I've talked on, blogged about, and Tweeted about PowerPivot for awhile now. By itself, PowerPivot is pretty cool, but add it on to SharePoint and you have the Voltron of BI solutions.
I can pull from Oracle, Excel, a CSV, and DB2 all in the same file. CRAZY! What's even cooler is that I can pull a SSRS ATOM feed in to PowerPivot . . . sometimes.
By default - if you're using SQL 2012 Integrated mode - the largest SSRS ATOM feed you can pull is 110 MB. We have some pretty ridiculously large reports at Trek and a user was attempting to pull one of these monsters in to PowerPivot and it was choking. To make matters worse, PowerPivot was just throwing a 500 error (Unknown error). Not really helpful...
I opened a MS Case and began troubleshooting. We went back and forth and back and forth (6 weeks!), but we finally found the solution. So to get PowerPivot using large SSRS ATOM feeds do the following:
- Open the file Client.Config on your front end servers. The File is located here: C:/Program Files/Common Files/microsoft shared/Web Server Extensions/15/WebClients/Reporting/client.config
- Search for “httpStreaming” and “httpsStreaming”
- Within both these Bindings, change the value of the following - this will increase the Data Size from 110 Mb to 1.1 Gb:
- maxReceivedMessageSize from "115343360" to “1153433600"
- maxStringContentLength from "115343360" to “1153433600"
- maxArrayLength from "115343360" to “1153433600"
- Save the File
- Do an IISRESET across all SharePoint Servers.
Had an interesting request a few weeks ago. HR wanted to be able to share pages out to Yammer. I instantly ruled out a workflow because I couldn't post as the user. Then ruled out a console app for 2 different reasons: the logic could get complicated and it didn't allow for much flexibility. I went over to the Yammer Customer Network and started poking around. I ran across a few threads that talked about using the bookmarklet so I went and took a look at it: https://www.yammer.com/company/bookmarklet
The code will give you the share to yammer button:
Once the user clicks the icon a new window will open and - as long as they're logged in to Yammer - they'll be able to create a new Yam with the URL of the page they want to share in the update's body.
Give it a shot and let me know what you think.
For the last year and a half I've been taking one class a semester at MATC in Madison. Having been trained as a Technical Writer, I've basically learned all this sysadmin stuff "on the job." I figured it would be a good idea to fill-in-the-blanks for the stuff I didn't learn yet. The classes require a external hard drive to house and manage VMs you use during labs and tests. Being 30 and having a full-time job allows me to buy really cool, really fast hardware to satisfy this class requirement. I opted for a 128GB Vertex 4. This thing SCREAMS. I get labs done in record time.
So how am I supposed to get my homework done if a spaghetti and meatball tornado comes through and wipes out the lower half of Wisconsin, taking my external hard drive with it?
TO THE CLOUD!
I've been using Azure at work for a variety of things so I figured I'd give this a try. I have 3 VMs and with them all zipped up (individually) I have about 16GB total to upload to Azure.
There are 3 Azure storage basics you need to know about: storage accounts, containers, and blobs. A storage account is the first thing you need in order to get started.
The storage account sets up the subdomain you'll use to be able to communicate with your storage objects: yourstorage.*.core.windows.net. You also set the affinity group (location) where your content will be stored.
Once your storage account is up, you'll need a container. Think of a container as a folder - only it's not a folder - it's a container. It holds your blobs - binary large object (i.e. your files). More on that in a bit.
Click Storage in the left-hand navigation
Click on the storage account name (my account is called "inhifistereo," you can call your's whatever you like)
Click containers at the top of the page > then click New at the bottom of the page
Now give the container a name and choose Public or Private
Private is just that; private. Meaning you have to be logged on (or have a Shared access key, but that's fodder for another blog post) to access your stuff. A public container is cool because you can access it from anywhere as long as you have the URL. Click the checkmark and we're good to go.
So a container is a container - like a folder, only it's a container. And a blob is a file. The part that took me a second to understand is this storage isn't like a fileshare up in the cloud. It's the basic building blocks of storage in the cloud. A container dictates the access method, and a blob is the big 'ole file that sits within the container.
Now to get content up to Azure. You could write a console app, use PowerShell, or a third-party tool. For this exercise, I opted for a third-party tool: http://azurestorageexplorer.codeplex.com/. There are other tools too:
The files took me basically all day to upload. There were several reasons for this. For one, I have the most basic Broadband package Charter offers, but I'm not doing this for a living or every day so the time is no big deal. I'm charged by the GB not the minute, so if it took several days no biggie. But I'm not getting any younger...
Following this blog post, I did learn that the tools above do not upload in parallel, hence why it took so long.
"But David, you have a Skydrive and Dropbox account along with a hosting account. Why use Azure Storage?" Why not!? The real beauty of Azure storage is I only pay for what I use, and I pay pennies at that. Skydrive and Dropbox require a yearly commitment, and college classes only last 18 weeks. So when the class is over I can blow the container away and I don't get charged anymore. I don't plan on ever using these backups so they're cheap insurance. Now having said that, Azure storage (and Amazon, and Google, etc.) aren't really setup for consumer usage. But I'm not your typical consumer.
I'll give PowerShell a shot next time and probably try Amazon as well to see if there are any performance differences. If I'm feeling really ambitious I may try doing a console app.
Price-wise, I've been charged a total of 15 US cents so far. I may have to go raid the couch cushions...
What's the best way to get someone to eat their vegetables? Force feed them
All kidding aside, CRM can be a pretty powerful tool and when people don't want to use it we have to find creative ways to get them to use CRM. In addition to CRM usage, we have people who are absolutely married to Excel. And these aren't your typical Excel files. These are files that legends are made of. Crazy formulas, vlookups galore; you name it, we use it. To make matters worse, they e-mail these Excel reports around all day, every day as attachments. So let's kill two birds with one stone. We stop a big group of people from e-mailing Excel attachments and we get them to use CRM. Win-win for everyone (or at least that's the hope).
- Save Excel file in an easy to find, easy to access place in SharePoint - doing this in SharePoint gives us all the doc mgmt benefits that we've come to know and love
- Configure your Excel REST API URL - I've made it pretty clear that I love SharePoint's REST APIs in the past and the Excel API is no exception. You can read more about it here: http://msdn.microsoft.com/en-us/library/ee556413(v=office.14).aspx
- Navigate to the Model page of your Excel Doc: https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/
- You can then pick any Range, Chart, Table, or PivotTable to display in a variety of formats (I prefer image or html myself)
- Once you've chosen the desired element within the Excel file in the desired format, your URL should look something like this: https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html
- Copy the URL as-is from the address bar
- Create a new Web Resource in CRM - we're going to iframe our Excel REST API URL call
- Choose Web Page (HTML) as the Type and then click "Text Editor"
- Click the Source tab
- Paste in your iframe code between the <body> tags. Your code should look like this:
<iframe src=https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html frameborder="0" width="4000px" height="1300px"></iframe>
- Note the frameborder, height, and width attributes. These are needed to eliminate the nasty border and to make scrolling work correctly. iframe's aren't perfect and getting them to work feels "hacky" but the user won't know the difference and it should perform relatively seamless in all browsers.
- Click Publish
- Now, navigate to the desired dashboard and add your new Web Resource, click Save, and Publish.
Users should now see the Excel spreadsheet in their dashboard:
If users do not have access to the spreadsheet they should encounter an Error:Access Denied prompt or a blank screen depending on the browser they use.
In our case, the Excel spreadsheet scrolled FOREVER. I wanted to give users a pleasurable experience but I also didn't necessarily want them resorting to Excel on the client right away. I added a "Click to View in a separate Window" link in the iframe Web resource. Here's what my code looked like:
<p><a href="https://dude.crm.dynamics.com/WebResources/new_iframe2">Click to View in separate Window</a></p><iframe src=https://dude.com/sites/site1/_vti_bin/ExcelRest.aspx/Documents/Document.xlsx/Model/Ranges('Scorecard')?$format=html frameborder="0" width="4000px" height="1300px"></iframe>
All HTML Web Resources are web pages, so I linked directly to the web page. But notice I linked to new_iframe2? I didn't want users seeing "Click to View" on every page so I made an identical web resource, except I removed the hyperlink from the top, making for a seamless experience for the user. There's all sorts of other things I could have done on the new_iframe2 page. I could have linked to the Excel Web Access or even directly to Excel itself, but we'll leave it like that for now.
Ultimately, I've gotten the report builders to stop e-mailing this specific report as an attachment, and now the audience of the spreadsheet has to go to CRM to view it rather than getting it e-mailed to them. Awesome.
Maybe you're still kicking the tires on SharePoint 2013 installing it for the first time. Maybe you're in the throws of planning your migration. Maybe you're a consultant who's been stuck on a SharePoint 2010 project for the last 18 months. Either way, we all have to face the music sooner or later and upgrade to SharePoint 2013. When you do, you'll have to setup a Search Service. It's not as bad as you'd think. And if you have at least 3 servers in your farm (1 app and 2 WFEs) then this script will work for you. Without further delay:
#Config Section $APP1 = "App1" $WFE1 = "WFE1" $WFE2 = "WFE1" $SearchAppPoolName = "SearchServiceAppPool" $SearchAppPoolAccountName = "domain\SearchSvc" $SearchServiceName = "SharePoint Search Service" $SearchServiceProxyName = "SharePoint Search Service Proxy" $DatabaseServer = "DBserver" $DatabaseName = "SP_Search_AdminDB" #Create a Search Service Application Pool $spAppPool = New-SPServiceApplicationPool -Name $SearchAppPoolName -Account $SearchAppPoolAccountName -Verbose #Start Search Service Instance on all Application Server Start-SPEnterpriseSearchServiceInstance $App1 -ErrorAction SilentlyContinue Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $App1 -ErrorAction SilentlyContinue #Start Search Service Instance on WFEs Start-SPEnterpriseSearchServiceInstance $WFE1 -ErrorAction SilentlyContinue Start-SPEnterpriseSearchServiceInstance $WFE2 -ErrorAction SilentlyContinue #Create Search Service Application $ServiceApplication = New-SPEnterpriseSearchServiceApplication -Partitioned -Name $SearchServiceName -ApplicationPool $spAppPool.Name -DatabaseServer $DatabaseServer -DatabaseName $DatabaseName #Create Search Service Proxy New-SPEnterpriseSearchServiceApplicationProxy -Partitioned -Name $SearchServiceProxyName -SearchApplication $ServiceApplication $clone = $ServiceApplication.ActiveTopology.Clone() #Set variables for component creation $App1SSI = Get-SPEnterpriseSearchServiceInstance -Identity $App1 $WFE1SSI = Get-SPEnterpriseSearchServiceInstance -Identity $WFE1 $WFE2SSI = Get-SPEnterpriseSearchServiceInstance -Identity $WFE2 #Create Admin component New-SPEnterpriseSearchAdminComponent –SearchTopology $clone -SearchServiceInstance $App1SSI #Create Processing component New-SPEnterpriseSearchContentProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI #Create Analytics Processing component New-SPEnterpriseSearchAnalyticsProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI #Create Crawl component New-SPEnterpriseSearchCrawlComponent –SearchTopology $clone -SearchServiceInstance $App1SSI #Create query processing component New-SPEnterpriseSearchQueryProcessingComponent –SearchTopology $clone -SearchServiceInstance $App1SSI #Set the primary and replica index location; ensure these drives and folders exist on application servers $PrimaryIndexLocation = "C:\SPSearch" $ReplicaIndexLocation = "C:\SPSearchReplica" #We need two index partitions and replicas for each partition. Follow the sequence. New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE1SSI -RootDirectory $PrimaryIndexLocation -IndexPartition 0 New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE2SSI -RootDirectory $ReplicaIndexLocation -IndexPartition 0 New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE1SSI -RootDirectory $PrimaryIndexLocation -IndexPartition 1 New-SPEnterpriseSearchIndexComponent –SearchTopology $clone -SearchServiceInstance $WFE2SSI -RootDirectory $ReplicaIndexLocation -IndexPartition 1 $clone.Activate() #Verify Search Topology $ssa = Get-SPEnterpriseSearchServiceApplication Get-SPEnterpriseSearchTopology -Active -SearchApplication $ssa
This script is actually pretty basic. 90% of the services end up on your App box, while the Index partitions live on your WFEs. You could provision the service on one box or any combination you see fit. That's the beauty of this model. For me, I like having the Index partition closest to where people will be searching (i.e. the WFEs).
The script should take anywhere from 10-30 minutes to run, maybe longer depending on your hardware. Once done, navigate to your Search Service in Central Admin and this is what you should see in the Search Topology.
Most important thing to remember when using this script is to create the C:\SPSearch and C:\SPSearchReplica directories on your WFEs PRIOR to running this script. The script will fail if you don't do this and it's a pain to cleanup after so create the directories first. I'll probably write in a check to see if the directories exist - and if they don't go, ahead and create them - next time I run this script when setting up an environment.
I have a SharePoint farm where I use the aspnet membership provider. Now and again I need to remove users due to separations, change of job, etc. so they can't access SharePoint.
Like all of you I have to research this thing anew every time I have to do this. But no more! My future self will thank me.
Using the aspnet_Users_DeleteUser stored proc we can remove users via SSMS.
USE [database name] EXEC [dbo].[aspnet_Users_DeleteUser] @ApplicationName = '[Application Name]', @UserName = '[Username]', @TablesToDeleteFrom = 15, @NumTablesDeletedFrom = 0 GO
@TablesToDeleteFrom can be a bit confusing when you open the stored proc up. It's actually a bit mask. I typically have to remove users entirely so I use 15, but this blog post details out some additional options for that parameter: http://vsproblemssolved.blogspot.com/2007/01/using-sqlmembershipprovider.html
@NumTablesDeletedFrom is an input/output variable, meaning anything you put in there will get replaced by 0 in the query. You can use that parameter to inspect the output but I'm not that ambitious.
I spin up my spiffy new SharePoint 2013 environment, migrate my PerformancePoint databases, and then try to hit the dashboards. I'm greeted by all kinds of errors. Take note: we're using more and more tabular SSAS data sources at Trek.
Fast forward a few days. I open a ticket with Microsoft and begin troubleshooting. Engineer was a helpful chap. He had an idea from the get-go what the problem was, but wanted to check some environment variables before we go and start installing stuff.
Long story, short: you need to install the ADMOMD.NET SQL 2008 R2 feature pack if you want to hit tabular data sources (LINK - you'll find the correct pack towards the bottom of the Install Instructions section). #lamesville #sql2012hasbeenoutforalmostayear
I asked the engineer to send me the Technet article stating this to which he replied "Wish I could." Nowhere in any of Microsoft's documentation does it state you need to install this feature pack in order to use tabular data sources in SP 2013. He did however send me this blog post so kudos to him for that: http://blogs.technet.com/b/microsoft_in_education/archive/2013/04/29/configuring-performancepoint-in-sharepoint-2013.aspx
I'm pissed. Like, in disbelief pissed. Kinda like my buddy Tracy:
The KB lists 2 possible workarounds: 1) Install a hotfix; or , 2) Run a whole bunch of PowerShell that requires the OS .iso readily available. I've tried both with little to no success. I especially see issues when it comes to the AppFabric and Distributed Cache.
So you can imagine my disbelief when the true workaround is to Install the IIS role first before running the Pre-Req installer. Do that and the pre-req installer runs just fine (at least on Server 2012). Son of a...
When I saw SkyDrive Pro work with Windows 8 for the first time.
via Tumblr http://spwookiee.tumblr.com/post/46856148486