Getting PowerPivot working in SharePoint

PowerPivot with SharePoint Diagnostics

There are a lot of moving parts to PowerPivot. Here are some common causes of PowerPivot not working within SharePoint:

No associated Service Application

I list this first, because I spent way too much time finding this one.   There is a Service Application created in support of PowerPivot.  If your web application doesn’t have a proxy association to this Service Application, you will get a generation connection error when trying to view a PowerPivot spreadsheet within Excel Services.

Data Connection Library trust

If you use an external (ie ODC) data connection in a library, that library needs to be trusted.

Trusted file location

The location where the PowerPivot spreadsheet is located needs to be trusted.

Install the appropriate SSAS OLAP library

For SQL Server 2008 R2 SP1, the correct library is MSOLAP.4, for SQL Server 2012, the correct library is is MSOLAP.5.  If you open the ODC data connection referenced by your PowerPivot spreadsheet, at the beginning you will find a reference to this library.

Ensure SSAS is running under a domain account

There are two solutions, and both need to be deployed. Here’s the PowerShell:

Add-SPSolution –LiteralPath “C:Program FilesMicrosoft SQL Server110ToolsPowerPivotToolsConfigurationToolResourcesPowerPivotFarm.wsp”
Install-SPSolution –Identity PowerPivotFarm.wsp –GACDeployment -Force
 
 
$centralAdmin = $(Get-SPWebApplication -IncludeCentralAdministration | Where { $_.IsAdministrationWebApplication -eq $TRUE})
Add-SPSolution –LiteralPath “C:Program FilesMicrosoft SQL Server110ToolsPowerPivotToolsConfigurationToolResourcesPowerPivotWebApp.wsp”
Install-SPSolution -Identity PowerPivotWebApp.wsp -GACDeployment -Force -WebApplication $centralAdmin

If you want to check the deployment status, try this command:

Get-SPSolution "powerpivotfarm.wsp" | Format-List

As background, there is a deployment job triggered with deploying the solution. This job is managed under the Timer Job Service, which you can check on:

Get-SPTimerJob -Type Microsoft.SharePoint.Administration.SPSolutionDeploymentJobDefinition | Format-List

Next, go into Farm Solutions in Central Admin, and deploy both solutions. What do these solutions do?

The Powerpivotfarm.wsp solution does the following:

Adds Microsoft.AnalysisServices.SharePoint.Integration.dll to the global assembly.
Adds Microsoft.AnalysisServices.ChannelTransport.dll to the global assembly.
Installs features and resources files, and registers content types.
Adds library templates for PowerPivot Gallery and Data Feed libraries.
Adds application pages for service application configuration, PowerPivot Management Dashboard, data refresh, and PowerPivot Gallery.
The Powerpivotwebapp.wsp solution does the following:

Adds Microsoft.AnalysisServices.SharePoint.Integration.dll resources files to the web server extensions folder on the Web front-end.
Adds PowerPivot Web service to the Web-front end.
Adds thumbnail image generation for PowerPivot Gallery.

Ensure OLE DB providers are available

If you are running SQL Server 2008 R2 SP, then the OLE DB provider should have been installed when SharePoint was installed.  You can find it in the GAC called “microsoft.analysisservices.xmla.dll”.  GAC is in C:windowsassembly

SQL Server 2008 R2: MSOLAP100.dll, within the data connection string it is called MSOLAP.4, and the version (when you examine properties of the file in the GAC) is 10.50.1600 or later)

SQL Server 2012: MSOLAP110.dll, within the data connection string it is called MSOLAP.5, and the version (when you examine properties of the file in the GAC) is 11.00.0000 or later)

Ensure the OLE DB provider is trusted by Excel Services

Check within Central Admin, Manage Service Applications, Excel Services, Trusted Data Providers lists MSOLAP.4  and/or MSOLAP.5, depending on which version of SQL Server you are using.

If it is not listed, click Add Trusted Data Provider, in the Provider ID, type MSOLAP.5.
For Provider Type, ensure that OLE DB is selected. In Provider Description, type Microsoft OLE DB Provider for OLAP Services 11.0.

Ensure CTWTS is started

The Claims to Windows Token Service is required to communicate and authenticate seamlessly with SQL Server.  This service has to be started.  In Central Admin, go to Services on Server, and start it up.

Ensure CTWTS is running as LOCAL

The Claims to Windows Token Service needs to run using the LOCAL builtin account.  You can see this by starting services.msc on each SharePoint server.

Ensure Secure Store Service is configured

This Service Application needs to be configured (it has its own database), running, associated with your web application, with an Application ID defined that maps to an AD account.  I prefer configuring as a “Group” entry.  You’ll need to enter credentials for this Application ID, and also grant permissions to the users who will reference the Secure Store Application ID, which is set within the data connection for the Authentication configuration.

Ensure Excel Services is running and associated

PowerPivot makes use of Excel Services.  This Service Application must be configured, and associated with your web application.  The data connection library (if used) and file location must be trusted.  Excel Services should also be configured to allow PowerPivot workbooks of sufficient size to render.  Note I have seen a bug in early versions of SharePoint 2013 where increasing the maximum workbook size does not allow larger workbooks to render.

Ensure XLSX renders in Excel Services

This one is a real gotcha” if you are running SharePoint 2013 with an Office Web Apps farm.  By default Office Web Apps renders spreadsheets, and cannot handle external data connections.  if the URL contains “_layouts/15/WopiFrame.aspx” then the spreadsheet is being rendered by Office Web Apps.  The tricky part is that other than the URL, Office Web Apps rendered spreadsheets look virtually identical to the Excel Services rendered equivalent.  Under Excel Services the URL would instead contain “_layouts/15/xlviewer.aspx”.  The fix for this is a single PowerShell command:

New-SPWOPISuppressionSetting -Extension XLSX -Action View

Ensure Analysis Services Account can act as O/S

The Service account running the PowerPivot dedicated Analysis Services on the SharePoint server can fail to refresh unless the following steps are taken:

  • Run “secpol.msc”
  • Click Local Security Policy, then click Local policies, and then click User rights assignment.
  • Add the service account.

Check the ULS

When all fails, the ULS can be a lifesaver.  While often cryptic or hyperventilating about some minutia, it still can very clearly indicate the source of a problem

How to write a SharePoint File to disk

It is quite common, there is a file in SharePoint, and you want to see it written to a local disk. How to do that in PowerShell? Here’s how:

$site = New-Object Microsoft.SharePoint.SPSite($SharePoint Location) 
$web = $site.OpenWeb() 
$item = $web.GetListItem($SharePoint Location) 
$fileStream = ( $item.file.OpenBinary())
$stream = New-Object System.IO.FileStream($LocalDiskFileLocation ,[System.IO.FileMode]::Create)
$writer = New-Object System.IO.BinaryWriter($stream)        
$writer.write($fileStream)        
$writer.Close()

Additonal Read
SharePoint Document IDs

Great new book on SharePoint Disaster Recovery

There’s a great new book on SharePoint Disaster Recovery. Given I wrote a chapter, I’m a bit biased 🙂
https://www.packtpub.com/application-development/microsoft-sharepoint-2013-disaster-recovery-guide

Disaster Recovery planning is key to maintaining enterprise system availability for a SharePoint environment.

Getting the absolute URL for a SharePoint SPItem

Getting the fully encoded absolute URL

Most properties for an SPFile and SPItem only give part of the URL. To get the full URL, use this reference:

SPItem["EncodedAbsUrl"]

It’s not so obvious to find this property, as it will not be listed in the fields returned in the SPField collection for the SPList

Report on all Search Site references across SharePoint Site Collections

I got an interesting request recently to find all search centers configured for all Site Collections. I thought I would share the very simple script to do this:

Get-SPSite -limit all | % {
 
write-host "$($_.url),$($_.rootweb.AllProperties["SRCH_ENH_FTR_URL"])"
}

You can set the search drop down property using this assignment

$web.AllProperties[“SRCH_SITE_DROPDOWN_MODE”] = HideScopeDD_Defaultcontextual

Here are the possible HideScopeDD_Defaultcontextual Values and what they mean

Site Collection Search Dropdown Mode Property Value Search Results URL
Do Not Show Scopes Dropdown, and default to contextual scope HideScopeDD_DefaultContextual Y
Do Not Show Scopes Dropdown, and default to target results page HideScopeDD N
Show scopes Dropdown ShowDD Y
Show, and default to ‘s’ URL parameter ShowDD_DefaultURL Y
Show and default to contextual scope ShowDD_DefaultContextual Y
Show, do not include contextual scopes ShowDD_NoContextual N
Show, do not include contextual scopes, and default to ‘s’ URL parameter ShowDD_NoContextual_DefaultURL N

Here’s the full PowerShell script to set these values:

$web = Get-SPWeb http://SharePoint/managedpath/site
 $web.AllProperties[“SRCH_ENH_FTR_URL”] =/search/$web.AllProperties[“SRCH_SITE_DROPDOWN_MODE”] = HideScopeDD_Defaultcontextual
 $web.AllProperties[“SRCH_TRAGET_RESULTS_PAGE”] =/_layouts/OSSSearchResults.aspx”
$web.update()

Generating alternate colors in an email HTML table generated by PowerShell in SharePoint

Generating rich text emails with PowerShell

In this post: Generating automatic emails with embedded-reports and link to uploaded csv
we explored sending emails and attachments with HTML tables. Now let’s have a look at alternating row colors in the HTML.

This function will be used later to inject CSS reference to alternating colors into the HTML Table:

Function Set-AlternatingRows {
 
    [CmdletBinding()]
    Param(
        [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
        [string]$Line,
 
        [Parameter(Mandatory=$True)]
        [string]$CSSEvenClass,
 
        [Parameter(Mandatory=$True)]
        [string]$CSSOddClass
    )
    Begin {
        $ClassName = $CSSEvenClass
    }
    Process {
        If ($Line.Contains(""))
        {   $Line = $Line.Replace("
","
")
            If ($ClassName -eq $CSSEvenClass)
            {   $ClassName = $CSSOddClass
            }
            Else
            {   $ClassName = $CSSEvenClass
            }
        }
        Return $Line
    }
}

Let’s now define $a as the styles, including the style for alternating rows, which we will use when generating the HTML table:

$a = "
<style>"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ".odd { background-color:#ccddee; }"
$a = $a + ".even { background-color:#eeeeff; }"
$a = $a + "</style>

In generating the HTML table, note the reference to the above function in the Pipeline:

$MyOutput = $table | ConvertTo-Html Title, Link, Author, Size, FileName, Modified -head $a -body "

$($rows.count) $($Header)

” | Set-AlternatingRows -CSSEvenClass even -CSSOddClass odd | Convert-HTMLEscape

As a refresher, we use this function to handle HTML escaping so the HTML comes out delightfully formatted:

Function Convert-HTMLEscape {
&lt;#
convert &lt; and &gt; to &lt; and &gt; It is assumed that these will be in pairs
#&gt;
 
[cmdletbinding()]
 
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
 
Process {
foreach ($item in $text) {
    if ($item -match "&lt;") {
 
        (($item.Replace("&lt;","&lt;")).Replace("&gt;","&gt;")).Replace(""",'"')
     }
     else {
        #otherwise just write the line to the pipeline
        $item
     }
 }
} #close process
 
} #close function

Inconsistent SharePoint timestamps with WebDAV

There are situations when moving documents using Explorer mode can retain the correct “Modified” timestamp in the browser, yet show an updated timestamp in Explorer Mode.

This is due to WebDAV showing the date associated with the File (SPFile) rather than the date associated with the SPItem. Explorer mode can update the modified date in the SPFile.

When you probe further, the “Item” has the correct timestamp (item[“Modified”] is in the local timezone. However the SPFile has a property called vti_timelastmodified that has the GMT timestamp.

Note the File has a property called vti_nexttolasttimemodified as well.

Both item.file.properties[“vti_timelastmodified”] and item[“Modified”] are the same type (DateTime) so I can compare them, which is precisely what I did in a special report looking for such timestamp divergence. I generated a filtered report custom written in PowerShell to show files that are more or less than 4 or 5 hours off (depending on the time of year, the hours diverge by either 4 or 5 hours) given my ET timezone. I only look to the “minute” and not the “second” on purpose, so I don’t flag false positives.
Here is how to fix a single instance of this issue:

$docurl = "http ://SharePoint/site/list/TimestampTest/test2.docx"
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$list = $item.ParentList
 
[System.DateTime]$date = $item["Modified"]
 $user = New-Object microsoft.SharePoint.SPFieldUserValue($web, $item["Editor"])
 $item["Modified"] = $date;
 $item["Editor"] = $user;
 $item.Update()
 try { $item.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item['Name'])"}

Here’s a function that reports on all URLs that suffer from this issue. Note the use of date comparison, and the check for timestamp being off by either 4 or 5 hours, and matching on the delta of both days and minutes:

function Reset-Dates ($WebUrl, $ListName)
{
 #Get web, list and content type objects
 $web = Get-SPWeb $WebUrl
 $list = $web.Lists[$ListName]
 IF ($ReportFile -eq $null) {$reportFile = C:report.csv"}
 
 #Check if the values specified for the content types actually exist on the list
 $xSit=$WebUrl;
 $xLis=$ListName;
 
 #Go through each item in the list
 $list.Items | ForEach-Object {
 $item = $_;
 
 $fd = $item.file.properties["vti_timelastmodified"]
 $id = $item["Modified"]
 
$dd = ($id-$fd)
 
$hoursMatch = (($dd.hours -eq -4) -or ($dd.hours -eq -5))
 $daysMatch = ($dd.days -eq 0)
 $minutesMatch = ($dd.minutes -eq 0)
 
 if ($hoursMatch -and $daysMatch -and $minutesMatch)
 {
 Write-Host '.' -NoNewline
 }
 else
 {
 $xURL=$item.url;
 $xNam=$item['Name']
 
 $xMod=$item["Modified"]
 $xfMod = $item.file.properties["vti_timelastmodified"]
 try {$xEdi=$item["ows_Modified_x0020_By"].replace("DOMAIN",$null)} catch {$xEdi=$item["ows_Modified_x0020_By"]}
 try {$xAut=$item["ows_Created_x0020_By"].replace("DOMAIN",$null)} catch {$xAut=$item["ows_Created_x0020_By"]}
 
 $Line1=$xURL+$sep+$xSit+$sep+$xLis+$sep+$xNam+$sep+$xAut+$sep+$xEdi+$sep+$xMod+$sep+$xfMod+$sep+$dd.days+$sep+$dd.hours+$sep+$dd.minutes+$sep+'1';
 $Line1 | Out-file -Filepath $ReportFile -Append
 $Line1=$null;
 }
 
}
 
$web.Dispose()
}

SharePoint Database Naming

The primary naming convention to put in place is the Database Naming standard.  By default, SharePoint puts a GUID at the end of every database.

This is to ensure that two SharePoint farms can use the same database server without conflict. However the GUIDs make memorizing a database name practically impossible.

Here are some basic naming conventions to consider adopting:

  • Avoid GUIDs at all costs
  • Avoid blanks
  • Avoid underscores
  • Leave “DB” and “database” out of the name
  • Use Capital Letters to highlight the start of words (CamelCase)
  • Consistent description to allow a database to be clearly associated with a specific Web App and Service App
  • References to Production vs. Development are not necessary
  • References to SharePoint version are not necessary
  • References to “SharePoint” are unnecessary, especially for a dedicated SQL Server
  • Leave the obscure “WSS” SharePoint convention for content databases, instead use “Content” at the start of the database name.  That’s clearer for DBAs who are not versed in the mysterious acronyms of SharePoint.

Here’s a proposed syntax for structuring database names:

[Major Application][Type] [Minor Application] [Specific]

Component Description Sample Values
[Major Application] Major category of Application [left blank for SharePoint]
MSPS (for MS Project Server)
[Type] Category or type of database, based on primary system using the database Content
ServiceApp
[Minor Application] Can be Service Application PerformancePoint
ManagedMetadata
[Specific] Can describe each of multiple service app DBs. Description of use of Content DB for Web App CentralAdmin

Examples

Default: Search_Connector_CrawlStoreDB_4040b7300e9e42779edb3e6b926be5a7

New: ServiceApp_SearchConnectorCrawlStoreDB

Default: SharePoint_AdminContent_ff35d171-482c-4f9d-8305-a4a259ec1a15

New: Content_CentralAdmin

Default: wss_content_eaee9d8f-ed75-4a56-bad3-5abf232b4f66

New: Content_ DIV_HR

Default: StateService_0f2a42e8b90d4c60830ca442e753de13

New: ServiceApp_State

Designing your SharePoint content hierarchy

Key SharePoint Limits

Microsoft publishes a long list of soft and hard boundaries and limits within SharePoint. These have not changed significantly between 2010 and 2013. The definitive list can be found at: https://technet.microsoft.com/en-us/library/cc262787.aspx

An example of a hard boundary limit is the 2GB limit on file size. This limit is based on Operating System and SQL Server limits, and involves no trade-off options. However it is not advisable to allow for 2GB uploads, without first considering the options and tradeoffs involved, as well as end-user performance, and configuration settings such as BlobCache that can alleviate some of the delays associated with huge files.

An example of a soft limit is the List View threshold of 5,000 that prevents any list from trying to display more than 5,000 items. Even trying to display 5,001 will return an error. This setting is easily changed, but the impact can be significant, as the underlying SQL Server changes from row locking and imposes table locking on edits of over 5,000 items. This is an example of trading off performance and scalability with any usability benefits of increasing this setting. There are situations where it may make sense, such as a farm with a disproportionately low number of concurrent users, leading to a good trade-off of table locking with a low likelihood of another user being concurrently impacted. However, for a larger number of concurrent users, you definitely do not want to increase this threshold, without causing likely and noticeable performance degradation.

While there are a large number of limits, the ones that are most likely to guide your design are the two listed above, plus Security Scopes and Content Database size.

Security Scopes in a Library

Each time an object (web, library, folder or item) gets unique security permissions, that is termed a “Security Scope”. If 100 documents are assigned unique permissions, but have the same three people as contributors, this still counts as 100 Security Scopes. If they are each assigned to 200 Contributors, that still is only 100 Security Scopes. A rule of thumb is once you reach 1,000 unique Security Scopes, performance degrades 20%. Increase the number further, and performance nosedives. If you are going to have large libraries, it is best to have few or no unique Security Scopes. This is purely a performance impact, and the result is noticeable.
Content Database Size
This has the most direct impact on your DR strategy. The limits of your hardware (bounded by laws of physics) determine the speed of your backups and restores. The larger the Content Database, the longer your backup and recovery. A prime example is restoring a single Content Database. The scenario is a user “stomps” on a document, replacing it completely, using Explorer Mode. You’ve got to restore from a backup, you start the restore and get ready to do an unattached database extract, and are waiting for the restore to complete. However the large Content Database of 200GB or more requires the better part of a day to restore, or worse, you have insufficient spare disk space for the restore.

URL Length

Aside from the published limits, there’s a crucial limit that is not listed in the SharePoint Boundaries and Limits site, which is the maximum length of a URL of 255. That might sound like a large URL, but once you use a FQDN (Fully Qualified Domain Name) for your Web Application, a Managed Path, Sites and SubSites, Libraries and Nested Folders, enterprising users will utilize longer file names than you’d imagine, sprinkling them with characters not easily represented in a URL (such as blanks), causing them to be expanded to %20 (hex of ASCII 32, which is a blank). This limitation is insidious. Your design could appear to work fine…until users encounter seemingly random problems, where only some files have issues. Renaming files and folders tersely can buy some relief, replacing blanks with underscores, or getting users to use CamelCase is a tough sell to end users used to naming files as they want.

Impact of Design Decisions

Design decisions can be made implicitly that have a great impact on scalability, performance and DR. Let’s summarize these key decisions:

  • Web Applications
    The fewer the better. Aside from the untouchable Central Admin Web Application, and separate MySites, try to limit your Web Applications. Each Web Application carries not only its own configuration, but also a whoel raft of dedicated Timer Jobs firing multiple jobs every second, and additional IIS load, even if you share an Application Pool among multiple Web Applications.
  • Managed Paths
    These are defined at the Web Application level, and provide the structure for additional Site Collections.
  • Host Named Site Collections
    Aside from perhaps Search, this is the largest infrastructure change in SharePoint 2013. It’s geared to allow for cloud based multi-tenancy, and scalability beyond what can be done with the relatively structured Managed Paths. Best is to select one Web Application, and use that for all Host Named Site Collections.
  •  Number of Site Collections
    Some users feel comfortable within the confines of a single site collection. There are certainly conveniences within a site collection, including shared security groups, shared content types, and an immediate navigable hierarchy. Going with monolithic Site Collections limits your ability to scale across Content Databases, as a Site Collection can only ever exist within one Content Database. Once a Site Collection has grown too large, splitting it can be fraught with challenges. A large Content Database takes proportionately longer to backup, and more importantly, to recover.
  • Number of Sites
    A more diverse set of Sites (SPWeb objects, not to be confused with Site Collections) and their hierarchy requires thoughtful design to echo how users typically navigate SharePoint. A greater number of sites is recommended, which in turn should allow for scalability.
  • Number of Libraries
    Huge libraries suffer a proportionate impact to performance, and expose the possibility of errors if a View returns a result set larger than the List View Threshold (default and recommended setting of 5,000). It is better to guide users to replace a desired set of top level folders instead with libraries.

Going with a more diverse yet coherent and logical structure of Site Collections, Sites, and Libraries will allow for the scalability your farm needs to satisfy daily user needs, as well as growth  objectives.

A good approach when working with users is to play down the name (site collection, site, library and folder) and focus instead on the hierarchy, which users understand. You can always tune navigation to give them an optimal experience.

Limiting Library Version Storage across your SharePoint farm

There are situations where documents are frequently edited. Each edit creates a new version in SharePoint. In SP2010, each version consumed the full disk space, with no optimization for managing deltas. In SP2013, one of the benefits of Shredded Storage is that it optimizes storage usage for multiple similar versions of not just Office XML (Office 2010/2013) documents but also other filetypes like PDFs and image files. It does this by working out and storing only the file differentials. Even with Shredded Storage, you can limit the number of versions retained on document edits. Here’s how to do this across your farm. Let’s limit major versions to three, and minor versions to five:

$spWebApp = Get-SPWebApplication http ://SharePoint
for ($Si=0; $Si -lt $spWebApp.Sites.count; $Si++)
{
  $site = $spWebApp.Sites[$Si];
 
  for ($Wi=0; $Wi -lt $site.AllWebs.count; $Wi++)
  {
    $web = $site.AllWebs[$Wi];
 
    for ($Li=0; $Li -lt $web.Lists.count; $Li++)
    {
      $List = $web.Lists[$Li];
      if ($list.EnableVersioning)
      {
        $list.MajorVersionLimit = 3
      }
 
      if ($list.EnableMinorVersions)
      {
         $list.MajorWithMinorVersionsLimit = 5
      }
      $list.Update()
    }
  $web.dispose()
  }
  $site.dispose()
}
$spWebApp.dispose()