Enhancing SharePoint breadcrumbs when navigating deeply nested folders

I had an interesting challenge to improve breadcrumbs in navigating a deeply folder nested library. The nice breadcrumb dropdown icon is actually gone in SP2013 by default; that can easily be re-enabled via CSS. an onMouseOver event action in JavaScript can eliminate the need to click on it. However to improve the breadcrumb navigation, I created an SPWeb feature to enhance breadcrumbs by overriding the OnPreRender event. The OnPreRender should only occur if the ‘Visible’ property of the web control is set to true. The PreRender event is raised just before the page is about to render its contents. This is the last chance we have to modify the page output before it is received by the browser.

I deployed this web Scoped solution that overrides the breadcrumbs with OnPreRender page processing, and provides the full folder path, clickable for each link. It recursively searches the page for the “TitleBreadcrumb” control, and builds a replacement, adding an a href, folder link name, and breadcrumb image iteratively. It only affects the page when you are navigating within a library.

namespace MyDocLibraryBreadcrumb
{
    public class MyDocLibBreadcrumbDelegateControl : WebControl 
    {
        protected override void OnPreRender(EventArgs e)
        {
 
            try
            {
                base.OnPreRender(e);
 
                // Get the path to the current folder
                string path = Context.Request.QueryString["RootFolder"];
 
                // if there's no path then there is nothing to do; it implies are are not in the context of the library 
                if (String.IsNullOrEmpty(path))
                {
                    return;
                }
 
                // Let's get the current folder
                SPWeb web = SPContext.Current.Web;
                SPFolder currentFolder = web.GetFolder(path);
 
                // Let's find the breadcrumb control on the current page - it's a ListProperty control where the property is  "TitleBreadcrumb". 
                Control c = Utils.FindRecursive(Page.Controls, ctrl => ctrl is ListProperty && ((ListProperty)ctrl).Property == "TitleBreadcrumb");
 
                // If not found, nothing to do, and we are not likely in a library. 
                if (c == null)
                    return;
 
                // Let's subsitute the OOTB breadcrumb control with our replacement enhanced one
                var parent = c.Parent;
                var index = parent.Controls.IndexOf(c);
                parent.Controls.RemoveAt(index);
                parent.Controls.AddAt(index, new LiteralControl { Text = GetReplacementBreadCrumbOutput(currentFolder) });
            }
            catch (Exception ex)
            {
                // log errors quietly 
                Utils.WriteMyLog(Utils.GetErrorInfo(ex));
            }
        }
 
 
        /// SPFolder is the parameter to create navigation to
        /// returns the HTML output
        private string GetReplacementBreadCrumbOutput(SPFolder folder)
        {
            List<BreadcrumbNodeData> nodes = new List<BreadcrumbNodeData>();
 
            // Collect a path from current folder to its root folder
            SPFolder nodeFolder = folder;
            while (nodeFolder != null)
            {
                // If we're in folder use the folder name as a title. If not use a library title enstead.
                BreadcrumbNodeData node  = new BreadcrumbNodeData();
                node.Url = nodeFolder.ServerRelativeUrl;
                if (string.IsNullOrEmpty(nodeFolder.ParentFolder.Url))
                {
                    if (nodeFolder.DocumentLibrary != null)
                        nodes.Add(new BreadcrumbNodeData
                                      {Title = nodeFolder.DocumentLibrary.Title, Url = nodeFolder.ServerRelativeUrl});
                }
                else
                {
                    nodes.Add(new BreadcrumbNodeData { Title = nodeFolder.Name, Url = nodeFolder.ServerRelativeUrl });
                }
 
                nodeFolder = string.IsNullOrEmpty(nodeFolder.ParentFolder.Url) ? null : nodeFolder.ParentFolder;
            }
 
            // Reverse the collected path because the root folder must be on the left in bredcrumb
            nodes.Reverse();
 
            // Create an HTML output similar to original. An arrow image we've created from the original
            string htmlOutput = String.Empty;
 
            foreach (var node in nodes)
            {
                if (node != nodes.Last())
                    htmlOutput +=
                        String.Format(
                            @"<A href=""{0}"">{1}</A> <IMG style=""vertical-align:middle"" alt=: src=""/_layouts/images/MyDocLibraryBreadcrumb/breadcrumb_arrow.png""/> ", node.Url, node.Title);
                else
                {
                    htmlOutput += node.Title;
                }
            }
 
            return htmlOutput;
        }
    }
 
 
    /// temporary class to holds navigation node data
 
    public class BreadcrumbNodeData
    {
        /// Title for URL (it will be a folder name)
        public string Title { get; set; }
 
        /// Url to navigate on click (it will be a server relative URL of the folder)
        public string Url { get; set; }
    }
 
    public class Utils
    {
        public static string GetErrorInfo(Exception ex)
        {
            string result = "ex.Message=" + ex.Message;
            result += ex.InnerException == null ? "|ex.StackTrace=" + ex.StackTrace : String.Empty;
 
            if (ex.InnerException != null)
                result += "[INNER EXCEPTION: ]" + GetErrorInfo(ex.InnerException);
 
            return result;
        }
 
        public static void WriteMyLog(string text)
        {
            SPDiagnosticsService.Local.WriteTrace(0,
                                                  new SPDiagnosticsCategory("MyDocLibBreadcrumbDelegateControl", TraceSeverity.High,
                                                                            EventSeverity.Error),
                                                  TraceSeverity.High, text, null);
        }
 
 
 
        /// Finds a control based on provided criteria inside controls hierarchy
 
        /// <param name="controls">A Controls collection to start search</param>
        /// <param name="criteria">A criteria to return control</param>
        /// <returns>The founded control </returns>
        public static Control FindRecursive(ControlCollection controls, Func<Control, bool> criteria)
        {
            foreach (Control control in controls)
            {
                if (criteria(control))
                    return control;
 
                var innerControl = FindRecursive(control.Controls, criteria);
                if (innerControl != null)
                    return innerControl;
            }
 
            return null;
        }
    }
}

Fixing Checked Out Files

Fixing Checked Out Files

I ran into a challenge this evening with a monstrously large library filled with 5,000+ folders with 47,000+ files.

What to do?

Firstly, things don’t work correctly until the list view threshold is temporarily lifted. Once that is done, we can iterate through the files, take the checked out ones over, and force the check in.

Here’s how:

$root = get-spweb "ht tp://SharePoint/sites/site"
$lib = $root.lists["LibraryName"]
$x = $lib.CheckedOutFiles
$count = $x.Count
for ($i=$count-1; $i -ge 0; $i--)
{
$Checkeditem = $x.get_Item($i)
$Checkeditem.TakeOverCheckOut()
 
$libitem = $lib.GetItemById($Checkeditem.listitemid)
$libitem.File.CheckIn("")
Write-Host -NoNewline "."
}

Custom SQL Reporting in MS-Project Server 2013

Custom SQL Reporting in MS-Project Server 2013

It is easy to navigate the database schema in MS-Project Server to generate reports.  The SQL can be embedded in an ODC, or can be used within PowerPivot.  If joining Task and Project data, there’s a challenge of rollups.  The first challenge is avoiding double-counting from summary tasks.  The solution is to exclude them on the join, adding this condition:

where TaskIsSummary=0

The next source for double-counting are external tasks; those exposed through cross-linking tasks in separate projects. We can exclude both this way:

where TaskIsSummary=0 and TaskIsExternal = 0

The next problem is if merging task and project tables, project values would roll up incorrectly, however such numeric fields can be pro-rated to the project work, as long as we avoid divide-by-zero errors, here’s how, referencing a custom field called “Budgeted Costs”; note how its value is proportionate to the task work:

, case
 when [MSP_EpmProject_UserView].[Budgeted Costs] = 0 THEN 0
 when MSP_EpmTask_UserView.TaskRegularWork = 0 THEN 0
 when MSP_EpmProject_UserView.ProjectWork = 0 THEN 0
 else
 [MSP_EpmProject_UserView].[Budgeted Costs] * ( MSP_EpmTask_UserView.TaskRegularWork/ MSP_EpmProject_UserView.ProjectWork )
 END as [Budgeted Costs]
 
FROM dbo.MSP_EpmProject_UserView INNER JOIN dbo.MSP_EpmTask_UserView
ON MSP_EpmProject_UserView.ProjectUID = MSP_EpmTask_UserView.ProjectUID
where TaskIsSummary=0 and TaskIsExternal = 0
ORDER BY MSP_EpmProject_UserView.ProjectName, MSP_EpmTask_UserView.TaskIndex, MSP_EpmTask_UserView.TaskName

One step further, we can do the same using task assignment data, here’s what that looks like using the assignment work:

, case
 when [MSP_EpmProject_UserView].[Budgeted Costs] = 0 THEN 0
 when MSP_EpmAssignment_UserView.AssignmentWork = 0 THEN 0
 when MSP_EpmProject_UserView.ProjectWork = 0 THEN 0
 else
 [MSP_EpmProject_UserView].[Budgeted Costs] * ( MSP_EpmAssignment_UserView.AssignmentWork/ MSP_EpmProject_UserView.ProjectWork )
 END as [Budgeted Costs]
 
,[MSP_EpmResource_UserView].[Cost Type]
 
,[MSP_EpmResource_UserView].[Resource Departments]
 ,[MSP_EpmResource_UserView].[RBS]
 ,[MSP_EpmResource_UserView].[Resource Title] FROM dbo.MSP_EpmProject_UserView INNER JOIN dbo.MSP_EpmTask_UserView ON MSP_EpmProject_UserView.ProjectUID = MSP_EpmTask_UserView.ProjectUID LEFT OUTER JOIN dbo.MSP_EpmAssignment_UserView ON MSP_EpmTask_UserView.TaskUID = MSP_EpmAssignment_UserView.TaskUID AND MSP_EpmTask_UserView.ProjectUID = MSP_EpmAssignment_UserView.ProjectUID LEFT OUTER JOIN dbo.MSP_EpmResource_UserView ON MSP_EpmAssignment_UserView.ResourceUID = MSP_EpmResource_UserView.ResourceUID
where TaskIsSummary=0 and TaskIsExternal = 0
ORDER BY MSP_EpmProject_UserView.ProjectName, MSP_EpmTask_UserView.TaskIndex, MSP_EpmTask_UserView.TaskName

How to delete all SharePoint webs in a Site Collection, except the root web

Pretty useful and simple to delete all webs in a site collection except the root web. This worked for me, because there isn’t nesting. Even with nesting and the pipeline returning SPWebs in suboptimal sequence, running this a few times will wipe at a minimum the leaf nodes whittling you down to your clean root web.

get-spsite ht tp://SharePoint SiteURL | get-spweb -Limit all  | Where-Object {$_.isrootweb -ne $true} | % {$_.delete()}

Assigning an Application Pool to a SharePoint Web App

the easy/obvious way is the wrong way. The trick is to reassign the app pool to the web app so SharePoint know about it. That way on a reconfig, things don’t get configured incorrectly.
To do it correctly, we’ll leverage this PowerShell snippet:

$sourceWebAppPool = (Get-SPWebApplication < URL of an existing webapp application pool you want to use >).ApplicationPool
$webApp = Get-SPWebApplication < URL of the web application you want to update >;
$webApp.ApplicationPool = $sourceWebAppPool
$webApp.ProvisionGlobally()
$webApp.Update()
iisreset

if you do not have a web application already using the desired web application pool, you have the option to create a new application pool to use:

Import-Module WebAdministration
$appPool = New-WebAppPool "My new App Pool"

However, while I initially could not find a CmdLet for doing a “get-webAppPool”, my friend Brett Parker suggested:

$appPool = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.ApplicationPools | Where {$_.Name -eq "NameOfAppPool" }

Programmatically Targeting Audiences for Audience Enabled Libraries in SharePoint

Overview

Targeting an audience for a given document is a great capability within SharePoint. There’s a simple Document Library setting to enable Audience Targeting.

This enables a dedicated Audience Targeting field to be configured per document. I customized a Content Query Web Part (CQWP) that honors audience targeting, while displaying the fields in a grid view, enabling a targeted and custom portal home page. While the CQWP is harder to work with than a Data View Web Part, CQWP is the primary way to leverage Audience targeting.

Additional Read

Set an Email alert for Document library or List for SharePoint 2013

 

Reporting

I prefer to first output a CSV of all documents, with one column for Audience. Note I also output the absolute URL. Here’s a function to do it. Note the proper use of memory management:

$LineHeader= 'Site,Lib,URL,ID,Versions,Name,Title,Created,Modified By,Modified,Audiences'
$LineHeader | Out-file -Filepath $ReportFile
 
function generate-VerReport ($WebUrl, $ListName)
{
    $ac=Start-SPAssignment
    $web = $ac | Get-SPWeb $WebUrl
    $list = $web.Lists[$ListName]
    $VerOut=$null;
 
    Write-Host "+" -NoNewline #each plus is a full look through all Docs in a library
    $xSit=$WebUrl;
    $xLis=$ListName;
 
        #Go through each item in the list
        $Items = $list.Items
        $ItemCount = $Items.count;
        for ($i=0; $i -lt $ItemCount; $i++)
        {
            #Check if the item content type currently equals the old content type specified
            $item=$items[$i];
 
            $sep='","';
            $xURL = $Item["EncodedAbsUrl"]
            $RawAud = $item["Target Audiences"];
            $xNam=$item['Name']
            $xTit=$item['Title']
            $xCre=$item['Created']
            $xEdi=$item["Editor"]
            $xMod=$item["Modified"]
            $xID= $item.ID
            $xVer= $item.Versions.count;
            $Line1='"'+$xSit+$sep+$xLib+$sep+$xURL+$sep+$xID+$sep+$xVer+$sep+$xNam+$sep+$xTit+$sep+$xCre+$sep+$xEdi+$sep+$xMod+$sep+$rawAud,'"';
            $Line1 | Out-file -Filepath $ReportFile -Append
            #$LineHeader+= '`r`n'
            #$VerOut+=$Line1;
 
        }
 
$LineHeader+= '`r`n'
$LineHeader | Out-file -Filepath $ReportFile -Append
 
  $web.Dispose()
 
   $ac | Stop-SPAssignment
}

Here’s the function call:

generate-VerReport -weburl $web.url -listname $JPlib

The output enables an end user to modify the CSV to specify the audience per document.

Additional Read

Mirroring SharePoint Document Library Contents to Multiple Sites

Audience Shuffling

I created a script to apply the Audiences specified per document in the control spreadsheet. The Audiences are expected to be in semi-colon separated format (multiple values possible). There is no error checking on spelling of Audiences. It turns out Document Audiences in SharePoint are weakly-typed. That is to say, it is simply a string. The string has four semi-colons. In this script, we only apply SharePoint Groups, and they appear as comma separated after four semi-colons. If multiple values appear, these are comma separated. If an AD group is used, then that appears nestled between two pairs of semi-colons, as in this example:
;;CN=GroupName Membership,OU=Security Groups,OU=Information Technology,DC=MyDomain,DC=com;;
Under the covers, SharePoint will accept not just AD groups and SharePoint groups, but also UPS (User Profile Service) Audiences, specified as GUIDs. So two semi-colons are used as delimiters (beats me, but this is the first time I’ve seen this convention). Here’s how it is structured:
[Guids, comma separated];;[AD group LDAP paths separated by line breaks];;[SharePoint security group names, comma separated]

Here’s the script to apply the Audiences. It assumes all are SharePoint Groups, but is easily extensible to support AD groups if desired. Note it does not do checking for validity of SharePoint groups. Note the “Action” column to determine whether to “D”elete or “A”ssign audiences:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
Start-SPAssignment –Global
$ShuffleSource = "C:UsersplautjDocumentsPowerShellAudienceShuffle.txt"
$siteUrl = "ht tp://SharePoint"
 
Write-Host "script starting $(get-date)"
$myheader = "STARTING: $(get-date)"
 
$ShuffArr = Import-Csv -Path $ShuffleSource -Delimiter "`t"
 
$ShuffCount = $ShuffArr.count;
 
for ($i=0; $i -lt $ShuffCount; $i++)
{
        $Row = $ShuffArr[$i]
 
        if (($Row.action -ne $null) -and ($Row.action.length -ne 0))
        {
        $docurl = $row.url;
        $site = New-Object Microsoft.SharePoint.SPSite($docurl)
        $web = $site.OpenWeb()
        $item = $web.GetListItem($docurl)
        $list = $item.ParentList
 
        if ($Row.action -eq "D")
        {
            $item["Target Audiences"] = $null;
            $item.SystemUpdate();
        }
        elseif ($Row.action -eq "A")
        {
 
            if ($Row.Audiences -gt 0) #ignore if empty
            #actual Target Audience property has four semicolons, followed by comma delimited SharePoint groups
            #for AD group is how semicolons work ;;CN=ADW Membership,OU=Security Groups,OU=Information Technology,DC=DOMAIN,DC=com;;
 
            {
                $AudBuilder = ";;;;"
                $AudArr = $Row.Audiences.Split(";");
                for ($ai=0; $ai-lt $AudArr.Count; $ai++)
                {
                    if ($ai-gt 0)
                    {
                        $AudBuilder = $AudBuilder + ","
                    }
                    $AudBuilder = $AudBuilder + $AudArr[$ai]
                }
                $item["Target Audiences"] = $AudBuilder;
                $item.SystemUpdate();
 
            } #IF Audiences is not null
 
        } #Add action
 
        } #action exists
} #loop of rows
 
Write-Host "script finishing $(get-date)"
 
Stop-SPAssignment –Global
###########################

Are you in need of assistance? We can help!

Book a live 1-on-1 call with the experts of Reality Tech to find out how we will help you to achieve your goal.

Getting PowerPivot working in SharePoint

PowerPivot with SharePoint Diagnostics

There are a lot of moving parts to PowerPivot. Here are some common causes of PowerPivot not working within SharePoint:

No associated Service Application

I list this first, because I spent way too much time finding this one.   There is a Service Application created in support of PowerPivot.  If your web application doesn’t have a proxy association to this Service Application, you will get a generation connection error when trying to view a PowerPivot spreadsheet within Excel Services.

Data Connection Library trust

If you use an external (ie ODC) data connection in a library, that library needs to be trusted.

Trusted file location

The location where the PowerPivot spreadsheet is located needs to be trusted.

Install the appropriate SSAS OLAP library

For SQL Server 2008 R2 SP1, the correct library is MSOLAP.4, for SQL Server 2012, the correct library is is MSOLAP.5.  If you open the ODC data connection referenced by your PowerPivot spreadsheet, at the beginning you will find a reference to this library.

Ensure SSAS is running under a domain account

There are two solutions, and both need to be deployed. Here’s the PowerShell:

Add-SPSolution –LiteralPath “C:Program FilesMicrosoft SQL Server110ToolsPowerPivotToolsConfigurationToolResourcesPowerPivotFarm.wsp”
Install-SPSolution –Identity PowerPivotFarm.wsp –GACDeployment -Force
 
 
$centralAdmin = $(Get-SPWebApplication -IncludeCentralAdministration | Where { $_.IsAdministrationWebApplication -eq $TRUE})
Add-SPSolution –LiteralPath “C:Program FilesMicrosoft SQL Server110ToolsPowerPivotToolsConfigurationToolResourcesPowerPivotWebApp.wsp”
Install-SPSolution -Identity PowerPivotWebApp.wsp -GACDeployment -Force -WebApplication $centralAdmin

If you want to check the deployment status, try this command:

Get-SPSolution "powerpivotfarm.wsp" | Format-List

As background, there is a deployment job triggered with deploying the solution. This job is managed under the Timer Job Service, which you can check on:

Get-SPTimerJob -Type Microsoft.SharePoint.Administration.SPSolutionDeploymentJobDefinition | Format-List

Next, go into Farm Solutions in Central Admin, and deploy both solutions. What do these solutions do?

The Powerpivotfarm.wsp solution does the following:

Adds Microsoft.AnalysisServices.SharePoint.Integration.dll to the global assembly.
Adds Microsoft.AnalysisServices.ChannelTransport.dll to the global assembly.
Installs features and resources files, and registers content types.
Adds library templates for PowerPivot Gallery and Data Feed libraries.
Adds application pages for service application configuration, PowerPivot Management Dashboard, data refresh, and PowerPivot Gallery.
The Powerpivotwebapp.wsp solution does the following:

Adds Microsoft.AnalysisServices.SharePoint.Integration.dll resources files to the web server extensions folder on the Web front-end.
Adds PowerPivot Web service to the Web-front end.
Adds thumbnail image generation for PowerPivot Gallery.

Ensure OLE DB providers are available

If you are running SQL Server 2008 R2 SP, then the OLE DB provider should have been installed when SharePoint was installed.  You can find it in the GAC called “microsoft.analysisservices.xmla.dll”.  GAC is in C:windowsassembly

SQL Server 2008 R2: MSOLAP100.dll, within the data connection string it is called MSOLAP.4, and the version (when you examine properties of the file in the GAC) is 10.50.1600 or later)

SQL Server 2012: MSOLAP110.dll, within the data connection string it is called MSOLAP.5, and the version (when you examine properties of the file in the GAC) is 11.00.0000 or later)

Ensure the OLE DB provider is trusted by Excel Services

Check within Central Admin, Manage Service Applications, Excel Services, Trusted Data Providers lists MSOLAP.4  and/or MSOLAP.5, depending on which version of SQL Server you are using.

If it is not listed, click Add Trusted Data Provider, in the Provider ID, type MSOLAP.5.
For Provider Type, ensure that OLE DB is selected. In Provider Description, type Microsoft OLE DB Provider for OLAP Services 11.0.

Ensure CTWTS is started

The Claims to Windows Token Service is required to communicate and authenticate seamlessly with SQL Server.  This service has to be started.  In Central Admin, go to Services on Server, and start it up.

Ensure CTWTS is running as LOCAL

The Claims to Windows Token Service needs to run using the LOCAL builtin account.  You can see this by starting services.msc on each SharePoint server.

Ensure Secure Store Service is configured

This Service Application needs to be configured (it has its own database), running, associated with your web application, with an Application ID defined that maps to an AD account.  I prefer configuring as a “Group” entry.  You’ll need to enter credentials for this Application ID, and also grant permissions to the users who will reference the Secure Store Application ID, which is set within the data connection for the Authentication configuration.

Ensure Excel Services is running and associated

PowerPivot makes use of Excel Services.  This Service Application must be configured, and associated with your web application.  The data connection library (if used) and file location must be trusted.  Excel Services should also be configured to allow PowerPivot workbooks of sufficient size to render.  Note I have seen a bug in early versions of SharePoint 2013 where increasing the maximum workbook size does not allow larger workbooks to render.

Ensure XLSX renders in Excel Services

This one is a real gotcha” if you are running SharePoint 2013 with an Office Web Apps farm.  By default Office Web Apps renders spreadsheets, and cannot handle external data connections.  if the URL contains “_layouts/15/WopiFrame.aspx” then the spreadsheet is being rendered by Office Web Apps.  The tricky part is that other than the URL, Office Web Apps rendered spreadsheets look virtually identical to the Excel Services rendered equivalent.  Under Excel Services the URL would instead contain “_layouts/15/xlviewer.aspx”.  The fix for this is a single PowerShell command:

New-SPWOPISuppressionSetting -Extension XLSX -Action View

Ensure Analysis Services Account can act as O/S

The Service account running the PowerPivot dedicated Analysis Services on the SharePoint server can fail to refresh unless the following steps are taken:

  • Run “secpol.msc”
  • Click Local Security Policy, then click Local policies, and then click User rights assignment.
  • Add the service account.

Check the ULS

When all fails, the ULS can be a lifesaver.  While often cryptic or hyperventilating about some minutia, it still can very clearly indicate the source of a problem

How to write a SharePoint File to disk

It is quite common, there is a file in SharePoint, and you want to see it written to a local disk. How to do that in PowerShell? Here’s how:

$site = New-Object Microsoft.SharePoint.SPSite($SharePoint Location) 
$web = $site.OpenWeb() 
$item = $web.GetListItem($SharePoint Location) 
$fileStream = ( $item.file.OpenBinary())
$stream = New-Object System.IO.FileStream($LocalDiskFileLocation ,[System.IO.FileMode]::Create)
$writer = New-Object System.IO.BinaryWriter($stream)        
$writer.write($fileStream)        
$writer.Close()

Additonal Read
SharePoint Document IDs

Generating alternate colors in an email HTML table generated by PowerShell in SharePoint

Generating rich text emails with PowerShell

In this post: Generating automatic emails with embedded-reports and link to uploaded csv
we explored sending emails and attachments with HTML tables. Now let’s have a look at alternating row colors in the HTML.

This function will be used later to inject CSS reference to alternating colors into the HTML Table:

Function Set-AlternatingRows {
 
    [CmdletBinding()]
    Param(
        [Parameter(Mandatory=$True,ValueFromPipeline=$True)]
        [string]$Line,
 
        [Parameter(Mandatory=$True)]
        [string]$CSSEvenClass,
 
        [Parameter(Mandatory=$True)]
        [string]$CSSOddClass
    )
    Begin {
        $ClassName = $CSSEvenClass
    }
    Process {
        If ($Line.Contains(""))
        {   $Line = $Line.Replace("
","
")
            If ($ClassName -eq $CSSEvenClass)
            {   $ClassName = $CSSOddClass
            }
            Else
            {   $ClassName = $CSSEvenClass
            }
        }
        Return $Line
    }
}

Let’s now define $a as the styles, including the style for alternating rows, which we will use when generating the HTML table:

$a = "
<style>"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color:black;}"
$a = $a + "Table{background-color:#EFFFFF;border-collapse: collapse;}"
$a = $a + "TH{border-width:1px;padding:5px;border-style:solid;border-color:black;background-color:#DDDDDD}"
$a = $a + "TD{border-width:1px;padding-left:5px;padding-right:3px;border-style:solid;border-color:black;}"
$a = $a + ".odd { background-color:#ccddee; }"
$a = $a + ".even { background-color:#eeeeff; }"
$a = $a + "</style>

In generating the HTML table, note the reference to the above function in the Pipeline:

$MyOutput = $table | ConvertTo-Html Title, Link, Author, Size, FileName, Modified -head $a -body "

$($rows.count) $($Header)

” | Set-AlternatingRows -CSSEvenClass even -CSSOddClass odd | Convert-HTMLEscape

As a refresher, we use this function to handle HTML escaping so the HTML comes out delightfully formatted:

Function Convert-HTMLEscape {
&lt;#
convert &lt; and &gt; to &lt; and &gt; It is assumed that these will be in pairs
#&gt;
 
[cmdletbinding()]
 
Param (
[Parameter(Position=0,ValueFromPipeline=$True)]
[string[]]$Text
)
 
Process {
foreach ($item in $text) {
    if ($item -match "&lt;") {
 
        (($item.Replace("&lt;","&lt;")).Replace("&gt;","&gt;")).Replace(""",'"')
     }
     else {
        #otherwise just write the line to the pipeline
        $item
     }
 }
} #close process
 
} #close function

Designing your SharePoint content hierarchy

Key SharePoint Limits

Microsoft publishes a long list of soft and hard boundaries and limits within SharePoint. These have not changed significantly between 2010 and 2013. The definitive list can be found at: https://technet.microsoft.com/en-us/library/cc262787.aspx

An example of a hard boundary limit is the 2GB limit on file size. This limit is based on Operating System and SQL Server limits, and involves no trade-off options. However it is not advisable to allow for 2GB uploads, without first considering the options and tradeoffs involved, as well as end-user performance, and configuration settings such as BlobCache that can alleviate some of the delays associated with huge files.

An example of a soft limit is the List View threshold of 5,000 that prevents any list from trying to display more than 5,000 items. Even trying to display 5,001 will return an error. This setting is easily changed, but the impact can be significant, as the underlying SQL Server changes from row locking and imposes table locking on edits of over 5,000 items. This is an example of trading off performance and scalability with any usability benefits of increasing this setting. There are situations where it may make sense, such as a farm with a disproportionately low number of concurrent users, leading to a good trade-off of table locking with a low likelihood of another user being concurrently impacted. However, for a larger number of concurrent users, you definitely do not want to increase this threshold, without causing likely and noticeable performance degradation.

While there are a large number of limits, the ones that are most likely to guide your design are the two listed above, plus Security Scopes and Content Database size.

Security Scopes in a Library

Each time an object (web, library, folder or item) gets unique security permissions, that is termed a “Security Scope”. If 100 documents are assigned unique permissions, but have the same three people as contributors, this still counts as 100 Security Scopes. If they are each assigned to 200 Contributors, that still is only 100 Security Scopes. A rule of thumb is once you reach 1,000 unique Security Scopes, performance degrades 20%. Increase the number further, and performance nosedives. If you are going to have large libraries, it is best to have few or no unique Security Scopes. This is purely a performance impact, and the result is noticeable.
Content Database Size
This has the most direct impact on your DR strategy. The limits of your hardware (bounded by laws of physics) determine the speed of your backups and restores. The larger the Content Database, the longer your backup and recovery. A prime example is restoring a single Content Database. The scenario is a user “stomps” on a document, replacing it completely, using Explorer Mode. You’ve got to restore from a backup, you start the restore and get ready to do an unattached database extract, and are waiting for the restore to complete. However the large Content Database of 200GB or more requires the better part of a day to restore, or worse, you have insufficient spare disk space for the restore.

URL Length

Aside from the published limits, there’s a crucial limit that is not listed in the SharePoint Boundaries and Limits site, which is the maximum length of a URL of 255. That might sound like a large URL, but once you use a FQDN (Fully Qualified Domain Name) for your Web Application, a Managed Path, Sites and SubSites, Libraries and Nested Folders, enterprising users will utilize longer file names than you’d imagine, sprinkling them with characters not easily represented in a URL (such as blanks), causing them to be expanded to %20 (hex of ASCII 32, which is a blank). This limitation is insidious. Your design could appear to work fine…until users encounter seemingly random problems, where only some files have issues. Renaming files and folders tersely can buy some relief, replacing blanks with underscores, or getting users to use CamelCase is a tough sell to end users used to naming files as they want.

Impact of Design Decisions

Design decisions can be made implicitly that have a great impact on scalability, performance and DR. Let’s summarize these key decisions:

  • Web Applications
    The fewer the better. Aside from the untouchable Central Admin Web Application, and separate MySites, try to limit your Web Applications. Each Web Application carries not only its own configuration, but also a whoel raft of dedicated Timer Jobs firing multiple jobs every second, and additional IIS load, even if you share an Application Pool among multiple Web Applications.
  • Managed Paths
    These are defined at the Web Application level, and provide the structure for additional Site Collections.
  • Host Named Site Collections
    Aside from perhaps Search, this is the largest infrastructure change in SharePoint 2013. It’s geared to allow for cloud based multi-tenancy, and scalability beyond what can be done with the relatively structured Managed Paths. Best is to select one Web Application, and use that for all Host Named Site Collections.
  •  Number of Site Collections
    Some users feel comfortable within the confines of a single site collection. There are certainly conveniences within a site collection, including shared security groups, shared content types, and an immediate navigable hierarchy. Going with monolithic Site Collections limits your ability to scale across Content Databases, as a Site Collection can only ever exist within one Content Database. Once a Site Collection has grown too large, splitting it can be fraught with challenges. A large Content Database takes proportionately longer to backup, and more importantly, to recover.
  • Number of Sites
    A more diverse set of Sites (SPWeb objects, not to be confused with Site Collections) and their hierarchy requires thoughtful design to echo how users typically navigate SharePoint. A greater number of sites is recommended, which in turn should allow for scalability.
  • Number of Libraries
    Huge libraries suffer a proportionate impact to performance, and expose the possibility of errors if a View returns a result set larger than the List View Threshold (default and recommended setting of 5,000). It is better to guide users to replace a desired set of top level folders instead with libraries.

Going with a more diverse yet coherent and logical structure of Site Collections, Sites, and Libraries will allow for the scalability your farm needs to satisfy daily user needs, as well as growth  objectives.

A good approach when working with users is to play down the name (site collection, site, library and folder) and focus instead on the hierarchy, which users understand. You can always tune navigation to give them an optimal experience.

Newsletters