Clearing out all SharePoint document versions for a set of documents

Wiping out all SharePoint document versions for a set of documents

Sometimes users create many versions of a document inadvertently. An example is a PDF, where properties may have been edited frequently. In SP2013 there is Shredded Storage, which handles storage of deltas (differential save). in SharePoint, 2010 versions can result in a lot of wasted disk storage. Let’s clean it up!

$web=Get-SPWeb "ht tp://SharePoint"
 
   for ($i=0;$i -lt $web.Lists.Count;$i++)
   { 
   $JPLib=$web.Lists[$i];
   $A_Lib_Count++;
   $SkipLib=$true; #true
    
   if ( ($JPlib.BaseType -ne "DocumentLibrary") -or ($JPlib.hidden) )
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.Title -Match "SitesAssets|Photo|Image|CustomizedsReports|Templates|Pages|Picture|cache|style|Slide")
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library because it mentions $Matches: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.BaseTemplate -ne "DocumentLibrary")   #alternatively, only skip if -eq XMLForm
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)";   
      Add-Content $mylogfile "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)`n";   
    }
    elseif (($JPLib.ThumbnailsEnabled) -or ($JPLib.DefaultView -eq "AllSlides"))
    {
      # forget any library with thumbnails, these are not normal doclibs, and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library because it has Thumbnails/Slides $($JPlib)";   
      Add-Content $mylogfile "fast test skipping Library because it has Thumbnails/Slides: $($JPlib)`n";   
    }
    else
    {  $SkipLib=$false; }
 
 
    if (!$SkipLib)
    {
      write-Host -foregroundcolor darkgreen "Processing Library to Merge: $($JPlib)";   
      Add-Content $mylogfile "Processing to merge: $($JPlib)`n"; 
 
       $JPItems = $JPLib.items;
       $JPCount = $JPitems.count;
 
          for ($itemIndex=0; $itemIndex -lt $JPCount; $itemIndex++)
          {
            $JPItem=$JPItems[$itemIndex];
            $FName=$JPItem.file.name;
 
 
            if ($FName -match ".pdf") #you choose the criteria, here I match on pdfs
            {
                $JPFile=$JPItem.file;
                Write-Host "Deleting $($JPFile.Versions.Count), size of each: $($JPFile.properties.vti_filesize),$($JPitem.url)"
 
                $JPfile.Versions.DeleteAll()
            }
 
            continue; # don't mess with timestamp/author just yet
             
 
             [Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
                {
                [System.DateTime]$date = $SourceItem["Modified"]
                $TargetItem["Modified"]=$date;
                try {
                $TargetItem["Editor"] = $SourceItem["Editor"]
                }
                catch
                {
                    write-host -foregroundcolor red "Error could not assign editor of $($targetItem.url)"
                }
 
 
                $TargetItem.update()   
                try
                {  # two deletes required
                    $TargetItem.Versions[1].delete()
                    $TargetItem.Versions[1].delete()
                }
                catch
                {
                    write-host -foregroundcolor red "Warning: could not delete old version of $($targetItem.url)"
                }
 
                })
          }
    }
}

Additional Read

How to Configure a URL to a Specific Location Inside a PDF

Scripting SharePoint logging to ULS and Event Log

It’s easy to dump output to a text file in a script, but for enterprise-class logging, the two standards are the Event Log and ULS (Unified Logging System). First ULS.

Below in PowerShell I grab a reference to the SPDiagnosticsService, define a SPDiagnosticsCategory, then call the writeTrace() method:

$diagSrc = [Microsoft.SharePoint.Administration.SPDiagnosticsService]::Local
$diacategory = new-object Microsoft.SharePoint.Administration.SPDiagnosticsCategory("MyTestCategory",[Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, [Microsoft.SharePoint.Administration.EventSeverity]::ErrorCritical)
$diagSrc.WriteTrace(98765, $diacategory, [Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, "Write your log description here" )

ULS is a good standard central way to go, but let’s move onto writing into the Event Log, which is extra useful given we are first going to create a custom application log:

New-EventLog -LogName MyCustomScripts -Source scripts

First challenge is if the logfile exists, then it will throw an error, even if you try encapsulating in a try/catch. the trick is to leverage the Get-EventLog CmdLet.

First, to see what exists, format as a list:

Get-EventLog -list

You now have your very own Event Log, and can write into it with your own event IDs, and messages, and severity levels. Here’s two worknig examples:

Write-EventLog -LogName MyCustomScripts -Source Scripts -Message "trying 4142 it works ... COOL!" -EventId 4142 -EntryType information
Write-EventLog -LogName MyCustomScripts  -Source Scripts -Message "trying 4942 as an error" -EventId 4942 -EntryType error

Now let’s simplify for re-use and consistency. Let’s declare some basics at the top of all scripts:

$Eventing = $true;  #determine if any events are written to the event log
$LogName = "JoelScripts"
$SourceName = "Scripts"
$ScriptID = 3; # unique number per script

Here’s a one-line function to make life simpler for our scripts:

function Write-MyLog([int] $EventID, [string] $Description, [system.Diagnostics.EventLogEntryType] $Severity)
{
    if ($Eventing)
    {
    Write-EventLog -LogName $LogName -Source $SourceName -Message $Description -EventId $EventID -EntryType $Severity
    }
}

Now let’s add a line at the start and end of the scripts to trigger an information event on what’s running. Note that references to $MyInvocation contain information about the currently running script:

Write-MyLog -Description "Start of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information
Write-MyLog -Description "End of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information

Lastly, here’s a sample normal message evented for warning, and next for error:

Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Warning
Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Error

A nice way to write events is to use the “Source” to map back to the script name or some other useful value for filtering. However sources need to be pre-defined. Here’s how to define a source:

New-EventLog -LogName $LogName -Source "X"

The challenge is when to create this source. I find it’s best to declare the source only if it does not exist:

try
{
$Sources = Get-EventLog -LogName $logname | Select-Object Source -Unique
$found = $false;
foreach ($OneSource in $Sources)
    {
        if ($OneSource.source -eq $Source)
        {
            $found=$true;
        }
    }   
}
catch
{
    Write-Host "cannot find logfile, so we are in deep trouble"
}
 
if (!$found)
{
    New-EventLog -LogName $LogName -Source $Source
    Write-Host "Created new Source $($Source) in log name $($LogName)"
}

How to Fix Bad Taxonomy Terms in Sharepoint Automatically

Fixing bad SharePoint taxonomy term references

A given document using managed metadata can have a term orphaned from the termset. These issues are common in complex SharePoint environments, especially those that have undergone file share to SharePoint Online migrations without a solid metadata governance plan. This can happen due to bad references to the intermediary site collection cached terms list, which is in each site collection and hidden, under the list name “HiddenTaxonomyList”. Here’s how to recreate this URL: [siteurl]/Lists/TaxonomyHiddenList/AllItems.aspx

While it’s easy to extend this to check the health of each document term, for simplicity, let’s imagine we want to fix a single document, and have the URL and know the name of the taxonomy field in the microsoft secure store. Let’s set the basics before we get started:

$docurl = "http://WebApp/path/lib/doc.xlsx" #URL to correct
$site = New-Object Microsoft.SharePoint.SPSite($docurl)  #grab the SPSite
$web = $site.OpenWeb() #grab the SPWeb
$item = $web.GetListItem($docurl) #grab the SPItem
$targetField = "MMS Field Name" # let's establish the name of the field
$TermValueToReplace = $item[$targetField].label;  #this is the termset value we want to re-assign correctly

Now let’s get a taxonomy session, with proper error detection. If you’re managing a large deployment with multiple site collections, make sure your team follows SharePoint security best practices to avoid data leaks when manipulating termsets programmatically.

try
{
    Write-Host "getting Tax Session for $($Site.url)..." -NoNewline
    $taxonomySession = Get-SPTaxonomySession -Site $site  #get one session per site collection you are in
    $termStore = $taxonomySession.TermStores[0]  #We need to move the  get Termset to above the lib level too!
    Write-Host "Got Tax Session. "
}
catch
{
    Write-Host "Tax session acquisition problem for $($site.url)"
    $ok=$false;
}

Given the field, let’s get the correct termset.

[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)  
$termSetID=$taxonomyField.TermSetId
 
$termSet = $termStore.GetTermSet($taxonomyField.TermSetId)  #if we ever loop, move to outside item loop, this takes a long time!

Now we do a lookup for the term in the Managed Metadata Service. We expect precisely one match, but we’ll check for that later

#"true" parameter avoids untaggable terms, like parent term at higher tier that should not be selected
[Microsoft.SharePoint.Taxonomy.TermCollection] $TC = $termSet.GetTerms($TermValueToReplace,$true)

Now let’s populate a TaxonomyFieldValue, and assign it to the SPItem, and save it without changing timestamp or author by using SystemUpdate()

$taxonomyFieldValue = new-object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)
$t1=$TC[0]  #use the first result
$taxonomyFieldValue.set_TermGuid($t1.get_Id())  #this assigns the GUID
$taxonomyFieldValue.set_Label($TermValueToReplace)  #this assigns the value
$taxonomyField.SetFieldValue($Item,$taxonomyFieldValue)  #let's assign to the SPItem
$item.systemupdate()

That’s the meat of it. Let’s put it all together with error handling:

$ok=$true;
$docurl = "http://WebApp/path/lib/doc.xlsx" #URL to correct
 
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$targetField = "FieldName"
$TermValueToReplace = $item[$targetField].label;
 
try
{
    Write-Host "getting Tax Session for $($Site.url)..." -NoNewline
    $taxonomySession = Get-SPTaxonomySession -Site $site  #get one session per site collection you are in
    $termStore = $taxonomySession.TermStores[0]  #We need to move the  get Termset to above the lib level too!
    Write-Host "Got Tax Session. "
}
catch
{
    Write-Host "Tax session acquisition problem for $($site.url)"
    $ok=$false;
}
 
[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)  
$termSetID=$taxonomyField.TermSetId
$termSet = $termStore.GetTermSet($taxonomyField.TermSetId)  #Move to outside item loop, this takes a long time!     
[Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue]$taxonomyFieldValue = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)  
 
[Microsoft.SharePoint.Taxonomy.TermCollection] $TC = $termSet.GetTerms($TermValueToReplace,$true)  #true avoids untaggable terms, like parent company at higher tier
 
if ($TC.count -eq 0)
    {
        Write-Host -ForegroundColor DarkRed "Argh, no Taxonomy entry for term $($TermValueToReplace)"
 
        $ok=$false;
    }
    else
    {
        if ( $TC.count -gt 1)
        {
            Write-Host -ForegroundColor DarkRed "Argh, $($TC.count) Taxonomy entries for Claim term $($TermValueToReplace)"
 
            $ok=false; #we can't be sure we got the right claim!
        }
 
        $taxonomyFieldValue = new-object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)
 
        $t1=$TC[0]
 
 
        $taxonomyFieldValue.set_TermGuid($t1.get_Id())
        #$taxonomyFieldValue.ToString()
        #$targetTC.add($taxonomyFieldValue)
    }
 
try
{
    $taxonomyFieldValue.set_Label($TermValueToReplace)
    $taxonomyField.SetFieldValue($Item,$taxonomyFieldValue)  
}
catch
{
    Write-Host -ForegroundColor DarkRed "Argh, can't write Tax Field for $($Item.url)"
    $ok=$false;
}
 
if ($ok)
{
    $item.systemupdate()
    write-host "Fixed term for item"
}
else
{
    Write-Host -ForegroundColor DarkRed "Did not fix term for item"
}

Each TaxonomyFieldValue has three important properties; these appear often as pipe separated values:
Label : It is the property of the Label selected by the user from the Labels property of the Term object
TermGuid : it is the Id (Guid) property of the Term (inherited from TaxonomyItem)
WssId : Reference back to the TaxonomyHiddenList, actually ID of the list entry in the site collection

 

Additional Read
Your Complete SharePoint Migration Guide (Step-By-Step)

Granting SharePoint Shell Administration access

Granting SharePoint Shell Administration access via PowerShell

Enabling a PowerShell user can be as simple as granting local admin rights and the following command:

add-spshelladmin "DOMAIN\USER"

Sometimes though there remain Content DBs for which the user doesn’t gain PowerShell access. In this case, the following command pipes in the content DBs and forces the PowerShell access granted:

get-spcontentdatabase | add-spshelladmin "DOMAIN\USER"

There are service application databases as well, and these could be all handled with this single command:

get-spdatabase | add-spshelladmin "DOMAIN\USER"

In the end what is required is the user having Security admin server role access on the SQL instance and the db_owner role in a database.

The security Admin role is required so that the underlying service accounts get granted the appropriate permissions, say when mounting a content DB.

Also a user must be a member of the SharePoint_Shell_Access role on the configuration database and a member of the WSS_ADMIN_WPG local group on the server (actually best to do on each server in the farm).

SharePoint Group Management

Managing SharePoint Groups in PowerShell

SharePoint Groups are a great mechanism for managing user permissions, however they exist within a single site collection. What if you have hundreds of site collections? We can easily script a range of common operations.

I prefer to use a CSV fed approach to manage groups and users. I create a CSV with the name of the group, and the users, which I list in pipe separated format (commas are already being used for the CSV). To read in a CSV use:

Import-Csv "L:PowerShellAD and SP group mapping.csv"

Let’s get the Site, Root Web, as well as an SPUser for the group owner, and get the groups object:

$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;

Here’s how to add a Group:

$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group for Members")

Here’s how to give the group Read access, for example:

$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
   $MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
   $MyAcctrole = $RootWeb.RoleDefinitions["Read"]
   $MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
   $RootWeb.RoleAssignments.Add($MyAcctassignment)
}

Here’s how to add a Member to a Group:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj) #if it exists
{
   $GroupToAddTo.addUser($UserObj)  
}

Note that a duplicate addition of a member is a null-op, throwing no errors.

Here’s how to remove a member:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
   $GroupToAddTo.RemoveUser($UserObj)  
}

Here’s how to remove all the members from a given group. This wipes the users from the whole site collection, so use this approach with care and consideration:

$user1 = $RootWeb.EnsureUser($MyUser)
try
{
   $RootWeb.SiteUsers.Remove($MyUser)
   $RootWeb.update()
}

Here’s the full script, with flags to setting the specific actions described above:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# uses feedfile to load and create set of SharePoint Groups.
$mylogfile="L:PowerShellongoinglogfile.txt"
$ADMap= Import-Csv "L:PowerShellAD and SP group mapping.csv"
$OwnerName = "DOMAIN/sp2013farm"
$AddGroups = $false;
$AddMembers = $false;  # optionally populates those groups, Comma separated list
$GrantGroupsRead = $true; #grants read at top rootweb level
$RemoveMembers = $false; # optionally  removes Comma separated list of users from the associated group
$WipeMembers = $false;  # wipes the groups clean        
$WipeUsersOutOfSite = $false;  #The Nuclear option. Useful to eliminate AD groups used directly as groups
 
 
 #we do not need a hashtable for this work, but let's load it for extensibility
$MyMap=@{}  #load CSV contents into HashTable
for ($i=0; $i -lt $AD.Count; $i++)
{
    $MyMap[$ADMap[$i].SharePoint Group] = $ADMap[$i].ADGroup;
}
 
# Script changes the letter heading for each site collection
$envrun="Dev"           # selects environment to run in
 
if ($envrun -eq "Dev")
{
$siteUrl = "h ttp://DevServer/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(,)
 
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "ht tp://SharePoint/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(,)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "script starting"
 
$myheader = "STARTING: $(get-date)"
 
foreach ($letter in $LoopStringArr)
{
    $SiteName=$siteurl+$letter
    $Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
 
    write-host $site.Url
    $rootWeb = $site.RootWeb;
    $Owner = $rootWeb.EnsureUser($OwnerName)
    $Groups = $rootWeb.SiteGroups;
 
    for ($ADi = 0; $ADi -lt $ADMap.count; $ADi++)
    {
        $SPGroupName = $ADMap[$ADi].SharePoint Group;
 
        if ($AddGroups)
        {
            if (!$Groups[$SPGroupName]) #no exist, so create
            {
                try
                {
                    $Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group members")
                }
                catch
                {
                    Write-Host -ForegroundColor DarkRed "Ouch, could not create $($SPgroupName)"
                }
            }
            else
            {
                    Write-Host -ForegroundColor DarkGreen "Already exists: $($SPgroupName)"
            }
        } #endif Add Groups
 
            if ($GrantGroupsRead)
        {
            $GroupToAddRoleTo = $Groups[$SPGroupName]
            if ($GroupToAddRoleTo) #if group exists
            {
 
                $MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
                $MyAcctrole = $RootWeb.RoleDefinitions["Read"]
                $MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
                $RootWeb.RoleAssignments.Add($MyAcctassignment)
            } #if the group exists in the first place
        } #ActionFlagTrue
 
        if ($AddMembers)
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                $usersToAdd = $ADMap[$ADi].ADGroup;
 
                if ($usersToAdd.length -gt 0) #if no users to add, skip
                {
                    $usersToAddArr = $usersToAdd.split("|")
                    foreach ($userName in $usersToAddArr)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.addUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                } #users to add
            } #if the group exists in the first place
        } #ActionFlagTrue
 
        if ($RemoveMembers)
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                $usersToAdd = $ADMap[$ADi].SharePoint Group;
 
                if ($usersToAdd.length -gt 0) #if no users to add, skip
                {
                    $usersToAddArr = $usersToAdd.split("|")
                    foreach ($userName in $usersToAddArr)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                } #users to add
            } #if the group exists in the first place
        } #ActionFlagTrue
 
        if ($WipeMembers)  #Nukes all users in the group
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                    foreach ($userName in $GroupToAddTo.Users)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot remove user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
 
            } #if the group exists in the first place
        } #ActionFlagTrue
 
if ($WipeUsersOutOfSite)  #Nukes all users in the group
        {
        $usersToNuke = $ADMap[$ADi].ADGroup;
 
        if ($usersToNuke.length -gt 0) #if no users to add, skip
                {
                    $usersToNukeArr = $usersToNuke.split("|")
                    foreach ($MyUser in $usersToNukeArr)
                    {
                        try
                            {
                                try
                                {
                                    $user1 = $RootWeb.EnsureUser($MyUser)
                                }
                                catch
                                {
                                    Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
                                }
 
                                try
                                {
                                    $RootWeb.SiteUsers.Remove($MyUser)
                                    $RootWeb.update()
                                }
                                catch
                                {
                                    Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
                                }
                           }
                           catch
                           {
                                Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
                           }
                } #if user is not null
            } #foreach user to nuke
        } #ActionFlagTrue
 
    }
 
 
    $rootWeb.dispose()
    $site.dispose()
 
} #foreach site

Start Your SharePoint Online Project in a Click

Our technology and wide delivery footprint have created billions of dollars in value for clients globally and are widely recognized by industry professionals and analysts.

Content Type Syndication tips from the Masters

Content Type Syndication must-know tips

Content Type Syndication is a must for enterprise-class structured SharePoint document management systems.

There are some limitations to be aware of.   A site column has an internal name and an external name. The internal name is set (and never changes) after the Site Column is created. So if you create a site column called “my data” it might appear to have a site column name of “my%20data” with %20 being the hex 20 equating to 32 for ASCII space. Now if you have 100 site collections, and one of them already has a field called “my data”, perhaps it is of a different type (number vs text, for example) then the Content Syndication Hub will not be able to publish this one site column, and will show it in a report available. Within your syndication hub, under Site Collection Administration, is an entry you can click called “Content type service application error log”. This is a list of errors where such site columns could not be published. During the publishing of Content Types, there’s actually a “pre-import check” that is done. The frustrating part, is when you create a site column, you don’t always know whether that name is in use somewhere. This is done for both the external visible name and the cryptic internal site column name.

When you publish a Content Type, it doesn’t get deployed right away. There’s an hourly batch job for each Web Application. You can go into Central Admin, Monitoring, Timer Jobs, and force the job to run for your web application. It’s called “Content Type Subscriber”. This picks up any recently published Content Types, and “pushes” them down throughout the Site Collection, starting with replicating the Site Columns, Content Types, Information Policies, then pushing them into sites and libraries, if the published Content Type is set to propagate down to lists (that’s a check box in the Content Type).

Similarly, within the subscribing Site Collection, in the Site Collection Administration, there’s a “Content type publishing error log” that summarizes issues with propagating Content Types and Site Columns.

For both Content Type Publishing, of many Content Types, I usually script the publishing, as well as running the subscribing jobs. I find that works extremely well, and avoids human errors.

On the  home page for the Content Type Syndication Hub I always put a link to:

1. Site Columns

2. Content Types

3. The Central Admin Timer Jobs, or a link directly to the main Web Application’s “Content Type Subscriber” Timer Job.

There’s a lot more, but let me know if there are any aspects I can clarify further.

Remote Blob Storage Report

When configuring RBS (Remote Blob Storage), we get to select a minimum blob threshold. Setting this offers a tradeoff between performance and storage cost efficiency.

Wouldn’t it be nice to have a report of the RBS (Remote Blob Storage) settings for all content databases within a farm? Well, here’s a script that reports on each content database, whether it is configured for RBS, and what that minimum blob threshold size is.

$sep="|"
Write-Host "DB Name$($sep)RBS Enabled$($sep)MinBlobThreshold"
Get-SPContentDatabase  | foreach {;
  try {
      $rbs = $_.RemoteBlobStorageSettings;
      Write-Host "$($_.name)$($sep)$($rbs.enabled)$($sep)$($rbs.MinimumBlobStorageSize)"
      } 
  catch {
  write-host -foregroundcolor red "RBS not installed on $($_.name)!`n"
  Write-Host "$($_.name)$($sep)False$($sep)0"
  }
}

View all Crawled Properties for a given SharePoint Document

View all Crawled Properties for a SharePoint Document

I often need to examine all the properties of a document. This is most useful for researching issues relating to the crawl property values

In this little PowerShell function I grab the SPItem, and split the internal XML with line feeds. Here’s the function:

 

Function Get-CrawledPropertyNames([string]$DocURL){ 
$DocURL = $DocURL.Replace("%20"," ")
$webfound = $false
$weburl = $DocURL
 
while ($webfound -eq $false) {
    if ($weburl.Contains("/")){
    $weburl = $weburl.Substring(0,$weburl.LastIndexOf("/"))
    $web = get-spweb -identity $weburl -ea 0
    if ($web -ne $null){
        $webfound = $true
    }
    }else{
    Write-Host -ForegroundColor Red "The Web could not be found"
    return -1
    }
}
$web.GetFile($DocURL).item.xml.Replace("' ", "' n").Replace("" ", "`" `n")
}

#To use, simply replace with the url of a file within a document library, here’s an example:

#Get-CrawledPropertyNames "http ://SharePoint/sites/SPWeb/Library/folder/FileName.DOC"

Adding one field to a SharePoint View in every library in every site

Adding a field every SharePoint View

Sometimes one needs to change a SharePoint library view across all libraries and sites. Here’s how to do it. First loop through sites/webs/libraries and find the target libraries. In this example, I use Content Types enabled as one of the criteria. Next, I delete the field I want to add. This is to make the script safe to re-run. If we didn’t try to delete the field first, we would end up with a view with multiple columns for the same field. Note no Update() method is required to alter a View.

#adds "Field" to right  in Default View; deletes field first, in case it already exists, then moves the field into position. 
Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
 
$envrun="Prod"          # selects environment to run in
if ($envrun -eq "Dev")
{
# you can always tune Dev to behave differently.  I prefer to make my scripts multi-environment enabled
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "http ://SharePoint/"  #use ytour site colelction URL, or choose to loop through all site collections
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "script starting"
$myheader = "STARTING: $(get-date)"
 
$SiteName=$SiteURL
$rootSite = New-Object Microsoft.SharePoint.SPSite($SiteName)
 
 $site=$rootSite  #skipping traversing all sites for now
 
   Write-Host -foregroundcolor darkblue "$($site.id) - $($site.Url) - $($site.contentdatabase.id) - $($site.contentdatabase.name)"  
 
   $rootWeb = $site.RootWeb
   $web=$rootWeb;
 
   $JPLists=$web.Lists;
   $JPListsCount=$JPLists.Count
 
   for ($i=0;$i -lt $JPListsCount;$i++)
   { 
   $JPLib=$JPLists[$i];
   $A_Lib_Count++;
   $SkipLib=$true; #true
    
   if ( ($JPlib.BaseType -ne "DocumentLibrary") -or ($JPlib.hidden) )
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor green "fast test skipping Library: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.Title -Match "SitesAssets|Photo|Image|CustomizedsReports|Templates|Pages|Picture|cache|style|Slide")
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor red "fast test skipping Library because it mentions $Matches: $($JPlib)";   
    }
    elseif ($JPLib.BaseTemplate -ne "DocumentLibrary")   #alternatively, only skip if -eq XMLForm
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor red "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)";   
    }
    elseif (($JPLib.ThumbnailsEnabled) -or ($JPLib.DefaultView -eq "AllSlides"))
    {
 
      # forget any library with thumbnails, these are not normal doclibs, and return to top
      Write-Host -foregroundcolor red "fast test skipping Library because it has Thumbnails/Slides $($JPlib)";   
    }
    elseif (!$JPLib.ContentTypesEnabled)
    {
      Write-Host -foregroundcolor red "skipping because it does not have CTs enabled: $($JPlib)";   
    }
    else
    {  $SkipLib=$false; }
 
    if (!$SkipLib)
    {
      write-Host -foregroundcolor green "Processing Library: $($JPlib)";   
 
      try
      {
      $ListView = $JPLib.Views["All Documents"];  #This is the one view we are looking for
      }
      catch
      {
      $ListView=$null;
      }
 
      if ($ListView -eq $null)  #let's not try to add a column to a non-existent view.
      {
        continue;
      }
 
      $ListFields=$JPLib.Fields;
      $ListViewFields=$ListView.ViewFields;
 
      #We might have duplicate entries; let's delete them all, then when we add the field, we know there's only one field
      $stillDeleting=$true;
      do {
        try   {$ListViewfields.Delete("LinkFilename")}   #note the use of the internal (static) name whenever deleting 
        catch {$stillDeleting=$false}
      } while ($stillDeleting)
 
      $stillDeleting=$true;   #similar field, just to be sure
      do {
        try   {$ListViewfields.Delete("LinkFilenameNoMenu")}
        catch {$stillDeleting=$false}
      } while ($stillDeleting)
 
 
 
      #Re-add field
      $ListViewFields.add("LinkFilename");
      #Move Field to position #2
      $ListViewFields.MoveFieldTo("LinkFilename",1);
 
      $ListView.Update();  #note no update() method is required
 
    }
}

Adding a SharePoint View to every library

How to add a SharePoint View to every library

Sometimes one has to create a custom view and deploy it to every library. This example walks through a set of explicitly named site collections within a given Managed Path, and adds a custom view. To enable running it multiple times, the script first deletes every instance of that named view, if any exist. Otherwise we would have a new duplicate view in each library each time we ran this script. Note both filtering and sort order is specified by a CAML query. This example shows a very simple one (sort by filename) but it’s possible to construct complex queries. The two approaches I like to use is use a tool called CAMLBuilder, or create a test view, and then navigate the object model and extract the CAML query. Lastly I add the fields using the static (internal) name in the preferred sequence.

 

#adds "Field" to right  in Default View; deletes field first, in case it already exists, so this can be rerun safely
Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
 
# Script changes the default View. 
$envrun="Prod"          # selects environment to run in
 
if ($envrun -eq "Dev")
{
}
elseif ($envrun -eq "Prod")
 
{
$siteUrl = "http ://SharePoint/ManagedPath/"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z,09"  #these are the explicitly named Site Collections
$LoopStringArr = $LoopString.Split(,)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "script starting"
 
$myheader = "STARTING: $(get-date)"
 
foreach ($letter in $LoopStringArr)
{
$SiteName=$siteurl+$letter
 
$rootSite = New-Object Microsoft.SharePoint.SPSite($SiteName)
 
 
 $site=$rootSite  #skipping traversing all sites for now
  write-host $site.Url
 
  if ($true) #this is useful to uncomment as a filter
#  if ($site.Url -like "$siteurl/personal/plau*")  
  {
   Write-Host -foregroundcolor darkblue "$($site.id) - $($site.Url) - $($site.contentdatabase.id) - $($site.contentdatabase.name)"  
 
 
   $rootWeb = $site.RootWeb
   $web=$rootWeb;
 
   $JPLists=$web.Lists;
   $JPListsCount=$JPLists.Count  #it much more efficient to resolve Lists object and count outside the loop
 
   for ($i=0;$i -lt $JPListsCount;$i++)
   { 
   $JPLib=$JPLists[$i];
   $A_Lib_Count++;
   $SkipLib=$true; #true
    
   if ( ($JPlib.BaseType -ne "DocumentLibrary") -or ($JPlib.hidden) )
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor green "fast test skipping Library: $($JPlib)";   
 
    }
    elseif ($JPLib.Title -Match "SitesAssets|Photo|Image|CustomizedsReports|Templates|Pages|Picture|cache|style|Slide")
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor red "fast test skipping Library because it mentions $Matches: $($JPlib)";   
 
    }
    elseif ($JPLib.BaseTemplate -ne "DocumentLibrary")   #alternatively, only skip if -eq XMLForm
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor red "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)";   
 
    }
    elseif (($JPLib.ThumbnailsEnabled) -or ($JPLib.DefaultView -eq "AllSlides"))
    {
      # forget any library with thumbnails, these are not normal doclibs, and return to top
      Write-Host -foregroundcolor red "fast test skipping Library because it has Thumbnails/Slides $($JPlib)";   
 
    }
    elseif (!$JPLib.ContentTypesEnabled)
    {
      Write-Host -foregroundcolor red "skipping because it does not have CTs enabled: $($JPlib)";   
 
    }
    else
    {  $SkipLib=$false; }
 
    if (!$SkipLib)
    {
      write-Host -foregroundcolor green "Processing Library: $($JPlib)";   
 
 
#hah, don't even delete and add if found, just cycle
$x=$null;
try
{
$x=$JPLib.Views.get_Item("NEW View")
}
catch {$x=$null}
if ($x -ne $null)
{
continue;
}
 
 
try
{
$x=$JPLib.Views.get_Item("NEW View")
if ($x.id -ne $null) #prevents duplicate entries
{
    $JPLib.Views.Delete($x.ID.ToString())
}
}
catch
{}
 
if ($JPLib.Views["NEW View"] -eq $null) #prevents duplicate entries
{
 
     $viewQuery = '<OrderBy><FieldRef Name="FileLeafRef" /></OrderBy>'  #This is any CAML query you construct
      
     #let's add the fields, by internal name, in the preferred sequence
     $viewFields = New-Object System.Collections.Specialized.StringCollection
     $viewFields.Add("DocIcon") > $null
     $viewFields.Add("LinkFilename") > $null
     $viewFields.Add("Title") > $null
     $viewFields.Add("Modified") > $null
     $viewFields.Add("Editor") > $null
     $viewFields.Add("ProductCode") > $null
     $viewFields.Add("ProductDescription") > $null
     $viewFields.Add("DocumentType") > $null
     $viewFields.Add("ReportSubType") > $null
 
     #RowLimit property
     $viewRowLimit = 30
     #Paged property
     $viewPaged = $true
     #DefaultView property
     $viewDefaultView = $false
     #$ViewQuery=$null;
     $viewTitle="NEW View"
     $newview = $JPLib.Views.Add($viewTitle, $viewFields, $viewQuery, $viewRowLimit, $viewPaged, $viewDefaultView)
   } #endif view doesn't exist
 
    }
    }
 
try
{
$rootweb.Dispose()
$web.Dispose()
}
catch{}
 
} #if $true/Siteurl is not null, if environment setup is valid
try
{
$rootSite.Dispose()
$site.Dispose()
}
catch{}
} #foreach letter

Newsletters