Feature Reporting and Deactivation across a SharePoint Farm

Reporting on features across a SharePoint Farm

Before retracting a SharePoint solution, it is best to deactivate the associated feature in each Site Collection. Here’s a script that reports on which Site Collections have a feature enabled, and optionally deactivate the feature. This is easily adapted to Web or Web Application features as well. Best is to first get the SPFeature GUID, using get-spFeature CmdLet. Just flip the $Deactivate switch to $true to actually deactivate. Note any SPFeature not found will generate an error that a try/catch clause will not catch. The quick solution is to capture my output into $OutLine, and clear the console and output the summary at the end of the run.

$OutLine=$null;
$Deactivate = $false;
$a=Get-SPFeature 359d84ef-ae24-4ba6-9dcf-1bbffe1fb788     #my site collection feature, substitute for yours
 
Get-SPWebApplication | Get-SPSite -Limit all | % {
$b = Get-SPFeature -Identity 359d84ef-ae24-4ba6-9dcf-1bbffe1fb788     -site $_.url
 
if ($b -ne $null)
{
    $OutLine = $OutLine + "Found active in $($_.url)`n"
    $b=$null;
    if ($Deactivate)
    {
        Disable-SPFeature 359d84ef-ae24-4ba6-9dcf-1bbffe1fb788 –url $_.url
    }
}
 
}
cls
if ($Deactivate)
{
    Write-Host "Deactivated all; Here are Site Collections where it was active:"
}
else
{
    Write-Host "Here are Site Collections where it is active:"
}
write-Host $OutLine

Adjusting Quicklinks Programmatically in SharePoint

Use PowerShell to set Quicklinks Programmatically in SharePoint

Wouldn’t it be great to hide all lists of a certain type in a farm? Perhaps Tasks, Calendars or Discussions, or all of the above. Progrmmatically, they can be hidden or exposed on navigation using set_OnQuickLaunch(). Here’s how a given library is hidden from quick launch:

$LIB.set_OnQuickLaunch($false)

Let’s now do it across a full web application; all the site collections, sites, and for a set of libraries.

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
Start-SPAssignment –Global 
 
$mylogfile="C:logfolderongoinglogfile.txt"
 
$envrun="Prod"          # selects environment to run in
   
if ($envrun -eq "Dev")
{
$siteUrl = "http://devdocs.SharePoint.com"
$LibsToFlip = "Tasks,Site Pages,Calendar,Documents"
 
$LibsToFlipArray = $LibsToFlip.Split(,)
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "http://docsny.SharePoint.com"
 
$LibsToFlip = "Tasks,Site Pages,Calendar,Documents,Team Discussion"
$LibsToFlipArray = $LibsToFlip.Split(,)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "Quick Launch Flip script starting $(get-date)"
 
 
 
if ($siteurl)
{
$rootSite = New-Object Microsoft.SharePoint.SPSite($siteUrl)
$spWebApp = $rootSite.WebApplication 
foreach($site in $spWebApp.Sites)
{
  write-host $site.Url
 
#  if ($site.Url -like "$siteurl/personal/*")  
  if ($site.Url -like "$siteurl*")  
{
$rootSite = New-Object Microsoft.SharePoint.SPSite($siteUrl);
$rootWeb = $rootSite.RootWeb;
 
 
 $webs= $site.AllWebs;
 $WebsCount = $webs.count;
 
 for ($wi=0; $wi -lt $WebsCount; $wi++)
   {
   $web = $webs[$wi]
 
   $changed = $false;
 
   $lists = $web.Lists;
   $listcount = $lists.count;
    for ($li=0; $li -lt $listcount; $li++)
   { 
   $JPLib = $lists[$li]
 
    if ($libsToFlipArray -contains $JPLib.Title )
    {
    WRITE-HOST -ForegroundColor darkgreen "$($JPLib.Title) in $($web.url)"
    $JPLIB.set_OnQuickLaunch($false)
    $JPLib.Update()
    $changed = $true;
    }
 
    }
    if ($changed)
    {
        #$Web.update()
    }
 
}
}
}
}
 
Stop-SPAssignment –Global
###########################

Removing a stubborn Content Type from a SharePoint Library

Removing a Content Type from a SharePoint Library

Having a single content type in a library can make a library sing for end-users, by avoiding choices and prompts. However removing an existing Content Type can be a problem. Primarily, if it is still in use, SharePoint will not allow its removal with “Content Type is Still in Use”. The first thing to do is to clear the recycle bin, if feasible. Microsoft reports this isn’t necessary, but when the Content Type hits the fan, we have to try a few things.

I recently found a case where the following PowerShell still failed, when trying to delete two ways, even with enabling unsafe updates:

$web = Get-SPWeb $WebUrl
$web.set_AllowUnsafeUpdates($true)
$list = $web.Lists[$ListName]
$oldCT = $list.ContentTypes[$OldCTName]
 
$oldCTID = $oldCT.ID   #fails, still in use
$list.ContentTypes.Delete($oldCTID) #fails, still in use
$oldCT.delete()
$web.set_AllowUnsafeUpdates($false)
$web.Dispose()

I finally tracked it down to documents that were checked out to a user, and had never before been checked in. Without a checked in version, they weren’t visible. The trick is to take ownership of these, then change their content types. Only then can the unused Content Type be deleted.

To automate the reassignment of a Content Type, see my next blog article: Reassigning Content Types programmatically.

Programatically reassigning Content Types in a library

Reassigning Content Types in a library using PowerShell

It is useful to be able to change the Content Type for each document in a library to a new Content Type. Perhaps you are consolidating, or just moving to a new Content Type hierarchy. There are a few tricks to changing a Content Type for a document. First, you’ll need to force a check-in if a document is checked out. Next, you’ll want to reference the Content Type by ID, specifically using this method: $list.Items.GetItemById($item.ID). Any other attempt to reassign a Content Type will fail. Lastly, either do a systemupdate() or clean up after yourself by restoring the timestamp, editor, and delete interim versions. Note there needs to be sufficient metadata, or the change will result in an error and/or the document being left checked out. This function traps and reports this condition. Here’s the function we will use:

function Reset-SPFileContentType ($WebUrl, $ListName, $OldCTName, $NewCTName)
{
    #Get web, list and content type objects
    $web = Get-SPWeb $WebUrl
    $list = $web.Lists[$ListName]
    $oldCT = $list.ContentTypes[$OldCTName]
    $newCT = $list.ContentTypes[$NewCTName]
    $newCTID = $newCT.ID
 
    #Check if the values specified for the content types actually exist on the list
    if (($oldCT -ne $null) -and ($newCT -ne $null))
    {
        #Go through each item in the list
        $list.Items | ForEach-Object {
            #Check if the item content type currently equals the old content type specified
            if ($_.ContentType.Name -eq $oldCT.Name)
            {
                $ForcedCheckin=$false;
                if ($_.File.CheckOutType -ne "None")
                {
                   try { 
                     $_.File.CheckIn(“Checked In By Administrator”);
 
                     write-host -foregroup Yellow "Forced checkin for $($_.File.Title)"
                   }
                   catch { 
 
                     write-host -foregroup Red "FAILED to Force checkin for $($_.File.Title)"
                     $ForcedCheckin=$false;
                   }
                }
 
                #Check the check out status of the file
                if ($_.File.CheckOutType -eq "None")
                {
 
                   $item=$_
                   [System.DateTime]$date = $item["Modified"]
                   $user = New-Object microsoft.SharePoint.SPFieldUserValue($web, $item["Editor"])
 
                   try { #untested try, failure could be due to inadequate required metadata
                    #Change the content type association for the item
      [Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
      {
        $item2 = $list.Items.GetItemById($item.ID);
#CHECKOUT NOT REALLY NEEDED, IF MUSTCHECKOUT IS DISABLED EARLIER
#        $item2.File.CheckOut()
 
#write-host $item2['Name']
Write-Host "." -NoNewline
 
 
        $item2["ContentTypeId"] = $newCTID
        $item2.Update()
 
        $item2["Modified"] = $date;
        $item2["Editor"] = $user;
        $item2.Update()
 
        #get rid of the last two versions now, trapping any errors 
        try { $item2.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item2['Name'])"}
        try { $item2.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (2) could not delete old version of $($item2['Name'])"}
        if ($ForcedCheckin)
        {try { $item2.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (3) could not delete ForcedCheckin version of $($item2['Name'])"}}
 
#get rid of the last version now
 
#        $item2.File.CheckIn("Content type changed to " + $newCT.Name, 1)
     } )
 
    } catch { write-host -foregroundcolor red "Error (possibly inadequate metadata) updating file: $($_.Name) from $oldCT.Name to $($newCT.Name)"
 
             }
 
            }
                else
                {
                    write-host -foregroundcolor red "File $($_.Name) is checked out to $($_.File.CheckedOutByUser.ToString()) and cannot be modified";
 
                }
            }
            else
            {
                 #Write-Host -ForegroundColor DarkRed "File $($_.Name) is associated with the content type $($_.ContentType.Name) and shall not be modified `n";
            }
        }
    }
    else
    {
        write-host -foregroundcolor red "One of the content types specified has not been attached to the list $($list.Title)"
 
    }
 
  $web.Dispose()
}

Next, let’s just call the function passing in the old name and the new name:

reset-spfilecontenttype -weburl $weburl -listname $JPlib  -oldctname "Documents" -newctname "DOCS"

Clearing out all SharePoint document versions for a set of documents

Wiping out all SharePoint document versions for a set of documents

Sometimes users create many versions of a document inadvertently. An example is a PDF, where properties may have been edited frequently. In SP2013 there is Shredded Storage, which handles storage of deltas (differential save). in SharePoint, 2010 versions can result in a lot of wasted disk storage. Let’s clean it up!

$web=Get-SPWeb "ht tp://SharePoint"
 
   for ($i=0;$i -lt $web.Lists.Count;$i++)
   { 
   $JPLib=$web.Lists[$i];
   $A_Lib_Count++;
   $SkipLib=$true; #true
    
   if ( ($JPlib.BaseType -ne "DocumentLibrary") -or ($JPlib.hidden) )
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.Title -Match "SitesAssets|Photo|Image|CustomizedsReports|Templates|Pages|Picture|cache|style|Slide")
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library because it mentions $Matches: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.BaseTemplate -ne "DocumentLibrary")   #alternatively, only skip if -eq XMLForm
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)";   
      Add-Content $mylogfile "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)`n";   
    }
    elseif (($JPLib.ThumbnailsEnabled) -or ($JPLib.DefaultView -eq "AllSlides"))
    {
      # forget any library with thumbnails, these are not normal doclibs, and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library because it has Thumbnails/Slides $($JPlib)";   
      Add-Content $mylogfile "fast test skipping Library because it has Thumbnails/Slides: $($JPlib)`n";   
    }
    else
    {  $SkipLib=$false; }
 
 
    if (!$SkipLib)
    {
      write-Host -foregroundcolor darkgreen "Processing Library to Merge: $($JPlib)";   
      Add-Content $mylogfile "Processing to merge: $($JPlib)`n"; 
 
       $JPItems = $JPLib.items;
       $JPCount = $JPitems.count;
 
          for ($itemIndex=0; $itemIndex -lt $JPCount; $itemIndex++)
          {
            $JPItem=$JPItems[$itemIndex];
            $FName=$JPItem.file.name;
 
 
            if ($FName -match ".pdf") #you choose the criteria, here I match on pdfs
            {
                $JPFile=$JPItem.file;
                Write-Host "Deleting $($JPFile.Versions.Count), size of each: $($JPFile.properties.vti_filesize),$($JPitem.url)"
 
                $JPfile.Versions.DeleteAll()
            }
 
            continue; # don't mess with timestamp/author just yet
             
 
             [Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
                {
                [System.DateTime]$date = $SourceItem["Modified"]
                $TargetItem["Modified"]=$date;
                try {
                $TargetItem["Editor"] = $SourceItem["Editor"]
                }
                catch
                {
                    write-host -foregroundcolor red "Error could not assign editor of $($targetItem.url)"
                }
 
 
                $TargetItem.update()   
                try
                {  # two deletes required
                    $TargetItem.Versions[1].delete()
                    $TargetItem.Versions[1].delete()
                }
                catch
                {
                    write-host -foregroundcolor red "Warning: could not delete old version of $($targetItem.url)"
                }
 
                })
          }
    }
}

Additional Read

How to Configure a URL to a Specific Location Inside a PDF

Scripting SharePoint logging to ULS and Event Log

It’s easy to dump output to a text file in a script, but for enterprise-class logging, the two standards are the Event Log and ULS (Unified Logging System). First ULS.

Below in PowerShell I grab a reference to the SPDiagnosticsService, define a SPDiagnosticsCategory, then call the writeTrace() method:

$diagSrc = [Microsoft.SharePoint.Administration.SPDiagnosticsService]::Local
$diacategory = new-object Microsoft.SharePoint.Administration.SPDiagnosticsCategory("MyTestCategory",[Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, [Microsoft.SharePoint.Administration.EventSeverity]::ErrorCritical)
$diagSrc.WriteTrace(98765, $diacategory, [Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, "Write your log description here" )

ULS is a good standard central way to go, but let’s move onto writing into the Event Log, which is extra useful given we are first going to create a custom application log:

New-EventLog -LogName MyCustomScripts -Source scripts

First challenge is if the logfile exists, then it will throw an error, even if you try encapsulating in a try/catch. the trick is to leverage the Get-EventLog CmdLet.

First, to see what exists, format as a list:

Get-EventLog -list

You now have your very own Event Log, and can write into it with your own event IDs, and messages, and severity levels. Here’s two worknig examples:

Write-EventLog -LogName MyCustomScripts -Source Scripts -Message "trying 4142 it works ... COOL!" -EventId 4142 -EntryType information
Write-EventLog -LogName MyCustomScripts  -Source Scripts -Message "trying 4942 as an error" -EventId 4942 -EntryType error

Now let’s simplify for re-use and consistency. Let’s declare some basics at the top of all scripts:

$Eventing = $true;  #determine if any events are written to the event log
$LogName = "JoelScripts"
$SourceName = "Scripts"
$ScriptID = 3; # unique number per script

Here’s a one-line function to make life simpler for our scripts:

function Write-MyLog([int] $EventID, [string] $Description, [system.Diagnostics.EventLogEntryType] $Severity)
{
    if ($Eventing)
    {
    Write-EventLog -LogName $LogName -Source $SourceName -Message $Description -EventId $EventID -EntryType $Severity
    }
}

Now let’s add a line at the start and end of the scripts to trigger an information event on what’s running. Note that references to $MyInvocation contain information about the currently running script:

Write-MyLog -Description "Start of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information
Write-MyLog -Description "End of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information

Lastly, here’s a sample normal message evented for warning, and next for error:

Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Warning
Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Error

A nice way to write events is to use the “Source” to map back to the script name or some other useful value for filtering. However sources need to be pre-defined. Here’s how to define a source:

New-EventLog -LogName $LogName -Source "X"

The challenge is when to create this source. I find it’s best to declare the source only if it does not exist:

try
{
$Sources = Get-EventLog -LogName $logname | Select-Object Source -Unique
$found = $false;
foreach ($OneSource in $Sources)
    {
        if ($OneSource.source -eq $Source)
        {
            $found=$true;
        }
    }   
}
catch
{
    Write-Host "cannot find logfile, so we are in deep trouble"
}
 
if (!$found)
{
    New-EventLog -LogName $LogName -Source $Source
    Write-Host "Created new Source $($Source) in log name $($LogName)"
}

How to Fix Bad Taxonomy Terms in Sharepoint Automatically

Fixing bad SharePoint taxonomy term references

A given document using managed metadata can have a term orphaned from the termset. These issues are common in complex SharePoint environments, especially those that have undergone file share to SharePoint Online migrations without a solid metadata governance plan. This can happen due to bad references to the intermediary site collection cached terms list, which is in each site collection and hidden, under the list name “HiddenTaxonomyList”. Here’s how to recreate this URL: [siteurl]/Lists/TaxonomyHiddenList/AllItems.aspx

While it’s easy to extend this to check the health of each document term, for simplicity, let’s imagine we want to fix a single document, and have the URL and know the name of the taxonomy field in the microsoft secure store. Let’s set the basics before we get started:

$docurl = "http://WebApp/path/lib/doc.xlsx" #URL to correct
$site = New-Object Microsoft.SharePoint.SPSite($docurl)  #grab the SPSite
$web = $site.OpenWeb() #grab the SPWeb
$item = $web.GetListItem($docurl) #grab the SPItem
$targetField = "MMS Field Name" # let's establish the name of the field
$TermValueToReplace = $item[$targetField].label;  #this is the termset value we want to re-assign correctly

Now let’s get a taxonomy session, with proper error detection. If you’re managing a large deployment with multiple site collections, make sure your team follows SharePoint security best practices to avoid data leaks when manipulating termsets programmatically.

try
{
    Write-Host "getting Tax Session for $($Site.url)..." -NoNewline
    $taxonomySession = Get-SPTaxonomySession -Site $site  #get one session per site collection you are in
    $termStore = $taxonomySession.TermStores[0]  #We need to move the  get Termset to above the lib level too!
    Write-Host "Got Tax Session. "
}
catch
{
    Write-Host "Tax session acquisition problem for $($site.url)"
    $ok=$false;
}

Given the field, let’s get the correct termset.

[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)  
$termSetID=$taxonomyField.TermSetId
 
$termSet = $termStore.GetTermSet($taxonomyField.TermSetId)  #if we ever loop, move to outside item loop, this takes a long time!

Now we do a lookup for the term in the Managed Metadata Service. We expect precisely one match, but we’ll check for that later

#"true" parameter avoids untaggable terms, like parent term at higher tier that should not be selected
[Microsoft.SharePoint.Taxonomy.TermCollection] $TC = $termSet.GetTerms($TermValueToReplace,$true)

Now let’s populate a TaxonomyFieldValue, and assign it to the SPItem, and save it without changing timestamp or author by using SystemUpdate()

$taxonomyFieldValue = new-object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)
$t1=$TC[0]  #use the first result
$taxonomyFieldValue.set_TermGuid($t1.get_Id())  #this assigns the GUID
$taxonomyFieldValue.set_Label($TermValueToReplace)  #this assigns the value
$taxonomyField.SetFieldValue($Item,$taxonomyFieldValue)  #let's assign to the SPItem
$item.systemupdate()

That’s the meat of it. Let’s put it all together with error handling:

$ok=$true;
$docurl = "http://WebApp/path/lib/doc.xlsx" #URL to correct
 
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$targetField = "FieldName"
$TermValueToReplace = $item[$targetField].label;
 
try
{
    Write-Host "getting Tax Session for $($Site.url)..." -NoNewline
    $taxonomySession = Get-SPTaxonomySession -Site $site  #get one session per site collection you are in
    $termStore = $taxonomySession.TermStores[0]  #We need to move the  get Termset to above the lib level too!
    Write-Host "Got Tax Session. "
}
catch
{
    Write-Host "Tax session acquisition problem for $($site.url)"
    $ok=$false;
}
 
[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)  
$termSetID=$taxonomyField.TermSetId
$termSet = $termStore.GetTermSet($taxonomyField.TermSetId)  #Move to outside item loop, this takes a long time!     
[Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue]$taxonomyFieldValue = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)  
 
[Microsoft.SharePoint.Taxonomy.TermCollection] $TC = $termSet.GetTerms($TermValueToReplace,$true)  #true avoids untaggable terms, like parent company at higher tier
 
if ($TC.count -eq 0)
    {
        Write-Host -ForegroundColor DarkRed "Argh, no Taxonomy entry for term $($TermValueToReplace)"
 
        $ok=$false;
    }
    else
    {
        if ( $TC.count -gt 1)
        {
            Write-Host -ForegroundColor DarkRed "Argh, $($TC.count) Taxonomy entries for Claim term $($TermValueToReplace)"
 
            $ok=false; #we can't be sure we got the right claim!
        }
 
        $taxonomyFieldValue = new-object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)
 
        $t1=$TC[0]
 
 
        $taxonomyFieldValue.set_TermGuid($t1.get_Id())
        #$taxonomyFieldValue.ToString()
        #$targetTC.add($taxonomyFieldValue)
    }
 
try
{
    $taxonomyFieldValue.set_Label($TermValueToReplace)
    $taxonomyField.SetFieldValue($Item,$taxonomyFieldValue)  
}
catch
{
    Write-Host -ForegroundColor DarkRed "Argh, can't write Tax Field for $($Item.url)"
    $ok=$false;
}
 
if ($ok)
{
    $item.systemupdate()
    write-host "Fixed term for item"
}
else
{
    Write-Host -ForegroundColor DarkRed "Did not fix term for item"
}

Each TaxonomyFieldValue has three important properties; these appear often as pipe separated values:
Label : It is the property of the Label selected by the user from the Labels property of the Term object
TermGuid : it is the Id (Guid) property of the Term (inherited from TaxonomyItem)
WssId : Reference back to the TaxonomyHiddenList, actually ID of the list entry in the site collection

 

Additional Read
Your Complete SharePoint Migration Guide (Step-By-Step)

Granting SharePoint Shell Administration access

Granting SharePoint Shell Administration access via PowerShell

Enabling a PowerShell user can be as simple as granting local admin rights and the following command:

add-spshelladmin "DOMAIN\USER"

Sometimes though there remain Content DBs for which the user doesn’t gain PowerShell access. In this case, the following command pipes in the content DBs and forces the PowerShell access granted:

get-spcontentdatabase | add-spshelladmin "DOMAIN\USER"

There are service application databases as well, and these could be all handled with this single command:

get-spdatabase | add-spshelladmin "DOMAIN\USER"

In the end what is required is the user having Security admin server role access on the SQL instance and the db_owner role in a database.

The security Admin role is required so that the underlying service accounts get granted the appropriate permissions, say when mounting a content DB.

Also a user must be a member of the SharePoint_Shell_Access role on the configuration database and a member of the WSS_ADMIN_WPG local group on the server (actually best to do on each server in the farm).

SharePoint Group Management

Managing SharePoint Groups in PowerShell

SharePoint Groups are a great mechanism for managing user permissions, however they exist within a single site collection. What if you have hundreds of site collections? We can easily script a range of common operations.

I prefer to use a CSV fed approach to manage groups and users. I create a CSV with the name of the group, and the users, which I list in pipe separated format (commas are already being used for the CSV). To read in a CSV use:

Import-Csv "L:PowerShellAD and SP group mapping.csv"

Let’s get the Site, Root Web, as well as an SPUser for the group owner, and get the groups object:

$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;

Here’s how to add a Group:

$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group for Members")

Here’s how to give the group Read access, for example:

$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
   $MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
   $MyAcctrole = $RootWeb.RoleDefinitions["Read"]
   $MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
   $RootWeb.RoleAssignments.Add($MyAcctassignment)
}

Here’s how to add a Member to a Group:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj) #if it exists
{
   $GroupToAddTo.addUser($UserObj)  
}

Note that a duplicate addition of a member is a null-op, throwing no errors.

Here’s how to remove a member:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
   $GroupToAddTo.RemoveUser($UserObj)  
}

Here’s how to remove all the members from a given group. This wipes the users from the whole site collection, so use this approach with care and consideration:

$user1 = $RootWeb.EnsureUser($MyUser)
try
{
   $RootWeb.SiteUsers.Remove($MyUser)
   $RootWeb.update()
}

Here’s the full script, with flags to setting the specific actions described above:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# uses feedfile to load and create set of SharePoint Groups.
$mylogfile="L:PowerShellongoinglogfile.txt"
$ADMap= Import-Csv "L:PowerShellAD and SP group mapping.csv"
$OwnerName = "DOMAIN/sp2013farm"
$AddGroups = $false;
$AddMembers = $false;  # optionally populates those groups, Comma separated list
$GrantGroupsRead = $true; #grants read at top rootweb level
$RemoveMembers = $false; # optionally  removes Comma separated list of users from the associated group
$WipeMembers = $false;  # wipes the groups clean        
$WipeUsersOutOfSite = $false;  #The Nuclear option. Useful to eliminate AD groups used directly as groups
 
 
 #we do not need a hashtable for this work, but let's load it for extensibility
$MyMap=@{}  #load CSV contents into HashTable
for ($i=0; $i -lt $AD.Count; $i++)
{
    $MyMap[$ADMap[$i].SharePoint Group] = $ADMap[$i].ADGroup;
}
 
# Script changes the letter heading for each site collection
$envrun="Dev"           # selects environment to run in
 
if ($envrun -eq "Dev")
{
$siteUrl = "h ttp://DevServer/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(,)
 
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "ht tp://SharePoint/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(,)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "script starting"
 
$myheader = "STARTING: $(get-date)"
 
foreach ($letter in $LoopStringArr)
{
    $SiteName=$siteurl+$letter
    $Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
 
    write-host $site.Url
    $rootWeb = $site.RootWeb;
    $Owner = $rootWeb.EnsureUser($OwnerName)
    $Groups = $rootWeb.SiteGroups;
 
    for ($ADi = 0; $ADi -lt $ADMap.count; $ADi++)
    {
        $SPGroupName = $ADMap[$ADi].SharePoint Group;
 
        if ($AddGroups)
        {
            if (!$Groups[$SPGroupName]) #no exist, so create
            {
                try
                {
                    $Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group members")
                }
                catch
                {
                    Write-Host -ForegroundColor DarkRed "Ouch, could not create $($SPgroupName)"
                }
            }
            else
            {
                    Write-Host -ForegroundColor DarkGreen "Already exists: $($SPgroupName)"
            }
        } #endif Add Groups
 
            if ($GrantGroupsRead)
        {
            $GroupToAddRoleTo = $Groups[$SPGroupName]
            if ($GroupToAddRoleTo) #if group exists
            {
 
                $MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
                $MyAcctrole = $RootWeb.RoleDefinitions["Read"]
                $MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
                $RootWeb.RoleAssignments.Add($MyAcctassignment)
            } #if the group exists in the first place
        } #ActionFlagTrue
 
        if ($AddMembers)
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                $usersToAdd = $ADMap[$ADi].ADGroup;
 
                if ($usersToAdd.length -gt 0) #if no users to add, skip
                {
                    $usersToAddArr = $usersToAdd.split("|")
                    foreach ($userName in $usersToAddArr)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.addUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                } #users to add
            } #if the group exists in the first place
        } #ActionFlagTrue
 
        if ($RemoveMembers)
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                $usersToAdd = $ADMap[$ADi].SharePoint Group;
 
                if ($usersToAdd.length -gt 0) #if no users to add, skip
                {
                    $usersToAddArr = $usersToAdd.split("|")
                    foreach ($userName in $usersToAddArr)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                } #users to add
            } #if the group exists in the first place
        } #ActionFlagTrue
 
        if ($WipeMembers)  #Nukes all users in the group
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                    foreach ($userName in $GroupToAddTo.Users)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot remove user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
 
            } #if the group exists in the first place
        } #ActionFlagTrue
 
if ($WipeUsersOutOfSite)  #Nukes all users in the group
        {
        $usersToNuke = $ADMap[$ADi].ADGroup;
 
        if ($usersToNuke.length -gt 0) #if no users to add, skip
                {
                    $usersToNukeArr = $usersToNuke.split("|")
                    foreach ($MyUser in $usersToNukeArr)
                    {
                        try
                            {
                                try
                                {
                                    $user1 = $RootWeb.EnsureUser($MyUser)
                                }
                                catch
                                {
                                    Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
                                }
 
                                try
                                {
                                    $RootWeb.SiteUsers.Remove($MyUser)
                                    $RootWeb.update()
                                }
                                catch
                                {
                                    Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
                                }
                           }
                           catch
                           {
                                Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
                           }
                } #if user is not null
            } #foreach user to nuke
        } #ActionFlagTrue
 
    }
 
 
    $rootWeb.dispose()
    $site.dispose()
 
} #foreach site

Start Your SharePoint Online Project in a Click

Our technology and wide delivery footprint have created billions of dollars in value for clients globally and are widely recognized by industry professionals and analysts.

Content Type Syndication tips from the Masters

Content Type Syndication must-know tips

Content Type Syndication is a must for enterprise-class structured SharePoint document management systems.

There are some limitations to be aware of.   A site column has an internal name and an external name. The internal name is set (and never changes) after the Site Column is created. So if you create a site column called “my data” it might appear to have a site column name of “my%20data” with %20 being the hex 20 equating to 32 for ASCII space. Now if you have 100 site collections, and one of them already has a field called “my data”, perhaps it is of a different type (number vs text, for example) then the Content Syndication Hub will not be able to publish this one site column, and will show it in a report available. Within your syndication hub, under Site Collection Administration, is an entry you can click called “Content type service application error log”. This is a list of errors where such site columns could not be published. During the publishing of Content Types, there’s actually a “pre-import check” that is done. The frustrating part, is when you create a site column, you don’t always know whether that name is in use somewhere. This is done for both the external visible name and the cryptic internal site column name.

When you publish a Content Type, it doesn’t get deployed right away. There’s an hourly batch job for each Web Application. You can go into Central Admin, Monitoring, Timer Jobs, and force the job to run for your web application. It’s called “Content Type Subscriber”. This picks up any recently published Content Types, and “pushes” them down throughout the Site Collection, starting with replicating the Site Columns, Content Types, Information Policies, then pushing them into sites and libraries, if the published Content Type is set to propagate down to lists (that’s a check box in the Content Type).

Similarly, within the subscribing Site Collection, in the Site Collection Administration, there’s a “Content type publishing error log” that summarizes issues with propagating Content Types and Site Columns.

For both Content Type Publishing, of many Content Types, I usually script the publishing, as well as running the subscribing jobs. I find that works extremely well, and avoids human errors.

On the  home page for the Content Type Syndication Hub I always put a link to:

1. Site Columns

2. Content Types

3. The Central Admin Timer Jobs, or a link directly to the main Web Application’s “Content Type Subscriber” Timer Job.

There’s a lot more, but let me know if there are any aspects I can clarify further.

Newsletters