Add a Taxonomy Field into Every Site Collection and to a Content Type

Add a Taxonomy Field into Every Site Collection

I recently had to add a Taxonomy Site Column into a range of Site Collections, and then add the new Site Column into a Content Type.  Lastly, I needed to update all libraries in the site collection to reflect the updated Content Type.  How might we do this?

Step by step

Let’s get a web application, and its sites.  Note that this is not the most efficient approach, as a collection of SPSites can be large.  A more efficient way could be to pass the Sites through a pipeline.

  $webApp=Get-Spwebapplication "http ://SharePoint" #your web app url
$sites=$webApp.sites;

Let’s loop through the SPSites, and get the SPWeb to work with.  The SPSite actually has no content, nor does it have Site Columns or Content Types.  The root web is where Site Columns and Content Types are managed.

  for ($i=0; $i -lt $sites.count; $i++)
{
$site= $sites[$i];
$JPweb=$site.rootweb;
}

To do anything with Managed Metadata requires opening a Taxonomy session.   This can be done directly through the Service Application.  For simplicity, I open the session at the Site level.  This code assumes only one Managed Metadata Service association (index offset of zero).  If you have more than one MMS service application, query the collection of termstores to ensure you are grabbing the correct one.  Note below there is a “group name” and “Term Set Name” required.  Adjust these to match your environment.  Lastly there’s a check below to ensure you grabbed a termset, and aren’t holding a null reference:

$taxSession = new-object Microsoft.SharePoint.Taxonomy.TaxonomySession($site, $true);
$termStore = $taxSession.TermStores[0];
$TermSet = $termStore.Groups["Your Term Group"].TermSets["TermSetName"]
 
if ($TermSet -eq $null)
{
Write-Host -ForegroundColor DarkRed "Termset does not exist"
continue;
}

Now let’s create a new field. Note we’ll need to know the display name for the field, as well as define a static (internal) name.  As with all Site Columns, I highly recommend using a static name that avoids spaces and other characters that lead to control escape sequences in the code (%20 for space, etc).  Note the termset that is “latched” to this field refers to the specific term store (it could point to any of your Managed Metadata Service Applications) and to the specific termset.  These are GUIDs.  Note that each Managed Metadata Service Application is associated with a dedicated database.  That’s where your termsets are stored.  You’ll want to select a “group” for where the site column will be displayed.  We’ll add the field, and update the SPWeb object.

 $taxonomyField = $JPweb.Fields.CreateNewField("TaxonomyFieldType", $FieldToAddDisplayName)
$taxonomyField.SspId = $termSet.TermStore.Id
$taxonomyField.TermSetId = $termSet.Id
$taxonomyField.AllowMultipleValues = $false
$taxonomyField.Group = "Site Column Group"
$taxonomyField.StaticName = $FieldToAddStaticName
$taxonomyField.ShowInEditForm = $true
$taxonomyField.ShowInNewForm = $true
$taxonomyField.Hidden = $false
$taxonomyField.Required = $false
 
$JPweb.Fields.Add($taxonomyField);
$JPweb.Update();

Now let’s grab the set of Content Types, find our target Content Type and field, and update the Content Type.  We’ll turn off ReadOnly for the Content Type, and do the following special object update to force propagation of the Content Type into all the libraries in the site collection: $ct.UpdateIncludingSealedAndReadOnly()

$cts=$JPWeb.ContentTypes
$ct=$cts["Specific Content Type"] # replace with your content type
 
$ct.set_ReadOnly($false)
$fields=$ct.fields
 
$fs=$JPWeb.Fields
$favField=$fs.get_Item($FieldToAddDisplayName)
 
if ($favField -eq $null)
{
Write-Host -ForegroundColor DarkRed "Cannot find $($FieldToAdd) in web $($JPWeb.url)"
continue;
}
 
$link = new-object Microsoft.SharePoint.SPFieldLink $favField
$ct.FieldLinks.Add($link)
$ct.UpdateIncludingSealedAndReadOnly($true)
$ct.set_ReadOnly($true)

Let’s now put it all together into one neat script that includes some extra error handling:

 $FieldToAddStaticName = "InternalFieldName"
$FieldToAddDisplayName = "Field Display Name"
$webApp=Get-Spwebapplication "http ://SharePoint" #your web app url
$sites=$webApp.sites;
 
for ($i=0; $i -lt $sites.count; $i++)
{
$site= $sites[$i];
$JPweb=$site.rootweb;
 
if ($site.url -notlike "http ://SharePoint/MyPreferredPath/*")
{
continue;
}
 
$taxSession = new-object Microsoft.SharePoint.Taxonomy.TaxonomySession($site, $true);
$termStore = $taxSession.TermStores[0];
 $TermSet = $termStore.Groups["Your Term Group"].TermSets["TermSetName"]
 
if ($TermSet -eq $null)
{
Write-Host -ForegroundColor DarkRed "Termset does not exist"
continue;
}
 
$taxonomyField = $JPweb.Fields.CreateNewField("TaxonomyFieldType", $FieldToAddDisplayName)
 
$taxonomyField.SspId = $termSet.TermStore.Id
 $taxonomyField.TermSetId = $termSet.Id
 $taxonomyField.AllowMultipleValues = $false
 $taxonomyField.Group = "Site Column Group"
 $taxonomyField.StaticName = $FieldToAddStaticName
 $taxonomyField.ShowInEditForm = $true
 $taxonomyField.ShowInNewForm = $true
 $taxonomyField.Hidden = $false
 $taxonomyField.Required = $false
 
$JPweb.Fields.Add($taxonomyField);
 
$JPweb.Update();
$cts=$JPWeb.ContentTypes
$ct=$cts["Specific Content Type"] # replace with your content type
if ($ct -eq $null)
{
Write-Host -ForegroundColor DarkRed "Cannot add field to Content Type in web $($JPWeb.url)"
continue;
}
 
$ct.set_ReadOnly($false)
$fields=$ct.fields
 
$fs=$JPWeb.Fields
$favField=$fs.get_Item($FieldToAddDisplayName)
 
if ($favField -eq $null)
{
Write-Host -ForegroundColor DarkRed "Cannot find $($FieldToAdd) in web $($JPWeb.url)"
continue;
}
 
$link = new-object Microsoft.SharePoint.SPFieldLink $favField
 
$ct.FieldLinks.Add($link)
 
$ct.UpdateIncludingSealedAndReadOnly($true)
 
$ct.set_ReadOnly($true)
}

Creating a View in each Document Library in a Web Application

Crate a View in each Library in a Web App

My friend Rob Holmes raised an interesting challenge; deploying a new view to all libraries. Here’s the script to apply it to all libraries with Content Types enabled that are Document Libraries, across all libraries in all webs in all site collections of a web application.

Sorting/filtering entries in a view requires the use of CAML, for which I provide a few examples commented in the script. To write CAML, one needs to know the available fields. If you have a $Lib, you can dump the static or internal names for every field in the library with this command:

1
$Lib.Fields | select internalname

Here’s some CAML examples.

'<OrderBy><FieldRef Name="mvReceived_x0020_Time" Ascending="FALSE" /></OrderBy><Where><IsNotNull><FieldRef Name="mvReceived_x0020_Time" /></IsNotNull></Where>'
'<OrderBy><FieldRef Name="mvSentOn" Ascending="FALSE" /></OrderBy><Where><IsNotNull><FieldRef Name="mvReceived_x0020_Time" /></IsNotNull></Where>'
'<OrderBy><FieldRef Name="mvSentOn" Ascending="FALSE" /></OrderBy><Where><IsNotNull><FieldRef Name="mvSentOn" /></IsNotNull></Where>'

There’s a few good CAML builder tools. You can also just navigate an existing View and grab the XML from the View in SharePoint.

Here’s the script:

  Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
$wa=get-spwebapplication http ://SharePoint
 
Write-Host "script starting $(get-date)"
 
$wa=get-spwebapplication http ://SharePoint
 
foreach ($Site in $wa.sites)
{
    if ($site.url -like $MatchStr)
    {
        $webs=$Site.AllWebs
        $webcount = $Site.AllWebs.Count
 
        for ($i=0; $i -lt $webcount; $i++)
        {
            $web=$webs[$i]
 
            Write-Host "working in $($web.url)"
 
            $lists=$web.lists;
 
              for ($k=0; $k -lt $lists.count; $k++)
              {
                    $JPLib = $lists[$k];
                         #don't bother adding a view to a hidden library
                    if ($JPLib.Hidden)
                    {
                        continue;
                    }
                         #only for libraries, not for GenericList and other types
                    if ($JPLib.BaseType -ne "DocumentLibrary")
                    {
                        continue;
                    }
                         #choose your own lib filter; this acts on every content type enabled library
                    if ($JPLib.ContentTypesEnabled)
                    {
                    write-host -f green "The Library $($JPLib.title) exists in the site $($web.url), about to tune the view"
                    try
                    {
                    $x=$JPLib.Views.get_Item("Email")
                    if ($x.id -ne $null) #prevents duplicate entries
                    {
                        $JPLib.Views.Delete($x.ID.ToString())
                    }
                    }
                    catch
                    {}
                    if ($JPLib.Views["Email"] -eq $null) #prevents duplicate entries
                       {
 #$viewQuery = ''
                     #$viewQuery = ''
                     $viewQuery = ''
 
                     $viewFields = New-Object System.Collections.Specialized.StringCollection
                     $viewFields.Add("DocIcon") &gt; $null
                     $viewFields.Add("LinkFilename") &gt;  $null
                     $viewFields.Add("Title") &gt;  $null
                     $viewFields.Add("mvTo") &gt;  $null
                     $viewFields.Add("mvFrom") &gt;  $null
                     $viewFields.Add("mvSubject") &gt;  $null
                     $viewFields.Add("mvSentOn") &gt;  $null
                     $viewFields.Add("mvAttach_x0020_Count") &gt;  $null
                #    $viewFields.Add("DocType") &gt;  $null # this is accounting specific
 
                     #RowLimit property
                     $viewRowLimit = 50
                     #Paged property
                     $viewPaged = $true
                     #DefaultView property
                     $viewDefaultView = $false
                     #$ViewQuery=$null;
                     $viewTitle="Email"
                     $newview = $JPLib.Views.Add($viewTitle, $viewFields, $viewQuery, $viewRowLimit, $viewPaged, $viewDefaultView)
                       }
    }
    }
   }
#} #foreach site
} #if $true/Siteurl is not null, if environment setup is valid
}<del datetime="2024-03-06T19:12:48+00:00">  </del>

Configuring Blob Cache correctly

Blob Cache

For snappy SharePoint performance, one great option to enable is Blob Caching. Make sure to first back up the web-config for your target web app, which you can navigate to via IIS.

I prefer to increase the max-age=”600″ to a full day max-age=”86400″. This parameter forces remote browser caching which really helps with remote users over relatively poor connections.

Set an appropriate location; I prefer to segregate by web app name. Microsoft suggests a 10GB maxSize, with 20% extra space. You can lower it, it rarely increases to that level.

Best is configuring this on a dedicated drive. Unless you have an enterprise-class SAN, dedicated spindles are the way to go.

For every cache, you’ll want to know how to flush it:

$webApp = Get-SPWebApplication "http://SharePoint"
[Microsoft.SharePoint.Publishing.PublishingCache]::FlushBlobCache($webApp)
Write-Host "Flushed the BLOB cache for:" $webApp
It’s totally safe to flush the cache at any time without disruption, and best of all, wastes no water 🙂

AD User group membership not propagating into site collections

AD User group membership propagation issue

In some rare instances, users may exist within a Site Collection that don’t receive their AD group membership updates.

I’ve traced this down to recreated AD users that have the same account name, yet a new SID. The solution is to wipe the user references from the site collection.

Be forewarned, any user permissions will be wiped as well. One more excellent reason to only use AD groups for assigning permissions in SharePoint!

You can see this internal list and even delete the user by adapting this URL:
http ://WebApp/ManagedPath/namedSiteCollection/_layouts/people.aspx?MembershipGroupId=0

Better to do it in PowerShell for speed, extensibility, consistency, and across many site collections. The trick comes down to a specific way to eliminate the user from the site collection:

1
$RootWeb.SiteUsers.Remove($MyUser)

Note trying $RootWeb.Users.Remove($MyUser) or $RootWeb.AllUsers.Remove($MyUser) will not work.

To finish it off, I prefer to re-add the user:

1
$RootWeb.EnsureUser($MyUser)

Here’s the full script, where I traverse through site collections in a Web App, filter them based on criteria (in this case the managed path), then carefully take the action on a list of users (one or more, comma separated), and output any failures along the way:

  Start-SPAssignment –Global
$UsersToWipe = "DOMAINPoorBloke"
$UsersToWipeArray = $UsersToWipe.Split(,)
 
$siteUrl = "http ://SharePoint" 
 
Write-Host "script starting $(get-date)"
 
$rootSite = New-Object Microsoft.SharePoint.SPSite($siteUrl)
$spWebApp = $rootSite.WebApplication 
foreach($site in $spWebApp.Sites)
{
 
 
 if ($site.Url -notlike "$siteurl/SpecificPath/*") 
 {
     Write-Host "Fast Skipping $($site.Url)"
 }
 else
  { 
   $rootWeb = $site.RootWeb;
 
   foreach ($MyUser in $UsersToWipeArray)
   {
        try
        {
            try
            {
                $user1 = $RootWeb.EnsureUser($MyUser)
            }
            catch
            {
                Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
            }
 
            try
            {
                $RootWeb.SiteUsers.Remove($MyUser)
                $RootWeb.update()
            }
            catch
            {
                Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
            }
 
            try
            {
                $user1 = $RootWeb.EnsureUser($MyUser)
            }
            catch
            {
                Write-Host "x3: Failed to ensure user $($MyUser) in $($Site.url)"
            }
 
       }
       catch
       {
            Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
       }
   }
 } #Site to process 
   
    $site.Dispose();  
 } #foreach Site
 
 
Write-Host "script finishing $(get-date)"
 
 
Stop-SPAssignment –Global

When users can’t access a taxonomy

When users cannot access a MMS taxonomy

The Managed Metadata Service is great; but what do you do when users can’t view some taxonomy entries? This occurs when the user does not have access to the HiddenTaxonomyList, native to each Site Collection.

You can view the list using SharePoint Manager from CodePlex, navigating the Object Model, or simply by modifying this URL to reflect your site collection URL:
http ://SharePoint/sites/SiteCollection/Lists/TaxonomyHiddenList/AllItems.aspx where “http ://SharePoint/sites/SiteCollection” is replaced with your site collection URL.

I recently found a situation where permissions by default were empty. Best would be to allocate all Authenticated users access. However what does one do if there are many site collections within a given Web Application? Here’s a script that will iterate through Site Collections, and grant the access:

 $WebApp = "http ://SharePoint" #replace with your own web app
$webapp = get-spwebapplication $webapp
 
function AddPerm ([Microsoft.SharePoint.SPList] $TargetObj, [string] $RoleValue, [string] $RoleGroup)
{ #SPWeb is implied and not passed as parms for efficiency!
    if ((!$RoleValue) -or (!$RoleGroup))
    {
    return; #race to be efficient on NullOp
    }
    try
    {
                $user = $SPWeb.ensureuser($RoleGroup)
                $roledef = $SPWeb.RoleDefinitions[$RoleValue]
                $roleass = New-Object Microsoft.SharePoint.SPRoleAssignment($user)
                $roleass.RoleDefinitionBindings.Add($roledef)
 
                $TargetObj.RoleAssignments.Add($roleass)  
    }
    catch
    {
    Write-Host -ForegroundColor DarkRed "ERR: Can't Assign $($RoleGroup)"
    }
}
 
for ($i=0; $i -lt $WebApp.Sites.Count; $i++)
{
    $site=$webapp.Sites[$i];
    $SPWeb=$site.rootweb;
    $list = $SPWeb.Lists["TaxonomyHiddenList"]
    addPerm $list "Read" "SharePoint NT Authenticated Users"
}

Send Email From PowerShell

Sending An Email from PowerShell

It’s easy to send an email from PowerShell, and it’s really useful for notifying yourself of the completion of a long-running script.

It’s also a way to document what ran when, if you are like me and running hundreds of scripts and need a way to organize documentation on when you run things.

Just alter the parameters below, and let ‘er rip and spam yourself!

   param(  
        [string] $From = "server@joelDomain.com",
        [string] $To = "joelplaut@joelDomain.com",
        [string] $Title = "title",
        [string] $Body = "body"
    )
    $SmtpClient = New-Object System.Net.Mail.SmtpClient
    $SmtpServer = "mail.domainserver.com"
    $SmtpClient.host = $SmtpServer
    $SmtpClient.Send($From,$To,$Title,$Body)

Output User Profile Service Colleagues

Output User Profile Service Colleagues

SharePoint User Profile Service maintains the set of colleagues associated with each user registered within the User Profile Service. This is used for a range of social features, and is leveraged by third party products for managing relationships between users. The colleagues are initially defined based on the users with whom a user shares a direct reporting line. Users can then add or remove colleagues.

Below is a PowerShell script to output the set of colleagues for a given user, both into XML and also into a text file in list format.

  ##This script gets the colleagues for the specified user using the UPS web service , outputting to both XML and txt as a list
$uri="http://mysites.FQDN.com/_vti_bin/UserProfileService.asmx?wsdl"
## $accountName is a string that contains the account name for which you need to get all the colleagues
#$Acct="JPlaut"
$accountName="Domain$($Acct)"
$OutputDir="C:UsersSharePoint ConsultantDocumentsPowerShellUPSReports"
 
$userProfileWebServiceReference = New-WebServiceProxy -Uri $uri -UseDefaultCredential
$colleagues=$userProfileWebServiceReference.GetUserColleagues($accountName) 
## Creates an GetUserColleagues.xml file in the D: which contains colleague information for the specified $accountName
$output = New-Object -TypeName System.IO.StreamWriter -ArgumentList "$($OutputDir)GetUserColleagues$($Acct).xml", $false
$output.WriteLine("<!--?xml version=""1.0"" encoding=""utf-8"" ?-->")
$output.WriteLine("")
foreach($colleague in $colleagues)
{
    $accountName=$colleague.AccountName
    $privacy=$colleague.Privacy
    $name=$colleague.Name
    $isInWorkGroup=$colleague.IsInWorkGroup
    $group=$colleague.Group
    $email=$colleague.Email
    $title=$colleague.Title
    $url=$colleague.Url
    $userProfileId=$colleague.UserProfileId
    $id=$colleague.Id
$output.WriteLine("")
    $output.WriteLine("") 
} 
$output.WriteLine("") 
$output.WriteLine() 
$output.Dispose()
 
$colleagues &gt; "$($OutputDir)GetUserColleagues$($Acct).txt"

As an added bonus, here’s a command to remove a specific Colleague from a user

 $userProfileWebServiceReference.RemoveColleague("Domainuser","domaincolleague")

While SharePoint does a good job of guessing colleagues based on the AD reporting structure, it doesn’t do a great job of keeping them up to date. Colleagues can get out of date. As the AD reporting structure changes, Colleagues can remain, which can cause issues. If you want to reboot colleagues for a user, here’s how:

   $userProfileWebServiceReference.RemoveAllColleagues("domain\victim")

Extending the Outlook Ribbon programmatically

Extending the Outlook Ribbon

The MS-Office ribbon can be extended programmatically. In this article, I add a new tab to the ribbon, and new icons that open links to sites, then create a ClickOnce deployable package. We’ll explore a few tricks including how to import new icons. For a great overview of the Fluent UI, please see: https://msdn.microsoft.com/en-us/library/aa338198.aspx

Ribbon XML

Microsoft designed the Fluent UI (aka Ribbon) to be extensible. The easiest way to do this is via the Ribbon Designer. However I chose to hand-craft the solution based on Ribbon XML. After creating a new Visual Studio 2010 project, ensuring use of .NET 4.0,

To use an existing MS-Office Tab, just reference it by name, using tab idMso:

However you can create your own tab. Just remember to give it a unique ID, and also add the label. Note the use of “id” rather than “idMso”:

 

 
 
 <!-- Ribbon XML -->
 
 
 
 <button></button>
 <button></button>
 <button></button>
 <button></button>
 <button></button>
 
 
 
 
 
 
 <button></button>
 <button></button>
 <!--?xml version="1.0" encoding="UTF-8" standalone="yes"?-->
 
 
      <command></command>
      <command></command>
 
 
 
         <button></button>
         <button></button>
         <button></button>

For your image, you can reference existing MS-Office images, which is easiest and faimilar to users. Note that tag “imageMso” is used. The other advantage, as we’ll soon see, is these images are already sized, and loaded correctly:

 imageMso="RecordsAddFromOutlook"

MS-Office has a great set of familiar icons you can leverage. These are easier to use than importing your own (which we will get to shortly). To select from existing icons, check out this great site, which includes the XML to reference the MS-Office images.

To load your own images, first import the desired images into the Visual Studio project; under Resources is a great place. I recommend right clicking and marking each to “Include”. These images will only ever appear if you explicitly binary stream them in on Ribbon load. Here’s how. At the start of the MyRibbonxml, add a mew reference method called “GetImage” to load the images you will reference in this MyRibbon.xml:

 imageMso="RecordsAddFromOutlook"

Rather than using imageMso, just use an image tag:

image="DMFLogo.ico"

Now for every image you reference, the loadImage method will be called, passing in the name of the image as a parameter. Let’s now add the getImage method to the myRibbon.cs:

   public Bitmap GetImage(string imageName)
        {
            Assembly assembly = Assembly.GetExecutingAssembly();
 
            //this is useful for navigating to discover the actual resource name reference
            //string[] x;
            //x = assembly.GetManifestResourceNames();
 
            Stream stream = assembly.GetManifestResourceStream("MyProjectName.Resources." + imageName);
 
            return new Bitmap(stream);  //uses system.drawing, added for now
        }

At runtime, the stream is loaded based on the manifest reference. During initial debugging, it’s great to be able to see the manifest to ensure the path is correct. I’ve found it best to have a few lines of code to give you a handle on the manifest during debugging, if ever you need, just uncomment these and examine at a breakpoint:

   //string[] x;
//x = assembly.GetManifestResourceNames();

As the images are stored in a folder, you’ll have to get the project name reference and the “Resources” folder path correct.

One last point is I found ICO files don’t give the best resolution. Microsoft recommends PNGs, which is a good image format to start.

To open a browser as a ribbon action, adapt this function:

public void ActionOpenInsureds(Office.IRibbonControl control)
       {
           string targetURL = "http ://SharePoint/sites/MySite/SitePages/Index1.aspx";
           System.Diagnostics.Process.Start(targetURL);
           return;
       }

This function is called based on the button onAction tag in myRibbon.xml:

 onAction="ActionOpenInsureds"

For completeness, here’s the full sample myRibbon.xml:

  <!--?xml version="1.0" encoding="UTF-8"?-->
 
          <button id="textButton"></button>
 
          <button id="DMFButton"></button>
 
           <button id="tableButton"></button>

ClickOnce

ClickOnce is a fabulous technology. Previously we had to build a deployment project, or purchase 3rd-party software. This ribbon can be published to a central location as a ClickOnce install. It will appear in the Control Panel, and it can check in itself for updates. Here’s a great overview of ClickOnce: https://msdn.microsoft.com/en-us/library/t71a733d

To get started, simply select “Build, Publish” in Visual Studio 2010 and follow the menus.

A few things to note:
1. Unless you sign your code with a digital certificate, users will get prompted
2. The cert you use needs to have “code signing” authority, best is to get one issued by the CA (Certificate Authority) in your organization
3. If you set pre-requisites, the user needs admin authority on their machine in order to check-pre-requisites
4. There’s a registry change to control the .NET install prompts
5. Click-once does not support install for “All users” on the machine, only the current user
6. Even trying to run in silent mode still results in a single-acknowledgement window after the install

Silence is golden

I would expect running off My Documents would end up in the MyComputer Zone, which should be “Enabled” as a trusted zone for install without prompt, but here’s the keys to experiment with:

HKLMSoftwareMicrosoft.NETFrameworkSecurityTrustManagerPromptingLevel.

Then add the following string values for one or more of the security zones, as appropriate:

  • MyComputer
  • LocalIntranet
  • Internet
  • TrustedSites
  • UntrustedSites

Set the value of each security zone to one of the following:Enabled. The installation prompts the user if the solution is not signed with a trusted certificate. This is the default for the MyComputer, LocalIntranet, and TrustedSites zones.AuthenticodeRequired. The user cannot install the solution if it is not signed with a trusted root authority certificate. The installation prompts the user if the solution is not signed with a trusted publisher certificate. This is the default for the Internet zone.Disabled. The user cannot install the solution if it is not signed with a trusted publisher certificate. This is the default for the UntrustedSites zone.

The last option is to add this solution to the inclusion list, which is a list of trusted solutions.

Refining People-Picker

Refining The SharePoint People-Picker

In SharePoint there are several locations where a set of users is presented in what is known as the “People-Picker”. Examples include a “people” field in lists, and in assigning security.

One can manage the set of users and groups presented, however THe underlying mechanism is not well known in the SharePoint community.

In short, it is the set of all users returned from AD (ActiveDirectory) plus the set of local users in the Site Collection being used.

In this article, I’ll provide guidance on how to adjust both the users returned from AD, as well as the users in the site collection.

AD Filtering

To return a subset of AD results, the following stsadm command is used:

 stsadm -o setproperty -url http ://SharePoint/  -pn peoplepicker-searchadcustomfilter -pv ""

In this example, http ://SharePoint is your web application, and the “” is the LDAP query. This clears the AD filter, as the LDAP query is empty. Note there is no PowerShell equivalent in SP2010, and this applies to a Web Application and not individual site collections. To test the change, try editing the set of Site Collection Administrators or User Policy for the Web Application in Central Administration.

In the example below, an LDAP query is specified to select AD entries where users have a manager (which filters out all kinds of non-standard user accounts), and groups starting with “SharePoint”.

   stsadm -o setproperty -url http ://SharePoint  -pn peoplepicker-searchadcustomfilter -pv "(|(&amp;(objectcategory=group)( sAMAccountName=domainSharePoint_*))(&amp;(&amp;(objectcategory=person)(objectclass=user))(manager=*)))"

Let’s check out the LDAP Query string above, with this bit of PowerShell:

$strFilter = "(|(&amp;(objectcategory=group)( sAMAccountName=yourdomainSharePoint _*))(&amp;(&amp;(objectcategory=person)(objectclass=user))(manager=*)))"
$objDomain = New-Object System.DirectoryServices.DirectoryEntry
 
$objSearcher = New-Object System.DirectoryServices.DirectorySearcher
$objSearcher.SearchRoot = $objDomain
$objSearcher.PageSize = 1000
$objSearcher.Filter = $strFilter
$objSearcher.SearchScope = "Subtree"
 
$colProplist = "name"
foreach ($i in $colPropList){$objSearcher.PropertiesToLoad.Add($i)}
 
$colResults = $objSearcher.FindAll()
 
foreach ($objResult in $colResults)
    {$objItem = $objResult.Properties; $objItem.name}

ColResults now has a nice array of results to work with.

While the above is all straightforward, the results in PeoplePicker could be less than perfect. If you test, the users returned may contain additional users, including those not returned by the LDAP Query.

It turns out, People Picker returns the superset of the LDAP Query results AND a local user list that is cached in a site collection. You can see this list by going to the Site Collection URL, plus this: “_catalogs/users/”. Note the default view does not allow you to delete items, but if you add a new view, you can add “Edit” as a column, and delete individual users one at a time.

The script below will delete all cached users in a Site Collection (except for Site Collection Administrators); but I don’t think you want to run it, as it will remove all user permissions as well!

  $url = "http://YourSiteCollection"
 
$web = get-spweb $url
$list = $web.Lists["User Information List"]
$listItems = $list.Items
$listItemsTotal = $listItems.Count
for ($x=$listItemsTotal-1;$x -ge 0; $x–-)
{
    Write-Host(“DELETED:+ $listItems[$x].name)
    remove-spuser $listItems[$x]["Account"] -web $url -confirm:$false
}
$web.dispose()

To get this right, we need to extract the LDAP query results into a hashtable, and loop through the cached user list and only remove entries that are not in the LDAP query.

Gradually reducing the Change Log for a Web Application

Safely reducing the Change Log for a Web Application

Content Databases by default contain logs of all change events for 60 days. If your Content Databases are under high loads, consider culling these logs.

In general, reducing really large database tables should be done gradually, to prevent Transaction Log failure due to lack of disk space, leading to failures and rollbacks.

There are two related tables that are trimmed through this operation:
dbo.EventCache
dbo.EventLog

I do not recommend reducing these to less than 30 days, or disabling them. That is because they seem to be used for two purposes by SharePoint itself. One is for User Alerts, the other is by the search incremental crawl, for detecting content changes.

In the script below, I reduce the log duration one day at a time, and wait a full 10 minutes after triggering the daily Change Log cleanup Timer Job to run.

the internal name of the timer job is “job-change-log-expiration”. You can get the list of the internal timer job names by using this CmdLet:

Get-SPTimerJob | select name

Or alternatively, pipe the Web Application into it:

get-SPWebApplication http ://SharePoint | Get-SPTimerJob | select name

Once you get the Timer Job, you can execute it:

 $SubTJ = Get-SPTimerJob "job-change-log-expiration" -WebApplication $wa
$SubTJ.RunNow()

Note that the duration of log retention is stored in type “Timespan”. The format in a string is actually “days.hours.minutes.seconds”, so you can change it to suit your needs.

$wa=Get-SPWebApplication http ://SharePoint
[TimeSpan]$OneDay=[timespan]"1.00:00:00"
[TimeSpan]$TargetDuration=[timespan]"30.00:00:00"
while ($wa.ChangeLogRetentionPeriod -gt $TargetDuration)
{
    $wa.ChangeLogRetentionPeriod=$wa.ChangeLogRetentionPeriod-$OneDay
    $wa.Update()
    $SubTJ = Get-SPTimerJob "job-change-log-expiration" -WebApplication $wa
         $SubTJ.RunNow()
    Write-Host -ForegroundColor DarkGreen "Change Log Retention Period reduced by a single day to $($wa.ChangeLogRetentionPeriod.Days)"
    Start-Sleep -s 600
}
 
Write-Host -ForegroundColor DarkRed "Already reduced the Change Log Retention Period to target"

Newsletters