Reporting on all user permissions in a web application

Frequently I am asked to report on user permissions across the enterprise. While each site collection, probably each site and possibly each library and item has its own permissions that are visible, reporting on them can’t be done in the User Interface. This little PowerShell script I wrote will output all permissions in a CSV format that’s easily opened in Excel and manipulated via a PivotTable. I even published this spreadsheet PivotTable in Excel Services allowing end users to interact and manipulate the views interactively in a browser. Here it is:

  $siteCol = Get-SPSite "http://SharePoint"  #replace with your own web app
get-spweb -site $siteCol -Limit ALL | ForEach-Object {
$i++;
$j=0;
$site = $_;
$str1=$i.tostring()+","+$site.Title+","+$site.url+",";
write-host $str1;
foreach ($usr in $site.users) { $j++; $webPermissions += $str1+ $j.tostring()+ ","+ $usr.userlogin +","+$usr.displayname+","+$usr.Roles+"`r`n";}
$site.Dispose();
}
$webPermissions += "`r"+"`n";
$webPermissions | Out-file -Filepath L:logsJP_security.csv  #I am partial to my own initials, replace with your name/path
$webpermissions;
$siteCol.dispose();

Using the SP2013 Client Object Model in InfoPath managed code

Someone asked me today about using the SharePoint 2013 Client Object Model in Managed Code.  I’ve done this in VSTA (Visual Studio for Tools and Applications) in C#.  It took a bit of experimentation to get it going, here’s how I did it:

I extracted the Client Object Model runtime DLLs from the SharePoint server.  That’s both Microsoft.SharePoint.Client.dll and Microsoft.SharePoint.Client.runtime.dll.  For good measure I also took Microsoft.SharePoint.Client.Runtime.dll.deploy

In my C# (I was using VSTA) as for Managed C# in InfoPath, I:

  1. Added three project references:
    1. System.Core
    2. SharePoint.Client
    3. SharePoint.Client.Runtime
  2. Added Using for SharePoint.Client

Opened AssemblyInfo.cs to deal with partial trust exception, and added:

using System.Security; //Added to try to resolve the partial trust error: JP
[assembly: AllowPartiallyTrustedCallers] //Added to try to resolve the partial trust error: JP

 

At that point, you can use the Client Object Model.  Getting data is done two different ways:

–          CAML: my native old-style preferred approach for simple queries

–          LINQ: for SQL-like queries; the one catch is you need to use SPMetal to create a runtime Thunk of sorts specific to your target sites/lists

Hope that helps!

Eliminating metadata from the Folder Content Type

Once again I was compelled to swoop in to fix a managled set of Metadata and Content Types for a key document library.  The simplest of the challenges was removing metadata from Folders, or so I thought.  In this case, somehow, all the fields in the Document Library were associated with the Folder Content Type, so adding a new folder presented the user with roughly ten fields to complete; which were not needed.

First attempt was to hide the fields, which didn’t succeed:

$Fields=$tContentType.Fields;
$Fields[$Field].set_showinaddform($false);
$Fields[$Field].set_showineditform($false);

I then tried the old standby of  trying to remove the field, where $i is the index of the field within the Content Type:

$Fields=$tContentType.Fields;
$Fields.item($i).remove()

I then tried my usual approach to redo the FieldLinks that is the list of fields to present for forms, by removing the selected fields, to no avail:

    $fieldLinks = $tContentType.FieldLinks; #simple shortcut reference $fieldList = New-Object 'System.Collections.Generic.List[System.String]' #new ordered list of fields 
# add everything to the list EXCEPT the fields to be removed
for ($i = 1; $i -le $FieldLinks.Count; $i++) {
  $FieldName=$FieldLinks[$i - 1].Name;
 If ($FieldsToEliminate -notcontains $FieldName)  #if this is a field that belongs     
   {$fieldList.Add($FieldName);
}
$FieldLinks.Reorder($fieldList.ToArray())  #convert fields to array as we trigger Reorder method on fields  
$tContentType.Update()
By this time, I was getting pretty hot under the collar, spending hours wrestling with removing a few fields from a Folder Content Type.  This simple script did the trick.  Here it is as a function.  Note how I defined the parameter types, and also take care to clear and restore the ReadOnly flag, for any Content Type that started out Read-Only:
 
Function Eliminate_Visible_Fields_ContentType( [Microsoft.SharePoint.SPContentType]$tContentType
                                        , [System.string]$tFields
                                        ) {
 $originallyReadonly = $tContentType.readonly;
 if ($originallyReadonly) {$tContentType.set_readonly($false)};
 
 $FieldsToEliminate = $tFields.Split(,) #convert set of fields into array
 $Fields=$tContentType.Fields;
 foreach($field in $FieldsToEliminate)     {
   $tContentType.FieldLinks.delete($field); }
 Write-Output([System.String] "Content Types stripped!")
 
 $tContentType.Update()
 if ($originallyReadonly) {$tContentType.set_readonly($true)};
}

A few points to note:

  • The input parameter tFields is a comma separated string of field names
  • The field name expected is the “Internal” field name.  That means the ugly name with the internal hex characters.  So my field “Project Num” can be deleted only by referencing “Project_x0020_Num”.
  • SharePoint Manager can be used to derive the intername field names
  • Referencing the RootWeb Field list can get you the internal field names programmatically
  • Trying to delete the public and internal names is safe.  Best is to add them both to the list to be deleted; the correct one will be deleted.

Trusted MySite Host Location

How often can one make a change in Development that blows up Production?  Within the User Profile Service Application one can configure the MySite Host Location.

However watch out, as I made a change in a Dev environment that propogated and affected all Production users.

In Dev, I set the Trusted Host Location without setting a target audience.  This broadcast to all farms (including Production) that this location should be the default.

Users attempting to go to their MySite were instantly redirected to Dev!  Of course DNS and Production didn’t offer a clue.

Removing the Trusted Host location in Dev eliminated the problem.

In a related annoyance, the My Site Host location in Setup MySite defaults back to the Default Zone for the MySite web app.

With AAM set up correctly, it will change any entry back to the default zone.

So to change the MySite Host Location, it seems I’ll need to rebuild the web app with the desired default MySite URL.

Not too convenient since I have quite a few Web App custom settings such as Maximum file upload size and Super User Publishing Cache account definitions.

RBS external provider invalid reference

Inside every Content Database is a key table called AllDocs; when configured for RBS there is a field called DocFlags that can provide insights into the configuration.  If the value is 65868, it indicates to the RBS Provider to leverage the deprecated Farm ExternalBinaryStoreClassId, which was used as part of EBS in SP2007.  You can see whether this is set for your farm by running these two PowerShell commands. For my farm, it gives a zeroed out GUID:

PS C:\Users\SP2013Farm> $farm = Get-SPFarm
PS C:\Users\SP2013Farm> $farm.ExternalBinaryStoreClassId
Guid
----
00000000-0000-0000-0000-000000000000

The following SQL run in the context of your content database provides a count of this DocFlag:

selectCOUNT(*)
from AllDocs where DocFlags = 65868

The following SQL run in the context of your content database provides a count of this DocFlag:

selectCOUNT(*)
from AllDocs where DocFlags = 65868 

This situation prevents the PowerShell Migrate() cmdlet from running. As a refresher, here’s the full PowerShell set of commands to set the minBlobSize and call Migrate():

$cdb=Get-SPContentDatabase -identity "[replace with your content db]"
$blobstoragesettings=$cdb.remoteblobstoragesettings
$rbs = $cdb.RemoteBlobStorageSettings
$rbs.MinimumBlobStorageSize
$rbs.GetProviderNames()
$rbs.Installed()
$rbs.Enable()
$rbs.MinimumBlobStorageSize =1mb
$rbs.update
$cdb.update()
$rbs.Migrate()

For me, I get a hideous “Object Variable Not Set” for the migrate() command. Oddly, the underlying AllDocs records with this one 65868 flag all predate my installation of SP1 with June CU, hinting that this Service Pack may have fixed a condition going forward…

What occurs for DocFlags is the highest order bit (0x10000, or 65536) is set to indicate an external provider (EBS) is utilized; this value should be zero for the typical SharePoint 2010 out-of-box RBS configuration.

The simple solution is to fix the offending bit. However this violates Microsoft’s rules telling us users not to muck in their database internals. We would never do that of course, but “hypothetically” here is how you can fix it. Take your hypothetical backup before hypothetically running this SQL:

update AllDocs
set DocFlags = DocFlags & 0xFFFEFFFF
where
( DocFlags & 0x10000 ) = 0x10000

What this does it clear the 65536 (0x10000) bit. This SQL would run (hypothetically speaking) in a fraction of a second.

We can then (again, of course hypothetically) enable RBS and do a Migrate() of content back into FILESTREAM.

Ciao, and happy Blobbing!

Secure Store Master Key Error

Don’t you hate mysterious, scary errors?  How about this one?

A critical incident has occurred where the Secure Store service application errored out because the master encryption key was not found.

Another error:

The Microsoft Secure Store Service application Secure Store Services failed to retrieve the master secret key.  The error returned was: ‘Unable to obtain master key.

While obscure, this was easily solved. It seems the Secure Store Service encrypts the database of credentials.  When a new server joins the farm, it doesn’t yet have the decryption key.  Hence the above error occurs when the Secure Store Service is started on the newly joined farm server.

To fix it, in Central Admin, go to the Manage Service Applications, select Secure Store Service and click on “Refresh Key”.  This propagates the key to all servers.

I did a Generate New Key for good measure, requiring a Passphrase (entered twice, of sufficient complexity).

I waited a minute for it to propagate, and started the service on the server, and checked ULS logs to confirm all was well in my happy farm.

With my small set of Secure Store Application IDs (ten or so) my Secure Store database size was around 11MB, comparatively tiny.

Then again, how much space could a dozen credentials take up?

Additional Read

SharePoint Farm Trusted Certificate

We all want our SharePoint farms to be blindingly fast.  Here’s an easy step you can take to improve performance, by reducing the traffic from your SharePoint servers that check on farm certificate validity.  Even though you may be using Classic NTLM as authentication for your Web Applications, inside SharePoint the components rely on Claims Authentication.

First, let’s export the Farm Certificate to C: with these two PowerShell commands:

$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content C:SharePoint RootAuthority.cer -Encoding byte

Now, let’s import the certificate into the Microsoft Management Console (MMC):

1. Click Start, Run, MMC

2. Add the Certificates snap-in:

img

img

 

3. Select “Computer Account”

img

4. Then import the Certificate:

img

 

FAST Search and the ticking time bomb

Tick…tick..tick…when you install FAST Search for SharePoint (FS4S), you probably have a time bomb set to go off exactly one year later to the minute. Unless you configured trust certificates between servers using a Certificate Authority, FAST/SharePoint uses a self-signed certificate with a one year expiration.

Worse, the ULS logs only point out that SharePoint can’t connect to the Content Distributor (green text is environment specific):

Failed to connect to srv-fast01.ReplaceWithYourDomain.com:13391 sp. Error=Failed to initialize session with document engine: Unable to resolve Contentdistributor [documentsubmitterworkerthread.cpp:132] d:officesourcesearchnativegatherpluginscontentpidocumentsubmitterworkerthread.cpp

I also saw the following error which is either misleading or unrelated:

At memory capacity. Load is 80%, configured to block at 80%. have been waiting 00:57 to queue this document [documentmanager.cpp:969]

To make matters worse, without connectivity to FAST, crawls hang and get stuck saying “Stopping”, clearing the FAST Index hangs in Central Admin; it’s not pretty…

Let’s take a step back. FAST requires that SSL is used for communication for search crawling between the FAST Server(s) and SharePoint Server(s). To communicate via SSL, a certificate needs to be generated for the FAST Sever and installed on the SharePoint server.

If you are getting errors connecting to the Content Distributor, it makes sense to first see if it is running, by running the following PowerShell in FAST: nctrl status

A very useful PowerShell command shows the connectivity and certificate status:

Ping-SPEnterpriseSearchContentService srv-fast01.[ReplaceWithYourDomain].com:13391

The below will show a list of the certificates. The timestamp below doesn’t lie; this is what I was doing at 5:30am this morning. When the problem existed, the highlighted entry showed an ExpiryDate of the day before and ConnectionSuccess of False. Note the port is 391 above your default port, which is 13000 unless you changed it on installation.

img

Here’s how to create a refreshed cert that will again celebrate its own birthday via expiration, On the FAST Server, open a PowerShell window in the D:YourFASTdirectoryInstallerScripts :

.ReplaceDefaultCertificate.ps1 -generateNewCertificate $true

Microsoft helpfully provides a PowerShell script to load the certificate into the SharePoint server. Before we run that let’s configure things so we generate a certificate that’s good for longer than a single year. To do that, let’s edit the script in C:FASTSearchinstallerscriptsinclude called certificatesetup.ps1 and right after the line around line number 246 which says:

Add-Content -Path $infFile -Value "SuppressDefaults=true"
Add the following lines underneath it:
 
Add-Content -Path $infFile -Value "ValidityPeriod=Years" 
Add-Content -Path $infFile -Value "ValidityPeriodUnits=20"  #now we ar

You’ll want to be sure to get the SSA Name correct as well as the service account; this is the account under which the Application Pool is running that hosts FAST Search connector Service Application that is running in IIS. Note you’ll need to copy both the SecureFASTSearchConnector.ps1 script and the certificate itself (find the file by going to Certificates(Local Computer)Trusted Root Certification Authorities in MMC). You will also need to halt the FAST Search Service and FAST Search Monitoring in Servicesc.msc before being able to generate a new certificate. When exporting the certificate, make sure to export the private key, for which you will be prompted for the password. You’ll want in in PFX format, not in DER or CER format. If you try to use MMC, you may find you won’t be able to get the cert exported with the private key. The good news is that the FAST script already automatically exports the script in the right format, in this location: D:FASTSearchdatadata_securitycert

For the FAST SSA, use the Content SSA, and not the query SSA. You can determine the service account easily, by checking the Service Application in Central Admin and clicking “Properties”

.SecureFASTSearchConnector.ps1 –certPath “path of the certificatecertificatename.pfx” –ssaName “name of your content SSA” –username “domainusername”
If you do need to change the port for your Content Distributor, here are the PowerShell commands. Remember to replace the server/domain/port with your own:

$SearchSSA= Get-SPEnterpriseSearchServiceApplication -identity ‘FAST Search Connector’
$SearchSSA.extendedConnectorProperties[“ContentDistributor”]=”srv-fast01.YourDomain.com:13391″
$SearchSSA.update()

Removing metadata from Folders

A dear friend of mine inadvertently tagged his SharePoint Document Library folders while using a combination of Default Metadata and Explorer mode.  Could SharePoint Manager solve this? Nope, much of SharePoint Manager (a quite nice utility) is read-only.  PowerShell to the rescue!

$web = get-spweb "<a_href="ht tp://yourweburl/">ht tp://YourWebURL"
$list1 = $web.Lists["YourLibraryTitle"]
$list1.Folders | ForEach-Object `
{ $_.properties["YourFieldName"]=""
  $_.update()
}
$list1.update()
$web.update()

To understand the code, one needs to recognize that Folders are stored separate from files; usually an annoyance, but in this case where we are only interested in Folders, it works great. Note the three separate update() methods. All three are indeed required.

Of course this can be extended into a function, and it could process all libraries, in all webs, etc…

Migrating a full copy of MS-Project Server content

There’s a frequent need to refresh MS-Project Server test environments from Production.  The conventional wisdom holds that you need to delete and recreate the SharePoint Web Application (PWA).   However this requires you to:

  • Recreate Alternate Access Methods (AAM)
  • Set quotas
  • Refine Blocked File Types
  • Set User Policy
  • Set Service Application connections…

The faster and smoother better way is instead to drop the old PWA content DB, and and reconnect the new PWA Content database. Regardless, the very first step should actually be dropping the Project Server Web App.  This is done through the Project Server Service Application in Central Administration, Service Applications. Now let’s switch over to the replacement PWA Content DB.  This houses both the top level Site Collection, the PWA application Site Collection and all webs under it (mostly each are a project site):

Dismount-spcontentdatabase [the old content database we are replacing]
Mount-SPContentDatabase -name [the new content database ] -DatabaseServer [your DB server] -WebApplication "http ://pwa"  [change as needed]

However there is one big wrinkle.  Doing this seems to leave an orphaned explicit managed path definition in the Web Application that prevents creation of the PWA Site Collection.  This appears to be what leads people to simply delete and recreate the Web Application.  However the solution is quite simple; remove the orphaned site collection:

Remove-SPSite -Identity  "htt p://pwa/" [change as needed]

When recreating the Project Web Application note:

  • When removing the Project Server Web App, you may wish to uncheck the “Remove Content DB” checkbox
  • Halting the Timer jobs may be required for the steps above.  One advantage of using PowerShell is that it does not depend on the Web App Application Pool to be active
  •  “PWA” must be the name of the project web app
  • Get the database names right, to ensure you connect to the target databases migrated to this environment

If your replacement PWA DBs come from an environment with different security, you’ll need to adjust security manually at the database level.  I prefer to take screenshots in SQL Studio for comparison before starting.

OLAP Configuration

The OLAP cube inevitably needs to be reconfigured.  When refreshing the databases, the OLAP configuration will now mirror the source environment.  Make sure you know the name of the OLAP server and database to reset the OLAP configuration.

Delete the Data Connections OLAP folder, as well as the 13 assorted cubes and the folder in which they reside.  When rebuilding the cube, these get recreated with reference to the OLAP Server and database.

Lastly, check the cube rebuild frequency and rebuild the cube.  You should see a successful OLAP cube build log, the new cubes recreated, and data in the cubes that is visible in the Excel PivotTables stored in the OLAP folder that is as current as the source PWA database.

Testing

Check the main navigation links.  Any hard-coded navigation links in the source may not get repointed automatically in the newly refreshed environment and could require hand-tuning.

Check that the new Excel OLAP pivots are in a SharePoint Excel Services Trusted Location

Other areas to test

  1. Add-ons
  2. Links
  3. Data (projects, resource pool and associated metadata)
  4. Sites
  5. Views
  6. Configurable fields and Lookup tables
  7. Enterprise Calendar
  8. Security
  9. Ability to access via MSPS client
  10. Scheduled backups
  11. Quick Launch settings in Server Settings
  12. Time/task management settings
  13. Project Detail pages
  14. Project site templates
  15. Ability to add a Risk, Issue to project sites
  16. OLAP
  17. Reports

When you’ve done a refresh once, you are golden!  I like to say “only in technology can you do something once, then be considered an ‘expert’”!