Feature Reporting and Deactivation across a SharePoint Farm

Reporting on features across a SharePoint Farm

Before retracting a SharePoint solution, it is best to deactivate the associated feature in each Site Collection. Here’s a script that reports on which Site Collections have a feature enabled, and optionally deactivate the feature. This is easily adapted to Web or Web Application features as well. Best is to first get the SPFeature GUID, using get-spFeature CmdLet. Just flip the $Deactivate switch to $true to actually deactivate. Note any SPFeature not found will generate an error that a try/catch clause will not catch. The quick solution is to capture my output into $OutLine, and clear the console and output the summary at the end of the run.

$OutLine=$null;
$Deactivate = $false;
$a=Get-SPFeature 359d84ef-ae24-4ba6-9dcf-1bbffe1fb788     #my site collection feature, substitute for yours
 
Get-SPWebApplication | Get-SPSite -Limit all | % {
$b = Get-SPFeature -Identity 359d84ef-ae24-4ba6-9dcf-1bbffe1fb788     -site $_.url
 
if ($b -ne $null)
{
    $OutLine = $OutLine + "Found active in $($_.url)`n"
    $b=$null;
    if ($Deactivate)
    {
        Disable-SPFeature 359d84ef-ae24-4ba6-9dcf-1bbffe1fb788 –url $_.url
    }
}
 
}
cls
if ($Deactivate)
{
    Write-Host "Deactivated all; Here are Site Collections where it was active:"
}
else
{
    Write-Host "Here are Site Collections where it is active:"
}
write-Host $OutLine

Reporting on all SharePoint Search Scopes

Search scopes are often created to refine the results returned on SharePont Search. I’ve written this small snippet of PowerShell as an easy way to get a report on all Scopes. I decided not to embelish it, and keep it quick and (not too) dirty, here goes:

$a = Get-SPEnterpriseSearchServiceApplication #grabs Content and Query
$scopes = $a | Get-SPEnterpriseSearchQueryScope
 
foreach ($Scope in $scopes)
{
write-host $Scope.name
write-host "=======================";
$scope.Rules  #outputs all the rules
}

Adjusting Quicklinks Programmatically in SharePoint

Use PowerShell to set Quicklinks Programmatically in SharePoint

Wouldn’t it be great to hide all lists of a certain type in a farm? Perhaps Tasks, Calendars or Discussions, or all of the above. Progrmmatically, they can be hidden or exposed on navigation using set_OnQuickLaunch(). Here’s how a given library is hidden from quick launch:

$LIB.set_OnQuickLaunch($false)

Let’s now do it across a full web application; all the site collections, sites, and for a set of libraries.

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
Start-SPAssignment –Global 
 
$mylogfile="C:logfolderongoinglogfile.txt"
 
$envrun="Prod"          # selects environment to run in
   
if ($envrun -eq "Dev")
{
$siteUrl = "http://devdocs.SharePoint.com"
$LibsToFlip = "Tasks,Site Pages,Calendar,Documents"
 
$LibsToFlipArray = $LibsToFlip.Split(,)
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "http://docsny.SharePoint.com"
 
$LibsToFlip = "Tasks,Site Pages,Calendar,Documents,Team Discussion"
$LibsToFlipArray = $LibsToFlip.Split(,)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "Quick Launch Flip script starting $(get-date)"
 
 
 
if ($siteurl)
{
$rootSite = New-Object Microsoft.SharePoint.SPSite($siteUrl)
$spWebApp = $rootSite.WebApplication 
foreach($site in $spWebApp.Sites)
{
  write-host $site.Url
 
#  if ($site.Url -like "$siteurl/personal/*")  
  if ($site.Url -like "$siteurl*")  
{
$rootSite = New-Object Microsoft.SharePoint.SPSite($siteUrl);
$rootWeb = $rootSite.RootWeb;
 
 
 $webs= $site.AllWebs;
 $WebsCount = $webs.count;
 
 for ($wi=0; $wi -lt $WebsCount; $wi++)
   {
   $web = $webs[$wi]
 
   $changed = $false;
 
   $lists = $web.Lists;
   $listcount = $lists.count;
    for ($li=0; $li -lt $listcount; $li++)
   { 
   $JPLib = $lists[$li]
 
    if ($libsToFlipArray -contains $JPLib.Title )
    {
    WRITE-HOST -ForegroundColor darkgreen "$($JPLib.Title) in $($web.url)"
    $JPLIB.set_OnQuickLaunch($false)
    $JPLib.Update()
    $changed = $true;
    }
 
    }
    if ($changed)
    {
        #$Web.update()
    }
 
}
}
}
}
 
Stop-SPAssignment –Global
###########################

FAST SharePoint Property Mapping Report

Generating a FAST SharePoint Property Mapping Report

In Search, it is not an exageration to say the property mapping is the heart of the customized search intelligence. Managed properties allow search to be customized, to serve an organization’s needs. I thought it would be useful to report on the mapping of the managed properties to crawled properties. I use a CSV format for the report, where a ‘|’ is used as the delimiter, and a semi-colon is used to separate crawled properties. Using Excel, one can easily convert pipe-delimited into columns.

The first thing we want to do is get the collection of managed properties using Get-FASTSearchMetadataManagedProperty. Then for each Managed Property, we get the associated crawled properties using the getcrawledPropertyMappings() method. Here’s the full script:

$ReportFileName = "C:tempMappingReport.csv"
$sep = '|'
cls
 
$LineOut = "Name$($sep)Description$($sep)Type$($sep)Mapping"
add-content $ReportFileName $($LineOut)
 
$mps = Get-FASTSearchMetadataManagedProperty
 
Foreach ($MP in $mps)
{
    $q = $MP.getcrawledPropertyMappings();
    $CPs=$null
    if ($q.gettype().Name -eq "CrawledPropertyMappingImpl") 
    {
        foreach ($cp in $q)
        {
            $CPs = "$($cps);$($cp.name)"
        }
        if ($CPs -ne $null)
        {
            $cps = $CPs.remove(0,1);
        }
    }
    else
    {
        $CPs = $q.gettype().Name;
    }
 
$LineOUt = "$($MP.Name)$($sep)$($MP.Description)$($sep)$($MP.Type)$($sep)$($CPS)"
add-content $ReportFileName $($LineOut)
}

Removing a stubborn Content Type from a SharePoint Library

Removing a Content Type from a SharePoint Library

Having a single content type in a library can make a library sing for end-users, by avoiding choices and prompts. However removing an existing Content Type can be a problem. Primarily, if it is still in use, SharePoint will not allow its removal with “Content Type is Still in Use”. The first thing to do is to clear the recycle bin, if feasible. Microsoft reports this isn’t necessary, but when the Content Type hits the fan, we have to try a few things.

I recently found a case where the following PowerShell still failed, when trying to delete two ways, even with enabling unsafe updates:

$web = Get-SPWeb $WebUrl
$web.set_AllowUnsafeUpdates($true)
$list = $web.Lists[$ListName]
$oldCT = $list.ContentTypes[$OldCTName]
 
$oldCTID = $oldCT.ID   #fails, still in use
$list.ContentTypes.Delete($oldCTID) #fails, still in use
$oldCT.delete()
$web.set_AllowUnsafeUpdates($false)
$web.Dispose()

I finally tracked it down to documents that were checked out to a user, and had never before been checked in. Without a checked in version, they weren’t visible. The trick is to take ownership of these, then change their content types. Only then can the unused Content Type be deleted.

To automate the reassignment of a Content Type, see my next blog article: Reassigning Content Types programmatically.

Programatically reassigning Content Types in a library

Reassigning Content Types in a library using PowerShell

It is useful to be able to change the Content Type for each document in a library to a new Content Type. Perhaps you are consolidating, or just moving to a new Content Type hierarchy. There are a few tricks to changing a Content Type for a document. First, you’ll need to force a check-in if a document is checked out. Next, you’ll want to reference the Content Type by ID, specifically using this method: $list.Items.GetItemById($item.ID). Any other attempt to reassign a Content Type will fail. Lastly, either do a systemupdate() or clean up after yourself by restoring the timestamp, editor, and delete interim versions. Note there needs to be sufficient metadata, or the change will result in an error and/or the document being left checked out. This function traps and reports this condition. Here’s the function we will use:

function Reset-SPFileContentType ($WebUrl, $ListName, $OldCTName, $NewCTName)
{
    #Get web, list and content type objects
    $web = Get-SPWeb $WebUrl
    $list = $web.Lists[$ListName]
    $oldCT = $list.ContentTypes[$OldCTName]
    $newCT = $list.ContentTypes[$NewCTName]
    $newCTID = $newCT.ID
 
    #Check if the values specified for the content types actually exist on the list
    if (($oldCT -ne $null) -and ($newCT -ne $null))
    {
        #Go through each item in the list
        $list.Items | ForEach-Object {
            #Check if the item content type currently equals the old content type specified
            if ($_.ContentType.Name -eq $oldCT.Name)
            {
                $ForcedCheckin=$false;
                if ($_.File.CheckOutType -ne "None")
                {
                   try { 
                     $_.File.CheckIn(“Checked In By Administrator”);
 
                     write-host -foregroup Yellow "Forced checkin for $($_.File.Title)"
                   }
                   catch { 
 
                     write-host -foregroup Red "FAILED to Force checkin for $($_.File.Title)"
                     $ForcedCheckin=$false;
                   }
                }
 
                #Check the check out status of the file
                if ($_.File.CheckOutType -eq "None")
                {
 
                   $item=$_
                   [System.DateTime]$date = $item["Modified"]
                   $user = New-Object microsoft.SharePoint.SPFieldUserValue($web, $item["Editor"])
 
                   try { #untested try, failure could be due to inadequate required metadata
                    #Change the content type association for the item
      [Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
      {
        $item2 = $list.Items.GetItemById($item.ID);
#CHECKOUT NOT REALLY NEEDED, IF MUSTCHECKOUT IS DISABLED EARLIER
#        $item2.File.CheckOut()
 
#write-host $item2['Name']
Write-Host "." -NoNewline
 
 
        $item2["ContentTypeId"] = $newCTID
        $item2.Update()
 
        $item2["Modified"] = $date;
        $item2["Editor"] = $user;
        $item2.Update()
 
        #get rid of the last two versions now, trapping any errors 
        try { $item2.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item2['Name'])"}
        try { $item2.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (2) could not delete old version of $($item2['Name'])"}
        if ($ForcedCheckin)
        {try { $item2.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (3) could not delete ForcedCheckin version of $($item2['Name'])"}}
 
#get rid of the last version now
 
#        $item2.File.CheckIn("Content type changed to " + $newCT.Name, 1)
     } )
 
    } catch { write-host -foregroundcolor red "Error (possibly inadequate metadata) updating file: $($_.Name) from $oldCT.Name to $($newCT.Name)"
 
             }
 
            }
                else
                {
                    write-host -foregroundcolor red "File $($_.Name) is checked out to $($_.File.CheckedOutByUser.ToString()) and cannot be modified";
 
                }
            }
            else
            {
                 #Write-Host -ForegroundColor DarkRed "File $($_.Name) is associated with the content type $($_.ContentType.Name) and shall not be modified `n";
            }
        }
    }
    else
    {
        write-host -foregroundcolor red "One of the content types specified has not been attached to the list $($list.Title)"
 
    }
 
  $web.Dispose()
}

Next, let’s just call the function passing in the old name and the new name:

reset-spfilecontenttype -weburl $weburl -listname $JPlib  -oldctname "Documents" -newctname "DOCS"

Clearing out all SharePoint document versions for a set of documents

Wiping out all SharePoint document versions for a set of documents

Sometimes users create many versions of a document inadvertently. An example is a PDF, where properties may have been edited frequently. In SP2013 there is Shredded Storage, which handles storage of deltas (differential save). in SharePoint, 2010 versions can result in a lot of wasted disk storage. Let’s clean it up!

$web=Get-SPWeb "ht tp://SharePoint"
 
   for ($i=0;$i -lt $web.Lists.Count;$i++)
   { 
   $JPLib=$web.Lists[$i];
   $A_Lib_Count++;
   $SkipLib=$true; #true
    
   if ( ($JPlib.BaseType -ne "DocumentLibrary") -or ($JPlib.hidden) )
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.Title -Match "SitesAssets|Photo|Image|CustomizedsReports|Templates|Pages|Picture|cache|style|Slide")
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library because it mentions $Matches: $($JPlib)";   
      Add-Content $mylogfile "Skipping Library: $($JPlib.title)`n";   
    }
    elseif ($JPLib.BaseTemplate -ne "DocumentLibrary")   #alternatively, only skip if -eq XMLForm
    {
      # forget the rest and return to top
      Write-Host -foregroundcolor blue "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)";   
      Add-Content $mylogfile "fast skipping Library because it is not of base DocumentLibrary, it is BaseType:$($JPlib.basetemplate): $($JPlib.title)`n";   
    }
    elseif (($JPLib.ThumbnailsEnabled) -or ($JPLib.DefaultView -eq "AllSlides"))
    {
      # forget any library with thumbnails, these are not normal doclibs, and return to top
      Write-Host -foregroundcolor blue "fast test skipping Library because it has Thumbnails/Slides $($JPlib)";   
      Add-Content $mylogfile "fast test skipping Library because it has Thumbnails/Slides: $($JPlib)`n";   
    }
    else
    {  $SkipLib=$false; }
 
 
    if (!$SkipLib)
    {
      write-Host -foregroundcolor darkgreen "Processing Library to Merge: $($JPlib)";   
      Add-Content $mylogfile "Processing to merge: $($JPlib)`n"; 
 
       $JPItems = $JPLib.items;
       $JPCount = $JPitems.count;
 
          for ($itemIndex=0; $itemIndex -lt $JPCount; $itemIndex++)
          {
            $JPItem=$JPItems[$itemIndex];
            $FName=$JPItem.file.name;
 
 
            if ($FName -match ".pdf") #you choose the criteria, here I match on pdfs
            {
                $JPFile=$JPItem.file;
                Write-Host "Deleting $($JPFile.Versions.Count), size of each: $($JPFile.properties.vti_filesize),$($JPitem.url)"
 
                $JPfile.Versions.DeleteAll()
            }
 
            continue; # don't mess with timestamp/author just yet
             
 
             [Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
                {
                [System.DateTime]$date = $SourceItem["Modified"]
                $TargetItem["Modified"]=$date;
                try {
                $TargetItem["Editor"] = $SourceItem["Editor"]
                }
                catch
                {
                    write-host -foregroundcolor red "Error could not assign editor of $($targetItem.url)"
                }
 
 
                $TargetItem.update()   
                try
                {  # two deletes required
                    $TargetItem.Versions[1].delete()
                    $TargetItem.Versions[1].delete()
                }
                catch
                {
                    write-host -foregroundcolor red "Warning: could not delete old version of $($targetItem.url)"
                }
 
                })
          }
    }
}

Additional Read

How to Configure a URL to a Specific Location Inside a PDF

Scripting SharePoint logging to ULS and Event Log

It’s easy to dump output to a text file in a script, but for enterprise-class logging, the two standards are the Event Log and ULS (Unified Logging System). First ULS.

Below in PowerShell I grab a reference to the SPDiagnosticsService, define a SPDiagnosticsCategory, then call the writeTrace() method:

$diagSrc = [Microsoft.SharePoint.Administration.SPDiagnosticsService]::Local
$diacategory = new-object Microsoft.SharePoint.Administration.SPDiagnosticsCategory("MyTestCategory",[Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, [Microsoft.SharePoint.Administration.EventSeverity]::ErrorCritical)
$diagSrc.WriteTrace(98765, $diacategory, [Microsoft.SharePoint.Administration.TraceSeverity]::Monitorable, "Write your log description here" )

ULS is a good standard central way to go, but let’s move onto writing into the Event Log, which is extra useful given we are first going to create a custom application log:

New-EventLog -LogName MyCustomScripts -Source scripts

First challenge is if the logfile exists, then it will throw an error, even if you try encapsulating in a try/catch. the trick is to leverage the Get-EventLog CmdLet.

First, to see what exists, format as a list:

Get-EventLog -list

You now have your very own Event Log, and can write into it with your own event IDs, and messages, and severity levels. Here’s two worknig examples:

Write-EventLog -LogName MyCustomScripts -Source Scripts -Message "trying 4142 it works ... COOL!" -EventId 4142 -EntryType information
Write-EventLog -LogName MyCustomScripts  -Source Scripts -Message "trying 4942 as an error" -EventId 4942 -EntryType error

Now let’s simplify for re-use and consistency. Let’s declare some basics at the top of all scripts:

$Eventing = $true;  #determine if any events are written to the event log
$LogName = "JoelScripts"
$SourceName = "Scripts"
$ScriptID = 3; # unique number per script

Here’s a one-line function to make life simpler for our scripts:

function Write-MyLog([int] $EventID, [string] $Description, [system.Diagnostics.EventLogEntryType] $Severity)
{
    if ($Eventing)
    {
    Write-EventLog -LogName $LogName -Source $SourceName -Message $Description -EventId $EventID -EntryType $Severity
    }
}

Now let’s add a line at the start and end of the scripts to trigger an information event on what’s running. Note that references to $MyInvocation contain information about the currently running script:

Write-MyLog -Description "Start of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information
Write-MyLog -Description "End of $($MyInvocation.MyCommand.Path)" -EventId $ScriptID -Severity information

Lastly, here’s a sample normal message evented for warning, and next for error:

Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Warning
Write-MyLog -Description $RunStatus -EventId $ScriptID -Severity Error

A nice way to write events is to use the “Source” to map back to the script name or some other useful value for filtering. However sources need to be pre-defined. Here’s how to define a source:

New-EventLog -LogName $LogName -Source "X"

The challenge is when to create this source. I find it’s best to declare the source only if it does not exist:

try
{
$Sources = Get-EventLog -LogName $logname | Select-Object Source -Unique
$found = $false;
foreach ($OneSource in $Sources)
    {
        if ($OneSource.source -eq $Source)
        {
            $found=$true;
        }
    }   
}
catch
{
    Write-Host "cannot find logfile, so we are in deep trouble"
}
 
if (!$found)
{
    New-EventLog -LogName $LogName -Source $Source
    Write-Host "Created new Source $($Source) in log name $($LogName)"
}

How to Fix Bad Taxonomy Terms in Sharepoint Automatically

Fixing bad SharePoint taxonomy term references

A given document using managed metadata can have a term orphaned from the termset. This can happen due to bad references to the intermediary site collection cached terms list, which is in each site collection and hidden, under the list name “HiddenTaxonomyList”, here’s how to recreate this URL: [siteurl]/Lists/TaxonomyHiddenList/AllItems.aspx

While it’s easy to extend this to check the health of each document term, for simplicity let’s imagine we want to fix a single document, and have the URL and know the name of the taxonomy field in the microsoft secure store. Let’s set the basics before we really get started:

$docurl = "http://WebApp/path/lib/doc.xlsx" #URL to correct
$site = New-Object Microsoft.SharePoint.SPSite($docurl)  #grab the SPSite
$web = $site.OpenWeb() #grab the SPWeb
$item = $web.GetListItem($docurl) #grab the SPItem
$targetField = "MMS Field Name" # let's establish the name of the field
$TermValueToReplace = $item[$targetField].label;  #this is the termset value we want to re-assign correctly

Now let’s get a taxonomy session, with proper error detection:

try
{
    Write-Host "getting Tax Session for $($Site.url)..." -NoNewline
    $taxonomySession = Get-SPTaxonomySession -Site $site  #get one session per site collection you are in
    $termStore = $taxonomySession.TermStores[0]  #We need to move the  get Termset to above the lib level too!
    Write-Host "Got Tax Session. "
}
catch
{
    Write-Host "Tax session acquisition problem for $($site.url)"
    $ok=$false;
}

Given the field, let’s get the correct termset.

[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)  
$termSetID=$taxonomyField.TermSetId
 
$termSet = $termStore.GetTermSet($taxonomyField.TermSetId)  #if we ever loop, move to outside item loop, this takes a long time!

Now we do a lookup for the term in the Managed Metadata Service. We expect precisely one match, but we’ll check for that later

#"true" parameter avoids untaggable terms, like parent term at higher tier that should not be selected
[Microsoft.SharePoint.Taxonomy.TermCollection] $TC = $termSet.GetTerms($TermValueToReplace,$true)

Now let’s populate a TaxonomyFieldValue, and assign it to the SPItem, and save it without changing timestamp or author by using SystemUpdate()

$taxonomyFieldValue = new-object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)
$t1=$TC[0]  #use the first result
$taxonomyFieldValue.set_TermGuid($t1.get_Id())  #this assigns the GUID
$taxonomyFieldValue.set_Label($TermValueToReplace)  #this assigns the value
$taxonomyField.SetFieldValue($Item,$taxonomyFieldValue)  #let's assign to the SPItem
$item.systemupdate()

That’s the meat of it. Let’s put it all together with error handling:

$ok=$true;
$docurl = "http://WebApp/path/lib/doc.xlsx" #URL to correct
 
$site = New-Object Microsoft.SharePoint.SPSite($docurl)
$web = $site.OpenWeb()
$item = $web.GetListItem($docurl)
$targetField = "FieldName"
$TermValueToReplace = $item[$targetField].label;
 
try
{
    Write-Host "getting Tax Session for $($Site.url)..." -NoNewline
    $taxonomySession = Get-SPTaxonomySession -Site $site  #get one session per site collection you are in
    $termStore = $taxonomySession.TermStores[0]  #We need to move the  get Termset to above the lib level too!
    Write-Host "Got Tax Session. "
}
catch
{
    Write-Host "Tax session acquisition problem for $($site.url)"
    $ok=$false;
}
 
[Microsoft.SharePoint.Taxonomy.TaxonomyField]$taxonomyField = $item.Fields.GetField($targetField)  
$termSetID=$taxonomyField.TermSetId
$termSet = $termStore.GetTermSet($taxonomyField.TermSetId)  #Move to outside item loop, this takes a long time!     
[Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue]$taxonomyFieldValue = New-Object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)  
 
[Microsoft.SharePoint.Taxonomy.TermCollection] $TC = $termSet.GetTerms($TermValueToReplace,$true)  #true avoids untaggable terms, like parent company at higher tier
 
if ($TC.count -eq 0)
    {
        Write-Host -ForegroundColor DarkRed "Argh, no Taxonomy entry for term $($TermValueToReplace)"
 
        $ok=$false;
    }
    else
    {
        if ( $TC.count -gt 1)
        {
            Write-Host -ForegroundColor DarkRed "Argh, $($TC.count) Taxonomy entries for Claim term $($TermValueToReplace)"
 
            $ok=false; #we can't be sure we got the right claim!
        }
 
        $taxonomyFieldValue = new-object Microsoft.SharePoint.Taxonomy.TaxonomyFieldValue($taxonomyField)
 
        $t1=$TC[0]
 
 
        $taxonomyFieldValue.set_TermGuid($t1.get_Id())
        #$taxonomyFieldValue.ToString()
        #$targetTC.add($taxonomyFieldValue)
    }
 
try
{
    $taxonomyFieldValue.set_Label($TermValueToReplace)
    $taxonomyField.SetFieldValue($Item,$taxonomyFieldValue)  
}
catch
{
    Write-Host -ForegroundColor DarkRed "Argh, can't write Tax Field for $($Item.url)"
    $ok=$false;
}
 
if ($ok)
{
    $item.systemupdate()
    write-host "Fixed term for item"
}
else
{
    Write-Host -ForegroundColor DarkRed "Did not fix term for item"
}

Each TaxonomyFieldValue has three important properties; these appear often as pipe separated values:
Label : It is the property of the Label selected by the user from the Labels property of the Term object
TermGuid : it is the Id (Guid) property of the Term (inherited from TaxonomyItem)
WssId : Reference back to the TaxonomyHiddenList, actually ID of the list entry in the site collection

 

Additional Read
Your Complete SharePoint Migration Guide (Step-By-Step)

Granting SharePoint Shell Administration access

Granting SharePoint Shell Administration access via PowerShell

Enabling a PowerShell user can be as simple as granting local admin rights and the following command:

add-spshelladmin "DOMAIN\USER"

Sometimes though there remain Content DBs for which the user doesn’t gain PowerShell access. In this case, the following command pipes in the content DBs and forces the PowerShell access granted:

get-spcontentdatabase | add-spshelladmin "DOMAIN\USER"

There are service application databases as well, and these could be all handled with this single command:

get-spdatabase | add-spshelladmin "DOMAIN\USER"

In the end what is required is the user having Security admin server role access on the SQL instance and the db_owner role in a database.

The security Admin role is required so that the underlying service accounts get granted the appropriate permissions, say when mounting a content DB.

Also a user must be a member of the SharePoint_Shell_Access role on the configuration database and a member of the WSS_ADMIN_WPG local group on the server (actually best to do on each server in the farm).