Migrating Documents via SFTP

A previous post covered how to programmatically download documents via FTP: How to download from FTP programmatically
If FTP over SSL is needed, that’s just a property to enable SSL:

$request = [Net.WebRequest]::Create($url)
$request.EnableSsl = $true  #enable SSL

Sometimes there is a need to access and download documents via SFTP. That’s a completely different beast. To do that, I utilize the open source WinSCP. Both the .exe and DLL are needed, and can be co-located with the script.  Read more about WinSCP.

In PowerShell, just load the type library:

Add-Type -Path "WinSCPnet.dll"
Then set up the session. Below I chose to use the actual SSH Host Key Fingerprint:
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Sftp
$sessionOptions.HostName = "sftp.myDomain.com"
$sessionOptions.UserName = "username"
$sessionOptions.Password = 'password'
$sessionOptions.SshHostKeyFingerprint = "ssh-rsa 1024 96:9b:ed:1f:66:8b:13:64:c3:ed:11:e0:27:68:62:67"

If you don’t want to bother confirming the crypto key, just set this property instead:

$sessionOptions.GiveUpSecurityAndAcceptAnySshHostKey = "True"

Then create a new session and open it:

$session = New-Object WinSCP.Session
$session.Open($sessionOptions)

Note $session.output contains all the useful FTP transactions, which you can log.

You also have the option to capture debugging information and set the debugging level:

$session.DebugLogPath = "D:\plautj\mypath"
$session.SessionLogPath = "D:\plautj\mypath2"
$session.DebugLevel = 1
 
Once the connection is established, use the session to:<br role="presentation" data-uw-rm-sr="" />1. Capture the directory listing: $directory = $session.ListDirectory($FTPDir)<br role="presentation" data-uw-rm-sr="" />2. Download files: $session.GetFiles($remotePath, $localPath).Check()<br role="presentation" data-uw-rm-sr="" />3. Delete files: $session.RemoveFiles($remotePath).Check()

Below is the full script to connect to SFTP, download all files, and delete them from the FTP Server:

$DeleteSource = $true;
 $DestLocation = "\\DestinationServer\Location\";
 $FTPDir = "/USERS/MyDir"
 #todo
 $ok = $true;
 
 try
 {
    $ok = test-path $destLocation;
 }
 catch
{
    $ok=$false;
    write-host -ForegroundColor darkred "Failed to reach destination location $($destLocation)"
}
 
if ($ok)
{
try
{
    # Load WinSCP .NET assembly
    Add-Type -Path "WinSCPnet.dll"  # requires script co-located DLL
}
catch
{
    $ok=$false;
    write-host -ForegroundColor darkred "Failed to acquire types from the WinSCPnet.dll"
}
}
 
 if ($ok)
{
 try
 {
    # Setup session options
    $sessionOptions = New-Object WinSCP.SessionOptions
    $sessionOptions.Protocol = [WinSCP.Protocol]::Sftp
    $sessionOptions.HostName = "sftp.SomeDomain.com"
    $sessionOptions.UserName = "userID"
    $sessionOptions.Password = 'Password'
    $sessionOptions.SshHostKeyFingerprint = "ssh-rsa 1024 96:9b:ed:1f:66:8b:13:64:c3:ed:11:e0:27:68:62:67"
 
    $session = New-Object WinSCP.Session
    $session.Open($sessionOptions)
 }
 catch
{
    $ok=$false;
    write-host -ForegroundColor darkred "Failed to open SFTP connection"
}
}
 
 if ($ok)
 {
    try #to get the directory listing
    {
        $directory = $session.ListDirectory($FTPDir)
    }
    catch
    {
        $ok=$false;
        write-host -ForegroundColor darkred "Failed to get FTP Directory $($FTPDir)"
    }
 }
 
 if ($ok)
 {
    try # to download each file that is not itself a directory
    {
        foreach ($f in $Directory.Files)
        {
            if (!$f.IsDirectory)
            {
                try
                {
                $RemotePath = "$($FTPDir)/$($f.name)"
                $LocalPath  =  "$($DestLocation)$($f.name)"
                $LocalPath  = $LocalPath.trim()
                $session.GetFiles($remotePath, $localPath).Check()
                write-host -ForegroundColor darkgreen "Deleted file from $($RemotePath) to $($LocalPath)"
                }
                catch
                {
                    $ok=$false;
                    write-host -ForegroundColor darkred "Failed to download file from $($RemotePath) to $($LocalPath)"
                }
 
            }
        }
    }
    catch
    {
        $ok=$false;
        write-host -ForegroundColor darkred "Generic failure Failed to download file from FTP Directory $($FTPDir)"
    }
 
 }
 
if ($ok)
{
    if ($DeleteSource)
    {
     foreach ($f in $Directory.Files)
        {
            if (!$f.IsDirectory)
            {
                try # try to delete each FTP file that is not a directory\
                {
                $RemotePath = "$($FTPDir)/$($f.name)"
                $LocalPath  =  "$($DestLocation)$($f.name)"
                $LocalPath  = $LocalPath.trim()
                $session.RemoveFiles($remotePath).Check()
                write-host -ForegroundColor darkgreen "Downloaded file from $($RemotePath) to $($LocalPath)"
                }
                catch
                {
                    $ok=$false;
                    write-host -ForegroundColor darkred "Failed to download file from $($RemotePath) to $($LocalPath)"
                }
 
            }
        }
    }
 }

Errors trying to publish SP2013 workflows

Recently I encountered an issue trying to publish SP2013 workflows.  Every attempt to save the workflow gave an odd error about the underlying XML. I traced into the ULS log which showed: “Cannot set unknown member ‘LookupSPListItemDoubleProperty.ItemId”.

The problem was related to the binding between the Workflow Manager instance and the IIS Site that supports the SharePoint web application. The key was to rebind them.  Here’s the command to do so:

Register-SPWorkflowService –SPSite "http://yoursite" -WorkflowHostUri “http://localhost:12291” -AllowOAuthHttp -Force

If that doesn’t do the trick, let’s restart by running the Workflow Configuration Wizard, to rejoin the SharePoint farm to the workflow farm. Note you need to know:
– Service account password
– Database instance
– Workflow passphrase

Once that is done be sure to check IIS bindings for the workflow manager site. Both 12290 and 12291 should be configured.

Then issue the first CmdLet above for registering the SPWorkflow Service.

Voila!

Quickly finding the SharePoint Search index

It is useful to quickly locate the SharePoint 2013 search index on a farm.

For example, it’s useful to exclude it from AntiVirus scans.
This can also be useful for checking security, disk usage, or for relocating the index.
Here’s how:

This set of commands will give you details on the search topology:

$ssa = Get-SPServiceApplication –Name “Search Service Application”
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
Get-SPEnterpriseSearchComponent -SearchTopology $active

To simply locate the search index on the file system, use these commands:

$ssi = Get-SPEnterpriseSearchServiceInstance
$ssi.Components

Fixing Repeated Logon Prompts in SharePoint

There’s nothing that annoys users as much as repeated logon prompts.

Recently a SharePoint 2013 farm was prompting all users for logon except when logging on from the web server itself.

It seems someone had changed the Web Application User Authentication setting for the web app from Claims Based Authentication to Windows Authentication.

Other areas to check:

  • Add SharePoint web application to the Trusted Sites
  • Clear all cached credentials. Here’s how. Go to Control Panel, User Accounts, Credential Manager, Manage Windows Credentials, and remove all relevant cached credential entries. In case old credentials are stored.
  • Make sure Integrated windows authentication is enabled in IE. (Tools >> Internet Options >> Advanced >> under security, enable integrated authentication)
  • Ensure IE settings for User Authentication has “Automatic Logon with current user name and password” selected

In conclusion, addressing repeated logon prompts in SharePoint can be a frustrating issue for users.

By ensuring the correct settings are in place, such as adding the SharePoint services web application to Trusted Sites, clearing cached credentials, and enabling integrated Windows authentication in Internet Explorer, you can alleviate this annoyance and ensure a smoother user experience within your SharePoint environment.

By following these steps and staying vigilant for any changes to authentication settings, you can help maintain the integrity and usability of your SharePoint services.

Crisply report on script duration

While scripts can get written in a jiffy, it’s best to make them usable and functional as a foundation for operability.

Some scripts can take time to execute.  As a habit, I tend to build into my scripts a crisp and clear report on script duration; both to console and to a logfile.

Let’s declare a logfile, and output the start of the script; let’s capture the start time of the script run, and output it to console and logfile at the start of the script run:

$startTime = get-date
write-host  "STARTING: $($startTime) Script: $($MyInvocation.MyCommand.Definition)"
Add-Content $mylogfile "STARTING: $($startTime) Script: $($MyInvocation.MyCommand.Definition)"

At the end, let’s do the same; output the duration of the script run to console and logfile:

$endtime = Get-Date
Write-Host "Started script at $($starttime), Ended script at $($endtime), duration of $($endtime.Subtract($starttime).minutes) minutes and $($endtime.Subtract($starttime).seconds) seconds "
Add-Content $mylogfile "Started Script: $($MyInvocation.MyCommand.Definition) at $($starttime), Ended script at $($endtime), duration of $($endtime.Subtract($starttime).minutes) minutes and $($endtime.Subtract($starttime).seconds) seconds "

Simple as that. Reuse wherever needed.

Solving Timer Service halting daily

Table of Contents

Overview

The SharePoint Timer Service is key to keeping SharePoint humming.  It runs all the timer jobs, and is therefore responsible for a near endless set of tasks.  I recently found that the timer service was halting shortly after 6am daily.  the service appears as halted.  Some additional symptoms:

  • Trying to set the password in Central Administration, Security, Managed Accounts doesn’t fix the issue
  • Trying to set the managed account password via PowerShell doesn’t help
  • The following appears in the event log:

img

  •  Trying to start the service fails:

img

Solution

First, check GPEdit.msc to make sure the Computer security policy allows the user to run as a service.  The real catch is that the Domain policy overrides the local policy, so unless the Tenant account has domain rights to log on as a service, it will fail the next morning as the GP settings propagate.

 

 

Use of Dynamic SQL for SSIS Data Sources

Quite a set of adventures in SSIS-land today.  It seems using parameters on SQL Server Integration Services data sources can cause CAST errors.  The only reason I was using parameterized OLE DB SQL Data Sources is to avoid a heavily nested SQL query that was performing poorly.

Here’s the CAST error that I was unable to solve no matter how many CASTs and CONVERTs I applied around the parameter:

[OLE DB Source [219]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: “Microsoft SQL Server Native Client 11.0” Hresult: 0x80040E21 Description: “Invalid character value for cast specification”.

A good approach is to break down the queries, using a Control Flow Execute SQL Task, to populate a Variable.  Note to populate a variable, one has to set the ResultSet to be a Single row.

Next thing to be aware of is that the Variables window does not seem to show updated values. Ever.  To see updated values, add a Script Task with a MessageBox in a scripttask; that’s the way I roll, pop-ups during development all the way 🙂

MessageBox.Show("My favorite variable " + Dts.Variables["User::TestVar1"].Value.ToString());

Next thing is the Data Source can use a Variable as source for SQL, but the Data Source expects it to be sufficient SQL to compile.

Happy SSISing!

Copying folder hierarchy with date filter and elimination of empty folders

Ever need to copy a tree of folders? I had to do this only for files older than a specified date. Plus empty folders were not welcomed. How to go about this?

We’ll approach this in four easy steps. First we’ll set the source, destination and threshold date, followed by recreating the empty folder structure on the target:

$sourceLocation = "\ny-srv-fs3Reserve"
$DestLocation = "D:plautjACTTMP"
$thresh = get-date "December 31, 2006"
xcopy $sourceLocation $DestLocation /T

Next we’ll get the full set of files and folders:

$q = Get-ChildItem $sourceLocation -Recurse

We will now copy all the files older than the threshold:

foreach ($qItem in $q)
{
    if (!$qitem.PSiscontainer)
    {
        if ($qItem.LastWriteTime -lt $thresh)
        {
            $DestItemLoc = $qitem.FullName.replace($sourceLocation,$DestLocation)
            copy-item $qitem.FullName  $DestItemLoc
        }
    }
}

Lastly, let’s delete empty folders. The kep is specifying the AllDirectories search option, otherwise it will delete folders that are devoid of immediate files, but which have files in subfolders:

$a = Get-ChildItem $DestLocation -recurse | Where-Object {$_.PSIsContainer -eq $True}
$a | Where-Object {$_.getfiles("*",[System.IO.SearchOption]::AllDirectories).Count -lt 1} | Select-Object FullName | ForEach-Object {remove-item $_.fullname -recurse}

Extracting to a CSV the list of files in folders

It’s easy to extract the set of files in all folders, here’s how using PowerShell:

$Files = Get-ChildItem "C:ChooseAnyFolder" -Recurse
$files  | select name | convertto-csv -notypeinformation | out-file "C:tempreportFile.csv";

Feel free to select your choice of fields within the Pipeline.

Using URL Parameters for an SSRS Report in SharePoint Integrated Mode

SSRS running in SharePoint integrated mode provides a great experience, but how can one pass in parameters to such a report (rdl) published to SharePoint?

here’s a URL that works for what you intended:
http://WebAppName/sites/SiteName/WebName/_vti_bin/reportserver?http://WebAppName/sites/SiteName/WebName/LibName/ReportName.rdl&rc:Parameters=Collapsed

Here’s how to construct the URL:
1. Get the SPWeb; this is the lowest site for the RDL report. In this case it is http://WebAppName/sites/SiteName/WebName which is in the SPWeb below the site collection.
2. Add on “/_vti_bin/reportserver?”
3. Add on the full URL of the RDL; you can get that by going to the All Files view, and clicking the ellipses (“…”) and clicking on the full URL and copy. Or right click on the RDL link and copy URL.
4. Add on your &parameters

Parameter examples include:

Collapsing parameter pane: &rc:Parameters=Collapsed

Zooming to whole page or page width: &rc:Zoom=Page Width

Jump to report section: &rc:Section=4

Happy reporting!