Business Intelligence Solution: What It Is and Why it Matters Nowadays

Business intelligence (BI) consolidates data to render valuable insights in a user-friendly manner through dashboards, reports, graphs, and charts. This class of tools enables entrepreneurs and professionals to access historical, new, in-house, third-party, unstructured, and semi-structured data sets to evaluate business performance and trends. Business Intelligence solutions offer valuable insights to identify problems, identify market trends, discover new business opportunities, and improve business decisions.

History of Business Intelligence

The term business intelligence was coined by author Richard Millar Devens in 1865 when he discovered a banker collecting business intelligence information on the market against his competitors.  In 1958, Hans Peter Luhn, an IBM computer scientist researched the probability of gathering business intelligence by using technology. Consequently, he established programs (protocols) for creating IBM’s analytics platforms.

By the 1990s, Business Intelligence solutions gained rapid popularity but the technical application was complex. It required significant investment, and preplanning as well as IT support, often leading to delayed reports and backlogs. Further, extensive training on business intelligence was required for the users and analysts to successfully analyze and run queries on data.

Today, Business Intelligence solutions are focused on self-service BI applications that enable users to create personalized reports and analyses based on their measuring parameters. Additionally, cloud-based platforms have assisted BI to reach globally. Consequently, with faster machines and improved programs, data can be easily processed in real-time, enabling decision-makers to make informed and profitable choices.

How Business Intelligence Works?

Business Intelligence solutions involve four steps to transform massive raw data sets into meaningful information that holds the potential to convert into wins, improving the organization’s bottom line.

Step 1: Gather and transform data from various sources

BI tools use the ETL method (extract, transform, and load) to combine unstructured and structured data from several sources. The collected data is sorted and transformed into smaller and comprehendible data sets for storing at a central location so that applications/programs can easily run a query or analyze it.

Step 2: Discover trends and variations

Data mining is an automated process that quickly analyzes data to discover patterns to render meaningful insights on the latest business performance. Additionally, BI tools feature different types of data modeling and analytics models such as predictive, descriptive, statistical, and exploratory models that explore data, predict trends, and make recommendations.

Step 3: Use data visualization to deliver results

Business Intelligence solutions use data visualization techniques to present results in an easy-to-understand format that can be shared with everybody in an organization. Further, to assist users in comprehending the data and provides the latest business performance or trends, and reporting method used that involves interactive dashboards, maps, graphs, and charts.

Step 4: Act on insights in real-time

Businesses can quickly transform their insights from historical and latest data into action with the use of Business Intelligence solutions. These BI tools allow real-time modifications and long-term strategic deviations to eliminate incompetence, adapt to dynamic market shifts, improve supply challenges, and resolve customer problems.

How Business Intelligence Supports Organizations?

The objective of the Business Intelligence solution is to assist organizations to make informed decisions. Organizations using a BI strategy will obtain accurate, organized, and meaningful data that holds the potential to improve their organizational processes and/or improve their sales and revenue.

BI tools assist teams to gauge their strategy’s performance through key performance indicators (KPIs) that can be tracked and monitored by Business Intelligence solution. Easy access to KPIs and metrics enables teams to organize, focus, and re-align their strategies to the organization’s overall objectives. Further, it allows them to free their working hours to engage in core business tasks that impact the organization’s performance.

Business Intelligence solution is used for demonstrating historic patterns to stakeholders, who can understand and evaluate the health and performance of their vertical or industry, and take corrective measures to improve it.

Benefits of using Business Intelligence software

  • Business Intelligence solutions enable organizations to transform into data-driven enterprises, which improves their performance while they gain a competitive edge.
  • Improve ROI – businesses can smartly comprehend the useful insights to allocate resources for meeting strategic objectives.
  • Discover customer behavior, trends, and preferences for targeting prospects or clients with relevant services/products.
  • Monitor, improve or streamline business operations based on the insights.
  • Improve supply chain management through monitoring activity across the funnel, and communicating outcomes with vendors and partners.

Start Your Business Intelligence Project in a Click

Our technology and wide delivery footprint have created billions of dollars in value for clients globally and are widely recognized by industry professionals and analysts.

How does Business Intelligence ease working?

Since, BI tools sort, structure and analyze historical and new data, it enables organizations to work smartly as they can make rational decisions in real-time, which gives them a competitive advantage.

Business Intelligence solution provides tools for measuring processes and strategies; hence organizations and sales and marketing teams can eliminate or improve on their tasks and performance. Further, it enables them to obtain new market trends and evolving consumer behavior, which they can tap into for successful outcomes.

Business Intelligence Solutions provide benefits in six key areas

  •           Enhanced operational efficiency.
  •           Valuable insights into customer behavior and purchase patterns.
  •           Accurate monitoring and evaluation of marketing, sales, and financial performance.
  •           Transparent benchmarks according to historical and new data.
  •           Instant alerts on customer problems and data variances.
  •           Sharing of analysis in real-time across verticals and industries.

Conclusion:

Business Intelligence solutions are the best great strategic option for enduring competitive advantage and business success. For making timely and informed decision-making to stay ahead of the competition, using BI tools and programs can be essential, as it allows the deep understanding of market trends, customer behavior, and monitors and tracks business processes in real-time for profitable business decisions. Reality Tech helps you to visualize your Enterprise data better than ever.

Challenges in the Hybrid sourcing model

Overcoming cultural challenges in the Hybrid sourcing model

The focus of this article is how team members within offshore organizations can be more effective in the USA working within a hybrid model. Without active efforts to address the cultural differences leads to eroded USA client satisfaction and general frustration.

While the cultural differences in a fully offshore model are commonly understood, the differences could be mare stark and even destructive in the hybrid model, where staff are present both on and off shore.

It is understood that avoiding cultural miscues requires at times going against instinct, and doing what feels unnatural. However success depends on closing the cultural gap, so it is a goal worthy of effort.

In this article I try to offer specific guidance for the Indian audience that can be put to immediate use to improve effectiveness when working in the USA.

Punctuality

In USA, being late for a meeting can be fatal to a relationship. People are expected to arrive not only on-time, but preferably slightly early. Not only to accommodate unanticipated delays, but also to get through security and be fresh and ready for the meeting. In USA, meetings can run over, but generally start on time.

Language

While Indian written and spoke English is generally excellent, Americans often have a hard time parsing English spoken by Indians. This seems to be a greater problem over the phone. Note the human ear adjusts to accents, so over time, this issue is reduced. For example, many American companies use Philippine-based offshore call centers, as that accent is considered easier for Americans to understand.
Lastly, some expressions are unknown in the West, and cause puzzlement, and should be avoided in conversation. These include:
• “Kindly Revert”
Try instead “Please get back to me”
In general, outside of mathematics and programming, “Revert” is rarely ever used in spoken American English.

  • “Do the needful”
    Try instead “Please do ‘x’”
  • “Discuss about”
    Try instead “Can we discuss it?”
  • “Veg”
    That acronym is unknown in the west. Instead refer to it as “Vegetarian” and “Non-Vegetarian”
  • “Holiday”
    Use the word “Vacation” instead.
  • “Rest is fine”
    Instead, use the phrase “All the rest is fine”

Suggestions for success:

  • Visit clients in person when possible
  • Use Skype over telephone, as that connection is often clearer
  • Avoid some phrases commonly used by Indians
  • Add in “Please” and “Thank you” in verbal and written correspondence

Being direct

The nature of client/service provider puts the USA in the driver’s seat. The perception in India is that the client is generally right, and it would be rude to question the client. However in the USA, the expectation is that the service provider should challenge guidance, and offer insights, alternatives, possible improvements and even critique of the guidance. The problem becomes more subtle and insidious with delivering bad news. There are two general approaches to managing the delivery of bad news in the USA:

  1. Try to deliver all the bad news at once
    Better to declare a delivery date slippage of a full week, than to slip a day each day of the week.
  2. Deliver the news as early as possible
    Using the scheduling delay issue above, a top British architect once told me “If a project is declared late on the delivery date, I fire the staffer for one of two reasons: Either he is incompetent and did not know it was late until the last day, or he hid the information from me, which is dishonest”
    In short, it’s almost always better to be up-front. Being blunt on advice is considered especially refreshing in the USA, and is a key factor in promotions, event when the communication is skipping reporting layers.

Facing things head-on

Similar to being direct, in USA avoiding something uncomfortable, is frowned upon, and even downright confusing to Americans. If you cannot do what is asked, it is expected you will push back directly, and say you can’t. The best advice is to offer alternatives, rather than a blanket refusal.
How to succeed: If you can’t do something, offer guidance on who can, or a date when you can, or an alternative approach that you can deliver on.

Initiative

In the USA, guidance given is often general and not specific and indirect. In contrast, the expectation from Indian staff is to be given precise directives. USA staff are used to giving vague and summary guidance, and only general goals.

This cultural gap leads to stress for Indian staff and dissatisfaction on the part of USA clients.

This cultural gap feeds the perception in USA is that Indian staff do not show initiative. This perception is grounded in experience where USA organizations expect staff to apply common sense and promote ideas, alternatives, and make unrequested changes.

In USA there is an expression “The Customer is King”. So if the client asks for something small (doable), the common response expected should be “Yes” or a qualified yes, with a promise to get back. It is less acceptable to say one needs to climb the management chain for approval. Staff are assumed to take responsibility. Certainly this has to be balanced with managing scope/costs caused by commitments.

Email confirmation

Americans expect email confirmation when sending an email. This may sound simpler and even petty, but has been an issue, especially exacerbated by the time zone differences. Emails are discussions in America, and Americans expect confirmation or a reply with questions, ideas or suggestions.

I’ve seen this simple expectation result in significant consternation when an American manager emails guidance, and doesn’t get questions or acknowledgement back in response.

Americans are generally happy to coordinate and navigate even complex business issues in writing.

Suggestions for success:

  • Acknowledge receipt of guidance along with stating that it was understood
  • Ask any questions or clarifications in response
  • Feel free to suggest improvements or enhancements or even critique

Summary

The cultural differences encountered in business can be a cause for delight, or a source for frustration and even failed business relationships. Closing the cultural gap starts with an understanding of differences, and effort to change behavior and reduce the impact of such differences.

Excel corruption writing DIF files

When Excel writes a file in the DIF format, SAS is unable to read the file.  Here’s a valid DIF that has 24 columns, and 446,213 rows:

TABLE
0,1
""
VECTORS
0,24      
""
TUPLES
0,446213  

Note that “Tuples” in mathematical formulas are equivalent to “Rows”. A VECTOR is like a dimension, or field. In the case of Excel, it refers to columns. So far so good. However here is how such a file would be saved by Excel 2010:

TABLE
0,1
"EXCEL"
VECTORS
0,446213
""
TUPLES
0,24

 

Excel has no problem reading the above file format, as it ignores tuples/vectors internally. However SAS cannot handle this not-so-standard deviation.

Below is VBA code that after saving a DIF file “fixes” the header by opening the file in binary mode and corrects the issue. Note FName contains both path and filename:

 
 'Fixup TUPLES and VECTORS
 
Dim filenum As Integer
Dim strData As String
Dim VectorIdx As Integer
Dim TuplesIdx As Integer
Dim VectorStr As String
Dim TuplesStr As String
Const CharsToProcess = 60
Dim outStr
Dim CRLF As String
Dim DoubleQuote As String
Dim Fname As String
 
CRLF = Chr(13) & Chr(10)
DoubleQuote = Chr(34)
 
Fname = saveString
 
filenum = FreeFile
Open Fname For Binary Access Read As filenum
 
strData = String$(CharsToProcess, " ")
Get #filenum, , strData
Close #filenum
 
VectorIdx = InStr(strData, "VECTORS")
TuplesIdx = InStr(strData, "TUPLES")
VectorStr = Mid(strData, VectorIdx + 9, 14) 'overly generous portion of chars
TuplesStr = Mid(strData, TuplesIdx + 8, 14)
 
If InStr(TuplesStr, Chr(13)) > 0 Then 'trim CR LF
  TuplesStr = Left(TuplesStr, InStr(TuplesStr, Chr(13)) - 1)
End If
 
If InStr(VectorStr, Chr(13)) > 0 Then 'trim CR LF
  VectorStr = Left(VectorStr, InStr(VectorStr, Chr(13)) - 1)
End If
 
outStr = "VECTORS" & CRLF & TuplesStr & CRLF & DoubleQuote & DoubleQuote & CRLF & "TUPLES" & CRLF & VectorStr
 
filenum = FreeFile
Open Fname For Binary Access Write As filenum
Put #filenum, VectorIdx, outStr
Close #filenum

SharePoint Group Management

Managing SharePoint Groups in PowerShell

SharePoint Groups are a great mechanism for managing user permissions; however, they exist within a single site collection. What if you have hundreds of site collections? We can easily script a range of common operations.

I prefer to use a CSV-fed approach to manage groups and users. I create a CSV with the name of the group and the users, which I list in pipe-separated format (commas are already being used for the CSV). To read in a CSV, use:

Import-Csv "L:PowerShellAD and SP group mapping.csv"

Let’s get the Site, Root Web, as well as an SPUser for the group owner, and get the groups object:

$Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
write-host $site.Url
$rootWeb = $site.RootWeb;
$Owner = $rootWeb.EnsureUser($OwnerName)
$Groups = $rootWeb.SiteGroups;

Here’s how to add a Group:

 
$Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group for Members")

Here’s how to give the group Read access, for example:

 
$GroupToAddRoleTo = $Groups[$SPGroupName]
if ($GroupToAddRoleTo) #if group exists
{
   $MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
   $MyAcctrole = $RootWeb.RoleDefinitions["Read"]
   $MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
   $RootWeb.RoleAssignments.Add($MyAcctassignment)
}

Here’s how to add a Member to a Group:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj) #if it exists
{
   $GroupToAddTo.addUser($UserObj)  
}

Note that a duplicate addition of a member is a null-op, throwing no errors.

Here’s how to remove a member:

$UserObj = $rootWeb.EnsureUser($userName);
if ($UserObj)
{
   $GroupToAddTo.RemoveUser($UserObj)  
}

Here’s how to remove all the members from a given group. This wipes the users from the whole site collection, so use this approach with care and consideration:

$user1 = $RootWeb.EnsureUser($MyUser)
try
{
   $RootWeb.SiteUsers.Remove($MyUser)
   $RootWeb.update()
}

Here’s the full script, with flags to setting the specific actions described above:

Add-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue
# uses feedfile to load and create set of SharePoint Groups.
$mylogfile="L:PowerShellongoinglogfile.txt"
$ADMap= Import-Csv "L:PowerShellAD and SP group mapping.csv"
$OwnerName = "DOMAIN\sp2013farm"
$AddGroups = $false;
$AddMembers = $false;  # optionally populates those groups, Comma separated list
$GrantGroupsRead = $true; #grants read at top rootweb level
$RemoveMembers = $false; # optionally  removes Comma separated list of users from the associated group
$WipeMembers = $false;  # wipes the groups clean        
$WipeUsersOutOfSite = $false;  #The Nuclear option. Useful to eliminate AD groups used directly as groups
 
 
 #we do not need a hashtable for this work, but let's load it for extensibility
$MyMap=@{}  #load CSV contents into HashTable
for ($i=0; $i -lt $AD.Count; $i++)
{
    $MyMap[$ADMap[$i].SharePointGroup] = $ADMap[$i].ADGroup;
}
 
# Script changes the letter heading for each site collection
$envrun="Dev"           # selects environment to run in
 
if ($envrun -eq "Dev")
{
$siteUrl = "h ttp://DevServer/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
 
}
elseif ($envrun -eq "Prod")
{
$siteUrl = "ht tp://SharePoint/sites/"
$mylogfile="L:PowerShellongoinglogfile.txt"
$LoopString = "A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z"
$LoopStringArr = $LoopString.Split(“,”)
}
else
{
Write-Host "ENVIRONMENT SETTING NOT VALID: script terminating..."
$siteUrl =  $null;
return;
}
 
Write-Host "script starting"
 
$myheader = "STARTING: $(get-date)"
 
foreach ($letter in $LoopStringArr)
{
    $SiteName=$siteurl+$letter
    $Site = New-Object Microsoft.SharePoint.SPSite($SiteName)
 
    write-host $site.Url
    $rootWeb = $site.RootWeb;
    $Owner = $rootWeb.EnsureUser($OwnerName)
    $Groups = $rootWeb.SiteGroups;
 
    for ($ADi = 0; $ADi -lt $ADMap.count; $ADi++)
    {
        $SPGroupName = $ADMap[$ADi].SharePoint Group;
         
        if ($AddGroups)
        {
            if (!$Groups[$SPGroupName]) #no exist, so create
            {
                try
                {
                    $Groups.Add($SPGroupName, $Owner, $web.Site.Owner, “SharePoint Group to hold AD group members")
                }
                catch
                {
                    Write-Host -ForegroundColor DarkRed "Ouch, could not create $($SPgroupName)"
                }
            }
            else
            {
                    Write-Host -ForegroundColor DarkGreen "Already exists: $($SPgroupName)"
            }
        } #endif Add Groups
     
            if ($GrantGroupsRead)
        {
            $GroupToAddRoleTo = $Groups[$SPGroupName]
            if ($GroupToAddRoleTo) #if group exists
            {
                 
                $MyAcctassignment = New-Object Microsoft.SharePoint.SPRoleAssignment($GroupToAddRoleTo)
                $MyAcctrole = $RootWeb.RoleDefinitions["Read"]
                $MyAcctassignment.RoleDefinitionBindings.Add($MyAcctrole)
                $RootWeb.RoleAssignments.Add($MyAcctassignment)
            } #if the group exists in the first place
        } #ActionFlagTrue
     
        if ($AddMembers)
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                $usersToAdd = $ADMap[$ADi].ADGroup;
                 
                if ($usersToAdd.length -gt 0) #if no users to add, skip
                {
                    $usersToAddArr = $usersToAdd.split("|")
                    foreach ($userName in $usersToAddArr)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.addUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                } #users to add
            } #if the group exists in the first place
        } #ActionFlagTrue
         
        if ($RemoveMembers)
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                $usersToAdd = $ADMap[$ADi].SharePoint Group;
                 
                if ($usersToAdd.length -gt 0) #if no users to add, skip
                {
                    $usersToAddArr = $usersToAdd.split("|")
                    foreach ($userName in $usersToAddArr)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot add user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                } #users to add
            } #if the group exists in the first place
        } #ActionFlagTrue
         
        if ($WipeMembers)  #Nukes all users in the group
        {
            $GroupToAddTo = $Groups[$SPGroupName]
            if ($GroupToAddTo) #if group exists
            {
                    foreach ($userName in $GroupToAddTo.Users)
                    {
                        try
                        {
                            $UserObj = $rootWeb.EnsureUser($userName);
                            if ($UserObj)
                            {
                                $GroupToAddTo.RemoveUser($UserObj)  #dup adds are a null-op, throwing no errors
                            }
                        }
                        catch
                        {
                        Write-Host -ForegroundColor DarkRed "cannot remove user ($($userName) to $($GroupToAddTo)"
                        }
 
                    }
                 
            } #if the group exists in the first place
        } #ActionFlagTrue
 
if ($WipeUsersOutOfSite)  #Nukes all users in the group
        {
        $usersToNuke = $ADMap[$ADi].ADGroup;
         
        if ($usersToNuke.length -gt 0) #if no users to add, skip
                {
                    $usersToNukeArr = $usersToNuke.split("|")
                    foreach ($MyUser in $usersToNukeArr)
                    {
                        try
                            {
                                try
                                {
                                    $user1 = $RootWeb.EnsureUser($MyUser)
                                }
                                catch
                                {
                                    Write-Host "x1: Failed to ensure user $($MyUser) in $($Site.url)"
                                }
                                 
                                try
                                {
                                    $RootWeb.SiteUsers.Remove($MyUser)
                                    $RootWeb.update()
                                }
                                catch
                                {
                                    Write-Host "x2: Failed to remove $($MyUser) from all users in $($Site.url)"
                                }
                           }
                           catch
                           {
                                Write-Host "x4: other failure for $($MyUser) in $($Site.url)"
                           }
                } #if user is not null
            } #foreach user to nuke
        } #ActionFlagTrue
         
    }
     
     
    $rootWeb.dispose()
    $site.dispose()
     
} #foreach site

Restore SharePoint document timestamp and author from feedfile

Often an administrator during maintenance or checking in a document for a user, “stomps” on a timestamp and who edited the document. In a perfect world we take the time to restore authorship and timestamp. Here’s a script that reads in a CSV of the URL, timestamp and user of any number of documents to correct. it will also try to remove the previous incorrect version, if possible.

 
$actionlist= Import-Csv "C:scriptsNameDateTag.csv"
 
for ($Ai=0; $Ai -lt $actionlist.Count; $Ai++)
    {
    $ActionRow=$ActionList[$Ai]
    $docurl=$ActionRow.DocURL;
    $site = New-Object Microsoft.SharePoint Online.SPSite($docurl)
    $web = $site.OpenWeb()
    $item = $web.GetListItem($docurl)
    $list = $item.ParentList
     
    [System.DateTime] $dat = Get-Date $ActionRow.Timestamp
    $usr = $web.ensureuser($ActionRow.Editor)
     
     $item["Modified"] = $dat;
     $item["Editor"] = $usr;
        $item.Update()
     try { $item.Versions[1].delete() } catch {write-host -foregroundcolor red "Error (1) could not delete old version of $($item['Name'])"}
    }

Use PowerShell to Automate Migration of FTP files to a File Share

A common business process to automate is moving files from an FTP server. To copy the files from an FTP server we first need to get a file listing. The following routine serves nicely:

function Get-FtpDir ($url,$credentials) {
 
$request = [Net.WebRequest]::Create($url)
 
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
 
if ($credentials)
 
{
 
$request.Credentials = $credentials
 
}
 
$response = $request.GetResponse()
 
$reader = New-Object IO.StreamReader $response.GetResponseStream()
 
$reader.ReadToEnd()
 
$reader.Close()
 
$response.Close()
 
}

Let’s set some basic parameters for the file migration:

$url = 'ftp://sftp.SomeDomain.com';
$user = 'UserID';
$pass = 'Password'  #single quotes recommended if unusual characters are in use
$DeleteSource = $true;  #controls whether to delete the source files from the FTP Server after copying
$DestLocation = '\DestinationServerAnyLocation';

Let’s start the processing with connecting to the FTP server, getting the file list, and processing:

$credentials = new-object System.Net.NetworkCredential($user, $pass)
 $webclient = New-Object System.Net.WebClient
 $webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)  
 
$files=Get-FTPDir $url $credentials
$filesArr = $files.Split("`n")
 
Foreach ($file in ($filesArr )){
    if ($file.length -gt 0) #this actually happens for last file
    {
    $source=$url+$file
    $dest = $DestLocation + $file
    $dest = $dest.Trim()        # this is actually needed, as trailing blank appears in FTP directory listing
    $WebClient.DownloadFile($source, $dest)
    }
}

Let’s now delete the source files, if configured to do so:

$credentials = new-object System.Net.NetworkCredential($user, $pass)
 $webclient = New-Object System.Net.WebClient
 $webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)  
 
$files=Get-FTPDir $url $credentials
$filesArr = $files.Split("`n")
 
Foreach ($file in ($filesArr )){
    if ($file.length -gt 0) #this actually happens for last file
    {
    $source=$url+$file
    $dest = $DestLocation + $file
    $dest = $dest.Trim()        # this is actually needed, as trailing blank appears in FTP directory listing
    $WebClient.DownloadFile($source, $dest)
    }
}

Let’s now delete the source files, if configured to do so:

if ($DeleteSource)
{
Foreach ($file in ($filesArr )){
    if ($file.length -gt 0) #this actually happens for last file
    {
    $source=$url+$file
    $ftprequest = [System.Net.FtpWebRequest]::create($source)
    $ftprequest.Credentials =  New-Object System.Net.NetworkCredential($user,$pass)
    $ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::DeleteFile
    $ftprequest.GetResponse()
    }
}
}

That’s it in a nutshell. Just don’t try to move zero length files, and trim the filenames for trailing blanks. Ensure your FTP ports are open, and you are good to go!

Start Your PowerShell Migration Project In A Click

Our technology and wide delivery footprint have created billions of dollars in value for clients globally and are widely recognized by industry professionals and analysts.

How to Download All Attachments for All Tasks in a List

Downloading all attachments for a SharePoint task list Tasks can have attachments. In fact, they can have multiple attachments.

However, these are stored in an “AttachmentCollection”. We can iterate through all items in the task list to download all attachments.

What we do is create a folder for each of the items and name the folder by the ID of the task.

 
$webUrl = "http:.."            # this is the URL of the SPWeb
$library = "Compliance Tasks"  # this is the SPList display name
$tempLocation = "D:\PROD"      # Local Folder to dump files
$s = new-object Microsoft.SharePoint.SPSite($webUrl)   
$w = $s.OpenWeb()        
$l = $w.Lists[$library]   
foreach ($listItem in $l.Items)
  {
     Write-Host "    Content: " $listItem.ID
       $destinationfolder = $tempLocation + "\" + $listItem.ID         
          if($listItem.Attachments.Count -gt 0)
          {
               if (!(Test-Path -path $destinationfolder))       
               {           
                 $dest = New-Item $destinationfolder -type directory         
               }
                     foreach ($attachment in $listItem.Attachments)   
                 {       
                       $file = $w.GetFile($listItem.Attachments.UrlPrefix + $attachment)       
                       $bytes = $file.OpenBinary()               
                       $path = $destinationfolder + "\" + $attachment
                       Write "Saving $path"
                       $fs = new-object System.IO.FileStream($path, "OpenOrCreate")
                       $fs.Write($bytes, 0 , $bytes.Length)   
                       $fs.Close()   
                 }
              }
   }

A folder for each task was created to allow for multiple attachments. The ID was applied to each folder to allow a subsequent script to traverse and upload the attachments by ID or for any linkage preservation.

For how to upload attachments from a task list, please see: Uploading attachments to tasks.

Additional Read
Secure Store Master Key Error

How to Configure a URL to a Specific Location Inside a PDF

URL to a Location Inside a PDF

It turns out, there is no way to specify a URL to a bookmark
Here’s how to jump to a place inside a PDF from a link by using what is called “Named Destinations”. Note these are not Bookmarks. Links to Bookmarks do not work.

Additional Read

The Ultimate Guide to Using SharePoint for End Users!

Here is an example link that will work: : http://SharePoint/dept/abc/Shared%20Documents/Test%20document%20for%20anchoring3.pdf#nameddes

See the URL above, note there’s a “#” then “nameddest=” then the named destination called “dest3”. This works.

#nameddest=[destination] in this case, I have a named destination I created called “dest3”

Note these are not bookmarks:
1. Use Adobe Acrobat X

2. Edit the PDF

3. Enable viewing of named destinations, by clicking “View, Show/Hide, Navigation Panes, destinations, see image below

4. Click “Destinations” below “Bookmarks” icon on left pane. See below:

img-01
[/av_textblock]

Named Destination
img-02

Additional Read

SharePoint Online vs On-Premises – Migrate from On-Premises to SharePoint Online

How To Migrate Documents Into SharePoint

Getting documents into SharePoint

There are many ways to get documents into SharePoint. This article covers a range of approaches aside from basic file
upload. Let’s first take a quick review of SharePoint’s general file upload limitations:

  • Zero-length files Files such as shortcuts cannot be imported
  • Too large files: Depending on the configured limit, files that are larger than the limit will
    not be uploaded
  • Filetypes Some filetypes are blocked by default. This can be for good reasoning. Executables
    and scripts are dangerous to upload, either easily running on user desks or containing malware.
  • Invalid filename characters Unsupported characters include: ~, #, %, & , *, {, }, \, :,
    <, >, ?, /, |
  • Trailing periods: A filename cannot have a trailing period
  • Leading or trailing spaces: A filename cannot start or end with a blank. Often, the upload will
    work with trailing blanks, but the blank may be truncated
  • Long filenames: The full URL cannot exceed somewhere around 230 characters. Adding in the web
    application, site, library, and any number of folders, it’s not too hard to hit this limit.

img

  Drag and Drop

 

SharePoint 2013 supports drag-and-drop into SharePoint. Some
limitations to consider

  • Only 100 documents at a time
  • Filenames can only contain valid characters
  • Unless default metadata is configured, this approach does not support tagging.

Explorer Mode Icon

  Explorer Mode

 

This is also known as WebDav. In the ribbon under the “Library” tab, there should be an “Open with Explorer” option. This supports folders and many documents. Limitations include poor handling for errors on upload.

 

 

 

OneDrive Icon

  Onedrive for Business

If Sync is selected in the library, documents uploaded to OneDrive can be easily synced with a library.

Scripting Icon

  Scripting

 

Using PowerShell, file migration can be scripted. I’ve created scripts that:

  • Consolidate multiple versions into a single document with versions
  • Preserving of authorship and timestamp
  • Detailed output looking as a CSV
  • Delta migration of only files meeting specific date or other criteria
  • Using custom business logic to map to Sites, libraries, content types, folders, and document sets
  • Fixing of filenames to avoid illegal filename errors

Tool Icon

Migration Tools

A range of document migration tools are available, such as Share-Gate, AvePoint DocAve, Tzunami, and Lightning Tools.

For large-scale or complex migrations, our SharePoint development solutions team can help with customized scripting, error handling, and secure deployment, tailored to your organizational needs.

 

Setting a Site Collection to not be read-only

How to set a site collection as not read-only

Is your site collection read-only?

It is critical to be able to set a site collection to not be read-only. This situation can occur if a site backup is interrupted, as an SPSite backup is made read-only temporarily during backups.

 $site=Get-SPSite "http://SharePoint/managedpath/sitename"
$site.set_ReadOnly($false)

To turn it back to read-only:

$site=Get-SPSite "http://SharePoint/managedpath/sitename"
$site.set_ReadOnly($true)

Newsletters