To get RBS going we are going to configure SQL Server, install the RBS libraries on the SharePoint servers, and configure it.
Configure SQL Server
First enable FILESTREAM on the SQL Server instance by:
- Right-clicking SQL Server Properties, enable “FILESTREAM for Transact-SQL access”
- Enable “FILESTREAM for file I/O streaming access”
- Enable “Allow remote clients to have streaming access to FILESTREAM data”
- Run the following SQL
EXEC sp_configure filestream_access_level, 2 RECONFIGURE |
Next, we’ll provivision a BLOB Store. You want a dedicated drive (not used by SQL Server binaries, temp, or the actual databases), and preferably cheaper storage. The BLOB Store needs to be configured for each database; otherwise the RBS configuration on the SharePoint fails (with a very unhelpful error).
use [WSS_Content] |
–name of Content DB for which you are adding BLOB Store; you’ll undoubtedly have a number of DBs if you are doing RBS, so name them with a standard
if not exists (select * from sys.symmetric_keys where name = N'##MS_DatabaseMasterKey##') create master key encryption by password = N'Admin Key Password !2#4' |
— replace with your master password
use [WSS_Content] if not exists (select groupname from sysfilegroups where groupname=N'RBSFilestreamProvider') |
— SharePoint RBS has an undocumented bug/limitation in that there can only one File Stream Provider, so use the same on e!
alter database [WSS_Content] add filegroup RBSFilestreamProvider contains filestream use [WSS_Content] alter database [WSS_Content] |
— set the location to your new cheaper storage dedicated Blob Storage location. You can create separate drives per DB
add file (name = RBSFilestreamFile, filename = 'c:Blobstore') to filegroup RBSFilestreamProvider |
Point to be aware of:
Windows Firewall on the database server could generate an error. This is solved by disabling the firewall, or unblock remote debugging.
Install RBS on SharePoint servers
Before we get started, do yourself a favor, and snapshot or backup a server image, as well as the Content DBs. Installing RBS actually changes the Content DBs by adding a number of Tables and Stored Procedures. Make sure you use the x64 RBS drivers, and do not run the wizard to install by double-clicking the .MSI file. Microsoft reports that the wizard configures it suboptimally and is then unchangeable. Lastly, anything you ever install on SharePoint servers should be saved for reference, server reinstallation, DR, recovery, or even simply for adding another server to the farm. This includes any web part, feature, WSP, executable, but I digress…
This is for the first install for the first database:
msiexec /qn /lvx* rbs_install_log.txt /i RBS.msi TRUSTSERVERCERTIFICATE=true FILEGROUP=PRIMARY DBNAME="Content_Intranet_IT" DBINSTANCE="SQL Server Instance Name" FILES AMFILEGROUP=RBSFilestreamProvider FILESTREAMSTORENAME=[TheNameOfTheFileStream] |
This is for subsequent databases:
The key here is the name of the File Stream Store cannot change between databases. This is the RBS limitation mentioned earlier. For each DB, it creates a set of Tables and Stored Procedures.
msiexec /qn /i rbs.msi REMOTEBLOBENABLE=1 FILESTREAMPROVIDERENABLE=1 DBNAME="your content db name" FILESTREAMSTORENAME=[TheNameOfTheFileStream] ADDLOCAL=EnableRBS,FilestreamRunScript DBINSTANCE="SQL Server Instance Name" |
he next challenge is determining if the installation worked. The command returns the command prompt instantly; even though it is running in the background. It runs for a minute or so typically. The logs are actually quite unclear. Microsoft says to scroll to the end of the file and on the last 20 lines find “Product: SQL Remote Blob Storage – Installation completed successfully“. Another way to tell is to look at the size of the logfile. A failed install has a logfile has around 500kb and does not have the above text near the end of the file, while a successful install has a logfile of 1.4MB if the database was RBS enabled as part of the RBS install. One last thing to check is that the RBS tables were created in the database. To do that, fire up SQL Studio, open the content database, and look at the end of the list of tables. If RBS is configured for the database, you’ll see roughly 20 tables starting with “rbs_”. These tables track references to the Blob Store, as well as configuration and housekeeping such as the history and deleted items for managing the lazy delete cleanup process we’ll be discussing later on configuring and running the “Maintainer”.
Configuring the Blob Store in SharePoint
At this point we might think we are done, but there’s more to do; nothing is being stored yet in the FILESTREAM Blob Store. SharePoint will continue to blindly store all files in the database. Here are the simple steps to take in PowerShell (ensure the SharePoint snap-in is loaded):
# Provide your content DB name using the name of the Content DB, or if there’s only one Content DB for a web app, you can get it by identifying the web app
$cdb=Get-SPContentDatabase -identity Content_DB_Use_Your_Name $cdb = Get-SPContentDatabase –WebApplication <a_href="htt mysite"="">htt p://MySite |
# This grabs the Blog Storage Object. if you type “$rbs” it will show the status, which initially is not enabled
$rbs = $cdb.RemoteBlobStorageSettings $rbs.GetProviderNames() $rbs.Installed() #this confirms it is installed correctly $rbs.Enable() # This is the key; it enables the Blob store $rbs.SetActiveProviderName($rbs.GetProviderNames()[0]) |
# Setting the active provider is required; there’s only one, at index 0
$rbs.MinimumBlobStorageSize =500kb |
# this is the minimum size of a file to be sent to the Blob Store; anything smaller is kept in the Content DB.
$rbs.Migrate() |
# this forces content into Blob Store from the Content DB, or moves content out of the Blob Store and back to the Content DB; based on the minimumBlobStorageSize
$rbs.update() $cdb.update() |
# both updates are absolutely required for the minimumBlobStorageSize size to “stick”. Existing documentation neglects this point.
Setting the MinimumBlobStorageSize is key. Setting it aggressively low results in a smaller Content DB, but relies more heavily on the speed of the Blob Store. A range of 100kb to 1mb is reasonable. You can experiment using different values, doing an $rbs.migrate() and testing. Note the Blob Store can grow rapidly due to the design, where overwritten files in SharePoint are never overwritten in the Blob Store; instead new files are created. A subsequent cleanup process needs to be configured and run called the “Maintainer”, but that’s tomorrow’s blog.
Check out RBS Part 3
[/av_textblock]
6 thoughts on “RBS (Remote Blob Storage) part 2”
Leave a Reply
Want to talk?
Drop us a line. We are here to answer your questions 24*7.
Hi Joel,
I’m trying to find a way to make a SQL query that could give me for each file in my Blob Store the name and path in Sharepoint where the file would be.
I’m pretty sure you know a way to do that, and it would be great help for me…
Regards
Hi Joel,
I realy played with fire and got burn : I made an active directory migration and my sql server and my sharepoint server have migrated. On first sight Sharepoint was still working, but I’ve got a problem with RBS, when I try to get a file stored in Rbs (because larger than 1Mb) I get a 404 error, and the following error message in windows event viewer on the sharepoint server.
Message ID:16, Level:ERR , Process:8388, Thread:25
Exception thrown:
Operation: StoreReadBlob
BlobStoreId: 0
Log Time: 11/02/2014 16:40:37
Exception: Microsoft.Data.SqlRemoteBlobs.RemoteBlobStoreException: The blob store threw an exception. The exception thrown is saved in InnerException. —> System.ComponentModel.Win32Exception: Nom de réseau introuvable
à System.Data.SqlTypes.SqlFileStream.OpenSqlFileStream(String path, Byte[] transactionContext, FileAccess access, FileOptions options, Int64 allocationSize)
à System.Data.SqlTypes.SqlFileStream..ctor(String path, Byte[] transactionContext, FileAccess access, FileOptions options, Int64 allocationSize)
à Microsoft.Data.BlobStores.FilestreamBlobStore.FilestreamReaderStream..ctor(String filestreamPathName, CommittableTransaction transaction, Byte[] transactionCtx, Boolean asyncReadDisabled)
à Microsoft.Data.BlobStores.FilestreamBlobStore.FilestreamStoreLibraryBase.ReadBlob(Byte[] storePoolId, Byte[] storeBlobId)
à Microsoft.Data.BlobStores.BlobStore.ReadBlobInternal(Request request)
— Fin de la trace de la pile d’exception interne —
ExceptionType: BlobStore
ExceptionCode: BlobStoreUnhandledException
Request Information:
Operation: StoreReadBlob
Request:
Request 65 for OpenReadableBlobStream
Request Configuration not set.
Command Details:
FetchBlobCommand : Request 65 , CheckedConfigVersion: Undefined, RetryNeeded: False, RetryAttempts: 3
BlobDetails: BlobStoreId 1, StorePoolId 0x01000000, StoreBlobId 0x00227700000056efbd960b382481cfa3
Session Information not set.
Provider Session Information:
Provider Session Info:
ProviderSession: ProviderSession. Id: , Name: , Type: , StoreVersion: , Location:
Core Configuration:
Extended Configuration:
Any idea to help me?
Regards
Check FILESTORE permissions.
One way to try to get your production farm going is to get blobs back out. Here’s how (note I have not tested this now, but it should work):
[sourcecode language=””””]
$db = New-Object System.Collections.ArrayList
Get-SPContentDatabase | foreach {$db+=$_;}
for ($ii=0;$ii -lt $db.count;$ii++)
{
write-host -foregroundcolor green “$($db[$ii].Name)”;
try {
$rbs = $db[$ii].RemoteBlobStorageSettings;
write-host “Provider Name=$($rbs.GetProviderNames())”;
write-host “Enabled=$($rbs.enabled)”;
write-host “Min Blob Size before resetting=$($rbs.MinimumBlobStorageSize)”
$rbs.MinimumBlobStorageSize=2gb; #set it really high
$rbs.update;
$rbs.Migrate()
$db[$ii].update() #not documented, but the db.update() is absolutely required in addition to the rbs.update()
}
catch {write-host -foregroundcolor red “RBS not installed on this database!`n”}
finally {write-host “——————————————————————`n”}
}
[/sourcecode]
Thanks for your quick reply, I’ve tried it, and I guess it should take some time to reintgrate all files in the Content database, but I don’t get very verbose messages and I still can’t access files bigger than 1Mb in Sharepoint 🙁 I will have to restore the backup I made before I activate Filestream. 🙁 It was thursday so, I will have to backup all files modified since and reintegrate them after restore.
Or maybe you have so magic to suggest… I must admit I’ve been totally dumb on this activate Rbs before performing an AD migration. And have you got some tips about this and Sharepoint, because ADMT doesn’t propose “Sharepoint farm” among all objects it can migrate.
Hi Joel, when I input the command $rbs.Update() it tells that there’s no command Update for this kind of object.
It is possible Update() method is not available and hence not required.