How are your SharePoint Content Databases Feeling?

Content databases are our magic black boxes where all our content, information and documents, are stored in SharePoint. They are working hard in the background, doing their job, and most often you don’t have to pay them much attention.

That said, there’s a lot to gain on becoming friends, understand and see to once content databases. We all want our SharePoint farms to perform their best and the content databases are one of several factor in achieving that.

One of the things you should be aware of and study are the SharePoint Software Boundaries and Limitations. There’s a section about content databases that I will pay attention to in particular in this article.

The tricky part is that there are recommendations as well as soft and hard limits within SharePoint. If you don’t pay attention to your SharePoint farm it’s easy to accidently cross those recommendations and maybe even the limitations.

Exceeding the recommendations and limits will most often not cause your farm to go down completely at once but they may gradually affect the performance of the farm. You may wake up to a farm that is behaving strangely and throwing you occasional and wired exceptions. Performance problems and exceptions that may be very difficult to trace back to one of those soft limits (speaking out of my own personal experience).

SharePoint support content databases up to 4TB but the recommendation is to keep them below 200GB. If you need to store more than 200GB of data within a web applications you should add additional content databases. You may connect up to 500 content databases within a farm (never tested this limit myself though).

Another limitation that we should focus on is the number of site collections within a single content database. You are able to store up to 10 000 site collections within a content database. However, the recommendation for non-personal (My Site) site collections is a maximum of 2 500.

So- how do we keep track of our content databases in order to stay below these two recommendations?

You are able to set a maximum number of site collections for a content database within Central Administration. However, these warnings are not that easy to see. You are not able to specify a warning or max size of content or data.

Keeping track of the size of the content databases by manually looking at the data files is quite impractical. It’s a lot better to automate the process somehow, using some off the shelf tool or by PowerShell.

Below is a small PowerShell script that may help you keep track of the content databases. The script summarize a small report and send it of by e-mail. The report is a list of the SharePoint farm’s content databases, the size of the data files and the number of site collections within each content database. Add a scheduled task in Windows that initiate the script once a week, once a month or even daily to get a report that will help you keep an eye on those content databases.

# Database server instance.
$DBInstance = 'server'

# Set warning levels.
$WarningNumSites = 2000
$WarningSize = 100 #GB

# Set e-mail properties.
$MailSubject = 'SharePoint content database report'
$MailBody = 'Content database report

' $MailFrom = '' $MailTo = '' $MailSmtp = '' # Register the SharePoint snapin. $snapin = Get-PSSnapin | Where-Object {$_.Name -eq 'Microsoft.SharePoint.Powershell'} if ($snapin -eq $null) { Add-PSSnapin "Microsoft.SharePoint.Powershell" } # Get a list of databases and their sizes from the database server. $dt = new-object "System.Data.DataTable" $cn = new-object System.Data.SqlClient.SqlConnection "server=$DBInstance;database=msdb;Integrated Security=sspi" $cn.Open() $sql = $cn.CreateCommand() $sql.CommandText = " SELECT @@servername AS ServerName, DB_NAME(mf.database_id) AS DBName, convert(dec(14,2),size)/128/1027 AS DBSize FROM sys.master_files as mf INNER JOIN sys.databases AS db ON db.database_id = mf.database_id WHERE mf.type_desc = 'ROWS'" $rdr = $sql.ExecuteReader() $dt.Load($rdr) $cn.Close() $MailBody += '' $Warning = $false # Loop all the SharePoint content databases and add them to the e-mail report. Get-SPDatabase | Where-Object { $_.Type -like 'Content Database' } | Sort { $_.WebApplication.DisplayName }| foreach-object { $db = $dt | Where-Object {$_.DBName -like $DBName} $WebAppName = $_.WebApplication.DisplayName $DBName = $_.Name $NumSite = $_.CurrentSiteCount $DBSize = $db.DBSize if ($WarningNumSites -le $NumSite) { $NumSite = '' + $NumSite + '' $Warning = $true } if ($WarningSize -le $DBSize) { $DBSize = '' + $DBSize.ToString('0.000') +'GB' $Warning = $true } else { $DBSize = $DBSize.ToString('0.000') +'GB' } $MailBody += '' -f $WebAppName, $DBName, $NumSite, $DBSize } $MailBody += '
Web ApplicationDB NameNum SitesDB Size
' if ($Warning) { $MailSubject = 'Warning! ' + $MailSubject } # Send the e-mail report. Send-MailMessage -From $MailFrom -To $MailTo -SmtpServer $MailSmtp -BodyAsHTML -Subject $MailSubject -Body $MailBody

Please note! You will have to run the script on one of the SharePoint farm’s servers and run it with a user that has direct access to the SQL Server.

Note that a site collection is stored within a single content database. So you shouldn’t wait to create a new content database until the previous one hit that magic 200GB mark. Each site collection within that content database may continue to grow. As a result you will have to plan for new content databases depending on how much you expect each site collection to grow within its life cycle.

If you prefer you’re also able to modify the script above to create new content databases automatically when the previous one hit a limit specified by you.

Leave a Reply