Windows PowerShell is a fantastic tool; SharePoint 2010 has literally hundreds of different PowerShell Cmdlets that are available out of the box, if you don’t believe me check this out - http://technet.microsoft.com/en-us/library/ff678226.aspx. What about MOSS 2007? Whilst there aren’t any native Cmdlets for MOSS 2007, PowerShell can be used to access the SharePoint object model directly instead and in most cases achieve the same objectives, this isn’t as daunting as it sounds; I’m not a developer but even I have been able to find my way around the object model and put together some useful scripts (at least to me anyway!)
I've included the script below, all you need to do is to copy this into Notepad (or your text editor of choice), edit the two highlighted variables to match your requirements - $Output specifies the location to output the results to in csv format, $SiteURL specifies the URL of the root site collection, this will then be used to discover other site collections within the Web application. Once you have done this save the file with a .PS1 extension, for example ScriptName.PS1. The script can be run on any server in the SharePoint farm from within a Windows PowerShell window using .\ScriptName.PS1.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")#Configure the location for the output file$Output="C:\Output.csv";"Site URL"+","+"Owner Login"+","+"Owner Email"+","+"Root Site Last Modified"+","+"Quota Limit (MB)"+","+"Total Storage Used (MB)"+","+"Site Quota Percentage Used" | Out-File -Encoding Default -FilePath $Output;#Specify the root site collection within the Web app$Siteurl="http://intranet.contoso.com";$Rootweb=New-Object Microsoft.Sharepoint.Spsite($Siteurl);$Webapp=$Rootweb.Webapplication;#Loops through each site collection within the Web app, if the owner has an e-mail address this is written to the output fileForeach ($Site in $Webapp.Sites){if ($Site.Quota.Storagemaximumlevel -gt 0) {[int]$MaxStorage=$Site.Quota.StorageMaximumLevel /1MB} else {$MaxStorage="0"}; if ($Site.Usage.Storage -gt 0) {[int]$StorageUsed=$Site.Usage.Storage /1MB};if ($Storageused-gt 0 -and $Maxstorage-gt 0){[int]$SiteQuotaUsed=$Storageused/$Maxstorage* 100} else {$SiteQuotaUsed="0"}; $Web=$Site.Rootweb; $Site.Url + "," + $Site.Owner.Name + "," + $Site.Owner.Email + "," +$Web.LastItemModifiedDate.ToShortDateString() + "," +$MaxStorage+","+$StorageUsed + "," + $SiteQuotaUsed | Out-File -Encoding Default -Append -FilePath $Output;$Site.Dispose()};
Community Sites are a new type of site introduced in SharePoint 2013, if you are unfamiliar the following article has some background information on their purpose and functionality - http://technet.microsoft.com/en-us/library/jj219805(v=office.15)
I was recently working with a customer that reported some performance issues in their SharePoint 2013 proof of concept environment, specifically page render time was very poor. I wanted to establish exactly how bad this was and also put in place some basic automated tests to measure response times over a short period of time. I put together a very quick script that measures how long a specific page takes to download and then repeats the test x number of times, as this only measures page download time it doesn't include any client side rendering therefore it isn't wholly accurate, but is a good starting point for assessing performance.
Below is the script itself, it takes two parameters: -URL, which is the URL of the page to download and -Times, which is the number of times to perform the test. By default the script uses the credentials of the current logged on user to connect to the URL provided.
param($URL, $Times)$i = 0While ($i -lt $Times){$Request = New-Object System.Net.WebClient$Request.UseDefaultCredentials = $true$Start = Get-Date$PageRequest = $Request.DownloadString($URL)$TimeTaken = ((Get-Date) - $Start).TotalMilliseconds $Request.Dispose()Write-Host Request $i took $TimeTaken ms -ForegroundColor Green$i ++}
To run the script, copy the commands above into a text editor, save as a .ps1 file and run from a suitable machine. An example of the script in action can be found below.
Brendan Griffin
MOSS 2007 and SharePoint 2010 keep a record of the total storage consumed by each Site collection; this is presented in Central Administration within the Site Collection Quotas and Locks page and also the Storage Space Allocation page within each Site collection in MOSS 2007. This figure is used by SharePoint to determine if a Site collection is exceeding its configured storage quota.
Occasionally this figure can be incorrect and may not accurately reflect the total amount of storage that is actually being consumed by the Site collection. If you ever encounter this scenario you can tell SharePoint to recalculate this figure by using the Windows PowerShell commands below, these use the RecalculateStorageUsed method - http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spsite.recalculatestorageused.aspx. Simply replace the highlighted URL with that of the Site collection that you want to recalculate and then either run the commands manually or save as a .ps1 file and execute as a script.
MOSS 2007
As you will have seen in some of our other blog entries here, we often get asked to assist our customers in creating PowerShell scripts.
I've recently been working on a script to help ensure that a client is able to set site collection auditing. For compliance reasons they needed to be sure that all auditing activities are logged.The script below enables specific logging options for each site collection:
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
$farm = [Microsoft.SharePoint.Administration.SPFarm]::Local $webServices = $farm.Services; foreach ($webService in $farm.Services) { if (!($webService -is [Microsoft.SharePoint.Administration.SPWebService])) { continue; }
foreach ($webApp in $webService.WebApplications | Where-Object {($_.DefaultServerComment -ne 'SharePoint Central Administration v3') -and ($_.DisplayName -eq '<WebApplicationName>') }) { if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication]) { continue; } $siteCollection = $webApp.Sites foreach ($site in $siteCollection) { $site.Audit.AuditFlags = [Microsoft.SharePoint.SPAuditMaskType]::View -bxor [Microsoft.SharePoint.SPAuditMaskType]::Update -bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckIn -bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckOut -bxor [Microsoft.SharePoint.SPAuditMaskType]::Copy -bxor [Microsoft.SharePoint.SPAuditMaskType]::Move -bxor [Microsoft.SharePoint.SPAuditMaskType]::Delete -bxor [Microsoft.SharePoint.SPAuditMaskType]::Undelete -bxor [Microsoft.SharePoint.SPAuditMaskType]::SecurityChange $site.Audit.Update(); $site.dispose(); } $siteCollection.dispose(); } }
To use this all you need to do is set the web application name, and change the options to suit your requirements.
The SharePoint Health Analyzer included in SharePoint 2010/2013 has a rule named "Missing server side dependencies". This rule reports details of all artifacts that are referenced within a content database that are not installed within the local farm, for example Web Parts, Assemblies and Features. The one issue with this is that the formatting of the output isn't great (example below!). I was recently working with a customer that had a large number of issues reported and it was difficult to decipher these, to make our lives a little easier I created a Windows PowerShell script that runs Test-SPContentDatabase (which performs the same tests) against every registered content database within a farm and output the results for each database to a separate CSV file.
The script can be found below, simply change the output path for the CSV files (highlighted) before running.
Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinueForeach ($WebApp in (Get-SPWebApplication)) {"Testing Web Application - " + $WebApp.Name | Write-Host -ForegroundColor Green ; Foreach ($CDB in $WebApp.ContentDatabases) {Test-SPContentDatabase -Name $CDB.Name -WebApplication $WebApp.URL -ServerInstance $CDB.Server | ConvertTo-Csv | Out-File -Encoding default -FilePath $("C:\" + $CDB.Name + ".csv")}}
An example of the output can be seen below: