FromTheField

Real world experiences of SharePoint PFE and CTS engineers from Microsoft UK

July, 2013

  • Windows PowerShell Script to Output Site Collection Information

    Windows PowerShell is a fantastic tool; SharePoint 2010 has literally hundreds of different PowerShell Cmdlets that are available out of the box, if you don’t believe me check this out - http://technet.microsoft.com/en-us/library/ff678226.aspx. What about MOSS 2007? Whilst there aren’t any native Cmdlets for MOSS 2007, PowerShell can be used to access the SharePoint object model directly instead and in most cases achieve the same objectives, this isn’t as daunting as it sounds; I’m not a developer but even I have been able to find my way around the object model and put together some useful scripts (at least to me anyway!)

    I’ve recently been helping one of my customers write some PowerShell scripts to improve their reporting capabilities and reduce the burden of day to day SharePoint administration on the support team. One of the scripts that I’ve written analyses every site collection within a Web application. The purpose of the script was to identify sites that were no longer required (as they hadn’t been updated for a long time) or that had a large quota assigned but were only using a small proportion of this, so that a smaller quota could be assigned. The script outputs the following information for each site collection into a csv file:
    • URL
    • Owner login
    • Owner e-mail address
    • Last time that the root web was modified
    • The size of the quota assigned
    • The total storage used
    • The percentage of the quota being used

    I've included the script below, all you need to do is to copy this into Notepad (or your text editor of choice), edit the two highlighted variables to match your requirements - $Output specifies the location to output the results to in csv format, $SiteURL specifies the URL of the root site collection, this will then be used to discover other site collections within the Web application. Once you have done this save the file with a .PS1 extension, for example ScriptName.PS1. The script can be run on any server in the SharePoint farm from within a Windows PowerShell window using .\ScriptName.PS1.

    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
    #Configure the location for the output file
    $Output="C:\Output.csv";
    "Site URL"+","+"Owner Login"+","+"Owner Email"+","+"Root Site Last Modified"+","+"Quota Limit (MB)"+","+"Total Storage Used (MB)"+","+"Site Quota Percentage Used" | Out-File -Encoding Default -FilePath $Output;
    #Specify the root site collection within the Web app
    $Siteurl="http://intranet.contoso.com";
    $Rootweb=New-Object Microsoft.Sharepoint.Spsite($Siteurl);
    $Webapp=$Rootweb.Webapplication;
    #Loops through each site collection within the Web app, if the owner has an e-mail address this is written to the output file
    Foreach ($Site in $Webapp.Sites)
    {if ($Site.Quota.Storagemaximumlevel -gt 0) {[int]$MaxStorage=$Site.Quota.StorageMaximumLevel /1MB} else {$MaxStorage="0"};
    if ($Site.Usage.Storage -gt 0) {[int]$StorageUsed=$Site.Usage.Storage /1MB};
    if ($Storageused-gt 0 -and $Maxstorage-gt 0){[int]$SiteQuotaUsed=$Storageused/$Maxstorage* 100} else {$SiteQuotaUsed="0"};
    $Web=$Site.Rootweb; $Site.Url + "," + $Site.Owner.Name + "," + $Site.Owner.Email + "," +$Web.LastItemModifiedDate.ToShortDateString() + "," +$MaxStorage+","+$StorageUsed + "," + $SiteQuotaUsed | Out-File -Encoding Default -Append -FilePath $Output;$Site.Dispose()};

     
     
    Below is an example of the script output.
     
     
     Brendan Griffin
  • Using PowerShell to Measure Page Download Time

    I was recently working with a customer that reported some performance issues in their SharePoint 2013 proof of concept environment, specifically page render time was very poor. I wanted to establish exactly how bad this was and also put in place some basic automated tests to measure response times over a short period of time. I put together a very quick script that measures how long a specific page takes to download and then repeats the test x number of times, as this only measures page download time it doesn't include any client side rendering therefore it isn't wholly accurate, but is a good starting point for assessing performance.

    Below is the script itself, it takes two parameters: -URL, which is the URL of the page to download and -Times, which is the number of times to perform the test. By default the script uses the credentials of the current logged on user to connect to the URL provided.

    param($URL, $Times)
    $i = 0
    While ($i -lt $Times)
    {$Request = New-Object System.Net.WebClient
    $Request.UseDefaultCredentials = $true
    $Start = Get-Date
    $PageRequest = $Request.DownloadString($URL)
    $TimeTaken = ((Get-Date) - $Start).TotalMilliseconds
    $Request.Dispose()
    Write-Host Request $i took $TimeTaken ms -ForegroundColor Green
    $i ++}

    To run the script, copy the commands above into a text editor, save as a .ps1 file and run from a suitable machine. An example of the script in action can be found below.

    Brendan Griffin

  • Community Sites - Where are my analytics reports!

    Community Sites are a new type of site introduced in SharePoint 2013, if you are unfamiliar the following article has some background information on their purpose and functionality - http://technet.microsoft.com/en-us/library/jj219805(v=office.15)

    I was playing around with a Community Site the other day and noticed something very strange, for some reason in Site Settings it includes links to the SharePoint 2010 Web Analytics reports. As you may know Web Analytics has been removed from SharePoint 2013 and its functionality has been replaced by the Analytics Processing component provided by Search - http://technet.microsoft.com/en-gb/library/ff607742.aspx#section1.
     
    Below is a screenshot of how this appeared:
     
     
    As Web Analytics isn’t available in SharePoint 2013 neither of these reports work, in addition to this links for the SharePoint 2013 equivalent analytics reports are missing – Popularity and Search Reports & Popularity Trends. What should appear is the following (taken from a standard “Team Site”).
     
     
    I did some digging around and found that the reason for this is that the Site Collection “Reporting” feature isn’t activated for Community Sites by default, activating this feature removes the links to the legacy SharePoint 2010 Web Analytics Reports and adds links to the new SharePoint 2013 Analytics reports. Not sure exactly why this feature isn’t automatically activated!
     
    If you do need to access the reports for Community Sites it is simple enough to activate the feature or to put
    together a PowerShell script to automate the activation of this feature across all Community Sites. To prevent this from occurring in the future you could staple the “Reporting” feature to the Community Site template, this will ensure that all new Community Sites have the “Reporting” feature activated
    automatically.
     
    Brendan Griffin
  • Current Storage Used Figure is Incorrect for a Site Collection

    MOSS 2007 and SharePoint 2010 keep a record of the total storage consumed by each Site collection; this is presented in Central Administration within the Site Collection Quotas and Locks page and also the Storage Space Allocation page within each Site collection in MOSS 2007. This figure is used by SharePoint to determine if a Site collection is exceeding its configured storage quota.

    Occasionally this figure can be incorrect and may not accurately reflect the total amount of storage that is actually being consumed by the Site collection. If you ever encounter this scenario you can tell SharePoint to recalculate this figure by using the Windows PowerShell commands below, these use the RecalculateStorageUsed method - http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spsite.recalculatestorageused.aspx. Simply replace the highlighted URL with that of the Site collection that you want to recalculate and then either run the commands manually or save as a .ps1 file and execute as a script.

    MOSS 2007

    [void][system.reflection.assembly]::loadwithpartialname("Microsoft.SharePoint")
    $URL = "http://moss2007"
    $Site = New-Object
    Microsoft.SharePoint.SPSite($URL)
    $Site.RecalculateStorageUsed()
     
    SharePoint 2010
    $URL = "http://sp2010"
    $Site = Get-SPSite -identity $URL
    $Site.RecalculateStorageUsed()

     

    Brendan Griffin

     

  • Automating Test-SPContentDatabase

    The SharePoint Health Analyzer included in SharePoint 2010/2013 has a rule named "Missing server side dependencies". This rule reports details of all artifacts that are referenced within a content database that are not installed within the local farm, for example Web Parts, Assemblies and Features. The one issue with this is that the formatting of the output isn't great (example below!). I was recently working with a customer that had a large number of issues reported and it was difficult to decipher these, to make our lives a little easier I created a Windows PowerShell script that runs Test-SPContentDatabase (which performs the same tests) against every registered content database within a farm and output the results for each database to a separate CSV file.

     

    The script can be found below, simply change the output path for the CSV files (highlighted) before running.

    Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinue
    Foreach ($WebApp in (Get-SPWebApplication))
     {"Testing Web Application - " + $WebApp.Name | Write-Host -ForegroundColor Green ;
     Foreach ($CDB in $WebApp.ContentDatabases)
      {Test-SPContentDatabase -Name $CDB.Name -WebApplication $WebApp.URL -ServerInstance $CDB.Server | ConvertTo-Csv | Out-File -Encoding default -FilePath $("C:\" + $CDB.Name + ".csv")}}

    An example of the output can be seen below:

    Brendan Griffin

     

     

     

     

  • Audit settings in Microsoft Office SharePoint Server 2007

    As you will have seen in some of our other blog entries here, we often get asked to assist our customers in creating PowerShell scripts.

    I've recently been working on a script to help ensure that a client is able to set site collection auditing. For compliance reasons they needed to be sure that all auditing activities are logged.
    The script below enables specific logging options for each site collection:

    [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

        $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
        $webServices = $farm.Services;
       
        foreach ($webService in $farm.Services)
        {
            if (!($webService -is [Microsoft.SharePoint.Administration.SPWebService]))
            {
                continue;
            }

            foreach ($webApp in $webService.WebApplications | Where-Object {($_.DefaultServerComment -ne 'SharePoint Central Administration v3') -and ($_.DisplayName -eq '<WebApplicationName>') })
            {
                if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication])
                {
                    continue;
                }
               
                $siteCollection = $webApp.Sites
               
                foreach ($site in $siteCollection)
                {
                    $site.Audit.AuditFlags = [Microsoft.SharePoint.SPAuditMaskType]::View -bxor [Microsoft.SharePoint.SPAuditMaskType]::Update -bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckIn -bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckOut -bxor [Microsoft.SharePoint.SPAuditMaskType]::Copy -bxor [Microsoft.SharePoint.SPAuditMaskType]::Move -bxor [Microsoft.SharePoint.SPAuditMaskType]::Delete -bxor [Microsoft.SharePoint.SPAuditMaskType]::Undelete -bxor [Microsoft.SharePoint.SPAuditMaskType]::SecurityChange
                    $site.Audit.Update();
                    $site.dispose();
                     
                }
                $siteCollection.dispose();    
            }           
        }

    To use this all you need to do is set the web application name, and change the options to suit your requirements.