FromTheField

Real world experiences of SharePoint PFE and CTS engineers from Microsoft UK

FromTheField

  • SharePoint 2010 Organization Browser - Expose more than just User, Title and About Me (Sort Of).

    One of my favourite customers was very keen to display the OOB Organisation Browser web part on several of their pages but they wanted to expose more than the default values against the user. I know this can be done by clicking across to the users profile page but the extra click was not acceptable.

    As you will know this is a Silverlight control and therefore not customisable that easily so the solution I came up with was to essentially grab the properties they wanted to expose and copy them all in to the 'About Me' portion of the user profile.

    They have somewhere in the region of 1000 users on this particular system so obviously I enlisted the help of our friend PowerShell.

    The code looks something like the below, please be aware of the warning! and that this will obviously not keep the data fresh... Also it might be worth stopping the users ability to edit the About Me field so they are not too disappointed when you run it again and clear all their entries!

    WARNING! THIS CODE WILL REMOVE ALL INFORMANTION IN THE ABOUT ME FIELD FOR ALL USERS BEFORE IT ADDS THE NEW PROPERTIES!

    #Add Sharepoint PowerShell Modules
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"

    #Get the Web application           
    $site=new-object Microsoft.SharePoint.SPSite("http://sharepoint2010/");    

    #Get the Service Context of the Web Application      
    $serviceContext = Get-SPServiceContext $site;           
    $site.Dispose();           

    #Get the User Profile Manager
    $upm = new-object Microsoft.Office.Server.UserProfiles.UserProfileManager($serviceContext);

    #Get all of the user profiles for the selected User Profile manager
    $profiles = $upm.GetEnumerator()

    #Execute a For Each loop against every user
    foreach ($userProfile in $profiles)
    {

    #Delete all values in the AboutMe Section of the User Profile for ALL Users     
    $am = $userProfile["AboutMe"]        
    $am.Clear()     
         
    #Set the AboutMe property to the value of am (empty)          
    $userProfile["AboutMe"].Value = $am;      
        
    #Save the profile changes back to the User Profile store           
    $userProfile.Commit()

    #Get the fields you want to expose in the AboutMe Section of the User Profile and on the Org Browser

    $userProperty1 = $userProfile["WorkEmail"].Value
    $userProperty2 = $userProfile["Office"].Value

    #Concatinate the values you have and add them to the AboutMe Section using the silverlight encoding

    $userProfile["AboutMe"].Value = "Email: "+"$userProperty1" + "
" + "Office Location: " +"$userProperty2"
    #Commit the changes to the user profile 

    $userProfile.Commit();

    }


    Essentially it just grabs existing profile properties and puts them into the 'About Me' field.. Don't forget to use "
" which is Silverlight encoding for new line.

    Thanks to Dave Little for his expert help with that :)

    I just ran the script on my test VM with 1000 users and it completed in sub 30 seconds.

    The results are a little something like this:

    WARNING! THIS CODE WILL REMOVE ALL INFORMANTION IN THE ABOUT ME FIELD FOR ALL USERS BEFORE IT ADDS THE NEW PROPERTIES!


    Hopefully someone will find this useful - I scoured the internet and could not find anything that did what I wanted :)


    Enjoy!


    Andy

  • SharePoint Maintenance

    From time to time you may need to temporarily make SharePoint unavailable due to applying a CU, Service Pack etc. One of my customers places an App_Offline.htm file at the root of the IIS Virtual Directory for each Web Application on each WFE server within their farm, this presents users with a nice friendly message when the try to access SharePoint explaining that SharePoint is currently unavailable due to scheduled maintenance.

    This works really well and they wanted to automate the process for copying the App_Offline.htm file prior to performing maintenance, I wrote the following two scripts to automate this. The first script copies the file (to enter maintenance), the second deletes it (to return SharePoint into service).

    The script has logic to handle MOSS 2007, SharePoint 2010 and 2013. It will iterate through all Web Apps and Zones and copy the App_Offline holding page to the Webroot of the IIS site for each Web Application on each server that hosts a Web Application. Simply execute with PowerShell on a server within the farm, the script expects there to be an App_Offline.htm file within the directory that the script is executed from. The script needs to be run using an account that has local admin permissions on each server and makes the presumption that SMB/CIFS access is available.

    Enter Maintenance

    #Load SharePoint assembly
    $Assemblies = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    #Check if the script is running on MOSS 2007
    If (Test-Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12"){$Mode="2007"}

    #Grab all Web Apps
    $WebApps = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.WebApplications

    #Retrieve Servers
    $Farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
    $SPServers = $Farm.Servers | Where {$_.Role -eq "Application"} | Foreach {$_.Name}

    Foreach ($WebApp in $WebApps)
    {
    Foreach ($URL in $WebApp.AlternateUrls)
        {
        If ($Mode = "2007")
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.URLZone)).Path.FullName -replace ":","$"
        }
        Else
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.Zone)).Path.FullName -replace ":","$"
        }
        Write-Host "Setting the Holding Page for" $WebApp.Name "- Zone:" $Url.URLZone -ForegroundColor Green
        Foreach ($Server in $SPServers)
            {
            Write-Host "-Updating" $Server -ForegroundColor Green
            Copy-Item "app_offline.htm" "\\$Server\$Webroot\"
            }
        }
    }

    Return SharePoint into Service

    #Load SharePoint assembly
    $Assemblies = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    #Check if the script is running on MOSS 2007
    If (Test-Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12"){$Mode="2007"}

    #Grab all Web Apps
    $WebApps = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.WebApplications

    #Retrieve Servers
    $Farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
    $SPServers = $Farm.Servers | Where {$_.Role -eq "Application"} | Foreach {$_.Name}

    Foreach ($WebApp in $WebApps)
    {
    Foreach ($URL in $WebApp.AlternateUrls)
        {
        If ($Mode = "2007")
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.URLZone)).Path.FullName -replace ":","$"
        }
        Else
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.Zone)).Path.FullName -replace ":","$"
        }
        Write-Host "Removing the Holding Page for" $WebApp.Name "- Zone:" $Url.URLZone -ForegroundColor Green
        Foreach ($Server in $SPServers)
            {
            Write-Host "-Updating" $Server -ForegroundColor Green
            Remove-Item "\\$Server\$Webroot\app_offline.htm"
            }
        }
    }

    Brendan Griffin - @brendankarl

  • Export Members of a SharePoint Audience

    Here is a quick PowerShell script that I put together for a customer to enable them to export the usernames and e-mail addresses of all users that are members of a specific SharePoint Audience to a CSV file.

    Please update the three highlighted values, $Output is the location to write the CSV file to, replace http://intranet.contoso.com with the URL of a Site Collection that resides within a Web Application that is associated with the User Profile Service Application that contains the Audience and replace "Test" with the name of the Audience that you wish to export.

    This script was tested on SharePoint 2010 but should also work on SharePoint 2013.

    asnp *SharePoint* -ea 0
    $Output="D:\Output.csv"
    "Username"+","+"Email" | Out-File -Encoding Default -FilePath $Output;
    $Site = Get-SPSite "http://intranet.contoso.com"
    $Context=[Microsoft.Office.Server.ServerContext]::GetContext($Site)
    $AudManager=New-Object Microsoft.Office.Server.Audience.AudienceManager($Context)
    $Audience=$AudManager.Audiences | Where {$_.AudienceName -eq "Test"}
    Foreach ($Member in $Audience.GetMembership())
    {
    Write-Host $Member.NTName
    Write-Host $Member.Email
    $Member.NTName + "," + $Member.Email | Out-File -Encoding Default -Append -FilePath $Output
    }

    Brendan Griffin - @brendankarl

  • Backup/Restore-SPSite and the Missing Term Set!

    One of my customers recently wanted to move a Site Collection into a new Content Database, however instead of using Move-SPSite to do this they decided to use Backup/Restore-SPSite and here is where the fun began! 

    One of the key differences between using Move-SPSite and Restore-SPSite is that Restore-SPSite will assign a new GUID to the Site Collection as part of the restore process, one of the results of this is that the Site Collection Term Set will lose its mapping and will be in-accessible - the reason for this is that the ACL for the Term Set uses the Site GUID and if the Sites GUID changes, it will not have access to its Site Collection Term Store.

    The result of this for my customer was that the Site Collection Term Set vanished and was inaccessible when the Site Collection that had been restored, fortunately this is fairly easy to resolve using the steps below:

    • Determine the GUID (UniqueId) of the Site Collection Term Set - This can be obtained by querying the MMS Service Application database. In the example below the Site Collection was named "qwerty".

    • Execute the following script - Replacing TermSetGUID with the UniqueId obtained in the step below and SiteURL with the URL of Site Collection that you need to grant access to the Term Set. Please run with an account that has "Full Control" permissions on the MMS Service Application and is a "Term Store Administrator"

    asnp *sharepoint* -ea 0
    $TermSetGUID = "TermSetGUID"
    $Site = Get-SPSite "SiteURL"
    $TermSet = Get-SPTaxonomySession -Site $Site
    $Group = $TermSet.DefaultSiteCollectionTermStore.Groups | Where {$_.Id -eq $TermSetGUID}
    $Group.AddSiteCollectionAccess($Site.Id)
    $Group.TermStore.CommitAll()


    Brendan Griffin - @brendankarl

  • Hybrid Search with SharePoint 2013 and Office 365

    I recently presented a session at the Yorkshire (UK) SharePoint User Group on Hybrid Search with SharePoint 2013 and Office 365, as promised to the attendees attached is the presentation (including demo videos).

    Brendan Griffin - @brendankarl

  • SP2010 workflow performance

    Today I want to talk about something quite odd that I ran across at one of my customers...

    They had some issues with one of their workflows not "performing" well... It took ages for the workflow to complete.

    One of the most important troubleshooting steps with workflows is usually quite simple...

    It's just "Wait..."

    But since that was already done for a few days, we had to take a closer look to see what was going on. SharePoint workflows can be run in either the W3WP process, or they can be handed over to the OWS Timer service. Usually this transition happens when you add a workflow action that calls for a "wait" or when some events need to be picked up as a reaction to edits... So we checked the timer service and reviewed the running timer jobs. The odd thing was that the workflow timer job is running on every server on the farm for several hours already... That is very odd, since usually that job only runs for a few minutes for each run, so we suspected a hung workflow. The workflow engine provides a lot of ULS entries if it fails somewhere, but non of those showed up... A good collection of the events that are output on failure can be found here...  http://blogs.technet.com/b/sharepointdevelopersupport/archive/2013/03/12/sharepoint-workflow-architecture-part-3.aspx (This also provides some very good information on how to troubleshoot workflows in general)

    So we increased the ULS logging and that helped us to identify what was happening in the job. The next step was to take the data, port it to a single server, and analyze it (merge-splogfile). But since it was a busy production server the amount gathered was very high. We found that all different OWS timer jobs were constantly updating list items in the same sitecollection, but there were no obvious workflow errors in the ULS logs. So now we had at least a place were we could have a closer look.

    As it turned out the sitecollection in question was already known to the customer... The sitecollection was Nintex-Workflow enabeled (Only users with a special inhouse training are allowed to use the nintex workflow components for that customer).

    Now lets go a bit of topic here... The power user on that sitecollection had a small issue, he actually managed to solve...

    He had a list and wanted to utilize [today] in a calculated field. But SharePoint does not support the use of dynamic data (Like [me] or [today]) in a calculated field. But we can use other fields in a calculated fields. So the user was outsmarting SharePoint by adding a field called "today" to his lists... But in order for this to work, the field had to be updated each day. So he found a workflow action that helped him out:

    Armed with this knowledge, he went on and created a site workflow, which should be executed daily... Here is a picture of the workflow the user designed:

    Basically it was a very simple workflow. The 1st Action was to activate the correct permissions to edit the lists, and then we had 12 X the Action "Update multiple Items" for 12 different lists on the farm.

    And since this workflow had to do his work each day, they used the aceduled it for every day... The lists themselves are not really "huge" and the total amount of items for all lists is only about 10.000, so there are no throttling exceptions or anything else...

    So back to our workflow issue... And why has this caused a Problem?

    Well... There are 2 timeout when it comes to workflows. One of these timeouts is the Event delivery timeout... This timeout determines how long a workflow can run. It is documented here: http://technet.microsoft.com/en-us/library/cc262968(v=office.12).aspx

    NoteNote:

    If you create a workflow solution that has a very long processing time to start your workflows, complete tasks, or modify workflows, you should consider increasing this value. View the ULS logs and watch the Microsoft SQL Server table ScheduledWorkItems to determine if the workflow jobs are timing out. The default folder location for the ULS log is Program Files\Common Files\Microsoft Shared\Web server extensions\12\Logs. In the ULS log file, you can use "workflow" or "workflow infrastructure" as search keywords.

    There is also a 2nd timeout, that determins when a workflow has died. This timeout will kick in after 20 minutes and will reset the state of the workflow back, so it will be processed again.

    Now there was a point when the server tipped over, and the processing for all those lists exceeded the 2nd timeout. Which caused the workflows to be reset. Each day there was a new workflow added to the queue... None of these workflows managed to complete, and we piled up a total of about 100 neverending workflows over time. The issue was never noticed, since the workflows managed to update "most" items, and so everything looked fine... There was only one strange result. The Workflow timer server keept increasing in duration, and took longer and longer to complete... (100 X 20 minutes = 2000 minutes, 5 Servers total in the Farm means an average of 6-7 hours per workflow timer job run per server)

    This started to raise other alerts, since the normal workflows got "stuck" behind this one (Processing is done one contentDB at a time) and things like approval took to long to be processed and users started to complain.

    How could this have been fixed?

    Assuming the user insists on keeping his solution, the easiest way to fix it would be to use the FILTER in the workflow action in order to only retrieve items that are NOT todays date. If the workflow fails the 1st time, it will just restart and fix the items it left over from the first run, and not doing millions over millions update actions to the same records setting todays date to today...

  • SharePoint 2013 and Office 365 Hybrid

    I recently presented a session at SharePoint Saturday UK on Configuring SharePoint 2013 and Office 365 Hybrid, as promised to the attendees attached is the presentation (including demo videos). This is an updated version of my previous session at the Yorkshire SharePoint User Group which goes into greater depth on the identity requirements - http://blogs.technet.com/b/fromthefield/archive/2014/11/04/hybrid-search-with-sharepoint-2013-and-office-365.aspx.

    Brendan Griffin - @brendankarl

  • SharePoint Update Deployment - Automating Parallel Content Database Upgrades

    I recently helped a customer to deploy a Cumulative Update to their SharePoint environment, due to the amount of content hosted within the farm the customer uses the approach of detaching all content databases, upgrading the farm and then re-attaching and upgrading all content databases afterwards, this potentially reduces the amount of downtime as content databases can be upgraded in parallel (using separate PowerShell sessions).

    I've put together a script that automates the upgrade of content databases once they have been re-attached to the farm (where they will be running in compatibility mode until upgraded).

    The following script splits the content databases into two batches and then executes a PowerShell job on the server for each batch to upgrade the databases in parallel (2 at a time instead of 1). It has been tested on SharePoint 2010 but should also work on SharePoint 2013.

    #Specify two script blocks for the two batches of upgrades to perform
    $Batch1 = {
    asnp *SharePoint* -ea 0
    $CDB = Get-SPContentDatabase
    $Count = $CDB.Count
    $Batch1 = [Decimal]::Round(($Count/2))
    $CDB[0..$Batch1] | Upgrade-SPContentDatabase -Confirm:$false
    }

    $Batch2 = {
    asnp *SharePoint* -ea 0
    $CDB = Get-SPContentDatabase
    $Count = $CDB.Count
    $Batch1 = [Decimal]::Round(($Count/2))
    $CDB[($Batch1 + 1)..($CDB.Count -1)] | Upgrade-SPContentDatabase -Confirm:$false
    }

    #Start the two upgrade jobs in parallel
    Start-Job -ScriptBlock $Batch1
    Start-Job -ScriptBlock $Batch2

    #Report the status - re-run as needed
    Get-Job
    #Reports the job output once the job has completed
    Get-Job | Receive-Job

    Brendan Griffin - @brendankarl

  • Flush the SharePoint Configuration Cache

    From time to time you may need to flush the SharePoint configuration cache on servers within your farm, my colleague Joe Rodgers blogged about this many moons ago - http://blogs.msdn.com/b/josrod/archive/2007/12/12/clear-the-sharepoint-configuration-cache-for-timer-job-and-psconfig-errors.aspx. If you run into a scenario that you need to flush the configuration cache on all servers within a farm this can become a very boring and laborious job.

    I wrote the following script which will work with MOSS 2007, SharePoint 2010 and SharePoint 2013, simply execute from one server within the farm and the script will perform a configuration cache flush on all servers within the farm.

    $ErrorActionPreference = "Stop"
    #Load SharePoint assembly
    $Assemblies = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    #Retrieve SharePoint servers
    $Farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
    $SPServers = $Farm.Servers | Where {$_.Role -eq "Application"} | Foreach {$_.Name}

    #Detect if running on MOSS 2007 and set the appropriate name for the service
    If (Test-Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12")
        {
        $Timer="SPTimerV3";
        }
    Else {
        $Timer = "SPTimerV4"
       
         }

    If ([Environment]::OSVersion.Version.Major -lt 6)
    {
    $Folder = "Documents and Settings\All Users\Application Data\Microsoft\SharePoint\Config"
    }
    Else
    {
    $Folder = "programdata\Microsoft\SharePoint\Config"
    }

    #Loop through each server
    Foreach ($Server in $SPServers)
    {
    Try {
        #Stop the Timer Service
        Write-Host "-Stopping the Timer Service on $Server" -ForegroundColor Green
        $A = (Get-WmiObject Win32_Service -filter "name='$Timer'" -ComputerName $Server).StopService()
        While ($B = (Get-WmiObject Win32_Service -filter "name='$Timer'" -ComputerName $Server).State -ne "Stopped")
                {
                Start-Sleep 5
                }

        #Clear the Config Cache
        Write-Host "-Clearing the Config Cache on $Server" -ForegroundColor Green
        $ConfigCache = Get-ChildItem \\$Server\c$\$Folder | Sort LastWriteTime | Select -last 1
        $ConfigCachePath = $ConfigCache.FullName
        Remove-Item -Path $ConfigCachePath -Include *.XML -Recurse
        "1" > "$ConfigCachePath\Cache.ini"

        #Restart the Timer Service
        Write-Host "-Starting the Timer Service on $Server" -ForegroundColor Green
        $C = (Get-WmiObject Win32_Service -filter "name='$Timer'" -ComputerName $Server).StartService()
        }
    Catch {
          Write-Host "Unable to flush the cache on $Server, please do this manually" -ForegroundColor Red
          }
    }
    Write-Host "-Configuration Cache Flush Complete" -ForegroundColor Yellow


    Brendan Griffin - @brendankarl

  • Office 365 - Creating a Subsite (Web) using CSOM in SharePoint Online

    SharePoint Online has a number of PowerShell Cmdlets - https://technet.microsoft.com/en-us/library/fp161374(v=office.15).aspx, these Cmdlets include New-SPOSite which provides the ability to create a Site Collection, unfortunately it's not possible to create a SubSite (web) using these Cmdlets - it is however possible to use CSOM to do this and the script below demonstrates how to create a SubSite and a site beneath the SubSite (a SubSubSite?).

    The script below is broken into three sections, the first section is used to connect to the SharePoint Online Site that you wish to create the Subsite within, simply update the highlighted $Site variable with the relevant URL.

    The second section creates a SubSite within the Site Collection. Update the highlighted variables to match your requirements, I've used the Team Site template (STS#0) in this example.

    The third section creates a Site beneath the SubSite that was just created, again update the highlighted variables to meet your requirements. The key thing to note when creating a SubSite beneath an existing SubSite is that you must include the relative path to the new SubSite within the $WCI.Url variable. In this case I first created a SubSite with the URL https://site.sharepoint.com/SubSite to create a SubSite beneath this I must specify the relative path of the SubSubSite to create therefore /SubSite/SubSubSite.

    #Add references to SharePoint client assemblies and authenticate to Office 365 site
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Publishing.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    $Username = Read-Host -Prompt "Please enter your username"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Site = "https://site.sharepoint.com"
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Username,$Password)
    $Context.Credentials = $Creds


    #Create SubSite
    $WCI = New-Object Microsoft.SharePoint.Client.WebCreationInformation
    $WCI.WebTemplate = "STS#0"
    $WCI.Description = "SubSite"
    $WCI.Title = "SubSite"
    $WCI.Url = "SubSite"
    $WCI.Language = "1033"
    $SubWeb = $Context.Web.Webs.Add($WCI)
    $Context.ExecuteQuery()

    #Create SubSubSite
    $WCI = New-Object Microsoft.SharePoint.Client.WebCreationInformation
    $WCI.WebTemplate = "STS#0"
    $WCI.Description = "SubSubSite"
    $WCI.Title = "SubSubSite"
    $WCI.Url = "SubSite/SubSubSite"
    $WCI.Language = "1033"
    $SubWeb = $Context.Web.Webs.Add($WCI)
    $Context.ExecuteQuery()

    Brendan Griffin - @brendankarl

    Steve Jeffery - @moss_sjeffery

  • Remove SharePoint 2007 Top Navigation using PowerShell.

    Following on from Steve's great post yesterday about creating site Navigation from CSV it got me thinking, what If I want to use this to create the Global Navigation on a farm that already has a hotch-potch TopNav structure?

    If this is the situation you are in then the following PowerShell is your friend.. Essentially it runs through all the site collections in your farm and removes all of the TopNav apart from the initial 'Home' Tab.

    You could also use this PS to clean up if you have used Steve's Script in a test environment and you are trying to figure out which Nav structure suits you best..... or if you got the .csv input wrong :)

    Thanks goes to Steve for all his help :) 

    WARNING - THIS WILL DELETE ALL OF THE TOP NAVIGATION BAR FOR ALL OF YOUR SITE COLLECTIONS IN ALL OF YOUR WEB APPLICATIONS WITH A SINGLE CLICK.

    Enjoy..

    Andy

     

    [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

        $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
        $webServices = $farm.Services;
       
        foreach ($webService in $farm.Services)
        {
            if (!($webService -is [Microsoft.SharePoint.Administration.SPWebService]))
            {
                continue;
            }

     foreach ($webApp in $webService.WebApplications | Where-Object {($_.DefaultServerComment -ne 'SharePoint Central Administration v3')})
            {
                if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication])
                {
                    continue;
                }

    $siteCollection = $webApp.Sites     
               
    foreach ($site in $siteCollection)
    {
     
    $web=$site.OpenWeb()
    $tn = $web.Navigation.TopNavigationBar
    [int]$count=$tn.Count                   
    while($count -ne 0)
    {
      if ($tn[$count-1].Title -ne "Home")
      {
        $tn[$count-1].Delete()
      }
      $count--
    }
    }
    }
    }

     

     

  • Current Storage Used Figure is Incorrect for a Site Collection

    MOSS 2007 and SharePoint 2010 keep a record of the total storage consumed by each Site collection; this is presented in Central Administration within the Site Collection Quotas and Locks page and also the Storage Space Allocation page within each Site collection in MOSS 2007. This figure is used by SharePoint to determine if a Site collection is exceeding its configured storage quota.

    Occasionally this figure can be incorrect and may not accurately reflect the total amount of storage that is actually being consumed by the Site collection. If you ever encounter this scenario you can tell SharePoint to recalculate this figure by using the Windows PowerShell commands below, these use the RecalculateStorageUsed method - http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spsite.recalculatestorageused.aspx. Simply replace the highlighted URL with that of the Site collection that you want to recalculate and then either run the commands manually or save as a .ps1 file and execute as a script.

    MOSS 2007

    [void][system.reflection.assembly]::loadwithpartialname("Microsoft.SharePoint")
    $URL = "http://moss2007"
    $Site = New-Object
    Microsoft.SharePoint.SPSite($URL)
    $Site.RecalculateStorageUsed()
     
    SharePoint 2010
    $URL = "http://sp2010"
    $Site = Get-SPSite -identity $URL
    $Site.RecalculateStorageUsed()

     

    Brendan Griffin

     

  • Audit settings in Microsoft Office SharePoint Server 2007

    As you will have seen in some of our other blog entries here, we often get asked to assist our customers in creating PowerShell scripts.

    I've recently been working on a script to help ensure that a client is able to set site collection auditing. For compliance reasons they needed to be sure that all auditing activities are logged.
    The script below enables specific logging options for each site collection:

    [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

        $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
        $webServices = $farm.Services;
       
        foreach ($webService in $farm.Services)
        {
            if (!($webService -is [Microsoft.SharePoint.Administration.SPWebService]))
            {
                continue;
            }

            foreach ($webApp in $webService.WebApplications | Where-Object {($_.DefaultServerComment -ne 'SharePoint Central Administration v3') -and ($_.DisplayName -eq '<WebApplicationName>') })
            {
                if ($webApp -is [Microsoft.SharePoint.Administration.SPAdministrationWebApplication])
                {
                    continue;
                }
               
                $siteCollection = $webApp.Sites
               
                foreach ($site in $siteCollection)
                {
                    $site.Audit.AuditFlags = [Microsoft.SharePoint.SPAuditMaskType]::View -bxor [Microsoft.SharePoint.SPAuditMaskType]::Update -bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckIn -bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckOut -bxor [Microsoft.SharePoint.SPAuditMaskType]::Copy -bxor [Microsoft.SharePoint.SPAuditMaskType]::Move -bxor [Microsoft.SharePoint.SPAuditMaskType]::Delete -bxor [Microsoft.SharePoint.SPAuditMaskType]::Undelete -bxor [Microsoft.SharePoint.SPAuditMaskType]::SecurityChange
                    $site.Audit.Update();
                    $site.dispose();
                     
                }
                $siteCollection.dispose();    
            }           
        }

    To use this all you need to do is set the web application name, and change the options to suit your requirements.

  • Automating Test-SPContentDatabase

    The SharePoint Health Analyzer included in SharePoint 2010/2013 has a rule named "Missing server side dependencies". This rule reports details of all artifacts that are referenced within a content database that are not installed within the local farm, for example Web Parts, Assemblies and Features. The one issue with this is that the formatting of the output isn't great (example below!). I was recently working with a customer that had a large number of issues reported and it was difficult to decipher these, to make our lives a little easier I created a Windows PowerShell script that runs Test-SPContentDatabase (which performs the same tests) against every registered content database within a farm and output the results for each database to a separate CSV file.

     

    The script can be found below, simply change the output path for the CSV files (highlighted) before running.

    Add-PSSnapin Microsoft.SharePoint.PowerShell -EA SilentlyContinue
    Foreach ($WebApp in (Get-SPWebApplication))
     {"Testing Web Application - " + $WebApp.Name | Write-Host -ForegroundColor Green ;
     Foreach ($CDB in $WebApp.ContentDatabases)
      {Test-SPContentDatabase -Name $CDB.Name -WebApplication $WebApp.URL -ServerInstance $CDB.Server | ConvertTo-Csv | Out-File -Encoding default -FilePath $("C:\" + $CDB.Name + ".csv")}}

    An example of the output can be seen below:

    Brendan Griffin

     

     

     

     

  • Build site navigation from CSV file

    One of our customers needed some help to build out navigation from a .csv file in their Microsoft Office SharePoint Server 2007 environment.

    PowerShell seemed to be the most simple option for them; so my colleague Andy and I created the following script:

    [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    $url = "http://intranet"

    $sc = New-Object Microsoft.SharePoint.SPSite($url)
    $csv = Import-Csv -Path "e:\siteNav.csv"

    foreach ($s in $sc)
    {
        $w = $sc.OpenWeb()
        $tn = $w.Navigation.TopNavigationBar
       
       
        if (!$w.Navigation.UseShared)
        {
          
           foreach ($line in $csv)
           {
                if ($line.parent -eq "") {
                    $l = New-Object Microsoft.SharePoint.Navigation.SPNavigationNode($line.description,$line.link,"true");
                    $tn.AddAsLast($l);
                }
               
                elseif ($line.parent -ne "") {
                   
                   $navBar = $tn.Navigation.TopNavigationBar
                  
                   foreach ($navItem in $navBar){
                        if ($navItem.Title -eq $line.parent)
                        {
                            $x = New-Object Microsoft.SharePoint.Navigation.SPNavigationNode($line.description,$line.link,"true");
                            $navItem.Children.AddAsLast($x);
                        }
                   }

                }
           }
          
           $w.Update()
           $w.Dispose()
       
        }
        
        $s.Dispose()
    }

     

    The CSV file was structured with three columns (Description, Link and Parent):

     

  • Identifying Sites using the Publishing Feature

    Below is a simple script that iterates through all Site Collections in all Web Applications within a farm and outputs a list of Site Collections that have the Publishing Feature enabled, I needed this recently during a customer engagement to help them to assess how widespread the Publishing Features were being used within their environment. Simply update the output path highlighted and run.

    Add-PSSnapin Microsoft.Sharepoint.PowerShell -EA SilentlyContinue
    $Output = "C:\Output.txt"
    "Site Collections with the Publishing Feature enabled " + (Get-Date) > $Output
    "------------------------------------------------------------------------" >> $Output
    $WebApps = Get-SPWebApplication
    $Feature = Get-SPFeature "PublishingSite"
    Foreach ($WebApp in $WebApps) {Foreach($Site in $WebApp.Sites)
    {$FeaturePresent = Get-SPFeature -Site $Site | Where {$_.DisplayName -eq $Feature.DisplayName}; if ($FeaturePresent -ne $null){$Site.URL >> $Output}}$Site.Dispose()}

    Below is an example of the output.

    Brendan Griffin

  • Invalid field name {17ca3a22-fdfe-46eb-99b5-9646baed3f16}

    Not the most descriptive title but basically summarises the issue. A customer contacted me as they had issues with an Approval Workflow, basically the Workflow would execute but it would never add anything to the Tasks list. The only information present in the ULS logs was:

    Invalid field name. {17ca3a22-fdfe-46eb-99b5-9646baed3f16}

    I did some investigation and found that this GUID relates to the FormURN field that should be present in the Tasks list - thanks Bing! My next course of action was to check the fields available in the Tasks list and compare these to a working Tasks list in a different Site Collection, I used the following PowerShell commands to collect this information. Where RootWebURL equals the URL of the Site Collection, for example http://intranet.contoso.com/sites/legal.

    $Web = Get-SPWeb "RootWebURL"
    $List = $Web.Lists["Name of Tasks List"]
    $List.Fields | Select Title,InternalName | Sort InternalName > C:\Output.txt

    From this output I could see that several fields were missing from the Tasks list (including FormURN).

    I then compared the Task Content Type (which is automatically associated with the Tasks list) with a different Site Collection to identify any discrepancies using the following PowerShell commands:

    $Web = Get-SPWeb "RootWebURL"
    $Task = $Web.AvailableContentTypes | Where {$_.Name -eq "Task"} > C:\Output.txt

    Interestingly I could see from the output that the content type had no fields associated with it (output cropped for brevity), this should have a number of fields associated.

    My next step was to check the Site Columns to ensure that they were all available and present (in particular FormURN), again comparing with a "working" Site Collection using the following PowerShell commands:

    $Web = Get-SPWeb "RootWebURL"
    $Fields = $Web.AvailableFields | Select InternalName > C:\Output.txt

    This identified a large number of Site Columns that were missing from the Site - including FormURN. Fortunately the fix was pretty simply, there is a Feature imaginatively named "Fields" that is responsible for creating a number of Site Columns, in this case it was simply a matter of disabling and then re-enabling this feature using the following PowerShell commands:

    Disable-SPFeature –identity "Fields" -URL RootWebURL
    Enable-SPFeature –identity "Fields" -URL RootWebURL

    This successfully re-created all of the missing Site Columns and upon creating a new Workflow and associated Task list everything worked!

    I would always advise thoroughly testing the final two commands on a copy of the Site Collection in a testing environment - you don't want to make things any worse :)

    Brendan Griffin

  • The Magically Disappearing "Destination Folder" Option

    I recently investigated an interesting issue for a customer. As you may know, when uploading a Document to a Document Library you are presented with the option to select which folder within the library to upload the document to, if the document library contains any folders.

    In the example below, you can see (highlighted) the option to select the destination folder, as the Document Library has a folder within - aptly named "Folder".

    The issue I ran into was that this option was not available for libraries within a specific Site (SPWeb) even if they contained a folder (see below).

    It turned out that somebody had changed the SPWeb property - "CustomUploadPage" to NULL. The custom upload page is called when a Document Library contains at least one folder. The fix for this was simply a matter of configuring this property to use it's default value:

    asnp *SharePoint* -ErrorAction SilentlyContinue
    $Web = Get-SPWeb "http://team.contoso.com/Sites/IT"
    $Web.CustomUploadPage = "/_layouts/15/UploadEx.aspx"
    $Web.Update()

    If you do run into this issue run the above script, replacing the highlighted URL with that of the affected Site (SPWeb).

    Brendan Griffin

     

     

  • Getting Started with Office 365 and the Client Side Object Model - CSOM

    SharePoint 2013 included some huge improvements to the Client Side Object Model, as more and more of the customers that I work with are moving to Office 365 I need to start thinking about updating the various scripts that I have to use CSOM. This is critical as my collection of scripts have been exclusively been written to make use of the Server Object Model - which I'm never, ever going to be able to access in Office 365 :)

    The beauty of CSOM is that the scripts can be run remotely, as long as you have sufficient permissions to the SharePoint site these can be run from your local machine!

    One of the pre-requisites of using the CSOM is the SharePoint 2013 client side assemblies which need to be installed on the machine that is running the script, these are included in the SharePoint Server 2013 Client Components SDK which can be downloaded from here - http://www.microsoft.com/en-us/download/details.aspx?id=35585.

    To get me started, I've written a super-basic script that connects to a Site within an Office 365 tenant and updates the Title and Description - that is all! As I become more proficient with CSOM, I will share my learning with the Blog. The script will prompt for the username and password, all you need to do is update the URL and the Title and Description (highlighted) to get this to work!

    #Add references to SharePoint client assemblies and authenticate to Office 365 site
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    $Username = Read-Host -Prompt "Please enter your username"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Site = "https://tenant.sharepoint.com/site"
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Username,$Password)
    $Context.Credentials = $Creds

    #Connect to site within the tenant and update the title and description
    $Web = $Context.Web
    $Context.Load($Web)
    $Web.Title = "New Title"
    $Web.Description = "New Description"
    $Web.Update()
    $Context.ExecuteQuery()

     Brendan Griffin

     

  • Creating a Large Number of Users in AD for testing

    I recently had to troubleshoot an issue with the User Profile Service Application and needed to build a repro environment, as part of this I needed to create a large number of users in AD to import into the UPA (too many to do manually using AD Users and Computers!). I thought, there has got to be an easy way using PowerShell - and there is using the Active Directory PowerShell module and the New-ADUser cmdlet.

    The script below will create 1000 users in AD (update the $NumAccounts variable if you need more). It will prompt for a password, this password will be used for all of the accounts. I tested the script on Windows Server 2008 R2, it should also work on Windows Server 2012. The script can either be run from a DC or a server with the Remote Server Administration Tools (RSAT) feature enabled. 

    Import-Module ActiveDirectory
    $Password = Read-Host -AsSecureString
    $NumAccounts = "1000"
    $i = 0
    While ($i -le $NumAccounts)
    {
    Write-Host "Creating User" $i
    New-ADUser -Name ("User" + $i) -AccountPassword $Password -PasswordNeverExpires $true -Enabled $true
    $i++
    }


    Brendan Griffin

  • Download language packs for SharePoint 2013

    Hi all,

    I've been working with a customer recently to help them deploy a multi-lingual SharePoint 2013 environment.
    This can be a little time consuming, so I helped automate this process with PowerShell:

    Here is the script:

    # Import BITS module

    Import-Module BITSTransfer

    # Path to download language packs to

    $downloadPath = "C:\SPLanguagePacks"

    # hashtable for language packs

    $lPacks = @{

    Arabic = "http://download.microsoft.com/download/3/2/C/32C97E8A-E1C4-4BC3-B4B5-1E85B2E0A571/serverlanguagepack.img"

    ChineseSimp = "http://download.microsoft.com/download/4/7/7/477BFB7A-C9C2-4B1D-8408-D70D4AF52DBA/serverlanguagepack.img"

    ChineseTrad = "http://download.microsoft.com/download/F/2/D/F2D67EBD-C9AE-482E-83FA-C4669F058073/serverlanguagepack.img"

    English = "http://download.microsoft.com/download/7/E/C/7EC7E73F-F172-453F-877C-640AF0B82D26/serverlanguagepack.img"

    Kazakh = "http://download.microsoft.com/download/0/D/1/0D1FD1A6-9104-4E57-A531-DDD26EE82E8F/serverlanguagepack.img"

    Korean = "http://download.microsoft.com/download/0/0/D/00D60DFF-E7D2-4EA2-BF6B-0FD591ED7AC3/serverlanguagepack.img"

    }

    # loop through each hashtable item

    $lPacks.GetEnumerator() | ForEach-Object {

    $lang = $_.Name

    $link = $_.Value

    # create folder for each languge

    $destination = New-Item -Path "$downloadPath\$lang" -ItemType Directory

    # start download of language pack

    Start-BitsTransfer -Source $link -Destination $destination -DisplayName "Downloading $lang SharePoint Language Pack" -Priority High

    }

  • Office 365 - PowerShell Script to Create a List, Add Fields and Change the Default View all using CSOM

    I'm my continued quest to get to grips with the Client Side Object Model (CSOM) in SharePoint 2013, I have put together a sample script below that connects to a Site Collection within an O365 tenant and does the following:

    • Creates a list using the "Custom" list template
    • Adds two Site Columns to the list (City and Company)
    • Adds these fields to the default view
    • Adds an item to the list

    You may find this useful as a reference! The usual disclaimers apply :)

    All you need to run this script is an O365 tenant, the SharePoint client components SDK installed on the machine running the script - http://www.microsoft.com/en-us/download/details.aspx?id=35585 and to update the $User, $SiteURL and $ListTitle variables. When the script is executed it will prompt for the password of the user specific in the $User variable.

    #Specify tenant admin and site URL
    $User = "admin@tenant.onmicrosoft.com"
    $SiteURL = https://tenant.sharepoint.com/sites/site
    $ListTitle = "List Title"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)

    #Bind to site collection
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)
    $Context.Credentials = $Creds

    #Retrieve lists
    $Lists = $Context.Web.Lists
    $Context.Load($Lists)
    $Context.ExecuteQuery()

    #Create list with "custom" list template
    $ListInfo = New-Object Microsoft.SharePoint.Client.ListCreationInformation
    $ListInfo.Title = $ListTitle
    $ListInfo.TemplateType = "100"
    $List = $Context.Web.Lists.Add($ListInfo)
    $List.Description = $ListTitle
    $List.Update()
    $Context.ExecuteQuery()

    #Retrieve site columns (fields)
    $SiteColumns = $Context.Web.AvailableFields
    $Context.Load($SiteColumns)
    $Context.ExecuteQuery()

    #Grab city and company fields
    $City = $SiteColumns = $Context.Web.AvailableFields | Where {$_.Title -eq "City"}
    $Company = $SiteColumns = $Context.Web.AvailableFields | Where {$_.Title -eq "Company"}
    $Context.Load($City)
    $Context.Load($Company)
    $Context.ExecuteQuery()

    #Add fields to the list
    $List.Fields.Add($City)
    $List.Fields.Add($Company)
    $List.Update()
    $Context.ExecuteQuery()

    #Add fields to the default view
    $DefaultView = $List.DefaultView
    $DefaultView.ViewFields.Add("City")
    $DefaultView.ViewFields.Add("Company")
    $DefaultView.Update()
    $Context.ExecuteQuery()

    #Adds an item to the list
    $ListItemInfo = New-Object Microsoft.SharePoint.Client.ListItemCreationInformation
    $Item = $List.AddItem($ListItemInfo)
    $Item["Title"] = "New Item"
    $Item["Company"] = "Contoso"
    $Item["WorkCity"] = "London"
    $Item.Update()
    $Context.ExecuteQuery()


    Brendan Griffin