FromTheField

Real world experiences of SharePoint PFE and CTS engineers from Microsoft UK

FromTheField

  • SharePoint 2013 and Office 365 Hybrid

    I recently presented a session at SharePoint Saturday UK on Configuring SharePoint 2013 and Office 365 Hybrid, as promised to the attendees attached is the presentation (including demo videos). This is an updated version of my previous session at the Yorkshire SharePoint User Group which goes into greater depth on the identity requirements - http://blogs.technet.com/b/fromthefield/archive/2014/11/04/hybrid-search-with-sharepoint-2013-and-office-365.aspx.

    Brendan Griffin - @brendankarl

  • SP2010 workflow performance

    Today I want to talk about something quite odd that I ran across at one of my customers...

    They had some issues with one of their workflows not "performing" well... It took ages for the workflow to complete.

    One of the most important troubleshooting steps with workflows is usually quite simple...

    It's just "Wait..."

    But since that was already done for a few days, we had to take a closer look to see what was going on. SharePoint workflows can be run in either the W3WP process, or they can be handed over to the OWS Timer service. Usually this transition happens when you add a workflow action that calls for a "wait" or when some events need to be picked up as a reaction to edits... So we checked the timer service and reviewed the running timer jobs. The odd thing was that the workflow timer job is running on every server on the farm for several hours already... That is very odd, since usually that job only runs for a few minutes for each run, so we suspected a hung workflow. The workflow engine provides a lot of ULS entries if it fails somewhere, but non of those showed up... A good collection of the events that are output on failure can be found here...  http://blogs.technet.com/b/sharepointdevelopersupport/archive/2013/03/12/sharepoint-workflow-architecture-part-3.aspx (This also provides some very good information on how to troubleshoot workflows in general)

    So we increased the ULS logging and that helped us to identify what was happening in the job. The next step was to take the data, port it to a single server, and analyze it (merge-splogfile). But since it was a busy production server the amount gathered was very high. We found that all different OWS timer jobs were constantly updating list items in the same sitecollection, but there were no obvious workflow errors in the ULS logs. So now we had at least a place were we could have a closer look.

    As it turned out the sitecollection in question was already known to the customer... The sitecollection was Nintex-Workflow enabeled (Only users with a special inhouse training are allowed to use the nintex workflow components for that customer).

    Now lets go a bit of topic here... The power user on that sitecollection had a small issue, he actually managed to solve...

    He had a list and wanted to utilize [today] in a calculated field. But SharePoint does not support the use of dynamic data (Like [me] or [today]) in a calculated field. But we can use other fields in a calculated fields. So the user was outsmarting SharePoint by adding a field called "today" to his lists... But in order for this to work, the field had to be updated each day. So he found a workflow action that helped him out:

    Armed with this knowledge, he went on and created a site workflow, which should be executed daily... Here is a picture of the workflow the user designed:

    Basically it was a very simple workflow. The 1st Action was to activate the correct permissions to edit the lists, and then we had 12 X the Action "Update multiple Items" for 12 different lists on the farm.

    And since this workflow had to do his work each day, they used the aceduled it for every day... The lists themselves are not really "huge" and the total amount of items for all lists is only about 10.000, so there are no throttling exceptions or anything else...

    So back to our workflow issue... And why has this caused a Problem?

    Well... There are 2 timeout when it comes to workflows. One of these timeouts is the Event delivery timeout... This timeout determines how long a workflow can run. It is documented here: http://technet.microsoft.com/en-us/library/cc262968(v=office.12).aspx

    NoteNote:

    If you create a workflow solution that has a very long processing time to start your workflows, complete tasks, or modify workflows, you should consider increasing this value. View the ULS logs and watch the Microsoft SQL Server table ScheduledWorkItems to determine if the workflow jobs are timing out. The default folder location for the ULS log is Program Files\Common Files\Microsoft Shared\Web server extensions\12\Logs. In the ULS log file, you can use "workflow" or "workflow infrastructure" as search keywords.

    There is also a 2nd timeout, that determins when a workflow has died. This timeout will kick in after 20 minutes and will reset the state of the workflow back, so it will be processed again.

    Now there was a point when the server tipped over, and the processing for all those lists exceeded the 2nd timeout. Which caused the workflows to be reset. Each day there was a new workflow added to the queue... None of these workflows managed to complete, and we piled up a total of about 100 neverending workflows over time. The issue was never noticed, since the workflows managed to update "most" items, and so everything looked fine... There was only one strange result. The Workflow timer server keept increasing in duration, and took longer and longer to complete... (100 X 20 minutes = 2000 minutes, 5 Servers total in the Farm means an average of 6-7 hours per workflow timer job run per server)

    This started to raise other alerts, since the normal workflows got "stuck" behind this one (Processing is done one contentDB at a time) and things like approval took to long to be processed and users started to complain.

    How could this have been fixed?

    Assuming the user insists on keeping his solution, the easiest way to fix it would be to use the FILTER in the workflow action in order to only retrieve items that are NOT todays date. If the workflow fails the 1st time, it will just restart and fix the items it left over from the first run, and not doing millions over millions update actions to the same records setting todays date to today...

  • Hybrid Search with SharePoint 2013 and Office 365

    I recently presented a session at the Yorkshire (UK) SharePoint User Group on Hybrid Search with SharePoint 2013 and Office 365, as promised to the attendees attached is the presentation (including demo videos).

    Brendan Griffin - @brendankarl

  • Backup/Restore-SPSite and the Missing Term Set!

    One of my customers recently wanted to move a Site Collection into a new Content Database, however instead of using Move-SPSite to do this they decided to use Backup/Restore-SPSite and here is where the fun began! 

    One of the key differences between using Move-SPSite and Restore-SPSite is that Restore-SPSite will assign a new GUID to the Site Collection as part of the restore process, one of the results of this is that the Site Collection Term Set will lose its mapping and will be in-accessible - the reason for this is that the ACL for the Term Set uses the Site GUID and if the Sites GUID changes, it will not have access to its Site Collection Term Store.

    The result of this for my customer was that the Site Collection Term Set vanished and was inaccessible when the Site Collection that had been restored, fortunately this is fairly easy to resolve using the steps below:

    • Determine the GUID (UniqueId) of the Site Collection Term Set - This can be obtained by querying the MMS Service Application database. In the example below the Site Collection was named "qwerty".

    • Execute the following script - Replacing TermSetGUID with the UniqueId obtained in the step below and SiteURL with the URL of Site Collection that you need to grant access to the Term Set. Please run with an account that has "Full Control" permissions on the MMS Service Application and is a "Term Store Administrator"

    asnp *sharepoint* -ea 0
    $TermSetGUID = "TermSetGUID"
    $Site = Get-SPSite "SiteURL"
    $TermSet = Get-SPTaxonomySession -Site $Site
    $Group = $TermSet.DefaultSiteCollectionTermStore.Groups | Where {$_.Id -eq $TermSetGUID}
    $Group.AddSiteCollectionAccess($Site.Id)
    $Group.TermStore.CommitAll()


    Brendan Griffin - @brendankarl

  • Export Members of a SharePoint Audience

    Here is a quick PowerShell script that I put together for a customer to enable them to export the usernames and e-mail addresses of all users that are members of a specific SharePoint Audience to a CSV file.

    Please update the three highlighted values, $Output is the location to write the CSV file to, replace http://intranet.contoso.com with the URL of a Site Collection that resides within a Web Application that is associated with the User Profile Service Application that contains the Audience and replace "Test" with the name of the Audience that you wish to export.

    This script was tested on SharePoint 2010 but should also work on SharePoint 2013.

    asnp *SharePoint* -ea 0
    $Output="D:\Output.csv"
    "Username"+","+"Email" | Out-File -Encoding Default -FilePath $Output;
    $Site = Get-SPSite "http://intranet.contoso.com"
    $Context=[Microsoft.Office.Server.ServerContext]::GetContext($Site)
    $AudManager=New-Object Microsoft.Office.Server.Audience.AudienceManager($Context)
    $Audience=$AudManager.Audiences | Where {$_.AudienceName -eq "Test"}
    Foreach ($Member in $Audience.GetMembership())
    {
    Write-Host $Member.NTName
    Write-Host $Member.Email
    $Member.NTName + "," + $Member.Email | Out-File -Encoding Default -Append -FilePath $Output
    }

    Brendan Griffin - @brendankarl

  • SharePoint Maintenance

    From time to time you may need to temporarily make SharePoint unavailable due to applying a CU, Service Pack etc. One of my customers places an App_Offline.htm file at the root of the IIS Virtual Directory for each Web Application on each WFE server within their farm, this presents users with a nice friendly message when the try to access SharePoint explaining that SharePoint is currently unavailable due to scheduled maintenance.

    This works really well and they wanted to automate the process for copying the App_Offline.htm file prior to performing maintenance, I wrote the following two scripts to automate this. The first script copies the file (to enter maintenance), the second deletes it (to return SharePoint into service).

    The script has logic to handle MOSS 2007, SharePoint 2010 and 2013. It will iterate through all Web Apps and Zones and copy the App_Offline holding page to the Webroot of the IIS site for each Web Application on each server that hosts a Web Application. Simply execute with PowerShell on a server within the farm, the script expects there to be an App_Offline.htm file within the directory that the script is executed from. The script needs to be run using an account that has local admin permissions on each server and makes the presumption that SMB/CIFS access is available.

    Enter Maintenance

    #Load SharePoint assembly
    $Assemblies = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    #Check if the script is running on MOSS 2007
    If (Test-Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12"){$Mode="2007"}

    #Grab all Web Apps
    $WebApps = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.WebApplications

    #Retrieve Servers
    $Farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
    $SPServers = $Farm.Servers | Where {$_.Role -eq "Application"} | Foreach {$_.Name}

    Foreach ($WebApp in $WebApps)
    {
    Foreach ($URL in $WebApp.AlternateUrls)
        {
        If ($Mode = "2007")
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.URLZone)).Path.FullName -replace ":","$"
        }
        Else
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.Zone)).Path.FullName -replace ":","$"
        }
        Write-Host "Setting the Holding Page for" $WebApp.Name "- Zone:" $Url.URLZone -ForegroundColor Green
        Foreach ($Server in $SPServers)
            {
            Write-Host "-Updating" $Server -ForegroundColor Green
            Copy-Item "app_offline.htm" "\\$Server\$Webroot\"
            }
        }
    }

    Return SharePoint into Service

    #Load SharePoint assembly
    $Assemblies = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

    #Check if the script is running on MOSS 2007
    If (Test-Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12"){$Mode="2007"}

    #Grab all Web Apps
    $WebApps = [Microsoft.SharePoint.Administration.SPWebService]::ContentService.WebApplications

    #Retrieve Servers
    $Farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
    $SPServers = $Farm.Servers | Where {$_.Role -eq "Application"} | Foreach {$_.Name}

    Foreach ($WebApp in $WebApps)
    {
    Foreach ($URL in $WebApp.AlternateUrls)
        {
        If ($Mode = "2007")
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.URLZone)).Path.FullName -replace ":","$"
        }
        Else
        {
        $WebRoot = ($WebApp.GetIisSettingsWithFallback($URL.Zone)).Path.FullName -replace ":","$"
        }
        Write-Host "Removing the Holding Page for" $WebApp.Name "- Zone:" $Url.URLZone -ForegroundColor Green
        Foreach ($Server in $SPServers)
            {
            Write-Host "-Updating" $Server -ForegroundColor Green
            Remove-Item "\\$Server\$Webroot\app_offline.htm"
            }
        }
    }

    Brendan Griffin - @brendankarl

  • SharePoint 2010 Organization Browser - Expose more than just User, Title and About Me (Sort Of).

    One of my favourite customers was very keen to display the OOB Organisation Browser web part on several of their pages but they wanted to expose more than the default values against the user. I know this can be done by clicking across to the users profile page but the extra click was not acceptable.

    As you will know this is a Silverlight control and therefore not customisable that easily so the solution I came up with was to essentially grab the properties they wanted to expose and copy them all in to the 'About Me' portion of the user profile.

    They have somewhere in the region of 1000 users on this particular system so obviously I enlisted the help of our friend PowerShell.

    The code looks something like the below, please be aware of the warning! and that this will obviously not keep the data fresh... Also it might be worth stopping the users ability to edit the About Me field so they are not too disappointed when you run it again and clear all their entries!

    WARNING! THIS CODE WILL REMOVE ALL INFORMANTION IN THE ABOUT ME FIELD FOR ALL USERS BEFORE IT ADDS THE NEW PROPERTIES!

    #Add Sharepoint PowerShell Modules
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"

    #Get the Web application           
    $site=new-object Microsoft.SharePoint.SPSite("http://sharepoint2010/");    

    #Get the Service Context of the Web Application      
    $serviceContext = Get-SPServiceContext $site;           
    $site.Dispose();           

    #Get the User Profile Manager
    $upm = new-object Microsoft.Office.Server.UserProfiles.UserProfileManager($serviceContext);

    #Get all of the user profiles for the selected User Profile manager
    $profiles = $upm.GetEnumerator()

    #Execute a For Each loop against every user
    foreach ($userProfile in $profiles)
    {

    #Delete all values in the AboutMe Section of the User Profile for ALL Users     
    $am = $userProfile["AboutMe"]        
    $am.Clear()     
         
    #Set the AboutMe property to the value of am (empty)          
    $userProfile["AboutMe"].Value = $am;      
        
    #Save the profile changes back to the User Profile store           
    $userProfile.Commit()

    #Get the fields you want to expose in the AboutMe Section of the User Profile and on the Org Browser

    $userProperty1 = $userProfile["WorkEmail"].Value
    $userProperty2 = $userProfile["Office"].Value

    #Concatinate the values you have and add them to the AboutMe Section using the silverlight encoding

    $userProfile["AboutMe"].Value = "Email: "+"$userProperty1" + "
" + "Office Location: " +"$userProperty2"
    #Commit the changes to the user profile 

    $userProfile.Commit();

    }


    Essentially it just grabs existing profile properties and puts them into the 'About Me' field.. Don't forget to use "
" which is Silverlight encoding for new line.

    Thanks to Dave Little for his expert help with that :)

    I just ran the script on my test VM with 1000 users and it completed in sub 30 seconds.

    The results are a little something like this:

    WARNING! THIS CODE WILL REMOVE ALL INFORMANTION IN THE ABOUT ME FIELD FOR ALL USERS BEFORE IT ADDS THE NEW PROPERTIES!


    Hopefully someone will find this useful - I scoured the internet and could not find anything that did what I wanted :)


    Enjoy!


    Andy

  • SharePoint Evolution Conference Roadshow - SharePoint Online Administration with PowerShell

    I recently presented a session at the SharePoint Evolution Conference Roadshow - http://www.sharepointevolutionconference.com

    As promised to the attendees, I have attached the slides from my session "SharePoint Online Administration with PowerShell"

    Brendan Griffin - @brendankarl

  • BI Virtual Labs for customers to have a dabble

    Are you curious about Business Intelligence and want to have a little bit of a dabble on some VMs to understand the possibilities better?

    V-Lab Central are now hosting some fully loaded BI Demo VMs with a walkthrough script which you can quickly spin up to see 'the art of the possible. I have used these VMs during several demos and they are a good way to see some first hand action on SharePoint Interaction with PowerPivot, PowerView, Reporting Services and a whole load of other Microsoft BI stuff.

    You can access the site at http://bi.vlabcentral.com/.

    Note: This is a non-Microsoft site, I assume this an MS partner doing the hosting. There is also a small $ cost associated with spinning up the VMs depending on how long you want them for (at the time of blogging there is a nominal $8 charge for a 2 hour slot)

  • Creating custom display templates

    Custom Display Templates / SharePoint 2013

    Recently I have been working with display templates and the Content by Search web part on a customer engagement.  I thought I would write up some ‘how to’ type material for you all in case you are having your first experience with them.

    This blog article walks you through the following scenario:

    1. I have a heavily branded SharePoint 2013 Intranet for my customer Contoso. They have a number of news articles that they want to aggregate onto the landing page.

    Display Templates

    If you are here, then you most likely know what a display template is, and as SharePoint 2013 has been around for a while now there is already a lot of great content out there explaining what they do, so I won’t try and write a new take on them. My only add about customising Display Templates is the syntax may be a little confusing at first, but when I compare the experience of writing XSLT (in past versions of SP) to the experience of creating a custom display template, there is clearly a winner, and the winner is Display Templates by a long way J

    Below you can see a screenshot of what we want to create for our first scenario. It is a web part (content by search) that displays the first four employee stories:

    The first thing to point out here is that the end result (e.g. what you are seeing in the screenshot above) is actually composed of two separate display templates. We have a control template, and an item template.

    Look at the screenshot, forgetting SharePoint for a brief moment and think about which HTML element would most suit a bunch of items that you want to look the same. Yep, the <ul> element. (That wasn’t a test question J, but +10 points if you got it right!)

    Our ‘Control’ display template is going to contain our <ul> element and any other code that we need, think custom JavaScript or CSS classes. The control template is your opportunity to control how the group of results is going to be displayed; not the individual items themselves.

    Moving on to the ‘item’ template. As we are going to use the unordered list <ul> element for the collection of items, we need to use the list item <li> element for each of the individual news stories. The ‘Item’ display template is the markup, JavaScript and CSS that is applied to each item within the list.

    Types of Display Template

    Whilst we are looking at two different types of display template, we should also look at what else is available to us as there are actually four different types of Display Template:

    1. Control Display Templates

    2. Group Display Templates

    3. Item Display Templates

    4. Filter Display Templates

    Display templates are associated with content types in the Master Page gallery. When a .html document (e.g. your display template) is uploaded to the Master Page Gallery; the same event receivers that are responsible for creating .aspx and .master versions of design files work to create the .js version (HTMLDesignEventReceiver).

    The HTML.

    So far we have established that news article items are going to be returned as list items (<li>). Each <li> contains an image, which is also going to link to the news item page, some introduction text, and a label for the author; so in terms of HTML we are looking at:

    <ul>
        <li>
            <a href=”<some url in here>”>
                <img src=”<some url in here>” />
            </a>
            <h2>Byline</h2>
            <p>Author</p>
        </li>
        <li>
            <!—- next item -->
        </li>
    </ul>

     

    The content.

    The first thing that we need before we start developing our display templates is the content!

    Below is a partial screenshot of a ‘news article’ page:


     Firstly, as we are going to be using the Content By Search web part, we need to make sure that our content has been crawled! Otherwise, this is going to be a short article!

    Secondly, we are going to be mapping news article properties in the web part’s configuration. Our news article contains images, headings, bylines, author name, etc, so we can really choose what properties we want to be displayed. So the second check there is to make sure that the properties are mapped.

    In this walkthrough, I have already crawled my content, and checked that the crawl has picked up and more importantly mapped properties for my news article.

     

    Quick sense check:

    1. We have a number of news article pages that are all ‘ContosoNewsPage’ content types.

    2. They have been crawled by our SSA.

    3. The crawled properties of our news articles are mapped.

    We are ready to create our display templates, and then add & configure the content by search web part.

    Reinventing the wheel.

    If we take a look at the display templates that are provided out of the box with SharePoint, you can see that we have most scenarios covered. By that, I mean that in this example, we have a list of items that are being rendered with an image, and essentially two lines of text.  This means that we can make the most of what the product group have given us, and take a copy of the closest display template that matches what we are trying to do.

    In this instance, I am going to take a copy of the ‘List’ control display template, and the ‘Picture on top, 3 lines on bottom’ item display template. This gives us a good starting point.

     

    Locating display templates.

    If you go to:

    1. Open Site Settings, and then click on the Master pages and page layouts link.

    2. Open the Display Templates document library, and then open the Content Web Parts document library.

    You should see a bunch of .html and .js files.  We want to download a copy of the two display templates that we intend to update to display the content how we want it to :)

    Therefore, go ahead and download the Control_List.html and Item_Picture3Lines.html files.

    Note: do not download the .js equivalent files. J


     

    Updating Control_List.html

    We’re going to take a perfectly suitable display template, and tweak it a little so that it has our own markup. If you open the Control_List.html file in your preferred ISE; then take a look at the <head> section:

    You’ll notice pretty much straight away that when you are editing the web part’s properties, you are seeing the <title> element as the title property of the display template. So change this to (whatever you like) ‘Employee Story List’ in my case.

    You should also notice the following:

    <mso:MasterPageDescription msdt:dt=”string”>This is the default Control Display Template…</MasterPageDescription>

    This property contains the information that is displayed as the description of the Control Display template. Go ahead and enter some user friendly description of what this template will do.

    We don’t need to change any of the other properties in the <head> section, so scroll further down to about line 50…

    In the screenshot above you can see there is some HTML markup (line 60 & 64) where we are creating our <ul> item. In between the opening and closing <ul> we are going to pass in our results from the Content By Search web parts query in the form of <li> that our item display template will control.

    The other interesting piece here, which you don’t need to change is the var ListRenderRenderWrapper. Within this, we are writing out our <li> items.

    The only customisation I have had to make to the Control Display Template is to add in the class=”fullWidth containsImage” so that I can style the results as per my responsive CSS classes.

    Once I have finished editing this file I will save it as Control_EmployeeStory.html.

    The final stage is to import this into SharePoint:

    1. Open Site Settings, and then click on the Master pages and page layouts link.

    2. Open the Display Templates document library, and then open the Content Web Parts document library.

    3. In the ribbon, click Files, then Upload Document.

    4. Browse to the location where you have saved the Control_EmployeeStory,html file and click OK.

    You will now notice that there is a corresponding .js file that has been created for you in the background:

    Next we need to go through the same process and edit our copy of Item_Picture3Lines.html

    Updating Item_Picture3Lines.html

    Take a look at the <head> section of this display template

    You’ll notice that it differs slightly from the control display template. We still have the <title> field that we need to change (this is what we are going to see when we edit the content by search web part properties); but in addition to this we now have the following section to take a look at:

     This is where we are able to map our default properties to variables; take a look:

    <mso:ManagedPropertyMapping msdt:dt=”string”>
        ‘Picture URL’{Picture URL}:’PublishingImage’

    And then look at the UI of the web part:

    These mapped properties indicate which values will be passed into our display template by default; we can change them, but for now we’ll leave things as they are.

    The next steps are the same as the control display template, we need to save the .html file and upload it into the Master pages and page layouts gallery.

    Once you have finished, you can go ahead and define the search query that will return your content, and select the two new display templates that you just uploaded.

    Your results may not look the same as mine as I have added some custom CSS to make it look how I wanted, and also to be responsive. 

    Hope this helps you create some amazing display templates :)

     

    Steve
    @moss_sjeffery

     

  • SharePoint: Planning for the Future - PowerShell Scripts

    I recently presented a session on planning for the future with SharePoint, as promised to the attendees, below are two of the scripts that I used in my demonstrations to audit Content databases and Site Collections.

    Content Database Inventory

    This script outputs details of all Content Databases within a SharePoint farm to a CSV file, this includes the following information:

    • Name
    • SQL Server
    • Web Application
    • Current number of sites
    • Maximum number of sites allowed
    • Total size of the database (MB)

    Prior to running the script please update highlighted with the location to store the output CSV file.

    asnp *SharePoint* -ErrorAction SilentlyContinue
    #Configure output location for CSV file
    $Output = "D:\CDBInventory.csv"
    #Create headings in the CSV file
    $Headings = "Name","Server","Web App","No. of Sites","Max Site Count","Size (MB)",
    $Headings -join "," | Out-File -Encoding default -FilePath $Output
    #Loop through each Site Collection and output the required properties
    Foreach ($CDB in (Get-SPContentDatabase))
    {
    $Info =
    $CDB.Name,
    $CDB.Server,
    $CDB.WebApplication.Url,
    $CDB.CurrentSiteCount,
    $CDB.MaximumSiteCount,
    ($CDB.DiskSizeRequired /1MB),
    $Info -join "," | Out-File -Encoding default -Append -FilePath $Output
    }

    Below is an example of the CSV file created.


    Site Collection Inventory

    This script outputs details of all Site Collections within a SharePoint farm to a CSV file, this includes the following information:

    • URL
    • Owner
    • Content database the Site is stored within
    • Number of users
    • Size (MB)
    • Template
    • Number of sub-sites (Webs)
    • Certification date (useful if Site Confirmation and Deletion is turned on)

    Prior to running the script please update highlighted with the location to store the output CSV file.

    asnp *SharePoint* -ErrorAction SilentlyContinue
    #Configure output location for CSV file
    $Output = "D:\SiteInventory.csv"
    #Create headings in the CSV file
    $Headings = "URL","Owner","Content DB","Total Users","Storage Used (MB)",
    "Template","Number of Webs","Certification Date"
    $Headings -join "," | Out-File -Encoding default -FilePath $Output
    #Loop through each Site Collection and output the required properties
    Foreach ($S in (Get-SPSite -Limit All))
    {
    $Info =
    $S.URL,
    $S.Owner,
    $S.ContentDatabase.Name,
    $S.RootWeb.SiteUsers.Count,
    ($S.Usage.Storage /1MB),
    ($S.RootWeb.WebTemplate + "#" + $S.RootWeb.WebTemplateId),
    $S.AllWebs.Count,
    $S.CertificationDate
    $Info -join "," | Out-File -Encoding default -Append -FilePath $Output
    $S.Dispose()
    }

    Below is an example of the CSV file created.

    Brendan Griffin - @brendankarl

  • Who are Global Business Support? What Do They Do?

    It's been nearly three weeks since my last post, how time flies! I'm back with a slightly different post, the contributors of this Blog are from the PFE and CTS teams within Microsoft UK, these teams are part of a larger team known as Global Business Support (GBS). One of my colleagues recently recorded a video that provides some background into the GBS team and the job role, I thought it may be worth sharing this to provide a little more insight into the background of the Blog contributors.

    https://www.youtube.com/watch?v=hedHasRflkY&list=PLEXuFMEOTa_geCL4_0lW8EIMMkFinAml5%20

    Brendan Griffin - @brendankarl

  • Sizing SharePoint - The 2nd stage Site collection recycle bin

    One of my customers asked me about the impact about the 2nd stage recycle bin on sizing a new farm. They were concerned that the default value of 50% of the current site quota could have a negative impact on sizing the farm and could influence the expected size of the contentDBs. This setting can be changed on a web application level:

    My initial response to the customer was that this value could be ignored most likely, and that this feature is there to help the administrators of SharePoint, since they don't need to start a restore if the item can be retrieved from there. Since I used "most likely", the answer did not pass, and the customer wanted a real percentage number that they could plug into their calculations.

    So I started a PowerShell session on one of their farms in order to retrieve some actual values...

    The Script I wrote for this is here:


     

    asnp *sh* -ea 0

     

    $countS1 =0

    $countS2 = 0

    $SizeS2 = 0

    $SizeS1 = 0

    $SizeSite = 0

     

    ### Get all sites in the farm

    $sites = get-spsite -limit all

    foreach ($site in $sites)

    {

        $site.Url

        $SizeSite += $site.Usage.Storage / 1MB

        foreach ($item in $Site.Recyclebin)

        {

            if ($item.ItemState -eq "SecondStageRecycleBin")

            {

                $countS2 ++

                $SizeS2 += $item.Size / 1MB

            }

            else #Yes i know, stage can be "Invalid, but i dont care

            {

                $countS1 ++

                $SizeS1 += $item.Size / 1MB       

            }#End If

        } #End Item

        $site.Dispose()

    } #End Sites

     

    "Site count = " + $Sites.length

    "Total recycled items = " + ($countS1 + $countS2)

    "Total Size Site in MB = " + $SizeSite

    "Total Size Stage 1 in MB ="  + $SizeS1

    "Total Size Stage 2 in MB ="  + $SizeS2

    "Stage 2 Percentage vs total size =" + ($SizeS2 / $SizeSite)  * 100


    And for our test farm, it generated the following output (Added decimal Points for clarity, and removed all Site URLs)...

    Site count = 7894

    Total recycled items = 333,003

    Total Size Site in MB = 5,222,218.89358997

    Total Size Stage 1 in MB =89,204.0437278748

    Total Size Stage 2 in MB =45,874.8446807861

    Stage 2 Percentage vs total size =0.878455032535985


    Which means even with a 50% quota buffer, we are using less than a single % on a Farm with almost 5 TB of data. I know that this number can fluctuate a lot, but if your sizing is unable to handle this you should consider a very close monitoring of Farm, and then this should be no issue ;)


    I would consider that a number I would ignore on a sizing calculation, esp. since there are other traps that can hurt you much more, like auditing data, workflows or whole sitecollections which wait for a restore in the contentDB. I would even go as far as suggesting to increase this value from 50% to 100%, which would allow even the recovery of a subweb, which can contain almost the whole sitecollection. The additional cost you pay for this is easily absorbed by the increased ease of recovery. Always remember that SharePoint has many options to recover a lost Document/Site... Make sure you try the internal options before you start to restore a database... And this includes Restore-SPDeletedSite

     

    Greetings

    Heiko Hatzfeld

     

     

     

     

     

     

  • Create a SharePoint Application Pool using PowerShell

    As you may know there isn't an out of the box PowerShell cmdlet available in SharePoint 2010 or 2013 that can be used to create an Application Pool for a Web Application, New-SPServiceApplicationPool (http://technet.microsoft.com/en-us/library/ff607595(v=office.15).aspx) is available however this can only be used to create an Application Pool for a Service Application.

    A customer recently asked me to help them to re-configure one of their Web Applications to use a new Application Pool, I put this script together to facilitate this. The script performs the following actions:

    • Creates a new Application Pool
    • Associates the new Application Pool with an existing Web Application

    Three variables need to be updated prior to running the script:

    $WebAppURL is the URL of the Web App to change the Application Pool of.
    $NewAppPoolName is the name of the name Application Pool that will be created.
    $NewAppPoolUserName is the user account that the Application Pool will run under the context of.

    The script will prompt for the credentials of the account specified in $NewAppPoolUserName. This account should be registered as a Managed Account in SharePoint prior to executing the script - New-SPManagedAccount: http://technet.microsoft.com/en-us/library/ff607831(v=office.15).aspx

    asnp *SharePoint* -ErrorAction SilentlyContinue
    $WebAppURL = "http://WebApp"
    $NewAppPoolName = "NewAppPool"
    $NewAppPoolUserName = "contoso\apppool"

    $Farm = Get-SPFarm
    $Service = $Farm.Services | where {$_.TypeName -eq "Microsoft SharePoint Foundation Web Application"}
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $NewAppPool = New-Object Microsoft.SharePoint.Administration.SPApplicationPool($NewAppPoolName,$Service)
    $NewAppPool.CurrentIdentityType = "SpecificUser"
    $NewAppPool.Username = $NewAppPoolUserName
    $NewAppPool.SetPassword($Password)
    $NewAppPool.Provision()
    $NewAppPool.Update($true)

    $NewAppPool = $Service.ApplicationPools[$NewAppPoolName]
    $WebApp = Get-SPWebApplication $WebAppURL
    $WAAppPool = $WebApp.ApplicationPool = $NewAppPool
    $WebApp.Update()
    $WebApp.ProvisionGlobally()


    Brendan Griffin - @brendankarl

  • Office 365 - Exporting Site Collection Search Configuration using CSOM with PowerShell

    Chris O'Brien has a fantastic Blog post - Using CSOM in PowerShell Scripts with Office 365: http://www.sharepointnutsandbolts.com/2013/12/Using-CSOM-in-PowerShell-scripts-with-Office365.html. One of the examples that he provides is how to import search configuration from an XML file, this is a new feature in SharePoint 2013 that is documented here - http://technet.microsoft.com/en-us/library/jj871675(v=office.15).aspx.

    I've put together a PowerShell script that can be used to export the search configuration, using this along with Chris's script is a great way to copy Search configuration between Site Collections.

    Three variables need to be updated prior to running the script (highlighted), $User is the username of a tenant administrator, $Site is the URL of the Site within the tenant to export the search configuration from and $Schema is the location of the file to write the configuration to.

    #Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585 prior to running this script.

    #Specify tenant admin, site URL and scope to export from
    $User = "admin@tenant.onmicrosoft.com"
    $SiteURL = https://tenant.sharepoint.com/sites/site
    $Scope = "SPSite"
    $Schema = "D:\SearchSchema.XML"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Search.dll"

    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)

    #Export search configuration
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
    $Context.Credentials = $Creds
    $Owner = New-Object Microsoft.SharePoint.Client.Search.Administration.SearchObjectOwner($Context,$Scope)
    $Search = New-Object Microsoft.SharePoint.Client.Search.Portability.SearchConfigurationPortability($Context)
    $SearchConfig = $Search.ExportSearchConfiguration($Owner)
    $Context.ExecuteQuery()
    $SearchConfig.Value > $Schema


    Brendan Griffin - @brendankarl

  • Office 365 - Automating the Creation of a Design Manager Package using CSOM with PowerShell

    One of the really cool features in SharePoint 2013 is the Design Manager, further information on this feature can be found here: Overview of Design Manager in SharePoint 2013 - http://msdn.microsoft.com/en-us/library/jj822363.aspx.

    Design Manager provides the ability to export a Design Manager Package so that customizations can be easily copied to another Site Collection and re-used - potentially saving a lot of time and hassle, the package itself is a sandbox solution using the WSP file format. A screenshot of the option can be found below:

    Wouldn't it be cool if you could automate the creation of a Design Manager Package using PowerShell? Below is a PowerShell script that uses CSOM to perform this very task! The resultant package is saved to a local drive for re-use.

    The three highlighted variables need to be updated. $Username is the username of an administrator of the Site Collection, $Site is the URL of the Site Collection and $DestinationDir is the local directory to save the exported package to.

    #Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585
    $Username = "admin@tenant.onmicrosoft.com"
    $Site = "https://tenant.sharepoint.com/sites/site"
    $DestinationDir = "D:\Downloads\"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Publishing.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Username,$Password)
    $Context.Credentials = $Creds

    $SC = $Context.Site
    $Context.Load($SC)
    $Context.ExecuteQuery()

    $RootWeb = $SC.RootWeb
    $Context.Load($RootWeb)
    $Context.ExecuteQuery()

    $DP = [Microsoft.SharePoint.Client.Publishing.DesignPackage]::ExportEnterprise($Context,$SC,$False)
    $Context.ExecuteQuery()

    #Download Design Package
    $Package =  $SC.ServerRelativeUrl + "/_catalogs/Solutions/" + $RootWeb.Title + "-1.0.wsp"
    $Destination =  $DestinationDir + $RootWeb.Title + "-1.0.wsp"
    $FileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($Context,$Package)
    $WriteStream = [System.IO.File]::Open($Destination,[System.IO.FileMode]::Create);
    $FileInfo.Stream.CopyTo($WriteStream)
    $WriteStream.Close();

    Brendan Griffin - @brendankarl

  • Office 365 - Automating the Creation of Managed Metadata Groups, Term Sets and Terms using CSOM with PowerShell

    I've previously written a Blog post on how to output MMS Term Sets and Terms using CSOM - http://blogs.technet.com/b/fromthefield/archive/2014/03/03/office-365-output-managed-metadata-term-sets-and-terms-using-csom.aspx. This next script example can be used to automate the creation of MMS Groups, Term Sets and Terms. The only reason that I wrote this script was for an excuse to improve my understanding of the CSOM API for MMS!

    The script itself takes input in the form of an XML file, example below:

    The script will create the MMS Groups, Term Sets and Terms contained within the XML file. I have attached an example XML file for reference.

    Three variables need to be updated prior to running the script (highlighted), $User is the username of a tenant administrator, $Site is the URL of any Site within the tenant - this is simply used to bind to the Managed Metadata Service and $Import which is the location of the XML input file.

    #Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585 prior to running this script.

    #Specify tenant admin and URL
    $User = admin@tenant.onmicrosoft.com
    $Site = "https://tenant.sharepoint.com/sites/CSOM"
    [XML]$Import = Get-Content "D:\Example.xml"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Taxonomy.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString

    #Bind to MMS
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)
    $Context.Credentials = $Creds
    $MMS = [Microsoft.SharePoint.Client.Taxonomy.TaxonomySession]::GetTaxonomySession($Context)
    $Context.Load($MMS)
    $Context.ExecuteQuery()

    #Retrieve Term Stores
    $TermStores = $MMS.TermStores
    $Context.Load($TermStores)
    $Context.ExecuteQuery()

    #Bind to Term Store
    $TermStore = $TermStores[0]
    $Context.Load($TermStore)
    $Context.ExecuteQuery()

    Foreach ($GroupName in $Import.Groups.Group)
    {
    #Create Groups
    $Group = $TermStore.CreateGroup($GroupName.Name,[System.Guid]::NewGuid().toString())
    $Context.Load($Group)
    $Context.ExecuteQuery()
        Foreach ($TermSetName in $GroupName.TermSets)
            {
            #Create Term Sets
            $TermSet = $Group.CreateTermSet($TermSetName.Name,[System.Guid]::NewGuid().toString(),1033)
            $Context.Load($TermSet)
            $Context.ExecuteQuery()
            Foreach ($TermName in $TermSetName.Terms.Term)
                {
                #Create Terms
                $TermAdd = $TermSet.CreateTerm($TermName,1033,[System.Guid]::NewGuid().toString())
                $Context.Load($TermAdd)
                $Context.ExecuteQuery()
                }
            }
    }

    Below is an example of how the Term Store will look after running the script with Example.XML file:

    Brendan Griffin - @brendankarl

  • SharePoint 2013 - December CU - Exception: No mapping between account names and security IDs was done

    I was recently helping a colleague to troubleshoot an issue that he was running into whilst deploying the December 2013 CU for SharePoint 2013, the following error was being logged in the upgrade log.

    Exception: No mapping between account names and security IDs was done 00000000-0000-0000-0000-000000000000
    03/03/2014 18:46:19.17 PSCONFIG (0x0EA8) 0x0BCC SharePoint Foundation Upgrade SPUpgradeSession ajxnm ERROR    at Microsoft.Office.Server.Utilities.WindowsSecurity.LookupAccountName(String accountName, String& domainName, SID_NAME_USE& use)     at Microsoft.Office.Server.Utilities.WindowsSecurity.GetNT4AccountName(String name)     at Microsoft.Office.Server.Data.SqlServerManager.GrantLogin(String user)     at Microsoft.Office.Server.Search.Administration.SearchDatabase.GrantAccess(String username, String role)     at Microsoft.Office.Server.Search.Administration.SearchDatabase.SynchronizeAccessRules(SearchServiceApplication searchApp)     at Microsoft.Office.Server.Search.Administration.SearchServiceApplication.SynchronizeDatabases()     at Microsoft.Office.Server.Search.Upgrade.SearchAdminDatabaseSequence.PostUpgrade()     at Microsoft.SharePoint.Upgrade.SPUpgradeSession.Upgrade(Object o, Boolean bRecurse)


    I had a look to see what the Microsoft.Office.Server.Search.Administration.SearchDatabase.SynchronizeAccessRules() method was doing, it turns out that this method (amongst other things) validates each user/group that is added as an Administrator to the Search Service Application. If the user account or group no longer exists the lookup fails which causes the above exception and the upgrade process to fail. The fix is fairly simple - remove any users or groups from the permissions list of the Search Service Application within Central Administration that no longer exist. I have written a PowerShell script that outputs all of the account types that are verified by the method, whilst the issue could be with one of the other three types of accounts that the method validates it's not likely as Search or the Farm will be seriously broken if they configured to use an account that no longer exists!

    • Search Service Application Administrators
    • Search Service Process Identity
    • Search Service Application Pool
    • Timer Service Identity

    The script makes the presumption that you have a single Search Service Application in the farm.

    asnp *sharepoint* -ea SilentlyContinue
    $SSA = Get-SPEnterpriseSearchServiceApplication
    $SSAAdmins = $SSA.GetAdministrationAccessControl().AccessRules.Name
    $SearchService = (Get-SPEnterpriseSearchService).ProcessIdentity
    $AppPool = $SSA.ApplicationPool.ProcessAccountName
    $TimerService = (Get-SPFarm).TimerService.ProcessIdentity.UserName
    Write-Host SSA Admins: $SSAAdmins -ForegroundColor Green
    Write-Host Search Service: $SearchService -ForegroundColor Green
    Write-Host App Pool: $AppPool -ForegroundColor Green
    Write-Host Timer Service: $TimerService -ForegroundColor Green



    Brendan Griffin - @brendankarl

  • Office 365 - Create Managed Metadata Terms using CSOM with PowerShell

    In a follow up to my previous post - "Office 365 - Output Managed Metadata Term Sets and Terms using CSOM with PowerShell" - http://blogs.technet.com/b/fromthefield/archive/2014/03/03/office-365-output-managed-metadata-term-sets-and-terms-using-csom.aspx, I've written a sample PowerShell script that can be used to add Terms to a specific Term Set. The script uses a simple text file as input with each Term to be added listed on a separate line:

    Five variables need to be updated prior to running the script (highlighted), $User is the username of a tenant administrator, $Site is the URL of any Site within the tenant - this is simply used to bind to the Managed Metadata Service, $GroupName is the name of the group that contains the Term Set you wish to add Terms to, $TermSetName is the name of the Term Set you wish to add Terms to and $Terms is the location of the text file that contains the Terms to be added. 

    #Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585 prior to running this script.

    #Specify tenant admin and URL
    $User = admin@tenant.onmicrosoft.com
    $Site = "https://tenant.sharepoint.com/sites/site"
    $GroupName = "Group"
    $TermSetName = "Term Set"
    $Terms = Get-Content "D:\terms.txt"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Taxonomy.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString

    #Bind to MMS
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)
    $Context.Credentials = $Creds
    $MMS = [Microsoft.SharePoint.Client.Taxonomy.TaxonomySession]::GetTaxonomySession($Context)
    $Context.Load($MMS)
    $Context.ExecuteQuery()

    #Retrieve Term Stores
    $TermStores = $MMS.TermStores
    $Context.Load($TermStores)
    $Context.ExecuteQuery()

    #Bind to Term Store
    $TermStore = $TermStores[0]
    $Context.Load($TermStore)
    $Context.ExecuteQuery()

    #Bind to Group
    $Group = $TermStore.Groups.GetByName($GroupName)
    $Context.Load($Group)
    $Context.ExecuteQuery()

    #Bind to Term Set
    $TermSet = $Group.TermSets.GetByName($TermSetName)
    $Context.Load($TermSet)
    $Context.ExecuteQuery()

    #Create Terms
    Foreach ($Term in $Terms)
        {
        $TermAdd = $TermSet.CreateTerm($Term,1033,[System.Guid]::NewGuid().toString())
        $Context.Load($TermAdd)
        $Context.ExecuteQuery()
        }


    Below is a screenshot of the result:

    Brendan Griffin - @brendankarl

  • Office 365 - Output Managed Metadata Term Sets and Terms using CSOM with PowerShell

    This PowerShell script will connect to an O365 SharePoint tenant and output the following information to the console from the Managed Metadata Term Store:

    • Groups
    • Term Sets
    • Terms

    As with my previous CSOM scripts, this is more a sample to get you started than something you would use in production. My term store has a relatively small amount of terms, I'm not sure how this script will behave with hundreds or thousands of terms!

    Three variables need to be updated prior to running the script (highlighted), $User is the username of a tenant administrator, $TenantURL is the URL of the Tenant Admin Site and $Site is the URL of any Site within the tenant - this is simply used to bind to the Managed Metadata Service.

    #Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585 prior to running this script.

    #Specify tenant admin and URL
    $User = "admin@tenant.onmicrosoft.com"
    $TenantURL = "https://tenant.admin.sharepoint.com"
    $Site = https://.sharepoint.com/sites/site

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Taxonomy.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString

    #Bind to MMS
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)
    $Context.Credentials = $Creds
    $MMS = [Microsoft.SharePoint.Client.Taxonomy.TaxonomySession]::GetTaxonomySession($Context)
    $Context.Load($MMS)
    $Context.ExecuteQuery()

    #Retrieve Term Stores
    $TermStores = $MMS.TermStores
    $Context.Load($TermStores)
    $Context.ExecuteQuery()

    #Bind to Term Store
    $TermStore = $TermStores[0]
    $Context.Load($TermStore)
    $Context.ExecuteQuery()

    #Retrieve Groups
    $Groups = $TermStore.Groups
    $Context.Load($Groups)
    $Context.ExecuteQuery()

    #Retrieve TermSets in each group
    Foreach ($Group in $Groups)
        {
        $Context.Load($Group)
        $Context.ExecuteQuery()
        Write-Host "Group Name:" $Group.Name -ForegroundColor Green
        $TermSets = $Group.TermSets
        $Context.Load($TermSets)
        $Context.ExecuteQuery()
        Foreach ($TermSet in $TermSets)
            {
            Write-Host "      Term Set Name:"$TermSet.Name -ForegroundColor Yellow
            Write-Host "        Terms:" -ForegroundColor DarkCyan
            $Terms = $TermSet.Terms
            $Context.Load($Terms)
            $Context.ExecuteQuery()
            Foreach ($Term in $Terms)
                {
                Write-Host "            " $Term.Name -ForegroundColor White
                }
            }
        }


    Below is an example of the output from the script:



    In my next post I plan to demonstrate how PowerShell can be used to create Term Sets and Terms using CSOM.

    Brendan Griffin - @brendankarl

  • PowerShell Snippets

    PowerShell Snippets

    As Premier Field Engineers, we often find that our customers need assistance creating PowerShell scripts that can help automate administrative processes; or retrieve information from their various SharePoint farms.

    Whilst each of our customers are different and have unique environments; many scripts have requirements that are similar, if not the same regardless of the environment or the actions that the script will perform.

    This blog post (the first of a series of blog posts that I am aiming to write) takes a look at some of these common scripting requirements, with the aim of providing you with some useful ‘snippets’ of code. As with all blog posts that we write, the code has not been tested in your environment, so do use caution and fully test anything that you use.

     

    (1) Event Logging

    “When we run a script; we want to ensure that we log/ capture who runs it in the server event log…”

    This requirement is pretty common. Being able to keep a record of who is running what on your servers is useful; not just for those ‘oops’ moments, but really useful in general :)

     

    Script example: Creating a new entry in the Application Log

    # Get the current logged in user

    $userObj = [System.Security.Principal.WindowsIdentity]::GetCurrent()

    $user = $userObj.name 

    # Set up the event log variables

    $eventLog = Get-EventLog –List | Where-Object {$_.Log –eq ‘Application’}

    $eventLog.MachineName = “.”

    $eventLog.Source = “Contoso IT Department”

    # Write an entry into the Application Log

    $eventLog.WriteEntry(“The script was started by $user.”,”Information”,2014)

     


    Fig 1: Log entry created in Application log


     

    Fig 2: Log entry detail


     

    (2) Showing Script Progress

    “When we run the script, we want to be able to see its progress rather than stare at a blinking cursor…”

    This is a pretty common requirement too; especially on scripts that query a large collection of objects. The good news is that PowerShell provides us with a great cmdlet to do just this (Write-Progress):

    Script example: iterating through all sub-webs in a site collection

    $collWeb = Get-SPWeb –identity http://intranet.contoso.com/sites/sitecollection1

    [int]$i = 0 

    Foreach ($web in $collWeb)
    {

        $i++
        Write-Progress –activity “Collecting all sub webs” –status “Status: “ –percentComplete (($i / $collWeb.count)*100)
    }

     

    Fig 3: Showing progress bar


     

    That’s all for now, I hope that this has been useful.

    Cheers, Steve
    @moss_sjeffery

  • Office 365 - Retrieve User Profile Properties using CSOM with PowerShell

    Update 12/04/14 - I have added an example that demonstrates how to export User Profile Properties to a CSV file at the end of this post.

    The example PowerShell script below can be used to retrieve all user profile properties for all users within a Site Collection. It doesn't appear to be possible to connect to the User Profile Service Application and retrieve profile properties for all users with the tenant using CSOM, the only approach is to perform this at a Site Collection level, you could of course add a ForEach loop to iterate through all Site Collections however additional effort would be required to remove duplicate profile data from the output as it's likely that each user will have permission to multiple Site Collections therefore would be retrieved multiple times using this approach.

    It requires three variables to be updated - $User which is the tenant admin, $TenantURL which is the tenant admin URL and $SiteURL which is the URL for the Site Collection that you wish to retrieve User Profile data from.

    #Please install the SharePoint client components SDK - http://www.microsoft.com/en-us/download/details.aspx?id=35585 prior to running this script.

    #Specify tenant admin and URL
    $User = "admin@tenant.onmicrosoft.com"
    $TenantURL = "https://tenant-admin.sharepoint.com"

    #Configure Site URL and User
    $SiteURL = "https://tenant.sharepoint.com/sites/site"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.UserProfiles.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)

    #Bind to Site Collection
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
    $Context.Credentials = $Creds

    #Identify users in the Site Collection
    $Users = $Context.Web.SiteUsers
    $Context.Load($Users)
    $Context.ExecuteQuery()

    #Create People Manager object to retrieve profile data
    $PeopleManager = New-Object Microsoft.SharePoint.Client.UserProfiles.PeopleManager($Context)
    Foreach ($User in $Users)
        {
        $UserProfile = $PeopleManager.GetPropertiesFor($User.LoginName)
        $Context.Load($UserProfile)
        $Context.ExecuteQuery()
        If ($UserProfile.Email -ne $null)
            {
            Write-Host "User:" $User.LoginName -ForegroundColor Green
            $UserProfile.UserProfileProperties
            Write-Host ""
            } 
        }


    Here is an example of the output.

    If you only need to retrieve a selection of user profile properties rather than everything, the script can be easily updated - simply replace $UserProfile.UserProfileProperties with $UserProfile | Select (Property Names). The following properties can be retrieved using this approach:

    • AccountName
    • DirectReports
    • DisplayName
    • Email
    • PersonalURL
    • PictureURL
    • Title
    • UserURL

    To retrieve AccountName,Email and PictureURL the following command can be used - $UserProfile | Select AccountName,Email,PictureURL

    If you need to retrieve other profile properties you will need to use the following approach - $UserProfile.UserProfileProperties.PropertyName, for example to retrieve the department for a user the following can be used $UserProfile.UserProfileProperties.Department.

    Below is an example of how to export User Profile Properties to a CSV file - which you may find a little more useful than outputting to the console. Simply replace the last section of the script above with the script below, you need to update the $Output variable to specify the location to output the CSV file to.

    If you need to add any additional properties, two lines in the script need to be updated (highlighted). The first line specifies the headings of the CSV file, the order of which need to match exactly the properties specified in the second line that is highlighted.

    The second line that is highlighted specifies the actual properties to output for each user. The $UserProfile and $UPP variables can be used to select the properties, $UPP is $UserProfile.UserProfileProperties. My example contains both types of properties.

    #Create People Manager object to retrieve profile data
    $Output = "D:\Output.csv"
    $Headings = "Name","Email","OneDrive URL","Phone","Job Title","Department"
    $Headings -join "," | Out-File -Encoding default -FilePath $Output

    $PeopleManager = New-Object Microsoft.SharePoint.Client.UserProfiles.PeopleManager($Context)
    Foreach ($User in $Users)
        {
        $UserProfile = $PeopleManager.GetPropertiesFor($User.LoginName)
        $Context.Load($UserProfile)
        $Context.ExecuteQuery()
        If ($UserProfile.Email -ne $null)
            {
            $UPP = $UserProfile.UserProfileProperties
            $Properties = $UserProfile.DisplayName,$UserProfile.Email,$UserProfile.PersonalUrl, $UPP.WorkPhone,$UPP.'SPS-JobTitle',$UPP.Department
            $Properties -join "," | Out-File -Encoding default -Append -FilePath $Output
            } 
        }

     The example outputs the following properties:

    • Name
    • Email
    • OneDrive URL
    • Phone
    • Job Title
    • Department

    Here is the output file that is created:

    Hopefully this gives you enough basic information to make a start using PowerShell to retrieve user profile information.

    Brendan Griffin - @brendankarl

  • Office 365 - PowerShell Script to Upload Files to a Document Library using CSOM

    UPDATE: The script now supports uploading files larger than 2MB.

    Another PowerShell sample script for you. This one uploads all files within a specified local directory to a Document Library within a Site in an O365 tenant.

    All you need to run this script is an O365 tenant, the SharePoint client components SDK installed on the machine running the script - http://www.microsoft.com/en-us/download/details.aspx?id=35585 and to update the $User, $SiteURL, $DocLibName (name of the destination Document library) and $Folder (path to the local folder containing the files to upload) variables. When the script is executed it will prompt for the password of the user specific in the $User variable.

    One thing to point out is that CSOM has a maximum upload size of 2MB.

    #Specify tenant admin and site URL
    $User = "admin@tenant.onmicrosoft.com"
    $SiteURL = "https://tenant.sharepoint.com/sites/site"
    $Folder = "C:\FilesToUpload"
    $DocLibName = "DocLib"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString

    #Bind to site collection
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)
    $Context.Credentials = $Creds

    #Retrieve list
    $List = $Context.Web.Lists.GetByTitle($DocLibName)
    $Context.Load($List)
    $Context.ExecuteQuery()

    #Upload file
    Foreach ($File in (dir $Folder))
    {
    $FileStream = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)
    $FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
    $FileCreationInfo.Overwrite = $true
    $FileCreationInfo.ContentStream = $FileStream
    $FileCreationInfo.URL = $File
    $Upload = $List.RootFolder.Files.Add($FileCreationInfo)
    $Context.Load($Upload)
    $Context.ExecuteQuery()
    }


    Brendan Griffin

  • Office 365 - PowerShell Script to Create a List, Add Fields and Change the Default View all using CSOM

    I'm my continued quest to get to grips with the Client Side Object Model (CSOM) in SharePoint 2013, I have put together a sample script below that connects to a Site Collection within an O365 tenant and does the following:

    • Creates a list using the "Custom" list template
    • Adds two Site Columns to the list (City and Company)
    • Adds these fields to the default view
    • Adds an item to the list

    You may find this useful as a reference! The usual disclaimers apply :)

    All you need to run this script is an O365 tenant, the SharePoint client components SDK installed on the machine running the script - http://www.microsoft.com/en-us/download/details.aspx?id=35585 and to update the $User, $SiteURL and $ListTitle variables. When the script is executed it will prompt for the password of the user specific in the $User variable.

    #Specify tenant admin and site URL
    $User = "admin@tenant.onmicrosoft.com"
    $SiteURL = https://tenant.sharepoint.com/sites/site
    $ListTitle = "List Title"

    #Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.dll"
    Add-Type -Path "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
    $Password = Read-Host -Prompt "Please enter your password" -AsSecureString
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)

    #Bind to site collection
    $Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
    $Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,$Password)
    $Context.Credentials = $Creds

    #Retrieve lists
    $Lists = $Context.Web.Lists
    $Context.Load($Lists)
    $Context.ExecuteQuery()

    #Create list with "custom" list template
    $ListInfo = New-Object Microsoft.SharePoint.Client.ListCreationInformation
    $ListInfo.Title = $ListTitle
    $ListInfo.TemplateType = "100"
    $List = $Context.Web.Lists.Add($ListInfo)
    $List.Description = $ListTitle
    $List.Update()
    $Context.ExecuteQuery()

    #Retrieve site columns (fields)
    $SiteColumns = $Context.Web.AvailableFields
    $Context.Load($SiteColumns)
    $Context.ExecuteQuery()

    #Grab city and company fields
    $City = $SiteColumns = $Context.Web.AvailableFields | Where {$_.Title -eq "City"}
    $Company = $SiteColumns = $Context.Web.AvailableFields | Where {$_.Title -eq "Company"}
    $Context.Load($City)
    $Context.Load($Company)
    $Context.ExecuteQuery()

    #Add fields to the list
    $List.Fields.Add($City)
    $List.Fields.Add($Company)
    $List.Update()
    $Context.ExecuteQuery()

    #Add fields to the default view
    $DefaultView = $List.DefaultView
    $DefaultView.ViewFields.Add("City")
    $DefaultView.ViewFields.Add("Company")
    $DefaultView.Update()
    $Context.ExecuteQuery()

    #Adds an item to the list
    $ListItemInfo = New-Object Microsoft.SharePoint.Client.ListItemCreationInformation
    $Item = $List.AddItem($ListItemInfo)
    $Item["Title"] = "New Item"
    $Item["Company"] = "Contoso"
    $Item["WorkCity"] = "London"
    $Item.Update()
    $Context.ExecuteQuery()


    Brendan Griffin

  • Download language packs for SharePoint 2013

    Hi all,

    I've been working with a customer recently to help them deploy a multi-lingual SharePoint 2013 environment.
    This can be a little time consuming, so I helped automate this process with PowerShell:

    Here is the script:

    # Import BITS module

    Import-Module BITSTransfer

    # Path to download language packs to

    $downloadPath = "C:\SPLanguagePacks"

    # hashtable for language packs

    $lPacks = @{

    Arabic = "http://download.microsoft.com/download/3/2/C/32C97E8A-E1C4-4BC3-B4B5-1E85B2E0A571/serverlanguagepack.img"

    ChineseSimp = "http://download.microsoft.com/download/4/7/7/477BFB7A-C9C2-4B1D-8408-D70D4AF52DBA/serverlanguagepack.img"

    ChineseTrad = "http://download.microsoft.com/download/F/2/D/F2D67EBD-C9AE-482E-83FA-C4669F058073/serverlanguagepack.img"

    English = "http://download.microsoft.com/download/7/E/C/7EC7E73F-F172-453F-877C-640AF0B82D26/serverlanguagepack.img"

    Kazakh = "http://download.microsoft.com/download/0/D/1/0D1FD1A6-9104-4E57-A531-DDD26EE82E8F/serverlanguagepack.img"

    Korean = "http://download.microsoft.com/download/0/0/D/00D60DFF-E7D2-4EA2-BF6B-0FD591ED7AC3/serverlanguagepack.img"

    }

    # loop through each hashtable item

    $lPacks.GetEnumerator() | ForEach-Object {

    $lang = $_.Name

    $link = $_.Value

    # create folder for each languge

    $destination = New-Item -Path "$downloadPath\$lang" -ItemType Directory

    # start download of language pack

    Start-BitsTransfer -Source $link -Destination $destination -DisplayName "Downloading $lang SharePoint Language Pack" -Priority High

    }