Partner Technical Services Blog

A worldwide group of consultants who focus on helping Microsoft Partners succeed throughout the business cycle.

Partner Technical Services Blog

  • Embedding a PowerPoint Deck on SharePoint 2010

    (Post dedicated to Nuri, Operations Manager for our delivery team in EMEA, and courtesy Sean Earp)

    With the addition of PowerPoint Web App to SharePoint 2010, you can now view and edit PowerPoint presentations directly from within your browser.  This technology has also been made available to consumers on services such as and


    In the past, it has been difficult to embed a PowerPoint document within a webpage, requiring workarounds such as saving the presentation as pictures, PDFs, or MHT documents.  If you have a public presentation, it is now extremely easy to embed a PowerPoint deck on any web page, following the steps on the aptly named how to embed a PowerPoint presentation on a web page post.

    Unfortunately, these steps do not work if your installation of PowerPoint Web App is local.  The Share –> Embed option available from is simply not present on SharePoint 2010.


    So what to do if you want to embed an internal, private, or confidential PowerPoint presentation on an internal SharePoint page?  Fortunately, it is possible to embed a presentation on a webpage without posting the presentation on a broadly available public site.

    Step 1: Ensure that Office Web Apps have been installed and configured on SharePoint 2010.  Those steps are out of scope for this article, but the official documentation should be all you need:  Deploy Office Web Apps (Installed on SharePoint 2010 Products)

    Step 2: Upload the PowerPoint to a document library


    Step 3: Click on the PowerPoint Deck to open it in PowerPoint Web App.  It will have a URL that looks like:



    Don’t worry about writing down the URL. Unfortunately, you can’t paste it into a Page Viewer web part without getting an error message.  So… a little magic to get the URL we need to embed our PowerPoint deck on our SharePoint Page.

    Step 4: Open the Developer Tools in Internet Explorer (F12), and search for iframe.


    Step 5: Copy the first result into your text editor of choice.  The magic URL you need is the one within the src attribute.


    Step 6: Delete everything except the part inside the quotes.  Before the PowerPointFrame.aspx, add the relative URL to your site collection _layouts directory, and copy the whole URL into your clipboard.


    Step 6: Go to the SharePoint Page you want to embed the PowerPoint into.  Add a Page Viewer Web Part to the page.  Open the tool pane for the web part,


    Step 7: In the Page Viewer tool pane, paste in the URL, and optionally enter a height/width and chrome state for the PowerPoint Deck.


    Step 8: Hit “OK” and be awed at how awesome it looks to have a fully functional PowerPoint deck embedded on your page.  You can view the deck full screen by clicking “Start Slide Show”, you can change slides, view notes, click links, or click the “popout” button to have the deck open up in a popout window.


    Super-secret-squirrel trick: If you want the deck to default to a slide other than the cover slide, click through to the slide you want, and then click the popout button in the top right of the PowerPoint Web App.  The deck will be open to that slide in its own window. 

    Use the same Developer Tools trick from step 4, but this time search for &SlideId.  You will see the URL has added two parameters… a slide ID and popout=1 (the URL will end with something like &SlideId=590&popout=1).  You can guess what popout=1 does, and the SlideId is some sort of internal reference to the Slide ID (I have no idea how it is generated, but it doesn’t matter Smile.  My web app-fu will work just the same). Just copy the &SlideID=somenumber and paste it to the end of your URL in the Page Viewer web part, and now your web page will display the PowerPoint deck starting on whatever page you specified!

    Additional Resources

    Office Web Apps technical library

  • SharePoint and Exchange Calendar together

    (post courtesy Anand Nigam)

    One of the cool things in SharePoint 2010 is the ability to show the Exchange Calendar on a SharePoint site, side by side. This is called as Calendar Overlay

    This post will walk through how to configure this.

    Step 1 (prerequisite)

    1. I have a SharePoint Site http://fabrikam which looks like this


    2. I also have a calendar “MySharePointCalender” , with a few calendar events entered.


    3. I have my Exchange Calendar in Outlook, with a few meeting/events there as well.


    4. What we want is to see events from my Exchange calendar show up on the SharePoint calendar.

    Step 2 (The actual process)

    1. Open the SharePoint calendar  --> Calendar Tools –> Calendar Overlay –>New Calendar,



    Fill in the :

    • Name: Give a name to this calendar
    • Type: Select Exchange
    • Outlook Web Access URL: the OWA url of your organization.
    • Exchange Web Service URL: which can be determined as follows:

    If your OWA URL is, then the Exchange web Service URL would be

    (in other words, from the OWA URL , remove the trailing “owa” and add “ews/exchange.asmx”)


    Step 3 (The awaiting Error and the fix)

    If you have not previously configured SharePoint to trust your Exchange server, you will receive the following error message:

    Could not establish trust relationship for the SSL/TLS secure channel with authority ‘dc’. (GUID)


    Here is the fix

    1. Get the CA Root Certificate for your domain

    (Just a note, there are many ways to get the certificate, I’m taking the one that is less prone to error)

    a. Go to the Server where you have the Certificate Authority installed. Open IIS and select the Server Certificates component.


    Double click on Server Certificates

    Locate the Root Certificate of the CA from the list, here is the one what I have.


    (To double check if this it the Root certificate, open the certificate and see the certification path, It should have just one entry (root), that is the name of the Certification Authority, in your domain.). Below the image or my root certificate


    b. Now that we have located the certificate, Open it go to Details tab and Click Copy to File

    clip_image026 clip_image028

    clip_image031 clip_image033

     clip_image036 clip_image038

    And now we have the Certificate exported to a file


    Copy this certificate to the SharePoint Server, and follow the below steps

    a. Open Central administration > Security> Manage Trust


    b. Click on New, Provide a Name (I use RootCA), and navigate to the RootCA.cer file you exported in the previous step and Click OK


    Now refresh the same calendar and confirm that you can see the Exchange calendar event for the logged in user


    Step 4 (Enhance the default behavior)

    Although we can now see the Exchange calendar, we can on only see the free/busy status, and not the actual details of the event. It would be good if we could have the details displayed here too. To display details:

    1. Open Outlook> File > Options>


    2. Go to the Calendar Section > click Free/Busy Options


    3. Select any one of the options below, I have selected Full details. Click Apply and Ok and exit out of Outlook.  Now refresh the SharePoint calendar and see the difference.



    Additional reading:

    Note: The calendar overlay is per user, meaning it will only show calendar items for the currently logged-in user.

  • SharePoint 2010–Returning Document ID in Search Results

    (Post courtesy Sean Earp, with research and XSLT authoring by Alaa Mostafa)

    One of my favorite features of SharePoint 2010 is the Document ID.

    As discussed in the MSDN article Developing with Document Management Features in SharePoint Server 2010 (ECM):

    A document ID is a unique identifier for a document or document set and a static URL that opens the document or document set associated with the document ID, regardless of the location of the document. Document IDs provide:

    • A way to reference items such as documents and document sets in SharePoint Server 2010 that is less fragile than using URLs. URLs break if the location of the item changes. In place of the URL, the document ID feature creates a static URL for each content item with a document ID assigned to it.

    • More flexible support for moving documents or document sets at different points in the document life cycle. For example, if you create a document on a MySite or Workspace page and then publish it on a team site, the document ID persists and travels with the document, circumventing the broken URL problem.

    • A document ID generator that assigns a unique document ID to items. You can customize the format of the IDs that the service generates. By using the document management API, you can write and use custom document ID providers.


    When browsing a document library with this feature enabled, you can display the Document ID column, and you will be able to see the Document ID for a given document.  Easy enough, and useful if you need to reference this Document ID in another system.

    This works great when you can browse a document library, perhaps using the new metadata navigation and filtering capabilities of SharePoint 2010, but if your document library is holding thousands and thousands of documents, users may resort to using search to find the document they are looking for.  Unfortunately, SharePoint search does not display the document ID is the search results by default.


    Fortunately, SharePoint indexes Document IDs as a managed property by default, which means that with a little magic, we can add the Document ID into the search results.

    In a nutshell, SharePoint retrieves the search results as XML, and uses XSLT to transform the XML into the pretty results you see on the search results page.  Same basic concept as HTML (which has your content) and CSS (which styles that content).  We just need to tell SharePoint to return the managed property with our Document ID, and then update the XSLT to display that managed property in the search results. 

    It is not as hard as it sounds.

    Assumptions: I assume you have enabled the Document ID feature on the site collection, all documents have been assigned Document IDs, and a full crawl has been done of the site.  I also assume you are a site collection administrator with full permissions to the site collection.

    From your Search Results page  in the site collection (wherever you have it),   click on Page –> Edit (or Site Actions –> Edit Page).  You will see a ton of zones and web parts (such as the Refinement Panel, Search Statistics, Search Box, etc.  You can customize the heck out of the search results page, and move things all over the place.


    For now, however, we are just going to modify the Search Core Results web part that contains…er… the core search results.  How intuitive!

    Edit the Search Core Results web part, and expand the section that says “Display Properties”.  Uncheck the box that says “Use Location Visualization”.  I have no idea why this option is named as it is… this is really the option that lets you edit the fetched properties and XSL.


    As a quick aside… although you can edit the fetched properties and XSL directly from the web page properties, the experience is horrible.  I strongly recommend using an XML editor like Visual Studio or NotePad++

    In the Fetched Properties section you will see a number of columns that look like the following.  these are the managed properties that are returned by SharePoint Search

    <Column Name="PictureHeight"/>  <Column Name="PictureWidth"/>

    Somewhere before the closing </Columns> tag, add a:

    <Column Name="spdocid"/>

    (Note: if you are using SharePoint search instead of FAST search replace all instances of “spdocid” with “DocID”)

    This will cause SharePoint to return the Document ID in the search results XML.  Now let’s modify the XSL so that we display the ID in the search results.  Click on the “XSL Editor…” and copy the XSL into your XML editor of choice (or, if you like pain, just edit the 938-line long XSL sheet in a browser that does no validation or color coding.  Your choice.)

    At the top of the XSL is a list of parameter names.  Add in the following parameter (order does not matter)

    <xsl:param name="spdocid" />


    Next, search for “DisplayAuthors.  After the DisplayAuthors call template, we are going to add a new call template called “DisplayID” to… well, display the ID. The template is wrapped in a conditional to ensure that if there is NOT a document ID, that it does not attempt to display a null value. 

    Add the following: following lines:

                  <xsl:if test="string-length($hasViewInBrowser) &gt; 0">
                          <xsl:call-template name="DisplayID">
                            <xsl:with-param name="spdocid" select="spdocid" />
                            <xsl:with-param name="browserlink" select="serverredirectedurl" />
                            <xsl:with-param name="currentId" select="$currentId" />


    Search for “DisplayString” and we will add a section to call our template, display the ID (along with a URL that links to the document), and we’ll put brackets around the Document ID so it stands out visually.  Add the following:

      <xsl:template name="DisplayID">
        <xsl:param name="spdocid" />
        <xsl:param name="currentId" />
        <xsl:param name="browserlink" />
        <xsl:if test="string-length($spdocid) &gt; 0">
          <xsl:text xml:space="default"> [ </xsl:text>
          <a href="{concat($browserlink, $ViewInBrowserReturnUrl)}" id="{concat($currentId,'_VBlink')}">
            <xsl:value-of select="$spdocid" />
          <xsl:text xml:space="default"> ] </xsl:text>

    We’re almost done!  Select all your XSL, copy it, and paste it back into your SharePoint window, hit Save –> Okay –> Check In –> Publish

    Voila!  The Document ID now shows up in the search results with a clickable link back to the source document.


    Random troubleshooting tip:  If you get the message “Property doesn't exist or is used in a manner inconsistent with schema settings”, this typically means one of two things:

    1. You created a custom managed property and have not yet run a full crawl so that this property does not exist in the index (this property is mapped out of the box, so it does not apply here)
    2. You are using the wrong managed property.  FAST search uses “spdocid” while SharePoint search uses “DocId”



    Attachments: I have attached a copy of the XSL I used for the above post to save you time copying and pasting into the right sections.  It works for me with SharePoint search, but use on a test server first and at your own risk.

  • Configure Power Management with SCCM 2007 R3

    (Post courtesy Anil Malekani)

    In this post I’ll explain how to configure Power Management with SCCM 2007 R3. The post is divided in three parts; prerequisites and dependencies, enabling power management on site and clients, and configuring reports for power management.

    The power management feature in SCCM 2007 R3 provides the following capabilities:

    1. Collect power usage information from clients in the site and calculate cost of power consumed during a specific period.
    2. Enforce Power Policies on clients in different collections. Each collection of clients can have different power plans for peak and non-peak hours.
    3. Monitor power plan applied on clients in a collection.
    4. Displays reports of carbon emissions saved over a specified time period.

    Part 1: Checking Prerequisites and Dependencies for Power Management


    • Make sure hotfix KB977384 ( ) is installed on computers running the following components of SCCM 2007 SP2:
      • Primary and secondary site servers
      • Remote administrator console servers
      • Remote provider servers
      • Client computers
    • Make sure SCCM site servers are running SCCM 2007 R3.
    • SQL Server Reporting Services must be installed and configured. This feature is required to view reports related to Power Management.
    • Inventory Client Agents should be configured and inventory collection should be working on clients.


    • Client computers must be capable of supporting different power states as defined by the power management policy.

    Part 2: Enable Site and Clients for Power Management and Deploy Power Policies on a Collection

    1. Install KB977384 on the site server and create a package for deployment to SCCM clients machines.

    a. Locate the file SCCM2007-SP2-KB977384-ENU.msi on the SCCM 2007 R3 install media and start installation.

    b. During installation it will prompt to create a patch install package for clients. Select the first option to create the package and Press Next.

    c. Press Next or modify the package name for the Patch.


    d. Press Next twice.

    e. Click Finish.

    This process will create a package and a program in the SCCM console. Open the console and verify that it is present under packages. Now you can create an advertisement and target the patch install program on a collection of workstations. Make it a mandatory installation and it won’t require a restart on machines.

    2. Enable the Power Management Client Agent setting on the Site server.


    On the SCCM client workstation, you’ll find the new Power Management Agent component under the Configuration Manager Properties. This will appear only if you have patch KB977384 installed and policy updates have been received from the site server after enabling the power management agent.


    3. Enable Power plans for a collection of machines.

    a. Right-click a collection of machines and select Modify collection settings.

    b. Click on the Power Management tab and select the option to Specify power management settings for the collection.

    c. Define Peak hours and select a Peak plan.

    d. You may select any of predefined power plans or modify the Customized Peak plan.

    e. Similarly for Non-peak plan you may customize the Non-peak power plan and press OK.


    Part 3: Configure Reporting for Power Management

    1. Make sure SQL Reporting Services component is installed and configured. Use the Reporting Services Configuration tool to configure Reporting Services.

    2. Install Reporting Services Point as a new site role in SCCM



    3. Specify the Report Folder Name and press Next twice.

    4. Go to Reporting Services under Computer Management > Reporting. Right-click the Reporting Services server role and select Copy Reports to Reporting Services.


    5. Select Database server name and database instance name for SCCM.

    6. Select database authentication method to gain access to connect to database and press Next.

    7. Select the second option to Import Reports from a cabinet file, and click Browse.


    8. Browse to "%ConfigMgr install folder%\Reports\Power Management" folder and select file.


    9. This will list all Power Management reports which will be imported.

    10. Press Next twice.

    11. To view reports, open IE and type URL http://%servername%/Reports.

    12. Click on ConfigMgr_%SiteCode% Folder and select Power Management Reports.


    These Reports may also be executed from SCCM server console.

    More Information

  • System Center Operations Manager 2012 Installation Walkthrough

    (Post courtesy Rohit Kochher)

    System Center Operations Manager 2012 has significant changes in setup from Operations Manager 2007. Setup of 2012 has become simpler and installation has become easier.

    If you want to follow along on a test server, you can download Beta version of SCOM 2012 from here.

    Note: The Root Management Server (RMS) concept which from Operations Manager 2007 R2 has been removed from Operations Manager 2012. All Operations Manager 2012 servers are management servers. However we do have an RMS emulator to support those management packs which target RMS. Architecturally, servers in Operations Manager 2012 have a peer-to-peer relationship and not a parent-child relationship like Operations Manager 2007 R2.

    In this blog we will discuss the setup of Operations Manager 2012 with some screenshots of the installation wizard. Microsoft SQL Server 2008 SP1 or 2008 R2 should be installed prior running SCOM 2012 Setup. You can get more information on SCOM 2012 supported configurations here.

    Now, once we run setup.exe we will see the following screen:


    You can click on Install for setup of Management server, Management Console, Web server and Reporting Server. Under Optional installations you can choose to install Local agent, Audit Collection Services, Gateway management server, and ACS for Unix/Linux.

    Once you click on Install you will get the screen to accept the agreement. Once you accept that you will get below screen


    You can select the component that you want to install. Clicking on the arrow pointing down in front of each role will give brief information about that role. There is no explicit option to install OPS DB and data warehouse, as they are integrated. Selecting given features, you will get screen for location of program files. The default location is C:\Program Files\System Center Operations Manager 2012.


    The next step will show you prerequisite failures (if any). You will get information for failures along with download links to install any missing prerequisites.

    Next you get screen to input information about management server. You can specify if it is first management server in new management group or an additional management server in an existing management group.


    You can specify the name of the management group here. You will also get the screen to specify operations database. We need to install both operations database and data warehouse in Operations Manager 2012. Installing Data warehouse is mandatory in 2012 (a change compared with Operations Manager 2007). The data warehouse is needed for things like dashboards etc. If this is second management server you can click on Add a management server to existing management group option.


    After specifying the required information about Operations database and clicking on next, you will get similar screen for Operations manager data warehouse.

    The next screen allows you to configure Operations Manager service accounts.


    You can specify the required accounts on this screen and click on next to complete the setup. This setup will automatically assign local administrators group on server to the Operations Manager admin role. Once you enter account information here, it will be automatically verified in the background. In case the account cannot be verified (or the password is incorrect), you will get a red warning as the above picture illustrates.

    After this, you will get the option to participate in the Microsoft Customer Experience Improvement Program (CEIP) and Error reporting. Finally, you will also get the option for configuring Microsoft Updates.


    The last screen will provide you with an installation summary. Clicking on Install will start the Installation. Once finished, you are all set to monitor your infrastructure! Some of the great features in Operations Manager 2012 are the new dashboards, network monitoring , and application monitoring; which will be covered in future posts.

    You can check the deployment guide for Operations Manager 2012 here.

    System Center Operations Manager 2012 Beta resources

  • CRM 2011 and SharePoint 2010 Integration - Part 1

    (Post courtesy of Anand Nigam)

    Hi SharePoint Folks,

    I am back with yet another post, this time I will focus on an evergreen subject, Integration between SharePoint and CRM 2011 ? This is a huge topic as there are many possibilities. To have a better clarity on each of the Integration point, I’m going to split the post basis on the integration point covered.

    • Part 1: Introduction and CRM 2011 - Document management Integration with SharePoint 2010 (This post)
    • Part 2: Reporting CRM data in SharePoint using Excel services
    • Part 3: Publishing CRM entities in SharePoint.
    • Part 4: Search CRM entities from SharePoint Enterprise Search.

    The word you are thinking is “awesome”, well I know Smile ,

    Let’s cut short the talking and make it work, get ready!

    CRM 2011 - Document management with SharePoint 2010

    So below is what you will need

      • CRM 2011 deployment and some sample data (I populated my CRM with built in sample data).
      • A SharePoint 2010 farm, with a web app created
      • This post

    In CRM 2011, document management feature comes OOB. This makes it very easy to setup document management as compared to CRM4. Once we are able to setup this up, a CRM 2011 user can create, upload, download, modify etc. to the documents/content in SharePoint without leaving the CRM UI, AND without using “not so good looking“ IFRAME method. Moreover it’s easy to pull administrative information too, All with minimal efforts Smile

    So here is my CRM


    And here is my SharePoint


    To actually integrate the two we have to complete 2 tasks :

    1. Make SharePoint aware that CRM is going to speak to it. This is done by an add-on, which is essentially a sandbox solution, Please be aware, Sandbox solution has a restriction to work within the site collection, so if your need is to use multiple SharePoint site collection for integrating with CRM, you have to install the sandbox solution on each of the them.

    Also Note – I am not talking about subsite, I am talking about site collection.

    2. Configure CRM to use SharePoint as the backend document server. This is done simply by configuring the SharePoint site collection URL in the CRM document management’s settings page.

    1. On SharePoint – “Make SharePoint aware that CRM is going to speak to it

    Step 1 – Log in to any (if you have multiple servers in SharePoint farm, just take any APPLICATION server) SharePoint server with the farm service account –

    Download the List component here

    Step 2 – Extract the files,

    Double click the installer


    Read and understand and if you accept the terms tick mark and continue


    Select the Location for extraction


    Once successfully extracted open the location (“c:\CRM List Component\” in my case)

    You will see 3 files, just 2 of them are important.

    1 – AllowHtcExtn.ps1 – this is the PowerShell script, which “Allows” SharePoint to allow users to use .htc files. In simple words, SharePoint does not by default allows all the file types to be Opened from within SharePoint. Consider this - if you an html file with JavaScript, this file is in SharePoint, and you open it, the browser will render the JavaScript in the html file, which means if the JavaScript is a malicious one, there is a possibility of damage. So to protect clients SharePoint does not lets you open, but gives you option to save it on your disk. With this PowerShell script we will add the .htc file to the list of allowed once. To view all the “Allowed” files run the:

    $WebApp = Get-SPWebApplication –identity <url of your webapp>


    2 – crmlistcomponent.wsp - this is the solution, that we will upload to the SharePoint site solution gallery that will enable SharePoint to “Speak” with CRM

    Easy till here? The next part is easy too !


    Now Open SharePoint central administration site. Navigate to System Settings > Manage Services on Server > Look for “Microsoft SharePoint foundation Sandboxed Code Service”, now Click on Start. On a “Not so important” note - it’s always a good idea/best practice to turn on the services on multiple servers.


    Ensure that you have started it , otherwise the sandbox solution WILL not work. Ok once this is done open services.msc and ensure the service “SharePoint 2010 User Code Host” is “started”. Not doing these steps are the most common reasons why many get error when they try to Activate the solution.


    Now Open the SharePoint site and navigate to the site collection where you want CRM to push the data. In my case I have a default site collection “/” so I will navigate to it in the browser.


    Navigate to Site Action> Site Settings > Solutions (Under Galleries)



    Upload > and point to crmlistcomponent.wsp


    Now click Activate.


    On successful activation click on the solution once and ensure that your window looks similar to the one below. I get Deactivate options confirms the solution is successfully activated Smile


    Now Time to run the script

    Fire up the SharePoint management shell > navigate to the AllowHtcExtn.ps1 location and run it in the below way

    > AllowHtcExtn.ps1 <site collection url>

    Eg > AllowHtcExtn.ps1 http://app1/

    Successful run will show you a similar window


    2. On CRM Server –“Configure CRM to use SharePoint as the backend document server

    Step 1- Open CRM 2011 Site > Settings (at bottom of the image)


    Click on Document management > click Document Management Settings


    Now in the following window tick mark those entities where you WANT to have Document management Option. Eg :- if you only want “Account” entity to have the option to upload and download document select it, if you want document management on other entities like Order, Price List , Contract etc. just select it. By default Account, Article, Lead, Opportunity, Product, Quote and Sales Literature are already selected.

    Next Enter the site collection url in the URL box.


    CRM now will validate if the sandbox solution is present in that site collection and once it detects its presence will it continue further


    Now we are given an option to select – On which Entity should CRM create the folder structure? Based on accounts or Contacts ?


    Once we click next it will prompt to create the document libraries in SharePoint, say Ok , and it finishes with a charm.


    Since we went with default entities, we got 7 libraries created in SharePoint


    And with that last click we are Done !!! Time for testing.
    See it in Action

    Open the CRM workspace> Open a Account.

    Click on Documents menu on the left and you will receive a prompt (it says that I’m going to create a folder with the account’s name) say Ok


    We got the library created


    Add a document to this


    You will see it below like this


    If I now do the same operation for the entity Opportunity I will see the documents in a folder name Opportunity under the account library.

    That's it !, we are done with the Document management integration of CRM 2011 and SharePoint 2010, wasn’t that easy?

    Good to know -

    1. Finally All the feature of document management server is provided in the CRM UI itself


    1. Navigate up – For navigating up in the SharePoint location
    2. Create folder – create a new folder in SharePoint
    3. New document - Requires Office 07/10 installed on client, for win2008 Desktop experience feature is required
    4. Upload – upload a document
    5. Edit document – Edit in office application, Office client must be installed on the client
    6. Delete document – Deletes from SharePoint
    7. Open the SharePoint site – Opens in another window the SharePoint site
    8. Check Out – Exclusively lock the document for yourself
    9. Check In – After edit is done check in to commit your change or unlock the document
    10. Discard Check out- Discard whatever you did after checkout
    11. Set alert on SharePoint – subscribe to email alerts on events on the document (eg,: document change, deleted )
    12. Download document – download a copy of the document locally
    13. Copy Shortcut – get the document url in clipboard
    14. Send Shortcut – needs outlook installed on client
    15. View Properties – view the metadata (date modified, author, last modified by, last modified date etc)
    16. Edit Properties – edit the metadata
    17. Versions – Shows different versions of document from previous edits

    2. If you DO NOT have list component on SharePoint still you can use the SharePoint libraries to store document. When you will use the documents option in the CRM entity it will ask you for a SharePoint library, Which CRM will show you in an IFRAME (like the good old days of CRM 4 with IFRAME of SharePoint library). CRM 2011 does the IFRAME configuration automatically.


    Enter the absolute URL of the SharePoint library and you will see it in IFRAME


    Now click save and see the library in IFRAMEJ.


    3. On the entities we can see the option “Add Location”, this way we can have multiple locations added for content storage, it can be another SharePoint site or a different URL.


    Notice the URL of the SharePoint, it’s a sub site, OR you could use another folder too. My intention is to have a separate folder for contract documents and another folder for other type of documents.


    The result is – the users get the drop down to select where he wants to upload the document.


    4. The Edit Location button will give us option to alter the location of the SharePoint folder.


    5. Something that you must know .The Option to Remove / Delete added location is nowhere here. You must navigate to Document management settings > SharePoint Document Location. Remember this is the only place where you can see all the SharePoint sites currently used by CRM.

    6. clip_image057

    I hope you liked the post. I will be back with other parts soon !

    Questions/feedback are welcome

  • Integrating Remote Desktop Services with SharePoint Server 2010

    Post courtesy of Yashkumar Tolia

    One of the first things that an IT administrator (and even an end user) dreams for is “Integration”. For an administrator, integration of multiple technologies in your environment, consolidation of various products, management of everything from one single place, provision of data in a secured manner; are a few reasons. While for an end user, single point of access, anytime consumption of data and Single Sign-On pops up to one’s mind.

    Remote Desktop Services

    One of the prime technologies used by IT administrators for virtualization is Remote Desktop Services, previously known as Terminal Services. The TechNet article, Remote Desktop Services Overview, gives a great beginning to understanding this technology. As the link mentions, the major advantages of adopting this technology are:

    • Application deployment: You can quickly deploy Windows-based programs to computing devices across an enterprise. Remote Desktop Services is especially useful when you have programs that are frequently updated, infrequently used, or difficult to manage.
    • Application consolidation: Programs are installed and run from an RD Session Host server, eliminating the need for updating programs on client computers. This also reduces the amount of network bandwidth that is required to access programs.
    • Remote access: Users can access programs that are running on an RD Session Host server from devices such as home computers, kiosks, low-powered hardware, and operating systems other than Windows.
    • Branch office access: Remote Desktop Services provides better program performance for branch office workers who need access to centralized data stores. Data-intensive programs sometimes do not have client/server protocols that are optimized for low-speed connections. Programs of this kind frequently perform better over a Remote Desktop Services connection than over a typical wide area network.

    SharePoint Server 2010

    SharePoint Server 2010 is not just viewed as a content sharing and accessing product any more, but as a Business Collaboration Platform for the Enterprise and the Internet. With features like content management, workflows, search, SharePoint Server 2010 helps you to connect with colleagues and information; manage and govern enterprise content; balance user experience with policy and process; and help users find the content, information, and people. A great guide for understanding SharePoint Server 2010 is found in TechNet, SharePoint Server 2010.

    Integration of Remote Desktop Services with SharePoint Server 2010

    Integration of these 2 technologies opens up great avenues for consolidation. The Remote Desktop Web Access server role can be taken over by the already present SharePoint Server 2010 in the environment. This provides the possibility such as:

    • Single Website: As the SharePoint website is already present; this can be leveraged to publish the RemoteApps that are hosted on the Remote Desktop Session Host server. This reduces the URLs that the end user has to remember to access company data and applications.
    • Customization: The SharePoint Website, unlike the Remote Desktop Web Access default web portal, can be customized as per company needs. This gives the company freedom to brand, color code, provide additional links or shortcuts as required.
    • Accessing content through RemoteApps: If the content in SharePoint needs a particular RemoteApp to run, you can make a connection to the RemoteApp and then open the doc in it. This gives you the capability of accessing this data over the internet as well, without having to worry about security.

    Steps to integrate Remote Desktop Services with SharePoint Server 2010

    The integration of SharePoint Server 2010 (from now on, we will call it SPS) with Remote Desktop Services (from now on, we will call it RDS), is divided into 5 steps:

    1. Installation of RDS Session host server role
    2. Installation of SPS
    3. Installation of RDS Web Access server role on SPS
    4. Configuration of the Terminal Services Web Part
    5. Publishing of RemoteApps

    1. Installation of RDS Session host server

    Perform these steps on the RDS Session Host server:

    a. Go to Server Manager -> Roles -> Add Roles. This will take you to the Add Roles Wizard. Click Next.


    Figure 1: Add Roles Wizard

    b. Select Remote Desktop Services. Click Next.


    Figure 2: Role Selection

    c. Click Next.


    Figure 3: Introduction to Remote Desktop Services

    d. Select Remote Desktop Services Session Host role. Click Next.


    Figure 4: Role Service Selection

    e. Click Next.


    Figure 5: Uninstall and Reinstall Application for compatibility warning

    f. Select Require Network Level Authentication. Click Next.


    Figure 6: Network Level Authentication Selection

    g. Select the appropriate licensing scheme. Click Next.


    Figure 7: Licensing Mode Selection

    h. Select the appropriate users you want to give access to the RDSH server. Click Next.


    Figure 8: User Group Definition

    i. Select any of the features that you want to include in the Desktop Experience. Click Next.


    Figure 9: Enabling Desktop Experience

    j. Click Install. Reboot the server.


    Figure 10: Installation summary

    2. Installation of SPS 2010

    Perform these steps on the SPS server:

    a. Install SPS 2010.


    Figure 1: SharePoint Installation

    b. Check mark Run the SharePoint Products Configuration Wizard now. Click Close.


    Figure 2: SharePoint Installation completion and Run Configuration Wizard

    c. Click Next.


    Figure 3: Configuration Wizard

    d. Click Yes to restart the services.


    Figure 4: Restarting of Services

    e. Go to the SPS website by typing the following URL: http://<servername>/, to check if the SharePoint site is working fine or not.


    Figure 5: SharePoint Website Home Page

    3. Installation of RDS Web Access server role on SPS

    Perform these steps on the SPS server:

    a. Go to Server Manager -> Roles -> Add Roles. This will take you to the Add Roles Wizard. Click Next.


    Figure 1: Add Roles Wizard

    b. Select Remote Desktop Services. Click Next.


    Figure 2: Selection of Role Services

    c. Click Next.


    Figure 3: Introduction to Remote Desktop Services

    d. Click on Add required role services.


    Figure 4: Add required Role Services

    e. Click on Next.


    Figure 5: Introduction to IIS

    f. Click Next.


    Figure 6: Add role services

    g. Click Finish to finish installation.

    4. Configuration of the Terminal Services Web Part

    Perform these steps on SPS server:

    a. Go to %SystemDrive%:\inetpub\wwwroot\VirtualDirectories\80. Right click Web.config and edit it in wordpad.


    Figure 1: Editing web.config file

    b. In the <SafeControls> section, add the following line under the other SafeControl Assembly entries (as a single line):

    <SafeControl Assembly="TSPortalWebPart, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="Microsoft.TerminalServices.Publishing.Portal" TypeName="*" Safe="True" AllowRemoteDesigner="True" />


    Figure 2: Adding SafeControl Assembly

    c. Open an elevated command prompt. To do this, click Start, right-click Command Prompt, and then click Run as administrator:

    · Type mkdir "%SystemDrive%\Program Files\Common Files\Microsoft Shared\Web Server Extensions\wpresources\TSPortalWebPart\\images" and then press ENTER.

    · Type mkdir "%SystemDrive%\Program Files\Common Files\Microsoft Shared\Web Server Extensions\wpresources\TSPortalWebPart\\rdp" and then press ENTER.

    · Type cacls "%SystemDrive%\Program Files\Common Files\Microsoft Shared\Web Server Extensions\wpresources\TSPortalWebPart\\images” /T /E /P NetworkService:F and then press ENTER.

    · Type cacls "%SystemDrive%\Program Files\Common Files\Microsoft Shared\Web Server Extensions\wpresources\TSPortalWebPart\\rdp” /T /E /P NetworkService:F and then press ENTER.


    Figure 3: Adding files to the Web Parts


    d. Go to the SharePoint website as an administrator. In the upper-right corner, on the Site Actions tab, click Site Settings.


    Figure 4: Editing the Site Settings


    e. Under Galleries, click Web Parts.


    Figure 5: Adding Web Part to the Gallery


    f. Under the Web Part Gallery heading, click New.


    Figure 6: Adding the TSPortalWebPart to the list


    g. Select the check box next to Microsoft.TerminalServices.Publishing.Portal.TSPortalWebPart, and then click Populate Gallery.


    Figure 7: Adding the new Web Part

    5. Publishing of RemoteApps

    Perform these steps on the SPS server:

    a. On the Site Actions tab, click Edit Page.


    Figure 1: Edit Web Page

    b. Choose the location on the website where you want to add the Web Part, and then click Add a Web Part.


    Figure 2: Adding the Web Part to the site

    c. In the Add Web Parts -- Webpage Dialog dialog box, under the All Web Parts heading, select the TSPortalWebPart check box, and then click Add. The TSPortalWebPart Web Part will appear on the page.


    Figure 3: Select the TSPortal Web Part

     d. To configure the Web Part, click edit in the upper-right corner of the Web Part, and then click Modify Shared Web Part.


    Figure 4: Editing the Web Part

    e. In the RD Session Host server(s) or RemoteApp and Desktop Connection Management server name box, type <RDSservername> and then click OK.


    Figure 5: Adding the RDS Session Host Server

    f. Click Save icon in the top left corner of the website.


    Figure 6: Saving the Web Part to the site


    g. Test the application by running a RemoteApp.


    Figure 7: Selecting the RemoteApp


    Figure 8: Connecting to the RemoteApp


    Figure 9: Providing credentials


    Figure 10: Using the RemoteApp

    In this way, you can leverage your already existing environment and integrate it for a single website for the users to log into and get their RemoteApps. This combined with the search and content sharing capabilities gives the user a seamless experience.

    Additional Information

    To learn more, check out the following articles:

    Customizing Remote Desktop Web Access by Using Windows SharePoint Services Step-by-Step Guide

  • SharePoint Conference 2011 Session Recordings Available

    Update: It looks like the recordings have been pulled off of YouTube.  Perhaps they were not supposed to be there in the first place.  It was good while it lasted… make sure to sign up for SharePoint Conference 2012 here:

    Hello All-

    After eating all your turkey today (if you are in the US), what better to do afterwards but kick back and watch recordings of the SharePoint Conference 2011?  Thanks to a tip from Spencer, I noticed that most of the sessions from SharePoint Conference 2011 have been uploaded to YouTube here:

    In the interests of helping you find the best sessions, I have compiled a list of every session below (and put the same information in an attached spreadsheet.  Nearly every session from the conference is available, and I have left the sessions that were not included in the list below on the off chance someone uploads them later.








    (REPEAT) Creating Beautiful and Engaging Web Sites with SharePoint 2010 200
    SPC410r (REPEAT) Out of the Sandbox and into the cloud: Build your next SharePoint app on Azure 400 Andrew Connell
    (REPEAT) SharePoint 2010 Planning and Adoption Framework 200
    SPC200 “Searching for Sarah" - Why Booz & Co. chose FAST Search 200

    Polly Kahler , Jeremiah Fellows , Adrienne Crowther

    “We’re Going two-way Baby!” How Vancity Took its Intranet From Static to Social 200
    36 Terabytes: How Microsoft IT Manages SharePoint in the Enterprise 200
    SPC300 A Closer Look at SQL and SharePoint: Tips and tricks from the field 300 Wayne Ewington , Steve Letford
    SPC301 Access & Office 365: Rapid Cloud App Development on Office 365 with Access Services 300

    Steven Greenberg

    SPC302 Access Services - Everything You Wanted to Know 300

    Greg Lindhorst

    SPC303 Advanced BI Visualizations using Visio Services 300

    Christopher Hopkins , A.J. Briant

    SPC304 Advanced Scorecarding and Dashboards with Excel, Visio and PerformancePoint Services 300 Peter Myers
    SPC400 Advanced SharePoint Data Access with Silverlight 400 Robert German , Ryan Sockalosky
    SPC305 Advanced Windows PowerShell for SharePoint 2010 300 Todd Bleeker
    SPC319 Application Lifecycle Management: Automated builds and testing for SharePoint projects 300

    Chris O'Brien , Mike Morton

    SPC201 Applying a Brand to your SharePoint Web Site 200 Jon Flanders
    SPC224 Architecting and Automating SharePoint Governance 200 Dan Holme
    SPC306 Are Office Web Apps Enough? Architecting the best Office experience for your customers 300

    Andy O'Donald , Alistair Speirs

    SPC202 Attractive Business Intelligence: Dashboards, Pivots, Scorecards, KPIs, and Reports Using Microsoft SharePoint 2010, Office 2010, PowerPivot, and SQL Server 2008 R2 200

    Rafal Lukawiecki

    SPC307 Automating Business Processes with SharePoint 2010 (Part 1) - Using SharePoint Designer, InfoPath and Workflow 300 Keenan Newton
    SPC308 Automating Business Processes with SharePoint 2010 (Part 2) - Using BCS, Word Automation Services (and More Workflow) 300 Keenan Newton
    SPC287 Avanade: Unleashing Competitive Advantage through the use of Social Media and Collective Organizational Intelligence 200

    Thomas Krofta , Markus Sprenger

    SPC293 AvePoint: Governance Enforcement with out of the box SharePoint and AvePoint’s DocAve Platform 200 John Hodges , Ion Tobescu
    SPC344 BA Insight: Enterprise Search and the Cloud: Unifying Information across SharePoint, O365, LOB systems, and more 300 Guy Mounier , Jeff Fried
    SPC309 Backup and Recovery for SharePoint 2010 300

    Chris Whitehead , Sam Hassani

    SPC310 Best Practices Around SharePoint 2010 User Profiles 300 Scott Jamison
    SPC311 Best Practices for Building your Website for Scale with SharePoint 2010 300

    Ethan Gur-esh

    SPC312 Best Practices for Creating Publishing Page Layouts 300 Geoffrey Edge
    SPC313 Best Practices for Deploying Project Server 2010 on SharePoint Farm 300

    Christophe Fiessinger

    SPC203 Best Practices from the Field: Managing Corporate Metadata and Taxonomies with SharePoint 2010 200

    Nikos Anagnostou , Lesly Goh

    SPC314 Best Practices With jQuery and SharePoint 300

    Mark Rackley , Eric Harlan

    SPC204 Beyond Ten Blue Links: Why Search-Driven Applications Matter to the Enterprise 200 Jeff Fried
    SPC315 Beyond the Basics: An Advanced Conversation on FAST Search for SharePoint 2010 300 Thomas Molbach , Thomas Svensen
    SPC316 Branding and Customizing My Sites with SharePoint 2010 300 John Ross, Randy Drisgill
    Branding SharePoint Online Sites 200
    SPC320 Building Business Applications on Azure using Office365 and Windows Azure AppFabric 300 Tony Meleg
    SPC321 Building Custom In Place Records Management Solutions 300 John Holliday
    SPC322 Building integrated SharePoint 2010 and CRM Online solutions. 300 Girish Raja
    SPC362 Building Language-Based SharePoint Internet Sites Using Variations 300 Israel Vega , Shad Phillips
    SPC324 Building Office Solutions that Leverage Duet Enterprise 300 Joyanta Sen, Matjaz Perpar
    SPC325 Building Self-Service BI Applications using PowerPivot v2 “Denali” for Excel 2010 and SharePoint 2010 300 John Hancock
    SPC318 Building Visually Compelling BI Experiences with PowerPivot 300 Peter Myers
    SPC3983 Business Intelligence Overall Architecture 300 Jason Burns
    SPC206 Campaign and Experience Management on a SharePoint 2010 Website 200 Geoffrey Edge
    SPC326 Capacity Management for SharePoint 2010 300 Spencer Harbar
    SPC207 Change Management: Preparing End Users for SharePoint 2010 200 Dan Holme
    Clearing away the Clouds: What’s Hype and What’s Real in Cloud Adoption, Today and Tomorrow 200
    SPC327 Cloud Packing: Preparing for the Move into SharePoint Online 300

    James Petrosky , Kimmo Forss

    SPC328 CMIS Deep Dive and Roadmap 300 Ryan McVeigh , Adam Harmetz , Mike Mahon
    Collaborative Decision Making: How the Convergence of Collaboration Software and Business Intelligence can Optimize Decision Making in Your Organization 200
    Communication and Collaboration across Baker Hughes 200
    SPC213 Content Acquisition for Search in SharePoint 2010 200 Vaidyanathan Raghavan , Sid Shah
    SPC329 Content Query WebPart: A Deep Dive on SharePoint's Swiss Army Knife WebPart 300 Christina Wheeler
    SPC330 Content Targeting with the FAST Search Web Part 300 Martin Harwar
    SPC331 Creating a FAST Search Driven Windows Phone 7 Application for a SharePoint Internet Sites 300 Shad Phillips , Andy Kojs
    SPC332 Creating an Easy To Use File Plan Builder for Your SharePoint Records Center 300 John Holliday
    SPC215 Creating Awesome Dashboards with SharePoint 2010, Infopath 2010 and SharePoint Designer 200 Eric Harlan
    Creating Beautiful and Engaging Web Sites with SharePoint 2010 200
    Creating Vivid BI reports with "Project: Crescent" for SharePoint 200
    SPC218 Customer Spotlight: Reduce Costs and Increase Reliability by Migrating from a Legacy ECM System to SharePoint 2010 200

    Mark Barron , Charles Norman II

    SPC219 Customer Spotlight: Saying Goodbye to Paper - Building a Multi Million Document Repository to Save Time and Money 200 Arik Kalininsky
    SPC333 Customizing Content Publishing Approval Workflows 300 Robert Bogue
    SPC334 Data Access with Search and the KeywordQuery API 300 Phil Wicklund
    SPC401 Deep Dive on Developing Custom Service Applications 400 Todd Bleeker
    SPC402 Deep Dive on SharePoint Ribbon Development & Extensibility 400

    Chris O'Brien , Andrew Connell

    SPC403 Deep Dive: Excel Services and PerformancePoint Services Administration and Troubleshooting 400 Kevin Donovan
    SPC404 Deep Dive: Implementing Kerberos for your BI Applications 400 Tom Wisnowski
    SPC220 Delivering Data as a Service using Azure platform and self service BI at Microsoft 200 Sanjay Soni , Phani Vikram Jungali
    SPC222 Deployed SharePoint Search? What’s next? 200 Denis Heliszkowski , Neil Hodgkinson
    SPC335 Deploying SharePoint 2010 as a Mission Critical Application 300

    Sam Hassani , Chris Whitehead

    SPC223 Deploying SharePoint 2010 in Private, Public and Hybrid Cloud Architectures 200

    Phil Wicklund

    SPC406 Developing and Extending Enterprise Content Management features 400 Paul Swider
    SPC336 Developing and Getting the Most from Sandboxed Solutions 300

    Michael Ammerlaan

    SPC225 Developing and Managing SharePoint Solutions with Visual Studio 200

    Mike Morton , Jay Schmelzer

    SPC276 Developing Cloud-Based Applications for SharePoint Online using Windows Azure 200 Steve Fox
    SPC337 Developing LOB Connectors in SQL Azure and Business Connectivity Services 300 Scot Hillier
    SPC338 Developing SharePoint applications with HTML5 and JQuery 300 Ted Pattison
    SPC339 Developing Windows Phone 7 Applications for SharePoint 300 Paul Stubbs
    SPC226 Document Management: Planning For Success 200 Russ Edelman
    Documents are boring but document solutions are not!! 300
    Drive Adoption and Get Users Excited About SharePoint 100
    SPC2993 EMC Corporation: Proven Best Practices for Virtualized SharePoint and FAST Search 200 James Baldwin , Eyal Sharon
    SPC341 Enabling the new Self-Service Alerting with SQL Server "Denali" Reporting Services 300 Lukasz Pawlowski
    SPC342 End-to-end BI Security Implementation Best Practices 300 Carl Rabeler
    SPC343 Enforce Governance by Automating Site Provisioning in SharePoint 2010 300 Ed Hild
    Enliven Customer Web Sites With Excel Services Web Parts and Embedding 200
    SPC407 Enterprise Deployment Considerations for the User Profile Service Application 400 Spencer Harbar
    SPC345 Everything you Need to Know About Security and FAST Search for SharePoint 2010 300

    Steven Fowle , Paul Branson

    SPC346 Exploring the Office Developer Story in Microsoft Office 365 300 Donovan Follette
    SPC347 Extending SharePoint 2010 Health & Monitoring 300 Todd Carter
    SPC348 Extending SharePoint 2010 to your customers and partners 300 Corey Roth
    Extending the reach of SharePoint in multi-vendor environments 200
    SPC228 FAST Search for SharePoint 2010 – Hello World 200 Johnny Benitez
    From Web Content Management to Customer Experience Management: How to Optimize External Web Sites and Deliver a Search-Driven Experience 200
    SPC349 Generating Business Documents using Word Automation Services and Open XML 300 Scot Hillier
    GimmalSoft: Managing Electronic Records within SharePoint 2010, including DoD 5015.2 200
    SPC350 Got iPads, Android tablets, smart phones and Windows devices? Managing Office 2010 endpoints in an Interoperable and multi-device World 300 Jeremy Chapman
    Hallmark Retail Connect ® - Connecting Hallmark Gold Crown Stores 200
    SPC408 Handling Explosive Content Growth: Advanced Strategies for Managing Retention and Disposition in SharePoint 400 Ben Robb Bring SharePoint to Outlook and Lotus Notes 200
    SPC351 Hit The Ground Running with Claims Authentication in SharePoint 2010 300

    Steve Peschka

    SPC232 How Del Monte Foods has kept things fresh through 4 versions of SharePoint 200 Michael Bernot , Scott Smith
    How eBay Successfully Upgraded their Intranet to SharePoint 2010 200
    SPC234 How FAST Search Empowers Information Discovery and Intranet Search at Microsoft 200

    Kajal Soni , Rene Sanchez Almaguer

    How Gaming Giant, EA Drove Culture Change with SharePoint 2010 100
    How General Mills Made Enterprise Search Personal 200
    SPC257 How is Duet Enterprise helping customers in the real world? 200 David Christensen
    SPC352 How Microsoft Builds, Deploys and Runs SharePoint Online: A Peek Behind the Curtain 300

    Roberto Taboada , Doron Bar-Caspi

    SPC353 How Microsoft Built Academy, it's Social Video Platform 300 Austin Winters
    How SharePoint is Being Used to Manage Content at the World’s Largest Airline 200
    SPC3982 How to Effectively Deploy Updates in SharePoint 2010 300

    Chris Whitehead , Sam Hassani

    How Turner Broadcasting System Turned On Employee Engagement With SharePoint 2010 200
    How We Built Community Site 300
    SPC298 HP: Private Cloud Collaboration with HP Enterprise Cloud Services 200 Chip Vollers
    SPC259 Identity in SharePoint Online 200 Phil Wicklund
    Implementing an OnPrem/Office365 Hybrid Architecture 200
    Implementing FAST Search for SharePoint 2010 at the IMF 200
    SPC239 InfoPath + SharePoint Designer + Office 365 = Forms in the Cloud! 200 Asif Rehmani
    SPC296 InfoPath 2010 – Best Practices for Design and Performance 200 Darvish Shadravan
    SPC397 InfoPedia: A High Performance, Self-Service knowledge sharing and Enterprise Content Management Solution 300

    Ludovic Fourrage , Gary Snowberger

    SPC355 Instrumentation and Debugging on Premises and in the Cloud 300 David Mann
    SPC409 Integrating and Synchronizing SharePoint Metadata with other Metadata Stores and Environments 400

    Pete Gonzalez , Daniel Kogan

    SPC356 Integrating Commerce Server with a SharePoint Internet Site 300 Shyam Narayan, William Cornwil
    SPC357 Integrating Microsoft Office 2010 and Windows Phone 7 300 Donovan Follette
    SPC358 Integrating Microsoft Visio Services with System Center for BI 300

    Marshall Copeland , Scott Wold , Julian Soh

    SPC359 Integrating SharePoint Social features into your Windows Phone 7 Application 300 Todd Baginski
    SPC360 Integrating Social Networking Sites with a SharePoint Internet Site 300

    Ryan Sockalosky , Brian Rodriguez

    Introduction to SharePoint 2010 Development 200
    SPC381 IW (heart) Office. Helping Information Workers Love Office Even More 300 Chris Auld
    SPC299 K2: Is Workflow and Process Automation the Key for Gaining More Value from SharePoint? 200 Olaf Wagner
    Knowledge Communities: Unlocking SharePoint 2010's Hidden Value 200
    KnowledgeLake: Getting Ready for the Cloud - Is your Content Strategy Partly Cloudy? 200
    SPC361 Landing SharePoint Data in Office Client Solutions 300 Donovan Follette
    SPC387 Leveraging AlwaysOn in SQL Server Denali with SharePoint 2010 200 Bill Baer
    SPC242 Leveraging Project 2010 with Office 365 for Project Management Success 200 Dux Raymond Sy
    SPC363 Localizing SharePoint Solutions/Lösungen/פתרונות/解决方案 300 Michael Ammerlaan
    SPC243 Make Your Search Social with FAST Search! 200 Paul Summers, Dan Benson
    SPC244 Making Enterprise Search a Strategic Platform Within an Organization 200 Jan Skjoy , Stein Wenberg Jacobsen
    SPC245 Making SharePoint 2010 Collaboration Rock by Increasing Findability 200 Scott Jamison
    SPC364 Making the Most of Search in SharePoint Online 300 Corey Roth
    SPC365 Making the Most of Your Content: Combining ECM and Enterprise Search 300 Jeff Fried
    SPC366 Making your SharePoint Websites Sing on Smartphones 300 Chris Auld , Gavin Barron
    SPC354 Managing Innovation with SharePoint & Project Server 2010 300 Simon Floyd
    Managing LOB Data with BCS & SharePoint Search 300
    SPC248 Measuring the Value of Your SharePoint 2010 Investments 200 Susan Hanley
    SPC2995 Metalogix: At Last - Size Doesn’t Matter! Considerations for building a SharePoint ECM platform to accommodate TBs of Content 200

    Stephen Cawood , Steve Marsh

    Microsoft Project and Project Server 2010 Overview 200
    SPC249 Microsoft's Vision and Strategy for the Future of Business Intelligence 200 Steve Tullis, Kamal Hathi
    SPC368 Migrating & Organizing Content: Unleashing the Value of Automatic Placement 300 Robert Bogue
    SPC246 Migrating from 2007 to SharePoint 2010 - How to do it "Search First" 200 Mark Stone , Harry Jones , Natalya Voskresenskaya
    SPC212 More Than My: How Microsoft is Driving Social Adoption and Intranet Transformation 200

    Chris Slemp , Sean Squires

    SPC369 Moving beyond Service Applications to build a social ecosystem 300 Andries den Haan
    SPC370 Multi-Tenancy with SharePoint 2010 300 Spencer Harbar
    SPC2992 NetApp: Addressing the Challenges of SharePoint Data Management Efficiently 200 Reena Gupta
    Neudesic: Space X Launches SharePoint 2010 with Neudesic Pulse for Out of this World Business Collaboration 200
    Nintex: Nintex Workflow End-to-End Solutions: One Big Demo, Live Without a Net 100
    SPC371 Notes From the Field: Sizing and High Availability with FAST Search Server 2010 for SharePoint 300

    Steven Fowle , Barry Waldbaum

    SPC252 One Content Library, Multiple Websites with SharePoint 2010 at D&M Holdings 200

    Alex Alexandrou , Lalit Panda

    SPC109 OpenText: Connecting SharePoint to your Information Governance Strategy 100 Dave Martin
    SPC263 Oranges, Rocket Ships and Six Pack Abs - What your SharePoint Corporate Portal is Lacking 200 Karuana Gatimu, Walter Cruzate
    SPC410 Out of the Sandbox and into the cloud: Build your next SharePoint app on Azure 400 Andrew Connell
    SPC254 Overview - Enterprise Search in SharePoint 2010 200

    Mark Stone , Aastha Gupta

    SPC254 Overview of Business Intelligence in Office and SharePoint 2010 200

    Seayoung Rhee, Albert Chew

    SPC255 Overview of Duet Enterprise for SharePoint 2010 and SAP 200 Joyanta Sen , Matjaz Perpar
    Overview of ECM in SharePoint 2010 200
    SPC372 Packaging SharePoint Branding Elements for Deployment 300 Christina Wheeler
    SPC374 Performance Testing and Optimizing SharePoint Websites 300

    Israel Vega , Frank Marasco

    SPC373 Performance Tuning SharePoint 2010 300 Eric Shupps
    SPC375 Planning and Implementing SharePoint 2010 Upgrade and Migration 300 Todd Klindt, Shane Young
    SPC376 Planning and Managing Sandboxed Solutions and Services 300 Maurice Prather
    SPC258 Planning for the Lifecycle of Your SharePoint 2010 Website 200

    PJ Zargarzadeh

    SPC260 Planning Your Approach to SharePoint ECM 200 Israel Vega , Charles Maxson
    SPC377 Planning your SharePoint 2010 Topology 300

    Scott St. Jean , Oleg Lysyk

    Practical Approach to SharePoint Governance: The Key to Successful SharePoint 2010 Solutions 200
    SPC262 Project 2010 and SharePoint 2010 Better Together 200 Joriz De Guzman
    SPC378 Project 2010 Development for SharePoint Developers 300

    Jan Kalis , Steven Haden

    SPC323 Putting Search on the Map with FAST Search For SharePoint 300 Steve Kuenzli
    SPC390 Quest Software: SharePoint Migration, Implementation and Customization Notes from the Field: Observations on What Not to Do 300 Adam Woodruff
    SPC379 Real World Examples for Virtualization, Configuration, and Security with FAST Search 300 Brent Groom
    SPC264 Relevance and Ranking in SharePoint Search 2010 200 Victor Poznanski
    SPC380 Remote BLOB Storage with SharePoint Server 2010 Deep Dive 300 Bill Baer
    SPC101 SAP: Bring the Power of SAP Applications to Microsoft SharePoint Platform 100

    Pascal Gibert , Jamie Stuart

    Scaling SharePoint Document and Records Centers to Terabytes and Beyond 300
    SPC383 Search Engine Optimization on a SharePoint 2010 Internet Site 300 Daniel Haywood
    SPC384 Searching Video with FAST Search for SharePoint 2010 300

    Matthew Roberts , Peter Petley , Nathan Treloar

    SPC411 Security Design with Claims Based Authentication 400 Nathan Miller , Israel Vega
    SPC265 Selecting a Records Management Strategy: What's Best for You? 200 Brad Teed
    SPC385 Service Application Federation with SharePoint 2010 300 Shannon Bray
    SPC386 Setting up and Configuring PowerPivot v2 for SharePoint in SQL Server 2008 R2 and "Denali" 300 Dave Wickert
    Seven Habits of Highly Effective SharePoint Developers 200
    SPC266 SharePoint 2010 Planning and Adoption Framework 200 Todd Ray
    SharePoint 2010 Solutions for Public Sector 200
    SPC271 SharePoint 2010: Improving Productivity with Social 200

    Dave Pae , Paul Javid

    SPC108 SharePoint and Friends: The Future of Productivity 100

    Chris Barnard , Laura Baur

    SPC272 SharePoint Designer 2010- A Tool for End Users? 200 Asif Rehmani
    SPC388 SharePoint Infrastructure for Geographically Distributed Organizations 300

    David McNamee

    SPC389 SharePoint Internet Sites that Integrate FAST Search 300

    Cem Aykan , Runar Olsen

    SPC273 SharePoint Lifecycle Management Solution with Project Server 200

    Scott Jamison , Christophe Fiessinger

    SPC274 SharePoint Online Overview 200 Mark Kashman
    SharePoint Workflow Best Practices 300
    SPC412 SharePoint, Azure and Claims Integration for Developers 400 Steve Peschka , James Petrosky
    SPC275 Solving Agile and PMO Problems by Integrating Project Server 2010 with Team Foundation Server 2010 200

    Christophe Fiessinger

    SPC392 Solving Enterprise Search Challenges with SharePoint 2010 300 Matthew McDermott
    SPC250 Solving Office Compatibility to accelerate Office deployments 200

    Curtis Sawin , Brian Shiers

    SPC393 Step-by-step: Building Search Driven Applications That Matter 300 Scot Hillier
    SPC278 Tackling the Challenges of a Multinational Organization with Collaboration 200 David McNamee
    SPC394 Taxonomy Based Content Targeting for a SharePoint Internet Site 300 Gary Lapointe
    SPC279 TELUS Goes Social: How a Canadian Telecom Changed the Learning Model 200 Dan Pontefract
    The City of SharePoint: What SharePoint Planners Can Learn From City Planning 200
    The Convergence of ECM and Knowledge Management: Strategies for Success 200
    SPC395 The End to End Guide to Upgrading Custom Code from 2007 to 2010 300 Becky Isserman
    The Forrester Survey: Best Practices in SharePoint 2010 Adoption and Migration 200
    SPC396 The Inside Scoop: How Microsoft Architected an Enterprise Scale Records Management Solution 300

    Nishan Desilva , Steve Pogrebivsky

    SPC281 The Inside Scoop: How Microsoft BI Solutions are helping Seattle Public Schools 200

    Adam Nathan , Paul Haldi , Stephen Drew

    SPC399 The Inside Scoop: How Microsoft Built a Scale Lab for 120 Million items 300

    Barry Waldbaum , Paul Andrew , Paul Learning

    SPC282 The Inside Scoop: How Microsoft IT Created their CIO Scorecard Using Self-Service BI 200

    Sanjay Soni , Sreepada Santhegudda

    The Inside Scoop: How Microsoft IT Enables Information Discovery with Managed Metadata 300
    SPC283 The Inside Scoop: How the SharePoint BI platform was used to create stunning dashboards at the Cherwell District Council 200 David McMahon
    SPC413 The Inside Scoop: How the SharePoint Dev Team Troubleshoots Performance and Reliability 400 Corey Roussel
    SPC414 The Nuts and Bolts of Managing Enterprise Content Types At Scale 400 Daniel Kogan
    SPC3991 The Official Guide to Troubleshooting FAST Search for SharePoint 2010 300

    Dan Harrington , Kristopher Loranger

    SPC3992 Tips and Tricks: Configuring SQL Server 2008 R2 Reporting Services with SharePoint 2010 300 Andrew Karcher
    SPC3993 Tips and Tricks: Effectively manage your SharePoint Farm with BI 300 Kevin Donovan
    SPC3997 TITUS: Using Claims for Authorization in SharePoint 2010 300 Antonio Maio
    True Business and IT Partnership: Best Buy Governance and SharePoint 2010 200
    SPC285 Understanding SharePoint Administration Part 1 200 Shane Young, Todd Klindt
    SPC286 Understanding SharePoint Administration Part 2 200 Shane Young, Todd Klindt
    SPC3995 Upgrading SharePoint Solutions and Features – A Closer Look 300 Wayne Ewington
    SPC3994 Upgrading User Profiles and My Sites from SharePoint 2007 to SharePoint 2010 300 Mirjam van Olst
    SPC3996 User-Centric Design for Deploying FAST Search Server 2010 for SharePoint 300

    Fergus Mcdowall

    SPC269 Using SharePoint to Drive Adoption of Records and Information Management – The Charter Communications Story 200 Dan Vasey
    SPC288 Using the Business Decision Appliance to Enable Self-Service BI in any Organization 200 Dana Kaufman , Dave Wickert
    SPC289 Visio Services – Creating a No-Code Visio Services Dashboard using Office 365 200

    Christopher Hopkins , Aftab Alam

    SPC291 What's new for SQL Server "Denali" Analysis Services and PowerPivot v2 200 John Hancock , T K Anand
    SPC292 What's new for SQL Server "Denali" Reporting Services 200 Carolyn Chau
    SPC3999 Why Your Next SharePoint Deployment Should be Virtualized 300 Damir Bersinic
    SPC251 With SharePoint 2010 Your Extranet is Easy 200

    Brad Freels , Peter Carson

    SPC221 Work Smarter, not Harder! Top Ten Tips for Improving Productivity with SharePoint 200 Debbie Ireland
  • Open XML SDK: Merging Documents

    (This post courtesy Natalia Efimsteva)

    Office Open XML (OpenXML) is a zipped, XML-based file format developed by Microsoft for representing spreadsheets, charts, presentations and word processing documents. The Office Open XML specification has been standardized by ECMA (ECMA-376) [wiki]. Open XML is the native format for MS Office 2007/2010.

    Open XML allows you to manipulate MS Office files in your own and desired way. For example, you can create .docx files programmatically on the server side (which wasn’t recommended for binary MS Office formats like .doc).

    The Open XML SDK 2.0 for Microsoft Office is built on top of the System.IO.Packaging API and provides strongly typed part classes to manipulate Open XML documents. The SDK also uses the .NET Framework Language-Integrated Query (LINQ) technology to provide strongly typed object access to the XML content inside parts of Open XML documents.

    The Open XML SDK 2.0 simplifies the task of manipulating Open XML packages and the underlying Open XML schema elements within a package. The Open XML Application Programming Interface (API) encapsulates many common tasks that developers perform on Open XML packages, so you can perform complex operations with just a few lines of code.

    So now let’s discuss an often-asked question like programmatically merging Open XML documents. It’s not a very complicated task, but we need to think about some things.

    First of all, let’s look at the internal .docx structure. Below is an unzipped view:


    The OpenXML SDK 2.0 contains a great tool – Document Explorer – which allows us to view XML markup as well as .Net representation of a code to construct this markup:


    So when we’re merging documents we need not only merge content (text) but also styles of the document and other formatting settings.

    Open XML SDK operates on Open XML elements like paragraphs rather than logical (for user) objects like pages, content, and so on.

    But we have tool which can make our life easier – DocumentBuilder from PowerTools for Open XML. Another way is to use altChunk. This element specifies a location within a document for the insertion of the contents of a specified file containing external content to be imported into the main WordprocessingML document. Differences between these two approaches described in a post “Comparison of altChunk to the DocumentBuilder Class”. We will talk further about the DocumentBuilder approach.

    Use of the DocumentBuilder util is really very simple:

    using (WordprocessingDocument part1 = WordprocessingDocument.Open(@"Doc1.docx", false))

    using (WordprocessingDocument part2 = WordprocessingDocument.Open(@"Doc2.docx", false))


    List<Source> sources = new List<Source>();

    sources.Add(new Source(part1, true));

    sources.Add(new Source(part2, true));

    DocumentBuilder.BuildDocument(sources, "MergedDoc.docx");


    The most interesting is the second argument of the constructor of Source class. Using the keepSections argument appropriately allows you to precisely control which sets of section properties (visual formatting in other words) are moved from source documents into the destination document. For more information please see How to Control Sections when using OpenXml.PowerTools.DocumentBuilder post.

    We have two documents:





    DocumentBuilder will do all of the work for you for merging the two documents preserving:

    • formatting
    • page numbers (including Link Sections)
    • headers and footers
    • orientation
    • and so on.

    That’s magic!


    Additional Resources

  • Capture a Windows® Image from a Reference Computer Using Capture Media—for IT Pros

    (This post courtesy of Simone Pace)

    In order to use System Center Configuration Manager 2007 to distribute the Windows 7 operating system to our managed clients, we need to provide to the OS bits somehow to the site server. One of the methods we can use is capturing a Windows 7 WIM image from a previously prepared reference computer.

    System Center Configuration Manager 2007 offers standard and easy ways to deploy software in our IT Infrastructure. One of the most relevant features we can take advantage of is the highly customizable Operating System Deployment capability built in the product.

    The Windows Vista® and Windows 7 new WIM image format further simplifies OS distribution by being independent from the destination client system’s hardware, so that we can use a single image to target different computers and keep our image repository less complex and more easily managed. This post shows an example of steps we can follow to successfully capture a WIM image of Windows 7 Ultimate Edition x64 from a reference computer.

    Note: Further posts will follow that illustrate the specific tasks required to upgrade a Windows XP computer.

    Testing lab description screenshots and computer names used in this article refers to a Virtual scenario running on a Hyper-V R2 host:

    • Domain: (single site)
    • All Servers are Windows 2008 R2 Enterprise servers.
    • Server CON-001
    • SCCM with almost all roles installed
    • SQL Server 2008
    • Windows Automated Installation Kit 2.0
    • WDS Transport Server role installed
    • Server CON-002
    • Active Directory Domain Controller role installed
    • DNS Server role installed
    • DHCP Server role installed
    • SCCM Primary Site: C01
    • Reference client is a Windows 7 Ultimate edition x64 clean setup

    1. Create a Capture Media iso file.

    The iso image we are creating in this section will be used to boot the reference machine and start the OS wim image creation sequence.

    a. Log on CON-001 and open the Configuration Manager console.

    b. Go to Task Sequences node.

    c. Click on “Create Task Sequence Media” in the action panel.

    d. Select Capture Media and click next on the welcome page.


    e. On the “Media file” click Browse, select the folder where you are going to save the media iso file, and give it a name (for example MediaCapture), click Next.clip_image004

    f. On “Boot Image” click Browse, and select the boot image suitable for your reference computer.

    Note: Two boot images (x86 and x64) are automatically added when you install WDS role in the system.

    g. On Distribution Point leave \\CON-001 (or select you preferred DP), click Next.clip_image006

    h. Review the summary and click Finish.

    i. The server starts building the iso image.


    j. Click Close to close the wizard.

    2. Prepare the reference computer.

    a. Log on CON-Ref7Client with user Administrator account

    b. Check the following requirements

    i. The computer must be a workgroup member.

    ii. The local Administrator password must be blank.

    iii. The local system policy must not require password complexity.

    iv. Apply the latest Service Pack and updates.

    v. Install the required applications.

    3. Capture the image using the Capture Media.

    a. Capture the MediaCaputer.iso you created in Step 1 in the Virtual DVD of the reference PC (if is a VM), or

    b. Burn the MediaCapture.iso on a DVD and insert it in the computer.

    c. Boot the reference computer normally.

    d. Start the autoplay DVD and launch the Capture Image Wizard.


    e. Click Next.

    f. Set the path where you want to save the wim file, give the image a name, and insert the appropriate credential to access the path and write on it.

    g. Click Next.

    h. Fill in the required data in the Image Information window.


    i. View the summary and launch the capture by clicking Finish.


    j. The program will start executing the sysprep phase.


    k. After sysprep, the computer will restart in WinPE to start the capture.


    l. (Reboot).


    m. Computer restarts in WinPE and starts the Capture.


    n. Capturing first Partition (1-2)


    o. And capturing second partition (2-2).


    Note: The number of partitions captured depends on the reference PC’s disk partitions. In the case shown, the VM had a 100Mb partition for BitLocker® capability (Partition 1 of 2).

    p. When finished, press OK to quit and restart.


    q. On the Server we can see the captured image file.


    4. Add the file to the image repository in SCCM 2007.

    a. Share a folder and move the image file (example \\ServerName\Images).

    b. Open the SCCM console, navigate to Site Database > Computer Management > Operating System Deployment > Operating System Images.

    c. Import the image by clicking Add Operating System Image in the task panel.

    d. Type or browse the network path to the image you want to import, and click Next.


    e. Fill in the required information, then click Next.


    f. Review the summary and complete the wizard.



    5. Distribute the image to Distribution Point.

    a. In the SCCM console, navigate to the image you uploaded in step 4 (Site Database > Computer Management > Operating System Deployment > Operating System Images) and select it.

    b. Click Manage Distribution Points in the action panel.


    c. Click Next on the wizard starting page.

    d. As DP doesn’t have the image deployed yet, leave the default selection (copy) and click Next.clip_image037

    e. Select the DPs where you want to deploy the image to and include their PXE DP’s hidden share.clip_image039

    f. Click Next and Next again in the Completion page.



    g. Check the copy progress in the Package Status folder until you see it is Installed.


    h. You are now ready to distribute the Windows 7 Ultimate x64 Image to client computers, either by upgrading or installing new machines.

  • Configuring SharePoint 2013 Forms-Based Authentication with SQLMemberShipProvider

    Post courtesy Partner Solution Consultant Priyo Lahiri


    With SharePoint 2013, a lot of partners and customers are opening up their on premise deployment to their vendors and customers. While the way you would configure this is very similar to SharePoint 2010, things get a little tricky when you perform a real-world deployment spanned across multiple servers. This post is an end-to-end walkthrough of setting up Forms Based Authentication with SQLMemberShipProvider in a 3 tier SharePoint 2013 Deployment.


    It would be whole lot easier if I had a single server environment with the same account running everything and that account is also a Domain Admin. However, I chose a different approach since most likely this is how your real-world deployment will be setup and the steps are little different when your farm is spanned across 3 servers. Here is my environment:

    WFE01 – Web Server running Microsoft SharePoint Foundation Web Application. I am connecting to the SQL instance using an Alias. It’s a very smart move. If you have ever had to move your SharePoint databases across SQL Servers or decommission an aging SQL Server, you know that having a SQL Alias will save you from a lot of nightmares. If you are looking for a step by step, click here.

    APP01 – Central Admin Server. Note: this is NOT running Microsoft SharePoint Foundation Web Application and is configured to be a “True” application server. This also means that the Web Application that we create will not reside on this server.

    SQL01 – SQL Server running SQL Server 2012 with SP1

    SharePoint 2013 server RTM and Windows Server 2012 RTM are used for this set up.

    Tools to use

    While the steps documented below can be done without these tools, they do make your life a whole lot easier.

    1. FBA Configuration Manager for SharePoint 2013 – Author and Credit goes to Steve Peschka. The download comes with a ReadMe file. Please read it, since you need to register the WSP that comes with it.

    2. SharePoint 2013 FBA Pack – Author and Credit goes to Chris Coulson. Here is the documentation that will tell you how to install/activate/work with it. This not only will this make usonly tested the user management er management a breeze, it has some very useful features like password reset and self-service account management.

    NOTE: I have portion of the FBA Pack and didn’t have time to play with the rest of the features.

    How it’s done

    Step 1 – Create the Web Application

    In this step we will be creating the web application with Windows Authentication (Claims) and Forms Based Authentication (FBA) on the same Zone. In SharePoint 2013, you can have multiple authentication providers without extending the web application. Having said that, at times, you might have to extend the web application depending on your scenario. More on that on a different post where I will show you how to use LDAPMemberShipProvider to talk to your AD.

    From Central Administration, we will create a Web Application and call it and enable both Windows Auth and FBA. Note the names I am using: ASP.NET Membership Provider Name = SQL_Membership and ASP.NET Role manager name = SQL_Role. You can call them whatever you want, just ensure you use the same names everywhere.


    We will create a new App Pool and use the Web App Pool account. Make a note of this since you would need to give this account permission in the next step in the ASPNET database.


    Create the Web App and then the Site Collection, it doesn’t matter what template you choose. Once the Site Collection is created, visiting the site collection will take you to our default sign in page where you will be asked to choose an Authentication Provider to Sign In with. If you want your External Users only to have the option of FBA, you would want to set this default zone with Windows Auth and extend it and have the FBA on the extended web app. Obviously, the URL’s will then be different.

    Your sign in page should look like this (make sure your DNS record (CNAME) point to the WFE01)


    Do you want to see a custom sign in page with your company brand on it? Well, let’s defer that to a different post.

    Step 2 – Verify Tools

    Now that the web app is created, we will make sure FBA Pack and FBA Configuration manager is deployed as it should be. Go to Central Administration >> System Settings >> Manage Farm Solutions. Make sure fbaConfigFeature.wsp is globally deployed and visigo.sharepoint.formsbasedauthentication.wsp is deployed to See screenshot below. If the visigo.sharepoint.formsbasedauthentication.wsp is not deployed, click on the WSP and deploy it to your web application.


    Login to the site collection created in the above step and activate the following feature:

    Site Settings >> Site Collection Administration >> Site Collection Features >> Form based Authentication Management


    Once the feature is activated, it should add the following to your Site Settings under User and Permissions


    Step 3 – Creating the SQL Database for User Management

    The first step is to create the SQL Database that would hold the Extranet Users

    • Browse to c:\Windows\Microsoft .NET\Framwork64\v4.0.30319
    • Run aspnet_regsql.exe
    • Click Next
    • Choose Configure SQL Server for Application Services >> Click Next
    • Enter your SQL Server Name , choose Windows Authentication and type in a Database Name


    • Click Next twice to provision the database
    • Now we need to add the Application Pool that runs the web application and give it required permission. In this case, the application pool name is waterfall\spweb. Perform the following steps:
      • Open up SQL Management Studio, Expand the database we created and expand Security
      • Right click Users and add a new User
      • User Type = Windows User
      • User name = choose <yourAppPoolAccountName>
      • Login name = browse and choose the login name (should be same as the app pool name above)


      • Click Owned Schemas and choose the following:
        • aspnet_Membership_FullAccess
        • aspnet_Persolalization_FullAccess
        • aspnet_Profile_FullAccess
        • aspnet_Roles_FullAccess
        • aspnet_WebEvent_FullAccess


    Step 4 – Editing the web.config files

    We need edit the following web.config files:

    • Web Application Web.config – WFE server
    • STS Application web.config – WFE server and Application Server
    • Central Admin web.config – CA Server
    • If you have more WFEs and App Servers, you need to edit them as well. A lot of people puts these in there machine.config file as well so that it gets inherited to the web.config file. I am not too keen on editing the machine.config file.

    Let’s login to our WFE server and fire up FBAConfigMgr.exe. While you can get the code you need from here and edit web.config yourself, if you just let the tool run its course, it will create a Timer Job and do the task for you. In the FBAConfigMgr type in your application URL and from the sample configuration choose the following:

    • People Picker Wildcard
    • Connection String
    • Membership Provider
    • Role Provider

    Here is what the screen looks like when default values are chosen:


    We will modify the default values to reflect the following (highlighted items need modification per your environment):

    • Web Application URL -
    • People Picker Wildcard - <add key="SQL_Membership" value="%" />
    • Connection String -
      <add name="fbaSQL" connectionString="server=SQL01;database=Extranet_User_DB;Trusted_Connection=true" />
    • Membership Provider -
      <add connectionStringName="fbaSQL" applicationName="/"
      type="System.Web.Security.SqlMembershipProvider, System.Web,
      Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    • Role Provider -
      <add connectionStringName="fbaSQL" applicationName="/"
      name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,
      Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>

    The screen should now look like this:


    It’s time to hit Apply Config. This will create a timer job to update your web.config files. Though it creates a backup, you should be proactive and take a backup of your web application web.config and sts web.config file. Here is how to back up the web.config file and here is how to find the STS web.config file.

    Once you click Apply Config, the tool will tell you when it’s done. It might take a few mins before you see any changes, so wait for it (you should see a new backup file created for your web.config file with time stamp and _FBAConfigMgr in the end of the file). To verify that the job is done, open up the web.config for your web application and search for <membership. You should see the following:

    <<Web Application web.config file>>


    The ConnectionStrings gets added to the end of the file right above </configuration>


    <<STS web.config file>>

    Open up the STS Web.Config and you should see the following:


    The ConnectionStrings gets added to the end of the file as well just like web.config of the web application.

    <<Central Administration web.config file on App Server>>

    If you go back to the application server and open up the web.config file for the Central Admin site, you will see there are no changes made there. So we will make that change manually. Create a backup of the file then open up the file and find <Machine. It should look like this:


    We will add the following (copied from web.config file of web application or the code from FBAConfigMgr)

    1. Search for <machineKey and paste the following under <rolemanager><providers>
    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    2. Under <membership><providers> paste the following
    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Membership" type="System.Web.Security.SqlMembershipProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    The screen should now look like this:

    3. Scroll to the end of the document and paste the following right before </configuration>

    <add name="fbaSQL" connectionString="server=SQL01;database=Extranet_User_DB;Trusted_Connection=true" />



    <<STS web.config file on App Server>>

    Just like the Central Admin web.config make the same changes on this web.config as well. Just make sure you are pasting the information from RoleManager Providers and Membership Providers in the right place. Here is what the code looks like (you can use the code below are make changes to the highlighted areas to suit your environment):




    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Membership" type="System.Web.Security.SqlMembershipProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />





    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />





    <add name="fbaSQL" connectionString="server=SQL01;database= Extranet_User_DB;Trusted_Connection=true" />


    Here is a screenshot


    Step 5 - Use FBA Pack to add and manage users

    Our configurations are done. We will now go to our site collection and use the FBA Pack to add / manage users and Roles

    Go to Site Settings and click on FBA User Management >> Click New User and create a dummy user and add him to the contributor group


    Step 6 – Verify Forms user

    Now open up IE in InPrivate mode and visit your site collection and this time choose Forms Authentication and enter the account information you just created to log in. You’re done!


    Click on the user name and My Settings, you will see the account information coming from SQL Membership Provider


    If you go to a document library and try and add the user there, you will see it resolves from your SQL database



    How to create SQL Alias for SharePoint

    Follow the steps below to create a SQL Alias on all your SharePoint Servers:

    TechNet Reference:

    1. Perform this on the Application Server that is hosting Central Administration

    a. Stop all SharePoint Services

    b. Open CLICONFIG.exe from C:\Windows\System32\cliconfg.exe (64 bit version of cliconfig.exe)

    c. Enable TCP/IP under general tab

    d. Click on Alias Tab

    e. Type Current SQL Server Name in the Alias Name field

    f. Type Current SQL Server Name in the Server field (see screenshot below. In your case SQL Alias and SQL Server name is the same)

    g. Validate SQL Alias

    i. Create a new text file on SharePoint Server and name it “TestDBConnection.udl”

    ii. Double click to open the file and enter your SQL Server Alias name

    iii. Use Windows Integrated Security

    iv. You should be able to see all your SharePoint databases when you click on “Select the database on the Server”

    h. Start all services for SharePoint Server / Reboot SharePoint Server

    i. Perform the steps above on all other SharePoint servers

    How to backup web.config file

    To back up web.config file, perform the following:

    · From IIS Manager (start >> Run > inetmgr)

    · Right click on the web site and click Explore

    · Copy the web.config file somewhere else, or the in the same location with a different name


    Where is the STS web.config file?

    · On your WFE open up IIS Manager and expand SharePoint Web Services

    · Right click on SecurityTockenServiceApplication and click Explore


  • Migrating File Shares to SharePoint Online

    (Post courtesy Partner Solution Consultant Andre Kieft)

    It has been a while since I created a blog post, but recently I received a lot of questions and requests for advice on how to migrate file shares to SharePoint and use SkyDrive Pro (SDP). So I figured to create a blog post with the things you need to consider as a Small and Medium Business (SMB) partner when you are planning to migrate file share content into SharePoint and want to make use of SDP for synchronizing the SharePoint content offline.

    Note: that these steps are both valid for SharePoint 2013 on-premises (on-prem) and SharePoint Online (SPO).

    Step 1 – Analyze your File Shares

    As a first step, try to understand the data that resides on the file shares. Ask yourself the following questions:

    • What is the total size of the file share data that the customer wants to migrate?
    • How many files are there in total?
    • What are the largest file sizes?
    • How deep are the folder structures nested?
    • Is there any content that is not being used anymore?
    • What file types are there?

    Let me try to explain why you should ask yourself these questions.

    Total Size

    If the total size of the file shares are more that the storage capacity that you have on SharePoint, you need to buy additional storage (SPO) or increase your disk capacity (on-prem). To determine how much storage you will have in SPO, please check the Total available tenant storage in the tables in this article. Another issues that may arise is that in SharePoint is that you reach the capacity per site collection. For SPO that is 100 Gigabyte, for on-premises the recommended size per site collection is around 200 Gigabyte. This would automatically mean that the content database is around 200 Gigabyte, which is the recommended size. Thought you can stretch this number up in on-prem, it is not recommended.

    So, what should I do when my customer has more than 100 Gigabyte?

    • Try to divide the file share content over multiple site collections when it concerns content which needs to be shared with others.
    • If certain content is just for personal use, try to migrate that specific content into the personal site of the user.

    How Many Files

    The total amount of files on the file shares is important as there are some limits in both SharePoint as well as SDP that can result in an unusable state of the library or list within SharePoint but you also might end up with missing files when using the SDP client.

    First, in SPO we have a fixed limit of 5000 items per view, folder or query. Reasoning behind this 5000 limit boils all the way down to how SQL works under the hood. If you would like to know more about it, please read this article. In on-prem there is a way to boost this up, but it is not something we recommend as the performance can significantly decrease when you increase this limit.

    Secondly for SDP there is also a 5000 limit for synchronizing team sites and 20000 for synchronizing personal sites. This means that if you have a document library that contains more that 5000 items, the rest of the items will not be synchronized locally.

    There is also a limit of 5 million items within a document library, but I guess that most customer in SMB won’t reach that limit very easily.

    So, what should I do if my data that I want to migrate to a document library contains more than 5000 items in one folder?

    • Try to divide that amount over multiple subfolders or create additional views that will limit the amount of documents displayed.

    But wait! If I already have 5000 items in one folder, doesn’t that mean that the rest of the other document won’t get synchronized when I use SDP?

    Yes, that is correct. So if you would like to use SDP to synchronize document offline, make sure that the total amount of documents per library in a team site, does not exceed 5000 documents in total.

    So, how do I fix that?

    • Look at the folder structure of the file share content and see if you can divide that data across multiple sites and/or libraries. So if there is a folder marketing for example, it might make more sense to migrate that data into a separate site anyway, as this department probably wants to store additional information besides just documents (e.g. calendar, general info about the marketing team, site mailbox etc). An additional benefit of spreading the data over multiple sites/libraries is that it will give the SDP users more granularity about what data they can take offline using SDP. If you would migrate everything into one big document library (not recommended), it would mean that all users will need to synchronize everything which can have a severe impact on your network bandwidth.

    Largest File Sizes

    Another limit that exists in SPO and on-prem is the maximum file size. For both the maximum size per file is 2 Gigabyte. In on-prem the default is 250 MB, but can be increased to a maximum of 2 Gigabyte.

    So, what if I have files that exceed this size?

    • Well, it won’t fit in SharePoint, so you can’t migrate these. So, see what type of files they are and determine what they are used for in the organization. Examples could be software distribution images, large media files, training courses or other materials. If these are still being used and not highly confidential, it is not a bad thing to keep these on alternative storage like a SAN, NAS or DVDs. If it concerns data that just needs to be kept for legal reasons and don’t require to be retrieved instantly, you might just put these on DVD or an external hard drive and store them in a safe for example.

    Folder Structures

    Another important aspect to look at on your file shares is the depth of nested folders and file length. The recommended total length of a URL in SharePoint is around 260 characters. You would think that 260 characters is pretty lengthy, but remember that URLs in SharePoint often has encoding applied to it, which takes up additional space. E.g. a space is one character but in Unicode this a %20, which takes up three characters. The problem is that you can run into issues when the URL becomes to large. More details about the exact limits can be found here, but as a best practice try to keep the URL length of a document under 260 characters.

    So, what if I have files that will have more than 260 characters in total URL length?

    • Make sure you keep your site URLs short (the site title name can be long though). E.g. don’t call the URL Human Resources, but call it HR. If you land on the site, you would still see the full name Human Resources as Site Title and URL are separate things in SharePoint.
    • Shorten the document name (e.g. strip of …v.1.2, or …modified by Andre), as SharePoint has versioning build in. More information about versioning can be found here.

    Idle Content

    When migrating file shares into SharePoint is often also a good momentum to clean up some of the information that the organization has been collecting over the years. If you find there is a lot of content which is not been accessed for a couple of years, what would be the point of migrating that data it to SharePoint?

    So, what should I do when I come across such content?

    • Discuss this with the customer and determine if it is really necessary to keep this data.
    • If the data cannot be purged, you might consider storing it on a DVD or external hard drive and keep it in a safe.
    • If the content has multiple versions, such as proposal 1.0.docx, proposal 1.1.docx, proposal final.docx, proposal modified by Andre.docx, you might consider just moving the latest version instead of migrating them all. This manual process might be time consuming, but can safe you lots of storage space in SharePoint. Versioning is also something that is build into the SharePoint system and is optimized to store multiple versions of the same document. For example, SharePoint only stores the delta of the next version, saving more storage space that way. Note that this functionality is only available in SharePoint on-prem.

    Types of Files

    Determine what kind of files the customer is having. Are they mainly Office documents? If so, then SharePoint is the best place to store such content. However, if you come across developers code for example, it is not a good idea to move that into SharePoint. There are also other file extensions that are not allowed in SPO and/or on-prem. A complete list of blocked file types for both SPO and on-prem can be found here.

    So, what if I come across such file extensions?

    • Well, you can’t move them into SharePoint, so you should either ask yourself, do I still need these files? And if so, is there an alternative storage facility such as a NAS, I can store these files on? If it concerns developer code, you might want to store such code on a Team Foundation Service Server instead.

    Tools for analyzing and fixing file share data

    In order to determine if you have large files or exceed the 5000 limit for example, you need to have some kind of tooling. There are a couple of approaches here.

    • First off, there is a PowerShell script that has been pimped up by a German colleague Hans Brender, which checks for blocked file types, bad characters in files and folders and finally for the maximum URL length. The script will even allow you to fix invalid characters and file extensions for you. It is a great script, but requires you to have some knowledge about PowerShell. Another alternative I was pointed at is a tool called SharePrep. This tool does a scan for URL length and invalid characters.
    • Secondly there are other 3rd party tools that can do a scan of your file share content such as Treesize. However such tools do not necessarily check for the SharePoint limitations we talked about in the earlier paragraphs, but at least they will give you a lot more insight about the size of the file share content.
    • Finally there are actual 3rd party migration tools that will move the file share content into SharePoint, but will check for invalid characters, extensions and URL length upfront. We will dig into these tools in Step 2 – Migrating your data.

    Step 2 – Migrating your data

    So, now that we have analyzed our file share content, it is time to move them into SharePoint. There are a couple of approaches here.

    Open with Explorer

    If you are in a document library you can open up the library in the Windows Explorer. That way you can just do a copy and paste from the files into SharePoint.


    But, there are some drawbacks using this scenario. First of all, I’ve seen lots of issues trying to open up the library in the Windows Explorer. Secondly, the technology that is used for copying the data into SharePoint is not very reliable, so keep that in mind when copying larger chunks of data. Finally there is also drag & drop you can use, but this is only limited to files (no folders) and only does a maximum of 100 files per drag. So this would mean if you have 1000 files, you need to drag them 10 times in 10 chunks. More information can be found in this article. Checking for invalid characters, extensions and URL length upfront are also not addressed when using the Open with Explorer method.

    Pros: Free, easy to use, works fine for smaller amounts of data

    Cons: Not always reliable, no metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

    SkyDrive Pro

    You could also use SDP to upload the data into a library. This is fine as long as you don’t sync more than 5000 items per library. Remember though that SDP is not a migration tool, but a sync tool, so it is not optimized for large chunks of data to be copied into SharePoint. Things like character and file type restrictions, path length etc. is on the list of the SDP team to address, but they are currently not there.

    The main drawbacks of using either the Open in Explorer option or using SDP is that when you use these tools, they don’t preserve the metadata of the files and folder that are on the file shares. By this I mean, things like the modified date or owner field are not migrated into SharePoint. The owner will become the user that is copying the data and the modified date will be the timestamp of the when the copy operation was executed. So if this metadata on the files shares is important, don’t use any of the methods mentioned earlier, but use one of the third party tools below.

    Pros: Free, easy to use, works fine for smaller amounts of data (max 5000 per team site library or 20000 per personal site)

    Cons: No metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

    3rd party tools

    Here are some of the 3rd party tools that will provide additional detection, fixing and migration capabilities that we mentioned earlier:

    (Thx to Raoul for pointing me to additional tools)

    The list above is in random order, where some have a focus on SMB, while other more focused on the enterprise segment. We can’t speak out any preference for one tool or the other, but most of the tools will have a free trial version available, so you can try them out yourself.


    So, when should I use what approach?

    Here is a short summary of capabilities:

      Open in Explorer SkyDrive Pro 3rd party
    Amount of data Relatively small No more than 5000 items per library Larger data sets
    Invalid character detection No No Mostly yes1
    URL length detection No No Mostly yes1
    Metadata preservation No No Mostly yes1
    Blocked file types detection No No Mostly yes1

    1This depends on the capabilities of the 3rd party tool.


    SDP gives me issues when synchronizing data
    Please check if you have the latest version of SDP installed. There have been stability issues in earlier released builds of the tool, but most of the issues should be fixed by now. You can check if you are running the latest version, by opening up Word-> File-> Account and click on Update Options-> View Updates. If your current version number is lower than the one you have, click on the Disable Updates button (click yes if prompted), then click Enable updates (click yes if prompted). This will force downloading the latest version of Office and thus the latest version of the SDP tool.


    If you are running the stand-alone version of SDP, make sure you have downloaded the latest version from here.

    Why is the upload taking so long?
    This really depends on a lot of things. It can depend on:

    • The method or tool that is used to upload the data
    • The available bandwidth for uploading the data. Tips:
      • Check your upload speed at and do a test for your nearest Office 365 data center. This will give you an indication of the maximum upload speed.
      • Often companies have less available upload bandwidth then people at home. If you have the chance, uploading from a home location might be faster.
      • Schedule the upload at times when there is much more bandwidth for uploading the data (usually at night)
      • Test your upload speed upfront by uploading maybe 1% of the data. Multiply it by 100 and you have a rough estimate of the total upload time.
    • The computers used for uploading the data. A slow laptop can become a bottle neck while uploading the data.

    If you feel that there are things missing here, please let me know and I’ll try to add them to this blog post.

  • What’s new with SP1 for VMM 2008 R2


    (Post courtesy Iftekhar Hussain)

    As you all know that SP1 for Windows Server 2008 R2 and Windows 7 just gone RTM adding two new virtualization capabilities: Dynamic Memory and RemoteFX

    Let’s first understand these capabilities in details.

    Dynamic Memory: An enhancement to Hyper-V R2, Dynamic Memory pools all the memory available on a physical host. Dynamic Memory then dynamically distributes available memory, as it is needed, to virtual machines running on that host. Then with Dynamic Memory Balancing, virtual machines will be able to receive new memory allocations, based on changes in workload, without a service interruption.


    RemoteFX: Microsoft RemoteFX leverages the power of virtualized graphics resources and advanced codecs to recreate the fidelity of hardware-assisted graphics acceleration, including support for 3D content and Windows Aero, on a remote user’s device. This allows for a local-like, remote experience.


    In the light of new features, service Pack 1 will also be released for VMM 2008 R2 to manage Dynamic Memory and RemoteFX on multiple Hyper V Servers from a single pane of glass.

    Lets understand the requirements on your Hyper V hosts and VMs to be able to manage by VMM 2008 R2.

    System Requirement for Manage Dynamic Memory:

    • Windows Server 2008 R2 SP1 Host
    • Supported Operating System in the VM
    • Upgrade Integration Services in the VM

    RemoteFX Requirements on Hosts

    • CPU must support SLAT
    • One or more GPUs (Graphics Processing Units) that support DirectX 10
    • Enough GPU Memory available for max monitors and resolution
    • RemoteFX feature enabled under the Remote Desktop Services Role

    RemoteFX requirements on VMs

    • Supported Operating System in the VM
    • New Integration Services (part of Windows 7 SP1)
    • Can be combined with Dynamic Memory

    VMM 2008 R2 SP1 now has settings to manage Dynamic Memory setting of VMs which are as follows:

    • Startup Memory (8 to 65536 MB)
    • Maximum Memory (8 to 65536 MB)
    • Memory Buffer (0 to 95%)
    • Memory Priority
    • Normal, Low, High
    • Custom (0 – 10000)


    To manage the RemoteFX , following settings have been added

    • Default Standard video adapter
    • New RemoteFX 3D video adapter
    • Maximum number of monitors: 1,2,3,4
    • Max monitor resolution
      • 1024x768 (4 monitors max)
      • 1280x1024 (4 monitors max)
      • 1600x1200 (3 monitors max)
      • 1920x1200 (2 monitors max)
    • Additional VM property
    • GPU ID (PowerShell only)


    Since RemoteFX requires you to have SLAT CPU and specific GPU and available GPU memmory, VMM 2008 R2 SP1 will also let you identify the following on the hosts.

    1. CPU Supports SLAT


    2. GPU and available Memory


    3. Intelligent Placement

    While creating a new VM or migrating a new VM from one host to another, VMM 2008 R2 SP1 added new check for Dynamic Memory and RemoteFX in the Intelligent placement.

    • Use Current Memory footprint for DM VMs on hosts: While doing the migration or creating new VM, VMM will check how much is the current memory the VM is running with, for E.g, you have specified startup memory as 1 GB and max memory as 6 GB but at the time of migration the VM is running at 2 GB memory. In that case, VMM will check if the other host has at least 2 GB free memory to be able to migrate this VM.
    • Check for GPU-compatibility when migrating RemoteFX VMs : While migrating a RemoteFX based VM, VMM 2008 R2 checks if identical GPU is available on the other hosts with required memory, Remote FX should be enabled in RDS.


    4. Updates to Performance and Resource Optimization

    PRO ties specific Operations Manager alerts to remediation actions in VMM, VMM moves a load-balance VMs in a cluster when CPU or Memory exceed a threshold. SP1 update to VMM PRO Pack a consider Current Memory utilization and not the fixed memory for DM VMs and matches it with the memory available on the other nodes.

    How do I upgrade from SCVMM 2008 R2 to SP1?

    Pretty simple, just 4 steps to upgrade

    • Backup VMM Database
    • Launch setup and follow the Upgrade Wizard
    • Repeat on each box [Console, Self-Service Web Server, Operations Manager Server]
    • Remove, then import new PRO Management Pack in Operations Manager
    • R2 Agents are supported in SP1, no need to re-deploy agents to Hosts and Library Servers
    • Upgrade the Integration components on all the VMs.


    VMM 2008 R2 SP1 will take the virtualization management to a whole new level and is scheduled to be released within 30 days of SP1 released by the Windows Server Team. So we can expect it sometime next month.

    I hope this post was of some help for those who are expecting VMM to be able to manage Dynamic Memory and RemoteFX on multiple hosts from a single console instead of managing it locally using Hyper V manager on each box.

    If you have any further queries, please feel free to reach me



  • Network Monitoring with System Center Operations Manager 2012

    (Post courtesy Nikunj Kansara)

    This post describes the network monitoring capabilities of the System Center Operations Manager 2012 Beta.

    In my opinion, network monitoring is the most exciting feature of the upcoming Operations Manager 2012 release. This article will help users to get an overview of the network monitoring, how to discover network devices, configure network monitoring rules and object discoveries, sneak-peek on reports generated out of network management and network dashboard.

    I have split up the blog in four different topics:

    How to discover the network devices:

    Discovery is the process of identifying network devices to be monitored.

    Operations Manager 2012 can monitor devices that use SNMP v1, v2c and V3.

    The benefit that we get by configuring Network Monitoring is that if a critical server seems to be down, and if network monitoring is configured, we will see an alert that a switch/router port is down which was connected to the critical server. We can also see the network topology diagram called the Network Vicinity view.

    Operations Manager 2012 provides the following monitoring for discovered network devices:

    • We can view connection health between the network devices and between the server and the network device
    • We can view the VLAN health based on health state of switches in VLAN
    • We can view HSRP group health based on health state of individual HSRP end points
    • We can view Port/Interface Monitoring like Up/Down, Inbound / Outbound volume traffic
    • We can view Port/Interface Utilization, Packets dropped, broadcasted.
    • We can view Processor Utilization for some certified devices
    • We can view Memory utilization some certified devices

    Network device discovery is performed by discovery rules that you create.

    Below are steps for creating the discovery rule:

    1. Open the Operations Console

    2. Go to Administration Workspace, right click Administration and the click Discovery

    Figure 1

    3. The What would you like to manage? Page in Figure 1 will open up and we need to select the Network Devices option and click Next.

    4. The General page in Figure 2 appears and we need to provide the Name of the discovery rule and then we need select the Management server from the drop down. And then click Next.


    • We can create one discovery rule per management server or gateway server.
    • If we are creating a second discovery rule then we will only see the management servers that don’t have any discovery rule associated with them.
    • Also, we might want plan ahead and strategically place the management servers or gateway servers so they can access the network devices that we would like to discover.

    Figure 2

    5. On the Discovery Method page in figure 3, we need to select the method to discover the network device. In this example we need to select Explicit discovery and then click next.


    • Differences between Explicit discovery and Recursive Discovery:
      • Explicit discovery – An explicit discovery rule will try to to discover the devices that you explicitly specify in the wizard by IP address or FQDN. It will only monitor those devices that it can successfully access. The rule will try to access the device by using ICMP, SNMP, or both depending on the configuration of the rule.
      • Recursive discovery – A recursive discovery rule will attempt to discover those devices that you explicitly specify in the wizard by IP address, as well as other network devices that are connected to the specified SNMP v1 or v2 device and that the specified SNMP v1 or v2 device knows about through the device’s Address Routing Protocol (ARP) table, its IP address table, or the topology Management Information Block (MIB).

    Figure 3

    6. On the Default Account Page in Figure 4, click on the Create default Run As Account as we need to create an account which will be used to discover the network devices.

    Figure 4

    7. On the Introduction page of Create Run As account Wizard in Figure 5, click next

    Figure 5

    8. On the General Properties page of the Create Run As account Wizard in Figure 6; enter the Display name of the Run As Account and click next.

    Figure 6

    9. On the Credentials page on the Create Run As account Wizard in Figure 7, enter the SNMP community string and click on create.

    SNMP Community Strings

    We can configure Read only [RO] and Read Write [RW] SNMP Community strings. With the RO Community string we have read access to the network device. For Operations Manager 2012, we need only RO SNMP Community String to access the device. So it’s should be easy to convince the network guys ;-)

    Figure 7

    10. On the Default Account Page in Figure 8, select the created Run As Account and click on Next.

    Figure 8

    11. On the Devices Page, click on Add Button

    Figure 9

    12. On the Add a device window in Figure 10, enter the IP address / Name of the device we want to monitor; Select the Access Mode as ICMP and SNMP (You can also select ICMP only and SNMP only); Select the version on SNMP as v1 or v2; Select the created Run As account and then click OK.


    • We use ICMP only in the scenario where we need to know the availability of the gateway router from the ISP to verify if the interface is up or down.
    • We use SNMP only in the scenario where we want to monitor a Firewall on which ICMP is blocked.
    • If we specify that a device uses both ICMP and SNMP, Operations Manager must be able to contact the device by using both methods or discovery will fail.
    • If you specify ICMP as the only protocol to use, discovery is limited to the specified device and monitoring is limited to whether the device is online or offline.

    Figure 10

    13. Now Click Next on the Devices Page as in Figure 11.

    Figure 11

    14. On the Schedule discovery Page in Figure 12, Select the discovery schedule and click Next.


    You may also select to run the discovery manually.

    Figure 12

    15. Click Create on the Summary page

    Figure 13

    16. Click Yes on the Warning box as in Figure 14. We need to distribute the created Run As account to the Management server for discovery and to the Management Server resource pool for monitoring that was selected in General properties [Figure 2]

    Figure 14

    17. Click close on Completion.

    Figure 15

    18. Now in the Administration Workspace, go to Discovery Rules Node under the Network Management Node. You will able to see the Discovery Rule that has created. Click Run if we want to Run the discovery manually. See Figure 16

    Figure 16

    19. See the Figure 17 for the Task Status window that appears when we run the Discovery Manually. The success Status suggests that the discovery is submitted successfully and not that the devices have been discovered. Click close.

    Figure 17

    20. We will see probing status of the discovery rule when it has actually found the device. See Figure 18

    Figure 18

    21. The Discover Rule starts processing the discovered components as in Figure 19

    Figure 19

    22. The status of the discovery rule will go to pending and will run again as per the discovery schedule that we selected Wizard. If we would have selected manual discovery option in the Wizard than the status would go to Idle. See Figure 20.

    Figure 20

    23. Go to Network Devices under Network Management to see the discovered device. See Figure 21.

    Figure 21

    24. Double click the Network device to view the properties page and more information about that discovered device. See Figure 22.

    Figure 22

    B. Network Monitoring:

    We will see some of the views that are relevant to the network device that we discovered in previous step.

    1. Go to Monitoring Workspace; double click the Network Monitoring Folder to see the Network views. See Figure 23.

    Figure 23

    2. Select the Network Devices view to see the Network Devices being monitored.

    Figure 24

    3. Click on the Health Explorer to the Subcomponents of the Switch. See Figure 25 & 26

    Figure 25

    Figure 26

    4. Click on the VLANs view to see the VLANs in which the switch is participating. See Figure 27

    Figure 27

    5. Click on the ICMP Ping Response Performance view or Processor utilization Performance view to see the performance graph for ping response. See Figure 28 & 29.

    Figure 28

    Figure 29

    C. Dashboard:

    1. To see the connections between the connected nodes and the network device, click on the Network Vicinity view. See figure 30.

    Figure 30

    2. Click on the show computers check box to see the connections. See figure 31.


    By default we can see connections which are one hop away from the network device.

    We can select at max 5 hops. In environments with large number of network devices, selecting five hops can take a while for Operations Manager 2012 to show the data and the view might not be useful to you.

    Figure 31

    3. Now coming back to Network devices view in Monitoring workspace, click on the Network Node Dashboard. We will able to view all the information related to Network devices in the just one window. See figures 32, 33, 34 and 35.

    Figure 32

    Figure 33

    Figure 34

    Figure 35

    D. Reporting: [See Figure 36]

    Processor Utilization Report: It displays the processor utilization of a particular network device in a specified period of time.

    Memory Utilization Report: It displays the percentage of free memory on a particular network device in a specified period of time.

    Interface Traffic Volume Report: It displays the rate of inbound and outbound traffic that goes through the selected port or interface in a specified period of time.

    Interface Error Packet Analysis Report: It displays the percentage of error packets or discarded packets, both inbound and outbound, for the selected port or interface.

    Interface Packet Analysis Report: It displays the types of packets (unicast or non-unicast) that traverse the selected port or interface.

    Figure 36

    Additional Resources

  • Monitoring machines using Certificates with System Center Operations Manager 2007 R2 - Part 2

    (Post courtesy Rohit Kochher)

    In part 1, we discussed scenarios where Operations Manager uses certificates to monitor computers in a workgroup or non-trusted domain. We also configured the certificate template for Operations Manager. In this post we will use certificates for gateway servers and deploy them. At the end we will also have the steps to monitor machines in a workgroup.

    Download and Import trusted Root (CA) Certificate

    Open browser https://<servername>/certsrv where <servername> is name of server running Active Directory Certificate Services. On the welcome page click download a CA Certificate, certificate chain, or CRL


    Save the certificate. Now to import it open MMC. From File, select Add/Remove Snap-in. Add certificates snap-in and select Computer Account.

    Expand Certificates (Local Computer), expand Trusted Root Certification Authorities, and then click Certificates. Use All tasks to Import the trusted root (CA) certificate.

    We already covered on configuring template for SCOM in part 1.

    Request a certificate for SCOM/Gateway server

    1) Open browser with https://<servername>/certsrv again. In Select a task page click Request a certificate.

    2) In request a Certificate page , select advanced certificate request.


    3) In Advanced Certificate request , select Create and Submit a request to this CA


    4) In Certificate template, from the drop down, select the certificate template that we configured in Part 1 of the series.


    5) At the bottom of the page in Friendly Name, give the FQDN for SCOM Server/ Gateway server.

    We need to install certificate both on SCOM server and gateway server.

    Installation of Gateway server

    On gateway server, open the media of SCOM. Click Install Operations Manager 2007 R2 Gateway. Enter the management group name, Management server.


    Select Gateway Action Account and we have two options from Local System or Domain account


    Click next and wait for installation to get complete.

    Registering Gateway server with Management group using gateway approval tool

    Copy the Microsoft.EnterpriseManagement.GatewayApprovalTool.exe from the installation media to the Operations Manager 2007 installation directory.

    Open a Command Prompt window, and navigate to the \Program Files\System Center Operations Manager 2007 directory.

    Syntax of command is Microsoft.EnterpriseManagement.gatewayApprovalTool.exe /ManagementServerName=<managementserverFQDN> /GatewayName=<GatewayFQDN> /Action=Create


    Importing Certificates with the MOMCertImport.exe Tool

    We need to import certificate both on management and gateway server.

    On Command prompt Navigate to \SupportTools\<platform> (i386 or ia64).

    Run momcertimport.exe /SubjectName <certificate subject name>.


    With this, I was able to monitor machines in non-trusting domain using gateway server and certificates. To confirm everything is good, you can check one thing.

    Open the certificate that you install on management/gateway server. Click on Details Tab and check the Serial Number.

    Now navigate to HKLM\Software\Microsoft\Microsoft Operations Manager\3.0\Machine Settings and check the value of ChannelCertificateSerialNumber. Serial number of certificate should be listed backwards here in registry.

    Further I can configure multiple gateway servers for agents to fail over. Also I can configure multiple SCOM servers for my gateways to fail over. This can be done using Power shell and is covered in the blog.

    Monitoring Workgroup machines using certificates

    Now, we will discuss about how we can monitor machines which are in a workgroup. I have outlined this in few steps:

    Name resolution between SCOM server and workgroup server can be done by host files which are located at C:\Windows\System32\drivers\etc

    Make sure TCP port 5723 is opened for communication. You can telnet to confirm same.

    You can manually install SCOM agent on workgroup server and later use certificates. Copy the AGENT folder from SCOM media. Based on 32/64 bit OS, run the MSI. Specify the management group and SCOM server name and complete the installation.

    Check the Download and Import trusted Root (CA) Certificate in beginning of this blog. Perform that to download and import CA certificate on local computer. CA certificate should be imported in Trusted Root Certification Authorities store. Here I am assuming you can connect to https://<servername>/certsrv through your browser.

    Next step is to get certificate for workgroup server. We already discussed in part 1 on how to configure certificate template for SCOM. Check Request a certificate for SCOM/Gateway server section at top of the blog to request certificate from workgroup server. You need to have permissions in domain of SCOM server for this. Also while requesting certificate, in FRIENDLY NAME give name of workgroup server.

    Certificate will by default get installed in personal store of Current user. Open MMC and export that certificate from current user store to some location. Later import it in Personal section of Local computer.

    Importing Certificates with the MOMCertImport.exe Tool

    Copy MOMCertImport from SCOM support tools on workgroup server.

    On Command prompt Navigate to \SupportTools\<platform> (i386 or ia64).

    Run momcertimport.exe /SubjectName <certificate subject name>.

    Process Manual Agent Installations in Operations Manager 2007

    On your Operations Manager server, configure the security settings for manually installed agent. It should be Review new manual agent installations in pending management view with/without Auto-approve new manually installed agents depending upon your security settings. Refer to the article for more details.

    Thanks for reading!!

  • Monitoring machines using Certificates with System Center Operations Manager 2007 R2 - Part 1

    (Post courtesy Rohit Kochher)

    In this series of two blogs, we will discuss about monitoring machines in non-trusted domain. In part one we will discuss about scenarios to use certificates and configuring certificate template for Operations Manager. In part two, we will talk about installation, approval of gateway servers and configuring monitoring for Workgroup machines.

    Kerberos or Certificates

    System Center Operations Manager 2007 R2 uses mutual authentication to communicate with agents. This can be done using Kerberos v5 or certificates. In case the monitored computers are in the same domain as that of Operations Manager server, or if the two domains have a two way trust we can use Kerberos. But if you want to monitor machines in a workgroup or in a non-trusted/one way trusted domain we need certificates. Certificates help in Mutual authentication.

    The following blog post from the Operations Manager support team has a nice diagram that shows where you would use Kerberos vs. certificates for authentication: Step by Step for using Certificates to communicate between agents and the OpsMgr 2007 server.

    Scenarios to used Certificates

    If my Operations Manager server is in domain A and I want to monitor machines which are in a workgroup, I need to use certificates. I will install certificates on my Operations Manager server and on each workgroup machine that I want to monitor.

    If my Operations Manager server is in domain A and I want to monitor machines in untrusted domain B, I will use certificates along with a gateway server. But this time I don’t need to install certificates on all machines in domain B. I can simply install the Gateway Server in domain B and have certificates installed on the Operations Manager server of domain A and the Gateway Server of domain B. Within Domain B, Kerberos is the security mechanism between the agents and the Gateway server. Between the Gateway and Operations Manager servers, certificates are used to provide mutual authentication.

    Another benefit of gateway servers is that I need to open only 1 port 5723 (TCP) between the Gateway and Operations Manager servers.

    We will also need name resolution between the Operations Manager server and gateway server. This can be done using DNS, host files etc.

    Let’s get it Started

    I have installed Active Directory Certificate Services (AD CS) and Certificate Authority Web Enrollment roles on 2008 R2. Certificate Authority is of Enterprise type. More on 2008 R2 CA can be found here. Also to configure HTTPS binding for CA, check this article.

    Configuring certificate template for SCOM

    1) On 2008 R2 Server, Click Start, then Administrative Tools and open Certification Authority snap-in. Click on Certificate templates, then on Manage.


    2) Right Click on IPSec (offline request) template and select Duplicate Template option.

    Select Windows Server 2003 Enterprise option for the version.

    3) In Properties of new Template on General tab, give any name to template like OpsMgr Certificate using Template Display Name.


    4) On request handling tab check Allow private key to be exported.

    5) Click the Extensions tab, and in Extensions included in this template, click Application Policies, and then click Edit. In the Edit Application Policies Extension dialog box, click IP security IKE intermediate, and then click Remove.


    6) Click Add and then select Client Authentication and Server Authentication and click OK.


    7) Click on Security tab and give Authenticated users Read and Enroll permissions.

    8) Close the Certificate templates console.

    Add the configured templates to certificate templates folder

    Right click Certificate templates in CA console. Click on New and then Certificate template to issue. Select the certificate template that we named in step 3.


    This way we configure our certificate for SCOM. In part 2 we will discuss about installation of certificates and deployment of gateway server.

    Stay Tuned!!

    Additional Resources

  • Security consideration when using * domain as production environment in Windows Azure

    (Post courtesy Giuseppe Branca)

    The Windows Azure platform offers you an easy way to publish applications: just press the publish button in management user interface. I want to go a little in depth on in what can be a possible security concern if you publish applications using * a production environment.

    Let’s start with a recap

    After having deployed your Windows Azure solution package in the cloud you can start your application. Starting your application will automatically create a first type of URI related to the application; this URI follows the pattern http://* In this phase the wildcard * stands for a group of letters and digits randomly generated by Windows Azure platform, and you are in a phase of lifecycle of Windows Azure application called staging. The Staging phase is a phase intended to be used only for testing; in this phase your application is completely published in Internet but is reachable only by users who know the URI: i.e. only by you, people you’re sharing the URL with, and/or people who can access your Windows Azure Account for managing purpose.

    The second phase is just putting the solution in production, as before it is very easy: just pressing another button in Windows Azure management interface. After that your application will be published with a domain where XXXX is the name you chose when you uploaded your Windows Azure package.

    This means mainly two things:

    1. A very easy way to publish application, if you are not bothered that users will access your application through the domain.

    2. There will be a lot of Windows Azure applications running within the same * domain.

    Indirectly point 1 and point 2 bring two downsides you must be aware of, so now you will learn that for your application in production it’s better to avoid * domain and that it’s better to associate your own custom domain.


    First security concern is phishing. Phishing is a technique whereby an attacker sets up a frontend copy of a website and using social engineering techniques brings users of attacked web site to the phishing site. The user is asked to enter sensitive data (e.g. username and password) into the phishing site, and often users will provide the data because based on the look and feel and URL of the phishing website they are tricked into thinking that they are interacting with the original website they trust.

    A simple example will let you understand this scenario: you did a good job and you finally published your Windows Azure application you called myroomcolor using * domain. So it will be published as An attacker decided to attack your site. So the attacker creates a website with the same layout as your site and publishes it on Windows Azure. The Attacker publishes the application with the URL and then he uses techniques (as link in emails, etc.) to bring visitors to his site. If a user is not very careful it will not notice any difference with your own website, and may insert sensitive data (such as username and password) in phishing site. This is due to the fact that users often base their trust in being in the right place looking at site layout and also to URL of the pages. Layout can easily be cloned, and as you can see an attacker can choose a very similar URL to the attacked one with some typo, or very little differences that can mislead users.

    Cookie stealing, cookie hijacking, CSFR

    The second security concern in hosting an application in the production environment using a domain * is not related with Windows Azure Platform but with the design of HTTP protocol, which makes this scenario inconvenient.

    Let’s consider the HTTP protocol cookie policy in brief. Cookies are a simply way to share data (typically store user control data such as: user session id) between web server and HTTP client using a proper field in the HTTP header. HTTP protocol sets the scope of a cookie to a domain (cookies are bound to a domain); so every time an HTTP client (e.g. a web browser) is connected to a domain it can get/set values of cookies against the server (and the server vice versa to the client). Sending cookies is an operation done contextually and automatically on every HTTP request with every verb (GET, POST, PUT, DELETE ref. RFC2616 from the client (e.g. web browser). Users typically have not much more control of it, clients have settings to customize that behavior but a user cannot choose for every HTTP request which cookie he wants to allow or reject. Server has the same behavior but developers have more control over it because they can use custom code and they can write a custom logic for cookies’ allow/deny policy.

    Consider following scenario: application A and application B are on * domain, that A application is our application and B application is application used by the attacker. Application A binds cookie domain to * instead of

    • User uses application A for hotel reservation. After he booked a room he doesn’t log out from the application and simply closes the tab. Then the user opens a new tab and goes to application B. When user browses application B, browser sends silently cookie to B. So application B can read all the cookies set while user was dealing with application A. Application B can also set values for that cookie, so on the next visit of browser to application A, values set from application B will be sent to application A.
    • User uses application A for hotel reservation. While he’s surfing on A he decides to have a look to his favorite weather channel to have a look at the weather during his own holiday. So user opens a new tab and types URL of his application B and looks to the forecast. For every page request to application B, application B can read and set values for cookies set by application A.

    These two scenarios can be easily generalized creating a wider set of attack scenario. As you can see the fact that cookies are set by different application owners on the same web domain brings the possible threat of cookie stealing or cookie injecting.

    By default ASP.NET cookies running on Windows Azure are bound to the specific domain instead of *, but if a developer intentionally sets this value (or uses a third party framework that does not set the proper value) then the application will be vulnerable to attacks based on cookies.

    So now you are asking how to fix this behavior. The answer is very easy; as this fault is not in Windows Azure but in HTTP protocol, you have to prevent the possibility of a cookie being sent to any other application than the intended one. You can get it associating your own custom domain name to application and emit cookies bound to that domain. In this way cookies of your own application can be read or set only when a HTTP request will address the write context (i.e. right domain).

    How to set a custom web domain for your own Windows Azure application

    Setting a custom domain for your own application is a process that does not involve Windows Azure platform directly. You have to deal with your own preferred DNS provider and you can follow one of these procedures:

    • Bind IP address (A field), so you will bind IP address in Microsoft Datacenter for your application to a custom domain.
    • Use a CNAME field, so you will associate your custom domain to your application, but request will involve a double lookup in DNS.

    You can find more details at

    Additional Information:

  • CRM 2011 and SharePoint 2010 Integration - Part 2

    (Post courtesy of Anand Nigam)

    Hi SharePoint Folks,

    This post comes after a long time after the Part 1. But as the saying goes – better late than never. Here I am back with Part 2, this time we will focus on Reporting CRM information into SharePoint Using Excel services.

    1. Part 1: Introduction and CRM 2011 - Document management Integration with SharePoint 2010
    2. Part 2: Reporting CRM data in SharePoint using Excel services (This post)
    3. Part 3: Publishing CRM entities in SharePoint.
    4. Part 4: Search CRM entities from SharePoint Enterprise Search.

    The word you are thinking is “awesome”, well I know J. Ok let’s cut short the talking and make it work, get ready!

    Reporting CRM data in SharePoint using Excel services

    So below is what you will need,

    1. CRM 2011 deployment and some sample data (I populated my CRM with built in sample data).
    2. A SharePoint 2010 farm, with a web app created
    3. This post

    What we will achieve by end of this post is to create an excel based report, to – Show all the account’s revenue, and have a filter on accounts based on number of employees . Our first step in the direction of SharePoint –is the SSS i.e. Secure Store Service. We will use Secure Store Service to, create an Application ID, We will use that App ID to retrieve the information from database server where the CRM 2011 is currently connected.

    TASK 1

    So we to SharePoint Central admin > Secure Store Service application and Create a New Application ID> and fill in the info as shown in the figure below,

    1. Target Application ID – CRMKey
    2. Display name – CRMKey
    3. Contact E-mail – any valid email address preferably
    4. Target Application Type – Keep it to Individual


    Click Next,

    Modify the Field name to reflect appropriate application credentials, this is not a necessary step, you could leave the default names as is.


    Click Next

    Specify who is going to manage the target application, Apart from farm admin, who by default has rights to modify the settings. For now, I have specified my CRM admin’s account – contoso\crm11.


    Click Ok and proceed to SSS main page, with our Application ID created.


    Here we just need to set the credentials once, Click on the drop down and select Set Credentials


    Below is what you will see, in Credentials owner specify the account that will manage this credentials – more simply put – just put the farm admin account here. What is more important here is in the username and password box, you specify the CRM 2011 account who has admin credentials, basically who can create a connection to the database Server of CRM 2011 deployment

    I have specified my crm11 admin account and its password. Click Ok to Finish


    We are DONE with SSS now.

    TASK 2

    Next we will create a Data connection Library, which is a specific type of library where we will store the data connection file (.odc),



    Now create a library where you want the resulting excel file to be published, this can be any normal library. I am going to use my Shared document library.

    TASK 3

    Now one important task, We need to configure excel service to trust the “data connection library” and the “shared documents” . Unless the trust is configured the report would not render.

    So Open Excel Service application main page > click on “Trusted File Location


    Click “Add Trusted File Location”


    Enter the shared documents location (and of course remove the trailing /forms/allitems.aspx)


    Tick mark – Children Trusted

    Scroll down, Under the External Data section Select Trusted data connection libraries only (this is because we will use the connection kept in SharePoint)

    Uncheck Refresh warning enabled


    Click on OK

    Now go back to Open Excel Service application main page> click on “Trusted Data connection Libraries


    Click on Add Trusted data connection library


    Enter the location for data connection library and click OK


    The result should like this


    TASK 4

    After this Next comes the task to create a connection file which will enable Excel to connect to CRM database views

    1. Open Excel > Data > From Other Sources> From SQL Server


    2. Specify the CRM’s SQL server name,


    3. Specify the Organization database of the CRM, in my case its Fabrikam_MSCRM, and Select FilteredAccount


    4. On the next screen, click Authentication Settings next to Excel Services, and enter the SSS Application ID we created



    5. Click ok and come back to the wizard, Now click on Browse and save the connection file to SharePoint Data connection library


    Click Finish in the wizard, after the wizard finishes just click on OK, in the property window that pops up


    Then just hit cancel and exit out of Excel.

    6. Now click Save, Once saved, Go to SharePoint data connection library and Approve the connection file, as shown below



    TASK 5

    Let’s now create an Simple Excel Report. Open Excel > Data> Existing Connections> Click on Browse for More and specify the SharePoint data connection library location, and click open.


    Now you will get the Import data prompt , select Pivotchart and Pivotable Report, and click Ok


    You get the Pivot chart on the Excel worksheet


    Now Coming back to our objective to –“– Show all the account’s revenue, and filter to filter the accounts based on number of employees”

    We will drag the Field “Name” to “Axis Fields (categories)” box, By the way the fields are sorted in alphabetical order, it should be easy to find fields.


    Next drag the “revenue” field to “Values” box,


    The end result is as below


    You will notice the pivot table shows the COUNT of the revenue, which is not what we want, we want to see the number, so right click on the “revenue” or “count of revenue”


    By default Count is selected, change it to Sum and click OK

    clip_image066 clip_image068

    Now we see the table and the chart correctly


    Now the next job is to Add a Slicer that will filter the chart and table based on the number of employees. Select the Chart by clicking it once and then Click Insert and select slicer


    Select “numberofemployees” and click OK


    You will now be able to see some meaning full report


    Task 6

    Now the real hero comes into picture, the Excel Services. Click on the file>Save & Send> Save to SharePoint > double click on “Browser for a location” and locate the SharePoint document library location where the Excel report will be saved


    Navigate to shared document library


    As soon as you save you see the report in the browser


    Verify if its working by selecting the “number of employee” slicer


    And yes it does J

    Ok what happens when the CRM data changes ? Try it – Open an CRM account and change the revenue value , I did for “A store (sample)” form 10000, to 12000.


    Now go back to the excel service report and click Data> refresh all connection


    And see the smooth update of data


    And this way you can hopefully build complex and more meaning ful reports and publish it in SharePoint.

    Enhance this ?

    If you want to show/display the excel workbook anywhere in SharePoint, you can use the Built in “Excel Web Access web part

    Open your SharePoint site and edit the page and add a webpart “Excel Web Access”


    Click Add and see the web part added. Now open web part properties



    You will see the properties


    In the workbook Box enter the excel work book location, in my case - \crmrpt/shareddocument/CRM report.xlsx. Click Apply.


    Further we can just have the chart shown in the site

    Change the web part property, Enter Named Item to Chart 1 (This is the chart object’s name), if you want to verify open excel and see the chart property


    And It just shows the chart on the home page,


    To get the chart name see in the Excel, click on the chart object to select it and see the name in ribbon.


    Further Enhance this using Connected Web parts

    If you have several reports in the library AND you want to just have click and see behavior we can create connections between web parts to have that kind of experience. Just ensure that you have multiple reports in the library.

    Edit the home page, Add the library that has Excel Reports in it AND add the Excel Web Access webpart on that same page, Configure the Excel Web access to show a report (this would be the default report that it will show). Now click on the menu of the Library >Connections>Send Row of Data To > Excel Web Access Web part.


    In the following popup menu

    Got to Tab 2. Configure Connection> and set the Field name to Document URL> and Finish. Now save and Close the page.


    See it in action, by selecting the report



    With that I will come back with the 3rd and the 4th part soon. Thanks for reading.

  • OWA Cross-Site Silent Redirection in Exchange 2010 SP2

    (Post courtesy of Krishan Kant Mehta)

    It wasn’t too long ago when we were celebrating the release of Exchange Server 2010 SP1. Now, high on the hog, we have Exchange Server 2010 SP2, with pretty interesting set of new features and enhancements including the much awaited Address Book Policies feature that provides a simpler mechanism to accomplish GAL separation for the on-premises organization that needs to run disparate GALs.

    In this blog, I am going to talk about the Cross-site silent redirection feature that did not make it into SP1. To get an overview of important new features and functionality in Exchange Server 2010 Service Pack 2, please refer to this link.

    Before Service Pack 2, with Client Access Servers in two different internet-facing AD sites, an OWA user would be presented with a link to click on to log-in to his mailbox in the site where his mailbox resided.


    And after clicking the link, the user would also have to login a second time… isn’t life complicated enough?


    Thanks to the Cross-site silent redirection feature, the user will not get a link but will be silently redirected to his own Client Access Server without having to log in again.

    As can be seen above, an OWA user is notified that he is using the wrong URL and he is required to enter his credentials twice which leads to sub-optimal experience with manual redirection. To improve the user experience, a new parameter ‘CrossSiteRedirectType’ has been introduced with Set-OWAVirtualDirectory cmdlet in Exchange Server 2010 SP2. As the name implies, this redirection performs silent redirection to CAS located in another Active Directory site that have an OWA ExternalURL specified, within the same Exchange Organization.

    This parameter supports two values, Manual and Silent. Cross-Site Silent Redirection is disabled by default which means Manual setting is enabled which would continuously perform manual redirection between CAS in different Active Directory sites, after you deploy Exchange Server 2010 SP2.

    Cross-Site Silent Redirection can be enabled by setting the CrossSiteRedirectType to Silent on the Internet-facing CAS OWA virtual directories:

     Set-OWAVirtualDirectory –Identity “companyname\owa (Default Web Site)” – CrossSiteRedirectType Silent

    When you configure the CrossSiteRedirectType parameter to Silent for a CAS OWA virtual directory, you will get a warning that the cross site silent redirection will work if the corresponding virtual directories in the target Active Directory Sites have the ExternalURL Specified that leverages HTTP SSL protocol (Fig 1).


    (Fig 1)

    The output of the command Get-OwaVirtualDirectory shows that the silent redirection is enabled on the Exchange 2010 CAS server in an AD site (Fig 2).


    (Fig 2)

    Cross-site silent redirection prevents users from having to learn a secondary Outlook Web App URL. This silent redirection also provides a single sign-on (SSO) experience when forms-based authentication is enabled on each Client Access server i.e. if the authentication method for the Outlook Web App virtual directory on both the source and target Client Access servers is set to forms-based authentication, the user will only have to enter their credentials once. If the authentication methods differ on the source and target Client Access servers, the users may have to enter their credentials two times. Bear in mind when using forms-based authentication, you should have SSL on both the source and target Outlook Web App virtual directories.

    Click here to download Exchange Server 2010 Service Pack 2, and let your fingers do the walking!!

    Let’s Exchange – KK Mehta (Krishan Kant Mehta)

    Partner Technical Consultant

    Microsoft Partner Technical Services

  • Dashboards in System Center Operations Manager 2012 (Part 1)

    (Post courtesy Rohit Kochher)

    System Center Operations Manager 2012 has some exciting features out of box like Networking Monitoring, Application monitoring and Dashboards. We will cover Dashboards in series of two blogs. In the first blog we talk about terminology of dashboards and create a performance widget. In the second blog, we will create state and alert widgets. I will be using Operations Manager 2012 Beta edition for these blogs.

    The Data warehouse which (was an optional component in System Center Operations Manager 2007 R2) is now mandatory in Operations Manager 2012. The main reason for this change is dashboards.

    While designing dashboards, we define two things

    1) Templates: We have two types of templates. Column layout and Grid Layouts, and you specify number of cells after you select any template. These layouts specify arrangement of cells that actually host content.

    2) Widgets: Once layout is created, we add widgets in the layout. In OM 2012 beta edition, we have three types of Widgets namely Alert, Performance and State. While creating widget, we define criteria to collect data from database.

    To create a dashboard we will be start from from Operations Console. Choose New –> Dashboard View


    We have to choose one layout out of the two available templates. We will select Grid layout and name the dashboard.


    Next, we define number of cells and we choose one of the layouts.


    Once the wizard is completed, you can click on Configure to change the number of cells and layout of dashboard. Also you can interchange the positions of all the widgets by using the two arrows.


    Adding widgets to dashboards: You can click on “Click to Add widget” and it will start the wizard of creating widgets. In Operations Manager 2012 Beta edition we have alert, state and performance widgets.

    We will start by creating a Performance widget.


    We will name the widget “SQL Performance counter”, and will use this to view performance of SQL computers.

    On “Specify the Scope and Counters”, select a group.


    We select a group of SQL computers. Next we will select performance counters. We will define object, counter and instance.


    The next step is to define Time Range.


    The next step is to configure the way you would like chart and legend to display.


    Finally we get our SQL Performance Widget. You can click on Configure to change scope, counters, time range, chart preferences. Also you can click on Personalize to change chart and legend to display.


    You can also hover anywhere on graph to see the exact value.


    In the part 2 of the series, we will create alert and state widgets.

    Additional Resources

  • Private Cloud Management with VMM 2012 (Part 3): Adding an update server and enable orchestrated update management

    See the first two parts of this series here:
    Private Cloud Management with VMM 2012 (Part 1): What's new with System Center Virtual Machine Manager 2012
    Private Cloud Management with VMM 2012 (Part 2): Creation of a Hyper-V Cluster Using VMM 2012

    (Post courtesy Iftekhar Hussain)

    In my last blog post, we saw how to create a Hyper V Cluster using the VMM 2012 cluster creation wizard as a part of preparing a fabric for the cloud.

    Virtual Machine Manager 2012 also provides a feature by which you can manage updates for your virtual machine hosts, library servers, PXE servers, the Windows Server Update Management (WSUS) server, and the VMM server itself in the VMM console.

    When you perform update remediation on a host cluster, VMM orchestrates the updates, in turn placing each cluster node in maintenance mode, migrating virtual machines off the host by using intelligent placement, and then installing the updates. If the cluster supports live migration of Windows Server-based virtual machines, live migration is used. If the cluster does not support live migration, VMM saves state for the virtual machines and does not migrate them.

    To manage updates in VMM 2012, you need a WSUS server

    Here are the prerequisites:

    1. You must install the 64-bit version of Windows Server Update Server (WSUS) 3.0 Service Pack 2 (SP2).
    2. VMM requires a single, dedicated WSUS root server; downstream servers are not supported.
    3. If you install WSUS on a remote server, you must install a WSUS Administrator Console on the VMM management server and then restart the VMM service.
    4. Before you install the WSUS server ensure that the server meets all WSUS prerequisites described on the Windows Server Update Services 3.0 SP2 download page

    Once you have configured the WSUS server, lets add the WSUS server in VMM 2012.

    1. Open the Fabric workspace.

    2. On the Home tab, in the Add group, click Add Resources, and then click Update Server.


    3. My WSUS server name is, hence I have put SCCM as Computer name with port number and credentials.clip_image003

    4. Once the WSUS server is added, click on Synchronize to sync the WSUS updates with VMM.


    5. You will all the update metadata showing on your VMM


    Create Baseline

    After you enable update management in VMM, you are ready to prepare for patching by configuring update baselines. An update baseline contains a set of required updates. During a compliance scan, computers that are assigned to a baseline are graded for compliance to their assigned baselines. After a computer is found noncompliant, an administrator brings the computer into compliance through update remediation.


    6. Provide a Name and Description for your Baseline



    7. Add the updates to your baseline against which your Hyper V hosts will be compared.


    8. Assign the Baseline to the host groups.


    9. Finish the wizard


    10. You’ll see your newly created Baseline


    To find out the compliance status for each baseline, you must scan the computer for compliance. When a computer is scanned for compliance, WSUS checks each update in the assigned update baselines to determine whether the update is applicable and, if the update is applicable, whether the update has been installed.

    After a compliance scan, each update has a compliance status of Compliant, NonCompliant, Error, or Unknown.

    clip_image022To scan computers for compliance

    1. In Compliance view of the Fabric workspace, select the computers that you want to scan.

    2. On the Home tab, in the Compliance group, click Scan.

    While the scan is in progress, the compliance status changes to Unknown. After the compliance scan completes, the computer's compliance status of each update is Compliant, NonCompliant, or Error.



    Perform Update Remediation

    To perform update remediation, the target computers must be noncompliant. To make a compliant computer noncompliant, you might need to use Add and Remove Programs to temporarily uninstall one or more of the updates listed in Compliance view.

    On the Home tab, in the Compliance group, click Remediate. (The Remediate task is only available when the selected objects are noncompliant.)

    If you select the host cluster by its cluster name, VMM orchestrates remediation of the hosts in the cluster.

    VMM rolls through the host cluster, remediating one cluster node at a time. If a cluster node is compliant, VMM bypasses that node.

    Before VMM begins remediating a host, it places the host in maintenance mode and migrates all virtual machines to other hosts in the cluster. If the cluster supports live migration, live migrations are performed. If the cluster does not support live migration, VMM saves state before migrating virtual machines.

    If you prefer to restart the computers manually after remediation completes if any updates require a restart, select the Do not restart the servers after remediation check box.


    Once the remediation is over, you’ll see your Hyper V cluster hosts as Compliant.


    Hope this post was helpful for some of you who are evaluating VMM 2012 its update management feature.

    In my next post, we’ll configure the rest of the fabric components like Logical Network, Storage and Load Balancers.

    Stay tuned.


  • Performing an Active Directory Health Check Before Upgrading

    (Post courtesy Bonoshri Sarkar)

    Hi everyone, this is Bonoshri Sarkar here. I have worked for Microsoft as Partner Technical Consultant specializing in Directory Services for the past two years; providing end to end consulting to enable partners to design, position, sell and deploy Microsoft Platforms for their customers. In my earlier role, I worked for more than 4 years on the Microsoft Support team focusing on Microsoft Directory Services.

    Since I have a great affinity for Directory Services, I thought it would be a great idea to pen down my thoughts and experience on ensuring a smooth Active Directory Upgrade.

    For any kind of Upgrade/ Migration / Transition to go smooth, and later on to have an healthy environment, it is required to spend a fair amount of time in planning and making sure that the source or the present environment is in a healthy state. Two driving factors for any upgrade or transition include the need to utilize the new features that the new version of the product has to offer, and the other being to ease the complexities and the issues of the current environment. However, most IT Pros do not take adequate steps to check the health of their existing Active Directory environment. In this post, I would like to address some of the key steps that an AD Administrator must perform prior to an upgrade or transition.

    In my experience of assisting customers and partners in different transitions, most of the issues pertain to the source domain or the source domain controllers, so I will discuss few important things which should be considered as mandatory before going for any kind of Upgrade / Migration / Transition.

    Performing an Active Directory Health Check

    The health check should be done in 2 phases.

    1. Planning Phase

    2. Deploy Phase (just before implementing the upgrade, transition or migration)

    In the first phase we should identify what all services and roles are running on the machine that we are planning to upgrade, and rule out things that we do not want to move to our new box.

    Putting emphasis on diagnosing AD issues, we can use dcdiag to ensure a healthier Active Directory, I know we have been using dcdiag for many years, and we look for failure messages in the output, but apart from the failure messages, we can also consider issues such as those highlighted in yellow below:




    If you notice the first part of dcdiag says “failed test replication”, which implies that there are issues with Active Directory replication with this Domain Controller.

    The second message tells us that there are issues with netlogon and sysvol which are default logon shares, both the errors can be interdependent or could be because of completely different reasons. 

    In this scenario we need to fix AD replication first or dig into it more to find what is causing these errors. Now you can use few more commands to check the AD replication like repadmin /syncall /eAP. In case of a huge enterprise, you can also use Replmon (2003).

    The third message tells us that the important services are running. We need to be sure that the above services are started to ensure a smooth transition.

    If we don’t get enough details from the dcdiag results, check the event viewer, and if you do not see anything restart the FRS service and then check the event viewer for Event ID 13516.


    Apart from dcdiag you can also use Netdiag to check the network status and get detailed information.

    In addition to this, make sure the NIC card drivers are updated on the old server. 

    Instead of disabling the hardware or software based firewall between on the servers (old &new), ensure that you make the appropriate exceptions and port configurations to ensure proper communication between the directory servers (see Active Directory and Active Directory Domain Services Port Requirements).

    Any third party legacy application(s) should be tested in lab environment to make sure that they are compatible with new version of server OS and Active Directory.


    We also have different versions of Exchange BPA (Best Practice Analyzer) tools depending on the version of Exchange to check Exchange integrity and Exchange specific permission (You can select Permission check to gather that information).

    Last but not the least read the migration or transition documents ( to make sure server has all the minimum requirements.

    Once we are sure that the servers are in healthy state do not forget to take a full and a system state backup using a supported backup system as documented in the TechNet article below

    All these stitches in time would definitely save you nine hours’ worth of troubleshooting. It’s up to you to decide, would you like to troubleshoot or enjoy your Fries with Coke?

    Additional References

  • WSUS not configured error during Configuration Manager 2012 Software Update Point Installation

    (Post courtesy Anil Malekani)

    Recently I tried configuring Software Update Management in Configuration Manager 2012. After installing WSUS on the Configuration Manager 2012 box, I tried to install Software Update Point as a site role.


    The Software Update Point role successfully installed, as per the SUPSetup.log file (under C:\Program Files\Microsoft Configuration Manager\Logs)

    However, my updates still did not appear on the console. After checking the Site Component status for SMS_WSUS_SYNC_MANAGER and SMS_WSUS_CONFIGURATION_MANAGER I noticed errors as below

    SMS_WSUS_SYNC_MANAGER: Message ID 6600




    I checked under WCM.log (under C:\Program Files\Microsoft Configuration Manager\Logs), and found the following proxy error


    SCF change notification triggered.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    This SCCM2012.CORP80.COM system is the Top Site where WSUS Server is configured to Sync from Microsoft Update (WU/MU) OR do not Sync.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Found WSUS Admin dll of assembly version Microsoft.UpdateServices.Administration, Version=3.0.6000.273, Major Version = 0x30000, Minor Version = 0x17700111        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Found WSUS Admin dll of assembly version Microsoft.UpdateServices.Administration, Version=3.1.6001.1, Major Version = 0x30001, Minor Version = 0x17710001        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    The installed WSUS build has the valid and supported WSUS Administration DLL assembly version (3.1.7600.226)        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    System.Net.WebException: The request failed with HTTP status 502: Proxy Error ( The host was not found. ).~~ at Microsoft.UpdateServices.Administration.AdminProxy.CreateUpdateServer(Object[] args)~~ at Microsoft.UpdateServices.Administration.AdminProxy.GetUpdateServer(String serverName, Boolean useSecureConnection, Int32 portNumber)~~ at Microsoft.SystemsManagementServer.WSUS.WSUSServer.ConnectToWSUSServer(String ServerName, Boolean UseSSL, Int32 PortNumber)        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Remote configuration failed on WSUS Server.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    STATMSG: ID=6600 SEV=E LEV=M SOURCE="SMS Server" COMP="SMS_WSUS_CONFIGURATION_MANAGER" SITE=CM1 PID=2424 TID=5408 GMTDATE=Fri Oct 14 00:20:03.092 2011 ISTR0="" ISTR1="" ISTR2="" ISTR3="" ISTR4="" ISTR5="" ISTR6="" ISTR7="" ISTR8="" ISTR9="" NUMATTRS=0        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Waiting for changes for 46 minutes        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)


    I validated that the proxy had been configured correctly and my browser settings also contained the same settings.

    Resolution: After spending some time I found that Configuration Manager 2012 uses the system account proxy settings, which were set to Automatically detect settings.

    1. Using the excellent PsExec utility, I opened a command prompt under the system account (using the –s parameter).
    2. Within this command prompt running as system, I launched Internet Explorer and removed proxy settings.
    3. Finally, updates started appearing in the console.


  • Sending e-mails from Microsoft Dynamics CRM

    (post courtesy Sarkis Derbedrossian)

    I often meet Microsoft CRM users who don’t know how sending e-mail works within Microsoft Dynamics CRM. Most users think that when they create an e-mail in CRM and hit the send button, the e-mail is sent automatically. Neither Outlook nor CRM can send e-mails without a post system e.g. Exchange server. Below you will learn how e-mail within CRM works with and without Outlook

    E-mail in relation to CRM

    Once you've created an e-mail activity in MS CRM and click the Send button to send the e-mail, this mail is handled differently depending on the settings of each user is set to in MS CRM.

    E-mail can be handled through Outlook or directly through CRM ... but neither Outlook nor MS CRM can implement the physical handling of the e-mail. This is done by a mail server (Microsoft Exchange Server or another mail system).

    Sending an E-mail in MS CRM

    Do not make a fast conclusion and think that MS CRM can neither receive nor send e-mail. You should understand that the above task requires an e-mail system to accomplish.

    When you send email from MS CRM it usually happens by the following steps:

    1. The user creates an e-mail activity and clicks on the send button. The e-mail is now saved with the user as the recipient
    2. The e-mail gets synchronized to the user’s Outlook
    3. The users Outlook sends the e-mail to the mail server (Exchange)
    4. Exchange sends the e-mail through the internet

    What if the user does not have the Outlook client open? This will result in the mail will not be sent until the user logs into Outlook. For some situations this can be insufficient. Fortunately installing the e-mail router can solve it.

    Sending e-mail via the e-mail router

    If you want to be independent of Outlook, and thus could send email directly from MS CRM without using Outlook, this can be done by installing and configuring an E-mail Router.

    The e-mail Router is free software that comes with MS CRM. The software can be installed on any server that has access to a Mail Server (Exchange Server or other mail system (POP3/SMTP)) and MS CRM.

    When you send email from MS CRM using an E-mail Router it often happens by the following steps

    1. The user creates an e-mail activity and clicks on the send button. The e-mail is now saved with the user as the recipient
    2. The e-mail is sent to the e-mail router
    3. The email router sends the e-mail to the mail server (Exchange)
    4. Exchange sends the e-mail through the internet

    E-mail settings in CRM

    Depending on how you want your organization to send e-mails, remember to check the following settings:

    1. In CRM, Settings, Users
    2. Open the user form
    3. In the configuration section of the e-mail access, select the desired setting


    Configuring e-mail access

    It is possible to choose one of the following settings from the option list:

    Outlook cannot be used for sending and receiving e-mails which is related to MS CRM

    Microsoft Dynamics CRM for Outlook
    Outlook is responsible for sending / receiving e-mail. Integration with MS CRM for Outlook must be installed and configured. E-mails sent / received only when Outlook is active (open)

    E-mail router
    E-mail is sent and received with MS CRM Email Router. If this element is selected, a dialog box allows entering credentials. Check the box if you want to specify credentials

    Forwarded mailbox
    E-mail forwarded from another e-mail address. The e-mail Router is responsible for sending / receiving e-mails.

    More Information:

  • Going to MMS?

    (post courtesy Iftekhar and cross-posted from his blog).  If you are going to the Microsoft Management Summit, make sure to say hi!

    MMS 2011 Design

    Hello Everybody,

    This is again a part of year where Microsoft reaches out to all its customers, partners, IT Pros and developers through its back to back technical conferences with some great announcements, Technical Sessions and demonstrations of its latest and cutting edge products and solutions.

    Year 2011 will start with Microsoft Management Summit 2011 next week in Las Vegas, followed by TechEd North America 2011 in May and then Worldwide Partner Conference 2011 (WPC) in July.

    MMS 2011 is expected to be all about Private Cloud, Datacenter Automation and Virtualization Management. The products which I am really looking forward to at MMS this year are VMM 2012, Opalis and Service Manager

    I’ll be working at MMS 2011 in Las Vegas next week as a product specialist for Virtual Machine Manager 2012. So all you partners, customers and IT Pros, Virtualization enthusiasts who are interested in Next Gen capabilities of Virtual Machine Manager 2012 and learn how it helps creating and doing end to end management of Private Cloud can find me at VMM Booth at Microsoft Pavilion.

    In addition to my booth duty, I am really looking forward to presenting and meeting some of my customers, partners whom I work with and discuss their Virtualization and Private Cloud Practice, also hanging out with some old friends and explore Vegas..

    Though MMS 2011 is completely sold out for general attendees but here are other options.

    For those who are not attending but would like to be updated with what's happening in Vegas, I am planning to do some heavy duty tweeting..

    See you in Las Vegas.