Partner Technical Services Blog

A worldwide group of consultants who focus on helping Microsoft Partners succeed throughout the business cycle.

Partner Technical Services Blog

  • Configuring SharePoint 2013 Forms-Based Authentication with SQLMemberShipProvider

    Post courtesy Partner Solution Consultant Priyo Lahiri

    Background

    With SharePoint 2013, a lot of partners and customers are opening up their on premise deployment to their vendors and customers. While the way you would configure this is very similar to SharePoint 2010, things get a little tricky when you perform a real-world deployment spanned across multiple servers. This post is an end-to-end walkthrough of setting up Forms Based Authentication with SQLMemberShipProvider in a 3 tier SharePoint 2013 Deployment.

    Environment

    It would be whole lot easier if I had a single server environment with the same account running everything and that account is also a Domain Admin. However, I chose a different approach since most likely this is how your real-world deployment will be setup and the steps are little different when your farm is spanned across 3 servers. Here is my environment:

    WFE01 – Web Server running Microsoft SharePoint Foundation Web Application. I am connecting to the SQL instance using an Alias. It’s a very smart move. If you have ever had to move your SharePoint databases across SQL Servers or decommission an aging SQL Server, you know that having a SQL Alias will save you from a lot of nightmares. If you are looking for a step by step, click here.

    APP01 – Central Admin Server. Note: this is NOT running Microsoft SharePoint Foundation Web Application and is configured to be a “True” application server. This also means that the Web Application that we create will not reside on this server.

    SQL01 – SQL Server running SQL Server 2012 with SP1

    SharePoint 2013 server RTM and Windows Server 2012 RTM are used for this set up.

    Tools to use

    While the steps documented below can be done without these tools, they do make your life a whole lot easier.

    1. FBA Configuration Manager for SharePoint 2013 – Author and Credit goes to Steve Peschka. The download comes with a ReadMe file. Please read it, since you need to register the WSP that comes with it.

    2. SharePoint 2013 FBA Pack – Author and Credit goes to Chris Coulson. Here is the documentation that will tell you how to install/activate/work with it. This not only will this make usonly tested the user management er management a breeze, it has some very useful features like password reset and self-service account management.

    NOTE: I have portion of the FBA Pack and didn’t have time to play with the rest of the features.

    How it’s done

    Step 1 – Create the Web Application

    In this step we will be creating the web application with Windows Authentication (Claims) and Forms Based Authentication (FBA) on the same Zone. In SharePoint 2013, you can have multiple authentication providers without extending the web application. Having said that, at times, you might have to extend the web application depending on your scenario. More on that on a different post where I will show you how to use LDAPMemberShipProvider to talk to your AD.

    From Central Administration, we will create a Web Application and call it Extranet.waterfall.net and enable both Windows Auth and FBA. Note the names I am using: ASP.NET Membership Provider Name = SQL_Membership and ASP.NET Role manager name = SQL_Role. You can call them whatever you want, just ensure you use the same names everywhere.

    clip_image002

    We will create a new App Pool and use the Web App Pool account. Make a note of this since you would need to give this account permission in the next step in the ASPNET database.

    clip_image004

    Create the Web App and then the Site Collection, it doesn’t matter what template you choose. Once the Site Collection is created, visiting the site collection will take you to our default sign in page where you will be asked to choose an Authentication Provider to Sign In with. If you want your External Users only to have the option of FBA, you would want to set this default zone with Windows Auth and extend it and have the FBA on the extended web app. Obviously, the URL’s will then be different.

    Your sign in page should look like this (make sure your DNS record (CNAME) point to the WFE01)

    clip_image006

    Do you want to see a custom sign in page with your company brand on it? Well, let’s defer that to a different post.

    Step 2 – Verify Tools

    Now that the web app is created, we will make sure FBA Pack and FBA Configuration manager is deployed as it should be. Go to Central Administration >> System Settings >> Manage Farm Solutions. Make sure fbaConfigFeature.wsp is globally deployed and visigo.sharepoint.formsbasedauthentication.wsp is deployed to http://extranet.yourdomain.com. See screenshot below. If the visigo.sharepoint.formsbasedauthentication.wsp is not deployed, click on the WSP and deploy it to your web application.

    clip_image008

    Login to the site collection created in the above step and activate the following feature:

    Site Settings >> Site Collection Administration >> Site Collection Features >> Form based Authentication Management

    clip_image009

    Once the feature is activated, it should add the following to your Site Settings under User and Permissions

    clip_image011

    Step 3 – Creating the SQL Database for User Management

    The first step is to create the SQL Database that would hold the Extranet Users

    • Browse to c:\Windows\Microsoft .NET\Framwork64\v4.0.30319
    • Run aspnet_regsql.exe
    • Click Next
    • Choose Configure SQL Server for Application Services >> Click Next
    • Enter your SQL Server Name , choose Windows Authentication and type in a Database Name

    clip_image013

    • Click Next twice to provision the database
    • Now we need to add the Application Pool that runs the web application and give it required permission. In this case, the application pool name is waterfall\spweb. Perform the following steps:
      • Open up SQL Management Studio, Expand the database we created and expand Security
      • Right click Users and add a new User
      • User Type = Windows User
      • User name = choose <yourAppPoolAccountName>
      • Login name = browse and choose the login name (should be same as the app pool name above)

    clip_image015

      • Click Owned Schemas and choose the following:
        • aspnet_Membership_FullAccess
        • aspnet_Persolalization_FullAccess
        • aspnet_Profile_FullAccess
        • aspnet_Roles_FullAccess
        • aspnet_WebEvent_FullAccess

    clip_image017

    Step 4 – Editing the web.config files

    We need edit the following web.config files:

    • Web Application Web.config – WFE server
    • STS Application web.config – WFE server and Application Server
    • Central Admin web.config – CA Server
    • If you have more WFEs and App Servers, you need to edit them as well. A lot of people puts these in there machine.config file as well so that it gets inherited to the web.config file. I am not too keen on editing the machine.config file.

    Let’s login to our WFE server and fire up FBAConfigMgr.exe. While you can get the code you need from here and edit web.config yourself, if you just let the tool run its course, it will create a Timer Job and do the task for you. In the FBAConfigMgr type in your application URL and from the sample configuration choose the following:

    • People Picker Wildcard
    • Connection String
    • Membership Provider
    • Role Provider

    Here is what the screen looks like when default values are chosen:

    clip_image019

    We will modify the default values to reflect the following (highlighted items need modification per your environment):

    • Web Application URL - http://extranet.waterfall.net
    • People Picker Wildcard - <add key="SQL_Membership" value="%" />
    • Connection String -
      <add name="fbaSQL" connectionString="server=SQL01;database=Extranet_User_DB;Trusted_Connection=true" />
    • Membership Provider -
      <add connectionStringName="fbaSQL" applicationName="/"
      name="SQL_Membership"
      type="System.Web.Security.SqlMembershipProvider, System.Web,
      Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    • Role Provider -
      <add connectionStringName="fbaSQL" applicationName="/"
      name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,
      Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>

    The screen should now look like this:

    clip_image021

    It’s time to hit Apply Config. This will create a timer job to update your web.config files. Though it creates a backup, you should be proactive and take a backup of your web application web.config and sts web.config file. Here is how to back up the web.config file and here is how to find the STS web.config file.

    Once you click Apply Config, the tool will tell you when it’s done. It might take a few mins before you see any changes, so wait for it (you should see a new backup file created for your web.config file with time stamp and _FBAConfigMgr in the end of the file). To verify that the job is done, open up the web.config for your web application and search for <membership. You should see the following:

    <<Web Application web.config file>>

    clip_image023

    The ConnectionStrings gets added to the end of the file right above </configuration>

    clip_image025

    <<STS web.config file>>

    Open up the STS Web.Config and you should see the following:

    clip_image027

    The ConnectionStrings gets added to the end of the file as well just like web.config of the web application.

    <<Central Administration web.config file on App Server>>

    If you go back to the application server and open up the web.config file for the Central Admin site, you will see there are no changes made there. So we will make that change manually. Create a backup of the file then open up the file and find <Machine. It should look like this:

    clip_image029

    We will add the following (copied from web.config file of web application or the code from FBAConfigMgr)

    1. Search for <machineKey and paste the following under <rolemanager><providers>
    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    2. Under <membership><providers> paste the following
    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Membership" type="System.Web.Security.SqlMembershipProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    The screen should now look like this:
    clip_image031

    3. Scroll to the end of the document and paste the following right before </configuration>
    <connectionStrings>

    <add name="fbaSQL" connectionString="server=SQL01;database=Extranet_User_DB;Trusted_Connection=true" />

    </connectionStrings>

    clip_image033

    <<STS web.config file on App Server>>

    Just like the Central Admin web.config make the same changes on this web.config as well. Just make sure you are pasting the information from RoleManager Providers and Membership Providers in the right place. Here is what the code looks like (you can use the code below are make changes to the highlighted areas to suit your environment):

    <system.web>

    <membership>

    <providers>

    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Membership" type="System.Web.Security.SqlMembershipProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    </providers>

    </membership>

    <roleManager>

    <providers>

    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    </providers>

    </roleManager>

    </system.web>

    <connectionStrings>

    <add name="fbaSQL" connectionString="server=SQL01;database= Extranet_User_DB;Trusted_Connection=true" />

    </connectionStrings>

    Here is a screenshot

    clip_image035

    Step 5 - Use FBA Pack to add and manage users

    Our configurations are done. We will now go to our site collection and use the FBA Pack to add / manage users and Roles

    Go to Site Settings and click on FBA User Management >> Click New User and create a dummy user and add him to the contributor group

    clip_image037

    Step 6 – Verify Forms user

    Now open up IE in InPrivate mode and visit your site collection and this time choose Forms Authentication and enter the account information you just created to log in. You’re done!

    clip_image039

    Click on the user name and My Settings, you will see the account information coming from SQL Membership Provider

    clip_image041

    If you go to a document library and try and add the user there, you will see it resolves from your SQL database

    clip_image043

    Appendix

    How to create SQL Alias for SharePoint

    Follow the steps below to create a SQL Alias on all your SharePoint Servers:

    TechNet Reference: http://technet.microsoft.com/en-us/library/ff607733.aspx#clientalias

    1. Perform this on the Application Server that is hosting Central Administration

    a. Stop all SharePoint Services

    b. Open CLICONFIG.exe from C:\Windows\System32\cliconfg.exe (64 bit version of cliconfig.exe)

    c. Enable TCP/IP under general tab
    clip_image045

    d. Click on Alias Tab

    e. Type Current SQL Server Name in the Alias Name field

    f. Type Current SQL Server Name in the Server field (see screenshot below. In your case SQL Alias and SQL Server name is the same)
    clip_image047

    g. Validate SQL Alias

    i. Create a new text file on SharePoint Server and name it “TestDBConnection.udl”

    ii. Double click to open the file and enter your SQL Server Alias name

    iii. Use Windows Integrated Security

    iv. You should be able to see all your SharePoint databases when you click on “Select the database on the Server”

    h. Start all services for SharePoint Server / Reboot SharePoint Server

    i. Perform the steps above on all other SharePoint servers

    How to backup web.config file

    To back up web.config file, perform the following:

    · From IIS Manager (start >> Run > inetmgr)

    · Right click on the web site and click Explore

    · Copy the web.config file somewhere else, or the in the same location with a different name

    clip_image049

    Where is the STS web.config file?

    · On your WFE open up IIS Manager and expand SharePoint Web Services

    · Right click on SecurityTockenServiceApplication and click Explore

    clip_image051

  • Embedding a PowerPoint Deck on SharePoint 2010

    (Post dedicated to Nuri, Operations Manager for our delivery team in EMEA, and courtesy Sean Earp)

    With the addition of PowerPoint Web App to SharePoint 2010, you can now view and edit PowerPoint presentations directly from within your browser.  This technology has also been made available to consumers on services such as http://office.live.com/ and http://docs.com/.

    image

    In the past, it has been difficult to embed a PowerPoint document within a webpage, requiring workarounds such as saving the presentation as pictures, PDFs, or MHT documents.  If you have a public presentation, it is now extremely easy to embed a PowerPoint deck on any web page, following the steps on the aptly named how to embed a PowerPoint presentation on a web page post.

    Unfortunately, these steps do not work if your installation of PowerPoint Web App is local.  The Share –> Embed option available from http://office.live.com is simply not present on SharePoint 2010.

    image

    So what to do if you want to embed an internal, private, or confidential PowerPoint presentation on an internal SharePoint page?  Fortunately, it is possible to embed a presentation on a webpage without posting the presentation on a broadly available public site.

    Step 1: Ensure that Office Web Apps have been installed and configured on SharePoint 2010.  Those steps are out of scope for this article, but the official documentation should be all you need:  Deploy Office Web Apps (Installed on SharePoint 2010 Products)

    Step 2: Upload the PowerPoint to a document library

    image

    Step 3: Click on the PowerPoint Deck to open it in PowerPoint Web App.  It will have a URL that looks like:

    http://sharepoint/sites/team/_layouts/PowerPoint.aspx?PowerPointView=ReadingView&PresentationId=/sites/team/Shared%20Documents/SharePoint%202010%20100-level%20overview.pptx&Source=http%3A%2F%2Fteam%2Fsites%2Fteam%2FSitePages%2FHome%2Easpx&DefaultItemOpen=1

    image

    Don’t worry about writing down the URL. Unfortunately, you can’t paste it into a Page Viewer web part without getting an error message.  So… a little magic to get the URL we need to embed our PowerPoint deck on our SharePoint Page.

    Step 4: Open the Developer Tools in Internet Explorer (F12), and search for iframe.

    image

    Step 5: Copy the first result into your text editor of choice.  The magic URL you need is the one within the src attribute.

    image

    Step 6: Delete everything except the part inside the quotes.  Before the PowerPointFrame.aspx, add the relative URL to your site collection _layouts directory, and copy the whole URL into your clipboard.

    image

    Step 6: Go to the SharePoint Page you want to embed the PowerPoint into.  Add a Page Viewer Web Part to the page.  Open the tool pane for the web part,

    image

    Step 7: In the Page Viewer tool pane, paste in the URL, and optionally enter a height/width and chrome state for the PowerPoint Deck.

    image

    Step 8: Hit “OK” and be awed at how awesome it looks to have a fully functional PowerPoint deck embedded on your page.  You can view the deck full screen by clicking “Start Slide Show”, you can change slides, view notes, click links, or click the “popout” button to have the deck open up in a popout window.

    image

    Super-secret-squirrel trick: If you want the deck to default to a slide other than the cover slide, click through to the slide you want, and then click the popout button in the top right of the PowerPoint Web App.  The deck will be open to that slide in its own window. 

    Use the same Developer Tools trick from step 4, but this time search for &SlideId.  You will see the URL has added two parameters… a slide ID and popout=1 (the URL will end with something like &SlideId=590&popout=1).  You can guess what popout=1 does, and the SlideId is some sort of internal reference to the Slide ID (I have no idea how it is generated, but it doesn’t matter Smile.  My web app-fu will work just the same). Just copy the &SlideID=somenumber and paste it to the end of your URL in the Page Viewer web part, and now your web page will display the PowerPoint deck starting on whatever page you specified!

    Additional Resources

    Office Web Apps technical library

  • SharePoint and Exchange Calendar together

    (post courtesy Anand Nigam)

    One of the cool things in SharePoint 2010 is the ability to show the Exchange Calendar on a SharePoint site, side by side. This is called as Calendar Overlay

    This post will walk through how to configure this.

    Step 1 (prerequisite)

    1. I have a SharePoint Site http://fabrikam which looks like this

    clip_image002

    2. I also have a calendar “MySharePointCalender” , with a few calendar events entered.

    clip_image004

    3. I have my Exchange Calendar in Outlook, with a few meeting/events there as well.

    clip_image006

    4. What we want is to see events from my Exchange calendar show up on the SharePoint calendar.

    Step 2 (The actual process)

    1. Open the SharePoint calendar  --> Calendar Tools –> Calendar Overlay –>New Calendar,

    clip_image008

    clip_image011

    Fill in the :

    • Name: Give a name to this calendar
    • Type: Select Exchange
    • Outlook Web Access URL: the OWA url of your organization.
    • Exchange Web Service URL: which can be determined as follows:

    If your OWA URL is https://exch.contoso.com/owa, then the Exchange web Service URL would be https://exch.contoso.com/ews/exchange.asmx

    (in other words, from the OWA URL , remove the trailing “owa” and add “ews/exchange.asmx”)

    clip_image014

    Step 3 (The awaiting Error and the fix)

    If you have not previously configured SharePoint to trust your Exchange server, you will receive the following error message:

    Could not establish trust relationship for the SSL/TLS secure channel with authority ‘dc’. (GUID)

    clip_image016

    Here is the fix

    1. Get the CA Root Certificate for your domain

    (Just a note, there are many ways to get the certificate, I’m taking the one that is less prone to error)

    a. Go to the Server where you have the Certificate Authority installed. Open IIS and select the Server Certificates component.

    clip_image018

    Double click on Server Certificates

    Locate the Root Certificate of the CA from the list, here is the one what I have.

    clip_image021

    (To double check if this it the Root certificate, open the certificate and see the certification path, It should have just one entry (root), that is the name of the Certification Authority, in your domain.). Below the image or my root certificate

    clip_image023

    b. Now that we have located the certificate, Open it go to Details tab and Click Copy to File

    clip_image026 clip_image028

    clip_image031 clip_image033

     clip_image036 clip_image038

    And now we have the Certificate exported to a file

    clip_image040

    Copy this certificate to the SharePoint Server, and follow the below steps

    a. Open Central administration > Security> Manage Trust

    clip_image042

    b. Click on New, Provide a Name (I use RootCA), and navigate to the RootCA.cer file you exported in the previous step and Click OK

    clip_image044

    Now refresh the same calendar and confirm that you can see the Exchange calendar event for the logged in user

    clip_image046

    Step 4 (Enhance the default behavior)

    Although we can now see the Exchange calendar, we can on only see the free/busy status, and not the actual details of the event. It would be good if we could have the details displayed here too. To display details:

    1. Open Outlook> File > Options>

    clip_image048

    2. Go to the Calendar Section > click Free/Busy Options

    clip_image050

    3. Select any one of the options below, I have selected Full details. Click Apply and Ok and exit out of Outlook.  Now refresh the SharePoint calendar and see the difference.

    clip_image052

    clip_image054

    Additional reading:

    Note: The calendar overlay is per user, meaning it will only show calendar items for the currently logged-in user.

  • Network Monitoring with System Center Operations Manager 2012

    (Post courtesy Nikunj Kansara)

    This post describes the network monitoring capabilities of the System Center Operations Manager 2012 Beta.

    In my opinion, network monitoring is the most exciting feature of the upcoming Operations Manager 2012 release. This article will help users to get an overview of the network monitoring, how to discover network devices, configure network monitoring rules and object discoveries, sneak-peek on reports generated out of network management and network dashboard.

    I have split up the blog in four different topics:

    How to discover the network devices:

    Discovery is the process of identifying network devices to be monitored.

    Operations Manager 2012 can monitor devices that use SNMP v1, v2c and V3.

    The benefit that we get by configuring Network Monitoring is that if a critical server seems to be down, and if network monitoring is configured, we will see an alert that a switch/router port is down which was connected to the critical server. We can also see the network topology diagram called the Network Vicinity view.

    Operations Manager 2012 provides the following monitoring for discovered network devices:

    • We can view connection health between the network devices and between the server and the network device
    • We can view the VLAN health based on health state of switches in VLAN
    • We can view HSRP group health based on health state of individual HSRP end points
    • We can view Port/Interface Monitoring like Up/Down, Inbound / Outbound volume traffic
    • We can view Port/Interface Utilization, Packets dropped, broadcasted.
    • We can view Processor Utilization for some certified devices
    • We can view Memory utilization some certified devices

    Network device discovery is performed by discovery rules that you create.

    Below are steps for creating the discovery rule:

    1. Open the Operations Console

    2. Go to Administration Workspace, right click Administration and the click Discovery

    image
    Figure 1

    3. The What would you like to manage? Page in Figure 1 will open up and we need to select the Network Devices option and click Next.

    4. The General page in Figure 2 appears and we need to provide the Name of the discovery rule and then we need select the Management server from the drop down. And then click Next.

    Note:

    • We can create one discovery rule per management server or gateway server.
    • If we are creating a second discovery rule then we will only see the management servers that don’t have any discovery rule associated with them.
    • Also, we might want plan ahead and strategically place the management servers or gateway servers so they can access the network devices that we would like to discover.

    image
    Figure 2

    5. On the Discovery Method page in figure 3, we need to select the method to discover the network device. In this example we need to select Explicit discovery and then click next.

    Note:

    • Differences between Explicit discovery and Recursive Discovery:
      • Explicit discovery – An explicit discovery rule will try to to discover the devices that you explicitly specify in the wizard by IP address or FQDN. It will only monitor those devices that it can successfully access. The rule will try to access the device by using ICMP, SNMP, or both depending on the configuration of the rule.
      • Recursive discovery – A recursive discovery rule will attempt to discover those devices that you explicitly specify in the wizard by IP address, as well as other network devices that are connected to the specified SNMP v1 or v2 device and that the specified SNMP v1 or v2 device knows about through the device’s Address Routing Protocol (ARP) table, its IP address table, or the topology Management Information Block (MIB).

    image
    Figure 3

    6. On the Default Account Page in Figure 4, click on the Create default Run As Account as we need to create an account which will be used to discover the network devices.

    image
    Figure 4

    7. On the Introduction page of Create Run As account Wizard in Figure 5, click next

    image
    Figure 5

    8. On the General Properties page of the Create Run As account Wizard in Figure 6; enter the Display name of the Run As Account and click next.

    image
    Figure 6

    9. On the Credentials page on the Create Run As account Wizard in Figure 7, enter the SNMP community string and click on create.

    Note:
    SNMP Community Strings

    We can configure Read only [RO] and Read Write [RW] SNMP Community strings. With the RO Community string we have read access to the network device. For Operations Manager 2012, we need only RO SNMP Community String to access the device. So it’s should be easy to convince the network guys ;-)

    image
    Figure 7

    10. On the Default Account Page in Figure 8, select the created Run As Account and click on Next.

    image
    Figure 8

    11. On the Devices Page, click on Add Button

    image
    Figure 9

    12. On the Add a device window in Figure 10, enter the IP address / Name of the device we want to monitor; Select the Access Mode as ICMP and SNMP (You can also select ICMP only and SNMP only); Select the version on SNMP as v1 or v2; Select the created Run As account and then click OK.

    Note:

    • We use ICMP only in the scenario where we need to know the availability of the gateway router from the ISP to verify if the interface is up or down.
    • We use SNMP only in the scenario where we want to monitor a Firewall on which ICMP is blocked.
    • If we specify that a device uses both ICMP and SNMP, Operations Manager must be able to contact the device by using both methods or discovery will fail.
    • If you specify ICMP as the only protocol to use, discovery is limited to the specified device and monitoring is limited to whether the device is online or offline.

    image
    Figure 10

    13. Now Click Next on the Devices Page as in Figure 11.

    image
    Figure 11

    14. On the Schedule discovery Page in Figure 12, Select the discovery schedule and click Next.

    Note:

    You may also select to run the discovery manually.

    image
    Figure 12

    15. Click Create on the Summary page

    image
    Figure 13

    16. Click Yes on the Warning box as in Figure 14. We need to distribute the created Run As account to the Management server for discovery and to the Management Server resource pool for monitoring that was selected in General properties [Figure 2]

    image
    Figure 14

    17. Click close on Completion.

    image
    Figure 15

    18. Now in the Administration Workspace, go to Discovery Rules Node under the Network Management Node. You will able to see the Discovery Rule that has created. Click Run if we want to Run the discovery manually. See Figure 16

    image
    Figure 16

    19. See the Figure 17 for the Task Status window that appears when we run the Discovery Manually. The success Status suggests that the discovery is submitted successfully and not that the devices have been discovered. Click close.

    image
    Figure 17

    20. We will see probing status of the discovery rule when it has actually found the device. See Figure 18

    image
    Figure 18

    21. The Discover Rule starts processing the discovered components as in Figure 19

    image
    Figure 19

    22. The status of the discovery rule will go to pending and will run again as per the discovery schedule that we selected Wizard. If we would have selected manual discovery option in the Wizard than the status would go to Idle. See Figure 20.

    image
    Figure 20

    23. Go to Network Devices under Network Management to see the discovered device. See Figure 21.

    image
    Figure 21

    24. Double click the Network device to view the properties page and more information about that discovered device. See Figure 22.

    image
    Figure 22

    B. Network Monitoring:

    We will see some of the views that are relevant to the network device that we discovered in previous step.

    1. Go to Monitoring Workspace; double click the Network Monitoring Folder to see the Network views. See Figure 23.

    image
    Figure 23

    2. Select the Network Devices view to see the Network Devices being monitored.

    image
    Figure 24

    3. Click on the Health Explorer to the Subcomponents of the Switch. See Figure 25 & 26

    image
    Figure 25

    image
    Figure 26

    4. Click on the VLANs view to see the VLANs in which the switch is participating. See Figure 27

    image
    Figure 27

    5. Click on the ICMP Ping Response Performance view or Processor utilization Performance view to see the performance graph for ping response. See Figure 28 & 29.

    image
    Figure 28

    image
    Figure 29

    C. Dashboard:

    1. To see the connections between the connected nodes and the network device, click on the Network Vicinity view. See figure 30.

    image
    Figure 30

    2. Click on the show computers check box to see the connections. See figure 31.

    Note:

    By default we can see connections which are one hop away from the network device.

    We can select at max 5 hops. In environments with large number of network devices, selecting five hops can take a while for Operations Manager 2012 to show the data and the view might not be useful to you.

    image
    Figure 31

    3. Now coming back to Network devices view in Monitoring workspace, click on the Network Node Dashboard. We will able to view all the information related to Network devices in the just one window. See figures 32, 33, 34 and 35.

    image
    Figure 32

    image
    Figure 33

    image
    Figure 34

    image
    Figure 35

    D. Reporting: [See Figure 36]

    Processor Utilization Report: It displays the processor utilization of a particular network device in a specified period of time.

    Memory Utilization Report: It displays the percentage of free memory on a particular network device in a specified period of time.

    Interface Traffic Volume Report: It displays the rate of inbound and outbound traffic that goes through the selected port or interface in a specified period of time.

    Interface Error Packet Analysis Report: It displays the percentage of error packets or discarded packets, both inbound and outbound, for the selected port or interface.

    Interface Packet Analysis Report: It displays the types of packets (unicast or non-unicast) that traverse the selected port or interface.

    image
    Figure 36

    Additional Resources

  • Performing an Active Directory Health Check Before Upgrading

    (Post courtesy Bonoshri Sarkar)

    Hi everyone, this is Bonoshri Sarkar here. I have worked for Microsoft as Partner Technical Consultant specializing in Directory Services for the past two years; providing end to end consulting to enable partners to design, position, sell and deploy Microsoft Platforms for their customers. In my earlier role, I worked for more than 4 years on the Microsoft Support team focusing on Microsoft Directory Services.

    Since I have a great affinity for Directory Services, I thought it would be a great idea to pen down my thoughts and experience on ensuring a smooth Active Directory Upgrade.

    For any kind of Upgrade/ Migration / Transition to go smooth, and later on to have an healthy environment, it is required to spend a fair amount of time in planning and making sure that the source or the present environment is in a healthy state. Two driving factors for any upgrade or transition include the need to utilize the new features that the new version of the product has to offer, and the other being to ease the complexities and the issues of the current environment. However, most IT Pros do not take adequate steps to check the health of their existing Active Directory environment. In this post, I would like to address some of the key steps that an AD Administrator must perform prior to an upgrade or transition.

    In my experience of assisting customers and partners in different transitions, most of the issues pertain to the source domain or the source domain controllers, so I will discuss few important things which should be considered as mandatory before going for any kind of Upgrade / Migration / Transition.

    Performing an Active Directory Health Check

    The health check should be done in 2 phases.

    1. Planning Phase

    2. Deploy Phase (just before implementing the upgrade, transition or migration)

    In the first phase we should identify what all services and roles are running on the machine that we are planning to upgrade, and rule out things that we do not want to move to our new box.

    Putting emphasis on diagnosing AD issues, we can use dcdiag to ensure a healthier Active Directory, I know we have been using dcdiag for many years, and we look for failure messages in the output, but apart from the failure messages, we can also consider issues such as those highlighted in yellow below:

    clip_image001

    clip_image003

    clip_image004

    If you notice the first part of dcdiag says “failed test replication”, which implies that there are issues with Active Directory replication with this Domain Controller.

    The second message tells us that there are issues with netlogon and sysvol which are default logon shares, both the errors can be interdependent or could be because of completely different reasons. 

    In this scenario we need to fix AD replication first or dig into it more to find what is causing these errors. Now you can use few more commands to check the AD replication like repadmin /syncall /eAP. In case of a huge enterprise, you can also use Replmon (2003).

    The third message tells us that the important services are running. We need to be sure that the above services are started to ensure a smooth transition.

    If we don’t get enough details from the dcdiag results, check the event viewer, and if you do not see anything restart the FRS service and then check the event viewer for Event ID 13516.

    clip_image005

    Apart from dcdiag you can also use Netdiag to check the network status and get detailed information.

    In addition to this, make sure the NIC card drivers are updated on the old server. 

    Instead of disabling the hardware or software based firewall between on the servers (old &new), ensure that you make the appropriate exceptions and port configurations to ensure proper communication between the directory servers (see Active Directory and Active Directory Domain Services Port Requirements).

    Any third party legacy application(s) should be tested in lab environment to make sure that they are compatible with new version of server OS and Active Directory.

    clip_image007

    We also have different versions of Exchange BPA (Best Practice Analyzer) tools depending on the version of Exchange to check Exchange integrity and Exchange specific permission (You can select Permission check to gather that information).

    Last but not the least read the migration or transition documents (http://technet.microsoft.com/en-us/library/cc731188(WS.10).aspx) to make sure server has all the minimum requirements.

    Once we are sure that the servers are in healthy state do not forget to take a full and a system state backup using a supported backup system as documented in the TechNet article below

    http://technet.microsoft.com/en-us/library/cc731188(WS.10).aspx

    All these stitches in time would definitely save you nine hours’ worth of troubleshooting. It’s up to you to decide, would you like to troubleshoot or enjoy your Fries with Coke?

    Additional References

  • SharePoint 2010–Returning Document ID in Search Results

    (Post courtesy Sean Earp, with research and XSLT authoring by Alaa Mostafa)

    One of my favorite features of SharePoint 2010 is the Document ID.

    As discussed in the MSDN article Developing with Document Management Features in SharePoint Server 2010 (ECM):

    A document ID is a unique identifier for a document or document set and a static URL that opens the document or document set associated with the document ID, regardless of the location of the document. Document IDs provide:

    • A way to reference items such as documents and document sets in SharePoint Server 2010 that is less fragile than using URLs. URLs break if the location of the item changes. In place of the URL, the document ID feature creates a static URL for each content item with a document ID assigned to it.

    • More flexible support for moving documents or document sets at different points in the document life cycle. For example, if you create a document on a MySite or Workspace page and then publish it on a team site, the document ID persists and travels with the document, circumventing the broken URL problem.

    • A document ID generator that assigns a unique document ID to items. You can customize the format of the IDs that the service generates. By using the document management API, you can write and use custom document ID providers.

    image

    When browsing a document library with this feature enabled, you can display the Document ID column, and you will be able to see the Document ID for a given document.  Easy enough, and useful if you need to reference this Document ID in another system.

    This works great when you can browse a document library, perhaps using the new metadata navigation and filtering capabilities of SharePoint 2010, but if your document library is holding thousands and thousands of documents, users may resort to using search to find the document they are looking for.  Unfortunately, SharePoint search does not display the document ID is the search results by default.

    image

    Fortunately, SharePoint indexes Document IDs as a managed property by default, which means that with a little magic, we can add the Document ID into the search results.

    In a nutshell, SharePoint retrieves the search results as XML, and uses XSLT to transform the XML into the pretty results you see on the search results page.  Same basic concept as HTML (which has your content) and CSS (which styles that content).  We just need to tell SharePoint to return the managed property with our Document ID, and then update the XSLT to display that managed property in the search results. 

    It is not as hard as it sounds.

    Assumptions: I assume you have enabled the Document ID feature on the site collection, all documents have been assigned Document IDs, and a full crawl has been done of the site.  I also assume you are a site collection administrator with full permissions to the site collection.

    From your Search Results page  in the site collection (wherever you have it),   click on Page –> Edit (or Site Actions –> Edit Page).  You will see a ton of zones and web parts (such as the Refinement Panel, Search Statistics, Search Box, etc.  You can customize the heck out of the search results page, and move things all over the place.

    image

    For now, however, we are just going to modify the Search Core Results web part that contains…er… the core search results.  How intuitive!

    Edit the Search Core Results web part, and expand the section that says “Display Properties”.  Uncheck the box that says “Use Location Visualization”.  I have no idea why this option is named as it is… this is really the option that lets you edit the fetched properties and XSL.

    image

    As a quick aside… although you can edit the fetched properties and XSL directly from the web page properties, the experience is horrible.  I strongly recommend using an XML editor like Visual Studio or NotePad++

    In the Fetched Properties section you will see a number of columns that look like the following.  these are the managed properties that are returned by SharePoint Search

    <Column Name="PictureHeight"/>  <Column Name="PictureWidth"/>

    Somewhere before the closing </Columns> tag, add a:

    <Column Name="spdocid"/>

    (Note: if you are using SharePoint search instead of FAST search replace all instances of “spdocid” with “DocID”)

    This will cause SharePoint to return the Document ID in the search results XML.  Now let’s modify the XSL so that we display the ID in the search results.  Click on the “XSL Editor…” and copy the XSL into your XML editor of choice (or, if you like pain, just edit the 938-line long XSL sheet in a browser that does no validation or color coding.  Your choice.)

    At the top of the XSL is a list of parameter names.  Add in the following parameter (order does not matter)

    <xsl:param name="spdocid" />

    image

    Next, search for “DisplayAuthors.  After the DisplayAuthors call template, we are going to add a new call template called “DisplayID” to… well, display the ID. The template is wrapped in a conditional to ensure that if there is NOT a document ID, that it does not attempt to display a null value. 

    Add the following: following lines:

                  <xsl:if test="string-length($hasViewInBrowser) &gt; 0">
                    
                          <xsl:call-template name="DisplayID">
                            <xsl:with-param name="spdocid" select="spdocid" />
                            <xsl:with-param name="browserlink" select="serverredirectedurl" />
                            <xsl:with-param name="currentId" select="$currentId" />
                          </xsl:call-template>
                      
                  </xsl:if>

    image

    Search for “DisplayString” and we will add a section to call our template, display the ID (along with a URL that links to the document), and we’ll put brackets around the Document ID so it stands out visually.  Add the following:

      <xsl:template name="DisplayID">
        <xsl:param name="spdocid" />
        <xsl:param name="currentId" />
        <xsl:param name="browserlink" />
        <xsl:if test="string-length($spdocid) &gt; 0">
          <xsl:text xml:space="default"> [ </xsl:text>
          <a href="{concat($browserlink, $ViewInBrowserReturnUrl)}" id="{concat($currentId,'_VBlink')}">
            <xsl:value-of select="$spdocid" />
          </a>
          <xsl:text xml:space="default"> ] </xsl:text>
        </xsl:if>
      </xsl:template>
    image

    We’re almost done!  Select all your XSL, copy it, and paste it back into your SharePoint window, hit Save –> Okay –> Check In –> Publish

    Voila!  The Document ID now shows up in the search results with a clickable link back to the source document.

    image

    Random troubleshooting tip:  If you get the message “Property doesn't exist or is used in a manner inconsistent with schema settings”, this typically means one of two things:

    1. You created a custom managed property and have not yet run a full crawl so that this property does not exist in the index (this property is mapped out of the box, so it does not apply here)
    2. You are using the wrong managed property.  FAST search uses “spdocid” while SharePoint search uses “DocId”

    image

    image

    Attachments: I have attached a copy of the XSL I used for the above post to save you time copying and pasting into the right sections.  It works for me with SharePoint search, but use on a test server first and at your own risk.

  • Office 365 New Features for November 2013

    November was another month full of updates for Office 365.  Partner Solution Consultant Jon Horner runs through the updates in the video above, and you can click through to the various announcements below.

    OneNote for Windows Store v2.1

    A major update for OneNote for Windows Store now available. Now with a redesigned user interface. Included are new views, enhanced sharing, improved ink, and more new features.

    More Information: A BIG OneNote update for Windows note-taking devices
    Download: OneNote

    Folder Permissions and Calendar Delegation in Outlook Web App

    You can now configure folder and calendar permissions in Outlook Web App to give delegates access to your inbox and other folders.  This is particularly useful if you need someone like an admin to manage your inbox or calendar while you’re out of the office.

    More information: Configuring delegate access in Outlook Web App

    November 2013 Update for the Lync 2013 Desktop Client

    Features included in the November Update:

    • Photos of Sender/Receiver – View photos of sender/receiver inline with IM conversation
    • URL Photo Experience – Set your own photo from a public web site instead of using the corporate image
    • Login Trace Files – Easily access Lync client login logs
    • New Recording Options –Choose preferred resolution for client-side recordings

    More information: November 2013 Update for the Lync 2013 Desktop Client

    Office Web Apps Update

    Improved Online User Experience

    • Real time coauthoring and presence across Office Web Apps

    Excel Web App

    • the ability to drag and drop cells
    • reorder sheets
    • open and interact with spreadsheet that have sheet protection
    • see aggregates of the selected range (SUM, COUNT, AVERAGE, etc.) in the status bar
    • insert Apps for Office

    Word Web App

    • ability to Find and Replace content
    • apply styles to tables
    • insert Headers and Footers
    • see page numbers as placeholders, and
    • Auto save

    PowerPoint Web App

    • crop pictures

    More information: Collaboration just got easier: Real-time co-authoring now available in Office Web Apps

    SharePoint Online

    Get a Link

    It is now possible to generate a Guest Link (an anonymous link to a document) directly in the SharePoint Online document library user interface. This will remove the step of sending an email to yourself to generate the Guest Link in order to utilize the Guest Link for non-email purposes - and it's quicker and more efficient

    More Information: Get a Link

    Touch Design

    We are updating the default HTML5 "mobile view" in mobile browsers. Users accessing SharePoint on mobile devices will get an updated touch-friendly experience. The new experiences primarily target the Web views of a user's SkyDrive Pro, the Sites hub and default team site pages.

    More information: SharePoint Online introduces the Touch Design mobile experience

    Improved Sharing Emails - All on To... line

    Now everyone you share with will be on the To... line. When you share with multiple people via the sharing dialog in SharePoint, SharePoint will send one email to everyone you shared with and cc you as the sender, rather than sending a separate email to each recipient. Note: external email addresses typed in the people picker will still send individual external invitations, since external invites need unique redemption links.

    More information: SharePoint Online improves external sharing

    Outlook

    High DPI Improvements

    The latest update of Outlook 2013 is now optimized for high pixel density screens on tablet devices.

    Compact Message Header

    In the latest update of Outlook 2013, the email message header in the reading pane can be collapsed to provide more screen space for the body of the email.  Also, when the reading pane is too narrow to show all recipients in the message header, an indicator appears to tell you how many recipients are not shown.

    Yammer

    Yammer Channel Expansion

    Yammer Enterprise will be included in O365 E1-E4, including Government and Non-Profits, for the Direct, Syndication, and Open channels with automated self service provisioning. Additionally, new SKUs Yammer Enterprise Standalone and SharePoint Online Plan 1/2 + Yammer will launch for Direct and Syndication Channels.

    More Information: Getting it done with social: Yammer introduces new features, expands to all Office 365 enterprise customers

    Identity

    Windows Azure Active Directory Premium – PREVIEW

    Windows Azure Active Directory Premium, which is currently in preview, provides a number of directory related features through the Windows Azure portal. Many of these will be very valuable for Office 365 customers. The new features for Windows Azure Active Directory Premium include

    1. Self-service Password Reset for Users
    2. Group-based provisioning and access management to SaaS apps
    3. Customizable access panel
    4. Machine learning based security monitoring and reports.

    Stay tuned as there are additional features are planned in future previews. More information about Windows Azure Active Directory Premium in preview now is available here.

    To stay completely up-to-date on the latest announcements, make sure to visit the Office 365 Message Center. This is where tenant-specific announcements or items requiring administrative action will be posted.  The following blogs are where the product teams will release announcements about new features:

    Technology-Specific Blogs

  • System Center Operations Manager 2012 Installation Walkthrough

    (Post courtesy Rohit Kochher)

    System Center Operations Manager 2012 has significant changes in setup from Operations Manager 2007. Setup of 2012 has become simpler and installation has become easier.

    If you want to follow along on a test server, you can download Beta version of SCOM 2012 from here.

    Note: The Root Management Server (RMS) concept which from Operations Manager 2007 R2 has been removed from Operations Manager 2012. All Operations Manager 2012 servers are management servers. However we do have an RMS emulator to support those management packs which target RMS. Architecturally, servers in Operations Manager 2012 have a peer-to-peer relationship and not a parent-child relationship like Operations Manager 2007 R2.

    In this blog we will discuss the setup of Operations Manager 2012 with some screenshots of the installation wizard. Microsoft SQL Server 2008 SP1 or 2008 R2 should be installed prior running SCOM 2012 Setup. You can get more information on SCOM 2012 supported configurations here.

    Now, once we run setup.exe we will see the following screen:

    image

    You can click on Install for setup of Management server, Management Console, Web server and Reporting Server. Under Optional installations you can choose to install Local agent, Audit Collection Services, Gateway management server, and ACS for Unix/Linux.

    Once you click on Install you will get the screen to accept the agreement. Once you accept that you will get below screen

    image

    You can select the component that you want to install. Clicking on the arrow pointing down in front of each role will give brief information about that role. There is no explicit option to install OPS DB and data warehouse, as they are integrated. Selecting given features, you will get screen for location of program files. The default location is C:\Program Files\System Center Operations Manager 2012.

    image

    The next step will show you prerequisite failures (if any). You will get information for failures along with download links to install any missing prerequisites.

    Next you get screen to input information about management server. You can specify if it is first management server in new management group or an additional management server in an existing management group.

    image

    You can specify the name of the management group here. You will also get the screen to specify operations database. We need to install both operations database and data warehouse in Operations Manager 2012. Installing Data warehouse is mandatory in 2012 (a change compared with Operations Manager 2007). The data warehouse is needed for things like dashboards etc. If this is second management server you can click on Add a management server to existing management group option.

    image

    After specifying the required information about Operations database and clicking on next, you will get similar screen for Operations manager data warehouse.

    The next screen allows you to configure Operations Manager service accounts.

    image

    You can specify the required accounts on this screen and click on next to complete the setup. This setup will automatically assign local administrators group on server to the Operations Manager admin role. Once you enter account information here, it will be automatically verified in the background. In case the account cannot be verified (or the password is incorrect), you will get a red warning as the above picture illustrates.

    After this, you will get the option to participate in the Microsoft Customer Experience Improvement Program (CEIP) and Error reporting. Finally, you will also get the option for configuring Microsoft Updates.

    image

    The last screen will provide you with an installation summary. Clicking on Install will start the Installation. Once finished, you are all set to monitor your infrastructure! Some of the great features in Operations Manager 2012 are the new dashboards, network monitoring , and application monitoring; which will be covered in future posts.

    You can check the deployment guide for Operations Manager 2012 here.

    System Center Operations Manager 2012 Beta resources

  • Getting Started with Side Loading Windows Apps

    If you have worked with Windows 8 for any length of time, chances are your customer has wanted to deploy a custom app. The app isn’t something you want to publish to the Windows Store, so what is the best way to go about deploying it to all the customer’s devices?  Consider side loading.  This brief video will explain how to get started with side loading in test environments.

    After watching the video if you want to go deeper, or have specific questions please don’t hesitated to contact Partner Support.  We are here to help.

  • Capture a Windows® Image from a Reference Computer Using Capture Media—for IT Pros

    (This post courtesy of Simone Pace)

    In order to use System Center Configuration Manager 2007 to distribute the Windows 7 operating system to our managed clients, we need to provide to the OS bits somehow to the site server. One of the methods we can use is capturing a Windows 7 WIM image from a previously prepared reference computer.

    System Center Configuration Manager 2007 offers standard and easy ways to deploy software in our IT Infrastructure. One of the most relevant features we can take advantage of is the highly customizable Operating System Deployment capability built in the product.

    The Windows Vista® and Windows 7 new WIM image format further simplifies OS distribution by being independent from the destination client system’s hardware, so that we can use a single image to target different computers and keep our image repository less complex and more easily managed. This post shows an example of steps we can follow to successfully capture a WIM image of Windows 7 Ultimate Edition x64 from a reference computer.

    Note: Further posts will follow that illustrate the specific tasks required to upgrade a Windows XP computer.

    Testing lab description screenshots and computer names used in this article refers to a Virtual scenario running on a Hyper-V R2 host:

    • Domain: contoso.com (single site)
    • All Servers are Windows 2008 R2 Enterprise servers.
    • Server CON-001
    • SCCM with almost all roles installed
    • SQL Server 2008
    • Windows Automated Installation Kit 2.0
    • WDS Transport Server role installed
    • Server CON-002
    • Active Directory Domain Controller role installed
    • DNS Server role installed
    • DHCP Server role installed
    • SCCM Primary Site: C01
    • Reference client is a Windows 7 Ultimate edition x64 clean setup

    1. Create a Capture Media iso file.

    The iso image we are creating in this section will be used to boot the reference machine and start the OS wim image creation sequence.

    a. Log on CON-001 and open the Configuration Manager console.

    b. Go to Task Sequences node.

    c. Click on “Create Task Sequence Media” in the action panel.

    d. Select Capture Media and click next on the welcome page.

    clip_image002

    e. On the “Media file” click Browse, select the folder where you are going to save the media iso file, and give it a name (for example MediaCapture), click Next.clip_image004

    f. On “Boot Image” click Browse, and select the boot image suitable for your reference computer.

    Note: Two boot images (x86 and x64) are automatically added when you install WDS role in the system.

    g. On Distribution Point leave \\CON-001 (or select you preferred DP), click Next.clip_image006

    h. Review the summary and click Finish.

    i. The server starts building the iso image.

    clip_image008clip_image010clip_image012

    j. Click Close to close the wizard.

    2. Prepare the reference computer.

    a. Log on CON-Ref7Client with user Administrator account

    b. Check the following requirements

    i. The computer must be a workgroup member.

    ii. The local Administrator password must be blank.

    iii. The local system policy must not require password complexity.

    iv. Apply the latest Service Pack and updates.

    v. Install the required applications.

    3. Capture the image using the Capture Media.

    a. Capture the MediaCaputer.iso you created in Step 1 in the Virtual DVD of the reference PC (if is a VM), or

    b. Burn the MediaCapture.iso on a DVD and insert it in the computer.

    c. Boot the reference computer normally.

    d. Start the autoplay DVD and launch the Capture Image Wizard.

    clip_image013

    e. Click Next.

    f. Set the path where you want to save the wim file, give the image a name, and insert the appropriate credential to access the path and write on it.
    clip_image014

    g. Click Next.

    h. Fill in the required data in the Image Information window.

    clip_image015

    i. View the summary and launch the capture by clicking Finish.

    clip_image016

    j. The program will start executing the sysprep phase.

     clip_image017

    k. After sysprep, the computer will restart in WinPE to start the capture.

    clip_image018

    l. (Reboot).

    clip_image020

    m. Computer restarts in WinPE and starts the Capture.

    clip_image021

    n. Capturing first Partition (1-2)

    clip_image022

    o. And capturing second partition (2-2).

     clip_image023

    Note: The number of partitions captured depends on the reference PC’s disk partitions. In the case shown, the VM had a 100Mb partition for BitLocker® capability (Partition 1 of 2).

    p. When finished, press OK to quit and restart.

    clip_image024

    q. On the Server we can see the captured image file.

    clip_image025

    4. Add the file to the image repository in SCCM 2007.

    a. Share a folder and move the image file (example \\ServerName\Images).

    b. Open the SCCM console, navigate to Site Database > Computer Management > Operating System Deployment > Operating System Images.

    c. Import the image by clicking Add Operating System Image in the task panel.

    d. Type or browse the network path to the image you want to import, and click Next.

    clip_image027

    e. Fill in the required information, then click Next.

     clip_image029

    f. Review the summary and complete the wizard.

    clip_image031

    clip_image033

    5. Distribute the image to Distribution Point.

    a. In the SCCM console, navigate to the image you uploaded in step 4 (Site Database > Computer Management > Operating System Deployment > Operating System Images) and select it.

    b. Click Manage Distribution Points in the action panel.

    clip_image035

    c. Click Next on the wizard starting page.

    d. As DP doesn’t have the image deployed yet, leave the default selection (copy) and click Next.clip_image037

    e. Select the DPs where you want to deploy the image to and include their PXE DP’s hidden share.clip_image039

    f. Click Next and Next again in the Completion page.

    clip_image041

    clip_image043

    g. Check the copy progress in the Package Status folder until you see it is Installed.

    clip_image045

    h. You are now ready to distribute the Windows 7 Ultimate x64 Image to client computers, either by upgrading or installing new machines.

  • Office 365 New Features for December 2013

    Checking back in after a great holiday vacation, and wish all of our Partners a great 2014!!!

    Normally at the beginning of the month, I post a video and list of the O365 features that were released in the previous month, but December was relatively slow on new feature releases as our Partners, Customers, and Employees spent some well-deserved time with their families.  Don’t worry… there is plenty of innovation in the pipeline, so look forward to a much longer “Office 365 New Features for January 2014” post next month.

    Over on the Office 365 Technology Blog, Andy posted a list of the Office 365 features released in December:

    http://blogs.office.com/b/office365tech/archive/2014/01/02/what-s-new-december-2013.aspx

    Office 365 Home Premium & Office 365 University updates:

    Outlook.com makes it even easier to switch from Gmail - A new tool easily imports your mail and contacts, preserves the "read" status, and enables you to send email from your @gmail.com address from within Outlook.com.

    OneNote for Android update - Capture information from anywhere on your phone with a new Add to OneNote via the Android Share button.  See recent notes from your Home screen with the new OneNote Recent Widget.

    Office 365 for business updates*:

    Office 365 Admin app - Office 365 admins can now see their Office 365 service health dashboard on their iPhone and Android devices.

    Switch Plans to a different Office 365 service family - Customers can now upgrade from Small Business plans to Midsize and Enterprise plans, or from the Midsize plan to an Enterprise plan.

    OneNote for Android update - Capture information from anywhere on your phone with a new Add to OneNote via the Android Share button.  See recent notes from your Home screen with the new OneNote Recent Widget.

    Office 365 Education updates*:

    In addition to the Office 365 for Business updates,

    Student Advantage is now available - Educational institutions that subscribe to Office 365 ProPlus for their faculty and staff can now extend Office 365 ProPlus to their students at no additional cost.

    *Not all updates apply to every Office 365 plan; please check the individual post for specifics.

    Previous feature update posts are available below:

    Office 365 New Features for October 2013
    Office 365 New Features for November 2013

    A playlist with all the Office 365 Monthly Service Update videos is available here.

  • Build and Deploy Solutions Faster: Practice Accelerator Schedule for January–March

    imageMicrosoft Practice Accelerator is a Microsoft Partner Network benefit that provides a detailed framework of resources, spanning from pre-engagement to post-delivery, to help you improve efficiency and deliver solutions faster. The complete documentation set includes project guides, templates, architecture guidance, and planning and design guides. You will also receive customizable leave-behind materials for your customers.

    A Practice Accelerator session can be attended by several people from your organization, so that all roles involved in building and deploying a solution can take advantage of the tools and information. Click on the dates below to register. Read more about Practice Accelerator.

    Microsoft Office 365

    Course description, prerequisites, and course outline
    Download the Office 365 PA Datasheet

    SharePoint 2013: Search, Social, Portals & Collaboration

    Course description, prerequisites, and course outline
    Download the SharePoint 2013 PA Datasheet

    Data Center Infrastructure & Management

    Course description, prerequisites, and course outline
    Download the Data Center Infrastructure & Management Datasheet

    Windows 8 Flexible Workstyle

    Course description, prerequisites, and course outline
    Download the Flexible Workstyle Datasheet

    The dates above are for the US Deliveries.  For international dates, visit the Practice Accelerator page on the Partner Portal, select the Practice Accelerator you are interested in, and choose your language/location.

  • Configuring SharePoint 2013 Search with PowerShell

    Post courtesy Partner Solution Consultant Priyo Lahiri:

    I wrote this script for a demo during our Practice Accelerater for SharePoint 2013. If you have attended the session you have already seen this in action. If not, here is the script for you to try out in your lab.

    Disclaimer: before we proceed you should know that this script has been tested in my lab to work in a very specific scenario. If you wish to use this, your environment should exactly look like mine. In other words, we are not responsible if this script ruins your farm Smile.

    Take a note on the environment first:

    SharePoint Farm:

    • 2 web front end servers
    • 2 application servers
    • 1 SQL Server

    Following are the services running on the server:

    clip_image002

    This script will provision Search on all the servers and configure our WFEs to host Query Processing Role.

    Very important: if your environment doesn’t look like this, STOP here.

    If you already have Search Configured, which would be your default setting if you have run the Farm Configuration Wizard, don’t use this script.

    Follow these TechNet guidance to understand more:

    As you can see from the above screenshot, my environment doesn’t even have Search Service started, so we are good to go in using this script. It’s ok to modify the script to use 3 server or 2 server environment, as long as Search has never been configured on your Farm or was configured and now removed.

    Let’s understand the script on a piecemeal basis:

    Section 1: setting up the environment and gather user inputs on Server Names, load SharePoint snap-in, managed account to use etc

    Set-ExecutionPolicy unrestricted
    Clear-Host
     
    # Start Loading SharePoint Snap-in
    $snapin = (Get-PSSnapin -name Microsoft.SharePoint.PowerShell -EA SilentlyContinue)
    IF ($snapin -ne $null){
    write-host -f Green "SharePoint Snap-in is loaded... No Action taken"}
    ELSE  {
    write-host -f Yellow "SharePoint Snap-in not found... Loading now"
    Add-PSSnapin Microsoft.SharePoint.PowerShell
    write-host -f Green "SharePoint Snap-in is now loaded"}
    # END Loading SharePoint Snapin
     
    $hostA = Get-SPEnterpriseSearchServiceInstance -Identity "SP13App"
    $hostB = Get-SPEnterpriseSearchServiceInstance -Identity "SP13-App2"
    $hostC = Get-SPEnterpriseSearchServiceInstance -Identity "SP13WFE01"
    $hostD = Get-SPEnterpriseSearchServiceInstance -Identity "SP13WFE02"
     
    $searchName = "Fabricam Search Service"
    $searchDB = "SP_Services_Search_DB"
    $searchAcct = "fabricam\spService"
    $searchAcctCred = convertto-securestring "pass@word1" -asplaintext -force
    $searchManagedAcct = Get-SPManagedAccount | Where {$_.username-eq 'fabricam\spService'}
    $searchAppPoolName = "Search Services Application Pool"
    IF((Get-spserviceapplicationPool | Where {$_.name -eq "Search Services Application Pool"}).name -ne "Search Services Application Pool"){
    $searchAppPool = New-SPServiceApplicationPool -Name $searchAppPoolName -Account $searchManagedAcct} 
     

    Section 2: Starting Search Service on all servers. You will notice there are some error handling in this script, for example, instead of just firing off the commands, I am actually waiting for the Search Service to respond before I go over to the next step. I have always found this approach very stable.

     
    ## Start Search Service Instances
    Write-Host "Starting Search Service Instances..."
    # Server 1
    IF((Get-SPEnterpriseSearchServiceInstance -Identity $hostA).Status -eq 'Disabled'){
    Start-SPEnterpriseSearchServiceInstance -Identity $hostA 
    Write-Host "Starting Search Service Instance on" $hostA.Server.Name
    Do { Start-Sleep 5;
    Write-host -NoNewline "."  } 
    While ((Get-SPEnterpriseSearchServiceInstance -Identity $hostA).Status -eq 'Online')
    Write-Host -ForegroundColor Green "Search Service Instance Started on" $hostA.Server.Name
    } ELSE { Write-Host -f Green "Search Service Instance is already running on" $hostA.Server.Name  }
     
    #Server 2
    IF((Get-SPEnterpriseSearchServiceInstance -Identity $hostB).Status -eq 'Disabled'){
    Start-SPEnterpriseSearchServiceInstance -Identity $hostB 
    Write-Host "Starting Search Service Instance on" $hostB.Server.Name
    Do { Start-Sleep 5;
    Write-host -NoNewline "."  } 
    While ((Get-SPEnterpriseSearchServiceInstance -Identity $hostB).Status -eq 'Online')
    Write-Host -ForegroundColor Green "Search Service Instance Started on" $hostB.Server.Name
    } ELSE { Write-Host -f Green "Search Service Instance is already running on" $hostB.Server.Name  }
      
    #Server 3
    IF((Get-SPEnterpriseSearchServiceInstance -Identity $hostC).Status -eq 'Disabled'){
    Start-SPEnterpriseSearchServiceInstance -Identity $hostC 
    Write-Host "Starting Search Service Instance on" $hostC.Server.Name
    Do { Start-Sleep 5;
    Write-host -NoNewline "."  } 
    While ((Get-SPEnterpriseSearchServiceInstance -Identity $hostC).Status -eq 'Online')
    Write-Host -ForegroundColor Green "Search Service Instance Started on" $hostC.Server.Name
    } ELSE { Write-Host -f Green "Search Service Instance is already running on" $hostC.Server.Name  }
     
    #Server 4
    IF((Get-SPEnterpriseSearchServiceInstance -Identity $hostD).Status -eq 'Disabled'){
    Start-SPEnterpriseSearchServiceInstance -Identity $hostD 
    Write-Host "Starting Search Service Instance on" $hostD.Server.Name
    Do { Start-Sleep 5;
    Write-host -NoNewline "."  } 
    While ((Get-SPEnterpriseSearchServiceInstance -Identity $hostD).Status -eq 'Online')
    Write-Host -ForegroundColor Green "Search Service Instance Started on" $hostD.Server.Name
    } ELSE { Write-Host -f Green "Search Service Instance is already running on" $hostD.Server.Name  }
     

    Section 3: Starting the Search Query and Site Setting Service Instance on Application Servers. If you don’t wait for a response from Search Service on Application Servers in the step above and try to run this step, it’s very likely that it would fail.

    Additional Reference: http://msdn.microsoft.com/en-us/library/office/microsoft.office.server.search.administration.searchqueryandsitesettingsservice.aspx

    This service provides the creation of a SearchServiceApplication or a SearchServiceApplicationProxy to a Search service application. This allows the caller to establish effective load balancing of Search queries across query servers.


    ## Start Query and Site Settings Service Instance
    Write-Host "
    Starting Search Query and Site Settings Service Instance on" $hostA.server.Name "and" $hostB.server.Name
    Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $hostA.server.Name
    Do { Start-Sleep 3;
    Write-host -NoNewline "."  } 
    While ((Get-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance | Where {$_.Server.Name -eq $hostA.server.Name}).status -ne 'Online')
    Write-Host -ForegroundColor Green "
        Query and Site Settings Service Instance Started on" $hostA.Server.Name
     
    Start-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance $hostB.server.Name
    Do { Start-Sleep 3;
    Write-host -NoNewline "."  } 
    While ((Get-SPEnterpriseSearchQueryAndSiteSettingsServiceInstance | Where {$_.Server.Name -eq $hostB.server.Name}).status -ne 'Online')
    Write-Host -ForegroundColor Green "
        Query and Site Settings Service Instance Started on" $hostB.Server.Name 

     

    Section 4: Create the Search Service Application


    ## Create Search Service Application
    Write-Host "
    Creating Search Service Application..."
     
    $searchAppPool = Get-SPServiceApplicationPool -Identity "Search Services Application Pool"
     
    IF ((Get-SPEnterpriseSearchServiceApplication).Status -ne 'Online'){
    Write-Host " Provisioning. Please wait..."
    $searchApp = New-SPEnterpriseSearchServiceApplication -Name $searchName -ApplicationPool $searchAppPool -AdminApplicationPool $searchAppPool -DatabaseName $searchDB
    DO {start-sleep 2;
    write-host -nonewline "." } While ( (Get-SPEnterpriseSearchServiceApplication).status -ne 'Online')
    Write-Host -f green " 
        Provisioned Search Service Application"
    } ELSE {  write-host -f green "Search Service Application already provisioned."
    $searchApp = Get-SPEnterpriseSearchServiceApplication
    } 

    Section 5: creating the Admin Component. Initial Search Topology is created with this as well.


    ## Set Search Admin Component
    Write-Host "Set Search Admin Component..."
    $AdminComponent = $searchApp | Get-SPEnterpriseSearchAdministrationComponent | Set-SPEnterpriseSearchAdministrationComponent -SearchServiceInstance $hostA 

    Section 6: get the Initial Search Topology and Clone it

    ## Get Initial Search Topology
    Write-Host "Get Initial Search Topology..."
    $initialTopology = Get-SPEnterpriseSearchTopology -SearchApplication $searchApp
     
    ## Create Clone Search Topology
    Write-Host "Creating Clone Search Topology..."
    $cloneTopology = New-SPEnterpriseSearchTopology -SearchApplication $searchApp -Clone -SearchTopology $initialTopology 

     

    Section 7: here is where we define where different Search Components will live. We want to create a redundant topology for load balancing and high availability. As long as same services are running on multiple servers, SharePoint will use its internal Load Balancer.

    We will deploy Admin Component, Crawl Component, Analytics Processing Component, Content Processing Component and Index Partition 0 on both the Application Servers and we will run Query Processing Component on both the Web Front End Servers.

    If you are new these components mentioned above, refer to this TechNet Article.


    ## Host-A Components
     
    Write-Host "Creating Host A Components (Admin, Crawl, Analytics, Content Processing, Index Partition)..."
     
    $AdminTopology = New-SPEnterpriseSearchAdminComponent -SearchServiceInstance $hostA -SearchTopology $cloneTopology
    $CrawlTopology = New-SPEnterpriseSearchCrawlComponent -SearchServiceInstance $hostA -SearchTopology $cloneTopology
    $AnalyticsTopology = New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchServiceInstance $hostA -SearchTopology $cloneTopology
    $ContentProcessingTopology = New-SPEnterpriseSearchContentProcessingComponent -SearchServiceInstance $hostA -SearchTopology $cloneTopology
    $IndexTopology = New-SPEnterpriseSearchIndexComponent -SearchServiceInstance $hostA -SearchTopology $cloneTopology -IndexPartition 0
     
    ## Host-B Components
     
    Write-Host "Creating Host B Components (Admin, Crawl, Analytics, Content Processing, Index Partition)..."
     
    $AdminTopology = New-SPEnterpriseSearchAdminComponent -SearchServiceInstance $hostB -SearchTopology $cloneTopology
    $CrawlTopology = New-SPEnterpriseSearchCrawlComponent -SearchServiceInstance $hostB -SearchTopology $cloneTopology
    $AnalyticsTopology = New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchServiceInstance $hostB -SearchTopology $cloneTopology
    $ContentProcessingTopology = New-SPEnterpriseSearchContentProcessingComponent -SearchServiceInstance $hostB -SearchTopology $cloneTopology
    $IndexTopology = New-SPEnterpriseSearchIndexComponent -SearchServiceInstance $hostB -SearchTopology $cloneTopology -IndexPartition 0
     
    ## Host-C Components
     
    Write-Host "Creating Host C Components (Query)..."
     
    $QueryTopology = New-SPEnterpriseSearchQueryProcessingComponent -SearchServiceInstance $hostC -SearchTopology $cloneTopology
     
    ## Host-D Components
     
    Write-Host "Creating Host D Components (Query)..."
     
    $QueryTopology = New-SPEnterpriseSearchQueryProcessingComponent -SearchServiceInstance $hostD -SearchTopology $cloneTopology

    Section 8: Next we will activate the new topology that we just created above and remove the initial Search topology which should be in “Inactive” State.

    ## Activate Clone Search Topology
    Write-Host "Activating Clone Search Topology...Please wait. This will take some time"
    Set-SPEnterpriseSearchTopology -Identity $cloneTopology
     
    ## Remove Initial Search Topology
    Write-Host "Removing Initial Search Topology..."
    $initialTopology = Get-SPEnterpriseSearchTopology -SearchApplication $searchApp | where {($_.State) -eq "Inactive"}
    Remove-SPEnterpriseSearchTopology -Identity $initialTopology -Confirm:$false 

    Section 9: Last step is to create the Search Service Application Proxy

    ## Create Search Service Application Proxy
    Write-Host "Creating Search Service Application Proxy..."
    $searchAppProxy = New-SPEnterpriseSearchServiceApplicationProxy -Name "$searchName Proxy" -SearchApplication $searchApp

    And here is the end result:

    Here are the services running on all servers:

    clip_image002[5]

    And here is the topology:

    clip_image004

    Cheers!

    Priyo

    Note: The referenced PowerShell script is attached to this post.  Use at your own risk, modify to meet the parameters of your own environment, and test before using in production!

  • Migrating File Shares to SharePoint Online

    (Post courtesy Partner Solution Consultant Andre Kieft)

    It has been a while since I created a blog post, but recently I received a lot of questions and requests for advice on how to migrate file shares to SharePoint and use SkyDrive Pro (SDP). So I figured to create a blog post with the things you need to consider as a Small and Medium Business (SMB) partner when you are planning to migrate file share content into SharePoint and want to make use of SDP for synchronizing the SharePoint content offline.

    Note: that these steps are both valid for SharePoint 2013 on-premises (on-prem) and SharePoint Online (SPO).

    Step 1 – Analyze your File Shares

    As a first step, try to understand the data that resides on the file shares. Ask yourself the following questions:

    • What is the total size of the file share data that the customer wants to migrate?
    • How many files are there in total?
    • What are the largest file sizes?
    • How deep are the folder structures nested?
    • Is there any content that is not being used anymore?
    • What file types are there?

    Let me try to explain why you should ask yourself these questions.

    Total Size

    If the total size of the file shares are more that the storage capacity that you have on SharePoint, you need to buy additional storage (SPO) or increase your disk capacity (on-prem). To determine how much storage you will have in SPO, please check the Total available tenant storage in the tables in this article. Another issues that may arise is that in SharePoint is that you reach the capacity per site collection. For SPO that is 100 Gigabyte, for on-premises the recommended size per site collection is around 200 Gigabyte. This would automatically mean that the content database is around 200 Gigabyte, which is the recommended size. Thought you can stretch this number up in on-prem, it is not recommended.

    So, what should I do when my customer has more than 100 Gigabyte?

    • Try to divide the file share content over multiple site collections when it concerns content which needs to be shared with others.
    • If certain content is just for personal use, try to migrate that specific content into the personal site of the user.

    How Many Files

    The total amount of files on the file shares is important as there are some limits in both SharePoint as well as SDP that can result in an unusable state of the library or list within SharePoint but you also might end up with missing files when using the SDP client.

    First, in SPO we have a fixed limit of 5000 items per view, folder or query. Reasoning behind this 5000 limit boils all the way down to how SQL works under the hood. If you would like to know more about it, please read this article. In on-prem there is a way to boost this up, but it is not something we recommend as the performance can significantly decrease when you increase this limit.

    Secondly for SDP there is also a 5000 limit for synchronizing team sites and 20000 for synchronizing personal sites. This means that if you have a document library that contains more that 5000 items, the rest of the items will not be synchronized locally.

    There is also a limit of 5 million items within a document library, but I guess that most customer in SMB won’t reach that limit very easily.

    So, what should I do if my data that I want to migrate to a document library contains more than 5000 items in one folder?

    • Try to divide that amount over multiple subfolders or create additional views that will limit the amount of documents displayed.

    But wait! If I already have 5000 items in one folder, doesn’t that mean that the rest of the other document won’t get synchronized when I use SDP?

    Yes, that is correct. So if you would like to use SDP to synchronize document offline, make sure that the total amount of documents per library in a team site, does not exceed 5000 documents in total.

    So, how do I fix that?

    • Look at the folder structure of the file share content and see if you can divide that data across multiple sites and/or libraries. So if there is a folder marketing for example, it might make more sense to migrate that data into a separate site anyway, as this department probably wants to store additional information besides just documents (e.g. calendar, general info about the marketing team, site mailbox etc). An additional benefit of spreading the data over multiple sites/libraries is that it will give the SDP users more granularity about what data they can take offline using SDP. If you would migrate everything into one big document library (not recommended), it would mean that all users will need to synchronize everything which can have a severe impact on your network bandwidth.

    Largest File Sizes

    Another limit that exists in SPO and on-prem is the maximum file size. For both the maximum size per file is 2 Gigabyte. In on-prem the default is 250 MB, but can be increased to a maximum of 2 Gigabyte.

    So, what if I have files that exceed this size?

    • Well, it won’t fit in SharePoint, so you can’t migrate these. So, see what type of files they are and determine what they are used for in the organization. Examples could be software distribution images, large media files, training courses or other materials. If these are still being used and not highly confidential, it is not a bad thing to keep these on alternative storage like a SAN, NAS or DVDs. If it concerns data that just needs to be kept for legal reasons and don’t require to be retrieved instantly, you might just put these on DVD or an external hard drive and store them in a safe for example.

    Folder Structures

    Another important aspect to look at on your file shares is the depth of nested folders and file length. The recommended total length of a URL in SharePoint is around 260 characters. You would think that 260 characters is pretty lengthy, but remember that URLs in SharePoint often has encoding applied to it, which takes up additional space. E.g. a space is one character but in Unicode this a %20, which takes up three characters. The problem is that you can run into issues when the URL becomes to large. More details about the exact limits can be found here, but as a best practice try to keep the URL length of a document under 260 characters.

    So, what if I have files that will have more than 260 characters in total URL length?

    • Make sure you keep your site URLs short (the site title name can be long though). E.g. don’t call the URL Human Resources, but call it HR. If you land on the site, you would still see the full name Human Resources as Site Title and URL are separate things in SharePoint.
    • Shorten the document name (e.g. strip of …v.1.2, or …modified by Andre), as SharePoint has versioning build in. More information about versioning can be found here.

    Idle Content

    When migrating file shares into SharePoint is often also a good momentum to clean up some of the information that the organization has been collecting over the years. If you find there is a lot of content which is not been accessed for a couple of years, what would be the point of migrating that data it to SharePoint?

    So, what should I do when I come across such content?

    • Discuss this with the customer and determine if it is really necessary to keep this data.
    • If the data cannot be purged, you might consider storing it on a DVD or external hard drive and keep it in a safe.
    • If the content has multiple versions, such as proposal 1.0.docx, proposal 1.1.docx, proposal final.docx, proposal modified by Andre.docx, you might consider just moving the latest version instead of migrating them all. This manual process might be time consuming, but can safe you lots of storage space in SharePoint. Versioning is also something that is build into the SharePoint system and is optimized to store multiple versions of the same document. For example, SharePoint only stores the delta of the next version, saving more storage space that way. Note that this functionality is only available in SharePoint on-prem.

    Types of Files

    Determine what kind of files the customer is having. Are they mainly Office documents? If so, then SharePoint is the best place to store such content. However, if you come across developers code for example, it is not a good idea to move that into SharePoint. There are also other file extensions that are not allowed in SPO and/or on-prem. A complete list of blocked file types for both SPO and on-prem can be found here.

    So, what if I come across such file extensions?

    • Well, you can’t move them into SharePoint, so you should either ask yourself, do I still need these files? And if so, is there an alternative storage facility such as a NAS, I can store these files on? If it concerns developer code, you might want to store such code on a Team Foundation Service Server instead.

    Tools for analyzing and fixing file share data

    In order to determine if you have large files or exceed the 5000 limit for example, you need to have some kind of tooling. There are a couple of approaches here.

    • First off, there is a PowerShell script that has been pimped up by a German colleague Hans Brender, which checks for blocked file types, bad characters in files and folders and finally for the maximum URL length. The script will even allow you to fix invalid characters and file extensions for you. It is a great script, but requires you to have some knowledge about PowerShell. Another alternative I was pointed at is a tool called SharePrep. This tool does a scan for URL length and invalid characters.
    • Secondly there are other 3rd party tools that can do a scan of your file share content such as Treesize. However such tools do not necessarily check for the SharePoint limitations we talked about in the earlier paragraphs, but at least they will give you a lot more insight about the size of the file share content.
    • Finally there are actual 3rd party migration tools that will move the file share content into SharePoint, but will check for invalid characters, extensions and URL length upfront. We will dig into these tools in Step 2 – Migrating your data.

    Step 2 – Migrating your data

    So, now that we have analyzed our file share content, it is time to move them into SharePoint. There are a couple of approaches here.

    Open with Explorer

    If you are in a document library you can open up the library in the Windows Explorer. That way you can just do a copy and paste from the files into SharePoint.

    image

    But, there are some drawbacks using this scenario. First of all, I’ve seen lots of issues trying to open up the library in the Windows Explorer. Secondly, the technology that is used for copying the data into SharePoint is not very reliable, so keep that in mind when copying larger chunks of data. Finally there is also drag & drop you can use, but this is only limited to files (no folders) and only does a maximum of 100 files per drag. So this would mean if you have 1000 files, you need to drag them 10 times in 10 chunks. More information can be found in this article. Checking for invalid characters, extensions and URL length upfront are also not addressed when using the Open with Explorer method.

    Pros: Free, easy to use, works fine for smaller amounts of data

    Cons: Not always reliable, no metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

    SkyDrive Pro

    You could also use SDP to upload the data into a library. This is fine as long as you don’t sync more than 5000 items per library. Remember though that SDP is not a migration tool, but a sync tool, so it is not optimized for large chunks of data to be copied into SharePoint. Things like character and file type restrictions, path length etc. is on the list of the SDP team to address, but they are currently not there.

    The main drawbacks of using either the Open in Explorer option or using SDP is that when you use these tools, they don’t preserve the metadata of the files and folder that are on the file shares. By this I mean, things like the modified date or owner field are not migrated into SharePoint. The owner will become the user that is copying the data and the modified date will be the timestamp of the when the copy operation was executed. So if this metadata on the files shares is important, don’t use any of the methods mentioned earlier, but use one of the third party tools below.

    Pros: Free, easy to use, works fine for smaller amounts of data (max 5000 per team site library or 20000 per personal site)

    Cons: No metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

    3rd party tools

    Here are some of the 3rd party tools that will provide additional detection, fixing and migration capabilities that we mentioned earlier:

    (Thx to Raoul for pointing me to additional tools)

    The list above is in random order, where some have a focus on SMB, while other more focused on the enterprise segment. We can’t speak out any preference for one tool or the other, but most of the tools will have a free trial version available, so you can try them out yourself.

    Summary

    So, when should I use what approach?

    Here is a short summary of capabilities:

      Open in Explorer SkyDrive Pro 3rd party
    Amount of data Relatively small No more than 5000 items per library Larger data sets
    Invalid character detection No No Mostly yes1
    URL length detection No No Mostly yes1
    Metadata preservation No No Mostly yes1
    Blocked file types detection No No Mostly yes1

    1This depends on the capabilities of the 3rd party tool.

    Troubleshooting

    SDP gives me issues when synchronizing data
    Please check if you have the latest version of SDP installed. There have been stability issues in earlier released builds of the tool, but most of the issues should be fixed by now. You can check if you are running the latest version, by opening up Word-> File-> Account and click on Update Options-> View Updates. If your current version number is lower than the one you have, click on the Disable Updates button (click yes if prompted), then click Enable updates (click yes if prompted). This will force downloading the latest version of Office and thus the latest version of the SDP tool.

    image

    If you are running the stand-alone version of SDP, make sure you have downloaded the latest version from here.

    Why is the upload taking so long?
    This really depends on a lot of things. It can depend on:

    • The method or tool that is used to upload the data
    • The available bandwidth for uploading the data. Tips:
      • Check your upload speed at http://www.speedtest.net and do a test for your nearest Office 365 data center. This will give you an indication of the maximum upload speed.
      • Often companies have less available upload bandwidth then people at home. If you have the chance, uploading from a home location might be faster.
      • Schedule the upload at times when there is much more bandwidth for uploading the data (usually at night)
      • Test your upload speed upfront by uploading maybe 1% of the data. Multiply it by 100 and you have a rough estimate of the total upload time.
    • The computers used for uploading the data. A slow laptop can become a bottle neck while uploading the data.

    If you feel that there are things missing here, please let me know and I’ll try to add them to this blog post.

  • Sending e-mails from Microsoft Dynamics CRM

    (post courtesy Sarkis Derbedrossian)

    I often meet Microsoft CRM users who don’t know how sending e-mail works within Microsoft Dynamics CRM. Most users think that when they create an e-mail in CRM and hit the send button, the e-mail is sent automatically. Neither Outlook nor CRM can send e-mails without a post system e.g. Exchange server. Below you will learn how e-mail within CRM works with and without Outlook

    E-mail in relation to CRM

    Once you've created an e-mail activity in MS CRM and click the Send button to send the e-mail, this mail is handled differently depending on the settings of each user is set to in MS CRM.

    E-mail can be handled through Outlook or directly through CRM ... but neither Outlook nor MS CRM can implement the physical handling of the e-mail. This is done by a mail server (Microsoft Exchange Server or another mail system).

    Sending an E-mail in MS CRM

    Do not make a fast conclusion and think that MS CRM can neither receive nor send e-mail. You should understand that the above task requires an e-mail system to accomplish.

    When you send email from MS CRM it usually happens by the following steps:

    1. The user creates an e-mail activity and clicks on the send button. The e-mail is now saved with the user as the recipient
    2. The e-mail gets synchronized to the user’s Outlook
    3. The users Outlook sends the e-mail to the mail server (Exchange)
    4. Exchange sends the e-mail through the internet

    What if the user does not have the Outlook client open? This will result in the mail will not be sent until the user logs into Outlook. For some situations this can be insufficient. Fortunately installing the e-mail router can solve it.

    Sending e-mail via the e-mail router

    If you want to be independent of Outlook, and thus could send email directly from MS CRM without using Outlook, this can be done by installing and configuring an E-mail Router.

    The e-mail Router is free software that comes with MS CRM. The software can be installed on any server that has access to a Mail Server (Exchange Server or other mail system (POP3/SMTP)) and MS CRM.

    When you send email from MS CRM using an E-mail Router it often happens by the following steps

    1. The user creates an e-mail activity and clicks on the send button. The e-mail is now saved with the user as the recipient
    2. The e-mail is sent to the e-mail router
    3. The email router sends the e-mail to the mail server (Exchange)
    4. Exchange sends the e-mail through the internet

    E-mail settings in CRM

    Depending on how you want your organization to send e-mails, remember to check the following settings:

    1. In CRM, Settings, Users
    2. Open the user form
    3. In the configuration section of the e-mail access, select the desired setting

    clip_image002

    Configuring e-mail access

    It is possible to choose one of the following settings from the option list:

    None
    Outlook cannot be used for sending and receiving e-mails which is related to MS CRM

    Microsoft Dynamics CRM for Outlook
    Outlook is responsible for sending / receiving e-mail. Integration with MS CRM for Outlook must be installed and configured. E-mails sent / received only when Outlook is active (open)

    E-mail router
    E-mail is sent and received with MS CRM Email Router. If this element is selected, a dialog box allows entering credentials. Check the box if you want to specify credentials

    Forwarded mailbox
    E-mail forwarded from another e-mail address. The e-mail Router is responsible for sending / receiving e-mails.

    More Information:

  • Competing against Google for an Office 365 Deal?

    imageOur Q2 FY14 Google compete campaign was built for Microsoft partners specifically targeting SMB customers. It provides targeted demand generation materials to help guide you in 1:1 conversations with customers who are either considering Google Apps for Business or who have recently purchased Google Apps and aren’t satisfied.​

    One size doesn’t fit all when it comes to the needs of your customers – and we’ve built the Google compete campaign with that in mind.  This campaign provides marketing and readiness tools to help you start discussions with your SMB customers about the benefits of Office 365 when competing with Google Apps for Business.

    This campaign is not intended to replace the campaigns material found in Office BEST for SMB or the Office Small Business Premium campaign – so continue using those as a supplement to your Google compete activity. The intention of this campaign is to highlight the multiple areas where Office 365 beats the competition – Google – and provide specific scenario examples for you to use when marketing and selling.

    Campaign Includes:

    • Google Compete "Win-back Scenario" Emails
    • Google Compete "Consideration Scenario" Emails
    • Google Compete Sales Presentation
    • Google Compete Infographic
    • Google Compete Campaign Talking Points
    • Google Compete Campaign Guidance
    • Google Compete Leave-Behind

    Access the materials on the Ready-to-Go Marketing Campaign site here: Office 365 Beats Google

    If you have a Silver or Gold competency, make sure to take advantage of your Technical Presales Assistance benefit.  Partners who have attained a silver or gold competency receive unlimited presales advisory support for any deal worth US$3,000 or more. For deals worth less than US$3,000, you can access Technical Presales Assistance by phone (1-800-MPN-SOLV) using your allotted partner advisory hours or through the Partner Support Community.

    Submit a request now

    Examples of services
    • Competitive assistance
    • Feature overview and comparison guidance
    • Request for proposal (RFP) questions
    • Business value proposition
    • Proof-of-concept guidance
    • Technical licensing recommendations
  • Top 10 Partner PTS Blog Posts for 2013

    Reflecting back on 2013, I just dived (dove?) into the logs for the PTS Blog for 2013, and thought I would share the top 10 posts for the year. 

    It looks like the deeply technical posts are much more popular than the “announcement” type posts.  If there are any topics you’d like to see a Partner Solution Consultant cover, post in the comments!

  • WSUS not configured error during Configuration Manager 2012 Software Update Point Installation

    (Post courtesy Anil Malekani)

    Recently I tried configuring Software Update Management in Configuration Manager 2012. After installing WSUS on the Configuration Manager 2012 box, I tried to install Software Update Point as a site role.

    clip_image002

    The Software Update Point role successfully installed, as per the SUPSetup.log file (under C:\Program Files\Microsoft Configuration Manager\Logs)

    However, my updates still did not appear on the console. After checking the Site Component status for SMS_WSUS_SYNC_MANAGER and SMS_WSUS_CONFIGURATION_MANAGER I noticed errors as below

    SMS_WSUS_SYNC_MANAGER: Message ID 6600

    clip_image003

    SMS_WSUS_CONFIGURATION_MANAGER: Message ID 6600

    clip_image004

    I checked under WCM.log (under C:\Program Files\Microsoft Configuration Manager\Logs), and found the following proxy error

    =============================

    SCF change notification triggered.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    This SCCM2012.CORP80.COM system is the Top Site where WSUS Server is configured to Sync from Microsoft Update (WU/MU) OR do not Sync.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Found WSUS Admin dll of assembly version Microsoft.UpdateServices.Administration, Version=3.0.6000.273, Major Version = 0x30000, Minor Version = 0x17700111        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Found WSUS Admin dll of assembly version Microsoft.UpdateServices.Administration, Version=3.1.6001.1, Major Version = 0x30001, Minor Version = 0x17710001        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    The installed WSUS build has the valid and supported WSUS Administration DLL assembly version (3.1.7600.226)        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    System.Net.WebException: The request failed with HTTP status 502: Proxy Error ( The host was not found. ).~~ at Microsoft.UpdateServices.Administration.AdminProxy.CreateUpdateServer(Object[] args)~~ at Microsoft.UpdateServices.Administration.AdminProxy.GetUpdateServer(String serverName, Boolean useSecureConnection, Int32 portNumber)~~ at Microsoft.SystemsManagementServer.WSUS.WSUSServer.ConnectToWSUSServer(String ServerName, Boolean UseSSL, Int32 PortNumber)        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Remote configuration failed on WSUS Server.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    STATMSG: ID=6600 SEV=E LEV=M SOURCE="SMS Server" COMP="SMS_WSUS_CONFIGURATION_MANAGER" SYS=SCCM2012.corp80.com SITE=CM1 PID=2424 TID=5408 GMTDATE=Fri Oct 14 00:20:03.092 2011 ISTR0="SCCM2012.corp80.com" ISTR1="" ISTR2="" ISTR3="" ISTR4="" ISTR5="" ISTR6="" ISTR7="" ISTR8="" ISTR9="" NUMATTRS=0        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Waiting for changes for 46 minutes        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    =============================

    I validated that the proxy had been configured correctly and my browser settings also contained the same settings.

    Resolution: After spending some time I found that Configuration Manager 2012 uses the system account proxy settings, which were set to Automatically detect settings.

    1. Using the excellent PsExec utility, I opened a command prompt under the system account (using the –s parameter).
    2. Within this command prompt running as system, I launched Internet Explorer and removed proxy settings.
    3. Finally, updates started appearing in the console.

    clip_image005

  • Designing Apps for Different Window Sizes

    We often recieve questions from partners about the best way to manage different window and screen sizes when building an app.  Below is a short video that will help you through some of the common scenarios.

  • Office 365 ProPlus Deployment for IT Pros

    If you are planning a rollout of Office 365 ProPlus (the client bits), make sure to head on over to Microsoft Virtual Academy to watch the Office 365 ProPlus Deployment for IT Pros Jump Start.

    The goal of this course is for the audience to learn about the IT Pro deployment features of Office 365 ProPlus, such as licensing and activation, Click-to-Run, Office Telemetry, and building App-V packages. Join us as we deliver demo-rich sessions that discuss these topics, first as an overview and then more in depth. Come hear how these new IT investments make upgrading Office a lot easier, faster, and economical.

    The presenters have a wealth of knowledge. Curtis Sawin is a Technology Solutions Professional specializing in Office Compatibility and Office Deployment for Microsoft, while Dean Yamada is a Senior Premier Field Engineer, specializing in proactive deliveries of Office Premier Workshops and onsite Office Setup, Deployment and Migration advisories.

    imageimage

    Modules include:

    01 | Office 365 ProPlus Identity, Licensing, and Activation
    An overview of the identity, licensing, and activation requirements of Office 365 ProPlus

    02 | Office Telemetry Overview
    Introduces Office Telemetry, what Telemetry can do for you, and details all the Office Telemetry components

    03 | Advanced Telemetry
    In-depth about the Office Telemetry infrastructure, capacity planning, performance, and the Telemetry Dashboard Administration Tool

    04 | Telemetry Custom Reporting
    Examples of how to mine data from the Office Telemetry database, and a behind-the-scenes look at custom reporting

    05 | Office 365 ProPlus Click-to-Run Deployment Overview
    Introduces Microsoft’s new streaming and virtualization technology, Click-to-Run, which is used to deploy Office 365 ProPlus

    06 | Click-to-Run Deep Dive
    Internals of how Click-to-Run works, and real world tips and tricks that will enable you to successfully deploy Office 365 ProPlus

    07 | Office and App-V
    Office 365 ProPlus and App-V work together, discussing the advantages of this approach, and demonstrating how it works

    image

  • SLA and Disaster Recovery Planning for Microsoft Business Intelligence

    (Post courtesy Partner Technical Consultant specializing on Data Analytics)

    Service Level Agreement Planning and Disaster Recovery Planning for the Microsoft Business Intelligence Stack with Microsoft SQL Server and Microsoft SharePoint Server

    For a fully implemented Microsoft Business Intelligence stack, which might be composed of SQL Server, SSAS, SSIS, SSRS, SharePoint Server, MDS, and DQS; the question may arise regarding how to ensure the consistent status of all of the applications in case of a total or partial failure, where the different components may be subject to varying maintenance schedules.

    In the worst case, disaster recovery requires us to recover from a situation where the primary data center, which hosts the servers, is unable to continue operation. Even potentially smaller disruptions like power outages, data corruption or accidental deletion of data can force us to restore data or configuration from our backups. It is well known that fault-tolerance is achieved through redundancy, ideally not only at the data layer but also in each and every component of the network and all services (switches, servers, Active Directory, SMTP…)

    In this blog post we would like to focus on the Microsoft Business Intelligence stack and provide an overview what you need to consider when defining Service Level Agreements and how to prepare for a fast resumption of operation after such an unwelcome event has occurred.

    Balancing cost and risk of downtime

    First, let's consider the two factors that determine the Service Level Agreement corresponding to Availability. The whitepaper on "High Availability with SQL Server 2008 R2" at http://technet.microsoft.com/en-us/library/ff658546.aspx explains it concisely:

    "The two main requirements around high-availability are commonly known as RTO and RPO. RTO stands for Recovery Time Objective and is the maximum allowable downtime when a failure occurs. RPO stands for Recovery Point Objective and is the maximum allowable data-loss when a failure occurs. Apart from specifying a number, it is also necessary to contextualize the number. For example, when specifying that a database must be available 99.99% of the time, is that 99.99% of 24x7 or is there an allowable maintenance window?"

    This means that both RPO, i.e. the amount of data you are willing to lose, and RTO, the duration of the outage, need to be determined individually, depending on your customer's specific needs. Their calculation follows an actuarial principle in that cost and risk need to be balanced. Please do not forget that the RTO does not only depend on how soon your services are back online but might in certain circumstances encompass the amount of time needed to restore data up to a certain point in time from backups as well.

    Assuming a 24x7x365 operation, the following calculation applies, taken from "Create a high availability architecture and strategy for SharePoint 2013" at http://technet.microsoft.com/en-us/library/cc748824.aspx:

       Availability class            Availability measurement             Annual down time

       Two nines                       99%                                                   3.7 days

       Three nines                    99.9%                                                8.8 hours

       Four nines                       99.99%                                              53 minutes

       Five nines                        99.999%                                           5.3 minutes

    So now we start to appreciate what it means that in Windows Azure we receive a general SLA of 99.9% across services respectively 99.95% for cloud services, cf. http://www.windowsazure.com/en-us/support/legal/sla.

    And here is one more argument in favor of using Windows Azure as your secondary site and standby data center: If you back up your databases and transaction logs to Azure blob storage and take Hyper-V based snapshots of your virtual machines, which you then transfer to Azure blob storage, then you will only incur the cheap storage cost and still be able to turn on the VM's any time you decide to bring them online, and start paying for them only while they are running. Windows Server 2012 Backup and Windows Azure Backup allow you to backup system state and files/folders to Windows Azure storage as well.

    Alternatively, availability can be calculated as the expected time between two consecutive failures for a repairable system as per the following formula:

       Availability = MTTF / (MTTF + MTTR)

    where MTTF = Mean Time To Failure and MTTR = Mean Time To Recover.

    Scope

    A disaster recovery concept needs to encompass the whole architecture and all technologies involved and include best practices on functional and non-functional requirements (non-functional refers to software behavior like performance, security, etc.).

    To summarize, partners need to define in the SLA towards their customers an RPO (recovery point objective) and RTO (recovery time objective). For this, they are looking for a disaster recovery concept that takes into account:

    - full, differential and transaction log backups (assuming the database is in full recovery mode)

    - application backups

    - any add-on components of the software

    - Hyper-V virtual images of production servers

    With that let's take a detailed look at an end-to-end disaster recovery planning for a Microsoft BI solution.

    SQL Server databases

    To begin with, how do the above concepts apply to the SQL Server databases?

    Where would you look in the first place to find out about the recovery time of all of your databases? Correct, it is the SQL Server's error log, which can be read along a timeline.

    To estimate the roll forward rate for a standalone system, one could use a test copy of a database and restore a transaction log from a high-load time period to it. The application design plays an important role as well: Short-running transactions reduce the roll forward time.

    Upon failover in an AlwaysOn failover cluster instance, all databases need to be recovered on the new node, which means that transactions that are committed in the transaction log need to be rolled forward in the database, whereas transactions that got aborted have to be rolled back.

    Side note: In a Failover Cluster Instance, the time for switchover is furthermore impacted by factors like for example storage regrouping or DNS/network name provisioning. Regarding the client side, one can configure the connection timeout in order to accelerate the time needed to reestablish a broken connection.

    The new SQL Server 2012 Availability Groups make it easy to observe the RPO and RTO. For details, see "Monitor for RTO and RPO" at http://technet.microsoft.com/en-us/library/dn135338.aspx.

    Here are some tips for an efficient backup strategy of your SQL Server databases:

    - Use a separate physical location where you store the backups.

    - Have a schedule to carry out regular backups, for example nightly full backups, every 6 hours a differential backup, and every 30 minutes a transactional log backup, if you need a point-in-time recovery.

    - Enable CHECKSUM on backups. This is the default with backup compression, which is available in Standard, Business Intelligence and Enterprise Edition.

    - Test your backups periodically by restoring them because you might unknowingly carry on some data corruption, making your backups useless. Monitor the suspect_pages table in MSDB to determine when a page level restore is sufficient.

    - With regards to long-term archival, it is considered good practice to maintain three different retention periods. If you leverage three rotational schemes, thus for example create full backups daily, weekly and monthly and store them onto different media sets each, then you could regenerate your data from these if necessary. This is called the grandfather-father-son principle and allows for reusing the media sets after their retention period. As an example, a backup on a Monday overwrites that of some previous Monday and so on. The screenshot at http://social.msdn.microsoft.com/Forums/en-US/92fbf076-3cd1-4ab2-97d2-1ae6c9e909c7/grandfatherfatherson-backup-scenario depicts these options very well.

    - Filegroups for historical partitions can be marked as read-only, hence require only a one-time filegroup backup. A piecemeal restore of read-write filegroups can accelerate recovery.

    - Use "SQL Server Backup to Windows Azure" to upload the backup files for offsite storage, optimally with compression and encryption, even for versions earlier than SQL Server 2014. Check out the "Microsoft SQL Server Backup to Microsoft Windows Azure Tool" at http://www.microsoft.com/en-us/download/details.aspx?id=40740.

    - While the RBS FILESTREAM provider, which uses local disk storage, is integrated with SQL Server's backup and restore procedures, with a third party RBS provider it will be your responsibility to back up the Remote Blob Storage separately in a consistent manner, cf. "Plan for RBS in SharePoint 2013" http://technet.microsoft.com/en-us/library/ff628583.aspx.

    Fortunately, all Microsoft products are built to scale for availability. With SQL Server Availability Groups in SQL Server 2012 and higher you get a highly available set of databases and of secondary replicas for failover, disaster recovery purposes or to load-balance your read requests. Availability groups are a feature of SQL Server Enterprise Edition, which comes with even more online features than the other editions to allow for higher availability and faster recovery, noticeably online page and file restore or Database Recovery Advisor. The latter is helpful for point-in-time restores across sometimes complicated backup chains. For a concise list please see the table at: http://msdn.microsoft.com/en-us/library/cc645993.aspx#High_availability.

    With SQL Server Availability Groups spread out to Windows Azure virtual machines it is even possible to host your secondary database replicas in Azure and, for example, run your System Center Data Protection Manager and its agent in the cloud against them.

    Marked transactions allow you to restore several databases, for example the MDS database, the SSIS catalog and your corresponding user databases on the same instance of SQL Server consistently up the very same point in time, which can be advantageous if a major event, for example a fusion of two companies’ databases, occurs. See "Use Marked Transactions to Recover Related Databases Consistently (Full Recovery Model)" at http://technet.microsoft.com/en-us/library/ms187014.aspx.

    SSAS

    Since SQL Server Analysis Services is mainly a read-only system, you can do without things like transaction logging or differential backups. If metadata (.xmla files) is available, then this is sufficient to recreate and reprocess your cubes. If you even have functional database backups (.abf files), then those can be restored and used.

    It is possible to run a separate SSAS server, which has the same configuration settings, in a remote location and supply it regularly with the latest data via database synchronization.

    Hints:

    - When running SSAS in SharePoint mode (as the POWERPIVOT instance), the SharePoint content and service application databases contain the application and data files.

    - If you host your secondary replica for read access in Windows Azure, you will want to place your SSAS instance running in an Azure VM within the same Availability Set.

    SSIS

    SQL Server Integration Services since version 2012 offers two deployment modes: package-based for backward compatibility and the new project-based deployment. Backup and restore procedures depend on the storage location of the data. The package store can be folders in the file system or the msdb database. Any files should be copied away together with a script for dtutil to be able to upload them, additionally any centrally managed configuration files. Starting with SQL Server 2012, it is strongly recommended to use project deployment for the Integration Services server. The SSISDB catalog is a database that stores projects, packages, parameters, environments, operational history, and as such can be backed up into a .bak file. You also need to back up the master key for the SSISDB database, whereby the resulting file will be encrypted with a password you specify. Unless the key changes, this is a one-time operation.

    SSRS

    With SQL Server Reporting Services in native mode being a stateless service, it is the ReportServer database which contains the metadata and report snapshots with data. It can be protected as required for your SLA and RTO via full or differential backups. Experience has shown that doing just full backups oftentimes works fast enough. The ReportServerTempDB database can be recreated anytime. Do not forget to back up the RecoveryKey, which encrypts the database. This should be done at creation time, which suffices unless the service identity or computer name changes. In case of subscriptions, you need to back up the SQL Server Agent jobs as well. This can be accomplished via a simultaneous backup of the msdb database. For a backup of the Report Server configuration and custom assemblies kindly refer the corresponding links in the final section of this blog post.

    Concerning SQL Server Reporting Services in SharePoint mode, the SharePoint 2013 built-in backup does not take care of the SSRS databases – with the additional Reporting Service Alerting database - so the previous paragraph is still valid, which means you must use SQL Server tools for SharePoint Server or SQL Server (Express) tools for SharePoint Foundation. As for the application part, since SSRS in SharePoint mode is a true SharePoint Service application, configuration occurs through Central Administration and SharePoint Server's own backup and recovery applies.

    SharePoint Server

    The BI, also called Insights, features of SharePoint Server, like for example Excel Services, Access Services, Visio Services, PerformancePoint Services benefit from SharePoint Server's backup options for service applications. A restore of a service application database has to be followed by provisioning the service application. Please find further details in the TechNet articles referenced below.

    MDS

    Master Data Services consists of a database wherein all master data as well as MDS system settings are stored plus a Master Data Manager web application. Scheduling daily full backups and more frequent transaction log backups is recommended. MDSModelDeploy.exe is a useful tool for creating packages of your model objects and data.

    Side note: In our experience it is less the IIS-hosted website that tends to cause a bottleneck at high load than the MDS database itself. Hence, a scale-out would not necessarily involve just several MDS web sites, pointing to the same database, although this allows for redundancy and increased availability while web servers get updated. Rather it would separate out models into different MDS databases. On the one hand, this increases the overhead for security accounts and administration, given that the metadata tables are completely isolated from each other; on the other hand, blockings are avoided and databases can be managed independently.

    DQS

    Data Quality Services keeps its information in three databases: DQS_MAIN, DQS_PROJECTS, and DQS_STAGING_DATA, therefore can neatly be integrated into your SQL Server backup and restore processes. With the help of the command DQSInstaller.exe it is even possible to export all of the published knowledge bases from a Data Quality Server to a DQS backup file (.dqsb) in one go.

    Cross-cutting best practices

    - Making use of SQL alias for connections to your SQL Server computer eases the process of moving a database, for example when a SQL virtual cluster name changes. For instructions see for example "Install & Configure SharePoint 2013 with SQL Client Alias" http://blogs.msdn.com/b/sowmyancs/archive/2012/08/06/install-amp-configure-sharepoint-2013-with-sql-client-alias.aspx. It shows how you gain flexibility over the SQL Server connection string by appropriately populating the SQL Server Network Configuration and SQL Server Client Network Utility. This procedure has significant advantages over the DNS A record or CNAME alias in that the SQL alias does not change the Kerberos SPN format for connections. You continue to use the registered DNS host name (A record) in the Service Principal Name when connecting. Furthermore, it allows you to specify more than one alias pointing to the same instance. For example, you can create an alias for your content databases, search databases etc. and thereby plan ahead for future scale out.

    - System Center Data Protection Manager can be used for both database backups and application server backups. For a list of protected workloads please see the left-hand navigation bar on the page "Administering and Managing System Center 2012 - Data Protection Manager" http://technet.microsoft.com/en-us/library/hh757851.aspx.

    - In the context of private clouds, System Center comes into play with its Operations Manager to monitor SQL Server instances and virtual machines and its Virtual Machine Manager to quickly provision new virtual machines.

    Closing words

    SQL Server and SharePoint Server allow for robust disaster recovery routines as part of your business continuity plan. New hybrid and cloud based solutions enhance traditional possibilities greatly.

    As has become clear, configuration changes that occur outside of user databases should always happen in a controlled manner, requiring a tight Change Management process.

    Further reading

    "Microsoft SQL Server AlwaysOn Solutions Guide for High Availability and Disaster Recovery" http://msdn.microsoft.com/en-us/library/hh781257.aspx.

    With some good discussions: "Simple script to backup all SQL Server databases" http://www.mssqltips.com/sqlservertip/1070/simple-script-to-backup-all-sql-server-databases/.

    "Back Up and Restore of System Databases (SQL Server)" http://technet.microsoft.com/en-us/library/ms190190.aspx.

    "SQL Server AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution by Using AlwaysOn Availability Groups" http://msdn.microsoft.com/en-us/library/jj191711.aspx.

    "SQL Server AlwaysOn Architecture Guide: Building a High Availability and Disaster Recovery Solution by Using Failover Cluster Instances and Availability Groups" http://msdn.microsoft.com/en-us/library/jj215886.aspx.

    "Backup and Restore of Analysis Services Databases" http://technet.microsoft.com/en-us/library/ms174874.aspx.

    "Disaster Recovery for PowerPivot for SharePoint" http://social.technet.microsoft.com/wiki/contents/articles/22137.disaster-recovery-for-powerpivot-for-sharepoint.aspx.

    "Package Backup and Restore (SSIS Service)" http://technet.microsoft.com/en-us/library/ms141699.aspx.

    "Backup, Restore, and Move the SSIS Catalog" http://technet.microsoft.com/en-us/library/hh213291.aspx.

    "Backup and Restore Operations for Reporting Services" http://technet.microsoft.com/en-us/library/ms155814.aspx.

    "Migrate a Reporting Services Installation (Native Mode)" http://technet.microsoft.com/en-us/library/ms143724.aspx.

    "Migrate a Reporting Services Installation (SharePoint Mode)" http://technet.microsoft.com/en-us/library/hh270316.aspx.

    "Backup and Restore Reporting Services Service Applications" http://technet.microsoft.com/en-us/library/hh270316.aspx.

    "Planning Disaster Recovery for Microsoft SQL Server Reporting Services in SharePoint Integrated Mode" http://msdn.microsoft.com/en-us/library/jj856260.aspx.

    "Overview of backup and recovery in SharePoint 2013" http://technet.microsoft.com/en-us/library/ee663490.aspx.

    "Plan for backup and recovery in SharePoint 2013" http://technet.microsoft.com/en-us/library/cc261687.aspx.

    "Backup and restore SharePoint 2013" http://technet.microsoft.com/en-us/library/ee662536.aspx.

    "Supported high availability and disaster recovery options for SharePoint databases (SharePoint 2013)" http://technet.microsoft.com/EN-US/library/jj841106.aspx.

    "Database Requirements (Master Data Services)" http://technet.microsoft.com/en-us/library/ee633767.aspx.

    "Web Application Requirements (Master Data Services)" http://technet.microsoft.com/en-us/library/ee633744.aspx.

    "Export and Import DQS Knowledge Bases Using DQSInstaller.exe" http://technet.microsoft.com/en-us/library/hh548693.aspx.

    "Using AlwaysOn Availability Groups for High Availability and Disaster Recovery of Data Quality Services (DQS)" http://msdn.microsoft.com/en-us/library/jj874055.aspx.

    "Install SQL Server 2012 Business Intelligence Features" http://technet.microsoft.com/en-us/library/hh231681.aspx.

    "SQLCATs Guide to High Availability and Disaster Recovery", "SQLCAT's Guide to BI and Analytics" http://blogs.msdn.com/b/sqlcat/archive/2013/10/23/sqlcat-com-ebook-downloads.aspx.

    Case Study for failover to a standby data center: "High Availability and Disaster Recovery at ServiceU: A SQL Server 2008 Technical Case Study" http://technet.microsoft.com/en-us/library/ee355221.aspx.

    "Business Continuity in Windows Azure SQL Database" http://msdn.microsoft.com/en-us/library/hh852669.aspx.

    "SQL Server Managed Backup to Windows Azure" http://msdn.microsoft.com/en-us/library/dn449496.aspx.

    "SQL Server Deployment in Windows Azure Virtual Machines" http://msdn.microsoft.com/en-us/library/windowsazure/dn133141.aspx.

    Hybrid storage appliance "StorSimple cloud integrated storage" http://www.microsoft.com/en-us/server-cloud/products/storsimple.

    This posting is provided "AS IS" with no warranties, and confers no rights.

  • In Depth Training from the experts who built Windows Azure

    Partners, we are excited to share with you an excellent learning opportunity on Windows Azure brought to you by the Microsoft Virtual Academy.  Below are more details.

    Join us for a week long series of (FREE) live, interactive sessions from the experts who built Azure that will show you how to start using Windows Azure in your solutions today. In addition to seeing lots of demos and real world examples, you’ll be able to get your questions answered in real time Q&A.

    Scott Guthrie, Azure Guru and Microsoft Corporate Vice President, will kick off the week on Monday January 27 by building a real world application from scratch, end to end, to show you the latest capabilities of Azure. Then each day, we’ll have deep dive sessions led by Microsoft’s top cloud platform development experts, including Scott Hanselman, Scott Hunter, Marc Mercuri, Cheryl McGuire, and Miranda Luna.

    Monday January 27, Overview Day: Get Started with Windows Azure Today Jump Start

    Join Scott Guthrie. Azure guru and Microsoft Corporate Vice President, as he builds a real-world application from scratch, and hear about the platform in a nutshell.  Register Now

    Tuesday January 28, Architecture Day: Designing Applications for Windows Azure Jump Start

    Developers, do you wonder how to design apps for the cloud? Are you interested in best practices for architectures that include services that run in the cloud? Join Marc Mercuri for a day of design and architecture lessons for Windows Azure developers.  Register Now

    Wednesday January 29, Developer Day: Building Windows Azure Applications Jump Start

    If you’ve been thinking about Windows Azure for your next application, but aren’t sure where to begin, join this deep dive with Scott Hanselman and Scott Hunter to get started building applications for the cloud today.  Register Now

    Thursday January 30, Infrastructure Day: Windows Azure IaaS Deep Dive Jump Start

    Join Cheryl McGuire for Hybrid Cloud Day, a deep dive on integrating Windows Azure into your infrastructure. She will explore details about creating VMs, how they behave in Windows Azure, and how to configure good network communication to get things up and running in the cloud.  Register Now

    Friday January 31, Mobile Services Day: Mobile Apps to IoT: Connected Devices with Windows Azure

    Join Miranda Luna for a comprehensive, infrastructure-focused look at how your mobile apps can use cloud data, manage users with ease, and quickly deliver push notifications at scale.  Register Now

  • Implementing a Loading Splash Screen for your Windows Phone App

    One of our Escalation Engineers from support put together this fantastic video for Windows Phone App developers.

    This video tutorial demonstrates how to create a loading screen in your Windows phone App that is bound to a property in the ViewModel.  It covers how to use Blend to create a simple loading screen and use the DataStateBehavior in conjunction with a Boolean ViewModel property to display the loading screen when data is loading, and the main view for the page when data has been finished loading.

  • Office 365 New Features for October 2013

    As Microsoft has transitioned from a company that releases products every three years to a Devices and Services company that releases features and updates on a regular basis, it is becoming more important for Partners to stay on top of what is available in the current release of Office 365.  

    To help stay on top of the latest releases, the Partner Technical Services team will be releasing a monthly recap of the features that were released in the previous month.  The first video is embedded below, and a link to the announcements follows.

    “But Sean!” (I hear you thinking)… “you are releasing the “New Features for October” video at the end of November!”.  You are correct… it took longer than planned to connect the dots and make this happen.  Future updates will be more timely.

    Note: If you want to skip the introduction where we discuss the release history and cadence, the feature update starts at 5:50 in the video below.

    To stay completely up-to-date on the latest announcements, make sure to visit the Office 365 Message Center. This is where tenant-specific announcements or items requiring administrative action will be posted.  The following blogs are where the product teams will release announcements about new features:

    Technology-Specific Blogs

    October Announcement Recap

    Go-Daddy Integration

    While signing-up for Office 365, customers now have an option to either sign-up & purchase new domain via Go-Daddy from within Office 365 or change their MX records in Go Daddy - all from within the Office 365 admin page. While customers require login credentials of Go Daddy they can use a much simpler interface from within Office 365 to sign-in and purchase/change their domain information.

    OneNote for Android 2.2

    The update for OneNote for Android is now available as of October 2013.

    Exchange Online Mailbox Size Increase

    The size of user mailboxes Exchange Online and Office 365 multi-tenant service plans are doubling. This means that the current standard 25 GB user mailbox will now include 50 GB of mailbox storage. Shared Mailboxes and Resources Mailboxes are both more than doubling those to 10 GB each. Kiosk user mailboxes will increase from 1 GB to 2 GB. Site Mailboxes remain unchanged, as do all Office 365 dedicated mailboxes.

    Folder Permissions and Calendar Delegation in Outlook Web App

    You can now configure folder and calendar permissions in Outlook Web App to give delegates access to your inbox, calendar, and other folders.  This is particularly useful if you need someone like an admin to manage your inbox or calendar while you’re out of the office.

    Members Can Share

    Members of a site can share documents and the site itself without having to approve them through the site collection owner. When the site owner allows members to share on the site, anyone with at least Editor permissions on a document will be able to share the document, and any user in the default Members Group for the site will be able to share the site. (Office.com video)

    SkyDrive Pro for iOS v1.1

    Microsoft is adding two new capabilities to the existing SkyDrive Pro for iOS app:

    1. Integration with the Office Mobile for iPhone app (on iPhone) and Office Web Apps (on iPad) for editing Word, Excel, and PowerPoint documents while maintaining file integrity in-place in SharePoint Online.
    2. Integration with the OneNote app for viewing and editing notebooks. OneNote apps, as of this month, support Office 365. This new feature enables mobile editing of OneNote files stored in SkyDrive Pro

    Content Search Web Part

    It is now possible to enable a core web part previously only available on-premises - for dynamic publishing to showcase specific content based on a predefined search query.

    The Content Search Web Part (CSWP) uses SharePoint enterprise search technology to display dynamic results on a page. When you browse to a page that contains a CSWP, you're actually issuing a query. What’s different with CSWPs is that instead of you entering query terms in a search box, the query is contained within the web part itself. This means that when you browse to a page that contains a CSWP, a query is automatically issued.

    Lync Mobile Updates for Windows Phone and iOS

    The Lync team has released some new exciting features for Windows Phone and iOS Lync Mobile clients:

    • Join a meeting even if you don’t have a Lync account – Anonymous Join
    • Support for certification and passive authentication
    • View more information about meeting participants
    • Pin a contact to your home screen for quick communication (only for WP)
    • Easily start a conversation with a group (IM and Video)

    SharePoint & Yammer Integration: POST & Search

    • Ability to start a document conversation from any SP document library in Yammer (new "POST" button next to EDIT & SHARE)
    • New "Search on Yammer" link in the SP search center will pass the query string in Yammer

    Power BI PREVIEW for Office 365

    At WPC we publically disclosed a new offering - Power BI for Office 365. Power BI provides users with a complete Self Service BI solution for all users through the familiar and intuitive BI capabilities in Excel and ease of deployment through Office 365. This new BI offering will enable users to:

    Discover, Analyze, and Visualize with Excel

    Collaborate and Stay Connected – share insights, find answers, and stay connected from anywhere and on any device. Power BI for Office 365 enables users to quickly create Power BI Sites, BI workspaces for users to share and view larger workbooks of up to 250MB, refresh report data, maintain data views for others and track who is accessing them, and easily find the answers they need with natural language query. Users can also stay connected to their reports in Office 365 from any device with HTML5 support for Power View reports and through a new Power BI mobile app for Windows.

    In the Cloud, on Your Terms – extend your existing systems with a turn-key BI solution running in Microsoft’s cloud, enterprise grade and secure. Power BI for Office 365 enables new capabilities for IT to help enable users and ensuring a smooth running system. First, through the Data Management Gateway, IT can enable on-premises data access for all reports published into Power BI so that users have the latest data. IT can also enable enterprise data search across their organization, making it easier for users to discover the data they need. The system also monitors data usage across the organization, providing IT with the information they need to understand manage the system overall.

    Power BI Mobile App

    A new visualization app for Office that will help visualize graphs and data residing in an Excel workbook is now available in the Windows Store. The user will be able to navigate through the data with multiple views and ability to zoom in and out at different levels. This app will be first available for Windows 8, Windows RT, and Surface devices through the Windows Store and specifically for those customers using the Power BI for Office 365 Preview. It will provide touch optimized access to BI reports and models stored in Office 365.

    Apps for Office - Seller Dashboard Enhancements

    Enhancements to Apps for Office Seller Dashboard - enabling subscription pricing for Apps for Office & SharePoint.

    ---------------------------

    Whew! That’s a lot!  The links above will take you to the blog post announcing the future.  If you’d like to see a short 5-10 minute video on how to plan for and configure any of the above features, let me know at firstname.lastname@microsoft.com (substitute my actual first and last name) and I’ll add that topic to our queue of videos to release.

    As always, if you are a Microsoft Partner and need technical assistance in planning for, selling, or deploying Microsoft technologies, visit us at https://mspartner.microsoft.com/en/us/Pages/Support/technical-presales-and-advisory-services.aspx

  • Using Visual Studio Templates for Designing Windows Store Apps - Quick Start

    In this video we'll show you some of the default Visual Studio templates you can use to quickly get started building your Windows Store Application