Partner Technical Services Blog

A worldwide group of consultants who focus on helping Microsoft Partners succeed throughout the business cycle.

Partner Technical Services Blog

  • Configuring SharePoint 2013 Forms-Based Authentication with SQLMemberShipProvider

    Post courtesy Partner Solution Consultant Priyo Lahiri

    Background

    With SharePoint 2013, a lot of partners and customers are opening up their on premise deployment to their vendors and customers. While the way you would configure this is very similar to SharePoint 2010, things get a little tricky when you perform a real-world deployment spanned across multiple servers. This post is an end-to-end walkthrough of setting up Forms Based Authentication with SQLMemberShipProvider in a 3 tier SharePoint 2013 Deployment.

    Environment

    It would be whole lot easier if I had a single server environment with the same account running everything and that account is also a Domain Admin. However, I chose a different approach since most likely this is how your real-world deployment will be setup and the steps are little different when your farm is spanned across 3 servers. Here is my environment:

    WFE01 – Web Server running Microsoft SharePoint Foundation Web Application. I am connecting to the SQL instance using an Alias. It’s a very smart move. If you have ever had to move your SharePoint databases across SQL Servers or decommission an aging SQL Server, you know that having a SQL Alias will save you from a lot of nightmares. If you are looking for a step by step, click here.

    APP01 – Central Admin Server. Note: this is NOT running Microsoft SharePoint Foundation Web Application and is configured to be a “True” application server. This also means that the Web Application that we create will not reside on this server.

    SQL01 – SQL Server running SQL Server 2012 with SP1

    SharePoint 2013 server RTM and Windows Server 2012 RTM are used for this set up.

    Tools to use

    While the steps documented below can be done without these tools, they do make your life a whole lot easier.

    1. FBA Configuration Manager for SharePoint 2013 – Author and Credit goes to Steve Peschka. The download comes with a ReadMe file. Please read it, since you need to register the WSP that comes with it.

    2. SharePoint 2013 FBA Pack – Author and Credit goes to Chris Coulson. Here is the documentation that will tell you how to install/activate/work with it. This not only will this make usonly tested the user management er management a breeze, it has some very useful features like password reset and self-service account management.

    NOTE: I have portion of the FBA Pack and didn’t have time to play with the rest of the features.

    How it’s done

    Step 1 – Create the Web Application

    In this step we will be creating the web application with Windows Authentication (Claims) and Forms Based Authentication (FBA) on the same Zone. In SharePoint 2013, you can have multiple authentication providers without extending the web application. Having said that, at times, you might have to extend the web application depending on your scenario. More on that on a different post where I will show you how to use LDAPMemberShipProvider to talk to your AD.

    From Central Administration, we will create a Web Application and call it Extranet.waterfall.net and enable both Windows Auth and FBA. Note the names I am using: ASP.NET Membership Provider Name = SQL_Membership and ASP.NET Role manager name = SQL_Role. You can call them whatever you want, just ensure you use the same names everywhere.

    clip_image002

    We will create a new App Pool and use the Web App Pool account. Make a note of this since you would need to give this account permission in the next step in the ASPNET database.

    clip_image004

    Create the Web App and then the Site Collection, it doesn’t matter what template you choose. Once the Site Collection is created, visiting the site collection will take you to our default sign in page where you will be asked to choose an Authentication Provider to Sign In with. If you want your External Users only to have the option of FBA, you would want to set this default zone with Windows Auth and extend it and have the FBA on the extended web app. Obviously, the URL’s will then be different.

    Your sign in page should look like this (make sure your DNS record (CNAME) point to the WFE01)

    clip_image006

    Do you want to see a custom sign in page with your company brand on it? Well, let’s defer that to a different post.

    Step 2 – Verify Tools

    Now that the web app is created, we will make sure FBA Pack and FBA Configuration manager is deployed as it should be. Go to Central Administration >> System Settings >> Manage Farm Solutions. Make sure fbaConfigFeature.wsp is globally deployed and visigo.sharepoint.formsbasedauthentication.wsp is deployed to http://extranet.yourdomain.com. See screenshot below. If the visigo.sharepoint.formsbasedauthentication.wsp is not deployed, click on the WSP and deploy it to your web application.

    clip_image008

    Login to the site collection created in the above step and activate the following feature:

    Site Settings >> Site Collection Administration >> Site Collection Features >> Form based Authentication Management

    clip_image009

    Once the feature is activated, it should add the following to your Site Settings under User and Permissions

    clip_image011

    Step 3 – Creating the SQL Database for User Management

    The first step is to create the SQL Database that would hold the Extranet Users

    • Browse to c:\Windows\Microsoft .NET\Framwork64\v4.0.30319
    • Run aspnet_regsql.exe
    • Click Next
    • Choose Configure SQL Server for Application Services >> Click Next
    • Enter your SQL Server Name , choose Windows Authentication and type in a Database Name

    clip_image013

    • Click Next twice to provision the database
    • Now we need to add the Application Pool that runs the web application and give it required permission. In this case, the application pool name is waterfall\spweb. Perform the following steps:
      • Open up SQL Management Studio, Expand the database we created and expand Security
      • Right click Users and add a new User
      • User Type = Windows User
      • User name = choose <yourAppPoolAccountName>
      • Login name = browse and choose the login name (should be same as the app pool name above)

    clip_image015

      • Click Owned Schemas and choose the following:
        • aspnet_Membership_FullAccess
        • aspnet_Persolalization_FullAccess
        • aspnet_Profile_FullAccess
        • aspnet_Roles_FullAccess
        • aspnet_WebEvent_FullAccess

    clip_image017

    Step 4 – Editing the web.config files

    We need edit the following web.config files:

    • Web Application Web.config – WFE server
    • STS Application web.config – WFE server and Application Server
    • Central Admin web.config – CA Server
    • If you have more WFEs and App Servers, you need to edit them as well. A lot of people puts these in there machine.config file as well so that it gets inherited to the web.config file. I am not too keen on editing the machine.config file.

    Let’s login to our WFE server and fire up FBAConfigMgr.exe. While you can get the code you need from here and edit web.config yourself, if you just let the tool run its course, it will create a Timer Job and do the task for you. In the FBAConfigMgr type in your application URL and from the sample configuration choose the following:

    • People Picker Wildcard
    • Connection String
    • Membership Provider
    • Role Provider

    Here is what the screen looks like when default values are chosen:

    clip_image019

    We will modify the default values to reflect the following (highlighted items need modification per your environment):

    • Web Application URL - http://extranet.waterfall.net
    • People Picker Wildcard - <add key="SQL_Membership" value="%" />
    • Connection String -
      <add name="fbaSQL" connectionString="server=SQL01;database=Extranet_User_DB;Trusted_Connection=true" />
    • Membership Provider -
      <add connectionStringName="fbaSQL" applicationName="/"
      name="SQL_Membership"
      type="System.Web.Security.SqlMembershipProvider, System.Web,
      Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    • Role Provider -
      <add connectionStringName="fbaSQL" applicationName="/"
      name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,
      Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a"/>

    The screen should now look like this:

    clip_image021

    It’s time to hit Apply Config. This will create a timer job to update your web.config files. Though it creates a backup, you should be proactive and take a backup of your web application web.config and sts web.config file. Here is how to back up the web.config file and here is how to find the STS web.config file.

    Once you click Apply Config, the tool will tell you when it’s done. It might take a few mins before you see any changes, so wait for it (you should see a new backup file created for your web.config file with time stamp and _FBAConfigMgr in the end of the file). To verify that the job is done, open up the web.config for your web application and search for <membership. You should see the following:

    <<Web Application web.config file>>

    clip_image023

    The ConnectionStrings gets added to the end of the file right above </configuration>

    clip_image025

    <<STS web.config file>>

    Open up the STS Web.Config and you should see the following:

    clip_image027

    The ConnectionStrings gets added to the end of the file as well just like web.config of the web application.

    <<Central Administration web.config file on App Server>>

    If you go back to the application server and open up the web.config file for the Central Admin site, you will see there are no changes made there. So we will make that change manually. Create a backup of the file then open up the file and find <Machine. It should look like this:

    clip_image029

    We will add the following (copied from web.config file of web application or the code from FBAConfigMgr)

    1. Search for <machineKey and paste the following under <rolemanager><providers>
    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    2. Under <membership><providers> paste the following
    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Membership" type="System.Web.Security.SqlMembershipProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />
    The screen should now look like this:
    clip_image031

    3. Scroll to the end of the document and paste the following right before </configuration>
    <connectionStrings>

    <add name="fbaSQL" connectionString="server=SQL01;database=Extranet_User_DB;Trusted_Connection=true" />

    </connectionStrings>

    clip_image033

    <<STS web.config file on App Server>>

    Just like the Central Admin web.config make the same changes on this web.config as well. Just make sure you are pasting the information from RoleManager Providers and Membership Providers in the right place. Here is what the code looks like (you can use the code below are make changes to the highlighted areas to suit your environment):

    <system.web>

    <membership>

    <providers>

    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Membership" type="System.Web.Security.SqlMembershipProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    </providers>

    </membership>

    <roleManager>

    <providers>

    <add connectionStringName="fbaSQL" applicationName="/" name="SQL_Role" type="System.Web.Security.SqlRoleProvider, System.Web,&#xD;&#xA; Version=2.0.3600.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" />

    </providers>

    </roleManager>

    </system.web>

    <connectionStrings>

    <add name="fbaSQL" connectionString="server=SQL01;database= Extranet_User_DB;Trusted_Connection=true" />

    </connectionStrings>

    Here is a screenshot

    clip_image035

    Step 5 - Use FBA Pack to add and manage users

    Our configurations are done. We will now go to our site collection and use the FBA Pack to add / manage users and Roles

    Go to Site Settings and click on FBA User Management >> Click New User and create a dummy user and add him to the contributor group

    clip_image037

    Step 6 – Verify Forms user

    Now open up IE in InPrivate mode and visit your site collection and this time choose Forms Authentication and enter the account information you just created to log in. You’re done!

    clip_image039

    Click on the user name and My Settings, you will see the account information coming from SQL Membership Provider

    clip_image041

    If you go to a document library and try and add the user there, you will see it resolves from your SQL database

    clip_image043

    Appendix

    How to create SQL Alias for SharePoint

    Follow the steps below to create a SQL Alias on all your SharePoint Servers:

    TechNet Reference: http://technet.microsoft.com/en-us/library/ff607733.aspx#clientalias

    1. Perform this on the Application Server that is hosting Central Administration

    a. Stop all SharePoint Services

    b. Open CLICONFIG.exe from C:\Windows\System32\cliconfg.exe (64 bit version of cliconfig.exe)

    c. Enable TCP/IP under general tab
    clip_image045

    d. Click on Alias Tab

    e. Type Current SQL Server Name in the Alias Name field

    f. Type Current SQL Server Name in the Server field (see screenshot below. In your case SQL Alias and SQL Server name is the same)
    clip_image047

    g. Validate SQL Alias

    i. Create a new text file on SharePoint Server and name it “TestDBConnection.udl”

    ii. Double click to open the file and enter your SQL Server Alias name

    iii. Use Windows Integrated Security

    iv. You should be able to see all your SharePoint databases when you click on “Select the database on the Server”

    h. Start all services for SharePoint Server / Reboot SharePoint Server

    i. Perform the steps above on all other SharePoint servers

    How to backup web.config file

    To back up web.config file, perform the following:

    · From IIS Manager (start >> Run > inetmgr)

    · Right click on the web site and click Explore

    · Copy the web.config file somewhere else, or the in the same location with a different name

    clip_image049

    Where is the STS web.config file?

    · On your WFE open up IIS Manager and expand SharePoint Web Services

    · Right click on SecurityTockenServiceApplication and click Explore

    clip_image051

  • Embedding a PowerPoint Deck on SharePoint 2010

    (Post dedicated to Nuri, Operations Manager for our delivery team in EMEA, and courtesy Sean Earp)

    With the addition of PowerPoint Web App to SharePoint 2010, you can now view and edit PowerPoint presentations directly from within your browser.  This technology has also been made available to consumers on services such as http://office.live.com/ and http://docs.com/.

    image

    In the past, it has been difficult to embed a PowerPoint document within a webpage, requiring workarounds such as saving the presentation as pictures, PDFs, or MHT documents.  If you have a public presentation, it is now extremely easy to embed a PowerPoint deck on any web page, following the steps on the aptly named how to embed a PowerPoint presentation on a web page post.

    Unfortunately, these steps do not work if your installation of PowerPoint Web App is local.  The Share –> Embed option available from http://office.live.com is simply not present on SharePoint 2010.

    image

    So what to do if you want to embed an internal, private, or confidential PowerPoint presentation on an internal SharePoint page?  Fortunately, it is possible to embed a presentation on a webpage without posting the presentation on a broadly available public site.

    Step 1: Ensure that Office Web Apps have been installed and configured on SharePoint 2010.  Those steps are out of scope for this article, but the official documentation should be all you need:  Deploy Office Web Apps (Installed on SharePoint 2010 Products)

    Step 2: Upload the PowerPoint to a document library

    image

    Step 3: Click on the PowerPoint Deck to open it in PowerPoint Web App.  It will have a URL that looks like:

    http://sharepoint/sites/team/_layouts/PowerPoint.aspx?PowerPointView=ReadingView&PresentationId=/sites/team/Shared%20Documents/SharePoint%202010%20100-level%20overview.pptx&Source=http%3A%2F%2Fteam%2Fsites%2Fteam%2FSitePages%2FHome%2Easpx&DefaultItemOpen=1

    image

    Don’t worry about writing down the URL. Unfortunately, you can’t paste it into a Page Viewer web part without getting an error message.  So… a little magic to get the URL we need to embed our PowerPoint deck on our SharePoint Page.

    Step 4: Open the Developer Tools in Internet Explorer (F12), and search for iframe.

    image

    Step 5: Copy the first result into your text editor of choice.  The magic URL you need is the one within the src attribute.

    image

    Step 6: Delete everything except the part inside the quotes.  Before the PowerPointFrame.aspx, add the relative URL to your site collection _layouts directory, and copy the whole URL into your clipboard.

    image

    Step 6: Go to the SharePoint Page you want to embed the PowerPoint into.  Add a Page Viewer Web Part to the page.  Open the tool pane for the web part,

    image

    Step 7: In the Page Viewer tool pane, paste in the URL, and optionally enter a height/width and chrome state for the PowerPoint Deck.

    image

    Step 8: Hit “OK” and be awed at how awesome it looks to have a fully functional PowerPoint deck embedded on your page.  You can view the deck full screen by clicking “Start Slide Show”, you can change slides, view notes, click links, or click the “popout” button to have the deck open up in a popout window.

    image

    Super-secret-squirrel trick: If you want the deck to default to a slide other than the cover slide, click through to the slide you want, and then click the popout button in the top right of the PowerPoint Web App.  The deck will be open to that slide in its own window. 

    Use the same Developer Tools trick from step 4, but this time search for &SlideId.  You will see the URL has added two parameters… a slide ID and popout=1 (the URL will end with something like &SlideId=590&popout=1).  You can guess what popout=1 does, and the SlideId is some sort of internal reference to the Slide ID (I have no idea how it is generated, but it doesn’t matter Smile.  My web app-fu will work just the same). Just copy the &SlideID=somenumber and paste it to the end of your URL in the Page Viewer web part, and now your web page will display the PowerPoint deck starting on whatever page you specified!

    Additional Resources

    Office Web Apps technical library

  • SharePoint and Exchange Calendar together

    (post courtesy Anand Nigam)

    One of the cool things in SharePoint 2010 is the ability to show the Exchange Calendar on a SharePoint site, side by side. This is called as Calendar Overlay

    This post will walk through how to configure this.

    Step 1 (prerequisite)

    1. I have a SharePoint Site http://fabrikam which looks like this

    clip_image002

    2. I also have a calendar “MySharePointCalender” , with a few calendar events entered.

    clip_image004

    3. I have my Exchange Calendar in Outlook, with a few meeting/events there as well.

    clip_image006

    4. What we want is to see events from my Exchange calendar show up on the SharePoint calendar.

    Step 2 (The actual process)

    1. Open the SharePoint calendar  --> Calendar Tools –> Calendar Overlay –>New Calendar,

    clip_image008

    clip_image011

    Fill in the :

    • Name: Give a name to this calendar
    • Type: Select Exchange
    • Outlook Web Access URL: the OWA url of your organization.
    • Exchange Web Service URL: which can be determined as follows:

    If your OWA URL is https://exch.contoso.com/owa, then the Exchange web Service URL would be https://exch.contoso.com/ews/exchange.asmx

    (in other words, from the OWA URL , remove the trailing “owa” and add “ews/exchange.asmx”)

    clip_image014

    Step 3 (The awaiting Error and the fix)

    If you have not previously configured SharePoint to trust your Exchange server, you will receive the following error message:

    Could not establish trust relationship for the SSL/TLS secure channel with authority ‘dc’. (GUID)

    clip_image016

    Here is the fix

    1. Get the CA Root Certificate for your domain

    (Just a note, there are many ways to get the certificate, I’m taking the one that is less prone to error)

    a. Go to the Server where you have the Certificate Authority installed. Open IIS and select the Server Certificates component.

    clip_image018

    Double click on Server Certificates

    Locate the Root Certificate of the CA from the list, here is the one what I have.

    clip_image021

    (To double check if this it the Root certificate, open the certificate and see the certification path, It should have just one entry (root), that is the name of the Certification Authority, in your domain.). Below the image or my root certificate

    clip_image023

    b. Now that we have located the certificate, Open it go to Details tab and Click Copy to File

    clip_image026 clip_image028

    clip_image031 clip_image033

     clip_image036 clip_image038

    And now we have the Certificate exported to a file

    clip_image040

    Copy this certificate to the SharePoint Server, and follow the below steps

    a. Open Central administration > Security> Manage Trust

    clip_image042

    b. Click on New, Provide a Name (I use RootCA), and navigate to the RootCA.cer file you exported in the previous step and Click OK

    clip_image044

    Now refresh the same calendar and confirm that you can see the Exchange calendar event for the logged in user

    clip_image046

    Step 4 (Enhance the default behavior)

    Although we can now see the Exchange calendar, we can on only see the free/busy status, and not the actual details of the event. It would be good if we could have the details displayed here too. To display details:

    1. Open Outlook> File > Options>

    clip_image048

    2. Go to the Calendar Section > click Free/Busy Options

    clip_image050

    3. Select any one of the options below, I have selected Full details. Click Apply and Ok and exit out of Outlook.  Now refresh the SharePoint calendar and see the difference.

    clip_image052

    clip_image054

    Additional reading:

    Note: The calendar overlay is per user, meaning it will only show calendar items for the currently logged-in user.

  • Migrating File Shares to SharePoint Online

    (Post courtesy Partner Solution Consultant Andre Kieft)

    It has been a while since I created a blog post, but recently I received a lot of questions and requests for advice on how to migrate file shares to SharePoint and use SkyDrive Pro (SDP). So I figured to create a blog post with the things you need to consider as a Small and Medium Business (SMB) partner when you are planning to migrate file share content into SharePoint and want to make use of SDP for synchronizing the SharePoint content offline.

    Note: that these steps are both valid for SharePoint 2013 on-premises (on-prem) and SharePoint Online (SPO).

    Step 1 – Analyze your File Shares

    As a first step, try to understand the data that resides on the file shares. Ask yourself the following questions:

    • What is the total size of the file share data that the customer wants to migrate?
    • How many files are there in total?
    • What are the largest file sizes?
    • How deep are the folder structures nested?
    • Is there any content that is not being used anymore?
    • What file types are there?

    Let me try to explain why you should ask yourself these questions.

    Total Size

    If the total size of the file shares are more that the storage capacity that you have on SharePoint, you need to buy additional storage (SPO) or increase your disk capacity (on-prem). To determine how much storage you will have in SPO, please check the Total available tenant storage in the tables in this article. Another issues that may arise is that in SharePoint is that you reach the capacity per site collection. For SPO that is 100 Gigabyte, for on-premises the recommended size per site collection is around 200 Gigabyte. This would automatically mean that the content database is around 200 Gigabyte, which is the recommended size. Thought you can stretch this number up in on-prem, it is not recommended.

    So, what should I do when my customer has more than 100 Gigabyte?

    • Try to divide the file share content over multiple site collections when it concerns content which needs to be shared with others.
    • If certain content is just for personal use, try to migrate that specific content into the personal site of the user.

    How Many Files

    The total amount of files on the file shares is important as there are some limits in both SharePoint as well as SDP that can result in an unusable state of the library or list within SharePoint but you also might end up with missing files when using the SDP client.

    First, in SPO we have a fixed limit of 5000 items per view, folder or query. Reasoning behind this 5000 limit boils all the way down to how SQL works under the hood. If you would like to know more about it, please read this article. In on-prem there is a way to boost this up, but it is not something we recommend as the performance can significantly decrease when you increase this limit.

    Secondly for SDP there is also a 5000 limit for synchronizing team sites and 20000 for synchronizing personal sites. This means that if you have a document library that contains more that 5000 items, the rest of the items will not be synchronized locally.

    There is also a limit of 5 million items within a document library, but I guess that most customer in SMB won’t reach that limit very easily.

    So, what should I do if my data that I want to migrate to a document library contains more than 5000 items in one folder?

    • Try to divide that amount over multiple subfolders or create additional views that will limit the amount of documents displayed.

    But wait! If I already have 5000 items in one folder, doesn’t that mean that the rest of the other document won’t get synchronized when I use SDP?

    Yes, that is correct. So if you would like to use SDP to synchronize document offline, make sure that the total amount of documents per library in a team site, does not exceed 5000 documents in total.

    So, how do I fix that?

    • Look at the folder structure of the file share content and see if you can divide that data across multiple sites and/or libraries. So if there is a folder marketing for example, it might make more sense to migrate that data into a separate site anyway, as this department probably wants to store additional information besides just documents (e.g. calendar, general info about the marketing team, site mailbox etc). An additional benefit of spreading the data over multiple sites/libraries is that it will give the SDP users more granularity about what data they can take offline using SDP. If you would migrate everything into one big document library (not recommended), it would mean that all users will need to synchronize everything which can have a severe impact on your network bandwidth.

    Largest File Sizes

    Another limit that exists in SPO and on-prem is the maximum file size. For both the maximum size per file is 2 Gigabyte. In on-prem the default is 250 MB, but can be increased to a maximum of 2 Gigabyte.

    So, what if I have files that exceed this size?

    • Well, it won’t fit in SharePoint, so you can’t migrate these. So, see what type of files they are and determine what they are used for in the organization. Examples could be software distribution images, large media files, training courses or other materials. If these are still being used and not highly confidential, it is not a bad thing to keep these on alternative storage like a SAN, NAS or DVDs. If it concerns data that just needs to be kept for legal reasons and don’t require to be retrieved instantly, you might just put these on DVD or an external hard drive and store them in a safe for example.

    Folder Structures

    Another important aspect to look at on your file shares is the depth of nested folders and file length. The recommended total length of a URL in SharePoint is around 260 characters. You would think that 260 characters is pretty lengthy, but remember that URLs in SharePoint often has encoding applied to it, which takes up additional space. E.g. a space is one character but in Unicode this a %20, which takes up three characters. The problem is that you can run into issues when the URL becomes to large. More details about the exact limits can be found here, but as a best practice try to keep the URL length of a document under 260 characters.

    So, what if I have files that will have more than 260 characters in total URL length?

    • Make sure you keep your site URLs short (the site title name can be long though). E.g. don’t call the URL Human Resources, but call it HR. If you land on the site, you would still see the full name Human Resources as Site Title and URL are separate things in SharePoint.
    • Shorten the document name (e.g. strip of …v.1.2, or …modified by Andre), as SharePoint has versioning build in. More information about versioning can be found here.

    Idle Content

    When migrating file shares into SharePoint is often also a good momentum to clean up some of the information that the organization has been collecting over the years. If you find there is a lot of content which is not been accessed for a couple of years, what would be the point of migrating that data it to SharePoint?

    So, what should I do when I come across such content?

    • Discuss this with the customer and determine if it is really necessary to keep this data.
    • If the data cannot be purged, you might consider storing it on a DVD or external hard drive and keep it in a safe.
    • If the content has multiple versions, such as proposal 1.0.docx, proposal 1.1.docx, proposal final.docx, proposal modified by Andre.docx, you might consider just moving the latest version instead of migrating them all. This manual process might be time consuming, but can safe you lots of storage space in SharePoint. Versioning is also something that is build into the SharePoint system and is optimized to store multiple versions of the same document. For example, SharePoint only stores the delta of the next version, saving more storage space that way. Note that this functionality is only available in SharePoint on-prem.

    Types of Files

    Determine what kind of files the customer is having. Are they mainly Office documents? If so, then SharePoint is the best place to store such content. However, if you come across developers code for example, it is not a good idea to move that into SharePoint. There are also other file extensions that are not allowed in SPO and/or on-prem. A complete list of blocked file types for both SPO and on-prem can be found here.

    So, what if I come across such file extensions?

    • Well, you can’t move them into SharePoint, so you should either ask yourself, do I still need these files? And if so, is there an alternative storage facility such as a NAS, I can store these files on? If it concerns developer code, you might want to store such code on a Team Foundation Service Server instead.

    Tools for analyzing and fixing file share data

    In order to determine if you have large files or exceed the 5000 limit for example, you need to have some kind of tooling. There are a couple of approaches here.

    • First off, there is a PowerShell script that has been pimped up by a German colleague Hans Brender, which checks for blocked file types, bad characters in files and folders and finally for the maximum URL length. The script will even allow you to fix invalid characters and file extensions for you. It is a great script, but requires you to have some knowledge about PowerShell. Another alternative I was pointed at is a tool called SharePrep. This tool does a scan for URL length and invalid characters.
    • Secondly there are other 3rd party tools that can do a scan of your file share content such as Treesize. However such tools do not necessarily check for the SharePoint limitations we talked about in the earlier paragraphs, but at least they will give you a lot more insight about the size of the file share content.
    • Finally there are actual 3rd party migration tools that will move the file share content into SharePoint, but will check for invalid characters, extensions and URL length upfront. We will dig into these tools in Step 2 – Migrating your data.

    Step 2 – Migrating your data

    So, now that we have analyzed our file share content, it is time to move them into SharePoint. There are a couple of approaches here.

    Open with Explorer

    If you are in a document library you can open up the library in the Windows Explorer. That way you can just do a copy and paste from the files into SharePoint.

    image

    But, there are some drawbacks using this scenario. First of all, I’ve seen lots of issues trying to open up the library in the Windows Explorer. Secondly, the technology that is used for copying the data into SharePoint is not very reliable, so keep that in mind when copying larger chunks of data. Finally there is also drag & drop you can use, but this is only limited to files (no folders) and only does a maximum of 100 files per drag. So this would mean if you have 1000 files, you need to drag them 10 times in 10 chunks. More information can be found in this article. Checking for invalid characters, extensions and URL length upfront are also not addressed when using the Open with Explorer method.

    Pros: Free, easy to use, works fine for smaller amounts of data

    Cons: Not always reliable, no metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

    SkyDrive Pro

    You could also use SDP to upload the data into a library. This is fine as long as you don’t sync more than 5000 items per library. Remember though that SDP is not a migration tool, but a sync tool, so it is not optimized for large chunks of data to be copied into SharePoint. Things like character and file type restrictions, path length etc. is on the list of the SDP team to address, but they are currently not there.

    The main drawbacks of using either the Open in Explorer option or using SDP is that when you use these tools, they don’t preserve the metadata of the files and folder that are on the file shares. By this I mean, things like the modified date or owner field are not migrated into SharePoint. The owner will become the user that is copying the data and the modified date will be the timestamp of the when the copy operation was executed. So if this metadata on the files shares is important, don’t use any of the methods mentioned earlier, but use one of the third party tools below.

    Pros: Free, easy to use, works fine for smaller amounts of data (max 5000 per team site library or 20000 per personal site)

    Cons: No metadata preservations, no detection upfront for things like invalid characters, file type restrictions, path lengths etc.

    3rd party tools

    Here are some of the 3rd party tools that will provide additional detection, fixing and migration capabilities that we mentioned earlier:

    (Thx to Raoul for pointing me to additional tools)

    The list above is in random order, where some have a focus on SMB, while other more focused on the enterprise segment. We can’t speak out any preference for one tool or the other, but most of the tools will have a free trial version available, so you can try them out yourself.

    Summary

    So, when should I use what approach?

    Here is a short summary of capabilities:

      Open in Explorer SkyDrive Pro 3rd party
    Amount of data Relatively small No more than 5000 items per library Larger data sets
    Invalid character detection No No Mostly yes1
    URL length detection No No Mostly yes1
    Metadata preservation No No Mostly yes1
    Blocked file types detection No No Mostly yes1

    1This depends on the capabilities of the 3rd party tool.

    Troubleshooting

    SDP gives me issues when synchronizing data
    Please check if you have the latest version of SDP installed. There have been stability issues in earlier released builds of the tool, but most of the issues should be fixed by now. You can check if you are running the latest version, by opening up Word-> File-> Account and click on Update Options-> View Updates. If your current version number is lower than the one you have, click on the Disable Updates button (click yes if prompted), then click Enable updates (click yes if prompted). This will force downloading the latest version of Office and thus the latest version of the SDP tool.

    image

    If you are running the stand-alone version of SDP, make sure you have downloaded the latest version from here.

    Why is the upload taking so long?
    This really depends on a lot of things. It can depend on:

    • The method or tool that is used to upload the data
    • The available bandwidth for uploading the data. Tips:
      • Check your upload speed at http://www.speedtest.net and do a test for your nearest Office 365 data center. This will give you an indication of the maximum upload speed.
      • Often companies have less available upload bandwidth then people at home. If you have the chance, uploading from a home location might be faster.
      • Schedule the upload at times when there is much more bandwidth for uploading the data (usually at night)
      • Test your upload speed upfront by uploading maybe 1% of the data. Multiply it by 100 and you have a rough estimate of the total upload time.
    • The computers used for uploading the data. A slow laptop can become a bottle neck while uploading the data.

    If you feel that there are things missing here, please let me know and I’ll try to add them to this blog post.

  • SharePoint 2010–Returning Document ID in Search Results

    (Post courtesy Sean Earp, with research and XSLT authoring by Alaa Mostafa)

    One of my favorite features of SharePoint 2010 is the Document ID.

    As discussed in the MSDN article Developing with Document Management Features in SharePoint Server 2010 (ECM):

    A document ID is a unique identifier for a document or document set and a static URL that opens the document or document set associated with the document ID, regardless of the location of the document. Document IDs provide:

    • A way to reference items such as documents and document sets in SharePoint Server 2010 that is less fragile than using URLs. URLs break if the location of the item changes. In place of the URL, the document ID feature creates a static URL for each content item with a document ID assigned to it.

    • More flexible support for moving documents or document sets at different points in the document life cycle. For example, if you create a document on a MySite or Workspace page and then publish it on a team site, the document ID persists and travels with the document, circumventing the broken URL problem.

    • A document ID generator that assigns a unique document ID to items. You can customize the format of the IDs that the service generates. By using the document management API, you can write and use custom document ID providers.

    image

    When browsing a document library with this feature enabled, you can display the Document ID column, and you will be able to see the Document ID for a given document.  Easy enough, and useful if you need to reference this Document ID in another system.

    This works great when you can browse a document library, perhaps using the new metadata navigation and filtering capabilities of SharePoint 2010, but if your document library is holding thousands and thousands of documents, users may resort to using search to find the document they are looking for.  Unfortunately, SharePoint search does not display the document ID is the search results by default.

    image

    Fortunately, SharePoint indexes Document IDs as a managed property by default, which means that with a little magic, we can add the Document ID into the search results.

    In a nutshell, SharePoint retrieves the search results as XML, and uses XSLT to transform the XML into the pretty results you see on the search results page.  Same basic concept as HTML (which has your content) and CSS (which styles that content).  We just need to tell SharePoint to return the managed property with our Document ID, and then update the XSLT to display that managed property in the search results. 

    It is not as hard as it sounds.

    Assumptions: I assume you have enabled the Document ID feature on the site collection, all documents have been assigned Document IDs, and a full crawl has been done of the site.  I also assume you are a site collection administrator with full permissions to the site collection.

    From your Search Results page  in the site collection (wherever you have it),   click on Page –> Edit (or Site Actions –> Edit Page).  You will see a ton of zones and web parts (such as the Refinement Panel, Search Statistics, Search Box, etc.  You can customize the heck out of the search results page, and move things all over the place.

    image

    For now, however, we are just going to modify the Search Core Results web part that contains…er… the core search results.  How intuitive!

    Edit the Search Core Results web part, and expand the section that says “Display Properties”.  Uncheck the box that says “Use Location Visualization”.  I have no idea why this option is named as it is… this is really the option that lets you edit the fetched properties and XSL.

    image

    As a quick aside… although you can edit the fetched properties and XSL directly from the web page properties, the experience is horrible.  I strongly recommend using an XML editor like Visual Studio or NotePad++

    In the Fetched Properties section you will see a number of columns that look like the following.  these are the managed properties that are returned by SharePoint Search

    <Column Name="PictureHeight"/>  <Column Name="PictureWidth"/>

    Somewhere before the closing </Columns> tag, add a:

    <Column Name="spdocid"/>

    (Note: if you are using SharePoint search instead of FAST search replace all instances of “spdocid” with “DocID”)

    This will cause SharePoint to return the Document ID in the search results XML.  Now let’s modify the XSL so that we display the ID in the search results.  Click on the “XSL Editor…” and copy the XSL into your XML editor of choice (or, if you like pain, just edit the 938-line long XSL sheet in a browser that does no validation or color coding.  Your choice.)

    At the top of the XSL is a list of parameter names.  Add in the following parameter (order does not matter)

    <xsl:param name="spdocid" />

    image

    Next, search for “DisplayAuthors.  After the DisplayAuthors call template, we are going to add a new call template called “DisplayID” to… well, display the ID. The template is wrapped in a conditional to ensure that if there is NOT a document ID, that it does not attempt to display a null value. 

    Add the following: following lines:

                  <xsl:if test="string-length($hasViewInBrowser) &gt; 0">
                    
                          <xsl:call-template name="DisplayID">
                            <xsl:with-param name="spdocid" select="spdocid" />
                            <xsl:with-param name="browserlink" select="serverredirectedurl" />
                            <xsl:with-param name="currentId" select="$currentId" />
                          </xsl:call-template>
                      
                  </xsl:if>

    image

    Search for “DisplayString” and we will add a section to call our template, display the ID (along with a URL that links to the document), and we’ll put brackets around the Document ID so it stands out visually.  Add the following:

      <xsl:template name="DisplayID">
        <xsl:param name="spdocid" />
        <xsl:param name="currentId" />
        <xsl:param name="browserlink" />
        <xsl:if test="string-length($spdocid) &gt; 0">
          <xsl:text xml:space="default"> [ </xsl:text>
          <a href="{concat($browserlink, $ViewInBrowserReturnUrl)}" id="{concat($currentId,'_VBlink')}">
            <xsl:value-of select="$spdocid" />
          </a>
          <xsl:text xml:space="default"> ] </xsl:text>
        </xsl:if>
      </xsl:template>
    image

    We’re almost done!  Select all your XSL, copy it, and paste it back into your SharePoint window, hit Save –> Okay –> Check In –> Publish

    Voila!  The Document ID now shows up in the search results with a clickable link back to the source document.

    image

    Random troubleshooting tip:  If you get the message “Property doesn't exist or is used in a manner inconsistent with schema settings”, this typically means one of two things:

    1. You created a custom managed property and have not yet run a full crawl so that this property does not exist in the index (this property is mapped out of the box, so it does not apply here)
    2. You are using the wrong managed property.  FAST search uses “spdocid” while SharePoint search uses “DocId”

    image

    image

    Attachments: I have attached a copy of the XSL I used for the above post to save you time copying and pasting into the right sections.  It works for me with SharePoint search, but use on a test server first and at your own risk.

  • Performing an Active Directory Health Check Before Upgrading

    (Post courtesy Bonoshri Sarkar)

    Hi everyone, this is Bonoshri Sarkar here. I have worked for Microsoft as Partner Technical Consultant specializing in Directory Services for the past two years; providing end to end consulting to enable partners to design, position, sell and deploy Microsoft Platforms for their customers. In my earlier role, I worked for more than 4 years on the Microsoft Support team focusing on Microsoft Directory Services.

    Since I have a great affinity for Directory Services, I thought it would be a great idea to pen down my thoughts and experience on ensuring a smooth Active Directory Upgrade.

    For any kind of Upgrade/ Migration / Transition to go smooth, and later on to have an healthy environment, it is required to spend a fair amount of time in planning and making sure that the source or the present environment is in a healthy state. Two driving factors for any upgrade or transition include the need to utilize the new features that the new version of the product has to offer, and the other being to ease the complexities and the issues of the current environment. However, most IT Pros do not take adequate steps to check the health of their existing Active Directory environment. In this post, I would like to address some of the key steps that an AD Administrator must perform prior to an upgrade or transition.

    In my experience of assisting customers and partners in different transitions, most of the issues pertain to the source domain or the source domain controllers, so I will discuss few important things which should be considered as mandatory before going for any kind of Upgrade / Migration / Transition.

    Performing an Active Directory Health Check

    The health check should be done in 2 phases.

    1. Planning Phase

    2. Deploy Phase (just before implementing the upgrade, transition or migration)

    In the first phase we should identify what all services and roles are running on the machine that we are planning to upgrade, and rule out things that we do not want to move to our new box.

    Putting emphasis on diagnosing AD issues, we can use dcdiag to ensure a healthier Active Directory, I know we have been using dcdiag for many years, and we look for failure messages in the output, but apart from the failure messages, we can also consider issues such as those highlighted in yellow below:

    clip_image001

    clip_image003

    clip_image004

    If you notice the first part of dcdiag says “failed test replication”, which implies that there are issues with Active Directory replication with this Domain Controller.

    The second message tells us that there are issues with netlogon and sysvol which are default logon shares, both the errors can be interdependent or could be because of completely different reasons. 

    In this scenario we need to fix AD replication first or dig into it more to find what is causing these errors. Now you can use few more commands to check the AD replication like repadmin /syncall /eAP. In case of a huge enterprise, you can also use Replmon (2003).

    The third message tells us that the important services are running. We need to be sure that the above services are started to ensure a smooth transition.

    If we don’t get enough details from the dcdiag results, check the event viewer, and if you do not see anything restart the FRS service and then check the event viewer for Event ID 13516.

    clip_image005

    Apart from dcdiag you can also use Netdiag to check the network status and get detailed information.

    In addition to this, make sure the NIC card drivers are updated on the old server. 

    Instead of disabling the hardware or software based firewall between on the servers (old &new), ensure that you make the appropriate exceptions and port configurations to ensure proper communication between the directory servers (see Active Directory and Active Directory Domain Services Port Requirements).

    Any third party legacy application(s) should be tested in lab environment to make sure that they are compatible with new version of server OS and Active Directory.

    clip_image007

    We also have different versions of Exchange BPA (Best Practice Analyzer) tools depending on the version of Exchange to check Exchange integrity and Exchange specific permission (You can select Permission check to gather that information).

    Last but not the least read the migration or transition documents (http://technet.microsoft.com/en-us/library/cc731188(WS.10).aspx) to make sure server has all the minimum requirements.

    Once we are sure that the servers are in healthy state do not forget to take a full and a system state backup using a supported backup system as documented in the TechNet article below

    http://technet.microsoft.com/en-us/library/cc731188(WS.10).aspx

    All these stitches in time would definitely save you nine hours’ worth of troubleshooting. It’s up to you to decide, would you like to troubleshoot or enjoy your Fries with Coke?

    Additional References

  • Network Monitoring with System Center Operations Manager 2012

    (Post courtesy Nikunj Kansara)

    This post describes the network monitoring capabilities of the System Center Operations Manager 2012 Beta.

    In my opinion, network monitoring is the most exciting feature of the upcoming Operations Manager 2012 release. This article will help users to get an overview of the network monitoring, how to discover network devices, configure network monitoring rules and object discoveries, sneak-peek on reports generated out of network management and network dashboard.

    I have split up the blog in four different topics:

    How to discover the network devices:

    Discovery is the process of identifying network devices to be monitored.

    Operations Manager 2012 can monitor devices that use SNMP v1, v2c and V3.

    The benefit that we get by configuring Network Monitoring is that if a critical server seems to be down, and if network monitoring is configured, we will see an alert that a switch/router port is down which was connected to the critical server. We can also see the network topology diagram called the Network Vicinity view.

    Operations Manager 2012 provides the following monitoring for discovered network devices:

    • We can view connection health between the network devices and between the server and the network device
    • We can view the VLAN health based on health state of switches in VLAN
    • We can view HSRP group health based on health state of individual HSRP end points
    • We can view Port/Interface Monitoring like Up/Down, Inbound / Outbound volume traffic
    • We can view Port/Interface Utilization, Packets dropped, broadcasted.
    • We can view Processor Utilization for some certified devices
    • We can view Memory utilization some certified devices

    Network device discovery is performed by discovery rules that you create.

    Below are steps for creating the discovery rule:

    1. Open the Operations Console

    2. Go to Administration Workspace, right click Administration and the click Discovery

    image
    Figure 1

    3. The What would you like to manage? Page in Figure 1 will open up and we need to select the Network Devices option and click Next.

    4. The General page in Figure 2 appears and we need to provide the Name of the discovery rule and then we need select the Management server from the drop down. And then click Next.

    Note:

    • We can create one discovery rule per management server or gateway server.
    • If we are creating a second discovery rule then we will only see the management servers that don’t have any discovery rule associated with them.
    • Also, we might want plan ahead and strategically place the management servers or gateway servers so they can access the network devices that we would like to discover.

    image
    Figure 2

    5. On the Discovery Method page in figure 3, we need to select the method to discover the network device. In this example we need to select Explicit discovery and then click next.

    Note:

    • Differences between Explicit discovery and Recursive Discovery:
      • Explicit discovery – An explicit discovery rule will try to to discover the devices that you explicitly specify in the wizard by IP address or FQDN. It will only monitor those devices that it can successfully access. The rule will try to access the device by using ICMP, SNMP, or both depending on the configuration of the rule.
      • Recursive discovery – A recursive discovery rule will attempt to discover those devices that you explicitly specify in the wizard by IP address, as well as other network devices that are connected to the specified SNMP v1 or v2 device and that the specified SNMP v1 or v2 device knows about through the device’s Address Routing Protocol (ARP) table, its IP address table, or the topology Management Information Block (MIB).

    image
    Figure 3

    6. On the Default Account Page in Figure 4, click on the Create default Run As Account as we need to create an account which will be used to discover the network devices.

    image
    Figure 4

    7. On the Introduction page of Create Run As account Wizard in Figure 5, click next

    image
    Figure 5

    8. On the General Properties page of the Create Run As account Wizard in Figure 6; enter the Display name of the Run As Account and click next.

    image
    Figure 6

    9. On the Credentials page on the Create Run As account Wizard in Figure 7, enter the SNMP community string and click on create.

    Note:
    SNMP Community Strings

    We can configure Read only [RO] and Read Write [RW] SNMP Community strings. With the RO Community string we have read access to the network device. For Operations Manager 2012, we need only RO SNMP Community String to access the device. So it’s should be easy to convince the network guys ;-)

    image
    Figure 7

    10. On the Default Account Page in Figure 8, select the created Run As Account and click on Next.

    image
    Figure 8

    11. On the Devices Page, click on Add Button

    image
    Figure 9

    12. On the Add a device window in Figure 10, enter the IP address / Name of the device we want to monitor; Select the Access Mode as ICMP and SNMP (You can also select ICMP only and SNMP only); Select the version on SNMP as v1 or v2; Select the created Run As account and then click OK.

    Note:

    • We use ICMP only in the scenario where we need to know the availability of the gateway router from the ISP to verify if the interface is up or down.
    • We use SNMP only in the scenario where we want to monitor a Firewall on which ICMP is blocked.
    • If we specify that a device uses both ICMP and SNMP, Operations Manager must be able to contact the device by using both methods or discovery will fail.
    • If you specify ICMP as the only protocol to use, discovery is limited to the specified device and monitoring is limited to whether the device is online or offline.

    image
    Figure 10

    13. Now Click Next on the Devices Page as in Figure 11.

    image
    Figure 11

    14. On the Schedule discovery Page in Figure 12, Select the discovery schedule and click Next.

    Note:

    You may also select to run the discovery manually.

    image
    Figure 12

    15. Click Create on the Summary page

    image
    Figure 13

    16. Click Yes on the Warning box as in Figure 14. We need to distribute the created Run As account to the Management server for discovery and to the Management Server resource pool for monitoring that was selected in General properties [Figure 2]

    image
    Figure 14

    17. Click close on Completion.

    image
    Figure 15

    18. Now in the Administration Workspace, go to Discovery Rules Node under the Network Management Node. You will able to see the Discovery Rule that has created. Click Run if we want to Run the discovery manually. See Figure 16

    image
    Figure 16

    19. See the Figure 17 for the Task Status window that appears when we run the Discovery Manually. The success Status suggests that the discovery is submitted successfully and not that the devices have been discovered. Click close.

    image
    Figure 17

    20. We will see probing status of the discovery rule when it has actually found the device. See Figure 18

    image
    Figure 18

    21. The Discover Rule starts processing the discovered components as in Figure 19

    image
    Figure 19

    22. The status of the discovery rule will go to pending and will run again as per the discovery schedule that we selected Wizard. If we would have selected manual discovery option in the Wizard than the status would go to Idle. See Figure 20.

    image
    Figure 20

    23. Go to Network Devices under Network Management to see the discovered device. See Figure 21.

    image
    Figure 21

    24. Double click the Network device to view the properties page and more information about that discovered device. See Figure 22.

    image
    Figure 22

    B. Network Monitoring:

    We will see some of the views that are relevant to the network device that we discovered in previous step.

    1. Go to Monitoring Workspace; double click the Network Monitoring Folder to see the Network views. See Figure 23.

    image
    Figure 23

    2. Select the Network Devices view to see the Network Devices being monitored.

    image
    Figure 24

    3. Click on the Health Explorer to the Subcomponents of the Switch. See Figure 25 & 26

    image
    Figure 25

    image
    Figure 26

    4. Click on the VLANs view to see the VLANs in which the switch is participating. See Figure 27

    image
    Figure 27

    5. Click on the ICMP Ping Response Performance view or Processor utilization Performance view to see the performance graph for ping response. See Figure 28 & 29.

    image
    Figure 28

    image
    Figure 29

    C. Dashboard:

    1. To see the connections between the connected nodes and the network device, click on the Network Vicinity view. See figure 30.

    image
    Figure 30

    2. Click on the show computers check box to see the connections. See figure 31.

    Note:

    By default we can see connections which are one hop away from the network device.

    We can select at max 5 hops. In environments with large number of network devices, selecting five hops can take a while for Operations Manager 2012 to show the data and the view might not be useful to you.

    image
    Figure 31

    3. Now coming back to Network devices view in Monitoring workspace, click on the Network Node Dashboard. We will able to view all the information related to Network devices in the just one window. See figures 32, 33, 34 and 35.

    image
    Figure 32

    image
    Figure 33

    image
    Figure 34

    image
    Figure 35

    D. Reporting: [See Figure 36]

    Processor Utilization Report: It displays the processor utilization of a particular network device in a specified period of time.

    Memory Utilization Report: It displays the percentage of free memory on a particular network device in a specified period of time.

    Interface Traffic Volume Report: It displays the rate of inbound and outbound traffic that goes through the selected port or interface in a specified period of time.

    Interface Error Packet Analysis Report: It displays the percentage of error packets or discarded packets, both inbound and outbound, for the selected port or interface.

    Interface Packet Analysis Report: It displays the types of packets (unicast or non-unicast) that traverse the selected port or interface.

    image
    Figure 36

    Additional Resources

  • Sending e-mails from Microsoft Dynamics CRM

    (post courtesy Sarkis Derbedrossian)

    I often meet Microsoft CRM users who don’t know how sending e-mail works within Microsoft Dynamics CRM. Most users think that when they create an e-mail in CRM and hit the send button, the e-mail is sent automatically. Neither Outlook nor CRM can send e-mails without a post system e.g. Exchange server. Below you will learn how e-mail within CRM works with and without Outlook

    E-mail in relation to CRM

    Once you've created an e-mail activity in MS CRM and click the Send button to send the e-mail, this mail is handled differently depending on the settings of each user is set to in MS CRM.

    E-mail can be handled through Outlook or directly through CRM ... but neither Outlook nor MS CRM can implement the physical handling of the e-mail. This is done by a mail server (Microsoft Exchange Server or another mail system).

    Sending an E-mail in MS CRM

    Do not make a fast conclusion and think that MS CRM can neither receive nor send e-mail. You should understand that the above task requires an e-mail system to accomplish.

    When you send email from MS CRM it usually happens by the following steps:

    1. The user creates an e-mail activity and clicks on the send button. The e-mail is now saved with the user as the recipient
    2. The e-mail gets synchronized to the user’s Outlook
    3. The users Outlook sends the e-mail to the mail server (Exchange)
    4. Exchange sends the e-mail through the internet

    What if the user does not have the Outlook client open? This will result in the mail will not be sent until the user logs into Outlook. For some situations this can be insufficient. Fortunately installing the e-mail router can solve it.

    Sending e-mail via the e-mail router

    If you want to be independent of Outlook, and thus could send email directly from MS CRM without using Outlook, this can be done by installing and configuring an E-mail Router.

    The e-mail Router is free software that comes with MS CRM. The software can be installed on any server that has access to a Mail Server (Exchange Server or other mail system (POP3/SMTP)) and MS CRM.

    When you send email from MS CRM using an E-mail Router it often happens by the following steps

    1. The user creates an e-mail activity and clicks on the send button. The e-mail is now saved with the user as the recipient
    2. The e-mail is sent to the e-mail router
    3. The email router sends the e-mail to the mail server (Exchange)
    4. Exchange sends the e-mail through the internet

    E-mail settings in CRM

    Depending on how you want your organization to send e-mails, remember to check the following settings:

    1. In CRM, Settings, Users
    2. Open the user form
    3. In the configuration section of the e-mail access, select the desired setting

    clip_image002

    Configuring e-mail access

    It is possible to choose one of the following settings from the option list:

    None
    Outlook cannot be used for sending and receiving e-mails which is related to MS CRM

    Microsoft Dynamics CRM for Outlook
    Outlook is responsible for sending / receiving e-mail. Integration with MS CRM for Outlook must be installed and configured. E-mails sent / received only when Outlook is active (open)

    E-mail router
    E-mail is sent and received with MS CRM Email Router. If this element is selected, a dialog box allows entering credentials. Check the box if you want to specify credentials

    Forwarded mailbox
    E-mail forwarded from another e-mail address. The e-mail Router is responsible for sending / receiving e-mails.

    More Information:

  • Migrate from Gmail to Office 365 in 7 steps

    If you are doing a large migration from Gmail to Office 365, you will generally want to use a 3rd party tool that automates the process.  However, if you are migrating a small customer with a few mailboxes, it is quick, easy, (and free!) to do so manually).

    Here is how: http://technet.microsoft.com/en-us/library/dn568114.aspx

    What do you need to know before you begin?

    This guide covers migrating from Gmail to Office 365 and will take about an hour to complete.

    For more information on deploying Office 365, see the first article in the series at Office 365 Midsize Business Quick Deployment Guide and also watch the YouTube video at Office 365 Midsize Business Quick Deployment Guide video.

    Before you begin the Gmail to Office 365 migration, you need to know or have at hand a few key pieces of information:

    1. Your Google apps and Office 365 administrator account and password.
    2. The URLs to access the Google admin console, the Office 365 admin center, and the Exchange admin center. If you don't have them, don’t worry—they are covered later in this document.
    3. The user names and passwords of the Gmail mailboxes you want to migrate.
    4. How to create MX records at your Internet service provider.

    noteNote:

    If you’re using Office 365 Midsize Business with the Microsoft Open License or the Open Value program, go to the get started with Office 365 page and create an Office 365 account first. After you’ve created the account, return to this document and begin Step 1: Sign in to the Gmail Admin console and Office 365 admin center.

    What Gmail information is migrated?

    1. Email is migrated, and this is covered in Step 5: Migrate a Gmail mailbox.

    2. Gmail contacts are migrated and imported by using a CSV file. This topic is covered in Step 6: Migrate Gmail contacts.

    3. Gmail calendar items are imported by exporting Google Calendar to an iCal file. This is covered in Step 7: Migrate Gmail calendar.

    Okay, let’s get started.

    Step 1: Sign in to the Gmail Admin console and Office 365 admin center

    pages.

    Sign in to the Google Admin console

    1. By using your Google Apps administrative credentials, sign in to http://admin.google.com.

    2. After you’re signed in, choose Users and verify the list of users you want to migrate to Office 365.

      image

    Sign in to the Office 365 admin center or the Exchange admin center
    1. By using your Office 365 administrative credentials, sign in to https://portal.microsoftonline.com.

    2. After you’re signed in, you will be directed to the Office 365 admin center page.

    3. To go to the Exchange admin center, click the drop-down arrow next to the Admin name in the ribbon bar.

      image

    4. From the list, select Exchange.

    5. Select Office 365 to return to the Office 365 admin center page.

    Step 2: Create Office 365 mailboxes for Gmail users you want to migrate

    One of the most important tasks in preparing to migrate Gmail to Office 365 is first creating an Office 365 mailbox for each Gmail mailbox you want to migrate. Fortunately, creating an Office 365 mailbox is easy. You simply create a new user account and assign the Exchange Online Plan license to the user. Refer to your list of Gmail mailboxes you want to migrate, and complete the following steps to create corresponding Office 365 mailboxes.

    To create an Office 365 mailbox for each user you want to migrate from Gmail

    1. From the Office 365 admin center, click users and groups > active users.

    2. Click the plus icon (+) to add a new user account. You can also create multiple user accounts at the same time by clicking the Bulk add icon, as shown in the following figure.

    image

    • Click Assign role > Set user location, and then click Next.
    • On the Assign licenses page, ensure that Exchange Online Plan 1 or Exchange Online Plan 2 is selected. This helps ensure that the user account being created will have access to email.

    image

    • On the Send results in email page, type an email address where you will receive the temporary password for the user.

      The newly created user name and password appear on the Results page and are also sent to the administrator via email.

    • Lastly, send the email message with the user name and temporary password information to each user.

    Step 3: Create a Gmail migration file

    The migration file, a comma-separated values (CSV) file, contains the list of Gmail accounts that will be migrated to Office 365. Each row of the file contains the email address of an Office 365 mailbox and the corresponding user name and password of the Gmail account that will be migrated.

    The CSV file can easily be created by using Microsoft Excel.

    image

    Create the Gmail migration file

    1. On your local computer, open Excel 2013 or Excel 2010.

    2. Using the preceding figure as a template, create the migration file.

    3. Column A lists the Office 365 mailbox.

    4. Column B lists the Gmail user name.

    5. Column C lists the password for the Gmail user in Column B.

    6. Save the file as a CSV file type, and then close the program.

    Step 4: Verify that Office 365 can communicate with Gmail

    As part of the migration process, Office 365 must verify that it can communicate with Gmail. It’s very important to successfully connect to the Gmail server before continuing. If you do experience any problems performing this step, see Troubleshooting the Gmail connection to resolve the issue.

    Test the connection to the Gmail server

    1. Go to the Exchange admin center.

    2. Select migration > More > migration endpoints.

    3. Choose + and then select IMAP.

    4. Set IMAP server to imap.gmail.com, and leave the remaining settings as they are.

    5. Choose Next.

    image

    • When you reach the new migration endpoint page, this verifies that Office 365 can connect to the Gmail server.

    image

    • Enter a name for the connection and choose new to create the migration endpoint. The preceding figure uses Gmail-migration as the name of the migration endpoint.

    • The migration endpoints page appears and displays the endpoint you just created.

    image

    Step 5: Migrate a Gmail mailbox

    When you migrate your Gmail mailbox to Office 365, only the items in your inbox or other mail folders are migrated. The steps for migrating your contacts and calendar items are covered in later steps.

    Migrate messages from Gmail to Office 365

    1. Go to the Exchange admin center.

    2. Navigate to Recipients > Migration.

    3. Click the plus icon (+), and choose Migrate to Exchange Online.

    4. Choose IMAP migration.

    5. Choose Browse, and specify the file created in Step 3: Create a Gmail migration file.

    6. On the Start the batch page, select Automatically start the batch. The status field will initially be set to Created, as shown below.
      image

    7. The status will change to Syncing and then to Synced after the Gmail messages have been synchronized with Office 365.

    Step 6: Migrate Gmail contacts

    You migrate your contacts from Gmail to Office 365 by first exporting the list of contacts to a comma-separated values (CSV) file and then importing that file into Office 365.

    Export Gmail contacts to a CSV file

    1. Using your Google Apps administrative credentials, sign in to the Google admin console..

    2. Choose Contacts > More > Export.

    3. Choose All contacts > Outlook CSV format > Export.

    4. Select a location to save your file.

    importantImportant:

    When you export Gmail contacts to a CSV file, you must choose the Outlook CSV format to successfully import the Gmail contacts into Office 365.

    Import Gmail contacts into Office 365

    1. Using your Office 365 administrative credentials, sign in to the Office 365 admin center.

    2. Choose People > Settings > Import contacts.

    3. Select the Gmail CSV file you saved in Step 3: Create a Gmail migration file, and choose Next.

    4. After the Gmail contacts have been successfully imported into Office 365, choose finish.

    Step 7: Migrate Gmail calendar

    You migrate calendar items from Gmail to Office 365 by using a two-step process. First, you export the Gmail calendar items as an iCal file. Once the iCal file is saved, you use Microsoft Outlook to import the calendar items into the Outlook Calendar. You cannot import the iCal file directly into Outlook Web Access.

    noteNote:

    There are third-party tools available that simplify the task of moving Gmail calendar items and contacts to Office 365 and Microsoft Outlook. An Internet search for “Gmail to Office 365 migration tools” lists some of these tools.

    Export your Gmail calendar to an iCal file

    1. Using your Google Apps administrative credentials, sign in to http://admin.google.com.

    2. Choose Calendar > My calendars > Settings > Export calendars.

    3. Select a location to save your file. Gmail saves the iCal file as a compressed file. Be sure to decompress the file before proceeding to the next step.

    Import your Gmail calendar into Microsoft Outlook
    1. Set up Microsoft Outlook to access Office 365. For guidance, see Set up email in Outlook 2010 or Outlook 2013.

    2. Choose Import > Comma Separated Values (Windows) > Next.

    3. Select the iCalendar file you saved in the previous step..

    4. Choose Outlook’s calendar > Finish. You should now see the Gmail calendar items within the Outlook calendar.

    Verify Gmail migration completed successfully

    Now that you have migrated Gmail messages, contacts, and calendar items to Office 365, you can use Outlook Web App, which comes with Office 365, to verify that Gmail migrated successfully.

    Verify Gmail migrated successfully using Outlook Web App

    1. Open the email message sent by the Office 365 administrator that includes your temporary password.

    2. Go to the sign-in page https://portal.microsoftonline.com.

    3. Sign in with the user name and temporary password.

    4. Update your password, and set your time zone.

      noteNote:

      It’s very important that you select the correct time zone to ensure your calendar and email settings are correct.

    5. When Outlook Web App opens, send an email message to the Office 365 administrator to verify that you can send email.

    6. Choose the Outlook icon, and verify that the Gmail messages have been migrated.

    7. Choose the People icon, and verify that the Gmail contacts have been migrated.

    8. Choose the Calendar icon, and verify that the Gmail calendar items have been migrated.

      noteNote:

      You cannot import Gmail calendar items directly into Outlook Web App. However, you can view the items using Outlook Web App after they have been imported by Microsoft Outlook.

    Next steps after migrating Gmail to Office 365

    Well, you’ve reached the end of migrating Gmail to Office 365. At this stage, email is flowing to both Gmail and Office 365 mailboxes. Many administrators choose to keep both the Gmail and Office 365 mailboxes running in parallel for a period of time. There’s nothing wrong with this approach. The limitation is that email is updated to Office 365 from Gmail once every 24 hours. To remove this limitation and direct Gmail messages directly to Office 365, follow the procedure below.

    Route all future Gmail messages to Office 365

    1. Sign in to your DNS hosting provider’s website.

    2. Select your domain.

    3. Find the page where you can edit DNS records for your domain.

    4. Open a new browser window, and sign in to the Office 365 website using your Office 365 administrative credentials.

    5. Choose domains > your company domain > View DNS Settings > View DNS records.

    6. In the Exchange Online section, in the MX row, copy the Priority, Host Name, and Points to Address.

    image

    1. Return to your DNS hosting provider’s website, and use this information to create a new MX record.

    2. Set the priority of the MX record to the highest value available, typically 0, and save the record.

    For detailed instructions for creating MX records to point to Office 365, see the article Create DNS records for Office 365 when you manage your DNS records.

    For information about creating an MX record, see Find your domain registrar or DNS hosting provider.

    noteNote:

    Typically, it takes about 15 minutes for DNS changes to take effect. However, it can take up to 72 hours for a changed record to propagate throughout the DNS system.

    See the following list of resources to further your exploration of Office 365:

    • Join the Office 365 Yammer group to discuss the latest news about Office 365. Sign up on the Office 365 Yammer page to get started.
    • The Office 365 community site posts the latest developments and information related to Office 365. It includes a discussion area where site members can post questions and answers.

    Troubleshooting the Gmail connection

    The information in this article covers troubleshooting Step 4: Verify that Office 365 can communicate with Gmail. If you successfully created a connection to Gmail from Office 365, you can skip this topic. However, if you were not successful connecting to Gmail from Office 365, perform the following steps.

    Test the connection to the Gmail server

    1. Open Windows PowerShell as an administrator on your computer.

    2. From the Windows PowerShell command window, run Get-ExecutionPolicy.

      The Get-ExecutionPolicy cmdlet tells you which of the four execution policies (policies that determine which Windows PowerShell scripts, if any, will run on your computer) is set. In the next step, we’ll change this setting to remotesigned.

    3. From the Windows PowerShell command window, run Set-ExecutionPolicy remotesigned.

    4. Next, run the following command:

      $session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://ps.outlook.com/powershell/" -Credential $cred -Authentication Basic -AllowRedirection
          
    5. When prompted to enter your Windows PowerShell credentials, enter your Office 365 administrator credentials.

    6. Next, run Import-PSSession $session.

      This command provides access so you can test the connection between Gmail and Office 365.

    7. To see a list of Office 365 mailboxes configured on Office 365, run Get-Mailbox. This is just a quick test to verify that we are communicating with Office 365.

    8. Finally, to test the connection between Gmail and Office 365, run the following command:

      Test-MigrationServerAvailability -IMAP -RemoteServer imap.gmail.com -Port 993 -Security SSL
          

      You should see Success appear in the Result row. If you see any errors, verify you have entered the command correctly.

      image

    9. Now that you’ve verified that Office 365 can connect to Gmail, it’s important to disconnect from Office 365. To do that, from the Windows PowerShell command window, run Exit.

    10. Troubleshooting is now complete. Return to Step 4: Verify that Office 365 can communicate with Gmail.

  • Take advantage of your Office 365 Internal Use Rights

    In the past, MPN Partners with the Cloud Essentials and Cloud Accelerate had access to Internal Use Rights for Office 365.  Now, however, all partners with a Microsoft Action Pack Subscription (MAPS) as well as Partners with a Silver or Gold competency all have access to free Internal Use Rights licenses for Office 365.  This gives you an opportunity to try out the service, so you can speak from experience when you discuss the benefits with your customers.  It also means that someone else takes care of running your servers so that you can spend more time working and less time patching and troubleshooting.

    I wanted to share a few resources to help get you started.  First, the page with all the information you need on your Internal Use Rights licenses, how to access them, how to earn more licenses, and how to activate your partner features is available at: http://aka.ms/mpniur.

    In the following video, York Hutton walks through the Internal Use Rights (IUR) core benefits, discussing how they now give partners the power of choice to mix and match online services and on-premises software licenses. Microsoft partners can choose between work-alike solutions for productivity, demonstration, development, testing, and internal training purposes.

    In this video, York walks through the process of activating your IUR benefits, whether you are using them for the first time, or transitioning from a previous license grant:

    A few additional resources:
    KB2887467: Support Article: What are my internal use rights benefits?

    Office 365 partner features how-to guide (Learn about partner features available to help you sell to and manage your customers, including how to offer and use delegated administration, and how to send quotes and trials.)

    If you have program questions (how do I get my license, where is my key, how do I sign up for MAPS or renew my membership?) visit the Partner Membership Community

    If you have technical questions (why am I getting an error message when migrating my mailboxes? how do I resolve a DirSync error about an invalid attribute?), visit the Partner Online Services community.

    If you have a Silver or Gold competency, you have access to 20 and 50 (respectively) hours of advisory services consultation with a Partner Technical Consultant.  These consultants are a great resource to help plan for a deployment (even if it is an internal deployment). Submit an advisory request via: http://aka.ms/mysupport

    All partners holding current internal-use software licenses available through a cloud program must make the transition so that they are in alignment with the new internal-use software license process and entitlements, which are available to Action Pack subscribers or competency partners, prior to June 30, 2014, or your internal-use software licenses will expire.
    Download the instructions to transition to the new system

  • Microsoft Virtual Academy Presents: "Building Blocks" a pre //build event

    What do @geektrainer and @bitchwhocodes have in common?

    They both have awesome Twitter handles! And they’re both sharing their experience and insights in our upcoming “Building Blocks” Jump Start series. These entertaining and inspiring technology experts are teaming up with other seasoned pros, including @codefoster and @mlunes90, for three lively days of deep dives to help you gear up for next month’s //build conference. Whether you’re a web, app, C#, .NET, or JavaScript developer, you’re sure to stretch your dev muscles before the //build workout.

    We start the series on March 26 with “Initialize(),” which focuses on various paradigms comparing JavaScript and C# side by side on the Microsoft platform. We continue on March 27 with “Construct(),” where you learn how to create great layout and style with XAML and HTML5. And we wrap on March 28 with “Extend(),” a session on successful mobile app and smart device strategies.

    Sign up for one, two, or all three sessions, and be sure to bring questions for the Q&A!

    Register now! Building Blocks series:

    Initialize(), Wednesday, March 26, 9:00am‒5pm PDT

    Construct(), Thursday, March 27, 9:00am‒5pm PDT

    Extend(), Friday, March 28, 9:00am‒5pm PDT
    Where: Live, online virtual classroom
    Cost: Free!

  • How to fix the ACS Error ACS50008 in Windows Azure

    This video shows how to fix the Error ACS50008 in the context of Windows Azure Access Control Service.

    This error usually is displayed as an Inner Message like this:

    An error occurred while processing your request.
    HTTP Error Code: 401
    Message: ACS20001: An error occurred while processing a WS-Federation sign-in response.
    Inner Message: ACS5008:SAML token is invalid.
    Trace ID: 903f515f-3196-40c9-a334-71277700aca6
    Timestamp: 2014-03-02 10:16:16Z

    Links:

    How to fix Error ACS50008 http://msdn.microsoft.com/en-us/library/windowsazure/jj571618.aspx

    ACS Error Codes http://msdn.microsoft.com/en-us/library/windowsazure/gg185949.aspx

    ACS Documentation http://msdn.microsoft.com/acs

  • Capture a Windows® Image from a Reference Computer Using Capture Media—for IT Pros

    (This post courtesy of Simone Pace)

    In order to use System Center Configuration Manager 2007 to distribute the Windows 7 operating system to our managed clients, we need to provide to the OS bits somehow to the site server. One of the methods we can use is capturing a Windows 7 WIM image from a previously prepared reference computer.

    System Center Configuration Manager 2007 offers standard and easy ways to deploy software in our IT Infrastructure. One of the most relevant features we can take advantage of is the highly customizable Operating System Deployment capability built in the product.

    The Windows Vista® and Windows 7 new WIM image format further simplifies OS distribution by being independent from the destination client system’s hardware, so that we can use a single image to target different computers and keep our image repository less complex and more easily managed. This post shows an example of steps we can follow to successfully capture a WIM image of Windows 7 Ultimate Edition x64 from a reference computer.

    Note: Further posts will follow that illustrate the specific tasks required to upgrade a Windows XP computer.

    Testing lab description screenshots and computer names used in this article refers to a Virtual scenario running on a Hyper-V R2 host:

    • Domain: contoso.com (single site)
    • All Servers are Windows 2008 R2 Enterprise servers.
    • Server CON-001
    • SCCM with almost all roles installed
    • SQL Server 2008
    • Windows Automated Installation Kit 2.0
    • WDS Transport Server role installed
    • Server CON-002
    • Active Directory Domain Controller role installed
    • DNS Server role installed
    • DHCP Server role installed
    • SCCM Primary Site: C01
    • Reference client is a Windows 7 Ultimate edition x64 clean setup

    1. Create a Capture Media iso file.

    The iso image we are creating in this section will be used to boot the reference machine and start the OS wim image creation sequence.

    a. Log on CON-001 and open the Configuration Manager console.

    b. Go to Task Sequences node.

    c. Click on “Create Task Sequence Media” in the action panel.

    d. Select Capture Media and click next on the welcome page.

    clip_image002

    e. On the “Media file” click Browse, select the folder where you are going to save the media iso file, and give it a name (for example MediaCapture), click Next.clip_image004

    f. On “Boot Image” click Browse, and select the boot image suitable for your reference computer.

    Note: Two boot images (x86 and x64) are automatically added when you install WDS role in the system.

    g. On Distribution Point leave \\CON-001 (or select you preferred DP), click Next.clip_image006

    h. Review the summary and click Finish.

    i. The server starts building the iso image.

    clip_image008clip_image010clip_image012

    j. Click Close to close the wizard.

    2. Prepare the reference computer.

    a. Log on CON-Ref7Client with user Administrator account

    b. Check the following requirements

    i. The computer must be a workgroup member.

    ii. The local Administrator password must be blank.

    iii. The local system policy must not require password complexity.

    iv. Apply the latest Service Pack and updates.

    v. Install the required applications.

    3. Capture the image using the Capture Media.

    a. Capture the MediaCaputer.iso you created in Step 1 in the Virtual DVD of the reference PC (if is a VM), or

    b. Burn the MediaCapture.iso on a DVD and insert it in the computer.

    c. Boot the reference computer normally.

    d. Start the autoplay DVD and launch the Capture Image Wizard.

    clip_image013

    e. Click Next.

    f. Set the path where you want to save the wim file, give the image a name, and insert the appropriate credential to access the path and write on it.
    clip_image014

    g. Click Next.

    h. Fill in the required data in the Image Information window.

    clip_image015

    i. View the summary and launch the capture by clicking Finish.

    clip_image016

    j. The program will start executing the sysprep phase.

     clip_image017

    k. After sysprep, the computer will restart in WinPE to start the capture.

    clip_image018

    l. (Reboot).

    clip_image020

    m. Computer restarts in WinPE and starts the Capture.

    clip_image021

    n. Capturing first Partition (1-2)

    clip_image022

    o. And capturing second partition (2-2).

     clip_image023

    Note: The number of partitions captured depends on the reference PC’s disk partitions. In the case shown, the VM had a 100Mb partition for BitLocker® capability (Partition 1 of 2).

    p. When finished, press OK to quit and restart.

    clip_image024

    q. On the Server we can see the captured image file.

    clip_image025

    4. Add the file to the image repository in SCCM 2007.

    a. Share a folder and move the image file (example \\ServerName\Images).

    b. Open the SCCM console, navigate to Site Database > Computer Management > Operating System Deployment > Operating System Images.

    c. Import the image by clicking Add Operating System Image in the task panel.

    d. Type or browse the network path to the image you want to import, and click Next.

    clip_image027

    e. Fill in the required information, then click Next.

     clip_image029

    f. Review the summary and complete the wizard.

    clip_image031

    clip_image033

    5. Distribute the image to Distribution Point.

    a. In the SCCM console, navigate to the image you uploaded in step 4 (Site Database > Computer Management > Operating System Deployment > Operating System Images) and select it.

    b. Click Manage Distribution Points in the action panel.

    clip_image035

    c. Click Next on the wizard starting page.

    d. As DP doesn’t have the image deployed yet, leave the default selection (copy) and click Next.clip_image037

    e. Select the DPs where you want to deploy the image to and include their PXE DP’s hidden share.clip_image039

    f. Click Next and Next again in the Completion page.

    clip_image041

    clip_image043

    g. Check the copy progress in the Package Status folder until you see it is Installed.

    clip_image045

    h. You are now ready to distribute the Windows 7 Ultimate x64 Image to client computers, either by upgrading or installing new machines.

  • System Center Operations Manager 2012 Installation Walkthrough

    (Post courtesy Rohit Kochher)

    System Center Operations Manager 2012 has significant changes in setup from Operations Manager 2007. Setup of 2012 has become simpler and installation has become easier.

    If you want to follow along on a test server, you can download Beta version of SCOM 2012 from here.

    Note: The Root Management Server (RMS) concept which from Operations Manager 2007 R2 has been removed from Operations Manager 2012. All Operations Manager 2012 servers are management servers. However we do have an RMS emulator to support those management packs which target RMS. Architecturally, servers in Operations Manager 2012 have a peer-to-peer relationship and not a parent-child relationship like Operations Manager 2007 R2.

    In this blog we will discuss the setup of Operations Manager 2012 with some screenshots of the installation wizard. Microsoft SQL Server 2008 SP1 or 2008 R2 should be installed prior running SCOM 2012 Setup. You can get more information on SCOM 2012 supported configurations here.

    Now, once we run setup.exe we will see the following screen:

    image

    You can click on Install for setup of Management server, Management Console, Web server and Reporting Server. Under Optional installations you can choose to install Local agent, Audit Collection Services, Gateway management server, and ACS for Unix/Linux.

    Once you click on Install you will get the screen to accept the agreement. Once you accept that you will get below screen

    image

    You can select the component that you want to install. Clicking on the arrow pointing down in front of each role will give brief information about that role. There is no explicit option to install OPS DB and data warehouse, as they are integrated. Selecting given features, you will get screen for location of program files. The default location is C:\Program Files\System Center Operations Manager 2012.

    image

    The next step will show you prerequisite failures (if any). You will get information for failures along with download links to install any missing prerequisites.

    Next you get screen to input information about management server. You can specify if it is first management server in new management group or an additional management server in an existing management group.

    image

    You can specify the name of the management group here. You will also get the screen to specify operations database. We need to install both operations database and data warehouse in Operations Manager 2012. Installing Data warehouse is mandatory in 2012 (a change compared with Operations Manager 2007). The data warehouse is needed for things like dashboards etc. If this is second management server you can click on Add a management server to existing management group option.

    image

    After specifying the required information about Operations database and clicking on next, you will get similar screen for Operations manager data warehouse.

    The next screen allows you to configure Operations Manager service accounts.

    image

    You can specify the required accounts on this screen and click on next to complete the setup. This setup will automatically assign local administrators group on server to the Operations Manager admin role. Once you enter account information here, it will be automatically verified in the background. In case the account cannot be verified (or the password is incorrect), you will get a red warning as the above picture illustrates.

    After this, you will get the option to participate in the Microsoft Customer Experience Improvement Program (CEIP) and Error reporting. Finally, you will also get the option for configuring Microsoft Updates.

    image

    The last screen will provide you with an installation summary. Clicking on Install will start the Installation. Once finished, you are all set to monitor your infrastructure! Some of the great features in Operations Manager 2012 are the new dashboards, network monitoring , and application monitoring; which will be covered in future posts.

    You can check the deployment guide for Operations Manager 2012 here.

    System Center Operations Manager 2012 Beta resources

  • Windows Store App Development Training Videos (Spanish Versions)

    Recently we’ve released several short training videos around building Windows Store apps.  As a part of this process we’re now taking some of the most popular and offering them in additional languages.  In the coming weeks we’ll bring you French, German, Portuguese, and Turkish as a start.

    Today we’re excited to bring the first five in Spanish.

    Desarrollo de Windows 8.1 Store Apps en C#

    Certificación de Store Apps

    Plantillas de Proyecto de Visual Studio para Store Apps

    Consejos y Trucos para Implementación de Notificaciones en Store Apps

    Como Adaptar Store Apps a los Tamaños de Ventana

    Como Implementar Live Tile del Store App en menos de 10 minutos

  • Automatically Collect Windows Azure Storage Analytic Logs

    This video shows you how to automatically collect Windows Azure Storage Analytic logs. Storage Analytics are key to diagnosing issues with blob, table and queue storage. You can run the Windows Azure Storage Analytics Diagnostics package (.DiagCab) to automatically collect the logs previously generated.

    Before using this package, you will need to enable Windows Azure Storage Analytics.

    Links:

    Download .DiagCab http://dsazure.blob.core.windows.net/azuretools/AzureStorageAnalyticsLogs_global.DiagCab

    How to use .DiagCab http://blogs.msdn.com/b/kwill/archive/2014/02/06/windows-azure-storage-analytics-sdp-package.aspx

    Storage Analytics Video http://channel9.msdn.com/Series/DIY-Windows-Azure-Troubleshooting/Storage-Analytics

    Storage Analytics Documentation http://msdn.microsoft.com/en-us/library/windowsazure/hh343270.aspx

    Storage Analytics Billing http://msdn.microsoft.com/en-us/library/windowsazure/hh360997.aspx

    Storage Analytics Log Format http://msdn.microsoft.com/en-us/library/windowsazure/hh343259.aspx

    Storage Analytics Logging - How to Enable and Where to Find the logs. http://blogs.msdn.com/b/cie/archive/2013/10/10/storage-analytics-logging-how-to-enable-and-where-to-find-the-logs.aspx

    Note:

    This package will only work on a Windows 7 or later, or Windows Server 2008 R2 or later computer. You will need to have Microsoft Excel installed on the machine where you run this package in order to see the charts.

  • Desarrollo de Windows 8.1 Store Apps en C#

    Windows 8.1 ofrece una oportunidad para los ISV de desarrollo de aplicaciones novedosas y su monetización a través de la Tienda de Windows. Las Store Apps permiten implementar rápidamente soluciones de movilidad que requieren un interfaz táctil y una conectividad permanente a datos. Esta serie de webcasts está pensada en desarrolladores de C# que quieren conocer la nueva funcionalidad disponible para las Store Apps en Windows 8.1

    Hemos compartido el material de este curso en el OneDrive.

    Cuando

    Titulo

    Descripción

    21/04/2014

    10:00-11:15

    RECORDING LINK

    Windows 8.1 para Desarrolladores

    1:15h

    Level 100

    Introducimos la funcionalidad del SS.OO Windows 8.1 de interés para desarrolladores. Comentaremos como las Store Apps integran con el SS.OO., introducimos la Tienda de Windows, las APIs y las herramientas de desarrollo.

    23/04/2014

    10:00-11:00

    RECORDING LINK

    Controles XAML UI

    1:00h

    Level 200

    Presentamos el resumen de los controles XAML básicos para construir el interfaz de usuario: TextBox, RichTextBlock, ProgressRing, RichEditBox, Date y TimePicker, Buttons , Flyouts, Shapes, Paths y Images. También explicaremos la aplicación de estilos a los controles.

    25/04/2014

    10:00-11:00 GMT+1

    RECORDING LINK

    Controles de Listado Modernos

    1:00h

    Level 200

    Presentamos los controles modernos de listados: FlipView, GridView, ListView y SemanticView. También comentamos mejoras de rendimiento en listados que podemos conseguir utilizando virtualización de UI y carga de datos incremental.

    28/04/2014

    10:00-11:00 GMT+1

    RECORDING LINK

    Lenguaje de Diseño de Store Apps

    1:00h

    Level 100

    Presentamos los principios del diseño de las Store Apps. El propósito es ayudar al desarrollador a aplicar el diseño para conseguir que el aplicativo tenga una marca (brand) distinta y se perciba como una parte integral de Windows 8.1. Explicamos los 4 pilares de diseño: Principles, Personality, Patterns and Platform.

    30/04/2013

    10:00-11:00 GMT+1

    RECORDING LINK

    Navegación, Comandos, Ventanas y Layout

    1:00h

    Level 200

    Comentamos el mecanismo de navegación entre las páginas y como implementar comandos con CommandBar. Finalmente, explicamos cómo adaptar el layout del interfaz de usuario a los cambios en el tamaño de ventanas que contienen el aplicativo.

    5/05/2013

    10:00-10:40 GMT+1

    RECORDING LINK

    Controles WebView y RenderTargetBitMap

    0:40h

    Level 200

    Explicamos cómo integrar el contenido HTML en la App utilizando control WebView. También comentamos como generar una imagen en variedad de formatos desde una rama del árbol visual de XAML y compartirlo con otros aplicativos utilizando RenderTargetBitMap

    9/05/2013

    10:00-11:00 GMT+1

    RECORDING LINK

    APIs de Windows Store

    Level 200

    1:00h

    Esta presentación está enfocada en las APIs de Tienda de Windows y el diseño de la aplicación para monetizarla. Explicamos los modelos disponibles (add-funded, trial, in-app purchase, consumables), como gestionar el cambio de licencia del aplicativo y habilitar la funcionalidad correspondiente. Además comentaremos el proceso de publicación del aplicativo en la Tienda de Windows.

    14/05/2013

    10:00-10:40 GMT+1

    RECORDING LINK

    Integrando con los Contactos y Calendario

    Level 200

    0:40h

    En esta presentación explicamos cómo integrar con las app estándar de Windows 8.1: Contactos y Calendario utilizando el API de contrato. Conseguiremos que el aplicativo que necesita gestionar datos de sobre personas o citas puede acceder a la información que gestionan estos apps.

    Podéis encontrar videos adicionales sobre el desarrollo de Store Apps en XAML y C# en inglés en:

    “Designing Your XAML UI with Blend Jump Start”

    http://www.microsoftvirtualacademy.com/training-courses/designing-your-xaml-ui-with-blend-jump-start

    “XAML Deep Dive for Windows & Windows Phone Apps Jump Start”

    http://www.microsoftvirtualacademy.com/training-courses/xaml-deep-dive-for-windows-windows-phone-apps-jump-start

    "Windows Store App Development Essentials with C# Refresh"

    http://www.microsoftvirtualacademy.com/training-courses/windows-store-app-development-essentials-with-c-refresh

    "Advanced Windows Store App Development Using C# Refresh"

    http://www.microsoftvirtualacademy.com/training-courses/advanced-windows-store-app-development-using-c-refresh

    También hemos liberado el Update 2 de Visual Studio 2013 que puedes descargar de

    “Microsoft Visual Studio 2013 Update 2”

    http://www.microsoft.com/en-us/download/details.aspx?id=42666

  • WSUS not configured error during Configuration Manager 2012 Software Update Point Installation

    (Post courtesy Anil Malekani)

    Recently I tried configuring Software Update Management in Configuration Manager 2012. After installing WSUS on the Configuration Manager 2012 box, I tried to install Software Update Point as a site role.

    clip_image002

    The Software Update Point role successfully installed, as per the SUPSetup.log file (under C:\Program Files\Microsoft Configuration Manager\Logs)

    However, my updates still did not appear on the console. After checking the Site Component status for SMS_WSUS_SYNC_MANAGER and SMS_WSUS_CONFIGURATION_MANAGER I noticed errors as below

    SMS_WSUS_SYNC_MANAGER: Message ID 6600

    clip_image003

    SMS_WSUS_CONFIGURATION_MANAGER: Message ID 6600

    clip_image004

    I checked under WCM.log (under C:\Program Files\Microsoft Configuration Manager\Logs), and found the following proxy error

    =============================

    SCF change notification triggered.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    This SCCM2012.CORP80.COM system is the Top Site where WSUS Server is configured to Sync from Microsoft Update (WU/MU) OR do not Sync.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Found WSUS Admin dll of assembly version Microsoft.UpdateServices.Administration, Version=3.0.6000.273, Major Version = 0x30000, Minor Version = 0x17700111        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Found WSUS Admin dll of assembly version Microsoft.UpdateServices.Administration, Version=3.1.6001.1, Major Version = 0x30001, Minor Version = 0x17710001        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    The installed WSUS build has the valid and supported WSUS Administration DLL assembly version (3.1.7600.226)        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    System.Net.WebException: The request failed with HTTP status 502: Proxy Error ( The host was not found. ).~~ at Microsoft.UpdateServices.Administration.AdminProxy.CreateUpdateServer(Object[] args)~~ at Microsoft.UpdateServices.Administration.AdminProxy.GetUpdateServer(String serverName, Boolean useSecureConnection, Int32 portNumber)~~ at Microsoft.SystemsManagementServer.WSUS.WSUSServer.ConnectToWSUSServer(String ServerName, Boolean UseSSL, Int32 PortNumber)        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Remote configuration failed on WSUS Server.        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    STATMSG: ID=6600 SEV=E LEV=M SOURCE="SMS Server" COMP="SMS_WSUS_CONFIGURATION_MANAGER" SYS=SCCM2012.corp80.com SITE=CM1 PID=2424 TID=5408 GMTDATE=Fri Oct 14 00:20:03.092 2011 ISTR0="SCCM2012.corp80.com" ISTR1="" ISTR2="" ISTR3="" ISTR4="" ISTR5="" ISTR6="" ISTR7="" ISTR8="" ISTR9="" NUMATTRS=0        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    Waiting for changes for 46 minutes        SMS_WSUS_CONFIGURATION_MANAGER        1/1/1601 12:00:00 AM        5408 (0x1520)

    =============================

    I validated that the proxy had been configured correctly and my browser settings also contained the same settings.

    Resolution: After spending some time I found that Configuration Manager 2012 uses the system account proxy settings, which were set to Automatically detect settings.

    1. Using the excellent PsExec utility, I opened a command prompt under the system account (using the –s parameter).
    2. Within this command prompt running as system, I launched Internet Explorer and removed proxy settings.
    3. Finally, updates started appearing in the console.

    clip_image005

  • Windows Store App Development Training Videos (Portuguese Versions)

    Recently we’ve released several short training videos around building Windows Store apps.  As a part of this process we’re now taking some of the most popular and offering them in additional languages.  Last week we launched the Spanish series.  Today we’re launching the Portuguese versions.

    Construir um Live Tile para a sua aplicação da Windows Store em menos de 10 minutos

    Construir uma aplicação para diferentes tamanhos de janela

    Windows App Certification Kit (Portuguese)

    Dicas e truques para utilizar notificações em aplicações da Windows Store

    Windows Store uygulamalarında bildirimleri kullanmak için ipuçları

    Como utilizar o Visual Studio para criar Windows Store Apps

    >

  • Windows Store App Development Training Videos (Turkish Versions)

    Recently we’ve released several short training videos around building Windows Store apps.  As a part of this process we’re now taking some of the most popular and offering them in additional languages.  Last week we launched the Spanish series.  Today we’re launching the Turkish series.

    Windows Store uygulamanız için 10 dakikadan az bir süre içinde Live Tile oluşturun

    Farklı ekran boyutları için Windows Store uygulamaları tasarlama

    Windows App Certification Kit (Turkish)

    Windows Store uygulamaları tasarlamak için Visual Studio şablonlarını kullanma

    Windows Store uygulamalarında bildirimleri kullanmak için ipuçları

  • Ask the Experts… any time

    I just sat through an “Ask the Experts” session on Exchange Online migrations at the Microsoft Exchange Conference, and there were some great questions asked. It got me to thinking… what do YOU do when you have a question?

    Search Bing/Google? Read TechNet or MSDN?  Read the Office 365 Service Descriptions? The Office 365 Deployment Guide? Stack Exchange?  Save up the question for SharePoint Conference or MEC or TechEd?  Those are all excellent resources.  Many teams within Microsoft write blogs to share tips/tricks/issues so they can be found later.  Definitely use them to find an answer if you can.

    For that matter, take advantage of the excellent training on Microsoft Virtual Academy or Ignite or  from a Microsoft Learning Partner. We even have some great videos over on the MSPartnerTech YouTube channel. However, training tends to cover our products working as designed, in a vanilla environment.  Out in the real world, things are much trickier, which is why we depend on our Partners for technical/industry/integration expertise. 

    That means our Partners hit some super interesting scenarios… “For a mailbox that was originally created on Exchange 2003, I cannot enable an archive once the user is moved to the cloud… am I doing something wrong or did I hit a bug?” “I am trying to move 30,000 mailboxes to the cloud, and when we did a few test migrations, I am only able to move mailboxes at  0.5 GB/hr. At that rate, it will take a year to move to the cloud. How can I speed up the migration?” “How can I move SharePoint list items from on-premises to SharePoint Online without changing the “modified” date?

    Wouldn’t it be great if you could get ahold of someone at Microsoft that had seen that scenario before and had an answer or a workaround or a pointer to documentation? Someone that could track down an answer from the thousands of smart people within Microsoft that may have hit that edge case before?  Some way to “Ask the Expert” when you have a problem, rather than waiting for a conference that takes place once a year?

    Let me point you to resources for our partners that let you “Ask an Expert” when you need it most… when you are planning for or carrying out a project:

    All Partners:

    Partner Support Communities – Unlimited no-charge support for both technical and program (questions about your membership, benefits, etc).  SLA’s vary by your membership level.

    Office 365 Partner Yammer Community – This is a Yammer group maintained by the Office 365 Partner team. There are no guaranteed answers or SLA, but it is a great place to collaborate with the Product team and other partners.

    Silver/Gold Competency Partners

    Partners with a Silver competency have access to 20 advisory hours a year, and Partners with a Gold competency have access to 50 advisory hours a year.  This is the Bat Phone to speak directly with a Partner Technical Consultant about your technical issue. There are many things a Partner Technical Consultant can help you out with (deployment planning, design review, and more). Submit your request via http://aka.ms/mysupport and a consultant will call you directly.

    Cloud Accelerate/Cloud Deployment Program Partners

    The Cloud Partner Support team is available to Cloud Accelerate and Cloud Deployment Program Partners via the Microsoft Online Portal (MOP) and via phone submission for severity A (critical) issues. Note: Cloud Accelerate partners must submit MOP issues on their partner Individual Use Rights (IUR) tenant to ensure routing to the correct Support team.   A quick reference guide with SLA’s, best practices, escalation resources, etc is available here: Cloud Partner Support Quick Reference Guide

  • Partners - Maximize Your Investment and Use Free Your Azure Credits! We can help you get started.

    Partners, did you know that if you have an MSDN subscription you qualify for free Windows Azure credits each monthMAPS Partners also qualify for $100 in Monthly Credits.  Most Gold and Silver partners quality as well.

    To see if you qualify for Internal Use Rights on Azure follow the steps in this video.

    You might say, “but we don’t work on Azure, so those don’t help us” or, “we’d like to learn Azure, but where do we start?”  There is a common business need you can start addressing that applies to almost everyone.

    Just about every Partner I talk with uses virtual machines in some way.  It might be for demos, lab testing, development work, training, customer support, or many other scenarios.  Most of the folks I talk with also run into challenges with VMs.  See if any of these apply to you:

    • “Our IT staff doesn’t have time to maintain VMs for us.”
    • “We frequently run out of VM capacity on our servers when we are busy and need it the most.”
    • “We want to build a catalog of VMs that we can spin up quickly.”
    • “The developers only use VMs for a short amount of time, but need them daily.”
    • “We want to access our VMs from anywhere.”
    • And the list goes on….

    Why not put those free Azure credits to work and start running some of those VM scenarios on Azure?

    To help you get started, the Partner Services team has put together a new offering called (ready for my burst of creativity?) Labs on Azure.  This offering is an opportunity for you or your team to spend some time 1-1 with one of our consultants.  You’ll learn how to get started in Azure, build a VM in minutes, customize and re-use VMs, upload VMs you might be running on premise today, and how to start automating VM creation through PowerShell. 

    This is a great way to get started on Azure, solve a common challenge, and use a benefit you already have!

    To get started, visit http://aka.ms/mysupport to view your available advisory hours and submit a request.

    Also, if you want some broader training on Azure be sure to check out this learning path.

  • Migrate from Gmail to Office 365 (video)

    In a recent post (Migrate from Gmail to Office 365 in 7 steps), I shared the steps to migrate from Gmail to Office 365.

    If you are doing a large migration from Gmail to Office 365, you will generally want to use a 3rd party tool that automates the process.  However, if you are migrating a small customer with a few mailboxes, it is quick, easy, (and free!) to do so manually).

    As a picture is worth a thousand words (and a movie shows 30 pictures a second), I thought I would share the process in two short videos.

    As always… if you are a Partner and need assistance migrating mailboxes, ask our experts!

    Part 1

     

    Part 2

  • Virtual Drumbeat Office 365

    image

    Come learn the proven best practices for selling Office 365, Virtually!

    The Virtual Drumbeat Sales day, April 18th, provides partners in sales and pre-sales technical roles with best practice sales training for Office 365.  Selling Office 365 requires a new way of selling, come hear about it.  In addition to sharing Microsoft best practices, programs, and selling tools, we will also present insights into the cloud services market and the opportunity for partners who invest in growing an Office 365 practice.  

    You will also have the opportunity to interact and learn from your industry peers and representatives from Microsoft.

    Date: April 18th, 2014
    Time: 9:00 am – 5:00 pm PST

    WHO SHOULD ATTEND:

    Sales Professionals
    Pre – Sales Technical

    There is no charge for this exclusive training, however we will be imposing a no-show fee of $39 (USD) if you register, but do not cancel your registration within fourteen (14) business days before the start of the first event.

    Space is limited. RSVP today!

    Session Descriptions

    Session 01 The Office 365 Enterprise Partner Opportunity

    The new Office represents a once-in-a-generation shift in technology and a new era of partner opportunity.  Microsoft Is front-running the industry transformation to the cloud and Office 365 is leading the charge.  Learn more about our investments in the new Office and how we have created new partner opportunities across the customer lifecycle.

    Session 02 Office 365 What to Sell

    Office 365 is Microsoft’s fastest growing business ever to the tune of $1 billion and counting.  And, three out of four enterprise customers work with a partner to deploy their Office 365 service.  Are you one of these partners?  Learn more about the benefits of becoming a recognized Office 365 Cloud Deployment partner and what it takes to be one. 

    Session 03 How to Sell Office 365

    Microsoft's Office 365 is advantageously built on a set of cloud principles that form how we position Office 365 to customers.  Become familiar with these principles and learn how to showcase the value of Office 365 cloud services across a breadth of real customer scenarios.  

    Session 04 Google Compete

    The proliferation of devices, broadening workplace demographics and a transformative shift to the cloud are all trends impacting the way we work.  Office 365 clearly addresses all of these trends and is backed by a sales process that has helped grow a $1B business.  Learn how to sell to customers using the Customer Decision Framework, a sales process that enables partners to make the shift from traditional software selling to successfully sell Office 365 in the cloud. 

    Session 05 Selling with the Customer Immersion Experience 

    The Microsoft Customer Immersion Experience (CIE) is a hands-on introduction to Windows 8 and the new Office.  For partners, it is an effective sales tool that provides customers with an opportunity to experience these powerful new productivity solutions for themselves.  Learn how the CIE simplifies customer conversations and provides business decision-makers with an opportunity to experience the full Office stack to accelerate sales and close revenue.

    Session 06 Pilot and Deploy Customers with Office 365 FastTrack

    Office 365 FastTrack is Microsoft’s new 3-step pilot and deployment process designed so customers experience service value early in the sales cycle with a smooth path from pilot to full deployment within hours and no 'throw away' effort.  Learn how to utilize the Office 365 FastTrack process to get customers up and running quickly to win against the competition.

    Session 07 Office 365 Support and Communications

    Microsoft is strengthening its partner support and communications strategy to better enable our partners to sell, service and support customers.  Learn about new ways to enhance your service offerings and stay connected with the latest developments on Office 365.

    Register Here

  • Windows Store App Development Training Videos (Italian Versions)

    Recently we’ve released several short training videos around building Windows Store apps. As a part of this process we’re now taking some of the most popular and offering them in additional languages. Today we’re excited to bring you the the Italian Series.

    Windows App Certification Kit

    Costrure un Live Tile per la tua Windows Store App in meno di 10 minuti

    Usare i template Visual Studio per crare App Windows Store

    Suggerimenti e trucchi per usare le notifiche nelle App Windows Store

    Creare app che si adattano alle diverse dimensioni delle finestre