GD Bloggers

This is the blog site for Microsoft Global Delivery Communities focused in sharing the technical knowledge about devices, apps and cloud.
Follow Us On Twitter! Subscribe To Our Blog! Contact Us

December, 2011

  • How to create Pin Authorization Normalization rule in Lync

    I have got many requests on how to create Pin Authorization for specific users on lync to be able to dial international calls.

    Here’s a sample case scenario and print screen of the normalization rule that meets this case:

    - User has to type pin number before any dialed international number,  the pin number is #9090 and that will be shared with the user by the administrator in secured email.

    Scenario  1: User want to dial 0097466040129 he just type it directly , notice the highlighted test result below call will not succeed as it’s not normalized:

    AST-007

    Scenario 2:  User has to types the number in this format #9090009746040129 , notice the highlighted test result below the resulted number is +97466040129 call will succeed as it’s normalized correctly.

    AST-006

    The above is just an example that can be tailored according to different environments.

    Notes:

     - The above normalization rule should be duplicated for pin+E164 typed numbers.

    - Another normalization rule might be required to truncate the E164 numbers that are directly typed by users to bypass the pin enforcement. this rule will be deleting international numbers typed in E164 without the pin.

    I hope it helps !

  • My Hello Azure Service Bus WCF Service–Step by step guide

    This is the first post in a series to take you through my learning experience to using the Windows Azure service bus. The Service Bus is a concrete implementation of the service bus pattern designed to operate at Internet scope within highly-scalable Microsoft data centers as a constituent of the Windows Azure platform. The Service Bus provides a federated identity and access control mechanism through Access Control, a federated naming system, a dynamic service registry, and a robust messaging fabric capable of overcoming the internet and distributed services connectivity. The internet challenges that any distributed service platform face are summarized in the below diagram.

    clip_image002[4]

    The way that the service bus overcomes these challenges is by implementing a relay service pattern as below.

    clip_image004[4]

    Here’s how it works: the on-premises service connects to the relay service through an outbound port and creates a bidirectional socket for communication tied to a particular rendezvous address. The client can then communicate with the on-premises service by sending messages to the relay service targeting the rendezvous address. The relay service will then “relay” messages to the on-premises service through the bidirectional socket already in place. The client does not need a direct connection to the on-premises service nor does it need to know where it resides. The on-premises service doesn’t need any inbound ports open on the firewall. This is how most instant messaging applications work today.

    A central component of the Service Bus messaging fabric is a centralized (but highly load-balanced) relay service that supports a variety of different transport protocols and Web services standards, including SOAP, WS-*, and even REST.  The relay service provides a variety of different relay connectivity options and can even help negotiate direct peer-to-peer connections when possible.

    So in this article I will show you how to develop your first hello service hosted on the service bus. The major steps can be summarized as below.

    Step 1:               Create a new service bus namespace on the Azure service bus.

    Step 2:               Implement the service and client.

    Step 3:               Configure the service and client to use the required relay binding configuration.

    So as you can see it is as easy as 1, 2, and 3. Or is it?! J

    Step1: Create a new service bus namespace

    So you need to go to the URL: https://windows.azure.com/ that will request you either to login using a live ID that has already an active Azure subscription or to sign up. If you already have an account then just logon, or if not you can click sign up.

    clip_image006[4]

    That would lead you to the purchase screen of subscription options, study your options well and pick whatever type you want or (even better) click on the FREE trial.

    clip_image008[4]

    Then you will be asked to login using an existing live ID then you will need to go through three steps as below:

    clip_image010[4]

    clip_image012[4]

    Then in the third step you would need to enter a credit card just as a form of verification and voila you have a three month trial subscription on Azure.

    Next logon to your management portal using you live ID on the URL: https://windows.azure.com/ it should look something like this.

    clip_image014[4]

    Now to create a new namespace do the following steps.

    ·         Click on “Service Bus, Access Control & Caching” from the bottom left.

    ·         Then click on “Service Bus”
    clip_image016[4]

    ·         Then click “New”
    clip_image017[4]

    ·         Enter the namespace name and the portal will check if it is available or not for you and select the needed details as below:
    clip_image019[4]

    ·         Click “Create” and the namespace would be ready to use.

    ·         Now we will need three details to get started, the service hosting URL, the secret issuer name and issuer secret. The service hosting URL would be dependent on the namespace you already created so for the shown above example the URL would be: https://MyAzureHello.servicebus.windows.net.

    ·         To get the secret details click on the created namespace.
    clip_image021[4]

    ·         Then scroll the right action pan all the way down and click on “View” for the “Default Key”.
    clip_image023[4]

    ·         It will then give you the option to copy both the issuer name and the issuer secret to the clipboard. Do this one by one and keep this information were you can use it later.
    clip_image024[4]

    Step 2: Implement the service and client

    This is a straight forward step. Just create two console applications, one for the service host and another for the client and create a service definition as below.

    [ServiceContract]

    public interface IHelloServiceBus

    {

        [OperationContract]

        string SayHello(string name);

    }

    Listing 1: Service Contract

    public class HelloServiceBus : IHelloServiceBus

    {

        public string SayHello(string name)

        {

            string greeting = string.Format("Hello {0}!", name);

            Console.WriteLine("Returning: {0}", greeting);

            return greeting;

        }

    }

    Listing 2: Service Implementation

    static void Main(string[] args)

    {

        Console.WriteLine("**** Service ****");

        ServiceHost host = new ServiceHost(typeof(HelloServiceBus));

        host.Open();

     

        Console.WriteLine("Press [Enter] to exit");

        Console.ReadLine();

     

        host.Close();

    }

    Listing 3: Host Implementation

    static void Main(string[] args)

    {

        Console.WriteLine("**** Client ****");

        Console.WriteLine("Press <Enter> to run client.");

        Console.ReadLine();

        Console.WriteLine("Starting.");

     

        ChannelFactory<IHelloServiceBus> channelFactory =

            new ChannelFactory<IHelloServiceBus>("webRelay");

        IHelloServiceBus channel = channelFactory.CreateChannel();

     

        for (int i = 0; i < 10; i++)

        {

            string response = channel.SayHello("Service Bus");

            Console.WriteLine(response);

        }

     

        channelFactory.Close();

    }

    Listing 4: Client Implementation

    Now we need to configure both the client and service to use the service bus bindings.

    Step 3: Configure the service and client to use the required relay binding configuration

    All you need to do is to put the proper Azure service bus configuration. But before you do that how would the service and client implementation know where to get the binding implementation from? You did not add any custom references, right?! So here is how.

    ·         Make sure both the client and service is using the .NET 4 profile (not the client profile).

    ·         The go in Visual Studio to tools and then extension manager:
    clip_image025[4]

    ·         Click on line gallery and search for “NuGet”
    clip_image027

    ·         Click download and hence install the NuGet Package Manager.

    ·         Close the extension manager.

    ·         Right click on the references node for the service project and click “Manage NuGet Packages”.
    clip_image028

    ·         Click online.

    ·         Search for “Azure” and select the “Windows Azure Service Bus” and click  “Install”.
    clip_image030

    ·         Install the same NuGet package to the client project.

    ·         Now that you have the Azure assemblies in place you can change the service and client configurations as below.

    <system.serviceModel>

      <services>

        <service name="Service.HelloServiceBus">

          <endpoint address="https://momalek.servicebus.windows.net/helloservicebus" behaviorConfiguration="sharedSecretClientCredentials" binding="ws2007HttpRelayBinding" contract="Service.IHelloServiceBus"/>

        </service>

      </services>

      <behaviors>

        <endpointBehaviors>

          <behavior name="sharedSecretClientCredentials">

            <transportClientEndpointBehavior credentialType="SharedSecret">

              <clientCredentials>

                <sharedSecret issuerName="[Issuer name retrieved before]" issuerSecret="[issuer secret retrieved before]"/>

              </clientCredentials>

            </transportClientEndpointBehavior>

          </behavior>

        </endpointBehaviors>

      </behaviors>

    </system.serviceModel>

    Listing 5: Service Configuration

    <system.serviceModel>

      <client>

        <endpoint address="https://momalek.servicebus.windows.net/helloservicebus" behaviorConfiguration="sharedSecretClientCredentials" binding="ws2007HttpRelayBinding" contract="Service.IHelloServiceBus" name="webRelay"/>

      </client>

      <behaviors>

        <endpointBehaviors>

          <behavior name="sharedSecretClientCredentials">

            <transportClientEndpointBehavior credentialType="SharedSecret">

              <clientCredentials>

                <sharedSecret issuerName="[Issuer name retrieved before]" issuerSecret="[issuer secret retrieved before]"/>

              </clientCredentials>

            </transportClientEndpointBehavior>

          </behavior>

        </endpointBehaviors>

      </behaviors>

    </system.serviceModel>

    Listing 6: Client Configuration

    Now you are ready to start the service and (wait for it to properly start, as it take some time – couple of minutes or so) the client and watch them communicate through the Azure service bus.

    Final Notes

    During this exercise I tried many bindings and I must say that the most reliable one I used was the “ws2007HttpRelayBinding” (I mean reliable from the perspective of being able to start hosting the service with no problems).

    Hosting a service behind a proxy (specially a proxy that requires authentication) is not supported and does not workL. Check this URL: http://msdn.microsoft.com/en-us/library/windowsazure/ee706729.aspx.

  • Visual Round Trip Analyzer For SharePoint administrator

    Visual Round Trip Analyzer (VRTA) is a tool that helps the SharePoint Administrator identify
    what is being downloaded at a web page level.

    One of the biggest complaints from users is the response time.

     

    VRTA excels in showing the network round trip relationship between the client and the server.

    This is also critical to the well-being of a farm. While an administrator can optimize the server
    response, there are several other parties that can inadvertently be working against this:

    • Web developers: These folks create the HTML, CSS, and stylesheets.
    • End users: They load content such as images, which directly hampers performance.
    • Application developers: These folks load JavaScript, jQuery, and now have the client
      object model at their disposal.


    All of these listed parties create solutions using SharePoint Designer, Notepad, and possibly
    Visual Studio, and the administrator would have no knowledge of this. But in the end, the
    administrator is the person who will get the support call.

    Using VRTA, the administrator can identify the bottlenecks and involve the right parties.

     

    You must have VRTA loaded on a PC (free download from the Microsoft Download Center).
    Netmon 3.4, also a free download, needs to be loaded on the PC. These tools should not be
    run on servers but on local machines. No special permissions are needed and it can be run
    against a public site.

    VRTA uses Microsoft Network Monitor 3.4 packet analyzer as its foundation. Visually, it shows
    files and packets, along with the round trip information that occurs between a client
    and server.

    When evaluating page loads, several factors should be taken into account:

    • Distance: The round trip
    • Number of round trips
    • Images on a home page
    • Files that need to be downloaded (CSS, JavaScript, and so on)

     

    image

    Using the four tabs, Main Chart, Statistics, All Files, and Analysis, the data the page is
    retrieving and loading can be seen in detail. In the preceding screenshot, every file that is
    loaded shows how long to load, the port, the type of file, a status code, and size.


    The administrator can observe the assets that are being used
    and be able to offer recommendations such as creating a sprite instead of loading each
    individual image, or combining JavaScript files. Hovering over each detail item will present
    further detail on the individual asset.


    VRTA also has an Analysis tab that acts as a best practice guide. It grades the files and page
    on several basic factors such as an average file size rule, white spaces rule, and image
    clustering rule. Using a color-coded scheme, it makes recommendations to help you
    improve performance.


    Finally, every time a recording is made, it is saved in a directory by default, whose path can be
    seen in the title of the VRTA application.

  • TFS 2010 in Practice –Import Upgrade from 2005/2008

    Introduction

    Hello again. In the first part we have done an introduction to TFS 2010 and installation according to dual-server topology. In this post, we will talk about migrating existing TFS databases into TFS 2010 platform. We shall start!

    Firstly, before we are getting into details of importing upgrade, I would like to mention other types of scenarios you might encounter:

    • Migrating sources from 3rd party servers as like IBM Rational ClearCase or ClearQuest, Perforce, HP QualityCenter, Subversion: You can use TFS Integration Tool (1) that is developed by MS TFS product group and ALM Rangers. In fact, the tool can be used for synchronization between those platforms as well. It can also be used for import upgrade scenarios which is defined below. Although I have not tried this, it is very promising. For more details please visit here
    • In-Place Upgrade: Simply upgrading from old version of TFS (2005/2008) to 2010 within the same box by selecting Upgrade from TFS Installation wizard. No need to use another tool. For detail please visit 1st part of these series.

    How-To

    So, here we have TFS 2010 that is up and running well and now wants to import content of TFS 2008 server (applies to TFS 2005 as well). One of the biggest differences between 2010 and old version is that 2010 is based on Team Project Collections (TPCs): One collection reflects a single database and may have 0 or more team projects, independent from other TPC.

    Steps:

    1. Inform your team members about process and schedule
    2. Back up all TFS databases (both from and to databases)
    3. RDC to TFS server (where TFS 2010 installed), make sure the account has permissions to from-and-to-databases. I have used tfs.setup account which is local administrator in TFS 2010 machine and has access rights to databases. For detail again please look at the first part.
    4. Run Command Prompt with administration privilege (type “cmd /admin” in run section of start menu)
    5. Tfsconfig.exe is located in “%programfiles%\Microsoft Team Foundation Server 2010\Tools” directory, so please cd to here in the prompt window
    6. Type “Tfsconfig import /sqlinstance:<FromDatabaseServerInstance\FromDatabase> /collectionName:<CollectionName> confirmed”. This will take some time depending on the network and size of the from-database content, in my case it took 45-50 minutes. At the end if you don’t have 0 error/warning means it is succeed!
    7. Configure portal (on SharePoint Foundation) and reports

    clip_image002Picture 1: Arguments of tfsconfig.exe

    Data Model in TFS 2010

    Databases used by TFS 2010 with some notes

    • ReportServer
    • ReportServerTempDB
    • SharePoint_AdminContent_#: Please remember that SharePoint Foundation 2010 used instead of WSS 3.0
    • SharePoint_Config: portal configuration
    • Tfs_Configuration: central database include list of TPCs
    • Tfs_Warehouse: where data about team project collections is stored and optimized for reports
    • Tfs_DefaultCollection: Again, 2010 creates a database per each TPC that stores collection related data (source, work items, builds, etc..).
    • WSS_Content
    • WSS_Search

    clip_image004

    Picture 2: TFS 2010 SQL Database, Tfs_Warehouse

    Health Check

    After importing it would be wise to re-run Best Practice Analyzer, just you do before importing process. To do so,

    1. Install Microsoft Team Foundation Server Power Tool (2). You should have this tool not just for server analysis, but for TFS Backups configuration and scheduling, Custom Check-in policies, Team Explorer enhancements, Team Members collaboration, etc.
    2. Start > Programs > Microsoft Team Foundation Server Power Tool > Team Foundation Server Best Practices Analyzer

    clip_image006

    Picture 3: TFS 2010 Best Practices Analyzer running

    Conclusion

    In this second part, we have executed a scenario of importing TFS 2005/2008 databases into TFS 2010 Server by using tfsconfig command. Hope you liked it.

    Refererences

    1. TFS Integration Platform
    2. Team Foundation Server Power Tools
    3. Team Foundation Server – Migration and Integration
    4. Moving Team Foundation Server
  • Implementing a message pass-through WCF Behaviour (Router WCF service)

    I recently came across a requirement to implement a Windows Communication Service that acts as a message router across several other WCF backend services. This router service main functionality is to encapsulate the backend services and also to add automatic request monitoring. This should not be hard I was thinking but there are couple of twists, first the router service should not sensitive to the services themselves and how they change and communicate with their clients. It also will be handing encrypted messages that it cannot decrypt as these messages belong to other partners that we do not control and of course do not have the certificate used for encryption.

    The way this is done is by implementing custom WCF message dispatch inspector and custom WCF message client message inspector behaviours. Why do we need both as our service acts as a WCF to all the clients and as a client to all the backend services. So the main idea is to when a message is received over the wire the entire message body is escaped and placed in a custom header element and before this message is sent to the backend the original message body is extracted from the header and written back to the message sent. While the response is being received from the backend service the same logic happens but in the reverse order. So the reply is written to the custom header as an escaped string and then before this reply is delivered to the original caller the reply is read from the header and sent to the client. Simple right J well it will get even better; since we want the original message unparsed and not changed coming on the wire this has to be done using a custom message encoder for both ways, receiving requests from the clients and receiving responses from the backend services. So to try to make this simple I placed the following diagram.

    clip_image002[4]

    So you need to implement a custom message encoder inheriting from the class “MessageEncoder” and mainly implement the method “ReadMessage” to be as follows.

            public override Message ReadMessage(ArraySegment<byte> buffer, BufferManager bufferManager, string contentType)

            {

                byte[] msgContents = new byte[buffer.Count];

                Array.Copy(buffer.Array, buffer.Offset, msgContents, 0, msgContents.Length);

                bufferManager.ReturnBuffer(buffer.Array);

     

                MemoryStream stream = new MemoryStream(msgContents);

                Message message = ReadMessage(stream, int.MaxValue);

     

                string ns = "http://Somenamespace";

     

                MessageHeader header = MessageHeader.CreateHeader("OriginalFullMessage", ns, UTF8Encoding.UTF8.GetString(msgContents));

                message.Headers.Add(header);

     

                return message;

            }

    So you need to implement first a custom class inheriting from “IDispatchMessageInspector” and it should first override the method “ApplyDispatchBehavior” to add this dispatch behavior to the endpoint as follows.

            public void ApplyDispatchBehavior(ServiceEndpoint endpoint, EndpointDispatcher endpointDispatcher)

            {

                endpointDispatcher.DispatchRuntime.MessageInspectors.Add(this);

                endpointDispatcher.AddressFilter = new System.ServiceModel.Dispatcher.MatchAllMessageFilter();

            }

    Then you need to implement the method “BeforeSendReply” to put the original reply back on the wire as follows.

            public void BeforeSendReply(ref System.ServiceModel.Channels.Message reply, object correlationState)

            {

                string origMsg = OperationContext.Current.RequestContext.RequestMessage.ToString();

                string fullMessageHeader = "OriginalFullMessage";

                string ns = "http://Somenamespace";

                int fullMessageHeaderIndex = reply.Headers.FindHeader(fullMessageHeader, ns);

                if (fullMessageHeaderIndex >= 0)

                {

                    origMsg = UnescapeXml((reply.Headers.GetHeader<string>(fullMessageHeaderIndex)));

                }

     

                Message newreply = Message.CreateMessage(MessageVersion.None, reply.Headers.Action, new SimpleMessageBody(origMsg));

     

                newreply.Headers.To = reply.Headers.To;

                reply = newreply;

                return;

            }

    You will then need to implement the method “BeforeSendRequest” from the class “IClientMessageInspector” to put the original request back on the wire before we send it to the backend services.

            public object BeforeSendRequest(ref System.ServiceModel.Channels.Message request, System.ServiceModel.IClientChannel channel)

            {

                string origMsg = OperationContext.Current.RequestContext.RequestMessage.ToString();

                string fullMessageHeader = "OriginalFullMessage";

                string ns = "http://Somenamespace";

                int fullMessageHeaderIndex = request.Headers.FindHeader(fullMessageHeader, ns);

                if (fullMessageHeaderIndex >= 0)

                {

                    origMsg = UnescapeXml((request.Headers.GetHeader<string>(fullMessageHeaderIndex)));

                }

     

                Message newRequest = Message.CreateMessage(MessageVersion.None, request.Headers.Action, new SimpleMessageBody(origMsg));

     

                newRequest.Headers.To = request.Headers.To;

                request = newRequest;

     

                return null;

            }

     

    And that’s it J

  • Streamed XPath Extraction using hidden BizTalk class XPathReader

    Usually when writing custom BizTalk pipeline components you find yourself wanting to extract specific values from the message passed using Xpath statements.

    You can do this either by XPathDocument or XDocument, but this solution would require loading the entire XML into memory and if the XML file is huge that can be not possible. Also it makes the pipeline component slower. The solution is to use a streamed class such as XMLReader. But that would be too much work to do, right?

    The solution comes in the form of a hidden GEM in the BizTalk installed components, called the XPathReader. This is a stream based class that would search for a node or element using the given set of XPath strings.

    This class is defined in the assembly Microsoft.BizTalk.XPathReader.dll deployed to the GAC. You need to add a reference to this assembly first and then use the class as below.

                MsgStream.Seek(0, SeekOrigin.Begin);

                XmlReader reader = XmlReader.Create(MsgStream, settings);

                string strValue = null;

                if (!string.IsNullOrEmpty(MsgXPath))

                {

                    XPathCollection xPathCollection = new XPathCollection();

                    XPathReader xPathReader = new XPathReader(reader, xPathCollection);

                    xPathCollection.Add(MsgXPath);

                    if (xPathReader.ReadUntilMatch())

                    {

                        if (xPathReader.Match(0))

                        {

                            strValue = xPathReader.ReadString();

                        }

                    }

                    MsgStream.Seek(0, SeekOrigin.Begin);

                }

    Where the MsgStream is a seekable steam obtained from the message.

  • Installing and configuring a 4-server SharePoint farm–a successful order

    Translate this page

    Below is the order I followed recently to install and configure a 4-server SharePoint Farm.

     

    1. Ran PrerequisitesInstaller.exe on Application Server 01 – installer downloaded and installed prerequisites
    2. Installed SharePoint Server 2010
    3. Ran the Configuration Wizard – wizard completed (created the farm)
    4. Ran PrerequisitesInstaller.exe on Application Server 02 – installer downloaded and installed prerequisites
    5. Installed SharePoint Server 2010
    6. Ran the Configuration Wizard – wizard completed (added the server to the farm and chose that the app server to host the Central Administration for redundancy)
    7. Ran PrerequisitesInstaller.exe on Web front-end Server 01 – installer downloaded and installed prerequisites
    8. Installed SharePoint Server 2010
    9. Ran the Configuration Wizard – wizard completed (added the server to the farm )
    10. Ran PrerequisitesInstaller.exe on Web front-end Server 02 – installer downloaded and installed prerequisites
    11. Installed SharePoint Server 2010
    12. Ran the Configuration Wizard – wizard completed (added the server to the farm )
    13. Installed a SharePoint Foundation Language Pack on all servers
    14. Installed a SharePoint Server Language Pack on all servers
    15. Ran the Configuration Wizard on all servers in order (App01, App02, WFE01,WFE02)
    16. For each server in the farm and in the order (App01,App02,WFE01,WFE02)
      • Installed SharePoint Foundation 2010 Service Pack 1
      • Installed Service Pack 1 for SharePoint Foundation 2010 Language Pack
      • Installed Dec 2011 CU for SharePoint Foundation 2010*
      • Installed SharePoint Server 2010 Service Pack 1
      • Installed Service Pack 1 for Server Language Pack 2010
      • Installed Dec 2011 CU for SharePoint Server 2010
    17. Ran the Configuration Wizard on all servers in order (App01, App02, WFE01,WFE02)

    * Update: I was discussing the order of installation with a colleague and got the news that the Dec 2011 CU Hotfix package (2597014) "The full server package for SharePoint Server 2010 and contains also the MSF2010 [SharePoint Foundation] fixes so you need only this one package." - no need to apply the CU for SharePoint Foundation.    

    Good luck!

  • Securing WCF Services with Custom WIF STS: A Step-By-Step guide

    In real SOA implementation, you will be probably exposing many WCF Services that you need to secure. Many blogs are there around STS and WCF, but non of them which guide you through a basic implementation of a custom STS using Windows Identity Foundation (WIF) to secure your WCF services. If you are just starting with STS/WIF or you have spent sometime trying to implement a basic STS with no luck, this blog is just for you. I assume you are already familiar with what's STS as an authentication mechanism, WCF in general and Visual Studio. When you finish all the steps, you should have an STS, console client and a WCF service that uses STS for authentication.

    I've added the images to allow you to follow easily with the steps. 

    Pre-requisites:

    Environment Description and preparing certificates

    This demonstration assumes that you will code only on 1 machine which will contain the console client, the service and the custom STS. I use a machine named VS2010 on domain contoso.com

    You will need to prepare a single certificate that will be used in signing and encryption of the STS tokens. to do so, create a self-signed certificate from the IIS, then use this certificate to encrypt traffic of the default website of IIS (SSL).

    Steps to create your custom STS:

    • Create Custom STS with Visual Studio
    • Creating claims aware WCF Service
    • Secure WCF Service with STS
    • Update the STS to use specific certificates for encryption and signing
    • Create and run console application

    Create Custom STS with Visual Studio

      • Create an empty VS2010 solution. Name it “STSDemoSolution”
      • Right-click solution node in the solution explorer and choose “Add-->New Web Site”.
     
    • In the “Add New Web Site” dialog window – select “WCF Security Token Service” as a project type, then in “Web location” drop down select “http” and in text box enter an address on the development IIS, where You want to put Your service. In the given example the address of the server “http://vs2010.contoso.com/STSDemo” is an FQDN of the local development machine. Finally press “OK”.

    • Compile the solution and make sure that it runs successfully.
    • Open the “web.config” of the “DemoSTS” project and modify the “ws2007HttpBiding” to make the message security use “Windows” Authentication. It should look like the following…

    This step is critical. If you pass it, your STS will probably consider anonymous users only for authentication. 

    • To Ensure that your STS is working, browse to the folder “2007-06”, click on the “FederationMetaData.xml” XML file and click “View in Browser”.
       

    You should be able to view the content of that XML file in your browser. This ensures that your STS is created successfully.

     

    Creating Claims Aware WCF Service

    • Right-click solution node in the solution explorer and choose “Add”  “New Web Site”.

    •  In the “Add New Web Site” dialog window – select “Claims-aware WCF Service” as a project type, then in “Web location” drop down select “http” and in text box enter an address on the development IIS, where You want to put Your service. In the given example the address of the server “vs2010.contoso.com” is an FQDN of the local development machine. Finally press “OK”.

    • Your project should look like this

     

     

     

    Secure WCF Service using STS

    Now we will configure the WCF service to use the STS for security.
    1. Right click on the project “Secure WCF Service” and click “Add STS Reference”
     

    2. In the first screen, leave all defaults and click “Next”.

     

     


     
    3. In the second screen, leave all defaults as well and click “Next”.

     


    4. In the next screen, select “Use an Existing STS”.
    5. In the “STS WS-Federation Metadata document location” box, type the address of the STS Federation metadata file. It should be something like “http://vs2010.contoso.com/DemoSTS/FederationMetaData/2007-06/FederationMetaData.xml”. Then click Next.

     


    6. Click “Enable Encryption”, the click “Select an Existing Certificate from Store”


     
    7. Click “OK” when the certificate is selected then click “Next”
    8. In the next Screen click “Next”
    9. In the final screen click “Finish”. Make sure that the checkbox “Schedule task to perform…” is unchecked.


     

     

    Update STS to use specific certificates for signing and encryption

    1. Open the “web.config” file of the “DemoSTS” project
    2. Modify the “appSettings” section to specify the certificates that you want to use for the encryption and signing. After you modify it, it should look something like this…


    Note: In the above example, I use the same certificate for both signing and encryption. IN real life scenarios, those certificates should be different.

    3. Save the “web.config” file and close it.
    4. Right click on the project “SecureWCFService” and click “Update Federation Meta Data”

    Create console application and modify the WCF Service to list all claims in the token of STS

    Now, we will create a console application and modify the service code to read the claims in the STS token
    1. Now add a simple console application to the same solution to make its structure to look like this:

     

     
    2. Right-click “DemoConsole”, the console application project You’ve just added and select “Add Service Reference”:

     



    3. In the “Add Service Reference” dialog window click “Discover” and when “Address” text box is populated with the WCF service address you have created. Double click service name in the “Services” list box, on the picture bellow it is “SecureWCFService/Service.svc”. Then click “OK”:

     

     

    4. Now test that client and service actually work by populating “Main” method of console application “Program” class with following code:


     
    5. Modify the “GetData” function in the  service code also to list all claims. The code should look like the following…


     
    6. Now, compile everything and run the console application.
    7. Set console application as start-up project and press F5.  You should receive the following console prompt after a while:

     

     

     Happy Coding:)

     

  • WSPAD file is in bad format

     

     

    I’ve been configuring TMG auto client proxy detection and it was failing. After researching it online and internally, didn’t find anything useful.

    So when I installed the TMG client and run this test:

    C:\Program Files (x86)\Forefront TMG Client>FwcTool.exe testautodetect
    FwcTool version 7.0.7734.100
    Forefront TMG Client support tool
    Copyright (c) Microsoft Corporation. All rights reserved.
    Action: Test the auto detection mechanism
    Type: Default
    Detection details:
    Timeout is set to 60 seconds
    Locating WSPAD URL on the Active Directory server
    WSPAD object was found in the global Active Directory container
    WSPAD URL found on the Active Directory server:
    http://XXX.YYY.com:8080/wpad.dat

    Initializing Web server connection
    Resolving IP addresses for XXX.YYY.com
    Resolved 1 address(es):
    10.0.110.110
    Connecting to address #1: 10.0.110.110:8080
    Waiting for address #1 to connect
    Address #1 successfully connected
    Requesting wspad.dat file
    Web server is connected and ready to send WSPAD file
    Downloading WSPAD file
    WSPAD file was downloaded successfully
    WSPAD file is in bad format
    Failed to detect Forefront TMG
    Result: The command failed and was not completed.

     

    So in able to fix that, I’ve changed the SCP from wpad to wspad using adsiedit (or you can use the tmgadconfig tool again with –f switch)

     

    Hope this helps!

    M

  • Operation Tasks for Monitoring & Maintaining SharePoint environment

     

     

    Below are the operational tasks which should be done on a regular basis, . There is still work to be done on deciding who will perform each task.

    Daily Tasks

    The following tasks should all be carried out on a daily basis:

    • Backup SQL Server Database
    • Backup Web Front End Servers
    • Backup MOSS Index
    • View Windows Event Log for Errors

    Weekly Tasks

    These tasks should be carried out every week:

    • View/Review custom reports – these are the SQL Reporting Services reports which show solution usage. These will help from a capacity planning perspective
    • View/Review search usage reports – these are the SSP search report
    • View/Review Profile import log
    • View/Review Crawl logs

    Monthly Tasks

    Take care of these tasks once a month:

    • Defragment SQL Indexes
    • Monitor SharePoint Server’s free disk space – this is to ensure enough disk space as index grows
    • Monitor SQL SAN Storage – this is to pre-empty ordering of more disk space
    • Manage best bet results and keywords
    • Review and monitor usage reports

    As Needed

    The following tasks should be performed as necessary; however, they are frequently also covered by standard procedures:

    • Configure Search settings for more relevant results e.g. thesaurus, keywords, adjust global search advanced search page etc
    • Configure/Maintain Search Scopes
    • Configure/Maintain Content Sources
    • Configure/Maintain Indexing Schedules
    • Configure Profile Properties
    • Configure Profile Importing Schedule and settings
    • Configure Audiences
    • Manage the Global Navigation
    • Manage Administrative Permissions
    • Manage file size limits
    • Manage site quotas
    • Manage usage confirmation settings
    • Recover content (content depends on reason for recovery)
    • Change service account passwords in SharePoint
    • Disseminate general SharePoint info
    • Monitor the size of content databases and create new ones as required

    Additional Tasks (may be farmed to a Server Operations Group)

    The following tasks are general server-based tasks that may be executed by a server operations team (or other) and  not necessarily specific to SharePoint Server services .

    • Manages the Windows Server operation system
    • Manages system security
    • System patching and upgrades
    • Initial install and configuration of SharePoint
    • Deploy custom developed and third-party SharePoint solution packages
    • Hardware installation, maintenance, and support
    • Network support
    • Synchronization of portal with AD
    • Approve and activate custom developed and third-party SharePoint solution packages
    • Review and monitor System Center performance levels
  • A Crash Course in Optimizing SQL Server for SharePoint

        • Do not enable auto-create statistics on a SQL Server that is supporting SharePoint Server
        • To ensure optimal performance, it is recommend that you set max degree of parallelism (MAXDOP) to 1
        • To improve ease of maintenance, configure SQL Server connection aliases for each database server in your farm
        • Create a secondary FILEGROUP for each database and mark it as DEFAULT*
        • Only create files in the primary file group for the content database
        • Spread database files on separate disks
        • Use RAID 1 or RAID 10 when possible
        • For collaboration or update-intensive sites, use the following ranking for storage distribution: http://technet.microsoft.com/en-us/library/hh292622.aspx
          1. tempdb data files and transaction logs on the fastest disks
          2. Content database transaction log files
          3. Search databases, except for the Search administration database
          4. Content database data files
        • In a heavily read-oriented portal site, prioritize data and search over transaction logs as follows: http://technet.microsoft.com/en-us/library/hh292622.aspx
    1. tempdb data files and transaction logs on the fastest disks
    2. Content database data files
    3. Search databases, except for the Search administration database
    4. Content database transaction log files
      • Set the autogrow property of database files to a percentage. A general rule of thumb you can use for testing is to set your autogrow setting to about one-eight the size of the file. http://support.microsoft.com/kb/315512
      • Rebuild indices daily
      • Limit a content database size to 200GB

    * updated the post as per the guidance from the technet article http://technet.microsoft.com/en-us/library/cc298801.aspx (Storage and SQL Server capacity planning and configuration)

     

  • Cannot Login to SQL Server using administrator account

    After installing SQL server on a machine, it happens that you connect or disconnect that machine to domain. WHen you do this, the administrator account can no longer access the database engine. The below steps allow you to regain access to the SQL server...

    1. Stop SQL Service: on the command line type: net stop MSSQLServer
    2. Start the SQL Server in Management mode: on the command line type: net start MSSQLServer /m
    3. Open the SQL Server management studio, cancel the login dialog
    4. Open new sql server engine query window: from the menu, Click file->new->Database engine query
    5. Enable SA account if not enabled: in the query window type: Alter login sa enable
    6.  Set the password of the sa account: alter login sa with password='my password'
    7. Stop the SQL server from the command line: net stop MSSQlServer
    8. Start SQL Service from the command line: net start mssqlserver
    9. Start the SQL Management studio and connect to the server using sa account
    10. Add you domain administrator as sysadmin
    11. Disable the sa account when you finish