• Writing a custom HostApps ESB adapter provider to integrate ESB toolkit with a Mainframe

    Adapter provider framework is how the ESB toolkit configures the off-ramps to send requests using dynamic send ports. An adapter provider is an implementation of the base abstract class BaseAdapterProvider. The adapter provider is simply a component that is executed as part of the send pipeline to add the required context properties on the message before it is processed by the adapter on the dynamic send port to send the message.
    HostApps adapter is BizTalk way to communicate with Mainframe programs using CICS integration. To use this adapter as part of the BizTalk ESB toolkit then you will need to allow dynamic ports to be able to send requests to this adapter. The ESB way to do this is by implementing a custom adapter provider.
    The steps to implement a new custom adapter provider are simple. The following are the steps required to implement a HostApps adapter provider:
    1. 1-     Create a new C# class library DLL and strongly sign the DLL.
    2. 2-     Add references to the following DLLs

    a.      Microsoft.BizTalk.Messaging.dll

    b.      Microsoft.BizTalk.Pipeline.dll

    c.      Microsoft.Practices.ESB.Adapter.dll

    d.      Microsoft.Practices.ESB.Itinerary.Pipelines.dll

    e.      Microsoft.XLANGs.BaseTypes.dll

    1. 3-     Add a new class with any name and inherit from BaseAdapterProvider
    2. 4-     Override the public properties AdapterName and AdapterContextPropertyNamespace and return any name for your adapter provider. The following is an example for the HostApps adapter provider.

     

            public override string AdapterName
            {
                get
                {
                    return "HostApps";
                }
            }
     
            public override string AdapterContextPropertyNamespace
            {
                get
                {
                    return "HostApps";
                }
            }

     

    1. 5-     Override the method SetEndpoint and do all your processing to add context properties required by the adapter. In the case of the HostApps adapter you need simply to add an undocumented context property called “AdapterConfig” with all properties required including the assembly mapping. The following is an example for full configuration to be set in this property

    .

    <Config xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns:xsd=\"http://www.w3.org/2001/XMLSchema\">
      <uri>Connection String</uri>
      <PersistentConnections>Yes</PersistentConnections>
      <Transactions>Yes</Transactions>
      <SecurityOverrides>No</SecurityOverrides>
      <Tracing>No</Tracing>
      <AdvancedOverrides>No</AdvancedOverrides>
      <AssemblyMappings>&lt;?xml version=\"1.0\" encoding=\"utf-8\"?&gt;&lt;mappings&gt;&lt;mapping&gt;&lt;assembly&gt;&lt;![CDATA[C:\\Program Files\\Microsoft Host Integration Server 2013 SDK v1.0\\ApplicationIntegration\\WindowsInitiated\\InstallationVerification\\TiHostDefinitions\\NetClnt1\\bin\\NetClnt1.DLL]]&gt;&lt;/assembly&gt;&lt;connectionString&gt;&lt;![CDATA[CodePage=37;Name=TrmLink;TimeOut=0;SecurityFromClientContext=False;IPAddress=127.0.0.1;TCPPorts=7508;ConcurrentServerTransactionId=MSCS]]&gt;&lt;/connectionString&gt;&lt;/mapping&gt;&lt;/mappings&gt;</AssemblyMappings>
    </Config>
    1. 6-     You can implement the way to take all these properties any way you want but the below is a full example of how to implement this.

     

            public override void SetEndpoint(Dictionary<string, string> resolverDictionary, Microsoft.BizTalk.Message.Interop.IBaseMessageContext pipelineContext)
            {
                try
                {
                    if (resolverDictionary == null)
                        throw new ArgumentNullException("resolverDictionary");
                    if (pipelineContext == null)
                        throw new ArgumentNullException("pipelineContext");
                    // Set the end point
                    base.SetEndpoint(resolverDictionary, pipelineContext);
                    // Get the endpoint configuration
                    string endpointConfig = resolverDictionary["Resolver.EndpointConfig"];
                    // Split the values
                    // endpoint in the following format "key1=val1#key2=val2#....";
     
                    // Here the assembly mapping will be the last element
                    // We used the '&' instead of the ';' due to that AssemblyMappings will contains ';'
     
                    Config cng = new Config();
                    string elemname, elemvalue;
                    int i;
                    var configelements = endpointConfig.Split('&');
                    foreach (string elem in configelements)
                    {
                        i = elem.IndexOf('=');
                        if (i == -1) continue;
                        elemname = elem.Substring(0, i).Trim();
                        elemvalue = elem.Substring(i + 1).Trim();
                        switch (elemname)
                        {
                            case "Uri":
                                cng.uri = elemvalue;
                                break;
                            case "PersistentConnections":
                                cng.PersistentConnections = elemvalue;
                                break;
                            case "Transactions":
                                cng.Transactions = elemvalue;
                                break;
                            case "SecurityOverrides":
                                cng.SecurityOverrides = elemvalue;
                                break;
                            case "AdvancedOverrides":
                                cng.AdvancedOverrides = elemvalue;
                                break;
                            case "AssemblyMappings":
                                cng.AssemblyMappings = elemvalue;
                                break;
                            case "Tracing":
                                cng.Tracing = elemvalue;
                                break;
                        }
                    }
     
                    StringBuilder Builder = new StringBuilder();
                    StringWriter Writer = new StringWriter(Builder);
                    XmlSerializer serializer = new XmlSerializer(cng.GetType());
                    serializer.Serialize(Writer, cng);
                    Writer.Flush();
                    string configStr = Builder.ToString();
                    string AdapterConfigNamespace = "http://microsoft.com/HostApplications/TI/WIP/Properties";
     
                    pipelineContext.Write("AdapterConfig", AdapterConfigNamespace, configStr);
                }
                catch(Exception ex)
                {
                }
            }

     

    1. 7-     Where the Config class is a just a helper class to allow serialization of the properties to XML and it is as follows.

     

        public class Config
        {
            public Config()
            {
                uri = "Connection String";
                PersistentConnections = "Yes";
                Transactions = "Yes";
                SecurityOverrides = "No";
                Tracing = "No";
                AdvancedOverrides = "No";
                AssemblyMappings = "";
            }
            public string uri { get; set; }
            public string PersistentConnections { get; set; }
            public string Transactions { get; set; }
            public string SecurityOverrides { get; set; }
            public string Tracing { get; set; }
            public string AdvancedOverrides { get; set; }
            public string AssemblyMappings { get; set; }
        }

     

    1. 8-     Then create the manifest file as follows to allow development inside the itinerary designer.

     

    <adapterPropertyManifest adapterName="TI">
      <aliases>
        <alias name="tiPropertySchemas" value="NBE.MW.Common.Schemas, Version=1.0.0.0, Culture=neutral, PublicKeyToken=b504e32a0298c59a" />
      </aliases>
      <properties>
        <property name="uri" type="HostApps.uri" description="Always set to Connection String" assembly="tiPropertySchemas" />
        <property name="PersistentConnections" type="HostApps.PersistentConnections" description="" assembly="tiPropertySchemas" />
        <property name="Transactions" type="HostApps.Transactions" description="" assembly="tiPropertySchemas" />
        <property name="SecurityOverrides" type="HostApps.SecurityOverrides" description="" assembly="tiPropertySchemas" />
        <property name="Tracing" type="HostApps.Tracing" description="" assembly="tiPropertySchemas" />
        <property name="AdvancedOverrides" type="HostApps.AdvancedOverrides" description="" assembly="tiPropertySchemas" />
        <property name="AssemblyMappings" type="HostApps.AssemblyMappings" description="" assembly="tiPropertySchemas" />
      </properties>
    </adapterPropertyManifest>

     

    1. 9-     Copy the manifest file to the folder “C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE\Extensions\Microsoft.Practices.Services.Itinerary.DslPackage” on any development machine.
    2. 10-  Build the DLL and then GAC it.
    3. 11-  Add the following line to the Esb.Config file
    <adapterProvider name="HostApps" type="MyDll.AdapterProviders.HostAppsAdapterProvider, MyDll.AdapterProviders, Version=1.0.0.0, Culture=neutral, PublicKeyToken=32323232" moniker="HostApps" />

     

    1. 12-  Now open the itinerary designer and you will be able to see the new adapter provider in the list of adapters for the static resolver to configure the required properties. Also you can use the same configuration using any dynamic resolver such as BRE resolver.
    Happy BizTalking J
  • AutoSPInstaller for beginners

    I love to write blog posts for beginners. Maybe because, I always feel happy and energized when I stumble upon a helpful blog post when I am learning a new subject. In this post, I will introduce you to AutoSPInstaller (http://autospinstaller.codeplex.com/ ), the framework for installing and configuring SharePoint using automated scripts. It is a great toolkit - whether for a test farm or a high-end production environment - AutoSPInstaller can save you time, effort, and bring consistency and predictability to your SharePoint works. Let's get started.

    So what is AutoSPInstaller? it is a set of scripts, xml configuration files, and batch files to automate the installation and configuration of a SharePoint farm. From the AutoSPInstaller codeplex page:

    "This project consists of PowerShell scripts, an XML input file, and a standard windows batch file (to kick off the process) which together provide a quick and near-unattended installation and initial configuration (Service Apps, My Sites) of Microsoft SharePoint Server 2010/2013. Works on Windows 2008 (though I hardly test on that OS these days), 2008 R2 and Windows 2012 (x64 only of course).

    Perfect for repeated Virtual Machine-based installs/tear-downs, etc., but also great for production installs where you want to guarantee consistency and minimize data entry glitches. The immediate value is for installing and configuring the first/only server in a farm, but also supports using either server-specific input files or a single, all-encompassing input file for running the script on all farm servers (with parameters - e.g. for the service apps - set according to your desired topology).

    "But doesn't SharePoint 2010 have a nice wizard now that does all this for me??" - Yes, and it's a huge improvement over what was available in MOSS 2007. However if you've ever seen the 'DBA nightmare' left behind on your SQL server after the Farm Configuration Wizard has completed (GUID'ed databases with inconsistent naming, etc.):"


    Two other tools support AutoSPInstaller which are the following:

    - AutoSPSourceBuilder: which is a "utility for building a SharePoint 2010 / 2013 install source including prerequisites, service packs, language packs & cumulative updates". See http://autospsourcebuilder.codeplex.com/

    - AutoSPInstallerGUI: which is a light-weight desktop tool used to edit the input configuration file of AutoSPInstaller. See https://autospinstallergui.codeplex.com/

    So how does AutoSPInstaller works:

    1. You download the AutoSPInstaller package and extract it. You get one folder "SP" and in this folder there are 3 subfolders:

    • AutoSPInstaller, which has the scripts
    • 2010, a folder where you place your SharePoint 2010 prerequisites, core binaries, and service packs, CUs, etc... (in case you are installing 2010)
    • 2013, a folder where you place your SharePoint 2013 prerequisites, core binaries, and service packs, CUs, etc... (in case you are installing 2013)

    2. You place the SharePoint prerequisites, core binaries, SPs, and CUs, etc... in their respective subfolders in the master folder of the version of choice (see above). Note: use AutoSPSourceBuilder to slipstream - build the source files.

    3. You create all the prerequisite service accounts and install/configure SQL Server.

    4. You modify the input file, which is AutoSPInstallerInput according to the desired topology, service accounts, web applications, service apps, etc...

    5. You run the batch file AutoSPInstallerLaunch.bat and it works like magic. AutoSPInstaller will execute the file AutoSPInstallerMain and the referenced ps1 files and build/configure the farm.

    Now, I have simplified the whole process in a few steps, but make sure to follow the SharePoint setup/configuration best practices and prepare everything needed for the farm in advance.

    The AutoSPInstaller team did a great job and most of the problems/glitches you will run into are listed in the tool's codeplex site. This is an awesome tool and I recommend it for your SharePoint deployments.

    Good luck!

    Yousef

  • Configuring SharePoint 2013 Search Topology

     

    Introduction

    When creating the Search Service application , a default topology will be built automatically where all Search components are assigned to the server which is running the Central Administration, in multi servers farm scenario you need to change this topology , the only available way currently is through PowerShell which will provide more flexibility in configuring the topology, (you can NOT modify the topology through UI like you used to do with SharePoint 2010)

     

     

    Sample Topology

    I will guide you in this post on how to configure 5 servers as the following:

    • 3 front servers running the Query Processing component to handle search query requests from users, the query processing components will be located in the WFE servers.
    • 2 backend servers servers running the rest of Search components , The rest of search components will be located in the Application servers.
    • We will configure the below topology to have all Search components high available (no single point of failure for any components) , even the index partition will be replicated to the other server (replica)

    you can refer to the below diagram to have better understanding for the topology we are about to configure:

    clip_image001

     

    Before you start

    • make sure that the current user who is going to execute the power shell is added as admin to Search service application

    image 

    • Make sure to run the below Powershell commands on the server that is hosting the Central Administration
    • from Central administration – Search Administration page , make sure the index files are empty , Searchable items appears as 0

    now let us start by opening the powerShell on the server running the Central administration, make sure to execute all the below commands in the same PowerShell screen, but I have divided them in  below to make them more readable:

    where SPAPP-SRV1, SPAPP-SRV2, SPWF-SRV1, SPWF-SRV2, SPWF-SRV3  are the server names

    Prepare the topology variables

    $hostApp1 = Get-SPEnterpriseSearchServiceInstance -Identity "SPAPP-SRV1"  

    $hostApp2 = Get-SPEnterpriseSearchServiceInstance -Identity "SPAPP-SRV2"

    $hostWF1 = Get-SPEnterpriseSearchServiceInstance -Identity "SPWF-SRV1"

    $hostWF2 = Get-SPEnterpriseSearchServiceInstance -Identity "SPWF-SRV2"

    $hostWF3 = Get-SPEnterpriseSearchServiceInstance -Identity "SPWF-SRV3"

    Start-SPEnterpriseSearchServiceInstance -Identity $hostApp1

    Start-SPEnterpriseSearchServiceInstance -Identity $hostApp2

    Start-SPEnterpriseSearchServiceInstance -Identity $hostWF1

    Start-SPEnterpriseSearchServiceInstance -Identity $hostWF2

    Start-SPEnterpriseSearchServiceInstance -Identity $hostWF3

     

    Get services status

    Get Search Service Instance status (started or stopped) after running the above commands

    Get-SPEnterpriseSearchServiceInstance -Identity $hostApp1
    Get-SPEnterpriseSearchServiceInstance -Identity $hostApp2
    Get-SPEnterpriseSearchServiceInstance -Identity $hostWF1
    Get-SPEnterpriseSearchServiceInstance -Identity $hostWF2
    Get-SPEnterpriseSearchServiceInstance -Identity $hostWF3

     

    Setting up the topology

    $ssa = Get-SPEnterpriseSearchServiceApplication

    $newTopology = New-SPEnterpriseSearchTopology -SearchApplication $ssa

    #SPAPP-SRV1

    New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp1

     

    New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp1

     

    New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp1

     

    New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp1

     

    New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp1 -IndexPartition 0

     

    #SPAPP-SRV2

    New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp2

     

    New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp2

     

    New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp2

     

    New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp2

     

    New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostApp2 -IndexPartition 0

     

    #SPWF-SRV1 , SPWF-SRV2 , SPWF-SRV3

    New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostWF1

     

    New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostWF2

     

    New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology

    -SearchServiceInstance $hostWF3

    #Activate the Topology

    Set-SPEnterpriseSearchTopology -Identity $newTopology

    # Verify the Topology 

    Get-SPEnterpriseSearchTopology -SearchApplication $ssa

  • Getting Error: This Setup Requires Internet Information Server 5.1 or higher and Windows XP and higher when installing setup project on windows server 2008 and IIS 7

     

    Introduction:

    I was getting the error:

    This Setup Requires Internet Information Server 5.1 or higher and Windows XP and higher when I was trying to install a setup project on windows server 2008 with IIS 7.0.

    After 1 hour of searching, I discovered there was some configuration problems with the Conditions Editor (Requirements on Target Machine)

    Capture0

    Walkthrough:

    • Go to the setup project setup and Launch the Conditions Editor.

    Untitled

    • Right click on the IIS Condition and view the properties

    image

    • Check Condition Formula; it should as the following:

    (IISMAJORVERSION >= "#5" AND IISMINORVERSION >= "#1") OR IISMAJORVERSION >= "#6"

    In the above case, the minimum IIS Version required is 6.

    If you want to install on IIS Version 7.5 and above then use the below condition formula:

    (IISMAJORVERSION >= "#7" AND IISMINORVERSION >= "#5")

  • How to Query Custom View in Event Viewer using C# Code

    Introduction:

    I want to talk about how to filter events from the custom views in windows event viewer; that will give you more ability to drill down and give you more advance filtration options using C# Code.

    Below is an example how to use the same custom view and setting the options in your server side code.

    Walkthrough Scenario:

    Querying the custom view needs to create a dynamic XML Query; a good start to generate the basic XML Query is by generating one using the event viewer:

    image

    image

     

    Now the extra filtration will be using the Event Level and the Time Generated:

    queryString = "<QueryList>" + "<Query Id=\"0\" Path=\"Microsoft-Windows-Application Server-System Services/Admin\">" + "<Select Path=\"Microsoft-Windows-Application Server-System Services/Admin\">*[System[Provider[@Name='Microsoft-Windows-Application Server-System Services' or @Name='Microsoft-Windows-Application Server-System Services Event Collector' or @Name='Microsoft-Windows-Application Server-System Services Hosting' or @Name='Microsoft-Windows-Application Server-System Services IIS Manager' or @Name='Microsoft-Windows-Application Server-System Services Power Shell' and *[System[(" + ddlFilterByType.SelectedItem.Value.ToString() + ") and TimeCreated[timediff(@SystemTime) &lt;= " + ddlFilterByTime.SelectedItem.Value.ToString() + "]]]]]]</Select> " + "<Select Path=\"Microsoft-Windows-Application Server-System Services/Debug\">*[System[Provider[@Name='Microsoft-Windows-Application Server-System Services' or @Name='Microsoft-Windows-Application Server-System Services Event Collector' or @Name='Microsoft-Windows-Application Server-System Services Hosting' or @Name='Microsoft-Windows-Application Server-System Services IIS Manager' or @Name='Microsoft-Windows-Application Server-System Services Power Shell' and *[System[(" + ddlFilterByType.SelectedItem.Value.ToString() + ") and TimeCreated[timediff(@SystemTime) &lt;= " + ddlFilterByTime.SelectedItem.Value.ToString() + "]]]]]]</Select> " + "<Select Path=\"Microsoft-Windows-Application Server-System Services/Operational\">*[System[Provider[@Name='Microsoft-Windows-Application Server-System Services' or @Name='Microsoft-Windows-Application Server-System Services Event Collector' or @Name='Microsoft-Windows-Application Server-System Services Hosting' or @Name='Microsoft-Windows-Application Server-System Services IIS Manager' or @Name='Microsoft-Windows-Application Server-System Services Power Shell' and *[System[(" + ddlFilterByType.SelectedItem.Value.ToString() + ") and TimeCreated[timediff(@SystemTime) &lt;= " + ddlFilterByTime.SelectedItem.Value.ToString() + "]]]]]]</Select> " + "</Query>" + "</QueryList>";

    *[System[(" + ddlFilterByLevel.SelectedItem.Value.ToString() + ") and TimeCreated[timediff(@SystemTime) &lt;= " + ddlFilterByTime.SelectedItem.Value.ToString() + "]]]

    The values for the dropdown list to filter by level will be set as the following:

    <asp:DropDownList ID="ddlFilterByType" runat="server" Height="16px" Width="160px"> <asp:ListItem Text="All Levels" Value="0,1,2,3,4,5" /> <asp:ListItem Text="Critical" Value="Level=1" /> <asp:ListItem Text="Error" Value="Level=2" /> <asp:ListItem Text="Warning" Value="Level=3" /> <asp:ListItem Text="Information" Value="Level=4" /> <asp:ListItem Text="Verbose" Value="Level=5" /> </asp:DropDownList>

    The values for the dropdown list to filter by Time will be set as the following:

    <asp:DropDownList ID="ddlFilterByTime" runat="server" Width="160px"> <asp:ListItem Text="Any Time" Value="0" /> <asp:ListItem Text="Last Hour" Value="3600000" /> <asp:ListItem Text="Last 12 Hours" Value="43200000" /> <asp:ListItem Text="Last 24 Hours" Value="86400000" /> <asp:ListItem Text="Last 7 Days" Value="604800000" /> <asp:ListItem Text="Last 30 Days" Value="2592000000" /> </asp:DropDownList>

    Now we need to initialize an instance of the EventLogReader by specifying and EventLogQuery:

    EventLogQuery eventsQuery = new EventLogQuery("Application", PathType.LogName, queryString); EventLogReader logReader = new EventLogReader(eventsQuery);

    Read the EventLogReader by looping through the EventRecord:

    for (EventRecord Instance = logReader.ReadEvent(); null != Instance; Instance = logReader.ReadEvent()) { String EventID = Instance.Id.ToString(); }