GD Bloggers

This is the blog site for Microsoft Global Delivery Communities focused in sharing the technical knowledge about devices, apps and cloud.
Follow Us On Twitter! Subscribe To Our Blog! Contact Us

September, 2012

  • Why and How-To Asynchronous Programming

     

    Introduction

    Our world is becoming globalized, distributed/clouded more and more. That is very true especially for IT in both consumer and enterprise space. Some examples would be Amazon AWS, Google Documents, Windows Azure Services, Office 365, Windows 8 UI Applications. Within this world, to develop scalable, fluent, and less-dependent/resilient applications, asynchronous programming approach can be used.

    In this post, I will give you the big picture of async world (what is it, why to use, and how and when/where to use it) within .NET platform. In the next post, I will focus on MS-recommended approach (task-based asynchronous programming TAP).

     

    What is it?

    In short, asynchronous programming is to enabling delegation of application process to other threads, systems, and or devices. Synchronous programs runs in a sequential manner whereas asynchronous applications can start a new operation and without waiting the new ones’ completion it can continue working on its own flow (main operation). To simplify, let's visualize a case where a person send an email and can do nothing till a response received from a sender. Here is the tasks beyond sending email are blocked by the response that you have no control over and may take for a while. What would be the asynchronous way is to send the email and continue working on other tasks while waiting the response from sender. In order to capture the full context of asynchronous programming, we need to understand roles and meanings of OS, CLR, application domain, thread, process.

    • Application domain does isolation (security, GC, memory, etc.) and can have 0 or more applications/processes
    • Each process may use 1 or more thread; threads within the same process can share the memory. This is the reason why multi-threaded applications can be problematic and hard to troubleshoot;  concurrency & race conditions.
    • Threads are the lightweight processes/workers. Each process has at least 1 thread (main thread).
    • CLR provides run time for managed codes
    When and where to use it?

    Any point where your application initiates an operation that is long-running and/or Input Output-bound. For example, you may leverage asynchronous programming in the following scenarios

    • Read/write on a file (I/O-bound)
    • Application has a long-running operation (CPU-bound)
    • Call a remote service (service on cloud) leveraging an ESB.
    • Other places, depending on your KPIs.

     

    How does it work?

    Let me explain it with an example: A simple application that process data from 2 Xml documents into a list. Here is the pseudo code:

    1. Parse 1st file’s book elements into a list.
    2. Parse 2nd file’s book elements into another list.
    3. Merge lists

    Now, I am interested in only line 1-3 to differentiate use of sync and Async calls.

    Here is code snippet showing the ProcessXml and Book objects and the method that does the parsing:  

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Xml.Linq;
    using System.Threading.Tasks;
     
    namespace AsyncDemo
    {
        public class ProcessXml
        {
            /// <summary>
            /// Parses 'book' named elements from a file into a list of Book objects
            /// </summary>
            /// <param name="path">file path</param>
            /// <returns>List of Books obj</returns>
            public static List<Book> ParseToList(string path)
            {
                //System.Threading.Thread.Sleep(1000);
                if (System.IO.File.Exists(path))
                {
                    XDocument doc = XDocument.Load(path);
                    if (doc != null)
                    {
                        var coll = doc.Root.Elements("book").Select(p =>
                            new Book
                            {
                                Author = p.Elements("author").First().Value,
                                Title = p.Elements("title").First().Value,
                                Genre = p.Elements("genre").First().Value,
                                Price = double.Parse(p.Elements("price").First().Value),
                                PublishDate = DateTime.Parse(p.Elements("publish_date").First().Value),
                                Description = p.Elements("description").First().Value
                            });
                        return coll.ToList<Book>();
                    }
                }
                return null;
            }
     
            /// <summary>
            /// Parses 'book' named elements from a file into a list of Book objects
            /// </summary>
            /// <param name="path">file path</param>
            /// <returns>List of Books obj</returns>
            public static Task<List<Book>> XmlProcessEngineAsync(string path)
            {
                //System.Threading.Thread.Sleep(1000);  
                List<Book> list;
                if (System.IO.File.Exists(path))
                {
                    XDocument doc = XDocument.Load(path);
                    if (doc != null)
                    {
                        list = doc.Root.Elements("book").Select(p =>
                            new Book
                            {
                                Author = p.Elements("author").First().Value,
                                Title = p.Elements("title").First().Value,
                                Genre = p.Elements("genre").First().Value,
                                Price = double.Parse(p.Elements("price").First().Value),
                                PublishDate = DateTime.Parse(p.Elements("publish_date").First().Value),
                                Description = p.Elements("description").First().Value
                            }).ToList<Book>();
                        
                    }
                }
                return null; // (Task<List<Book>>)list;
            }
        }
     
        public class Book
        {
            public string Author { get; set; }
            public string Title { get; set; }
            public string Genre { get; set; }
            public double Price { get; set; }
            public DateTime PublishDate { get; set; }
            public string Description { get; set; }
        }
    }

     

    Here are 2 unit test methods that demonstrate the scenario both sync and Async manners:

    /// <summary>
    /// Tests the method synchronously
    /// </summary>
    [TestMethod]
    public void ParseToListTestSync()
    {
        books1 = ProcessXml.ParseToList(file1);
        books2 = ProcessXml.ParseToList(file2);
        var list = MergeLists(books1, books2);
        Assert.IsNotNull(list); 
    }
     
    /// <summary>
    /// Tests the method asynchronously with APM approach w/o callback
    /// </summary>
    [TestMethod]
    public void ParseToListTestAsyncWithAPM()
    {
        books1 = books2 = null;            
        DelProcessing del = new DelProcessing(ProcessXml.ParseToList);
        IAsyncResult result = del.BeginInvoke(file2, null, null);
        books2 = ProcessXml.ParseToList(file1);
     
        //if (!result.IsCompleted)
        //{
            //this runs in main thread; do some other stuff while async method call in-progress
            //Thread.Sleep(1000);
        //}
     
        books1 = del.EndInvoke(result);
        var list = MergeLists(books1, books2);
     
        Assert.IsNotNull(list); 
    }

     

    Here is the screenshot from the run results

    image

     

    As you can see, the 1st test method calls the function in a synchronous way:

    • takes 17ms that is sum of the time spent each line in the operation.
    • a single thread used
    • sequential

    2nd test method calls the function in a asynchronous way:

    • takes 2ms and that is sum of the time spent one of the parsing call (the max one) and time spent for others.
    • leverages delegate’s invoking methods asynchronously that uses another thread from thread pool; on which the assigned method (ParseToList) gets executed behind the scenes. I would certainly recommend some time spend on delegates.
    • Simply, here is 2 threads used (specific to this example); one is the main which starts the application and the other for executing 1st call to execute ParseToList method. When delegate invoked, the main thread continue its operation without waiting 2 thread completion, which is stated by EndInvoke method.

    Here is the picture demonstrates this:

    SyncVsAsync

     

    How to-do in .NET?

    Since the version 1.1, .NET support asynchronous programming, since then each release brought new enhancements (delegates, TPL, etc.). Here is the picture taken when searching Async methods available within With FW 4.5.

    image

    With FW 4.5, there are 3 patterns available for Async development:

    • Asynchronous Programming Model (APM) or IAsyncResult pattern:
      • Available since FW 1.1
      • Requires 2 methods minimum (Begin and End prefixed methods)
    • Event-based Asynchronous Pattern(EAP): Available since FW 2.0.
    • Task-based Asynchronous Pattern(TAP): Introduced with FW 4.0 and enhanced with FW 4.5

    APM style can be implemented in 2 ways; with or without a callback. Sample call above (ParseToListTestAsyncWithAPM) is an example for non-callback APM. We can implement same functionality with a callback as seen below:

       1: /// <summary>
       2: /// Tests the method asynchronously with APM callback
       3: /// </summary>
       4: [TestMethod]
       5: public void ParseToListTestAsyncWithAPM_Callback()
       6: {
       7:     books1 = books2 = null;
       8:     int t1 = System.Threading.Thread.CurrentThread.ManagedThreadId;
       9:     int t2 = 0;
      10:  
      11:     DelProcessing del = new DelProcessing(ProcessXml.ParseToList);
      12:     IAsyncResult result = del.BeginInvoke(file2, (r) =>
      13:     {                
      14:         books1 = del.EndInvoke(r);
      15:         t2 = System.Threading.Thread.CurrentThread.ManagedThreadId;
      16:     }, null);
      17:  
      18:     books2 = ProcessXml.ParseToList(file1);
      19:  
      20:     var list = MergeLists(books1, books2);
      21:  
      22:     Assert.IsNotNull(list);
      23:     Assert.IsTrue(t2 > 0);
      24: }

    Let me explain this little bit more in detail:

    • Lambda expression used for callback method
    • ‘r’ represents an IAsyncResult
    • To simulate, what threads used, t1 (representing the main thread id) and t2 (id of the second thread) are used and their values are 11, and 9 respectively.
    • Note that, threads are scheduled by OS; so no control you have when it starts and stops! For example, above, if t1 thread executes line 23 (Assert.IsTrue(t2 > 0);) before t2 completed, then this test method will fail (t2 is still 0). That means, you need to pay attention when and where to use async calls in your application.

     

    TAP is the simplest one and is recommended by MS. Here is the code for implementing the same scenario with TAP:

       1: /// <summary>
       2: /// Tests the method asynchronously with TAP
       3: /// </summary>
       4: [TestMethod]
       5: public void ParseToListTestAsyncWithTAP()
       6: {
       7:     int t1 = System.Threading.Thread.CurrentThread.ManagedThreadId;
       8:     int t2 = 0;
       9:  
      10:     books1 = books2 = null;
      11:     Task.Factory.StartNew(()=> {
      12:         books1 = ProcessXml.ParseToList(file2);
      13:         t2 = System.Threading.Thread.CurrentThread.ManagedThreadId;
      14:     });
      15:     books2 = ProcessXml.ParseToList(file1);
      16:  
      17:     var list = MergeLists(books1, books2);
      18:  
      19:     Assert.IsNotNull(list);
      20:     Assert.IsTrue(t2 > 0);
      21: }

     

    TAP is hot:), will explain this in detail in my next post hopefully. For now, I would like to share the results of my efforts so far with you:

    image

    Obviously, perhaps another post would be good for comparing sync vs async or APM vs TAP by running load tests. We will see. This is a very live world/sector and there are many things to unleash, is not it?

     

    Conclusion

    Wow, that has been my longest post:). Forgot how fast time passed here in Robert’s Coffee in Istanbul.

    Well, in this post, I have explained various aspects of asynchronous programming; meaning, differentiations, why and how-to-use. Asynchronous programming can be implemented in both client and server side and provides scalability and performance advantages over synchronous programming. I would certainly recommend you to invest some time on this, since it is now simpler (TAP) and use of it becomes almost a must-have due to more integration to cloud applications.

     

    References
  • Adding a Publishing Image Field to a SharePoint 2010 Calendar List

    As I was working on customizing a SharePoint Calendar, someone asked me if it’s possible to add a Publishing Image Field to a SharePoint Calendar called “Event Image” to display the current event picture. I decided to write this post to explore the options that I have in this case.

     Option #1:

    I tried to add a new column to the calendar list, but I couldn’t find a column with the Publishing Image type.

    PS:

    You can add a Hyperlink or Picture Column as shown below, but it has a different behavior in the Add/Edit forms as it doesn’t has a browse dialog to search for image location.

     Option #2:

    I tried adding an existing Publishing Image Site Column such as Page Image (found in Page Layout Columns) and it works fine but will keep the name of “Page Image” instead of “Event Image”.

     Option #3:

    The 3rd option is to create a custom content type that inherits form the Event Content Type using Visual Studio 2010 and below are the steps to do so:

     To create a SharePoint 2010 content type application solution in Visual Studio 2010

    1. Start Visual Studio 2010.
    2. On the File menu, click New, and then click Project.
    3. In the Installed Templates section, expand either Visual Basic or C#, expand SharePoint, and then click 2010.
    4. In the template pane, click Content Type.
    5. In the Name box, type CustomEvent.
    6. Leave other fields with their default values, and then click OK.
    7. In the What local site do you want to use for debugging? box, select your site.
    8. Select the Deploy as a farm solution box. Then, click Next.
    9. In the Choose Content Type Settings dialog box, in the Which base content type should this content type inherit from? list, select Event.
    10. Click Finish.

    To edit the content type details in the Elements.xml file

    1. In Solution Explorer, expand ContentType1 and then open Elements.xml.
    2. Edit the Name attribute in the <ContentType> tag and type CustomEvent.
    3. Edit the Group attribute in the <ContentType> tag to Custom Content Type
    4. Edit the Description attribute in the <ContentType> tag and type Custom Event Content Type.

     To add a field to the Elements.xml file

    1. Above the <ContentType> tag, add the following markup. Notice that this includes a new field that will be identified with a new GUID. For this exercise, you can either use the GUID in the following code or create your own GUID.
    2.   <Field ID="{66510192-771D-420B-BA43-1AC8AAF69D7E}" Type="Image" Name="EventImage" DisplayName="Event Image" Required="FALSE" Sealed="TRUE" RichText="TRUE"  RichTextMode="FullHtml"/>
    3. Between the begin and end <FieldRefs> tags in the <ContentType> tag, add the following <FieldRef> tag. Ensure that the GUID matches that of the <Field> in the previous step.
    4.    <FieldRef ID="{66510192-771D-420B-BA43-1AC8AAF69D7E}" Name=" EventImage" DisplayName="Event Image" />

    To deploy the project

    1. In Solution Explorer, right-click the project, and then click Deploy.
    2. In SharePoint, Go to the Calendar and add the new content type.
    3. Give it a try. 

    The Final Markup should look like this:


     

    Additional References:

  • Create a password dialog for Visual Studio deployment project

     

     

    Introduction

    Most of us have worked with Visual Studio deployment projects to create setup projects for components we developed. In many cases we used the custom user interface dialogs that come with Visual Studio to take extra parameters from the user. The problem happens when you require extra parameters than Visual Studio provides or custom user interface dialogs. In my case what I needed is a dialog to take username and password fields from the user and hence I wanted to apply a style to one of the edit box controls to be a password field. In the following I will show you how to create a new custom dialog and how to mark an edit box as a password field.

    How to create a custom Visual Studio Setup dialog?

    Visual Studio setup project has the functionality to add extra dialogs from a library of custom dialogs as shown in the figure below.
    clip_image002

    So what I wanted to do is to customize one of the three textboxes dialog and create new one where I change one of the fields to be a password field. To do this you will need ORCA MSI editor from the Windows SDK Components for Windows Installer Developers here. The steps are as follows.

    1-     Close the Visual Studio.

    2-     Browse to the folder “C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\Deployment\VsdDialogs\0”and make a copy of the file “VsdCustomText3Dlg.wid” and rename it to “VsdUsernamePasswordDlg.wid”

    3-     Open the file “VsdUsernamePasswordDlg.wid”in ORCA.
    clip_image004

    4-     Change all occurrences of “CustomTextC”to “CustomTextD” in all tables and all values.

    5-     In the module signature table make sure you edit the GUID and replace it with a new GUID
    clip_image006

    6-     Open the “ModuleDialog”table and change the display name to any name you would like
    clip_image008

    7-     Open the “Control” table and then change the attribute of the second edit box control to be a password field by changing the value to “2097159”
    clip_image010

    8-     Save the file.

    9-     Open the “Module signature”table and change the value of the language to be 1033 (for English)
    clip_image012

    10-  Save the file in the folder“C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\Deployment\VsdDialogs\1033”with the same name.

    11-  Close ORCA and re –open Visual Studio.

    12-  Now when you open the add dialog you will find that your custom dialog is added to the list of dialog.
    clip_image014

    Conclusion

    You can edit and create any custom dialog you want to the list of dialogs used by the setup solution but remember that your project will not be built unless this custom dialog exists on the development machine.

     

  • Divisional Portal (Single vs. Multiple Site Collections)

    What is Divisional Portal?

    A divisional portal is a SharePoint Server 2010 deployment where teams mainly do collaborative activities and some content publishing; for example it can be used for hosting an Intranet Portal with multiple Departments. In general we usual face the choice between using sites or site collections. Below we will go through each scenario in more details:

     

    Single Site Collection Portal:

    In this scenario the divisional portal will be built using a single site collection, and all the other Teams will be created using sub sites under the root portal. This approach have the following characteristics:

    • The solution has a common navigation
    • Has a single Quota template
    • Has a single content database
    • SharePoint resources can be
      shared among all subsides (like content types, workflows, design … etc.)
    • Security configuration can be inherited
    • Search is easier to configure
    • You can aggregate contents using SharePoint out of the box functionality (like Content Query Web Part, Data View Web Part … etc.)
    • Easier to administer and operate (ex: single backup & restore)

    Multiple Site Collections Portal:

    In the scenario the divisional portal might have a root site collection to aggregate the contents (like News) coming from other Teams. The Teams here is implemented using separate site collection each. This approach have the following characteristics:

    • Can support dedicated URLs per Team.
    • Supports multiple quota templates.
    • Can be distributed among multiple content databases.
    • Additional effort is needed to aggregate SharePoint resources (ex: create a hub to manage content types).
    • Custom navigation need to be considered.
    • Each site collection has an isolated administration configuration.
    • Each site collection must be added separately to Search.
    • Needs some customization to aggregate contents (.NET Development)
    • Requires additional administration efforts (Backup, Restore … etc.).
       

    Conclusion:

    Before making the decision to choose one of the above approaches, you need to study and review your solution carefully and make sure that it covers the requirements. In addition you need to consider the operational & administration effort behind each approach.

     

    Additional References:

     

  • SQL Consolidation Planning and recommended practices

     

    What is SQL server consolidation

    it can be defined simply as reduce the number of physical SQL servers, by migrating/moving different SQL databases running on different servers into one high performance server machine.

    image

    Early in the process of a consolidation project, you will create a profile to help identify which applications are good candidates for consolidation. Then you can identify the applications that fit this profile.

    Some general traits that make an application a good candidate for consolidation are:

    • low machine resource utilization
    • Moderate performance requirements
    • little active development
    • low maintenance costs.

     

    Consolidation strategies

    SQL server consolidation can be achieved mainly through one of the below strategies:

    1. Virtualization: using a single physical machine to host multiple virtual machines (VMs) running Microsoft® SQL Server® data management software
    2. Multiple Instances: using a single machine to host multiple SQL Server instances,
    3. Multiple databases: using a single instance of SQL Server to host multiple databases

    Each of these strategies has different advantages and disadvantages related to security and compliance requirements, high availability and disaster recovery requirements, resource management benefits, level of consolidation density, and manageability tradeoffs

    Consolidation projects are typically started to achieve specific goals such as creating room for new servers or reducing operating expenditure. These goals can be broadly grouped into the following categories:

    • Lack of space in the data center: reducing the physical space required to host applications
    • Reducing costs and improving efficiency: hardware will run closer to capacity, reducing inefficiencies and allowing for fewer machines
    • Standardization and centralization:Place various databases in a centrally managed system, better audit control .
    • IT agility: Moving application into newer hardware, these application can take advantage of the newer technology and improved performance of these machines.
    • Green IT: computers and fewer idle machines result in lower power consumption and a reduced need for cooling

     

    Consolidation Strategy Check list

    when choosing one of the consolidations strategies each of which has its cons and pros, in the below checklist im trying to list the main concerns to take care of , these concerns can be captured from different views as the following :

    • Security
    • High availability and disaster recovery
    • Resource management
    • Density
    • Manageability

    answering the below checklist will help you to decide the best strategy for your environment: 

    Category

    Check item

    Yes

    No

    Security

    Share SQL service account for all applications (dbs)

    Database

    Instance, VM

    Security

    Do you need to isolate system admins (e.g. sa)

    Instance , VM

    Database

    Security

    Do you need to isolate local windows admin account

    VM

    Instance, database

    Security

    Do you seek isolation on SQL binaries' and patching for each application DB ?

    Instance, VM

    Database

    Resource management

    Full software & hardware resources isolation , Hard limits on CPU and memory usage set per application

    VM, Instance-partially

    Database

    High availability

    Applications (DBs) can be moved to different HW without windows restart or downtime

    VM

    Instance, database

    Resource management

    Isolation of tempdb, one tempDb for all applications (dbs)

    Database

    Instance , VM

    Resource management

    Isolation of server level objects (credentials, linked servers, msdb, SQL Server Agent jobs, and so on)

    VM, Instance

    Database

    Density

    Best performance when same hardware is provided

    database

    VM, Instance

    Manageability

    Reduces number of physical servers to maintain

    Instance, database

    VM

    Manageability

    Reduces number of Windows installations to maintain

    Instance, database

    VM

    Manageability

    Reduces number of SQL Server instances to maintain

    database

    VM, Instance

    Manageability

    Are you seeking reduced management overhead and licensing cost

    database

    VM, Instance

     

    Recommendation when planning for SQL consolidation

    • Use the MAP tool to assess the current environment and plan for consolidation. in my next post im going to talk in details how to use the MAP tool and analyze its results, you can download the tool form here.

     

    • Performance considerations
      • If multiple applications are consolidated as databases and these have dependencies on tempdb, I/O bottlenecks on tempdb can cause performance issues, assess your disks IOPS and allocate dedicated physical disks to tempDB.
      • Virtualization: we generally recommend using a fixed size virtual hard disk (VHD) or a pass-through disk because dynamic VHDs can cause additional I/O overhead.
      • Virtualization: Use SLAT processors incase of virtualization , the VMs will perform better especially when increasing number of VMs .
      • Virtualization: If an application has very strict security requirements, it is an ideal candidate for a virtualized approach to consolidation because the virtual machine has almost the same security isolation options as if the application had a dedicated physical host.
      • Virtualization: One virtual processor is recommended to be mapped to one physical processor.
      • Instance-level and database-level consolidation options provide direct access to the consolidated server’s physical hardware, which may help scalability by providing support for hot-add CPU and memory

     

    • SQL Server provides the max server memory and CPU affinity mask settings to set limits on how much memory and how many logical processors the SQL Server instance can use..
    • Database-level consolidation provides the lowest overhead, because all other resources are shared with the other databases on the single instance.
    • Plan for Network bandwidth thoroughly: Another factor to consider when planning for consolidation is the impact on the application’s network and I/O latency, because both the network and storage resources become shared as part of consolidation.

     

    • CPU sizing
      • Newer processors may reduce the need for the application to use as many processors as it previously had.
      • look at all applications that significantly under-utilize CPU, pick the one that utilizes the most processors, and take that number of processors as a base.
      • You should always leave room for peak performance or application usage growth. Targeting approximately 50 percent utilization is a good starting point.

     

    • Density: which is the number of databases that can be placed together in one instance or server, In terms of Performance , database can demonstrate the best throughput and fastest response time when hardware is baselined

    Consolidation method

    Number of applications

    Throughput

    Response time

    Host system CPU utilization

    Baseline (old hardware)

    1

    100%

    100%

    6%

    Virtualization

    24

    +0.8%

    80%

    24%

    Instance

    24

    +0.6%

    58%

    20%

    Database

    24

    +0.9%

    53%

    16%

    Virtualization

    40

    +0.6%

    95%

    45%

    Instance

    40

    +1.1%

    73%

    37%

    Database

    40

    +1.3%

    55%

    34%

    Table : sample Density results based on throughput (higher is better) and response time (lower is better) across options

     

    • Scalability
      • Its important to check what is the scalability room of current hardware (scale up RAM , CPU , available PCI Network slots , Storage)
    • Security
      • Instance-level consolidation provides an additional layer of protection, because the binaries and the SQL Server logins are separate, but the instances still share the same Windows accounts and operating system configuration. At the instance level, we recommend that you use different service accounts for each instance to reduce security risks
    • High Availability
      • All three approaches can leverage the various high-availability features built into SQL Server such as failover clustering, database mirroring, and replication
      • SQL Server failover clustering is the high availability solution, database-level consolidation may not be the best choice, because failover will occur at the instance level. If you have applications that are consolidated at the database level, these applications will need to rely on health monitoring based on the entire instance failing over. However Virtualization or dedicated hardware may be the best choices in this scenario.

     

    Limitations and Boundaries

    • Virtualization Host requires hardware-assisted virtualization support (Intel VT or AMD-V) .
    • Virtualization Host requires hardware data execution prevention (DEP, also called Intel XD bit and AMD NX bit).
    • HW resources in virtualized environment will be allocated to the VM regardless of whether or not they are fully utilized. In addition, the guest operating system of the VM itself will consume some overhead of the allocated resources, and the host operating system will also require an additional allocation of resources although these are generally relatively small.
    • SQL Server is currently limited to a maximum of 50 instances per operating system environment (physical or virtual).
    • Hyper-V has a limit of 64 VMs per node and a SQL Server instance has a limit of 32,767 databases per instance

     

    next  blog , how to use MAP tool to assess your environment readiness for SQL consolidation  :

    http://blogs.technet.com/b/meamcs/archive/2012/09/24/how-to-use-map-tool-microsoft-assessment-and-planning-toolkit.aspx

     

    References: http://msdn.microsoft.com/en-us/library/ee819082.aspx

  • How to use MAP Tool–Microsoft Assessment and Planning toolkit

     

      Introduction

      Larger IT organizations are likely to have a proliferation of SQL Server installations that are not fully known to the internal IT department. Furthermore, some of these installations may be in various stages of repair because they are not managed systematically. If these unsupervised SQL databases carry business-critical or sensitive information without backup or control, this can be problematic for the organization.

      The MAP Toolkit helps you create an inventory of all the SQL Server installations in the network, complete with component and version information. This information can help you consider the value of consolidating, possibly virtualizing, certain databases, and bringing them under IT supervision when appropriate.

       

       

      Install MAP tool

    • Download MAP tool : http://technet.microsoft.com/en-us/library/bb977556.aspx
    • Install the MAP tool
    •              clip_image001

                  clip_image002

       

      How to run MAP tool

    • Open MAP tool
    • Create inventory database which will be used to save inventory data and collected statistics inside it when working with the MAP tool , by default SQL express is installed by default when installing MAP tool , SQL express is used to host the inventory database.

                        clip_image003

    • After creating the database. the MAP console launches giving the option to select your Inventory scenario , since MAP tool can be used to target different scenarios like SQL database consolidation, VM migration, windows upgrade, lync readiness check, etc…
    •        clip_image004

    • The MAP tool will launch the home screen (Inventory and assessment) page, for most scenarios you need to run the first two steps as preparation steps before performing any assessment , these steps are:
      • Perform an inventory
      • Collect performance data

            clip_image005

      To do so , follow below steps which will guide you through this.

      • Click on Perform an inventory à Go
      • Determine the discovery method

               clip_image006

      Select an account that have administrative permission on the targeted servers, you can add more than one account , and the MAP tool will try to login to servers using all the accounts sequentially until it succeed to login

              clip_image007

              clip_image008

              clip_image009

              clip_image010

              clip_image011

     

      • Collect Performance data à Go

                     clip_image012

      select all the SQL servers to participate in the analysis

                   clip_image013

                   clip_image014

                   clip_image015

                    clip_image016

      Though it runs a different thread where you can continue using the tool simultaneously but for SQL MAP you have to wait until this (collecting counters) are completely finished.

                     clip_image017

      Now collect performance Data step is done.

                      clip_image018

     

    • Since the MAP tool is used to cover multiple technology assessment scenarios , we don’t need to run the rest of steps below in case of SQL consolidation, just move to next step now (step 8)
    • clip_image019

    • To continue with SQL consolidation , From the left tree view , click expand Database , run the uncompleted scenarios to have a complete SQL consolidation assessment
    • clip_image020

    • If you are planning to consolidate your current environment having hardware appliances ,make sure to run Microsft database consolidation Appliance wizard:
      • Click Go button
      • Currently you have two hardware flavor , version 1.2 is Intel processor based where v1.2 is AMD processor based, for this scope v1.2 is selected

                    clip_image021

      • Select the servers you wish to consolidate, in this scenario all SQL servers are selected

                    clip_image022

                     clip_image023

      Finish

                      clip_image024

      Important: I will show in a later step where to find and how to read all the reports generated by our assessment.

       

       

      Reports generation and collection

      After completing all previous steps , its now time to generate the reports :

    • To generate SQL Server Proposal Report , SqlServerAssessment, and SqlServerDatabaseDetails, follow the below steps
      • From the right hand tree view , click on Microsoft SQL Server Discovery , this will display a summary view for the MAP execution
    •                     clip_image025

      • To get a detailed report , from the most right actions list , select generate report/proposal

                         clip_image026

      • Wait until report generation is finished:

                          clip_image027

      • then from view menu select

                           clip_image028

      • This will open the physical location for the report

                           clip_image029

     

    • to generate Microsoft Database Consolidation Appliance Report follow these steps:
      • From the right hand tree view , click on Microsoft SQL Server Discovery , this will display a summary view for the MAP execution
    •                   clip_image030

      • To get a detailed report , from the most right actions list , select generate report
      • Wait until report is generated

                         clip_image031

      • From view menu , open saved reports
      • Physical file location will be opened , collect Microsoft Database Consolidation Appliance Report.xlxs
      • HardwareAndSoftwareSummary

                           clip_image032

       

    • PerfMetricResults report
    • clip_image033

      Or there is another way to generate all reports of your selection :

      • To create SQL Server Reports
      • From the main menu, select File → Prepare New Reports and Proposals to launch the Select Reports and Proposals dialog.

                                   clip_image034

      • Click Select All/Unselect All too clear all selections.
      • Click Microsoft SQL Server Discovery.
      • Click Next to review the list of reports and proposals that will be generated.
      • Click Finish to start generating the reports and proposals and to launch a status dialog.
      • After the status dialog reports that the generation has completed, click Close.
      • Select View → Saved Reports and Proposals from the main menu (or navigate to a previously opened file explorer) to launch a file browser on the directory where the generated files are stored.

      Note: Reports and proposals are created in folders named after the database currently in use.

      • Open the following reports:
        • SQLServerAssessment-<date-and-time> excel
          • view the Summary tab to see how many SQL Server database components were found in the network in total.
          • View the Database Instances tab to see all database instances listed with server details including if the server is virtualized (Machine Type).
          • View the Components tab to see all installed database components listed with server details.
      • SQLServerDatabaseDetails-<date-and-time> excel
        • In the SQLServerDatabaseDetails Excel report, view the SQL Server database information.
        • Database summary
        • Instances summary
      • SQLServerProposal-<date-and-time> Word document
        • inspect the discovered instances of SQL Server.
        • The Word document is provided in a customer-ready format that you can include in a larger proposal or analysis with a small amount of customization.
        • The proposal contains most of the information provided in the Excel spreadsheet, but also contains background information on the advantages of SQL Server 2012
  • "The base type 'Microsoft.Office.Server.Search.Internal.UI.SearchFarmDashboard' is not allowed for this page. The type is not registered as safe"

    So, I finish the deployment of a multi-server SharePoint farm, and I setup the Search Service Application; I was able to configure Search and browse the admin pages in central admin successfully. After trying to browse the search admin main page from central admin a couple of days later, I got the error message "The base type 'Microsoft.Office.Server.Search.Internal.UI.SearchFarmDashboard' is not allowed for this page. The type is not registered as safe". To resolve this I added the following line to the Central Admin web site's web.config:

    <SafeControl
         Assembly="Microsoft.Office.Server.Search, Version=14.0.0.0,
         Culture=neutral, PublicKeyToken=71e9bce111e9429c"
         Namespace="Microsoft.Office.Server.Search.Internal.UI"
         TypeName="*" />

     

    But when I clicked the Search Service Application name from searchfarmdashboard.aspx I got the error message "Code blocks are not allowed in this file." To resolve the issue, I added the following line to the Central Admin web site's web.config:

     

    <PageParserPath
         VirtualPath="/searchadministration.aspx"
         CompilationMode="Always" AllowServerSideScript="true"
         />

     

    Hope this helps!