• Diagnostics of Windows Azure Web Sites – Built-in Diagnostic Features

    Introduction

    I have been developing an MVC 4.0 web application that is hosted in Azure platform. In this first part of 2-blog post series, I will talk about leveraging built-in logging and tracing features of Windows Azure platform.

     

    Context

    Use of built-in diagnostic features requires 3 main steps:

    Step 1: Activation of built-in Windows Azure diagnostics features

    1. Login to Azure management portal with your Windows Account
    2. Select your web site
    3. Click ‘Configure’ on the top menu
    4. Turn it On those you wished to capture under Diagnostic tab menu
    5. Click Save.

    Please see the image on the right a reference.

    Now, you should be able to see the deployment and ftp account on the dashboard as seen in the 2nd image on the right.

    AzureDiagnosticsTurnOn
    AzureFtpAccountReset3

         

    Step 2: Downloading and analysis of log files

    Once Step 1 completed, Windows Azure starts logging diagnostics events for the site. To see them, you need to create an FTP deployment account and FTP program to download the log files as detailed below:

    • Click on Quick Start icon (the one next to Dashboard link at the top menu)
    • Click on ‘Reset deployment credentials’ link in the Publish your app section
    • AzureFtpAccountReset1
    • Enter username and password information for ftp account
    • Click Save icon to finish.
    AzureFtpAccountReset2
    • Download an FTP client application – I use FileZilla, an open source FTP program (Alternatively, you can use a Windows or Internet browser too).
    • Provide the host address, ftp username and password, as notated on the right image.
    AzureFtpFileZilla

     

     

    Step 3: Analysis of diagnostic logs

    In Step 1, we talked about diagnostic features comes with Windows Azure for web sites:

    • Web Server Logging –they are resembled to IIS logs and are located under LogFiles\http\RawLogs with .log extensions. Each request (success or failure) is recorded along with these log fields: # date time s-sitename cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs-host sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken. Readable, but you can get good reports by using LogParser tool.
    • Detailed Error Messages – they are in LogFiles\DetailedErrors folders and can give you details on errors occurred in the application.
    • Failed Request Tracing – they come with XML(document) and XSLT (for transformation) and can give you further details for failed responses.

    image

    Partial image from the tracing document for this demo.

     

    Conclusion

    Windows Azure comes with built-in diagnostic features that you can utilize to monitor your applications running on Windows Azure. The diagnostic features can be useful to report page-related statistics (server logs), and to diagnose errors (error details) and performance issues (traces) in high-level. For detailed logging, you need to come up with your custom solution.

    Happy clouding! and stay tune for my next post on how-to write custom messages to logs in Windows Azure web sites.

  • What is Microsoft StreamInsight?

    Introduction

    AS I was attending on of the sessions I found an Interesting Microsoft Tool called "Microsoft StreamInsight", so I decided to write this post to provide more information about it and provide general guidelines on it.

     

    What is Microsoft StreamInsight?

    Microsoft StreamInsight™ is a powerful platform that you can use to develop and deploy complex event processing (CEP) applications. Its high-throughput stream processing architecture and the Microsoft .NET Framework-based development platform enable you to quickly implement robust and highly efficient event processing applications. Event stream sources typically include data from manufacturing applications, financial trading applications, Web analytics, and operational analytics. By using StreamInsight, you can develop CEP applications that derive immediate business value from this raw data by reducing the cost of extracting, analyzing, and correlating the data; and by allowing you to monitor, manage, and mine the data for conditions, opportunities, and defects almost instantly.

     

    By using StreamInsight to develop CEP applications, you can achieve the following tactical and strategic goals for your business:

     

    • Monitor your data from multiple sources for meaningful patterns, trends, exceptions, and opportunities.

      Analyze and correlate data incrementally while the data is in-flight -- that is, without first storing it--yielding very low latency. Aggregate seemingly unrelated events from multiple sources and perform highly complex analyses over time.

    • Manage your business by performing low-latency analytics on the events and triggering response actions that are defined on your business key performance indicators (KPIs).

      Respond quickly to areas of opportunity or threat by incorporating your KPI definitions into the logic of the CEP application, thereby improving operational efficiency and your ability to respond quickly to business opportunities.

    • Mine events for new business KPIs.

    • Move toward a predictive business model by mining historical data to continuously refine and improve your KPI definitions.

     

    Skillset needed to work with Microsoft Insight

     

    • .Net & Linq

    • Dashboards & Reporting

    • Microsoft SQL Server 2008 R8 or Later (Enterprise)

     

     

    Coding Sample

    in this section I will provide a quick walkthrough on StreamInsight sample, below are the steps:

    • Create an instance of the Server, in this case I called it "MyInstance"
    • Create an Application "MyApp".
    • Create an Input Stream; this input stream will be used as a demo purpose to provide the data source.
    • Creating a Query to filter results using Linq Query
    • Configure the Events & Endpoints
    • Run & Stop the engine

     

    PS: For more details visit StreamInsight on MSDN, link for related topics is found in the reference section.

     

    The Code:

    Server server = null;

    using (Server server = Server.Create(”MyInstance”))
    {
        try
        {
            Application myApp = server.CreateApplication("MyApp");

            var inputstream = CepStream<MyDataType>.Create("inputStream",
                                                           typeof(MyInputAdapterFactory),
                                                           new InputAdapterConfig { someFlag = true },
                                                           EventShape.Point);

            var filtered = from e in inputstream
                           where e.Value > 95
                           select e;

            var query = filtered.ToQuery(myApp,
                                         "filterQuery",
                                         "Filter out Values over 95",
                                         typeof(MyOutputAdapterFactory),
                                         new OutputAdapterConfig { someString = "foo" },
                                         EventShape.Point,
                                         StreamEventOrder.FullyOrdered);

            query.Start();
            Console.ReadLine();
            query.Stop();
        }
        catch (Exception e)
        {
            Console.WriteLine(e.ToString());
        }
    }

     

    References

  • Creating a SQL Server alias using the SQL Server Client Network Utility

    Many think that you have to install the SQL Server client tools to be able to create a SQL Server alias on a client machine. You can do this without installing SQL Server client tools. Windows ships with the SQL Server Client Network Utility. This is all what you need to create an alias. Click run and type cliconfg.exe and you will get the utility. Enable a protocol - e.g. TCP/IP

    Click on the Alias tab and click on Add...

    Add an alias as the below figure depicts.

    Enjoy!

     

     

  • YouTube Video Gallery in SharePoint 2010

    During one of our engagement we had a requirement for one of our customers to build video galley on SharePoint

    The video galleries are created based on SharePoint content type as backend storage and content Query web part mixed with JQuery features to represent the video/images as sliding objects, the following diagram shows the high level design of the solution

     

    image

    First you have to create the following content types as follows:

    1. Video Album

    Custom content type that inherit from folder content type with the following properties :

    • Video Album cover image

    2. YouTube Video

    Custom content type that inherit from list item content type

    • Youtube thumbnail URL
    • youTube Video URL

     

    image

     

    Assign the content type to SharePoint list, name it as video gallery

    image

    Now you can add video albums and youtube videos into the video gallery list

    image

    image

     

    Now once the backend is setup and filled with videos and albums we need to setup the UI part using content query web part located in codePlex http://imtech.codeplex.com/releases/view/39782 this web part supports paging and custom XSLT styles:

    image 

    we used the JQuery sliding gallery called pikachoose http://www.pikachoose.com to display the videos located in video gallery ,once a user clicks on specific album we open popup that display the videos located in video album as sliding objects ,video album name is passed in query string to filter the videos based on the clicked Album,we created a custom content query web part to reference the needed script files ,CSS and XSLT and the most important to filter based on the passed query string , we inherited the normal content query as follows:

    public class CustomCQWP : Microsoft.SharePoint.Publishing.WebControls.ContentByQueryWebPart

    {

    Then added the needed properties as follows,here am listing two of them

            [Personalizable(PersonalizationScope.Shared)]

            [WebBrowsable(true)]

            [WebDisplayName("Custom CAML")]

            [Description("")]

            [Category("Custom")]

            public string CustomQueryString { get; set; }

     

     

            [Personalizable(PersonalizationScope.Shared)]

            [WebBrowsable(true)]

            [WebDisplayName("Parameter Name")]

            [Description("")]

            [Category("Custom")]

            public string ParameterName { get; set; }

    Then add the properties using

    public override ToolPart[] GetToolParts()

    {

    ArrayList res = new ArrayList(base.GetToolParts());

    res.Insert(0, new CustomPropertyToolPart() { Title = "Custom" });

    return (ToolPart[])res.ToArray(typeof(ToolPart));

    }

    Then in CreateChildControls we used the query string to filter the data from the list

    if (string.IsNullOrEmpty(ParameterName) == false){

    string value = HttpContext.Current.Request.QueryString[ParameterName];

    if (SPContext.Current.FormContext.FormMode == SPControlMode.Display)

    {

    Properties shown as follows in web part properties

    image

    image

    We registered the CSS links and JS links as follows:

    HtmlHead head = (HtmlHead)Page.Header;

    HtmlLink link = new HtmlLink();

    link.Attributes.Add("href", Page.ResolveClientUrl(CSSPath));

    link.Attributes.Add("type", "text/css");

    link.Attributes.Add("rel", "stylesheet");

    head.Controls.Add(link);

    if (string.IsNullOrEmpty(JavaScriptPath) == false)

    Page.ClientScript.RegisterStartupScript(this.GetType(), "CustomCQWPJS", @"<script type=""text/javascript"" src=""" + JavaScriptPath + @"""  ></script>");

    Now the gallery retrieves the objects from the backend list and display for users as gallery

    image

  • SharePoint Administration & Development Tools and Guidance

    In this post, I am going to list the most used tools aligned with MS recommended practices for custom SharePoint development and administration. Please note that there are hundreds of tools available (from MS or 3rd party) in the market and the tools listed below are either MS proprietary or MS open source.

     

    Administration Tools

    • Windows PowerShell Cmdlets: It is the scripting and administrative tool for Windows platforms; widely used from SharePoint to TFS, from Exchange to SQL Server. If you are administrating a SharePoint farm, this tool should be your best friend. The one specific to SharePoint (use of Microsoft.SharePoint.PowerShell) is SharePoint Management Shell which is introduced with SP 2010 and its latest version currently is 3.0. PowerShell scripts can be called programmatically.
    • STSADM Tool: It is a command-line tool used for administrating SharePoint farm. It is used particularly for MOSS 2007. Its equivalent tool is PowerShell cmdlets that is mentioned above, and use of PowerShell instead is recommended.
    • Other OOB Tools (depending on the SharePoint version):
      • SPDisposeCheck: Tool used for checking if SP objects disposed according to disposing best practices. It certainly gives you primitive idea if the deployment has any memory leak issue, but requires further study, dump analysis for example. Please note that SP objects underneath uses unmanaged codes such as COM objects, and you need to pay attention to those since not disposed by .NET garbage collector (gc) automatically. For such cases, recommended practices are use of ‘using’, ‘try-catch’, and/or ‘dispose() calls. It can be used as a standalone or as an add-in for Visual Studio. Applicable in all versions.
      • SP Best Practices Analyzer: Checks best-practice rules against to data collected from performance counters, codes, repositories, etc..
      • SharePoint Health Analyzer: It lets you set regular and automated checks for potential configuration, performance, and usage problems on servers across the farm per health rules defined in Microsoft.SharePoint.Health assembly by default. Rule run cycles (timer jobs) can be customized.
    • SharePoint Administration Toolkit: Contains kits such as Load Testing Kit, Security Configuration Wizard and User Replications for farm administrators. Available for both SP 2010 and 2007.
    • SharePoint Timer Job Administration: GUI tool to manage SharePoint timer jobs (schedule, enable/disable) deployed to your SharePoint solution.
    • SharePoint Manager: It lets you to explore any property in any site within the local farm. Available for SP2007, 2010, and 2013 versions from codeplex.
    •    

     

    Development Tools

    • Visual Studio: Visual Studio (VS) is much more than a tool; in fact it is one of the most sophisticated Integrated Development Environment (IDE). Please see here for details on developing custom SharePoint applications by using VS. The most useful tools/features with VS for SP development, I think, are listed below:
      • Web Performance and Load Test: Capacity planning (measuring minimal hardware configuration of SharePoint portal with business continuity targets) is one of the integral jobs in SharePoint development. It is especially critical when the portal is integrated more and more with other platforms (SAP, CRM, PeopleSoft, Lotus, etc.) and has massive concurrent user volume. This feature is to identify the areas causing bottlenecks in the system per performance counters (processor, memory, network) with threshold values that you set. It is closely related to Service Level Agreement(SLA) or OLA (Operation Level Agreement) that you defined for your processes. It requires VS Ultimate version and Load Test Feature Pack license if test runs with 250+ concurrent users.
      • Performance Profiler: It is a feature comes with VS and does performance analysis from start to end. It does explicit code-based performance analysis, so that you detect code blocks slowing down the application. It requires VS Ultimate or Premium edition.
      • Visual Studio 2010 SharePoint Power Tools: It comes with additional site templates and extended functionalities. Please note that with SharePoint 2013, one of the changes is having less number of site templates. So, use of these extensions may not be a good for new developments.
      • Productivity Power Tools 2012: Add-in tool that facilitates your development efforts with power commands and features that boost developer productivity.
    • SharePoint Designer: It let’s you work on the site collections directly. It uses “Discover and Use” approach – no deployment efforts needed. Thus, it is easy to use and does quick fixes/enhancements, but limited compare to Visual Studio IDE aforementioned.
    • AjaxControl Toolkit: It is a library of AJAX-enabled controls. It has more than 40 controls from Calendar to Accordion, from Modal Pop-Up to Auto Complete.
    • Silverlight Toolkit: It includes Silverlight components that are useful if your site has complex graphs.
    • Developer Dashboard: Very similarly to ASP.NET Tracing, it monitors the performance of the SharePoint page rendered. It echoes execution times  in page level (per web part, SQL query, etc.). By default, It is in Off mode; it can be changed by using STSADM, Windows PowerShell, or SharePoint Foundation object model. It comes with the framework and available in both 2010 and 2013. Once it is enabled, it is displayed on any page that uses master or custom master page where the Dashboard control is included. In SP 2013, it is displayed in a prompt page whereas it is displayed within the page in SP 2010. Compare to use and analysis of ULS logs, this is much simpler and quicker.
    • Microsoft Code Analysis Tool .NET (CAT.NET): It is secure-code-analysis tool that checks the custom code against to common vulnerabilities, such as cross-site scripting and SQL injection.
    • DebugView: It monitors debug outputs through TCP. Usable for troubleshooting purposes. Its effectiveness depends on the diagnostic code implementation within the application.
    • WinDbg: Windows debugging tools lets you analyze the code by attaching to a process at run time or through the dumps taken. Dump analysis with WinDbg is a common practice, because taking dumps is an easy task and can let you work remotely and repeatedly.
    • LogParser: Command-line utility to analyze text-based logs such as IIS and Windows event logs (XML, txt, csv, etc.). Logging is one of the standard practice to record the resource and/or application related data (health, statistics, etc.). Most of the time log files are to large and complex to analyze it. It uses standard query and can produce different outputs such as text-based files or graphs.
    • NetMon: Tool to analyze the network traffic (in and out) on the network adapter it bounds to. It has both GUI and command-line utility (NMCAP). Certainly useful to capture and analyze the inbound and outbound traffic when explicit package study intended.
    • CLR Profiler: Tools from MS is used for tracking memory usage of managed codes in .NET applications (2.0+).

     

    Guidance