Insufficient data from Andrew Fryer

The place where I page to when my brain is full up of stuff about the Microsoft platform

July, 2011

  • Analysis Services cubed in SQL Server Denali ctp3

    I have now managed to install analysis services (SSAS) in SQL Server Denali ctp3 three times to reflect the three installation options that now exist ..

    ssas in denali ctp3

    • The top instance is the table based installation that’s completely new for SQL Server Denali
    • Then I have olap based SSAS instance that I have deliberately called OLAP.  This is essentially the same as was introduced in SQL Server 2005 and performance enhanced in SQL Server 2008. It has had some minor improvements in SQL Server Denali, specifically scalable string storage to allow more than 4G for storing strings in dimension tables, but will support existing tools such as the data mining add-ins for Excel (Note: you’ll need the SQL Server 2008 add-in and this must be installed to 32-bit excel).
    • Finally I have a PowerPivot for SharePoint integration instance (the instance name is hard coded as PowerPivot during the installation), and as you can see I have one PowerPivot in SharePoint

    Anyway back to table based or VertiPaq analysis services.  Essentially this is server side PowerPivot:

    • Is uses the same query language, DAX
    • It uses the same VertiPaq column based engine to cache data, and this can be likened to MOLAP storage in SSAS.
    • you can import a PowerPivot into BI Development studio (BIDS) and then deploy an instance of tables based SSAS.
    • You’ll see that the design experience in BIDS is very similar to what there is in PowerPivot for Excel today with SQL Server 2008 R2. 

    However like PowerPivot itself in SQL Server Denali there are also improvements in BIDS such as a visual view of the relationship between the tables ..

    table based SSAS model

     

    What you are actually looking at here is the new BI Semantic Model (BISM).  This is replacing the report model in previous versions of SQL Server and will also replace the unified dimension model (UDM) that underpins analysis services, so that there is a simple three layer structure to do BI:

    • Data sources, be that structured data that exists in house or Odata feeds from the cloud like the Azure data market
    • BISM to add business logic, and store how the data sets are joined and where they came from.
    • Analysis & Reporting tools both from Microsoft and from third parties to suite different users and differing needs.  For example I can point straight Excel (with no add-ins)  directly at Tabular based SSAS model like this or the Dashboard Designer in SharePoint and both will treat this as a conventional olap cube. 

    At this point I probably need to do a series of posts on why BISM is important as well as contrast and comparison posts on Vertipaq v OLAP analysis services, so I’ll stop here. In the meantime if you want to try any of this, then SQL Server Denali ctp 3 is available for download here and there’s matching sample databases, reports and tutorials for it on codeplex.

  • Reporting Alerting in SQL Server Denali ctp3

    The installation experience for SQL Server Denali is very similar to SQL Server 2008 R2 but there are some important differences because of the new services.  I would also add that this is the install of the beta and I daresay I’ll need to do another post when it is released. The options for the database itself are largely unchanged, the biggest differences are for reporting and analysis services, because of the cool new things you can do now and in this post I want to look specifically at reporting services.

    Reporting Services can still be installed natively, and I couldn’t see that this had changed at all even to the point that Report Builder 3 is supplied to design the reports.  If you have seen anything around Denali you may well have heard of project Crescent, but there are two other key things to know about this new version of reporting services in SharePoint mode:

    • End users can setup alerts when numbers on reports go out of band.
    • Reporting service is now a part of SharePoint, (it requires SharePoint 2010 sp1) and is configured through PowerShell. This has been done to remove the performance degradation in previous versions of reporting services when in integrated mode which although reduced in SQL Server 2008 R2 also mean you have to configure SharePoint and the reporting services service using it’s own configuration tool.

    In order to setup Reporting Services in SharePoint:

    • Check this dialog in the installation of SQL Server :

    rs_SQL11_SetupfeatureSelection_withcircles

    • Follow this guide to configure SharePoint for Reporting Service
    • Having done that you can set up the alerting functionality using this link. Note this uses SQL Agent so make sure you have that running. 
    • You’ll also want to setup e-mail in reporting services by defining your smtp server once the reporting service is setup.  This can be done from SharePoint administration..

    ssrs in sharepoint

    Once this has been setup you can then run any report that has stored credentials (i.e. its not running against the user who’s running the report) and set up an alert..

    new data alert

    and then complete this template to be alerted on any condition you can describe here..

    new data alert 2

    If you want to try any of this, then SQL Server Denali ctp 3 is available for download here and there’s matching  sample databases , reports and tutorials for it on codeplex.

  • BISM and Consumerisation in BI

    What on earth is consumerisation and what has it got to do with Business Intelligence?  Consumerisation is the recognition that our users are tech savvy and want to have access to the tools and way of working that they have outside of work.  Self Service is the result of applying these ideas to BI:

    Our users know what they want

    • They want it now;
    • they don’t want the IT department getting in the way, and they want to see the results of their work as quickly as possible
    • They want to share there work in the way they share their digital lives with their friends and family

    My way of looking at the end user BI tools look today is …

    image

    the users thinks it’s all green and only the BI IT community worry about the red stuff.  On the other hand the more traditional approach to BI might be summed up as..

    image

    It’s strategic but can be hard to change and while no one can deny that the MDX query language is very powerful it isn’t easy for business users or for IT (outside the BI industry) to understand.   So a third way is need and that would look like this..

    image

    i.e. Take the work done in the self service tools and then scale it up put in control and security.  Having done that we then want to put it into production and then hook up all the end users familiar tools to interrogate it. 

    In Denali ctp3 this is now doable using the BI Sematic Model (BISM):

    • Your users create PowerPivots in Excel
    • You can then open that in the new BI Development Studio in Denali as a .. project
    • Add in the security roles and if necessary partition it clean it and refine it.  The original business user who designed the PowerPivot can be involved in this process and they will recognise what’s going on as the designer is virtually the same as it is in Excel.
    • Deploy to a table based instance of analysis services

    I would stress that this approach is in addition to the traditional approach that has been in SQL Server with analysis services since 2005, and you can verify that and try this approach by pulling down SQL Server Denali ctp 3 , SharePoint 2010 with sp1  and the matching sample databases, reports and tutorials for BISM on codeplex.

  • the BI Sematic Model in SQL Server Denali ctp3

    Some sort of semantic model is needed in every BI solution but what is it and why do I need one? 

    It’s a view of the data store(s) you want to work on for business intelligence which adds additional information that can be stored in the individual tables themselves.  For example how are the tables joined together, friendly names for the  columns in those tables that make sense, and security to define which parts of the data each users can work with .   This semantic model is then used as the source for the reports and analytics that comprise a BI solution.  

    This layer of abstraction may seem unnecessary in some cases for example if there is only one source of data such as a data warehouse which already has this functionality all in place. However the sematic model in many solutions including Microsoft  provides the means to group and sum data across hierarchies such as summing by year down to month down to day, or from enterprise to division to department. 

    The other advantage of this approach is that it can mask changes to the underlying schema form the reports that depend on it; the report will reference an object in the model which  can have it’s definition changed to reflect the change without having to touch the report.  Multiply this in a BI solution that has hundreds of reports and this can be a reason to use the model in its own right. Models can also span different sources of data e.g. database like SQL Server, Oracle DB2, to spreadsheets flat files and more recently data feeds from internal web services and the cloud.

    Up until now Microsoft has had two semantic models in place one to reflect the relational world in reporting services, the report model and the Unified Dimensional Model (UDM) built into analysis services to model the complexities of olap cubes.  The problem I have with this is that you can end up having to do everything twice; setup security, add business logic and so  on in each model making change control difficult especially as by their very nature they don’t work in the same way.  To get around this one workaround I have used before is to lay a report model with no logic in it at all directly on top  of  a UDM and then the reports inherit the same security hierarchies as exist in the analysis services world.  That is a bit of fudge to be honest and so I am pleased to see the problem is going to go away with the introduction of a new model the BI Semantic Model in SQL Server Denali ctp3.  

    Actually BISM v1 is in SQL Server 2008R2 as part of PowerPivot as it is the mechanism to describe how the sheets (tables) in the PowerPivot were related and also where the business logic you specified (in DAX) was stored.  as I mentioned in my last post,  PowerPivot in SQL Server Denali ctp3 now lets you use a table design view to create these relationships add calculations and now allows you to also create hierarchies, either in the PowerPivot viewer in Excel..

    bism hierarchy

    ..or in the BI Development studio.

    You can also see that some of the columns are greyed out, which indicates I have hidden them.

    Once you have created a BISM, you can then use it much as you would analysis services:

    • in SharePoint Dashboard designer to create scorecards and  dashboards
    • in reporting services as a source for reports, and as a source for the new Crescent reporting tool
    • in any Excel version as a source of a pivot table (without using the PowerPivot add-in)

     

    So why all this change apart from the need to just have one model?

    For me the key thing that BISM provides is that it enables BI professionals to work with end users to scale up the self service tactical BI that they have built in PowerPivot. I think this is important because if the business have gone to the trouble of building a tactical BI solution it should be protected and possibly scaled.

     

    This is available to try now you’ll need SQL Server Denali ctp 3 and SharePoint 2010 with sp1 installed and there’s matching sample databases, reports and tutorials for BISM on codeplex.

  • Reach for the Cloud - System Center 2012

     

    For me the most interesting stuff in the System Center 2012 is the cloud stuff.  Some of this is obvious there’s cloud button in Virtual Machine Manager (VMM)  2012, and there’s even a new product in the line up, currently called Concero that is totally cloud focused as you can see from this preliminary screenshot..

    image

    However some of the cloud enablement is more subtle and is really an extension of what System Center has always been capable of i.e. automation and event handling.

    For example one of the capabilities that marks the private cloud out from a traditional data centre is the ability to provide a self service capability.  I don’t think this is for edn users no matter how easy it is, but rather by what I would refer to as applied IT Professionals as distinct from the Data Centre Professionals. The applied IT Professionals might either be embedded in a business unit or would be an expert in a particular application like the DBAs, Exchange and SharePoint administrators.  This group would have limited rights to create services and provision VM’s within limits set by the Data Centre Professional.  In System Centre 2012 there are several ways to accomplish this:

    • Use the VMM  2012 console but with limited privileges
    • Concero will allow users to see and manage all of their cloud services whether public or private
    • Use a combination of Service Manager 2012 and the new Orchestrator 2012 to provide a more controlled approval process for creating and modifying services.

    The other key feature of cloud computing is scalability and System Center can support this depending on what the service needs:

    • Some applications like SharePoint and Exchange have the inbuilt potential scale to more users and or deliver more throughput as more servers are added to their farms.  Processes can be setup in a combination of Orchestrator and Service Manager to not only do this automatically but also in a controlled fashion.  Of course these servers can also be automatically be scaled back when and if demand drops off.
    • Applications like SQL Server can just use dynamic memory and be assigned more CPU as demands on it increase rather than provisioning more virtual machines to boost the service. Here System Center  will be used to track usage through the relevant management packs and improved PRO (Performance & Resource Optimisation) to show what state everything is in.
    • Moving some applications to the cloud might be appropriate for scale or agility and this has been difficult to do until now. However SCVMM 2012 will have server application virtualisation where you can package a server application just like in app-v and then deploy it  to Azure or another data centre.

    One final thought, wen never discuss backup and disaster recovery of the private cloud but of course it’s essential and in fact protecting System Center 2012 itself from disaster is also essential if you’re using it to manage your data centre  so the least known of the System Center product Data Protection Manager (DPM) is also being upgraded for 2012 to allow remote management and a single console view to allow all of your DPM servers to be centrally managed so you can quickly find the data you need to recover from wherever you are working.

     

    Not all of this is in beta right now but you can get your hands on:

     

    and I’ll keep you posted on the rest.

  • Haiku-V

    Virtualisation

    Good management tools

    Private cloud appears

     

     

    Public cloud worries

    Cause myth and uncertainty,

    Trust is essential

     

    [Haiku is a strict form of Japanese poetry dating back to the 1600s]

  • SCO Saturday part 2–configuring interop with SCVMM

    Apologies for skipping a post on this last week, embarrassingly I couldn’t get System Center Orchestrator 2012 (SCO) to work with System Center Virtual Machine Manager 2008 R2 (SCVMM). Since then I have been on a course with QA Training which included a day on Opalis and the excellent trainer Paul Gregory sorted it for me.  So I thought it would be good to share what I did so you can start to use SCO to control your VMs.

    In my last video all I did was install and deploy the integration pack that talks to SCVMM.  Having done that there’s only one more thing you have to do and that’s setup the connection to your SCVMM server in the Runbook designer.  The key Think you need to do is not to enter any credentials in the dialog box to create a new connection:

    • Enter the name of the SCVMM server in twice once at the top and again at the bottom
    • Blank out all of the connection credentials – domain, login and password
    • Leave all the connection information as is ..

    scvmm config in sco

    You all also need open up the remote admin port (5095) on SCVMM and a simple way to do this is to allow remote management on that server with winRM QuickConfig

    The best way to understand how SCO works with VMs is to pull down the sample runbooks from codeplex and import them as I did in this weeks screencast which shows you how to work with virtual machines…

     

     

    Things to note:

    • the Runbooks (or policies in Opalis) were originally written for Opalis 6.3, but will work here showing that upgrade issues aren’t going to be too much of  a problem. These Runbooks inheret a connection called VMM, so to use them edit this connection to point it to your SCVMM server, and check each step for any hard coded references to physical hosts, paths etc.  In reality noon of this stuff should be hard coded in a Runbook, but theses are just samples after all.
    • SCO doesn’t work with SCVMM 2012 beta, the integration pack for that and for the integration with the other System Center 2012 components are still being developed. So to mange VMs right now I am using SCVMM 2008 R2 and the integration pack that came with Opalis 6.3 (which is included with the SCO beta)
    • You can download the SCO beta  and the samples I used came from Codeplex.
    • There are also community developed runbooks on the TechNet Gallery

    If you are serious about looking at this then I can recommend the QA Course  on Opalis 

  • Gone Critical

    Many of us would associate the the term critical with the Register, so when I was asked by Tim Phillips their broadcast and commercial editor to join a discussion titled “Going Critical” I was a little nervous.  The title refers to some Register commissioned research by Andrew Buss at Freeform Dynamics on what Register subscribers are doing about mission critical application in a virtual world.  This indicates that now the initial rush to virtualise all the simple stuff like file and print and simple applications that there is shift in two directions:

    • A move to embrace the sorts of techniques associated with private clouds, such as process management, resource pooling, and building scalability.
    • Starting to virtualise mission critical applications.

    Also on the panel was Alun Rogers one of the founders of Microsoft gold partner Risual, as he’s in the real world making all this actually happen.  

    The webcast was recorded and is available on the Register whitepapers website.

    I contributed a couple of slides to the deck based on some other research by the Enterprise Server Group on Hyper-V performance with the sort of workloads we were discussing.The other resource I always mention in this conversations is the free Microsoft Assessment and Planning Toolkit (MAPT) which you should use to plan and baseline any  major change to your infrastructure such as physical or virtual to virtual migrations  and version upgrades.

  • Before I get started with SQL Server Denali ctp3

    I have spent the last couple of days setting up a demo environment for SQL Server “Denali”, and I thought it might be useful to share some of my experiences over the next few days. Before I get into that a short post about the environment I have which will comprise of 7 x virtual machines (VMs) running on my Dell Precision M6500 aka the Orange Data Centre:

    running VMs

    Those VMs work as follows:

    • DC -  my domain controller, also running DHCP and DNS
    • BIClient - a windows 7 VM  with all of the SQL Server 2008 R2 tools on such as BI development Studio(BIDS), SQL Server Management Studio (SSMS), PowerPivot, and SharePoint Dashboard Designer.
    • Tangerine - a second Windows 7 VM for  Denali BI, BIDS, SSMS, PowerPivot, SharePoint Dashboard Designer as well as Visual Studio 2010 Ultimate
    • BI2010 - a SQL Server 2008 R2 BI server with SharePoint and all of the BI tools like PerformancePoint, PowerPivot, Reporting Services, Analysis Services and FAST search.
    • Denali 1  - a SQL Server Denali BI Server with everything BI in Denali plus SharePoint 2010 sp1, for Crescent, PowerPivot, BISM  and the new integrated Reporting Services
    • Denali 2 & Denali 3  - two more SQL Server Denali VMs to show the high availability ‘Always On’ capability.

    But why not have a VM for SQL Server 2008 R2 and another for Denali?

    Speed and Flexibility:

    Speed If any of you have tried to use or create an “all in one” VM, complete with a domain controller, Silverlight,  SQL Client tools installed it won’t work too well even if you throw 16GB RAM and all your cores at it.  You could argue my approach means multiple copies of the operating system (one in each VM) but windows 7 is designed for client stuff like Silverlight and Office , while the domain controller does slow down SQL Server fi they are co-located.I could take this a stage further and put SQL Server in its own VM but that would really only benefit lots of concurrent usage.

    Flexibility   I have to use my demo laptop for other demos e.g. System Center which requires another 6 VMs (virtual machine manger, operations manager, orchestrator and 2 for service manager).  These are joined to the same domain as my BI environment so I can use BI to show System Center or manage BI with System Center,  although even my 16Gb SSD only Dell can’t run all of this even with dynamic memory turned on in Windows Server 2008 R2 sp1 Hyper-V.

    I wanted to mention this because I see a lot of monolithic VMs which end up being huge and can be very difficult to manage and basically don’t run as well.  Of course you might choose not to believe a random Microsoft evangelist but if you look at this graph:

    image 

    you can see the number of supported users goes up as we add more VMs to the same hardware while response time (the dotted line) drops slightly  (Research by the Enterprise Services Group).

    However for my demo I am sort of breaking my own guidance as I will just have the one VM for SharePoint and SQL Server Denali.  Before I  go through my experience of doing that (some of which is whirring away as I type this) I want to use my next post to describe some of the moving parts of SQL Server Denali ctp3 which is available for download here