What on earth is consumerisation and what has it got to do with Business Intelligence? Consumerisation is the recognition that our users are tech savvy and want to have access to the tools and way of working that they have outside of work. Self Service is the result of applying these ideas to BI:
Our users know what they want
My way of looking at the end user BI tools look today is …
the users thinks it’s all green and only the BI IT community worry about the red stuff. On the other hand the more traditional approach to BI might be summed up as..
It’s strategic but can be hard to change and while no one can deny that the MDX query language is very powerful it isn’t easy for business users or for IT (outside the BI industry) to understand. So a third way is need and that would look like this..
i.e. Take the work done in the self service tools and then scale it up put in control and security. Having done that we then want to put it into production and then hook up all the end users familiar tools to interrogate it.
In Denali ctp3 this is now doable using the BI Sematic Model (BISM):
I would stress that this approach is in addition to the traditional approach that has been in SQL Server with analysis services since 2005, and you can verify that and try this approach by pulling down SQL Server Denali ctp 3 , SharePoint 2010 with sp1 and the matching sample databases, reports and tutorials for BISM on codeplex.
Some sort of semantic model is needed in every BI solution but what is it and why do I need one?
It’s a view of the data store(s) you want to work on for business intelligence which adds additional information that can be stored in the individual tables themselves. For example how are the tables joined together, friendly names for the columns in those tables that make sense, and security to define which parts of the data each users can work with . This semantic model is then used as the source for the reports and analytics that comprise a BI solution.
This layer of abstraction may seem unnecessary in some cases for example if there is only one source of data such as a data warehouse which already has this functionality all in place. However the sematic model in many solutions including Microsoft provides the means to group and sum data across hierarchies such as summing by year down to month down to day, or from enterprise to division to department.
The other advantage of this approach is that it can mask changes to the underlying schema form the reports that depend on it; the report will reference an object in the model which can have it’s definition changed to reflect the change without having to touch the report. Multiply this in a BI solution that has hundreds of reports and this can be a reason to use the model in its own right. Models can also span different sources of data e.g. database like SQL Server, Oracle DB2, to spreadsheets flat files and more recently data feeds from internal web services and the cloud.
Up until now Microsoft has had two semantic models in place one to reflect the relational world in reporting services, the report model and the Unified Dimensional Model (UDM) built into analysis services to model the complexities of olap cubes. The problem I have with this is that you can end up having to do everything twice; setup security, add business logic and so on in each model making change control difficult especially as by their very nature they don’t work in the same way. To get around this one workaround I have used before is to lay a report model with no logic in it at all directly on top of a UDM and then the reports inherit the same security hierarchies as exist in the analysis services world. That is a bit of fudge to be honest and so I am pleased to see the problem is going to go away with the introduction of a new model the BI Semantic Model in SQL Server Denali ctp3.
Actually BISM v1 is in SQL Server 2008R2 as part of PowerPivot as it is the mechanism to describe how the sheets (tables) in the PowerPivot were related and also where the business logic you specified (in DAX) was stored. as I mentioned in my last post, PowerPivot in SQL Server Denali ctp3 now lets you use a table design view to create these relationships add calculations and now allows you to also create hierarchies, either in the PowerPivot viewer in Excel..
..or in the BI Development studio.
You can also see that some of the columns are greyed out, which indicates I have hidden them.
Once you have created a BISM, you can then use it much as you would analysis services:
So why all this change apart from the need to just have one model?
For me the key thing that BISM provides is that it enables BI professionals to work with end users to scale up the self service tactical BI that they have built in PowerPivot. I think this is important because if the business have gone to the trouble of building a tactical BI solution it should be protected and possibly scaled.
This is available to try now you’ll need SQL Server Denali ctp 3 and SharePoint 2010 with sp1 installed and there’s matching sample databases, reports and tutorials for BISM on codeplex.
I have now managed to install analysis services (SSAS) in SQL Server Denali ctp3 three times to reflect the three installation options that now exist ..
Anyway back to table based or VertiPaq analysis services. Essentially this is server side PowerPivot:
However like PowerPivot itself in SQL Server Denali there are also improvements in BIDS such as a visual view of the relationship between the tables ..
What you are actually looking at here is the new BI Semantic Model (BISM). This is replacing the report model in previous versions of SQL Server and will also replace the unified dimension model (UDM) that underpins analysis services, so that there is a simple three layer structure to do BI:
At this point I probably need to do a series of posts on why BISM is important as well as contrast and comparison posts on Vertipaq v OLAP analysis services, so I’ll stop here. In the meantime if you want to try any of this, then SQL Server Denali ctp 3 is available for download here and there’s matching sample databases, reports and tutorials for it on codeplex.
The installation experience for SQL Server Denali is very similar to SQL Server 2008 R2 but there are some important differences because of the new services. I would also add that this is the install of the beta and I daresay I’ll need to do another post when it is released. The options for the database itself are largely unchanged, the biggest differences are for reporting and analysis services, because of the cool new things you can do now and in this post I want to look specifically at reporting services.
Reporting Services can still be installed natively, and I couldn’t see that this had changed at all even to the point that Report Builder 3 is supplied to design the reports. If you have seen anything around Denali you may well have heard of project Crescent, but there are two other key things to know about this new version of reporting services in SharePoint mode:
In order to setup Reporting Services in SharePoint:
Once this has been setup you can then run any report that has stored credentials (i.e. its not running against the user who’s running the report) and set up an alert..
and then complete this template to be alerted on any condition you can describe here..
If you want to try any of this, then SQL Server Denali ctp 3 is available for download here and there’s matching sample databases , reports and tutorials for it on codeplex.
I have spent the last couple of days setting up a demo environment for SQL Server “Denali”, and I thought it might be useful to share some of my experiences over the next few days. Before I get into that a short post about the environment I have which will comprise of 7 x virtual machines (VMs) running on my Dell Precision M6500 aka the Orange Data Centre:
Those VMs work as follows:
But why not have a VM for SQL Server 2008 R2 and another for Denali?
Speed and Flexibility:
Speed If any of you have tried to use or create an “all in one” VM, complete with a domain controller, Silverlight, SQL Client tools installed it won’t work too well even if you throw 16GB RAM and all your cores at it. You could argue my approach means multiple copies of the operating system (one in each VM) but windows 7 is designed for client stuff like Silverlight and Office , while the domain controller does slow down SQL Server fi they are co-located.I could take this a stage further and put SQL Server in its own VM but that would really only benefit lots of concurrent usage. Flexibility I have to use my demo laptop for other demos e.g. System Center which requires another 6 VMs (virtual machine manger, operations manager, orchestrator and 2 for service manager). These are joined to the same domain as my BI environment so I can use BI to show System Center or manage BI with System Center, although even my 16Gb SSD only Dell can’t run all of this even with dynamic memory turned on in Windows Server 2008 R2 sp1 Hyper-V.
Speed If any of you have tried to use or create an “all in one” VM, complete with a domain controller, Silverlight, SQL Client tools installed it won’t work too well even if you throw 16GB RAM and all your cores at it. You could argue my approach means multiple copies of the operating system (one in each VM) but windows 7 is designed for client stuff like Silverlight and Office , while the domain controller does slow down SQL Server fi they are co-located.I could take this a stage further and put SQL Server in its own VM but that would really only benefit lots of concurrent usage.
Flexibility I have to use my demo laptop for other demos e.g. System Center which requires another 6 VMs (virtual machine manger, operations manager, orchestrator and 2 for service manager). These are joined to the same domain as my BI environment so I can use BI to show System Center or manage BI with System Center, although even my 16Gb SSD only Dell can’t run all of this even with dynamic memory turned on in Windows Server 2008 R2 sp1 Hyper-V.
I wanted to mention this because I see a lot of monolithic VMs which end up being huge and can be very difficult to manage and basically don’t run as well. Of course you might choose not to believe a random Microsoft evangelist but if you look at this graph:
you can see the number of supported users goes up as we add more VMs to the same hardware while response time (the dotted line) drops slightly (Research by the Enterprise Services Group).
However for my demo I am sort of breaking my own guidance as I will just have the one VM for SharePoint and SQL Server Denali. Before I go through my experience of doing that (some of which is whirring away as I type this) I want to use my next post to describe some of the moving parts of SQL Server Denali ctp3 which is available for download here
Many of us would associate the the term critical with the Register, so when I was asked by Tim Phillips their broadcast and commercial editor to join a discussion titled “Going Critical” I was a little nervous. The title refers to some Register commissioned research by Andrew Buss at Freeform Dynamics on what Register subscribers are doing about mission critical application in a virtual world. This indicates that now the initial rush to virtualise all the simple stuff like file and print and simple applications that there is shift in two directions:
Also on the panel was Alun Rogers one of the founders of Microsoft gold partner Risual, as he’s in the real world making all this actually happen.
The webcast was recorded and is available on the Register whitepapers website.
I contributed a couple of slides to the deck based on some other research by the Enterprise Server Group on Hyper-V performance with the sort of workloads we were discussing.The other resource I always mention in this conversations is the free Microsoft Assessment and Planning Toolkit (MAPT) which you should use to plan and baseline any major change to your infrastructure such as physical or virtual to virtual migrations and version upgrades.
Apologies for skipping a post on this last week, embarrassingly I couldn’t get System Center Orchestrator 2012 (SCO) to work with System Center Virtual Machine Manager 2008 R2 (SCVMM). Since then I have been on a course with QA Training which included a day on Opalis and the excellent trainer Paul Gregory sorted it for me. So I thought it would be good to share what I did so you can start to use SCO to control your VMs.
In my last video all I did was install and deploy the integration pack that talks to SCVMM. Having done that there’s only one more thing you have to do and that’s setup the connection to your SCVMM server in the Runbook designer. The key Think you need to do is not to enter any credentials in the dialog box to create a new connection:
You all also need open up the remote admin port (5095) on SCVMM and a simple way to do this is to allow remote management on that server with winRM QuickConfig
The best way to understand how SCO works with VMs is to pull down the sample runbooks from codeplex and import them as I did in this weeks screencast which shows you how to work with virtual machines…
Things to note:
If you are serious about looking at this then I can recommend the QA Course on Opalis
Good management tools
Private cloud appears
Public cloud worries
Cause myth and uncertainty,
Trust is essential
[Haiku is a strict form of Japanese poetry dating back to the 1600s]
For me the most interesting stuff in the System Center 2012 is the cloud stuff. Some of this is obvious there’s cloud button in Virtual Machine Manager (VMM) 2012, and there’s even a new product in the line up, currently called Concero that is totally cloud focused as you can see from this preliminary screenshot..
However some of the cloud enablement is more subtle and is really an extension of what System Center has always been capable of i.e. automation and event handling.
For example one of the capabilities that marks the private cloud out from a traditional data centre is the ability to provide a self service capability. I don’t think this is for edn users no matter how easy it is, but rather by what I would refer to as applied IT Professionals as distinct from the Data Centre Professionals. The applied IT Professionals might either be embedded in a business unit or would be an expert in a particular application like the DBAs, Exchange and SharePoint administrators. This group would have limited rights to create services and provision VM’s within limits set by the Data Centre Professional. In System Centre 2012 there are several ways to accomplish this:
The other key feature of cloud computing is scalability and System Center can support this depending on what the service needs:
One final thought, wen never discuss backup and disaster recovery of the private cloud but of course it’s essential and in fact protecting System Center 2012 itself from disaster is also essential if you’re using it to manage your data centre so the least known of the System Center product Data Protection Manager (DPM) is also being upgraded for 2012 to allow remote management and a single console view to allow all of your DPM servers to be centrally managed so you can quickly find the data you need to recover from wherever you are working.
Not all of this is in beta right now but you can get your hands on:
and I’ll keep you posted on the rest.