It always nice to see historic aircraft , and vintage cars out and about rather than stuck in museums, the noise and even the smell of castor oil add to this nostalgia. However keeping them going requires a lot of effort and keeping them current with modern rules means that a Fokker triplane will need a proper seat harness, radio etc. and my mates Porsche 356 now runs on unleaded fuel.
Keeping software current means patching and possibly bolting on add-ons which can affect performance and make management more of an issue and I would argue that you don’t get the same feeling of pride and a job well done from looking after old software. Getting hold of the bits in bot scenarios can be tricky as manufacturers cease production. With old cars and planes this can lead to small engineering firms recreating new parts and at the extreme complete replicas. However I can’t see many people writing their own hot fixes and service patches!
I mention all of this because SQL Server 2005 is now coming to the end of its life. The key event is the 2nd anniversary of the release of its successor SQL Server 2008 and this occurs on 12th April 2011 and the implications of this are:
Full details of the support arrangements for SQL Server are here (you’ll need to click on the SQL Server 2005 tab)
Full details of the support arrangements for SQL Server are here (you’ll need to click on the SQL Server 2005 tab)
What you decide to do about this is of course up to you. However while I can see the fun in maintaining and restoring an old car or plane I can’t see the justification for running databases on SQL Server 2005 unless:
You will at this point tell me you don’t have software assurance and you can’t justify the upgrade. However there is so much extra stuff in SQL Server 2008 R2 that you can just turn on without upgrading:
The upgrade from SQL Server 2005 to 2008 should be a straightforward process but it is still important to run the application/database through the upgrade advisor and there are a couple of other useful links here..
• TechNet Upgrading to SQL Server 2008 R2
• SQL Server 2008 R2 Upgrade Advisor
• SQL Server 2008 R2 Upgrade Guide
As ever I am interested in your upgrade stories and why you feel you can’t upgrade so ping me I have polo shirts with SQL Server 2008 R2 on even if you can’ put it on your server yet
When I applied to join Microsoft nearly 4 years ago I had to give a presentation on olap for small business which now looks a little dated. So yesterday I got the chance to revisit my thoughts on this over espresso with BI Guru Rafal Luckaweiki from Project Botticelli who was over to run a Microsoft BI Seminar.
We both thought BI for small business has become even easier,partially because the capabilities in the the top BI end user tool, Excel have grown and grown from Excel 2003 partly because it has got better itself with slicers, sparklines, conditional formatting etc. but also because of the add-ins available:
What of the downsides? I would say data quality is key as is the ability to join data together. In either case getting this wrong will produce faulty analysis, but as Rafal pointed out in a smaller business this much more likely to be picked up as the users have a better feel for the numbers. So a sense of reality is key to this issue.
Training and awareness is the other barrier to adoption. A lot of people still don’t know about the more powerful capabilities of Excel, and Rafal mentioned a family run business where he introduced them to basket analysis, and they were so pleased with the results they have passed this best practice to other businesses in their community. Another case in point was a guy at a Housing Trust I met recently who didn’t know that SQL Server came with a comprehensive set of BI tools and as he rightly pointed out where are the simple getting started guides for these other components that a semi-professional IT/ business user can understand. I don’t think TechNet has this and actually it’s probably not the right place, but to be honest I couldn’t find too many other articles on using Excel for BI on Microsoft sites apart from a few blog posts (mine included!) so I shall see what I can do to fix this!
What does this all mean for the BI developer/practitioner/[insert your own word here]. I think it means that there are more customers wanting BI, and that what they need is practical help to clean their data and education and guidance to get the most out of the data themselves. I Actually enjoy working with these smaller businesses, because as Rafal pointed out as they value agility over control – they just want get going quickly, so you don’t get so bogged down in procedures and overhead you just get to play with the data and get close to their business.
I also think this trend will continue as PowerPivot evolves in the next release of SQL Server, but that’s another story!
As ever comments and feedback welcome and if weren’t among the 300 at the seminar yesterday there might well be another one in May.
Dell have kindly loaned my a very Orange Dell Precision M6500 (aka Covet) and I plan to use it to show off the latest developments in SQL Server and System Centre etc. However I won’t pretend I can run a private cloud on it as that needs at least one cluster and I don’t see me lugging two of these monsters around even if Dell decided to let me have a pair of them! However what I will do is run all of the System Center suite on here including Virtual Machine Manager (SCVMM) with its self service portal. I will also be able to run SQL Server not just to support System Center but also to show off Denali (the project name for SQL Server vNext)
The first thing I need to do is put Windows Server 2008 r2 on it and then install the the hyper-V role. The Covet has 1x80Gb and 2x 265Gb SSDs with the two larger SSDs setup as a raid0 to offer a 512Gb volume where I’ll be putting all my virtual machines. Rather than install the physical operating with the hypervisor onto the 80Gb SSD, I have actually created a 15Gb fixed virtual disk (.VHD file) and I will boot directly to that (this is a feature in Windows 7 and Windows Server 2008 R2). The cool thing about this is that I can copy this VHD to another machine and get back to my environment (possibly with a reboot to detect hardware changes so dead handy if Dell want their machine back!
Next I am going to break a few rules and put additional roles and features on the physical operating system notably the wireless LAN needed to get the wireless adapter to work) and desktop experience. Desktop experience gives me media player in case I need to show off a video and also allows my to give windows server the Windows 7 treatment including aero (obviously at the expense of VM performance). WhiIe I am on the subject of video I have installed the Windows x64 Nvidia drivers for the graphics card. I have seen some articles that suggest that graphics drivers can interfere with hyper-V and to stick withe the generic driver. There’s a powerful graphics card in the Covet with 1Gb dedicated RAM and without the drivers I won’t be able to project - I fell foul of this at TechEd 2009 and had to install the graphics drivers while I was presenting at a session!
One other thing I do is to fire up MMC (type MMC into the search bar to get it) to bring up an empty management console. I then add in the Hyper-V snap-in and save this as Hyper-V manager (whihc appears in administrative tools by default) so I have one screen especially to manage virtual machines which I pin to my taskbar (this is such a technique older than many of my jokes which you just might not have heard of!)
The final thing I do before I start bringing in my virtual machines is to setup the virtual networks in server manager. I stick to really simple naming on all my VMs to make it easy to move them around. I have Virtual Internal Network and Virtual External Network and each VM I use connect to one or both of these (I’ll go into more detail on that in subsequent posts).
Now I am all set to bring in virtual machines and setup SCVMM which is what I will be going through next time as running virtual machines without SCVMM is a bit like changing gear without a clutch, technically possible but not that easy.
I now have hyper-V running on my shiny new Orange Dell Covet (Precision 6500) and in order to manage it properly I want to setup SCVMM next to manage it.
Actually before I do that I am going to import a Virtual Machine(VM) I have already namely my Domain Controller (DC). I have other VMs joined to this domain and and it has some accounts I want to use to run services. This VM uses a special type of virtual disk, a differencing disk;
What this allows me to do is to create another differencing disk (SCVMM.VHD) which uses the same base disk as the DC above (base R-enterprise.vhd) and this keeps down the the space on my precious SSDs. A couple of points about my use of differencing disks:
Anyway back to SCVMM. I have created a new VM to use that SCVMM differencing disk and given it:
When I start this machine it will come out of sysprep and complete installation of the OS My next steps to prepare this VM to run SCVMM are:
At this point I snapshot the machine so I can redo the installation of SCVMM if I make a mess of it or to re-demo that step.
In part 3 I’ll setup SCVMM itself no I am ready to go..
In this post I am going to setup System Center Virtual machine Manager 2008 R2 (SCVMM) in a virtual machine and then use it to manage the physical OS on which it is running (parts 1 & 2 of this series will explain what I have done up to now to set this up).
To get SCVMM running on my shiny Dell Covet laptop, I actually need to do three installations:
I have a short video covering all three installations here..
Other things to note in the video are:
Next time I am going to add System Center Operations Manager 2007 R2 (SCOM) into the mix (actually into the same VM) as this adds the vital capability to manage the contents of the virtual machine e.g updates and health of services like SQL server running in the virtual machine.
In the meantime here are some links for further reading and extra credit:
I was going to do The next part of setting up my Dell Covet as a data centre install System Center Operations Manager (SCOM) on it etc. and I have to be honest I got stuck because SCOM 2007 R2 is not really that keen on SQL Server 2008 R2, and while I was researching this at home my phone and broadband died putting me way behind.
While I was offline I have reset everything to run on SQL Server 2008 sp2 (sp1 or later is needed for this) and then I remembered about various announcements at the War on Cost event run by Inframon least year specifically the various next versions of the System Center suite were discussed..
War on Cost event: Attack the Costs and Complexity of managing mobile devices in your enterprise War on Cost event: Microsoft Private Cloud Story War on Cost event: Microsoft Keynote - Desktop and Security Convergence War on Cost event: Keynote from Gordon McKenna, CEO at Inframon War on Cost event: Microsoft keynote - Datacenter to the Cloud War on Cost event: Heterogenous Management with Operations Manager 2007 War on Cost event: Operations Manager R2 and V.Next War on Cost event: Service Manager - the Better Together Story War on Cost event: System Center Configuration Manager v.Next Highlights War on Cost event: Datacenter IT Process Automation (Opalis)
War on Cost event: Attack the Costs and Complexity of managing mobile devices in your enterprise War on Cost event: Microsoft Private Cloud Story War on Cost event: Microsoft Keynote - Desktop and Security Convergence War on Cost event: Keynote from Gordon McKenna, CEO at Inframon
War on Cost event: Microsoft keynote - Datacenter to the Cloud War on Cost event: Heterogenous Management with Operations Manager 2007 War on Cost event: Operations Manager R2 and V.Next War on Cost event: Service Manager - the Better Together Story War on Cost event: System Center Configuration Manager v.Next Highlights War on Cost event: Datacenter IT Process Automation (Opalis)
On the basis of this , I have now decided that rather than doing a load of screencasts on what has been out there for some time I will postpone this project until I get my hands on the beats of vNext of SCOM etc.
So until then I will ask you to tune into my TechDays onLine session on tomorrow on building a Private Cloud.By a strange coincidence Gordon McKenna CEO of Inframon and a System Center MVP will be joining me live to share his experiences and views as well as taking you through the demo rig we have setup in the Microsoft Technology Centre (MTC)in Reading which has a working private cloud built on System Center and Hyper-V.
BTW if you have been on one of these TechDays Online sessions before in our we have sorted out our audio issues (we have some decent microphones) and we have prizes.
If you can’t make it or want to watch earlier videos in the series the same site has those recordings too.
Like many IT Professionals, I love my gadgets but what really matters is what you do with them – so which games do you play on your gamer PC what did you photograph with that monster lens you bought for your DSLR and who did you hook up with on Facebook at the weekend? The same is true of Business Intelligence (BI) – it’s what you to with the stuff , not how much RAM you have in your servers or even what BI tools you have – it’s all about how these tools are providing value to the business.
What is unusual about Microsoft BI is that many organisations have most or all of the capabilities sitting on the shelf , so SQL Server, SharePoint and Office and possibly also have some Project and Visio too. What is very common is that businesses don’t usually openly share what they are doing with BI as if it’s done right then this gives them a real edge over the competition. So on the one hand you have the kit on the other it can be difficult to know where to apply it.
So I am quite pleased Microsoft have put together this BI scenarios portal, to give you some ideas about how to expand your use of BI. One of these scenarios make use of BI to make the IT team more effective by providing sophisticated analysis of how SharePoint is preforming by picking up telemetry from Systems Center Operations Manager and I know this should be popular as this is exactly the kind of thing that my good friend Gordon McKenna at Inframon does all the time. I also see there are plans for more scenarios and I have a few ideas I think this is a great way of helping to deliver BI to wider audiences in the businesses.
These scenarios are also a great training resource for an IT professional wanting to break into BI possibly from a DBA SharePoint admin background as they have deeper worked examples than you might get studying the worked examples in MS Learning and I think if BI is to be used more widely then there need to be more of us BI Professionals with some awareness of it.
So have a look and as they say in your local supermarket “Try something different”.
First an apology. One of the reasons I stopped trying to turn my laptop into a mobile data centre was that I have been a login to the Reading Microsoft Technology Centre (MTC) and this has the entire system centre suite running on it and my Dell Covet can’t really compete with that. However I don’t have total control over it and when I tried to setup the System Center Virtual Machine Manager (SCVMM) self service portal (SSP) on it it died during my recent Techdays online session on Planning the Private Cloud. So back to plan A - setup my own environment including the Self service Portal.
The current Self Service Portal (version 2) is essentially a solution accelerator and you can download that from here
I have a short video on installing it here and that is all pretty straightforward, but as with other System Center products it needs it’s own SQL Server database..
The only hiccup I had was turning on MSMQ (a feature in Windows Server 2008 and later), as you also need to enable the Directory Integration mode for MSMQ and although I had enabled it took some time for the pre-requisite check in the SSP install to pick this up. However while installing it is pretty straight forward setting it up and allowing your users to start creating their own services and virtual machines requires more thought not because the portal is hard to use but it requires an understanding of the terminology and approach. And that’s what I’ll be doing in my next post in this series
Getting started with the Self Service Portal (SSP) for the System Center Virtual Machine Manager (SCVMM), isn’t just turning on the portal and letting users create virtual machines at random, as that would cause a lot of sprawl and control. Also you probably don’t want users creating these all over your data centre and so the first step in setting up a private cloud like environment is to decide which parts of your virtualisation infrastructure will be used for self service:
You may also wish to implement charge back for resources reserved by the business to get them to be more responsible for their decisions.
Having done that you then create a series of environments. These might be nothing more than Development, Test and Production or could represent available infrastructure in each of your different data centres in London Paris and Milan.
With these steps completed you have now set out your stall and users can make requests to create and describe their business units and the services they need. That’s what I’ll look at that in the next post in this series, in the meantime I have a 4 minute webcast of these steps which may help to further explain this..
Getting your head around the self service portal requires some thought and careful planning, not because it is hard to use but because it requires a change in the way business and the IT department interact. It’s also important to understand the terminology :
You will have realised by now that the average lawyer, accountant or other business professional isn’t going to be doing this for themselves and the key thing in moving to the private cloud is the underlying cultural change needed. In the IT department there will be less staff but the key function of the data centre administrator will still exist. In the business units there will be a new role of business system who will need to have a good understanding of IT and could well be IT professionals. The difference for these staff is that they would be members of the business unit not the IT department. I have spent pretty much my whole IT career in business not in an IT department so this should not be a new idea (I am far from new myself).
The SSP provides the collaboration necessary for these business administrators to provision services within the framework created by the data centre administrator – and so for each of the objects above the business services administrator initiates requests in the SSP which the datacentre administrator actions.
To help make sense of that I made this screencast of the dialogue between the two roles:
The screencast also shows the mechanics of the business service administrator setting up their first virtual machine and it starting in hyper-V and SCVMM. What I didn’t do in this was setup the templates in SCVMM that are used by the SSP to make VMs. That is a whole topic in itself which I will cover off in another post. If you want to try any of this yourself then you can get the latest self service portal from here.
I don’t think data centre management or the private cloud is particularly difficult to learn, but it could be easier to find out where to start. Of course if your focus is virtualisation or you only make tools in this space then your site should make this journey pretty straight forward. However I would argue that the move to cloud style IT as a service impacts all aspects of IT and that most if not all IT professionals should understand it. Moreover this approach will only become widespread if the training as well as the tools are more affordable, and this has been one of the the things Microsoft has bought to IT whether it was a PC in every home or in my background of business intelligence the fact that everyone has access to reporting and analytics through their browser and excel.
I think with a free hypervisor and affordable and easy to use management tools are helping, but understanding how to use these tools to implement dynamic data centres/ private clouds as well as integrating these with public cloud services not mention migration of services to the cloud still needs training. Hence yesterday’s launch of the Microsoft Virtual Academy (MVA)
..where you can dip into the topics you are interested in and build up the skills and you experience you need to compete in todays tightening job market. It’s all free, all you need is your time and an internet connection. You get points for doing the various course and the plan is to further extend this with more content and to offer discounts on the traditional certifications and course so that this becomes everyones’ first step to getting qualified. I also think this idea helps with cross training for example DBA’s, SharePoint administrators etc. also need to understand this stuff to a basic level to be more marketable and promotable. so have a look and let us know what you think.
Windows Server 2008 R2 sp1 introduced two key enhancement to Hyper-V, dynamic memory and remoteFX. RemoteFX is a way of sharing a physical servers GPU to provide a better VDI experience and I’ll leave that for Simon to blog about, so back to dynamic memory. This seemingly minor enhancement to Hyper-V allows you to setup minimum start up memory requirements for each VM and then increase this when they are under pressure according to rules and priorities you define (more details from a previous post of mine and its impact on SQL server ). Of course you don’t want to be rooting around in Hyper-V in each server and setting VM to do this, you will want to manage it and so you should be in System Center Virtual Machine Manager. To do that you’ll need the corresponding service pack (sp1) for SCVMM 2008 R2 which has been released today.
I was surprised how large it was but the download is a sliptstreamed (i.e. full ) install. I have this short screencast on doing the upgrade and seeing the dynamic memory showing up in SCVMM post install..
One thing to note is that getting dynamic memory to work in the first place requires you to apply the Windows Server 2008 R2 sp1 to the physical (host) operating system. The next step is to get the feature to be picked up in the virtual machines.If the guest operating system is Windows Server 2008 R2 or Windows 7 then applying service pack 1 to these guest operating systems will enable dynamic memory as well as providing the fixes that are included in the service pack. For older supported operating systems such as Vista and Windows Server 2008 then reapplying the integration components to these will also turn on dynamic memory. For more on this there is a Hyper-V Dynamic memory deployment guide on TechNet.
[added 3 April 2010]
For more on this you might want to come to our Private Cloud TechDay on 24th may in London or if you're notabloe to make that then tune into the the Managing the Private Cloud TechDays Live session on 19th April with Gordon McKenna(from Inframon) and me.