In my last post I fired up the new release candidate of System Center Advisor, and this is what it looks like after I have left it running across a couple of servers I have pointed to the servers..
Just one thing to note, in my last post I introduced Advisor as “essentially Systems Center Operations Manager in the cloud” it doesn’t mean it actually is a replacement for SCOM – for a start it doesn’t collect real time data from your servers, currently it’s just once a day. This means you don’t get the alerting and event handling that is in SCOM that makes so much automation possible.
So worth looking at if you haven’t got SCOM already, but you do then have to action the feedback the tool is giving you.
One of the many announcements to come out of the recent Microsoft Management Summit was System Center Advisor. Essentially this is Operations Manager (SCOM) in the cloud, so a sort of InTune for servers.
I had a look at the precursor to this, Project Atlanta back in January, but at the time it was just focused on SQL Server management. However in the new Advisor, you can also manage Windows Server health, specifically Hyper-V and Active directory as well as the general health of the operating system.
Of course this means my Atlanta screencasts are now obsolete so here’s the replacement screencast on setting up Advisor if you haven’t tried it before..
Here’s the things you need to know if screencast aren’t your thing:
This is just a release candidate so I expect that more and more environments will be included in much the same way as there are numerous management packs for on premise SCOM today. So this could be well worth looking into and for now it’s free as well as being very easy to setup.
If that’s too much trouble then you might want to come to our Private Cloud TechDay on 24th may in London for more on this or tune into the the Managing the Private Cloud TechDays Live session on 19th April with Gordon McKenna(from Inframon) and me.
I saw a discussion the other having a go about the fact that it is possible to create KPIs in a number of Microsoft tools and how confusing that can be and I thought it would be good to understand why there is a choice at all.
Firstly the term KPI mans key performance indicator, I mention this because of the word “key”. In any business there should be relatively few KPI, for example inside Microsoft we have 31 and despite pressure to increase them this has been the same for the last 5 years. These are set once a year and may be readjusted at the half year point so there isn’t too much work in creating them, but they need to be available across the business so they need to be on company intranet.
Each KPI is typically the responsibility of a manager and his team and to hit that number each division department will have its own dashboard and subordinate KPIs. This hierarchy cascades down and down until each member of staff can identify their objectives with those KPIs.
Of course measuring performance is one thing achieving it another. Even if things are going well there will always be the desire to improve things and to understand the underlying factors affecting successes and failures. The type of tools used to create KPIs can be used for this and I would call these just PIs as they aren’t so key any more.
This is one of the reasons that there are several tools to crate indicators in the Microsoft BI platform:
At the strategic end of the business where the ‘true’ KPIs are created these need to be widely available and so there is the dashboards, scorecards and tools to create KPIs embedded into SharePoint enterprise. The data for these can come from pretty well any structured source (I have yet to see a BI solution without at least one excel spreadsheet as a source), and not just from Microsoft products. Excel At the other end of the scale you can do quite a lot to create things that look like KPIs in Excel and if you are in a five man company this would be more than sufficient to keep your pulse on the business. Analysis Services KPIs have been a feature of cubes since SQL Server 2005 and this is a very powerful feature that is rarely used. The value of putting them here is really good performance and that once they are created in a backend datastore they can be accessed from any toll, Microsoft or not that can connect to the cube. For you can import these KPIs directly into Dashboard designer and deploy them to SharePoint, and there are some 35 third party tools that work with analysis services if you don’t or can’t invest in SharePoint. Reporting Services Since SQL Server 2008 there are all sorts of traffic lights and gauges in reporting services. This can be a good option for sharing performance indicators with third parties; perhaps to show parents how a schools is performing or in a business to business scenario like suppliers and supermarkets.
At the strategic end of the business where the ‘true’ KPIs are created these need to be widely available and so there is the dashboards, scorecards and tools to create KPIs embedded into SharePoint enterprise. The data for these can come from pretty well any structured source (I have yet to see a BI solution without at least one excel spreadsheet as a source), and not just from Microsoft products.
At the other end of the scale you can do quite a lot to create things that look like KPIs in Excel and if you are in a five man company this would be more than sufficient to keep your pulse on the business.
KPIs have been a feature of cubes since SQL Server 2005 and this is a very powerful feature that is rarely used. The value of putting them here is really good performance and that once they are created in a backend datastore they can be accessed from any toll, Microsoft or not that can connect to the cube. For you can import these KPIs directly into Dashboard designer and deploy them to SharePoint, and there are some 35 third party tools that work with analysis services if you don’t or can’t invest in SharePoint.
Since SQL Server 2008 there are all sorts of traffic lights and gauges in reporting services. This can be a good option for sharing performance indicators with third parties; perhaps to show parents how a schools is performing or in a business to business scenario like suppliers and supermarkets.
However what makes anyone of these a true KPI is the data not the tool: if the number , trend and status in the traffic light, speed gauge, thermometer is key to your business then it’s a KPI. My only concern is that you use the right tool to make sure the right people can see the up to date status quickly and simply and that there backend systems in place to keep it up to date.
Windows Server 2008 R2 sp1 introduced two key enhancement to Hyper-V, dynamic memory and remoteFX. RemoteFX is a way of sharing a physical servers GPU to provide a better VDI experience and I’ll leave that for Simon to blog about, so back to dynamic memory. This seemingly minor enhancement to Hyper-V allows you to setup minimum start up memory requirements for each VM and then increase this when they are under pressure according to rules and priorities you define (more details from a previous post of mine and its impact on SQL server ). Of course you don’t want to be rooting around in Hyper-V in each server and setting VM to do this, you will want to manage it and so you should be in System Center Virtual Machine Manager. To do that you’ll need the corresponding service pack (sp1) for SCVMM 2008 R2 which has been released today.
I was surprised how large it was but the download is a sliptstreamed (i.e. full ) install. I have this short screencast on doing the upgrade and seeing the dynamic memory showing up in SCVMM post install..
One thing to note is that getting dynamic memory to work in the first place requires you to apply the Windows Server 2008 R2 sp1 to the physical (host) operating system. The next step is to get the feature to be picked up in the virtual machines.If the guest operating system is Windows Server 2008 R2 or Windows 7 then applying service pack 1 to these guest operating systems will enable dynamic memory as well as providing the fixes that are included in the service pack. For older supported operating systems such as Vista and Windows Server 2008 then reapplying the integration components to these will also turn on dynamic memory. For more on this there is a Hyper-V Dynamic memory deployment guide on TechNet.
[added 3 April 2010]
For more on this you might want to come to our Private Cloud TechDay on 24th may in London or if you're notabloe to make that then tune into the the Managing the Private Cloud TechDays Live session on 19th April with Gordon McKenna(from Inframon) and me.
I don’t think data centre management or the private cloud is particularly difficult to learn, but it could be easier to find out where to start. Of course if your focus is virtualisation or you only make tools in this space then your site should make this journey pretty straight forward. However I would argue that the move to cloud style IT as a service impacts all aspects of IT and that most if not all IT professionals should understand it. Moreover this approach will only become widespread if the training as well as the tools are more affordable, and this has been one of the the things Microsoft has bought to IT whether it was a PC in every home or in my background of business intelligence the fact that everyone has access to reporting and analytics through their browser and excel.
I think with a free hypervisor and affordable and easy to use management tools are helping, but understanding how to use these tools to implement dynamic data centres/ private clouds as well as integrating these with public cloud services not mention migration of services to the cloud still needs training. Hence yesterday’s launch of the Microsoft Virtual Academy (MVA)
..where you can dip into the topics you are interested in and build up the skills and you experience you need to compete in todays tightening job market. It’s all free, all you need is your time and an internet connection. You get points for doing the various course and the plan is to further extend this with more content and to offer discounts on the traditional certifications and course so that this becomes everyones’ first step to getting qualified. I also think this idea helps with cross training for example DBA’s, SharePoint administrators etc. also need to understand this stuff to a basic level to be more marketable and promotable. so have a look and let us know what you think.
Getting your head around the self service portal requires some thought and careful planning, not because it is hard to use but because it requires a change in the way business and the IT department interact. It’s also important to understand the terminology :
You will have realised by now that the average lawyer, accountant or other business professional isn’t going to be doing this for themselves and the key thing in moving to the private cloud is the underlying cultural change needed. In the IT department there will be less staff but the key function of the data centre administrator will still exist. In the business units there will be a new role of business system who will need to have a good understanding of IT and could well be IT professionals. The difference for these staff is that they would be members of the business unit not the IT department. I have spent pretty much my whole IT career in business not in an IT department so this should not be a new idea (I am far from new myself).
The SSP provides the collaboration necessary for these business administrators to provision services within the framework created by the data centre administrator – and so for each of the objects above the business services administrator initiates requests in the SSP which the datacentre administrator actions.
To help make sense of that I made this screencast of the dialogue between the two roles:
The screencast also shows the mechanics of the business service administrator setting up their first virtual machine and it starting in hyper-V and SCVMM. What I didn’t do in this was setup the templates in SCVMM that are used by the SSP to make VMs. That is a whole topic in itself which I will cover off in another post. If you want to try any of this yourself then you can get the latest self service portal from here.
Getting started with the Self Service Portal (SSP) for the System Center Virtual Machine Manager (SCVMM), isn’t just turning on the portal and letting users create virtual machines at random, as that would cause a lot of sprawl and control. Also you probably don’t want users creating these all over your data centre and so the first step in setting up a private cloud like environment is to decide which parts of your virtualisation infrastructure will be used for self service:
You may also wish to implement charge back for resources reserved by the business to get them to be more responsible for their decisions.
Having done that you then create a series of environments. These might be nothing more than Development, Test and Production or could represent available infrastructure in each of your different data centres in London Paris and Milan.
With these steps completed you have now set out your stall and users can make requests to create and describe their business units and the services they need. That’s what I’ll look at that in the next post in this series, in the meantime I have a 4 minute webcast of these steps which may help to further explain this..
Like many IT Professionals, I love my gadgets but what really matters is what you do with them – so which games do you play on your gamer PC what did you photograph with that monster lens you bought for your DSLR and who did you hook up with on Facebook at the weekend? The same is true of Business Intelligence (BI) – it’s what you to with the stuff , not how much RAM you have in your servers or even what BI tools you have – it’s all about how these tools are providing value to the business.
What is unusual about Microsoft BI is that many organisations have most or all of the capabilities sitting on the shelf , so SQL Server, SharePoint and Office and possibly also have some Project and Visio too. What is very common is that businesses don’t usually openly share what they are doing with BI as if it’s done right then this gives them a real edge over the competition. So on the one hand you have the kit on the other it can be difficult to know where to apply it.
So I am quite pleased Microsoft have put together this BI scenarios portal, to give you some ideas about how to expand your use of BI. One of these scenarios make use of BI to make the IT team more effective by providing sophisticated analysis of how SharePoint is preforming by picking up telemetry from Systems Center Operations Manager and I know this should be popular as this is exactly the kind of thing that my good friend Gordon McKenna at Inframon does all the time. I also see there are plans for more scenarios and I have a few ideas I think this is a great way of helping to deliver BI to wider audiences in the businesses.
These scenarios are also a great training resource for an IT professional wanting to break into BI possibly from a DBA SharePoint admin background as they have deeper worked examples than you might get studying the worked examples in MS Learning and I think if BI is to be used more widely then there need to be more of us BI Professionals with some awareness of it.
So have a look and as they say in your local supermarket “Try something different”.
First an apology. One of the reasons I stopped trying to turn my laptop into a mobile data centre was that I have been a login to the Reading Microsoft Technology Centre (MTC) and this has the entire system centre suite running on it and my Dell Covet can’t really compete with that. However I don’t have total control over it and when I tried to setup the System Center Virtual Machine Manager (SCVMM) self service portal (SSP) on it it died during my recent Techdays online session on Planning the Private Cloud. So back to plan A - setup my own environment including the Self service Portal.
The current Self Service Portal (version 2) is essentially a solution accelerator and you can download that from here
I have a short video on installing it here and that is all pretty straightforward, but as with other System Center products it needs it’s own SQL Server database..
The only hiccup I had was turning on MSMQ (a feature in Windows Server 2008 and later), as you also need to enable the Directory Integration mode for MSMQ and although I had enabled it took some time for the pre-requisite check in the SSP install to pick this up. However while installing it is pretty straight forward setting it up and allowing your users to start creating their own services and virtual machines requires more thought not because the portal is hard to use but it requires an understanding of the terminology and approach. And that’s what I’ll be doing in my next post in this series
It always nice to see historic aircraft , and vintage cars out and about rather than stuck in museums, the noise and even the smell of castor oil add to this nostalgia. However keeping them going requires a lot of effort and keeping them current with modern rules means that a Fokker triplane will need a proper seat harness, radio etc. and my mates Porsche 356 now runs on unleaded fuel.
Keeping software current means patching and possibly bolting on add-ons which can affect performance and make management more of an issue and I would argue that you don’t get the same feeling of pride and a job well done from looking after old software. Getting hold of the bits in bot scenarios can be tricky as manufacturers cease production. With old cars and planes this can lead to small engineering firms recreating new parts and at the extreme complete replicas. However I can’t see many people writing their own hot fixes and service patches!
I mention all of this because SQL Server 2005 is now coming to the end of its life. The key event is the 2nd anniversary of the release of its successor SQL Server 2008 and this occurs on 12th April 2011 and the implications of this are:
Full details of the support arrangements for SQL Server are here (you’ll need to click on the SQL Server 2005 tab)
Full details of the support arrangements for SQL Server are here (you’ll need to click on the SQL Server 2005 tab)
What you decide to do about this is of course up to you. However while I can see the fun in maintaining and restoring an old car or plane I can’t see the justification for running databases on SQL Server 2005 unless:
You will at this point tell me you don’t have software assurance and you can’t justify the upgrade. However there is so much extra stuff in SQL Server 2008 R2 that you can just turn on without upgrading:
The upgrade from SQL Server 2005 to 2008 should be a straightforward process but it is still important to run the application/database through the upgrade advisor and there are a couple of other useful links here..
• TechNet Upgrading to SQL Server 2008 R2
• SQL Server 2008 R2 Upgrade Advisor
• SQL Server 2008 R2 Upgrade Guide
As ever I am interested in your upgrade stories and why you feel you can’t upgrade so ping me I have polo shirts with SQL Server 2008 R2 on even if you can’ put it on your server yet
I was going to do The next part of setting up my Dell Covet as a data centre install System Center Operations Manager (SCOM) on it etc. and I have to be honest I got stuck because SCOM 2007 R2 is not really that keen on SQL Server 2008 R2, and while I was researching this at home my phone and broadband died putting me way behind.
While I was offline I have reset everything to run on SQL Server 2008 sp2 (sp1 or later is needed for this) and then I remembered about various announcements at the War on Cost event run by Inframon least year specifically the various next versions of the System Center suite were discussed..
War on Cost event: Attack the Costs and Complexity of managing mobile devices in your enterprise War on Cost event: Microsoft Private Cloud Story War on Cost event: Microsoft Keynote - Desktop and Security Convergence War on Cost event: Keynote from Gordon McKenna, CEO at Inframon War on Cost event: Microsoft keynote - Datacenter to the Cloud War on Cost event: Heterogenous Management with Operations Manager 2007 War on Cost event: Operations Manager R2 and V.Next War on Cost event: Service Manager - the Better Together Story War on Cost event: System Center Configuration Manager v.Next Highlights War on Cost event: Datacenter IT Process Automation (Opalis)
War on Cost event: Attack the Costs and Complexity of managing mobile devices in your enterprise War on Cost event: Microsoft Private Cloud Story War on Cost event: Microsoft Keynote - Desktop and Security Convergence War on Cost event: Keynote from Gordon McKenna, CEO at Inframon
War on Cost event: Microsoft keynote - Datacenter to the Cloud War on Cost event: Heterogenous Management with Operations Manager 2007 War on Cost event: Operations Manager R2 and V.Next War on Cost event: Service Manager - the Better Together Story War on Cost event: System Center Configuration Manager v.Next Highlights War on Cost event: Datacenter IT Process Automation (Opalis)
On the basis of this , I have now decided that rather than doing a load of screencasts on what has been out there for some time I will postpone this project until I get my hands on the beats of vNext of SCOM etc.
So until then I will ask you to tune into my TechDays onLine session on tomorrow on building a Private Cloud.By a strange coincidence Gordon McKenna CEO of Inframon and a System Center MVP will be joining me live to share his experiences and views as well as taking you through the demo rig we have setup in the Microsoft Technology Centre (MTC)in Reading which has a working private cloud built on System Center and Hyper-V.
BTW if you have been on one of these TechDays Online sessions before in our we have sorted out our audio issues (we have some decent microphones) and we have prizes.
If you can’t make it or want to watch earlier videos in the series the same site has those recordings too.
In this post I am going to setup System Center Virtual machine Manager 2008 R2 (SCVMM) in a virtual machine and then use it to manage the physical OS on which it is running (parts 1 & 2 of this series will explain what I have done up to now to set this up).
To get SCVMM running on my shiny Dell Covet laptop, I actually need to do three installations:
I have a short video covering all three installations here..
Other things to note in the video are:
Next time I am going to add System Center Operations Manager 2007 R2 (SCOM) into the mix (actually into the same VM) as this adds the vital capability to manage the contents of the virtual machine e.g updates and health of services like SQL server running in the virtual machine.
In the meantime here are some links for further reading and extra credit:
I now have hyper-V running on my shiny new Orange Dell Covet (Precision 6500) and in order to manage it properly I want to setup SCVMM next to manage it.
Actually before I do that I am going to import a Virtual Machine(VM) I have already namely my Domain Controller (DC). I have other VMs joined to this domain and and it has some accounts I want to use to run services. This VM uses a special type of virtual disk, a differencing disk;
What this allows me to do is to create another differencing disk (SCVMM.VHD) which uses the same base disk as the DC above (base R-enterprise.vhd) and this keeps down the the space on my precious SSDs. A couple of points about my use of differencing disks:
Anyway back to SCVMM. I have created a new VM to use that SCVMM differencing disk and given it:
When I start this machine it will come out of sysprep and complete installation of the OS My next steps to prepare this VM to run SCVMM are:
At this point I snapshot the machine so I can redo the installation of SCVMM if I make a mess of it or to re-demo that step.
In part 3 I’ll setup SCVMM itself no I am ready to go..
Dell have kindly loaned my a very Orange Dell Precision M6500 (aka Covet) and I plan to use it to show off the latest developments in SQL Server and System Centre etc. However I won’t pretend I can run a private cloud on it as that needs at least one cluster and I don’t see me lugging two of these monsters around even if Dell decided to let me have a pair of them! However what I will do is run all of the System Center suite on here including Virtual Machine Manager (SCVMM) with its self service portal. I will also be able to run SQL Server not just to support System Center but also to show off Denali (the project name for SQL Server vNext)
The first thing I need to do is put Windows Server 2008 r2 on it and then install the the hyper-V role. The Covet has 1x80Gb and 2x 265Gb SSDs with the two larger SSDs setup as a raid0 to offer a 512Gb volume where I’ll be putting all my virtual machines. Rather than install the physical operating with the hypervisor onto the 80Gb SSD, I have actually created a 15Gb fixed virtual disk (.VHD file) and I will boot directly to that (this is a feature in Windows 7 and Windows Server 2008 R2). The cool thing about this is that I can copy this VHD to another machine and get back to my environment (possibly with a reboot to detect hardware changes so dead handy if Dell want their machine back!
Next I am going to break a few rules and put additional roles and features on the physical operating system notably the wireless LAN needed to get the wireless adapter to work) and desktop experience. Desktop experience gives me media player in case I need to show off a video and also allows my to give windows server the Windows 7 treatment including aero (obviously at the expense of VM performance). WhiIe I am on the subject of video I have installed the Windows x64 Nvidia drivers for the graphics card. I have seen some articles that suggest that graphics drivers can interfere with hyper-V and to stick withe the generic driver. There’s a powerful graphics card in the Covet with 1Gb dedicated RAM and without the drivers I won’t be able to project - I fell foul of this at TechEd 2009 and had to install the graphics drivers while I was presenting at a session!
One other thing I do is to fire up MMC (type MMC into the search bar to get it) to bring up an empty management console. I then add in the Hyper-V snap-in and save this as Hyper-V manager (whihc appears in administrative tools by default) so I have one screen especially to manage virtual machines which I pin to my taskbar (this is such a technique older than many of my jokes which you just might not have heard of!)
The final thing I do before I start bringing in my virtual machines is to setup the virtual networks in server manager. I stick to really simple naming on all my VMs to make it easy to move them around. I have Virtual Internal Network and Virtual External Network and each VM I use connect to one or both of these (I’ll go into more detail on that in subsequent posts).
Now I am all set to bring in virtual machines and setup SCVMM which is what I will be going through next time as running virtual machines without SCVMM is a bit like changing gear without a clutch, technically possible but not that easy.
When I applied to join Microsoft nearly 4 years ago I had to give a presentation on olap for small business which now looks a little dated. So yesterday I got the chance to revisit my thoughts on this over espresso with BI Guru Rafal Luckaweiki from Project Botticelli who was over to run a Microsoft BI Seminar.
We both thought BI for small business has become even easier,partially because the capabilities in the the top BI end user tool, Excel have grown and grown from Excel 2003 partly because it has got better itself with slicers, sparklines, conditional formatting etc. but also because of the add-ins available:
What of the downsides? I would say data quality is key as is the ability to join data together. In either case getting this wrong will produce faulty analysis, but as Rafal pointed out in a smaller business this much more likely to be picked up as the users have a better feel for the numbers. So a sense of reality is key to this issue.
Training and awareness is the other barrier to adoption. A lot of people still don’t know about the more powerful capabilities of Excel, and Rafal mentioned a family run business where he introduced them to basket analysis, and they were so pleased with the results they have passed this best practice to other businesses in their community. Another case in point was a guy at a Housing Trust I met recently who didn’t know that SQL Server came with a comprehensive set of BI tools and as he rightly pointed out where are the simple getting started guides for these other components that a semi-professional IT/ business user can understand. I don’t think TechNet has this and actually it’s probably not the right place, but to be honest I couldn’t find too many other articles on using Excel for BI on Microsoft sites apart from a few blog posts (mine included!) so I shall see what I can do to fix this!
What does this all mean for the BI developer/practitioner/[insert your own word here]. I think it means that there are more customers wanting BI, and that what they need is practical help to clean their data and education and guidance to get the most out of the data themselves. I Actually enjoy working with these smaller businesses, because as Rafal pointed out as they value agility over control – they just want get going quickly, so you don’t get so bogged down in procedures and overhead you just get to play with the data and get close to their business.
I also think this trend will continue as PowerPivot evolves in the next release of SQL Server, but that’s another story!
As ever comments and feedback welcome and if weren’t among the 300 at the seminar yesterday there might well be another one in May.
Running SQL Server in a virtual machine is a good thing for many reasons however there are two basic rules:
The guidance from the SQL CAT (SQL Server Customer Advisory Team) doesn’t extend to memory as up until now you can’t do much with memory in Hyper-V except allocate a fixed amount to each virtual machine which can’t be changed while its running.
However with the release of Windows Server 2008 R2 sp1 it is now possible to make use of dynamic memory. This allows you to set a minimum amount of memory need to start a vm and then each vm can be accorded a number of properties to make use of any unused memory on a give physical server:
The interesting thing here is that the amount used by the physical OS is controlled by Hyper-v unless you hack this in the registry (with the obvious word of caution implied in doing this).
You can still assign static memory to a virtual machine, so should you use dynamic memory with SQL Server?
The answer is probably yes with one caveat; the extra memory hyper-V will hand over once a virtual machine has started will appear to SQL Server as memory that has been hot added while the server is running. Only certain version and editions of SQL Server have the capability to recognised hot-add memory -Only Enterprise and DataCenter editions for SQL Server 2005 and later.
If you use dynamic memory with other editions and versions the memory will still be there but SQL Server itself simply won’t recognise it.
For more on this check KB 956893
I reckon the IT Professionals working in education have some of the most demanding jobs in in the IT industry, they have to contend with all sorts of privacy and security problems, a multitude of software and with only one or two per school they need to be real Jacks (or Jills) of all trades. Clearly they aren’t in for the money either, so I got to spend a day with them yesterday to find out what makes them tick and understand a little of their world.
They have a big not for profit community complete with its own portal (#Edugeek) which is now growing beyond the UK and they decided to visit Microsoft for their grand day out.
I wanted to break up the more day to day stuff on Office 365, System center and licensing with a look at what Microsoft are working on in the near and distant future, and this is bit like being a Londonner; you really ought to know what to see and where to go in your own city but you don’t always make the effort until friends or relatives turn up. So I had to do some digging around.
The big stuff worked on by Microsoft labs and Microsoft Research is pretty much all in the space of natural interfaces. Kinect and Surface are good examples in this space and the new Surface 2 (the Samsung SUR40) has now made this exciting device more of a commodity and integrated more into the Windows fold. The Kinect SDK for windows will do the same for this device and I may well be able run my future PowerPoint presentations using gestures rather than a mouse.
I also found a public video of StreetSlide Microsoft Research’s approach to showing the detail of roads integrated into Bing Maps. This has already addressed the problem of making the view visually appealing and easy to navigate, the other change is privacy to avoid the concerns this technology has raised in the past.
In between this blue sky stuff I also wanted to call out the interesting free stuff that might be useful in schools like Live Photo Gallery with its fuse function to clean up you photos
and photosynth, to stitch together a 3d composite which can then be geotagged on Bing maps
They loved all of this stuff and so hopefully I’ll get an invite to the next one, in the meantime if you are in education you might wish to sign up for EduGeek (it’s free) and share problems with your hard working peers.
If I want to copy 10 files from server A to server B it doesn’t matter whether I copy them one at a time or try and launch 10 copy processes at once, both methods will take about the same amount of time and the limiting factor is the network bandwidth I have. Now imagine I want to migrate ten virtual machines from Node A to Node B on a cluster - it shouldn’t matter whether I choose to sequence them or let them all move in parallel.
However what is being copied is not a static file but the memory state of each virtual machine and the the migration process tracks which blocks of memory have changed while the copy is made and then recopy those changed blocks. During this second copy yet more blocks will have changed (but not so many this time ) and so those in turn need to be recopied. This recopying continues until this the memory on node B is in synch with Node A and the process will complete. The longer the migration takes the more of this recursive copying of changed blocks will be required as the virtual machine changes during the process.
If several virtual machines are migrated in parallel the memory copy process is open for each virtual machine for the entire time of the migration, and if they are copied in serial the copy process for each virtual machine is only open for just the time need to migrate that machine. So copying virtual machines in parallel takes longer than doing each one in sequence because they are all migrated for a longer time and so more blocks will have changed in that time that need to be recopied.
Of course the sharp eyed virtualisation experts reading this will tell me that you can’t copy virtual machines in parallel using Live Migration in Hyper-V. Correct and my point is why would you want to have this if it takes longer?
Or have I missed something?
I’ve just got back from 5 weeks off in NZ to celebrate my 50th lap of the sun, hence the guest posts on here recently. Just before I went away the UK TechNet team setup Tech Days Online
.. based on a chat Simon and I had about what 2011 holds for the IT Professional which you can watch here
We recognised that its hard for you to give up complete days of your time to come and learn this direct even if you manager signs of your expenses, so we thought 1 hour focused bite sized chunks on core topics woudl be the way to go. Simon has done a couple of sessions while I was away and my debut is next Tuesday (22nd Feb) on exactly what Microsoft’s Hyper-V cloud is. The full list of events in this series is:
08 February The Modern Desktop 15 February Microsoft cloud for the IT Professional 22 February What is Hyper-V cloud 08 March Practical Deployment 15 March Creating Your Own Private Cloud 22 March Security integration with the cloud using Active Directory Federated Services (ADFS) 05 April Desktop Virtualisation 12 April Automation and the Private Cloud 19 April Managing the cloud 03 May Deployment with the Microsoft Desktop Optimisation Pack (MDOP) 10 May Virtual Desktop Infrastructure 17 May Office 365 for the IT Professional 07 June The Dynamic Duo, Windows Server 2008 R2 and Windows 7 14 June Mixing and moving services between the private and public cloud 21 June Governance Risk & Compliance in the cloud
08 February The Modern Desktop
15 February Microsoft cloud for the IT Professional
22 February What is Hyper-V cloud
08 March Practical Deployment
15 March Creating Your Own Private Cloud
22 March Security integration with the cloud using Active Directory Federated Services (ADFS)
05 April Desktop Virtualisation
12 April Automation and the Private Cloud
19 April Managing the cloud
03 May Deployment with the Microsoft Desktop Optimisation Pack (MDOP)
10 May Virtual Desktop Infrastructure
17 May Office 365 for the IT Professional
07 June The Dynamic Duo, Windows Server 2008 R2 and Windows 7
14 June Mixing and moving services between the private and public cloud
21 June Governance Risk & Compliance in the cloud
We’ve put some clues in the videos to enable you to enter a competition to win some useful tech stuff like a Samsung 42inch plasma TV, Canon EOS 1000D camera and an LG DVD home cinema system, but the real value of these is to enable you to stay on top of Microsoft’s current thinking which should translate into being a more valuable member of your IT team. If you can’t make the dates then the registration pages will also give you access to the recording after the sessions.
Hopefully by Tuesday the jetlag will have worn off, the email backlog will clear and the memory of these guys will still be fresh..
I was looking at the Self Service Portal (SSP), a free add-on to System Center Virtual Machine Manager (SCVMM), recently and wondered if this would work alongside System Center Essentials 2010 (SCE) to create a basic private cloud. If you haven heard of the SSP it enables business users to request services run up virtual machines etc. and to be charged monthly for the amount of resources they use.
I did get it all working after quite a bit of work, but I am pretty sure it’s not supported, so I have decided not to post the videos at the moment or to include it in the series I did on SCE Sunday. If you want to try it yourself you’ll need to do the following, and it should work for you:
1. Get the SSP from here
2. Install the Message Queuing (MSMQ) feature with Directory Services Integration enabled.
3. Run some PowerShell to
#create a host group New-VMHostGroup MyHostGroup # add my VM host to it Move-VMHost –VMHost MyHost –ParentHostGroup MyHostgroup as SCE doesn’t understand the concept of host groups, and this works because hidden under the covers of SCE, pretty much all of SCVMM is there except the user interface, and the SSP is essentially a solutions accelerator with PowerShell scripts to work its magic.
#create a host group
# add my VM host to it
Move-VMHost –VMHost MyHost –ParentHostGroup MyHostgroup
as SCE doesn’t understand the concept of host groups, and this works because hidden under the covers of SCE, pretty much all of SCVMM is there except the user interface, and the SSP is essentially a solutions accelerator with PowerShell scripts to work its magic.
If this is of interest I will do some more research, and you should also put your comments on Microsoft Connect.
One of the capabilities of SQL Server is the ability to create and publish rich reports on your data. One of the capabilities of SharePoint Server is the ability to store, manage and provide a portal to documents. Those documents could be your reports. So you have one system letting you create reports and the other letting you share them afterwards, delivering your data into the hands of the users who need it. In theory, this should mean that your users should be able to find the data they need in existing reports, be sure they’re looking at the most recent information and not request new reports when the data already exists in others. I say in theory because I have less faith in users doing this than I have in the capabilities of the software to deliver this functionality.
In order to get to this situation, you need to do some configuration on both sides. This article on TechNet describes the steps clearly to get the initial configuration working. One of the main steps is that you need to install the Reporting Services add-in for SharePoint 2010, which includes the admin screens for Reporting Services inside SharePoint.
Once you’ve done your configuration, you can give people the ability to view reports through SharePoint. This is done using the SQL Server Reporting Services Report Viewer web part – quite a mouthful! This allows you to point to a specific report on your SQL Server and display it through the browser.
To add this web part, go into edit mode in SharePoint, by going to the Page tab of the ribbon in the SharePoint site in question and clicking the Edit button. As with adding any other web part, you can go to the Insert tab that appears in the ribbon and clicking on the Web Part button. When the integrated mode is set up, the Miscellaneous folder in the menu includes the option for the Report Viewer web part. This inserts the web part, but it starts off blank. So do you connect it to a specific report?
You configure this by opening the tool pane. In the tool pane, you have the option to browse through SharePoint libraries to find reports that have been published. So now you can display SSRS reports in a SharePoint page, perhaps showing team reports in a team site or as part of a dashboard alongside other BI elements.
So now you can combine the rich power of Report Builder Reports with all the other capabilities that SharePoint provides.
Configuration article - http://technet.microsoft.com/en-us/library/bb326356.aspx
Jess is a partner technology advisor specialising in SharePoint an BI working for Microsoft in the UK
I have now completed my tour of Systems Center Essentials (SCE) 2010 and here is the complete list of videos and posts for this series:
Part 3 Post setup administration
Part 4 Computer Administration
Part 5 Monitoring
Part 6 Managing updates
Part 7 Software deployment
Part 8 Authoring
Part 9 Authoring continued
Part 10 SCE Backup
Part 11 Restoring SCE
Part 12 Virtual Machine Management
Part 13 Reporting
Hopefully that’s all of some use and shows how SCE can provide a comprehensive view of your IT infrastructure to ensure you know about any problems and can respond to them before the help desk phones start ringing. SCE is included in the TechNet Subscriptions here or you can get a time bombed trial version here and I would also recommend looking at the SCE deployment guide and SCE Operations Guide which will enable you to get going in the real world.
System Center Essentials 2010 (SCE) relies on SQL Server Reporting Services (SSRS) to show what’s going on with your infrastructure. As I mentioned in part 1, you need to either set this up at install time or point to an existing installation of SSRS. Once you have done that the reports you get will depend on which management packs you installed and whether you elected to setup SCE to manage virtual machines. I have made a short screencast introducing reporting in SCE here.
Other things to note, if the screencasts aren’t your thing:
This is the last in the series on SCE, and if you now want to try any of the stuff I have shown you over the last few weeks it’s included in the TechNet Subscriptions here or you can get a time bombed trial version here.
Andrew asked a really interesting question back in December about the future of domain controllers. I’d like to point out two complementary paths that may converge in the future and work out a possible user story for them.
The first path is represented by Active Directory Federation Services. ADFS v2 is being used by Microsoft IT to provide identity information to internal, and some external sites. http://channel9.msdn.com/shows/Identity/How-ADFS-v2-Helps-Microsoft-IT-to-Manage-Application-Access/. Using ADFS with 3rd parties means that my identity information is provided directly to the site based on my ability to log into a Microsoft domain, working within the corporate network this is entirely transparent, I don’t have to create and manage accounts for the dozens of different internal and external services that I use. Should I leave Microsoft at some point, then MSIT don’t need to contact all these companies to remove my access as that access is no longer possible as soon as my account is disabled. Could a future version of Windows allow access to resources based on a standardized secure token and the claims that it contains?
A second path is that the number of identity providers that I use is slowly consolidating, previously it would be normal to create a new account for each service, now I expect to be able to sign in directly to new services such as Project Emporia using a windows live or facebook account. The more experimental, temporary or infrequently used the less I trust them to maintain my account. Why wouldn’t I consider employers the same way? Rather than authenticating to a Windows AD
Imagine a future sample for Contoso Cycles looking at staff identity. They continue to have an Active Directory but ADFS has been deployed enabling staff to access supplier ordering sites directly based on their corporate identity using federated identity at the supplier site. They have seasonal demand and take on temporary staff. The IT manager is aware that shops have been creating shared accounts for holiday staff, rather than raising IT requests for each temporary staff member a closed Facebook group is created, temporary staff are added to this group by the store manager. Contoso IT authenticate Facebook users for domain access, and give log in permissions based on membership of the Facebook group.
BTW John recently joined Microsoft as an architect in the MTC
I met Greg Charman, one of the ex-Opalis experts who now works for Microsoft a couple of weeks ago and I thought it would be good to get his thoughts on how Opalis works with System Center and other similar tools in the systems management space. Take it away Greg..
In December 2009 Microsoft acquired Opalis, a specialist provider of IT Process Automation (ITPA) software. The Opalis product is the process of being fully integrated into the System Center family of datacenter management products.
IT Process Automation, formally known as Run Book Automation (RBA) software provides a platform to design and run IT processes. Standardizing the IT processes that underpin IT services means best practices can be deployed across the environment, regardless of the underlying management infrastructure. This is achieved by orchestrating and integrating the existing IT tools.
Traditional IT tools support the functions of one particular IT Silo, sometimes offering automation of tasks within that Silo function. Unfortunately IT Business Processes cross multiple IT Silos and today these bridges are provided by human beings, inducing delay, error and rekeying of data. Opalis allows you to now integrate and orchestrate the tools in each of the Silos to now support your end to end IT Business Process, rather than have these tools define what your Business Process will be.
Microsoft recognizes that companies run heterogeneous data centers. As a part of the System Center portfolio, Opalis workflow processes orchestrate System Center products and integrate them with non-Microsoft systems to enable interoperability across the entire datacenter. Opalis provides solutions that address the systems management needs of complex heterogeneous datacenter environments. Opalis has developed productized integrations to management software from Microsoft, IBM, BMC, CA, VMware, EMC, and Symantec. This enables users to automate best practices such as incident triage and remediation, service provisioning and change management process, and achieve interoperability across tools.
The combined offering of Opalis and System Center provides the ability to orchestrate and integrate IT management through workflow and simplifies routine systems management tasks in complex heterogeneous environments by:
With the new capabilities from System Center in 2010, namely Service Manager and Opalis and the rest of System Center suite Microsoft can provide the tools to truly achieve the “Infrastructure on Demand” requirements being placed on IT executives.
Imagine a user has a requirement for a new virtual server which will host a business application.
First they go to a Web Front End and select a virtual machine template from the available options and request which application must be installed on the machine and how much data storage is required
A fully automated request for provision of new infrastructure has been achieved with no human intervention required.
Opalis works with event management and monitoring tools to run automated diagnostics, triage and remediation actions to lower the amount of level 1 and level 2 tickets staff have to manage. In this example, Opalis monitors Operations Manager for a critical performance alert, running on a virtual machine. To triage the cause, it retrieves the host name and checks performance on the host and virtual machines. If the host is the issue, it initiates Virtual Machines Manager to migrate the VM. Once complete it verifies performance and updates/closes the originating alert. If the VM is the issue, it creates and populates a ticket in Service Manager, initiates VMM to start a standby VM and updates the Service Manager incident with new VM details.
in the above workflow Opalis monitors Operations Manager, runs triage and then takes appropriate action.
Opalis works with change management systems to automate request and enforce best practices. Using Opalis, users can authorize, deploy and test change such as adding new services, patching systems, or running audits to detect configurations that are out of compliance. In this use case, Opalis coordinates a patching process during the maintenance window. It opens a service desk ticket, so all activity is tracked. It then queries VMM to get a list of off line VMs running Windows 7, it starts those machines. Opalis then reaches out to Active Directory for a list of computers running Window 7 and initiates Data Protection Manager to run a backup. Once that is complete, Configuration manager is initiated to update all machines with the patch. Upon completion VM’s are returned to their offline state.
There is more information on Opalis + System Center at the links below and a technology roadmap fully integrating Opalis as part of the Microsoft System Center portfolio will be available shortly to clarify how System Center is becoming an increasingly powerful systems management platform for heterogeneous data centre environments
· Opalis (information on the acquisition):
· Opalis portal
· Microsoft System Center
· Installing Opalis