Well, that's it. I left home last Saturday morning at 7:00 and will be home tomorrow (Sunday) mid-morning. Add on a few days of jetlag and I've spent ten days at MMS. Was it worth it?
In a word - Yes.
This was the best conference I've ever been to (even better than the Microsoft, internal only, events) & I'm in Microsoft just over ten years (so I've attended more than a few). The content was something else. The organisation was amazing (always impressed how we feed thousands of people). External events are generally organised better than internal ones anyway because you have to pay to get in. San Diego is a fantastic town (even if it wasn't its normal sunny self). Obviously food & beer help the enjoyment factor, but it was really the content that sold it to me. If I was limited to one event a year - I would choose MMS.
If I had to summerise the whole week, it would be: The System Center products are, and will be the best management products for the Microsoft platform ever. Who else knows how to do it better. Our goals within the Dynamic Systems Initiative are nearly met with the products that we'll ship within the next 18 months. I've always said that we were 80% of the way there now (the easy 80% - the next 20 will be the hard ones), but looking at where we'll be in 18 months - I recon we'll be well over 90%.
The Microsoft Platform is the best managed & easiest to manage platform out there.
Ron Markezich - our CIO
Proactive Management Strategies at Microsoft IT
Basically Ron's presentation was about we use our own stuff. We have to 'dogfood' everything (i.e. run our business on beta software - we don't release products until our customers say it's OK & Ron's organisation get the last say).
He covered off some statistics about our environment (which seems to change every time I see the same/similar slide): 340,000 PCs, 7,200 production servers, 3 data centres (goal is 1), 189,000 SharePoint sites, 99.99% availability of Exchange, 3,000,000 internal emails per day, 10,000,000 inbound emails per day (9,000,000 of which are spam that is filtered at the gateway), 46,000,000 remote connections per month.
He put what he does into the frame of IOI, which basically helps you move from a reactive environment to a proactive one. Then he covered off the three tenants of IT (People, Process & Technology).
People are either Users or IT employees. For Users, his goals are to empower them, offer them seamless IT, help them be compliant and to make them cool. For his staff, his goals are to empower them, to make them a global workforce, help them provide remote management and to give them a mission rather than a job.
Process is all about MOF (we get assesed every year). His mantra is that what gets measured gets managed and he's putting a lot of effort info configuration management.
Technology is all about Standardisation, Centralisation and Elimination.
We got to see a couple of demos of how IT use the MOM 2005 SLA Scorecard to monitor our Exchange environment (there'll be one for SQL in September). We also got to see how we use the Desired Configuration Monitoring to keep servers from 'drifting' from their desired state. The demo was running the desired state rules against our SharePoint infrastructure (supposedly for the first time) - the results were pretty good - as in there was a lot of stuff wrong (so maybe it really was live).
Ron closed off by sharing what he's looking forwart to in the not-too-distant future:
Security (NAP, Strong User Authentication, Bitlocker Drive Encryption, User Account Control, Role Based Security and Secure Web Publishing) - which for me means that I'll no longer be able to bring 'rouge' PCs into the office, I'll have to start using my SmartCard to logon to the network, my laptop's disk will be encrypted, I'll no longer be an admin on my own PCs and I'll be able to access LOB applications from the Internet without having to VPN in.
Managability (ERP for IT, Desktop Instrumentation, Email Lifecycle Management, Mobile Device Management and Virtualisation of both Storage and Compute) - which again, being selfish, means to me that I'll be getting a 2Gb mailbox (up from my current 200Mb) - but I'll not be allowed to use PSTs anymore. And my mobile phone will start to be managed by IT (I'll have to remove all the games I've got on it).
Must go - Dave.
Software Virtualisation - ending DLL Hell once and for all
I saw a demo of this the other day at one of the keynotes - but hadn't quite got my head around it (wasn't entirely sure I understood what was going on) - I do now (I think?).
The presenter recons this is the hottest thing to happen to desktops in the last 10 years.
So you can virtualise things like Storage (SANs) and Hardware (Virtual Server/PC) - this was about virtualising applications.
Obviously the issues are around 'DLL Hell' for older legacy applications (that needed their own version of a Windows system DLL) and other application conflicts (like same file extensions, shortcuts, ini files, services, etc) for the newer 'well behaved' applications.
Current sollutions from Microsoft are the Windows Installer service and Terminal Services in Windows Server.
The Windows Installer 'owns' the software lifecycle (as in: purchase, implementation, production & maintenance and finaly retirement) - it solves a lot of problems. Windows XP allows side-by-side DLLs, but only for apps that use the MSI installer packages. Applications that don't can still get messed up because of the Windows System File Protetction service (they'll install their DLL into the Windows directories & Windows will put the 'correct' version back again) - very good for Windows stability, but bad for the app.
Terminal Services seems to offer a solution, but all you're really doing is reducing the number of places for things to go wrong. Apps still need to use the Windows Installer.
Enter 3rd Parties. There's two offerings: Streaming Software Virtualisation and Local Software Virtualisation.
Streaming Software Virtualisation is currently offered by Softricity (Softricity SoftGrid System). Citrix are working on something in this space (project tarpon). Basically, this method virtualises the File System, the Registry, Fonts, INI files, Com objects, etc for each application (it creates a 'sandbox' for each app). The apps are packaged up and put on a server. When the user initiates the app, only what is needed to run the app comes over the wire (Office is over 500Mb - to run Word takes 9Mb and less than 10 seconds). Nothing gets installed on the PC, but the app does get cached for speedier application initialisation. Users can request an offline license. As functions of the application are invoked (i.e spellcheck) just that function is streamed over the network. The package then is defining how the application runs, rather than how it is installed. You can't run the application unless you have a license (so License management is easy). Overall a very impressive solution, but expensive - a SoftGrid client is needed on every PC and all aplications need to be re-packaged into this format.
Local Software Virtualisation is currently offered by Altiris (Software Virtualisation Solution - SVS) this has a very different approach. SVS virtualises every component of the PC and uses a very light filter driver to map real system resources to virtual ones. Applications have to be re-packaged into their format and can be deployed as a single file (MSI packages can be 'wrapped' and used). The application package can then be deployed simply by copying the file down to the PC. Once there, it can be activated, deactivated and reset (goes back to the installation state). This virtualisation method is invisible to the user, to the application and to the operating system - which kind of means that you'd never need to retest an application if you were to patch the operating system. This approach is a lot cheaper than the streaming method, but you don't get the centralised control.
So, Microsoft don't play in the Application Virtualisation space at the moment (well, Internet Explorer in Windows Vista runs in it's own sandbox), but I would bet money (if I had any) that we will soon...
Data Protection Manager - how it works
Started off with the pain points (Recovery is unreliable & painful. Backup is too complex & slow. Costs are too high). Then to the ideal customer (medium sized data centers with 5-99 servers or large enterprises with remote offices) - but DPM is very good for customers of all sizes.
So the DPM agent (that gets installed onto the servers you want to manage) is going to capture byte-level changes to the disk. It can be installed on Windows 2000 SP4 or above (no support for 64-bit machines of clusters - yet). We recon there is a 3-5% overhead to the systems and you'll need around 500Mb of free disk space (for the sync log) - but this is really dependant on how many changes are made between your snapshots. The agent installs a file system filter driver (FSRecord.sys) that copies the byte-level changes to the sync log - in a copy-on-write manner. The DMP File Agent service will copy the contents of the sync log to the DPM server (where it creates the replica of the data).
We currently only work with file data, although there is documentation on how to get it to look after SQL (KB 910401) and Exchange (KB 909644) - basically I stop writes to the databases and take a snapshot.
SP1 will introduce support for 64-bit machines and clusters. The next version (V2 - probably a year away) will work natively with SQL & Exchange.
We have two technologies that seem to do similar things: DPM and Windows 2003 R2's File Replication Service with DFS (the branch office solution). To clarify the two, DPM is a Backup & Restore solution, while R2's solution is for high availability of file data (local server dies, your files are still online on the replica).
System Center Essentials 2007
Designed for small & medium businesses (not confirmed, but 'probably' 10 servers & 500 workstations).
Kind of MOM & SMS & Reporting Manager in one box.
One Console, with different 'spaces' for each management function (monitoring, updates, software, etc).
Getting & Staying Secure - built on the next version of WSUS with support for additional content (drivers, hotfixes and 3rd party stuff)
Monitoring is straight from System Center Operations Manager 2007 - state, diagram, performance and dashboard views with a really neat 'Daily Health Report'.
Software deployment and inventory.
Remote Control of servers & workstations.
Everything a small or medium business would ever need.
And here's the really great bit: Service Provider support: As a Service Provider, I install System Center Operations Manager 2007. I then give my customers a CD with an unattended install of System Center Essentials and now I'm managing their entire environment securely over the internet. I can even run reports per customer to demonstrate what a great service I'm providing for them.
No idea how much it will cost (or the detail of the number of devices it will support). Looks like it'll be out early in 2007.
This was the best demo session yet. I can't wait to get my hands on it.
I've been asked a few times now about how to get Windows Vista running inside Virtual PC and/or Virtual Server. Finally here's the answer:
First up, to get it working you are going to need the latest Virtual Machine Additions that come with the current version of Virtual Server. Download it (for free) from http://www.microsoft.com/virtualserver. If you're a Virtual PC user, you're still going to need the Virtual Server Additions (VMAdditions.iso).
Create a new Virtual Machine. Give it at least 512Mb of memory. Put the Windows Vista DVD into the drive (either physically of mount the ISO). Boot the virtual machine. Setup starts. Enter your Product Key. Select the Custom Installation option (Upgrade is greyed out). Click Advanced. Create a new partition and format it. I've seen a "feature" whereby even though you now have a formatted partition on which to install, setup ignores it - reset the virtual machine and re-run setup - it always works second time around. Once you have selected a partition on which to install, setup will continue (go and do something else for a couple of hours - this part takes a long time). Setup will end and ask you for a username a computername and your timezone, then it automatically logs you on.
Next you have to install the Virtual Machine Additions - until you have done this, the machine is VERY SLOW.
For Virtual Server select: Edit Configuration, then scroll down and select Virtual Machine Additions, check Install Virtual Machine Additions and click OK.
For Virtual PC: Right Click the CD icon in the lower left hand corner of the Virtual PC window, Select Capture ISO Image, then Browse for VMAdditions.iso that you’ve taken from a Virtual Server install.
Click through the setup screens, reboot and you're done.
The local administrator's password is blank - so set it please.
Windows Vista speeds up with time, especially after the initial install - so leave it running for a long while (overnight) before you start "playing".
Enjoy - Dave.
Management in Windows Vista.
Three big goals: Maintain PC Configuration, Simplify Configuration Management & Desktop Troubleshooting and Task Automation.
This was a very good session with loads of demos.
The big thing to help maintain the PC's configuration is User Account Control - users can be dumb users and not admins. This restricts what users can do to a system (which will mean less downtime and higher productivity) and it will mean less requirement to re-image PCs, as the worst a user will be able to do is screw their own profile (it's quicker to put a new user profile onto a machine than it is to re-image it). Windows Resource Protection also plays a big part here - only the OS Trusted Installer Service can change system files or system registry settings. Of course all this is managed via Group Policy (something like another 500 new settings - including what removable devices are allowed and what can be done with them - i.e. only Read or read/Write). We've improved update management (less reboots, the ability to update an image and the ability to use auto-update to fix everything - not just critical Windows fixes). We also introduce Windows Remote management (WinRM) - a WS-Management, Firewall friendly remote access protocol.
I really like the new Event Viewer and Task Scheduler. All events are actionable - I can assign a tast to an event. The Reliability Analysis Console is very good - the ability to monitor reliability over time and map changes in reliability to application installations and windows configuration changes.
Deploying Windows Vista clients with SMS
Sorry, but I've seen this before too many times (so my interest was low).
SMS shipped an Operating System Deployment Feature Pack (OSD) a couple of years ago. We are updating that feature pack to support Windows Vista.
The original OSD introduced the Windows Imaging Format (WIM) - a file based image that eliminates duplicate files in the image and offers around 3:1 compression. The version of WIM was 0.9
Windows Vista uses WIM itself for it's installation process - it uses WIM 1.0
Windows Vista is HAL independant and you can load extra device drivers into the image - so you're closer to getting to one image
So it works like this:
In SMS create the Image Capture CD (this is a WinPE disk, that you can add extra drivers to) - you actually create a 150Mb ISO file, that you burn to CD.
Then you build your reference PC - Operating System, Applications, etc.
Then insert the Image Capture CD into the reference PC. It will run sysprep and reboot the PC. The machine boots from the CD and loads up WinPE, it then creates the image file (of the C: drive) and copies it to a file share (that you defined when you created the CD).
I now have a .WIM file of my reference PC, which I load into SMS as a package. I then associate a Program to the package that includes all my setup options and any additional things I might want the setup to do (like run USMT). Now I have my SMS package defined, I get it out to my distribution points. Once it's there I advertise the package to a collection of PCs.
When the advertisment runs, the user is notified of what's going on (if you made that choice during the program definition). The program runs several phases:
- Validation (is this PC capable of accepting this image - enough disk space, etc).
- Pre-Installation (USMT to save user & machine state, partition & format disk if required).
- Installation (delete unwanted files from disk - \windows, \program files, etc. then expand WIM file onto disk)
- Post-Installation (additional SMS packages to install other stuff)
- State Restore (USMT in reverse).
More later - Dave.
WOW - that was pretty good.
Kirill Tatarinov (Corporate VP, Windows Enterprise Management Division) covered off System Center - current & future.
He positioned System Centre as the vehicle to deliver the "Manage Complexity, Achieve Agility" part of our People Ready message.
We'll be doing well if we help you to Ensure Your Business is Always Running, Eliminate Unneccessary Complexity and Establish a Responsive Infrastructure.
He covered off Infrastructure Optimisation (http://www.microsoft.com/io) - which is all about getting from where you are now to an eventual destination of Dynamic Systems (it's all about implementing Best Practices).
So System Center is Microsoft's branding for our 'knowledge driven management solutions family', which are Enterprise Ready, Flexible & Extensible.
He covered off what we have delivered so far (Data Protection Manager, Capacity Planner, Reporting Manager, SMS, MOM, etc and betas of some new stuff).
"SMS is dead - long live System Center Configuration Manager 2007" (previously known as SMS v4).
"MOM is dead - long live System Center Operations Manager 2007" (previously known as MOM v3) - public beta due in May/June.
The roadmad of future products is pretty exciting. The big investments are around Deeper Knowledge, a Self Service Portal, a CMDB and Workflow. Also products that cover Operations Management, Incident & Problem Management, Asset Lifecycle Management, Change Management and Configuration Management. Products are: SMS R2 (soon), System Center Operations Manager 2007 (this year), System Center Essentials 2007 (MOM & SMS & Reporting Manager in one box for small organisations - early 2007), System Center Configuration Manager 2007 (early 2007), System Center "Carmine" (Management of a Virtualised environment - late 2007), System Center "Service Desk" (late 2007) and System Center Operations Manager 2007 R2 (early 2008).
Some fantastic demos of all this, including one of SMS and AssetMetrix (a recent Microsoft aquisition) that enables "real" business level asset management - it showed how to report on what's installed side by side with what you own (licenses) - excellent..
Too much information in too short a time - there was definately more (but I can't remember it all). Must dash, I'm late for the next session.
More later, Dave.
Remote Management Capabilities in Windows Vista.
Bit of a mis-leading title.
The challenges here are: heterogeneous hardware & software, multiple protocols and security models, ports & firewall exceptions and packet filters. Cross platform is very hard and expensive.
We're addressing the problem by going down the standards route. WS-Management was ratified by the DMTF the other day. It's got support from most of the big industry players (Intel, AMD, Sun, HP, Dell, etc) at both the hardware & software levels. Big management ISVs are in too (BMC, CA, HP, Novell, etc).
Event aggregation, health monitoring & troubleshooting, collecting events from anywhere (we have a generic adapter that will collect WMI events from HW and convert them into WS-Management. WS-Management will obviously work through firewalls.
In Vista, the components are: Windows Remote Managemet (WinRM), Windows Remote Shell (WinRS), an IPMI driver, and event collector and an event forwarder. The Remote Shell will initially be CMD.exe but in the future will be the Windows PowerShell.
WS-Management will be natively in HW soon - that will enable you to remotely configure hardware and potentially perform out-of-band functions like watch a re-boot (all over http/https).
System Center Operations Manager 2007 will be able to consume events from agentless machines - just by configuring the event forwarder to send events to it. Monitoring of workstations.
Had to leave this session early, so missed the wrap up (hence the 'not so structured' notes).