Lots of good press coming out of Mix. I was able to catch the keynote webcast. It was great to hear from Ray Ozzie in some depth about his vision for the future of software plus services. Even internally unless you're on some of those teams the news has been pretty sparse. I like the fact that virtually everything they announced is available at least in beta to test and most is scheduled to be completed before the year's end. No vaporware here. Stepping back a little to take it all in, these announcements are very significant for the light they shed on the future. Cross platform. Cloud based services. Rich user experiences. APIs everywhere. For as huge a company as Microsoft is, the amount of change and new products in the last 18 months is incredible.
Here is some of the best coverage so far:
http://visitmix.com/ - The official site, lots of content and the session videos should be up here within a day or two
http://blogs.zdnet.com/Stewart/?p=356 - Best overall summary so far compliments of Ryan Stewart
http://blogs.msdn.com/jasonz/archive/2007/04/30/announcing-net-framework-support-for-silverlight.aspx - Good commentary from Jason Zander
http://blog.jonudell.net/2007/04/30/a-conversation-with-john-lam-about-the-dynamic-language-runtime-silverlight-and-ruby/ - Jon Udell article and podcast
Hello from TechReady 5 in Seattle. Like last year, I'm going to blog each day this week with news from TechReady, Microsoft's internal technical conference for employees. TechReady is a great event held twice a year where we get to hear from the product groups about all the upcoming releases for the year as well as training on all of the current releases.
Today kicked off with a keynote address by Kevin Turner, Microsoft's Chief Operating Officer. As they usually do, at the end of his presentation he took questions from the audience. This is one of the great things about the company, the executives will take any question from the audience and in most cases they audience doesn't pull any punches. Today, I had the opportunity to step up to the mic and ask Kevin a question. I wanted to guage the executive committment to the amount of investment and vision required for the move to software plus services, hosting, and large data centers that we keep talking about and that I am a big proponent of. These are huge business changes that in some cases are completelty different than the models that we have historically been successful with.
Being an internal conference, I don't think it's appropriate to quote his response but I came away convinced that the committment is absolutely there from the highest level. I think the top level vision is dead on in terms of offering customers the choice: services hosted by Microsoft, services hosted by our partners, or services hosted by the customer themselves. Everyone else is pitching an all or nothing approach.
Tomorrow Bill Gates will be our morning keynote so I am definately looking forward to that! Maybe I'll get a question in to him too! Check back tomorrow for a summary of his thoughts and a review of the technical sessions I'll be attending.
As someone who attempts (and whose job!) it is to maintain at least a passing knowledge of all of the Microsoft infrastructure servers and expertise in at least a few of them, weeks like this are both exciting and a great challenge. In case you missed them, there were a lot of announcements and releases this week. Here are some of the big ones:
System Center Operations Manager RTM
Office Communications Server 2007 Public Beta
PowerShell to be included in Longhorn Server
As if that isn't enough, these are some other announcements from MMS about upcoming releases:
"At MMS 2007, Microsoft will highlight progress made on its commitment to delivering a highly integrated, modular family of systems management solutions including the spring release to manufacturing (RTM) of Systems Management Server Service Pack 3 with AssetMetrix; the public availability of Data Protection Manager v2 Beta 2 within 45 days and the first public beta of Microsoft System Center Service Manager offering (formerly code named “Service Desk”) available in 30 days; the public availability of the Beta 2 release of Microsoft System Center Virtual Machine Manager within 45 days; and the recent release of System Center Configuration Manager 2007 Beta 2 this past February. Microsoft also announced plans to build and ship an add-on that will support the 2007 Intel vPro with Intel AMT technology (code-named “Weybridge”) after the RTM of System Center Configuration Manager 2007 (SMS V4). This ensures that customers that implement and use Intel AMT and Systems Management Server 2003 are fully supported when they move to System Center Configuration Manager 2007."
These are just huge announcements. All of these are either major upgrades to existing products or completely new offerings. Looking at that list it should be pretty clear the Microsoft is very serious about the management space. The announcements with Cisco and EMC, the submission of the SML standard, etc. should also make it clear that System Center will expand far beyond just managing Windows servers.
Brian Madden has an excellent post up today called The hidden costs of VDI. I’ve been working nearly full time the last two months helping to put together a Microsoft Services offering around desktop virtualization in general and VDI in particular so have spent a lot of time looking into both the technical and business considerations that must be taken into account. I’d summarize his post in three points:
As a well known fan and expert on Server Based Computing (SBC), i.e. Terminal Services or Citrix Presentation Server/XenApp, Brian prefaced the article by saying that he likes VDI “where it make sense”. He correctly points out that nearly all vendors and TCO models show that Server Based Computing still provides the lowest TCO due to its high user density but that there are limitations which make other approaches such as VDI relevant.
That is where I’ll jump in with my thoughts because I completely agree with those statements and it has been the foundation of the offering I have been working on. It starts with the notion of flexible desktop computing and desktop optimization that Microsoft has been talking about for some time now. An overview of this approach is presented in this whitepaper. To summarize, there are a variety of ways that a desktop computing environment can be delivered to users ranging from traditional desktops, to server based computing, to VDI, with a multitude of variations in between with the addition of virtualization at the layers illustrated below:
Rather than selecting a one-size-fits-all solution, virtualization provides architects a new, more flexible set of choices that can be combined to optimize the cost and user experience of the desktop infrastructure. The following four steps lead to an optimized solution:
Define User Types: Analyze your user base and define categories such as Mobile Workers, Information Workers, Task Workers, etc. and the percent distribution of users among them. The requirements of these user types will be utilized to select the appropriate mix of enabling technologies.
Define Desktop Architecture Patterns: Each architecture pattern should consist of a device type (thin client, PC, etc) and choice of:
For each pattern, determine which user types it can be applied to. For example, with mobile or potentially disconnected users, presentation virtualization alone would not be applicable as it requires a network connection. Power users may require a full workstation environment for resource intensive applications but may be able to leverage application virtualization for others. These are just a few examples where different user groups have different requirements.
Determine TCO for each Architecture Pattern: Use a recognized TCO model to determine the TCO for each pattern. Minor adjustments to these models can be made to account for specific technology differences but most include TCO values for PCs, PCs with virtualized apps, VDI, and TS/Citrix thin client scenarios. Be wary of vendor provided TCO models. To Brian’s points, be sure to gain a full and complete understanding of the chosen TCO model and what does and does not include. Consistent application of the model across the different architecture patterns is critical for relevant comparisons.
Model Desktop Optimization Scenarios: With the above data, appropriate architecture patterns can be selected for each user type by choosing the lowest TCO architecture pattern that still meets user requirements. By varying the user distribution and selected architecture patterns, an optimized mix can be determined. It is tempting to simply choose the lowest TCO architecture pattern for all users but this can be very dangerous in that it will typically impact your high value, power users the most if their requirements are not accounted for.
A one-size-fits-all approach would result in either a large number of PCs if not using virtualization, a large number of servers if virtualizing everything, or failure to meet power user needs if using only server based computing. An optimized solution is one which utilizes the right mix of technologies to provide the required functionality for each user type at the lowest average TCO. Combined with a unified management system that handles physical and virtual resources across devices, operating systems, and applications, substantial cost savings can be realized.
As I mentioned at the top, a lot of the concepts in addition to very detailed architecture and implementation guidance are part of the Microsoft Services Core IO offerings. For the last two years, in addition to my customer work I have been deeply involved in the creation of the Server Virtualization with Advanced Management (SVAM) offering. The work I mentioned above around VDI architecture will complement that and be available later this summer. Finally, specific to desktop imaging, deployment, and optimization, there is also the Desktop Optimization using Windows Vista and 2007 Microsoft Office System (DOVO) offering. Taken together in concert with the underlying product suites, these illustrate Microsoft’s “desktop to datacenter” solutions and how to plan, design, and implement them.
If you want to learn more about SQL 2008, here is a great list of resources.