Your Guide to the Latest Windows Server Product Information
Hi, Mike Neil here. It’s been a while since I've posted. To update you, I now oversee the development teams, plans and strategy for our virtualization software - both the desktop and server. In this role, I’m really fortunate in that I get to meet with a broad range of customers and partners. I value these meetings because the discussions and input helps validate my thinking and our planning, and other times it challenges some of my thinking and our planning.
Along those lines, a couple weeks ago I had a conversation with a reporter from the NY Times. The reporter’s questions challenged the rationale behind some of our thinking and decisions relative to virtualization. So in many ways it was like the customer and partner meetings I attend on a weekly basis. But unlike these meetings, a reporter wields the power of the pen. And it goes without saying that a NY Times reporter carries a big pen.
I know some of our customers and partners, after reading the Times' article, may wonder about our goals and strategy with respect to virtualization. I shared much of this with the reporter, but invariably the reporter couldn’t print all my comments. But it’s important for customers to understand our broader goals and strategy, and then from there some of the more specific issues. So I wanted to share my thoughts on this.
My team's ultimate goal here is the help customers’ IT systems become more self-managing and dynamic. We know that customer effort to deploy and configure IT systems often means change is too costly and too slow for IT to keep pace with the changing business needs. Virtualization software is just one technology – which is just one piece of the formula - that helps customers gain more control of their IT systems, and enable business to respond faster and stay ahead of the competition.
The people I work with are making broad investments to offer customers a set of virtualization products to become more dynamic. These investments span multiple disciplines and range from the desktop to the datacenter. These investments – in the areas of the platform, management, applications, interoperability and licensing -- fuel our virtualization strategy. I’ll touch on each now.
Platform: resource management has always been part of operating systems, be it mainframe, UNIX and more recently x86-based operating systems. Today, vendors such as Sun, Novell and Red Hat incorporate virtualization into their x86 operating systems. Even a few weeks back we saw KVM added to the Linux kernel. Our desire is to bring hypervisor-based virtualization to Windows Server – and a wider range of customers - with Windows Server "Longhorn." We’re adding some innovative functions to Windows Server virtualization (aka, Viridian), like live migration, support for up to eight virtual processors, and hot add of resources such as disk, networking, memory and CPU, so customers have more flexible and dynamic deployment options for all their workloads.
Management: we want to make Windows the most manageable virtualization platform by enabling customers to manage both physical and virtual environments using the same tools, knowledge and skills. No other virtualization platform provider is delivering this. Today customers can use MOM and the management pack for Virtual Server. We’re extending the virtual infrastructure management capabilities with System Center Virtual Machine Manager, which will allow customers to increase physical server utilization, centralize management of virtual machine infrastructure and quickly provision new virtual machines. And it’s fully integrated with the System Center product family so customers can leverage existing skill sets.
Applications: Application virtualization provides a finer grained solution compared to virtual machines, and it’s really more suited for the enterprise desktop environments. It’s really about making application deployment more cost-effective via centralized management. SoftGrid allows customers to more easily deploy and update applications.
But there’s a greater significance here. When SoftGrid is combined with its streaming delivery mechanism, customers can turn any Windows program into a dynamic service that follows users wherever they go. Integration into the Microsoft management infrastructure means that these application services can be administered using policy-based management with existing tools. And in my mind’s eye, I see a time when application virtualization will allow customers to easily move to an on-demand computing / services model for the hundreds of thousands of Windows applications built by partners, customers and Microsoft.
Interop: this area of our strategy has been well documented as of late. Be it the Novell agreement, the XenSource agreement, moving the VHD license under the Open Specification Promise, and certainly running/supporting Linux guest VMs on Virtual Server. But these are just the higher-profile examples. We’ve also done more subtle items to enable and support interoperability. For example, the APIs for Virtual Server 2005 have been available publicly since day 1 on MSDN. And much of the preliminary details on Windows Server virtualization (part of Longhorn) APIs were shared at WinHEC last year, in the included documentation shared with each attendee. And like other Windows APIs, we plan to publish these publicly at beta. We’re doing this because in the end customers with mixed environments expect it all to work together.
Licensing: this was one of the bigger challenges a couple years back – figuring out how to provide customers and partners more cost-effective, flexible and simplified licensing for server virtualization. Similar to licensing software for multi-core servers, virtualization licensing has been, and continues to be, a challenge for the overall software industry due to licensing models based on hardware. As we know, virtualization allows new and more dynamic uses of software that were not possible in a hardware-only model. A couple years back we moved from installation-base licensing to instance-base licensing for server products. Since these updates, the market has grown … as I mentioned in the Times’ article. And just last week, with the release of SQL Server 2005 SP2, we announced expanded virtualization use rights to allow unlimited virtual instances on servers that are fully licensed for SQL Server 2005 Enterprise Edition. I’m sure licensing for virtualized environments will continue to evolve for us and the industry.
Along those lines, there’s been much written about the EULA for the home editions of Windows Vista. But what hasn’t been well reported is where we have made advances. The primary use cases here are with business customers and enthusiasts. In Windows Vista Enterprise edition we allow the user to have 4 installs of Windows in VMs and they can install and use Vista Business Edition in a VM. Virtualization is a new technology for consumers, and one that isn’t mature enough yet from a security perspective for broad consumer adoption. But for the enthusiasts and early adopters we do provide Vista Ultimate to be used in a VM. As an example we have researched these issues with current virtualization hardware architectures. One area that is clear is that our security and data protection features can potentially be subverted by a malicious virtualization layer. We’re working with the hardware and software industry to improve the security of virtualization technologies and we will evolve our licensing policies as virtualization becomes more widely used on client systems.
So these are the major parts of our virtualization strategy. It’s a means to an end; the end being help customers’ IT systems become more self-managing and dynamic. Does this mean everything will be virtualized in 5 years? Not likely due to continuing innovation at the hardware and software level, and the fact that no solution applies to everyone. That said, virtualization will become the default setting in the operating system – whether that’s Windows or other OSes in the market. I feel confident that this strategy will provide customers with the most integrated solutions and the most cost-effective platform.
Thanks for reading.
GM, virtualization strategy
Mike Neil just posted a very good summary of our Virtualization strategy. Mike Neil is General Manager
I know the title is a bit weird, but it sounds better tan the boring "Microsoft Virtualization Roadmap".
I'm loving this article from Mike Neil in the Windows Server group. Especially the quote below. I can't
Mike Neil, GM for virtualization strategy at Microsoft has written an interesting blog post on the Windows
Mike Neil (GM, Virtualization Strategy) of Microsoft has a wonderful post on the Windows Server Division's
We've posted the white paper Microsoft Virtualization Licensing and Distribution Terms, which lays out our concerns about some anti-consumer, anti-choice, and anti-ecosystem policies that Microsoft is choosing to implement. It goes into some detail around
Much farther than many people know or expect. I will defer to the subject matter expert post here . This
Mike Neil's blog post on the Microsoft Virtualization Strategy is an important read if you are in this
Well at least a large part of it is. Microsoft have Virtualisation (sorry I'm English, it is spelled
As we have just announced over 1 MILLION downloads of Virtual PC 2007 (since its release on February
As the person who oversees the development plans, teams and strategy for Microsoft’s virtualization software on desktops and servers, I want to update everyone on the timing of our server virtualization offerings. I know that many of our customers and
Windows Server BLOG: As the person who oversees the development plans, teams and strategy for Microsoft’s
We’ve made good progress with Windows Server virtualization: we’re taking customer nominations for the technology adoption program (TAP); we have interop agreements with both XenSource and Novell; partners have provided productive input on the platform
More often I get asked what is Microsoft's virtualization strategy - I know Dave Northey covered it off