Issue 24 of the Microsoft Architecture Journal focuses on virtualization. My paper, From Virtualization to Dynamic IT, has been published in this issue. My thesis:
Virtualization is a critical infrastructure-architecture layer that is required for achieving higher IT-maturity levels, but several others layers—such as automation, management, and orchestration—are equally important.
For the last several months I have been focused on virtualization and private cloud offerings. My paper outlines several architecture layers in addition to virtualization that are required for a robust cloud infrastructure. The different forms of virtualization including server, application, storage, and network virtualization are key foundational technologies but they must be complemented by automation, management, and orchestration layers in order to achieve the cloud attributes of elastic capacity, user-self service, and resource pooling. In the paper I define those layers, detail some of their key requirements, then discuss how together they form the foundation for cloud infrastructure as a service (IaaS), either public or private.
In keeping with the mission of the Journal, my paper focuses on architecture and doesn’t get into specific details of Microsoft product offerings. For that, stay tuned to this blog as I will be detailing in a series of blog posts some of the work myself and others are doing in this space using current Microsoft technologies such as Hyper-V, VMM, Opalis, PowerShell, and the rest of the System Center suite. We’re doing some quite interesting work around datacenter automation that is going to make its way into both Microsoft and partner offerings in the virtualization space.
Concurrent with the release of Windows Server 2008 R2 SP1 Beta (available here), a bunch of documentation and step-by-step guides have been released. These two posts have a good list of links for RemoteFX and Dynamic Memory:
For RemoteFX, be sure to closely review the hardware considerations document to ensure your test systems meet the requirements and also note that the blog post calls out some particular nVidia and ATI driver versions known to provide a decent experience.
I’ve been evaluating both RemoteFX and Dynamic Memory for some upcoming training I’m delivering. In a very short amount of time you can learn to implement and configure. The settings that are exposed are explained well in the documentation. For a deeper understanding of what’s going on, testing in various configurations and loads is a good idea while looking at all the new performance counters that have been added. That will give a good look at what the system is doing in different scenarios.