I had this conversation last week:

IT manager: “Stephen, how can I reduce my x86 server sprawl while increasing efficiency and reducing costs?”
My answer: “Have you been thinking about virtualization?”
IT Manager: “Stephen, can you tell me more?”

In our conversation, he didn’t realize that the technology has matured sufficiently in both hardware and software that the whole area had to seriously considered.

We talked some more and it resulted in this two-part blog on server virtualization.

__________

Part 1/2:

Why virtualize?

Server virtualization provides server consolidation and thus reducing costs. Moreover, virtualization readily supports business agility, growth, more rapid deployments, and a unified/standardized IT infrastructure.  

I feel it makes sense to strategize for it, if you have not implemented some form of virtualization already. It definitely saves time, reduces server sprawl through server consolidation, lowers administrative and maintenance costs, eases testing and deployment scenarios, allows quick and secure client provisioning [setup of user computers], and simplifies storage management. Aren't these the hot buttons for IT departments--do more with less. Provisioning, securing, and testing systems are so much easier with virtualization. You can also save on licensing costs too.

Virtualization allows running multiple operating environments on one server increasing utilization and reducing the overall server count. Microsoft’s Virtual Server 2005 R2 is a prime example.

IT executives were aware of virtualization technology in an overview sense prior to 2005 with a limited few embracing the technology. This started to change in 2005 with uptake increasing. In the past, the majority of IT decision makers couldn’t justify looking at virtualization closer since the hardware wasn’t in place to support the needed performance. The advent of 64-bit server deployments overtaking 32-bit new server shipments in 2005; 64-bit servers being somewhat the norm this year in 2006 including even in the consumer space in 2007; and dual-core [later multi-core] chips proliferating in 2006—all of this provides the underlying performance platform making the idea more attractive and gaining mindshare amongst IT decision makers. In addition, the full benefits and the details are just now becoming apparent as there are more blogs, podcasts, articles, webcasts, and face-to-face technical sessions in this area. This blog is just one example.

Tomorrow, I’ll provide part 2 of this blog on virtualization.
_______________ 

Thank you,

Stephen Ibaraki, FCIPS, I.S.P.