I recently had the honor of having a blog post published on Corporate EcoForum's EcoInnovator blog that focused on the role of incentives and data center efficiency.  Take a look:


As incredible as it sounds, and as I wrote in The IT Energy Efficiency Imperative, a typical data center uses less than 5% of the energy it consumes for actual computing –the rest is lost due to various overheads and inefficiencies.  If that wasn’t bad enough, data centers also generate huge piles of e-waste each year as computer servers are replaced by more powerful and, ironically, more energy efficient models.

While this may sound terrible, first keep in mind that it usually takes less energy and natural resources to communicate and consume goods digitally rather than physically, so more investment in IT is often net positive for the environment.

However, underlying all those boasts about new, “energy efficient” data centers, there is usually a dirty little secret: the servers are often woefully under-utilized.  Most work at less than 10 percent of their capacity, but suck up about half of the energy that they would when running at full capacity. So all over the world, millions of servers operate like mostly empty delivery trucks, consuming lots of resources, but delivering very little.

True, most IT departments have tried to tackle poor server utilization by consolidating applications onto fewer servers through the use of virtualization technology.  Unfortunately, many consolidation projects have stalled due to financial constraints, organizational politics and staffing shortages.   As such, server utilization across the enterprise remains very low – often less than 10 percent.  Even worse, as each generation of server hardware has become more powerful, utilization has tended to decrease even further as applications and IT operational practices have been unable to take advantage of it.