By Christian BeladyGeneral Manager, Data Center Research, Global Foundation Services
As the demand for cloud services continues to grow, there are many companies in the datacenter industry (particularly start-ups) that have met with me to discuss their businesses. Their proposals usually included huge expectations on what they think can be generated from revenue opportunities in constructing datacenters. It seems that most of the time, these vendors assume that their niche business projections will have great impact, when in reality they are relatively small to the actual market opportunity. Having worked in the datacenter industry for a long time, I generally do my best to try to help them understand that they are over estimating the market opportunity, but when they pressed for the numbers to back up that claim, I didn’t have any.
Figure 8. Year-over-year datacenter construction market size
My responses in the past were based off my professional observations, that even though we have seen the datacenter power footprint double every five years, to me it didn’t necessarily mean that the datacenter construction business would follow that same trajectory. Datacenter designs are emerging today that drive cost per megawatt to half of what was traditionally expected. As a result, one would expect that the actual market size (in dollars) may in fact not be growing as fast as we think.
As usual, my curiosity got the best of me and I tried to find some research that may have done on the topic: How big is the annual construction opportunity for datacenters?
Since I could not find anything, I decided to try to estimate the annual new datacenter construction market size using existing data already available in the industry. My calculations resulted in a projection out to 2020 that I believe is fairly representative of what we can expect. In addition, I also made the assumption that the industry will likely move from an average construction cost of $15 million per megawatt to something more like $6 million per megawatt in five years resulting in a flattening of year-over-year growth. The resulting white paper is titled Projecting Annual New Datacenter Construction Market Size. It also shares the steps I took to generate these projections and estimates that the United States and global datacenter markets will grow 50 percent or so by 2020 to $18 billion and $78 billion, respectively. These estimates were derived by combining my own data with the great work of my good friend Jonathon Koomey, and some of the initial Uptime Institute’s work on datacenter costs.
Figure 8 (inserted above) captures the conclusion of the paper, however I would urge you to read the work to understand all of the underlying assumptions and how I ended up with these results.
It is my hope that this paper will serve as an aid for industry businesses and venture capital firms to better understand and calculate the approximate size of the datacenter construction business that they can realistically capture. In addition, it is important to note, that this is a “first order estimate.” The purpose of the paper will also hopefully seed our industry for a dynamic debate and discussion to further inform in this area. Please send your thoughts and feedback on the paper and as always, I look forward to the dynamic discussion with my industry friends!
I read thru your whitepaper on sizing the data center market and really enjoyed your use of Jevons’ Paradox (gains in efficiency do not increase as fast enough to offset growth in overall consumption of a contended resource).
I wonder if Jevons did not take into account the possibility of both exponential gains, and inter-relatedness of benefits from multiple dimensions of innovation, in describing his rule? And could exponential gains in efficiency impact your estimate of data center efficiency, which uses power growth as one of the primary scaling metrics? I wonder.
Some years ago I worked at Intel Corp. and one of the big trends in the semiconductor industry at that time was the increase in density (they called it process resolution) that defined how small the individual gates on the processor could be - the smaller the process dimension, the more devices could be placed on a chip, etc. Device density also affected heat and power consumption for a given processor. A little secret was it also drove the cost of the factory and instruments that built the chip!
Moore’s law stated that the number of processors on a chip would double about every 18-24 months. However, there had to be an end in sight! As the company’s fabs moved from .080 (micron) to .065 to .045 process geometry, successive generations of processors were developed with more and more devices, but eventually that trend would be limited by the wavelength of light ?
20 years on however, Moore’s law remains in effect, mostly because along with increasing density, there were other teams at Intel that were working on low power logic, better and more integrated device design, etc. The output of all of that work kept innovation proceed along at a rate more or less as predicted by Moore, despite the fact that the drive to continually smaller process geometries flattened out somewhat.
I think this relates to your data center sizing model in the following way: Could we account for some step function improvement in data center efficiency (measured as either power density or improved compute power per watt) based on things like introduction of solid state memory, a replacement for switching power supplies, or even LED lighting?
In other words, I wonder if there are some further dimension of innovations that would revise the slope of your growth curve upwards, even more?
Great paper. Thanks !
Bainbridge Island, WA
I agree that there is tons of opportunity.
Awesome post. Most folks do not appreciate the coming compute wave. Proportionate to total data volume, character data is becoming a thing of the past. Audio, video, GIS, 4G...What happens when we get to 10G?
Kurzweillian - ‘The singularity approaches’ and it demands data centers…I think your projections might be low.