(212) 398-3700
December 17, 2015
Among the most important inputs into a data center TCO analysis is a data center capacity demand forecast, ideally measured in kilowatts. Yet accurately forecasting data center capacity requirements beyond a 12-24 month horizon is virtually impossible.
So what happens when you’re tasked with comparing the 10-year TCO of two data center alternatives and face the virtually impossible task of accurately forecasting capacity demands?
Typically, you substitute a similar task that is possible: inaccurately forecasting demands. This is also known as choosing a best guess and “just going with it” and often takes the specific form of something like “500kW initially, growing at 50kW per year for the next 10 years” or the like.
When you “just goes with” a single discrete capacity forecast as an input to a data center TCO comparison, you lose the ability to quantify the value of flexibility which is a function of probability – not certainty. As a result, single-forecast TCO analysis becomes a useless tool for comparing alternatives that vary significantly with respect to flexibility features.
Because flexibility features in a data center solution can very often impact actual empirical TCO differentialsmore than every other variable combined. And yes, “every other variable” means: rental price, power rate, PUE, operating expenses, capital expenses, WACC, etc. Flexibility is the elephant in the room.
Lets look at an example:
Assuming an arbitrary cost of $2,500 per kilowatt of capacity per year, Provider A’s solution costs $6.25mm for the first 5 years and $12.5mm for the second 5 years for a total of $18.75mm under all load growth scenarios. Using only the lens of your “just go with it” forecast, Provider B’s solution has an identical cost to Provider A’s: $18.75mm of cost over a 10 year term.
But, what happens if the second 500kW block never materializes 6 years from now? You outperformed on your virtualization targets, outsourced an application to the cloud or divested a subsidiary. Provider B’s 10-year cost drops to $12.5mm, a savings of 33.3% relative to Provider A!
SEE ALSO: A Place of Your Own: “Wholesale Colocation” Defined
Technically you should probability weight the two models. Say there is a 50% likelihood that the load won’t materialize. In that case, there is a 50% likelihood of Provider B’s solution costing $18.75mm and a 50% likelihood of it costing $12.5mm, yielding a risk-weighted cost of $15.625mm or a 16.6% savings over Provider A.
All else equal, the obvious choice relative to TCO is Provider B. And the choice would still be obvious if Provider B’s per unit rental price was 10% higher than Provider A’s. But it wouldn’t be obvious unless the above approach was taken. Said differently, investing the time to build multiple probability weighted forecasts for use in comparative TCO analysis is absolutely critical to a sound analysis.
The above example is extremely simple, but in reality the same dynamics present themselves over and over again with respect to all different types of flexibilities or, conversely, different types of rigidities that characterize competing data center solutions.
Some rigidities are explicit such as the Provider A structure above, but others are implicit: solutions that can’t accommodate physical footprint growth, solutions that can’t accommodate hardware form factor shifts (i.e. many containers), rooms that can’t accommodate changing concentrations of power density, rooms that can’t accommodate technological change (i.e. water-to-the rack).
The example above could have just as easily been structured with the following two alternatives:
In an identical fashion, as illustrated above, assuming identical pricing, these solutions would look identical with respect to TCO if you simply used a baseline “just go with it” capacity forecast over the relevant period of 20 standard-sized racks at 15 kW each.
But what if you need to replace these racks upon refresh with free-standing gear that won’t fit in the container (i.e. large IBM or EMC boxes)? Or if the individual units will fit, what if their equivalent watt to footprint ratio is lower, requiring a higher quantity of units for the equivalent power capacity and that quantity won’t fit?
All of these scenarios would result in premature abandonment of purchased capacity and lead to the exact same conclusion as the analysis detailed above: the more flexible, less rigid solution will always have a better risk-weighted TCO.
One of our prime business goals over the last decade at Sentinel has been to remove as many of these rigidities as possible so that your data center facility enables your business evolution, not prohibit it.