Blog Feature
Ernest Sampera

By: Ernest Sampera on August 6th, 2014

Print/Save as PDF

Hot to Trot? The Paradox of Data Center Cooling

colocation | data center cooling | Data Center Performance

Subscribe to vXchnge Blog

Finland is the new hot spot for data center cooling. Tech giants like Microsoft and Google have both built facilitiesin this northern European country to take advantage of cold winters and chilly Baltic sea water — with some servers now hitting 20-25 kilowatt (kW) loads and 100 kW theoretically possible, anything that can offset massive air conditioning budgets is worth a try. But what if you're not a Microsoft, Google, or Amazon? Is it possible to maximize data center cooling while minimizing spend?

Cool Down

Your first choice is to cool down. This means bigger and more powerful air conditioning units or in-rack liquid cooling, which prevents the problem of air movement. But with liquid cooling still prohibitively expensive, most companies are left with what CEO of Future Facilities Hassan Moezzi calls a 'game of Tetris'.

Ideally, colocation and on-premises data centers should be completely full and without a rack space left blank. In reality, however, there are gaps created when components are upgraded, removed, or simply fail. The result? Physical fragmentation which leaves some areas too cold, some too hot, and some stuck in the middle. Hassan points to one customer operating at 45 percent capacity that still experienced overheating issues. Effective data center cooling requires strategy, forethought and in some cases, good luck. If you're in the market for a colo provider, make sure staying cool is at the top of their priority list.

Heat Up

Your other option is to run hot. As Fred Homewood, CTO of Gnodal notes in a recent recent Search Data Center interview, some businesses are running severs up to 113 or even 122 degrees Fahrenheit, still within ASHRAE guidelines but hotter than conventional wisdom allows. Supporting this kind of environment means finding devices that can perform under higher temperatures and then creating dedicated 'hot' and 'cold' aisles. At higher temperatures, outside air — even if you're not in the middle of a Finnish winter — has a far greater impact on cooling, owing to the wider differential between this air and your server racks.

Hot or Cold?

Going cold costs more money, since you need to invest not only in powerful A/C but keep a close eye on server configurations and air movement. Running hot can offer long-term savings but only if equipment doesn't fail early, since high-availability demands can't be met if servers are constantly offline. If you're designing a local data center, both methods offer viable avenues for ROI — if you're considering colocation, be clear about your cooling needs: Do you prefer cold data, or hot servers?

Next Steps:

 
Speak to an Expert

About Ernest Sampera

Ernie Sampera is the Chief Marketing Officer at vXchnge. Ernie is responsible for product marketing, external & corporate communications and business development.

  • Connect with Ernest Sampera on: