Housing racks upon racks of powerful servers as well the complex environmental controls needed to keep them at the optimal temperature, data centers consume massive amounts of electricity. In 2017, data centers providing many benefits to the United States used more than 90 billion kilowatt-hours, or roughly the equivalent of 34 coal-powered plants generating 500 megawatts of power each. Globally, data centers account for about 3 percent of all electricity produced on the planet.
And with more data centers on the way, including massive hyperscale facilities, those numbers might well continue to increase.
Fortunately, concerns about these power demands have been driving innovation in the industry for well over a decade. Today’s data centers are far more energy efficient than those of 10-15 years ago. Although US data center energy usage grew by 24 percent from 2005 to 2010, but the figure plummeted in the ensuing years. Between 2010 and 2014, it fell to a minuscule 4 percent, where it’s expected to remain until at least 2020 thanks to more sustainable data center policies. So while power demands are increasing in the aggregate, they’re not escalating nearly as quickly as the early 2000s.
The US Department of Energy has devoted significant resources and research into identifying new efficiency strategies data centers can benefit from today and in the future. With energy demands unlikely to decline in the coming years, the best way to control power usage in these facilities is by finding ways to utilize that power more efficiently. Programs like the Better Buildings Challenge are committed to improving building efficiency by 20 percent or more over the next ten years. The Data Center Accelerator program focuses specifically on promoting sustainable data centers, aiming to reduce infrastructure energy intensity in participants’ facilities by 25 percent over five years.
Today’s processors and hard drives offer much better performance than those of previous generations. Low-power chips are more efficient and help facilitate higher workloads over a sustained period than older versions (more on that in a moment). Solid state hard drives also have reduced energy demands because they don’t have to spin physical components to read the data stored on them.
Gone are the days of outdated air conditioning systems inefficiently pumping cold air into a mostly empty room. Today’s sustainable data centers utilize multiple forms of cooling technology to circulate and direct air to where it’s needed most, ensuring that the facility remains at a stable temperature at all times. Many energy efficient data centers have already implemented innovative liquid cooling systems, even going so far as to develop direct-to-chip liquid cooling for high-performance processors. Facilities located in geographically favorable environments have even incorporated outside air into their cooling systems, which helps them to keep cooling power needs to a minimum and reduce overall energy needs significantly.
Newer, more efficient servers have the ability to switch over to a low-power standby mode when they’re not needed. A server in standby mode consumes only a fraction of the electricity it uses when fully powered up. Since servers running at full power generate more heat, incorporating a standby mode into an operational strategy also reduces the need for extra cooling, further contributing to a more energy efficient data center.
Servers running at less than 50 percent of their potential workload are a massive waste of energy. The ratio between utilization and power consumption isn’t 1:1, so a server using only 20 percent of its available workload isn’t using 80 percent less energy than it would while running at full capacity. Since it’s using more power, that also means it’s generating more heat, which puts additional energy demands on the facility’s cooling system. Running five servers at 15 percent capacity, then, is far less energy efficient for a data center than running a single, high-density server at 75 percent. When applied across the entire facility, increasing server usage rates can result in much better energy efficiency metrics.
Data centers have made great strides to diversify their energy sources. Whether they’re aiming to grow green power by incorporating renewable energy generated on-site, such as solar panels, windmills, and geothermal power, or purchased from off-site providers in the form of Renewable Energy Certificates (RECs), sustainable data centers are making a massive push towards better efficiency and environmental consciousness. Many customers are looking to renewable energy usage as a key differentiator between companies. By investing in green power now, sustainable data centers can not only improve their efficiency metrics, but also position themselves as key stakeholders in the burgeoning renewable energy industry.
As data centers continue to grow and multiply, their overall energy demands will rise with them. By incorporating the most energy efficient technology and implementing practices designed to keep power consumption to a minimum, data centers can meet the insatiable data needs of today’s consumers while also avoiding creating an energy or environmental crisis in their rush to keep their servers up and running.