Once considered the stuff of science fiction, artificial intelligence (AI) is beginning to find its way into a variety of computing applications that directly impact both businesses and their customers. While perhaps not as dramatic (or as threatening) as 2001: A Space Odyssey’s iconic HAL 9000, AI data center technology is nevertheless revolutionizing the way companies manage their infrastructure and operations. As these innovations become more affordable in the coming years due to economies of scale, more facilities will be able to implement them to improve data center efficiency and offer better services to their customers.
Google made quite a splash in 2018 when it turned over control of its cooling systems in a number of hyperscale data centers to an AI program. Looking at results from an extended trial period, it’s not hard to understand why the tech giant made the change. The AI algorithm provided a number of optimization recommendations that delivered a 40 percent reduction in energy used by cooling equipment. By implementing machine learning systems at scale, AI data centers could realize massive energy efficiency benefits across their infrastructure. The benefits of AI research go beyond simple power consumption, however. Since cooling systems don’t need to run as heavily, they suffer less wear and tear over time. More importantly, accurate temperature controls informed by real-time sensor data are less likely to cause damage to computing equipment.
Automation is an obvious appeal of AI research, which is why many companies have been experimenting with completely unmanned, “lights out” data centers over the last decade. The concept goes beyond simple automation. Conventional data center facilities are designed not only for efficient computing and power consumption, but also to accommodate human technicians. If data centers could be built and optimized entirely based on IT considerations and monitored remotely via DCIM software, they could capitalize on a number of energy efficient strategies. Lower oxygen levels to reduce fire risk, more efficient cooling designs, and increased rack heights accessible by robots are just a few ways that automated facilities could save energy and improve performance. Without onsite personnel, these data centers would be less susceptible to human error as well.
When it comes to combating the latest forms of cyberattack, humans sometimes struggle to keep pace with all the latest threats. Identifying threats and optimizing systems to counter them is incredibly time-consuming. There’s always something new right around the corner, and humans can only monitor so many trends at once. Machine learning systems can not only evaluate network traffic in real-time to identify emergent threats, they can also run continuous simulations and tests to spot potential weaknesses in cybersecurity measures and even spot deeper security issues buried within accumulated data. An AI cybersecurity algorithm would, for example, be able to notice subtle variations in user access over time, which could give organizations a leg up in combating potential insider threats and identifying vulnerabilities that might be exposed due to human error.
The power of predictive analytics will also make it easier to manage equipment in an AI data center environment. No piece of equipment lasts forever, but predicting when hardware might fail isn’t always as easy as it might sound. A manufacturer’s guidelines may provide a reasonably accurate average lifespan for a device, but data centers must contend with the actual piece of equipment they have, not an abstract average. Heavy usage, accidents, and defects could all severely reduce the lifespan of hardware, and having to replace it unexpectedly could cause significant system downtime. With predictive analytics built into DCIM software, data centers can monitor equipment usage trends and real-time performance to gain a much more accurate estimate of when that hardware is inching toward failure. This form of predictive maintenance will allow facilities to replace equipment more efficiently with little or no service disruption.
Powering sophisticated machine learning systems will require massive amounts of processing power. As more companies implement AI technology, data centers will need to step up their computing capacity to accommodate increased workloads and meet storage demands. A new generation of hyperscale facilities is already being built specifically to deliver high-density deployments made possible by innovations in liquid cooling. The days of modest, enterprise data centers used primarily for backup and storage may soon give way to these technological marvels over the course of the next few years. While these new facilities will have intense power demands, their efficient design will actually help the data center industry make tremendous strides in terms of sustainability.
The AI revolution has not yet arrived in every data center, but the technology is rapidly being incorporated into many aspects of data center operations. As cloud computing platforms make powerful machine learning systems more accessible and a new generation of facilities are built from the ground up to take advantage of automation efficiencies, AI technology will surely become a key differentiator in the data center market in the years to come.