Today’s data centers are marvels of engineering and computing technology. Far from the self-contained computer mainframe rooms of old, the modern data center offers robust connectivity options that allow companies to connect to a variety of cloud-based services while still maintaining a secure environment for valuable data and proprietary assets.
While most people are aware that these facilities make their internet-based activities possible, they rarely stop to think about some of the things that make data centers so important in today’s fast-moving digital landscape. Even the IT professionals who work in data centers or interact with them on a regular basis may not always appreciate the impact they’re having on technology, networking, and energy trends around the world.
According to a 2018 report from AFCOM, 42 percent of respondents said they had already deployed a green energy solution in their data centers or were planning to do so over the next year. Of those polled who had already rolled out an eco-friendly energy option, solar was the most widely adopted option, with 83 percent of people noting they use it in their facilities.
These data center statistics are substantial because they indicate a growing knowledge that operating data centers responsibly — by emphasizing sustainability — is crucial. It could help data centers reduce operating costs, and such efforts are also appealing to members of the public that want to know major companies and the data center industry are making an effort to promote sustainability. Heavy investments in both internal and external renewable energy sources are helping to expand the availability of green power, which will help promote more efficient energy practices across the economy.
The rise of cloud computing and the distinctive tech demands of mobile applications on the data center industry are two of the things making facilities bigger than ever. As a case in point, data from the end of 2018 found that the number of data centers in the world that qualified as hyperscale facilities was well over 400, with even more on the way. It's not uncommon for these facilities to have tens of thousands of servers in each location.
The International Data Corp., a market intelligence firm, defines a facility as hyperscale if it has at least 5,000 servers and a total size of no less than 10,000 square feet, but the real measure of a hyperscale facility is its ability to utilize high-density deployments at scale. Server virtualization has drastically increased processing performance, allowing hyperscale facilities to help companies manage big data with powerful analytics and machine learning. There’s also the small matter of storage space. With billions of people generating around 2.5 quintillion bytes of data every single day, all of that data has to go somewhere.
According to data center statistics from the U.S. Chamber of Commerce's Technology Engagement Center, a facility being built in a community typically results in 1,688 new jobs for people involved in the construction process, generating $77.7 million in total earnings for those workers. The construction phase for a data center is usually 18-24 months. That means the creation of a data center brings an immediate and prolonged boost to employment opportunities.
Every year a data center operates, it brings about 157 more local jobs paying $7.8 million in wages. There’s also the long term benefits a community gains from the capital investment in a data center. Over the course of ten years, a $1 billion data center could generate around $200 million in total tax revenue due to increased economic activity.
For decades, data centers have relied upon fairly traditional HVAC systems to meet their cooling needs. As processors have become more powerful, however, even the most sophisticated cooling infrastructure is having a hard time keeping up. When Alphabet introduced its TensorFlow 3.0 AI processors in 2018, it had to implement a direct-to-chip liquid cooling system to manage the heat produced by the chip’s eight-fold increase in performance. DownUnder GeoSolutions’ “Bubba” supercomputer joined the liquid cooling trend in 2019 by immersing its processors in 722 specially-designed tanks filled with polyalphaolefin dielectric fluid.
While the concept of liquid cooling is hardly a new one, the technology has been neither practical nor necessary until recent years. As processors designed to handle intensive AI computations continue to develop, air-based cooling solutions are having difficulty keeping up with the massive amounts of heat these chips generate. Liquid cooling innovations may be confined to the largest players in the tech industry for the time being, but the technology will likely begin to migrate to conventional data centers in the near future.
Google caused a stir in 2018 when it put an AI system in charge of the cooling systems in many of its hyperscale data centers. The program, developed by the AI company DeepMind, had already undergone extensive testing and served a long apprenticeship under the watchful eye of data center managers. After its recommendations wound up delivering a 40 percent reduction in energy usage, it was hard to dispute that the AI could manage a facility’s cooling infrastructure just as well, if not better, than its human counterparts.
Recent developments in AI and machine learning have finally made autonomous systems, ones capable of evaluating data and making decisions accordingly rather than simply performing a repetitive task. As the technology becomes more widespread, data centers will begin tasking AI with monitoring network and infrastructure environments to make continuous micro-adjustments that enhance efficiency and performance. The shift to AI will also greatly reduce the threat of human error, which data center statistics show is one of the largest causes of system downtime.
Microsoft made headlines in recent years with Project Natick, an ambitious plan to build a fully automated micro-data center designed to be submerged in the ocean. The idea could deliver unique connectivity options while also providing unmatched security (little chance of authorized visitors to the Natick facility!). But with more companies investing in space exploration, some innovative startups see an opportunity to place similar micro-facilities in orbit around the Earth. While there are a number of challenges to overcome, specifically the question of how to harden servers against the harsh environment and how to combat latency, the idea certainly has some promise.
The data center industry already stands at the forefront of innovation, and it will surely continue to occupy that position for many years to come. As facilities become more advanced, they will undoubtedly keep people wondering what the next great developments will offer and how they might set records for size, functionality, and performance.
Kayla Matthews writes about data centers and big data for several industry publications, including The Data Center Journal, Data Center Frontier and insideBIGDATA. To read more posts from Kayla, you can follower her personal tech blog at ProductivityBytes.com.