With billions of people now using the internet on a daily basis, truly massive amounts of data are being generated every single minute. Setting aside the millions of text messages and emails being sent out, social media is helping users share and view more images and videos than ever before. About 2.5 quintillion bytes of data is created each day and with over 100 new devices connected to the internet every second, that figure is only going to increase in the years to come.
This massive boom of data has caused many companies to shift their data operations to hyperscale data centers of immense size. Roughly classified as any data center with at least 5,000 servers and 10,000 square feet of floor space, hyperscale computing facilities really set themselves apart with their ability to provide cloud, e-commerce, and social media services at scale while still maintaining energy efficiency. At the end of 2018, the total number of these facilities grew to 430, up 11 percent from the previous year. Another 132 hyperscale data centers are slated to be built in 2019, with major cloud providers like Amazon, Google, IBM, and Microsoft investing billions in construction along with device manufacturers like Apple and social media giants like Facebook.
Although the US continues to have more hyperscale data centers than any other country, facilities opened in 16 other countries over the course of 2018. The massive growth trends caused John Dinsdale of Synergy Research to observe: “There is no end in sight to the data center building boom.”
Given these unprecedented growth trends, it’s worth considering what factors are driving companies to build these huge, capital intensive facilities. While there are a number of reasons why hyperscale computing facilities meet the business needs of these companies, they may not be the ideal solution for every organization’s data center requirements.
It’s no accident that Amazon, Google, IBM, and Microsoft are major contributors to the hyperscale data center “building boom.” These four corporations are among the world’s leading providers of cloud computing solutions. Amazon’s AWS service hosts roughly 40 percent of internet cloud market, while the others provide powerful infrastructure-, platform-, and software-as-a-service cloud tools that allow companies across a wide range of industries develop products and deliver services to their customers.
Although the cloud may sometimes seem like something that exists in the ether, the computing infrastructure that makes it possible is both real and incredibly resource intensive. Many cloud assets are made possible by virtual server technology, which makes it possible to abstract the processing power of a server into a compartmentalized software form. This technique allows each machine to run a greater number of virtual server applications at once with minimal impact on performance, albeit at the cost of intensive power and cooling requirements.
Hyperscale computing facilities are built from the ground up to accommodate these high-performance workloads, allowing them to handle heavy processing tasks with correspondingly high efficiency. So while a hyperscale data center might consume a lot of energy, getting the same amount of processing power and performance out of a collection of conventional enterprise data centers would actually consume much more energy.
With more people connected to the internet than ever before and internet of things (IoT) devices transmitting more data from the edge of existing networks, companies are practically drowning in information that they could be using to guide their business decisions. Unfortunately for them, much of this data is unstructured and difficult to manage. In order to properly analyze it and derive meaningful insights that could inform strategy, they need to use sophisticated algorithms capable of sifting the “wheat from the chaff” to identify the trends and events that may be useful.
Developing these algorithms is challenging enough; finding the computing resources to power them is even more difficult. The world’s most sophisticated analytical software applications are incredibly resource-intensive, requiring computational power far beyond the capacity of most organizations. Cloud computing solutions have made it possible for companies to purchase the analytics tools they need, but the servers running those programs need to be located somewhere. With their concentrated and expansive computing resources, hyperscale computing facilities are a natural place to turn for companies looking to capitalize on the potential of big data.
With so many IoT devices hitting the market and generating actionable data, companies are struggling to find the ideal network solutions to take advantage of enhanced connectivity. While hyperscale data centers can accommodate the massive amounts of data being generated throughout company networks, their centralized locations make them poorly suited to enhancing the functionality of IoT edge devices. Distance matters when it comes to delivering services to end users, and the fact remains that the typical hyperscale data center is going to be located far from the average device user. With that distance comes the potential for latency, the delay caused by the time it takes for data to travel from one point in a network to another.
To truly unleash the power of IoT edge devices, companies need to incorporate edge computing architectures that view hyperscale data centers as only one component of a larger solution. Smaller edge data centers located closer to end users can provide faster services by taking on much of the processing workloads that IoT edge devices can’t handle locally. The edge data center serves as something of a relay station, handling immediate processing needs while passing analytics data on to the hyperscale computing facility. By incorporating the different strengths of these facilities into their network architecture, forward-thinking companies can deliver outstanding (and fast) services to their customers.
As data demands continue to intensify, hyperscale data centers will have an important role to play for any organization that has a need for cloud computing solutions or big data analytics. While a hyperscale data center may not be a cure-all solution for every problem, companies should keep in mind how incorporating their services into their network capabilities can greatly enhance flexibility and performance.