Is Your Data Center Ready to Take on the Challenges of IoT?
By: Kaylie Gyarmathy on July 24, 2018
Gone are the days of data centers that house little more than centralized servers. Today’s data centers serve a variety of functions, incorporating hybrid-cloud deployments and the latest edge computing techniques. As more and more organizations implement the use of Internet of Things (IoT) devices in their network architecture, data centers are scrambling to keep up with the latest trends in the technology sector.
The number of IoT devices is expected to exceed 20 billion by 2020, and all of them will be generating data of some kind. Even if these devices process data locally via edge computing, there will still be some information that needs to be sent back to a central server for storage or analytics processing. Data centers will therefore continue to play an important role in the network architecture that makes IoT possible, but their form and function may change depending upon the specific needs of their clients.
How Much Data?
So exactly how much data do experts expect IoT devices to generate? Industry estimates vary, but Cisco anticipates that the number will exceed 800 zettabytes per year by the end of 2021 and will grow exponentially, not linearly, in the years beyond. If that doesn’t sound imposing enough, consider that a single zettabyte is equal to approximately one trillion gigabytes.
The volume of data generated is growing so fast that it’s difficult to even measure how much is being produced each year. To put the scale in context, a 2013 study estimated that about 90% of all the world’s data was generated in the previous two years. Most of this data is unstructured, lacking any pre-defined organizational model that determines where it should be processed and stored. Fortunately, only about 10% of unstructured data is considered useful enough to be saved or retained for analysis. But given the vast scale of data to begin with, that still amounts to over 80 zettabytes.
Much of the data generated by IoT devices can be considered unstructured and must be mined by powerful analytics tools to produce valuable business insights. Data centers will play a key role in this process. While IoT devices are effective at processing information quickly to make immediate decisions, they lack the power and scope to make meaningful use of every byte of data they gather. Companies will need to leverage data centers for both storage and the kind of big data analysis they require for making strategic decisions.
How Many Data Centers?
With IoT devices generating so much information, data center infrastructure has been growing fast to keep pace with demand. One industry analyst projected that 400 million new servers will be needed by 2020 to support the demands of IoT and other cloud services. Assuming that a large, 400,000 square foot data center might contain 5,000 racks with 20 servers each, 4,000 such facilities would be needed just meet those needs. Considering that each one of those centers would require about 50 megawatts of power, it’s no surprise that data centers are expected to consume a staggering 20% of the world’s power by 2025.
Fortunately, organizations are taking steps to satisfy these massive storage needs. In 2017, companies invested $18.17 billion in data centers in the United States alone. The US was already a world leader in data center usage, with almost 3 million data centers, or about one for every 100 citizens. It’s also home to 44% of the world’s 390+ “hyper-scale” data centers, the most famous of which, the NSA’s Utah Data Center (code-named “Bumblehive”), is said to have the capacity to gather and store a yottabyte (1000 zettabytes).
With edge computing becoming more common, however, large data centers are not the only solution to data storage needs. Smaller, modular data centers that can be located closer to end users are being deployed in a number of ways. From offering prefabricated buildings assembled on site to repurposing existing commercial space, data center providers are finding creative ways to take the pressure off overburdened networks by facilitating more edge computing capacity. These solutions are often far less expensive than traditional data centers, making them more viable in emerging edge markets where it doesn’t make economic sense to construct a facility with a huge footprint.
Many of these new data centers will be optimized to facilitate heavy machine-to-machine (M2M) workloads. With so many IoT devices transmitting data for analytical processing, it makes sense to set up dedicated edge data centers to accommodate them. These data centers will be unique in that their “customers” are machines rather than human users. Since their usage patterns will be dictated by software, they will be much more predictable and compliant users, putting less strain on power infrastructure.
Better predictability contributes to better reliability. Since these facilities are less susceptible to failure, many of them will be able to operate unmanned. Automated software can monitor the data center’s usage needs in real-time, creating alerts for human technicians to resolve issues only when necessary. These data centers will be tremendously valuable in terms of reducing the footprint of the infrastructure needed to support edge computing and IoT.
As IoT devices generate more and more data, it falls to data center providers to keep up with the intense demand for storage space and processing power. While traditional data centers will meet some of these needs, more creative solutions in facility construction and automation can provide advantages in emerging edge markets to help companies get them out of the ever expanding potential of IoT devices.
About Kaylie Gyarmathy
As the Marketing Manager for vXchnge, Kaylie handles the coordination and logistics of tradeshows and events. She is responsible for social media marketing and brand promotion through various outlets. She enjoys developing new ways and events to capture the attention of the vXchnge audience.