Ernest Sampera

By: Ernest Sampera on August 2nd, 2019

Print/Save as PDF

Top 2019 Data Center Trends Impacting Your Infrastructure and Operations

Industry Trends | Data Center

Today’s data centers don’t much resemble the mainframes familiar to previous generations of IT personnel. Modern facilities are wonders of computing technology, featuring state-of-the-art high-density servers and revolutionary cooling systems. They also offer a wide range of services that are accessible to even some of the smallest startups, providing them with infrastructure resources that would have been unthinkable in previous decades.

Colocation data centers are continuing to drive innovation to provide their customers with better, and often customized, solutions for their networking and computing challenges. Keeping pace with the next generation of data center trends can help companies identify which services are right for them and learn how data centers can push their businesses forward. If these organizations don’t know what’s available to them, they will have difficulty charting the future course of their IT and data strategy.

Top 2019 Data Center Trends

Software Defined Infrastructures

Perhaps the most exciting data center trend over the last decade has been the widespread push toward software-defined infrastructures. Rather than simply serving as a storehouse for data, a software defined data center (SDDC) virtualizes the computing and storage power available through its servers into software form, which is then bundled and sold to customers as a service. This process allows multiple users to install and manage their own services on the same physical server. Each virtualized server is cordoned off from the others, ensuring both privacy and flexibility. Since customers are purchasing virtualized assets in a software-defined infrastructure, it’s incredibly easy to scale them to suit their needs or relocate them to take advantage of a data center’s bundled services.

Server virtualization has had a tremendous impact on data center efficiency. Software-defined infrastructures allow facilities to offer services at low costs while also keeping power and cooling demands in check. High-density servers running heavy virtualized workloads may consume a lot of electricity and generate a lot of heat, but it’s far more efficient to host multiple customers on a single high-density server than on several lower density units. Those cost savings can be passed along to customers, allowing them to invest in additional services beyond server space.

Colocation

As the costs of building a private data center continue to escalate and the versatility of third-party data centers expands, more companies are looking at colocation solutions as an ideal situation for their IT infrastructure. While renting space in a facility might not sound like it would qualify as a data center technologies trend, it’s important to remember that today’s data centers are entirely different beasts from the old data repositories of previous decades. The modern data center is a connectivity bonanza, offering organizations access to almost every digital service imaginable. Colocation isn’t just about renting floor space; it also means leveraging the enhanced power and cooling capabilities of data centers to reduce ongoing infrastructure costs.

In most cases, colocation data centers give customers the tools to manage and oversee their infrastructure that are far beyond anything they could afford to put in place in their own private facility. From business intelligence software that gives unparalleled visibility into the actual usage patterns of their network to robust asset tracking and 24x7x365 remote hands support, colocation allows small to medium-sized companies the resources to compete directly with their bigger, more resourceful competitors.

Hybrid Cloud and Multi-Cloud Architecture

Cloud computing is one of the most consequential next-generation data center trends affecting businesses today. When cloud services first became available, many companies jumped in with both feet, lifting and shifting or re-architecting their infrastructure to migrate the whole of their network into the cloud. While this worked out well for some of them, other companies found that the security concerns and unexpected expenses of operating in a purely public cloud environment didn’t make sense for their business. Some companies opted to leave the cloud altogether, transitioning instead into a colocation solution or taking advantage of a data center’s software-defined infrastructure to build a private cloud in virtualized servers.

But some companies still need the services of the public cloud. Responding to that need, data centers have developed hybrid cloud architecture and multi-cloud solutions that allow companies to take advantage of the power of public cloud computing while still enjoying the security and control of a private network. Hybrid cloud architecture stores sensitive and valuable data in a private network while also establishing connections to a public cloud service. This architecture allows companies to protect and control their data while still using it in a public cloud environment. Since the data resides in the private network and only travels into the public cloud under heavy encryption, hybrid clouds provide a peace of mind that simply doesn’t exist in a purely public cloud. Multi-cloud deployments work in a similar fashion, but incorporate multiple public cloud services, either through a single vendor or best-of-breed strategy.

Edge Computing

If cloud computing is the darling of the previous decade, edge computing might well be the next major trend for the data center industry. Edge computing architectures expand the reach of a typical cloud network by pushing key processing functions to the edge of the network, closer to where the data itself is gathered. Traditional cloud computing requires collected data to travel back to the core of the network where it can be processed by the central server. Since data is constrained by the laws of physics, it doesn’t travel there instantaneously, resulting in latency that slows streaming content services, medical devices, industrial scanners, and other devices.

By processing data closer to the edge of the network where it’s collected, edge computing can greatly increase speed and responsiveness. As internet of things (IoT) devices become more common each year, the number of devices capable of handling that processing load is increasing to the point at which edge computing is more viable than ever. Edge data centers are also being used to extend network reach and increase speed, providing more powerful processing resources that can handle tasks too big for IoT devices, but not large enough to send back to the core of the network.

Hyperscale Data Centers

As more organizations turn to cloud computing solutions, the demand for the data center infrastructure that supports them is increasing as well. Hyperscale facilities are substantially larger than most enterprise data centers, sometimes housing thousands upon thousands of servers. Most of them are operated by the biggest names in the tech industry, such as Google, Apple, and Microsoft, because these companies offer the type of cloud computing services that require such immense scale. Microsoft’s Quincy, WA data center, for example, utilizes 24,000 miles of network cable, the equivalent of six Amazon Rivers.

With demand for cloud and social media services showing no sign of decline, companies are investing in the construction of more of these massive facilities. There were 430 data centers that could be classified as hyperscale at the end of 2018, and another 132 are scheduled for construction in 2019. These facilities will be crucial to managing the massive amounts of data generated by virtual reality services, big data social media analytics, and the information being gathered by IoT devices.

Innovative Cooling Infrastructure

Data centers have long relied upon conventional air conditioning infrastructure to meet their cooling needs. Considering that cooling is responsible for a huge percentage of data center energy consumption (up to 40 percent), it’s hardly a surprise that many facilities have focused on improving their cooling strategies as a way to become more efficient. The power-intensive processors required to power modern AI applications are also generating more heat than traditional cooling infrastructure can handle, forcing data centers to take new approaches to meetd their cooling needs.

Fortunately, AI-applications provide an ideal solution with their ability to dynamically monitor and regulate the environment of a data center. Google recently handed over control of the environmental system in one of its hyperscale data centers to a DeepMind AI program after several years of successful testing. The program’s minute adjustments in cooling performance translated into significant energy savings within the first several months. Liquid cooling technology has also come into its own, both in the form of direct to chip and full-immersion solutions. As high-performance processors begin to generate too much heat to be managed by air-cooled systems, liquid cooling will surely become a more cost-effective and practical means of regulating the data center environment efficiently.

Data Center Industry Talent Demands

At 2018’s Data Center World convention, two Google data center executives made headlines by pointing out that the data center industry is facing a major talent crisis. The industry’s workforce is predominantly older and male. According to a 2018 survey of industry-wide data center operations, 56 percent of respondents had more than 20 years of experience while only five percent had less than five. More revealing, women made up less than six percent of the workforce, which only 30 percent of respondents saw as an issue.

Although the data center industry is growing rapidly, a 2017 study found that 81 percent of IT leaders said it was difficult to find qualified candidates. With so many experienced workers transitioning out of the workforce, it’s more important than ever for data centers to take an aggressive approach to hiring and training candidates to take their place before significant institutional knowledge goes out the door with them. 

These data center trends have already changed the industry significantly. Where data centers were once the exclusive concern of large companies, today they are providing smaller organizations with the ability to provide products and services more effectively than ever before. As data center technologies continue to evolve, these trends will surely give way to even more remarkable developments that drive innovation to another level.

 
Speak to an Expert

About Ernest Sampera

Ernie Sampera is the Chief Marketing Officer at vXchnge. Ernie is responsible for product marketing, external & corporate communications and business development.

  • Connect with Ernest Sampera on: