Kaylie Gyarmathy

By: Kaylie Gyarmathy on January 16th, 2019

Print/Save as PDF

How to Reduce Latency Using Edge Computing

Data Center Infrastructure

Subscribe to vXchnge Blog

Edge computing architecture has emerged as an exciting new topic in the world of network infrastructure in recent years. While the concept isn’t necessarily new, developments in internet of things (IoT) devices and data center technology have made it a viable solution for the first time. Edge computing relocates key data processing functions from the center of a network to the edge, closer to where it’s gathered and delivered to end users. While there are many reasons why this architecture makes sense for certain industries, the most obvious advantage of edge computing is its ability to combat latency.

The Lag is Real

Although many people may only hear about latency when they’re blaming it for their online gaming misfortunes, video games are actually a good example for explaining the concept. Often confused with bandwidth, which measures how much data can travel over a connection, latency measures how quickly that data travels from one point to another. In the context of a video game, high latency means that it takes longer for a player’s controller input to reach a multiplayer server. High latency connections result in significant lag or a delay between a player’s controller inputs and on-screen responses. To a player with a low latency connection, these opponents seem to be reacting slowly to events, even standing still. From the high latency player’s perspective, other players appear to teleport all over the screen because their connection can’t deliver and receive data quickly enough to present game information coming from the server.

The same latency is responsible for sputtering, fragmented streaming content. These buffering delays already occur in 29 percent of streaming experiences. Since video content is expected to make up 67% of global internet traffic (an estimated 187 exabytes) by 2021, latency is a problem that could very well become even more common in the near future. Studies have shown that internet users abandon videos that buffer or are slow to load after merely two seconds of delay. Companies that provide streaming services need to find solutions to this problem if they expect to undertake the business digital transformation that will keep them competitive in the future.

What Causes Latency

In most cases, latency is a byproduct of distance. Although fast connections may make networks seem to work instantaneously, data is still constrained by the laws of physics. It can’t move faster than the speed of light, although innovations in fiber optic technology allow it to get about two-thirds of the way there. Under the very best conditions, it takes data about 21 milliseconds to travel from New York to San Francisco. This number is misleading, however. Various bottlenecks due to bandwidth limitations and rerouting near the data endpoints (the “last mile” problem) can add between 10 to 65 milliseconds of latency.

Reducing the physical distance between the data source and its eventual destination is the best strategy for how to reduce latency. For markets and industries that rely on the fastest possible access to information, such as IoT devices or financial services, that difference can save companies millions of dollars. Speed, then, can provide a significant competitive advantage for organizations willing to commit to it.

How to Reduce Latency With Edge Computing

Edge computing architecture offers a groundbreaking solution to the problem of latency and how to reduce it. By locating key processing tasks closer to end users, edge computing can deliver faster and more responsive services. IoT devices provide one way of pushing these tasks to the edge of a network. Advancements in processor and storage technology have made it easier than ever to increase the power of internet-enabled devices, allowing them to process much of the data they gather locally rather than transmitting it back to centralized cloud computing servers for analysis. By resolving more processes closer to the source and relaying far less data back to the center of the network, IoT devices can greatly improve performance speed. This will be critically important for technology like autonomous vehicles, where a few milliseconds of lag could be the difference between a safe journey to a family gathering and a fatal accident.

Of course, not every business digital transformation will be delivered by way of IoT devices. Video streaming services, for example, need a different kind of solution. Edge data centers, smaller, purpose built facilities located in key emerging markets, make it easier to deliver streaming video and audio by caching high-demand content much closer to end users. This not only ensures that popular services are delivered faster, but also frees up bandwidth to deliver content from more distant locations. For instance, if the top ten Netflix shows are streaming from a hyperscale facility in New York City, but are able to cache that same content in an edge facility outside of Pittsburgh, end users in both markets will be able to stream content more efficiently because the streaming sources are distributed closer to consumers.

The combination of edge data centers and IoT devices have the potential to transform the way companies build their network architecture. Edge computing opens up a new range of options for how to reduce latency and deliver services more efficiently to end users. In a market increasingly driven by short attention spans, speed will very likely continue to be a key differentiator, making edge computing strategies increasingly vital to companies across many industries.

 
Speak to an Expert

About Kaylie Gyarmathy

As the Marketing Manager for vXchnge, Kaylie handles the coordination and logistics of tradeshows and events. She is responsible for social media marketing and brand promotion through various outlets. She enjoys developing new ways and events to capture the attention of the vXchnge audience.

  • Connect with Kaylie Gyarmathy on: