In 2016, the revenue from streaming video services hit $6.2 billion, eclipsing profits from DVD sales ($5.4 billion) for the first time. From titans like Netflix and Amazon Prime to less popular providers like Acorn TV and Tubi TV, it’s almost hard to remember a time before streaming video was a part of everyday life.
But the truth is that it’s only been a little more than a decade since Netflix launched its nascent streaming service. During that time, several companies have entered the market, some finding a specific niche while others made a play for a broader audience. As streaming becomes the preferred form of content delivery for consumers, providers face a number of challenges in their quest for viewership.
Today’s consumers have little patience for delays when it comes to streaming video content. A study from 2012 analyzing data from 6.7 million unique users found that viewers began to abandon a buffering video after only a two second delay, with viewership dropping off at a rate of about six percent per second thereafter. For streaming content providers, then, latency can absolutely kill their business.
With more and more media companies breaking into the streaming market, the challenges are only going to increase due to the sheer amount of data being forced through networks that were never meant to handle so much traffic. To make matters worse, the proliferation of smartphones has created an expectation among consumers that they should be able to stream their content anywhere at any time, regardless of local network infrastructure. Given the inevitable bottleneck problems that come from transferring content from wired and wireless networks, companies need to find solutions that allow them to meet their customers’ demands or risk losing market share.
Streaming services took up over 70 percent of downstream data traffic in North America during peak hours in 2015, with Netflix alone accounting for 37 percent of that content. Video content made up 51 percent of global internet traffic in 2016 (49 exabytes total), and Cisco anticipates that share will increase to 67 percent by 2021 (187 exabytes).
That’s a lot of traffic, and it's bound to make existing problems worse. A 2014 study found that buffering delays already occur in 29 percent of global streaming experiences. If streaming providers can’t find a way to deliver that content instantly, they won’t be in the market for very long.
Fortunately, edge computing offers a way out for companies looking to deliver content quickly and without latency. By distributing content and caching data across multiple data center facilities, streaming providers can put their content closer to end users. This reduces the distance data must travel, ensuring that it’s delivered quickly and at higher quality.
The location of the data center matters. Speed is a byproduct of latency, which is the time it takes a piece of data to physically travel from its source to the end user. While most people think of data traversing great distances almost instantaneously, even high-end fiber optic cable is constrained by the laws of physics. Data does travel quickly, about 2/3 the speed of light, but it still can take up to 65 milliseconds to get from New York to San Francisco under the best of circumstances. That might not sound like much, but it can take quite a toll on video content, which must transfer huge amounts of data from the content provider to the end user.
Reducing that distance leads to better overall performance. Since data isn’t traveling as far, drops in bandwidth are less likely to cause buffering issues due to latency. This is particularly important for users in locations outside of Tier 1 media markets. Many streaming providers are already setting up colocation facilities along key fiber optic line routes across the US. Caching high demand content in these edge data centers also frees up network bandwidth for other content that must still be accessed from a more distant location. This prevents the typical traffic jams that often bog down networks forced to serve up on-demand content on a massive scale.
Another advantage of using edge data centers relates to high volume traffic markets. A Tier 1 market like New York City streams huge amounts of video content around the clock. If data can only be accessed from a few centralized hyperscale data centers around the country, the bulk of their bandwidth will inevitably be dedicated to serving major markets. By positioning edge data centers to serve these markets with cached content, providers can alleviate the pressure on their networks, freeing up precious bandwidth to service a much wider market with reduced latency.
By incorporating edge data centers into their streaming service strategies, content providers can overcome many of the challenges inherent with delivering high quality video on demand. As more companies crowd their way into the streaming market, edge computing will continue to represent a key advantage for companies looking to minimize latency and make the most of their network infrastructure.