Blog Home   Blog Home

February 3, 2019   •   Nathan Moore

Cut Media Workload Latency and Costs with Edge Computing

 
 

Latency is one of the big problems of the modern Internet. Not everyone realizes this. For example, online video streaming is usually measured in terms of bandwidth, not latency, but the truth is it’s impossible to get high bandwidth without low latency. The emergence of edge computing brings the promise of dramatically decreased media workload latency as well as significant operating cost reduction.

What is latency and how can you decrease it?

Latency is the delay time between your end user (the person viewing a video, seeing an ad, downloading a game, or otherwise consuming online content) and the server determining what content to serve, and actually serving it.

Modern networks are so fast they are bounded in speed by the speed of light. Every mile closer to your end user reduces the latency by a tiny fraction, so the lowest latency you can get is to be next door to your end user. Edge is all about being metaphorically next door to your end user, your content consumer.

Edge means that the server is as physically close to your end user as is possible. The whole serving infrastructure and the whole supporting network, physically located in the same city as the end user, no matter which city they reside in the world. Servers follow population: your end users, the content consumers.

The next generation of Edge is here

The previous generation of Edge held content at the Edge (those familiar with CDNs know about caching content closer to consumers) but kept most content serving decisions remote and centralized. Moving content serving logic away from remote, centralized sites and into the distributed Edge is the key to reducing latencies still further. The problem has always been to make the Edge a multi-tenant environment with proper boundaries between tenants to ensure your custom logic can be executed quickly and reliably, and it’s this huge problem that the next generation of Edge finally solves.

Introducing Edge Computing

Let's use the example of serving display advertising, which applies to distributing video, images, music, and games as well. It’s a common truism that no one waits for content to start, particularly an ad. You have one chance to serve it and serve it quickly. This is a concern not just of the ad serving community but every media, entertainment, and game publisher with ad-supported content. With Edge Computing, ads are not only served from the Edge, but you can use Edge Computing to move the ad decision logic to the Edge as well. Picking the optimal ad to serve now happens at the Edge, and the wait time for your end customer - the latency - is the lowest it can be.

The round-trip ping time between Los Angeles and New York is typically about 75ms. If you have an ad server in New York whose job it is to choose which ad to serve, and a website visitor in Los Angeles, here is the trip the request must take. First, the server has to make an HTTP call to New York, which chooses the ad and returns the ad URL. This takes about 75ms. Then the server makes a second HTTP call, using that ad URL to a cache server in Los Angeles. This second call takes about 5ms. The total time to choose and serve the ad is about 80ms. However, with Edge Computing the logic to choose which ad to serve sits at the Edge, which in this scenario is our Los Angeles data center. The total time to choose and serve the ad is decreased all the way to 5ms. This estimate is based on the time to make an HTTP call in Los Angeles to the Los Angeles edge node, plus the time for the server to talk to the edge computing server, and the server to return the ad directly.

Another video delivery example is if you intend to insert ads within an existing video stream. In addition to putting ad choice logic at the Edge, Edge Computing can intercept the stream and dynamically insert that ad into the ongoing stream, without interrupting it. Because of the low latency inherent with Edge, this can be done at a very high bitrate, ensuring your customers get a seamless experience.

Decreasing the wait time for an ad and improving the transition between video streams and inserted ad content both translate directly to an increase in successful ad serving. Beyond the ad serving example, decreased latency also directly impacts your other goals with online content, including the number of clicks, views, downloads, and purchases. Improved performance, and better targeting, is critical for video streaming, image loading, game downloads, and app performance.

Customizing your workload with Edge Computing

The possibilities of what Edge Computing can offer are practically endless. Moving serving logic to the Edge also means moving some or all of your workload to the Edge. Combined with the StackPath API, the ability to dynamically deploy new logic allows near-instant changes to take place globally. Solving this coordination problem ensures consistency across your workloads.

Business process at the Edge

You know your business process very well, better than anyone else. Making Edge Computing a part of it makes for faster decisions and more efficient operations. Coordinating content decisions at the Edge, for example via log aggregation or metric collection, allows for business process optimization. The reduced latencies mean faster throughput in your process, and faster results from changes in your process, driving faster iteration and other business process improvements.

Reducing costs

Cost reduction is a function of removing unnecessary work, leaving out unneeded steps, and improving efficiencies. Edge Computing means minimizing the amount of billable bandwidth used between Edge and a remote, centralized computing environment. It means not requiring as many, or potentially, any servers at that remote, centralized computing environment. It even means, thanks to the StackPath API and global distribution, lower deployment and integration costs.

The things we have yet to consider

The promise of Edge Computing is real. With these new technologies we expect our customers to develop use cases beyond what we’ve identified. Since the launch of our serverless EdgeEngine last year, developers across the ecosystem have built innovative solutions just as we had hoped. With the launch of StackPath Edge Computing we expect the innovation to continue. We’d love to hear your ideas. Reach out to a StackPath expert via phone or chat to share your unique use case. Let’s work together to see what Edge Computing can do for you.

   
Topics

View All
Stay Informed

Receive our monthly blog newsletter.

Follow

Connect with us to stay updated.

Stay Informed

Receive our monthly blog newsletter.

Follow

Connect with us to stay updated.