resources

Latency or ‘Wait and See’ in Video Streaming

Latency is one of the biggest challenges to delivering a quality streaming experience to viewers. Latency can be defined as the period between when a frame is captured to when it is displayed on a screen. Usually expressed in milliseconds or seconds, latency in traditional broadcasting lasts five seconds (not to be confused with the intentional seven second delay used in live TV broadcasts to avoid airing mistakes or unacceptable content.) In today’s digital age, viewers are watching more broadband video content on various devices and expecting the same quality as traditional television, but latency can be a deterring factor.

Since the quality of mobile viewing is affected by internet connection and bandwidth, the start speed results in buffering—the time it takes between hitting ‘play’ and seeing an image on screen. If viewers abandon the stream in frustration, your service could experience audience churn and lower ad profits. Consequently, if viewers are not happy with their subscriptions, they might cancel and turn to others even when your brand is not in control of the medium.

Hitting ‘play’ and waiting to see content is the initial latency problem of a program. It’s even more crucial for showing live streams such as sporting events. Known as 'grass-to-glass' quality for sports, any type of buffering or delay could spoil the game for fans. There are anecdotes about patrons at one sports bar heard cheering at the outcome of an event several seconds before the neighboring bar could experience the same play. Longer latencies might also affect viewers’ experiences if they see social media posts or messages ahead of seeing the actual play. Other streaming events that depend upon the ‘live’ experience are gambling, gaming and auctions because any delays might create an unfair advantage for competing participants. Their 'glass-to-glass' quality of experience is just as important.

OTT network delivery results in extreme latency because it is packet based traveling over hundreds of nodes and many systems before being displayed on a viewer’s device. Each system along the way interjects another time delay. The latency for HTTP is 25 - 40 seconds delay compared to approximately five seconds for traditional RF transmissions. The categories of latency are reduced latency of six to 18 seconds; low latency of two to six seconds; and ultra-low latency of 0.2 to two seconds. Five to six seconds of latency makes OTT competitive with traditional broadcast signals.

There are several factors that cause latency, that you may or may not have control over:
• Encoders
• Upload time
• CDN technologies or chunk-based transfer
• End users’ network and closeness (wired, Wi-Fi, cellular)
• Playback (what about Players)

Competition is growing for audiences in the streaming market from subscription Video on Demand (sVoD), pay TV operators, and other OTT streaming services like HBO, Apple TV+ and Disney+. If your service quality doesn’t support low latency, you could easily lose customers to one that does. There are many low latency video solutions for encoding and video playback that service providers and CDNs might consider.

To strive for low latency, Amazon recommends measuring and optimizing latency at every step in your workflow. HTTP live streaming (HLS) and dynamic adaptive streaming over HTTP (DASH) are meeting broadcast latency times of six to 10 seconds by segmenting into two-to-three second small chunks. By choosing segment length, you can obtain five-second latency with one-second duration segments or get seven to 10 seconds with two-second duration. Other factors to consider are replacing your architecture and your video player.

Recently, Common Media Application Format or CMAF has been introduced as a new standard that will reduce latency. Basically, it uses ‘chunked coding’ to transfer smaller chunks of video at a set time while others are being processed and follow behind. CMAF must be used at each step in the distribution channel, otherwise latency remains. Service providers are slowly adopting CMAF, but there will be a lag time (no pun intended) for viewers to experience it because legacy devices and browsers don’t support CMAF.

Qligent’s Vision platform monitors and analyzes low latency factors along with other parameters such as manifest verification, adaptive bitrate metrics, download rates and lost segments that can negatively impact the viewer experience. Its location-specific, real-time monitoring capabilities uncover instances of rebuffering, jitter, stuttering, and other latency issues. Operators then receive actionable information to take proactive steps and help preserve customer loyalty and protect revenue.

To learn how Qligent Vision can help you address latency issues, contact us.