The quality conundrum: Data drives OTT improvement, but lack of standards is a problem

By Samantha Bookman

With the potential for a record number of new OTT services launching in 2016, along with the addition of exciting new technologies like virtual reality and 4K quality video, the online video world – and the requirements for bandwidth on all networks – is about to get a whole lot bigger.

Getting the best video to consumers over the top has become a much bigger talking point for vendors. At this year's NAB Show, for example, multiscreen and online video ecosystem players put QoS and QoE front and center at many of their booths -- demonstrating features like near-simultaneous OTT delivery of broadcast events, virtual reality, Ultra HD video and more.

The point of the renewed quality push isn't so much that vendors haven't been focusing on ensuring good service and a good experience for viewers; it's that the next generation technologies emerging into the commercial sphere demand solid QoS.

Chris Knowlton, streaming guru for Wowza, an encoding and delivery provider, said that the increased need for improved QoS has "been gradual, sort of like boiling the frog … what people see on some experiences, especially if they see Netflix streaming on Roku to a 4K television, suddenly they realize, 'I could be having the same experience but even more of an immersive, a better experience than what I can get on my cable subscription.' And that tends to drive the quality that [providers] want to deliver, especially if that becomes a differentiator."

Big data, OTT advances drive quality concerns

Digital quality of experience is becoming important to companies across industry verticals, not just OTT video, and that transformation has been driven by the improvement in data collection and analysis.


In April, Actual Experience released its 2016 report on the overall digital experience across industry verticals. It pointed out that while 79 percent of business leaders surveyed see consistent QoE as a critical element to success in the digital space, many of them underestimate the impact of a poor experience online. About 29 percent of executives surveyed said they didn't understand how well things were working from a customer perspective -- signaling that "they do not have the right data," the report said.

"They desperately want to improve it," said Dave Page, CEO and co-founder of Actual Experience, which provides business intelligence and analysis to companies in numerous industries that have digital content elements – such as banks and retail stores.

Landmark streaming events predict future requirements

Yahoo's exclusive livestream of a regular season NFL game took place last fall, but the event still gets some attention. One reason for that is because of the measurement opportunity it provided: Yahoo gave out audience numbers, recording 15.2 million unique viewers worldwide, but third-party vendors involved also got to provide their analysis of the event. Qwilt, a vendor that offers "open caching" video delivery, measured the traffic increases and drop-offs throughout the game, and provided a breakdown of the types of devices it was viewed on.

"The impact of this streaming event is more profound as you look under the hood," wrote Mark Fisher, VP of marketing and business development for Qwilt, in a blog post. "In addition to the major Commercial Content Delivery Networks (CDNs) that were enlisted to provide streaming capacity worldwide to meet demand, this NFL game also leveraged the new open caching infrastructure, deployed deep in ISP networks, close to consumers, which boosted overall delivery capacity and improved Quality of Experience for viewers."

Yahoo managed to capture 7 percent of all North American broadband data subscribers during the game's fourth quarter, Fisher said, based on Qwilt's measurement results from major ISPs. That equated to about 8 million viewers in the region.


Fast-forward to April of this year, when WWE Network, professional wrestling's SVOD outlet, streamed Wrestlemania 32. Qwilt measured a six-fold traffic spike on April 3, to more than 1 TB of total traffic consumed. In some U.S. networks, Fisher said, only Netflix ranked higher in terms of streaming volume during the live event.

Traffic spikes like that require high-level QoS to ensure the viewer has a good experience, that buffering isn't a problem, and that the stream doesn't fail entirely.

Monitoring and maintaining quality

Online video providers and the industry players that support them rely on monitoring, measurement and analysis of the video stream in order to make sure quality of service is being maintained and to improve on existing video quality. But there is some variation in how QoS is measured and even how it's defined by each service provider in the online video ecosystem.

For example, most, if not all CDN providers place beacons at or as close to the end point of a video stream as possible – ideally, at the device, through a software solution. These send reports on the stream back to the provider with various data such as streaming bitrate, whether the video is being delivered successfully, buffering events and other information.

Nice People At Work (NPAW), on the other hand, offers business intelligence for its online video provider customers. The Barcelona-based company analyzes data on multiscreen delivery to help improve the experience for viewers

For example, NPAW's analysis of multiscreen provider customer Antena 3 found that its IP video stream had a buffer rate of about 9 percent, the company said in a case study. The problem: A3 "had overestimated the computing power of the devices their customers were using," NPAW said. Based on that data, A3 changed its streaming format to better match the majority of the devices to which they were streaming video.

But there are still some problems with all of this data: one, different data may be measured by different vendors; two, each vendor may have a different definition of quality; and three, not all of a vendor's data may be made available to a content provider customer.

Kurt Michel, who leads marketing efforts at IneoQuest, said that measuring a video stream only at the end point makes it impossible to hone in on a problem with the video's delivery. Even worse, because the buffering or stream loss frustrates viewers, "people then stop watching (and) then you lose end point data and you can't solve the problem," he said. "You must have monitoring in the network to be able to act on the data."

With that perspective, it's not surprising that IneoQuest's cornerstone service is keeping an eye on a video stream's delivery from beginning to end. The provider recently upgraded its brand image, and this spring introduced a three-part product, FoQus, which promises QoS monitoring from several points along the video delivery chain.

IneoQuest can leverage its monitoring services either in a virtualized software environment, or place an appliance on their customer's premises, and places data acquisition elements at various points along the network. That enables the provider to get much more data during the delivery of a video stream and, during an event, pinpoint the problem and resolve it much more quickly.

By being a provider-agnostic vendor, IneoQuest is generally able to provide third-party monitoring across a video stream's delivery path. The advantage is that the customer has a single source of monitoring data from one end to the other with a single quality standard.

"Finger pointing is a problem. Delays are a problem. You need a third party solution," Michel said. For example, he noted that IneoQuest has worked with Elemental Technologies -- an OTT service provider acquired last year by Amazon -- to provide verification of the quality of the video it is delivering.

Standardizing QoS and QoE

With concerns over quality measurements mounting, IP-based video market players are focusing their efforts on this space.

The Streaming Video Alliance (SVA), for example, released a number of guidelines for streaming media delivery in early May after its quarterly meeting in New York City. This included functional requirements for open caching and its first set of guidelines for QoE. "The QoE Guidelines are the first step in the process – allowing members to focus on future deliverables that allow streaming video to continue scaling to millions of viewers without sacrificing video quality," said Thomas Edwards, VP of engineering and development at Fox Networks Engineering & Operations, in a statement. The QoE working group spent a year developing its guidelines, and the open caching working group spent 18 months carving out its set of guidelines for the delivery tech.

The SVA has more than 40 member companies participating, which could make a significant difference in how quickly QoE standards are developed and adopted across the industry. The variety of members – from Netflix to Fox Networks to Level 3 Communications -- reflect the different perspectives and priorities across the online video ecosystem.

But is it enough? Should streaming media companies look not just at their own industry segment for answers to quality issues, or should they turn to the IT vertical at large?

Actual Experience's Page said that the digital experience must be "a relentless data-driven journey," comparing it to the manufacturing industry and its development of data-reliant quality practices like Six Sigma.

"We're on the tip of a quality transformation in the digital world. We haven't had the technology and the data to enable it (before). The brands that make up their minds to do that transformation in the digital world today – it's a structural business change, to focus on quality and be led by quality – in three years are going to be the digital brand leaders in whatever their business is."

Regardless of whether the online video industry adopts Six Sigma-like quality standards or not, working together to develop agreed-upon guidelines, as the SVA is encouraging companies to do, could help the online video market segment respond much more quickly to changing technologies and user demands.

"We have to be working with the infrastructure vendors so we're ready when they are," Michel said.

The quality conundrum: Data drives OTT improvement, but lack of standards is a problem