Why MAU doesn’t work as a performance metric for FAST services—Industry Voices: Grines

If you proclaimed the death of linear TV, you were dead wrong. Sure, U.S pay TV revenues will decline more than $33.6 billion by 2025, but at the same time, free ad-supported TV (FAST) is booming, with revenues projected to nearly double from now, reaching $4.1 billion in 2023.

Call it the “resurgence of linear,” however, linear-as-a-viewing-medium was never broken. Instead, it was linear-as-a-double-dip-business-model, where I pay $45 to $130 for a swath of channels, have limited on-demand options, and still have to watch advertisements, that was broken.

There are about 20 FAST services in the U.S., with Google TV rumored to be the latest one to jump in the foray. The two OGs, Pluto TV and Xumo, have respectively clocked an impressive 52 million and 24 million monthly active users.

Monthly Active Users (MAU) is a key performance indicator (KPI) for many businesses, and it’s become the de facto standard on how FAST services are measured.

No doubt we need transparency and useful metrics, but is MAU the best we can do?

What is MAU? 

Simply put, MAU is the number of users that have done something meaningful with a product (in this case, a streaming service) in the last 30 days or calendar month.

Why MAU can be good

MAU looks at only those who engage with a service, making it much more meaningful than other metrics such as total users, downloads or app installs. MAU can be used to quickly and efficiently take a business’s metaphorical pulse, measure the effectiveness of its marketing strategies and gauge how successful it is in customer acquisition, engagement, and retention.

Calculating MAU is pretty straightforward once it’s determined what constitutes an “active” user. It can be someone logging into an app or visiting a website, watching a video or linear channel or completing a particular task such as rating a title.

Although there are many ways to tally MAUs, we can presume that most FAST services count anyone who opens an app or visits a website as “active.”

While I’m a proponent of using MAU internally as a business success barometer, it becomes a slippery playing field when comparing how one service stacks up against another.

Why MAU can be unreliable 

While MAU has its good points, it’s not always the most reliable or accurate metric for user engagement and whether people are getting value from a service.

There’s no universal way to measure MAU 

As discussed, different companies can define user activity differently. That means that while it’s easy to calculate MAU, no two companies can guarantee they measured the same thing on a month-to-month basis.

For instance, if you measure MAU by tracking users who interact more than 10 times with a service and a competitor ranks MAU as any user who logs in to one of their apps at least once, you’re going to have significantly different MAU readings at the end of the month.

Consequently, there’s no hard-and-fast rubric to hold your MAU figure up against and compare to other companies. The information exists in a vacuum, and like any figure in isolation, it’s hard to ascribe much meaning to it. 

Accuracy isn’t guaranteed 

One of the primary problems with MAU as a metric is that it oversimplifies engagement.

A user who opens and quickly closes an app is not the same as the user who watches 90 minutes of programming every month. But MAU makes no distinction between the two.

It’s possible to define MAU in terms of longer, more meaningful engagement with a streaming service, but this can lower MAU numbers.

MAU doesn’t consider unique users 

Say you log onto Pluto TV using Roku. While you’re channel surfing, your tablet receives a push message from Pluto’s iPad app. You open the service on your tablet from the notification, and while doing that, you receive an email from Pluto TV on your phone. You click the call to action on the email and that too brings you onto the Pluto TV service.

You have now registered as three different active users, although all three activities came from the same person.

MAU is now three times higher than it should be. On paper, it looks like a sudden surge in activity and may go over well with shareholders.

In context, however, it becomes a drastically inflated measurement that inaccurately reflects users’ interaction with a service. 

MAU disincentivizes cross-platform engagement 

It’s not unusual for consumers to download a streaming service across at least three devices.

Logically, you would suppose that streaming services would see this as an opportunity to create user profiles. These would allow users to:

  • Create playlists 
  • Save videos for later 
  • Resume videos across different devices 
  • Mark channels or shows as favorites

While this is table stakes for subscription-based services, why isn’t this functionality available for their FAST counterparts? 

As of writing, Xumo is working on introducing profiles. However, there’s little business incentive to follow through on this since creating them would unify user interaction across multiple devices.

That, in turn, would see a drop in monthly active users as customers signed in across all three devices associated with Xumo’s service, turning what had previously been three independent instances of activity into one.

The numbers might be more accurate, and the falloff in MAU figures can be easily explained. On paper, however, allowing those three device interactions to function as one unified customer interaction isn’t good business sense.

MAU might be a vanity metric, but it’s a persuasive metric when talking to shareholders.

Moreover, what can you use if you don’t rely on monthly active users to gauge customer interaction?

What about Total Sign Ups?

Peacock requires their users to create profiles. Accordingly, they don’t report on MAU since their numbers would be relatively lower than FAST platforms without user profiles in place.

So instead, Peacock relies on total sign-ups, which currently sits around 54 million.

But what total sign-ups doesn’t tell you is:

  • How often do people use the service?
  • How much content are they watching?
  • How many total sign-ups are on free accounts versus paid ones?

You could have signed up for Peacock when it launched in April 2020 and not used the service since. But you’re still being counted as a sign-up which is why total sign-ups is just a metric to obscure a picture of what’s really going on.

If not MAUs or sign-ups, then what? 

The short answer is total viewing time (TVT).

What is Total Viewing Time? 

TVT measures the number of actual hours users interact with a streaming service.

This is the metric Tubi uses to evaluate its performance. There’s some suspicion that this saves Tubi from having to reveal a lower MAU than other competitors. Still, the fact remains that as a way of assessing engagement, total view time makes for a more accurate measurement than MAU.

TVT has a direct and correlative relationship to ad-inventory and revenue opportunities. Say I’m an advertiser. I’d feel a lot more comfortable buying inventory from a service with bonafide viewership of 200 million hours per month than I would with one that’s been logged into a few million times.

Conclusion 

The problem with the MAU metric is twofold. In the first place, it’s a metric that gives an overegged sense of user interaction to companies and their shareholders.

In the second place, it has no standardized rubric to compare MAU measurements. Consequently, any information streaming services publicize about their monthly active users is rendered meaningless.

These problems are compounded by the fact these services rely on unauthenticated access across multiple connected TV and mobile devices. Without profiles, MAU doesn’t process unique users. Instead, it logs a multi-device household’s interaction as multiple instances of user activity, even when that’s manifestly not the case.

This has led streaming services to consider other metrics, most notably the total sign-up metric. But this is also inherently flawed. While total sign-ups tell you how many users a service has registered since launch, that too exists in a vacuum.

For instance, it doesn’t differentiate between trial users and subscribers who pay for the streaming service. And it cannot evaluate why its customers are signing up or even if they still use the service.

But that doesn’t mean there's no good way to measure customers’ interaction with streaming services. Pluto TV thinks that one day we’ll reach a consensus on how we measure free ad-supported services.

And I think we’re already there.

Rather than using the vanity metric of MAU or relying on the total sign-up figures, I think we should really look at total viewing time as a benchmark for ad-supported services.

Unlike the other metrics, total viewing time does what it says on the tin. It reports how many hours customers have cumulatively spent watching video. This directly correlates with attention, which is what all advertisers are looking for.

There’s no need for further contextualization and less risk of exaggerated results. The figure is meaningful and uniform. When streaming services compare total viewing time, the comparison is direct. That's something that can’t be said of other metrics.

Kirby Grines is an entrepreneur, advisor and the founder of 43Twenty, a strategic advisory and marketing firm that accelerates growth for companies in technology, media and entertainment. He’s also the creator of “The Streaming Wars,” which is a free, weekly newsletter that curates the latest developments in the OTT video industry. Previously, Kirby was a co-founder of Float Left, an application development company, where he designed and built some of the first connected TV apps.

Industry Voices are opinion columns written by outside contributors—often industry experts or analysts—who are invited to the conversation by Fierce Video staff. They do not represent the opinions of Fierce Video.