Q&A: Samsung’s Behbehani talks video use cases for 5G

Samsung S10 phones
Taher Behbehani sees mobile processors like the one in the Samsung Galaxy S10 playing a major role in immersive video applications using 5G networks. (Samsung)
Taher Behbehani
Taher Behbehani

LAS VEGAS—After years of talk and development, 5G mobile networks are rolling out in the U.S. Now industries including media and entertainment need to come up with ways to use it.

Taher Behbehani, senior vice president and general manager of mobile B2B at Samsung, joined a 5G panel at the NAB Show alongside Shelly Palmer, CEO of the Palmer Group; Lynn Comp, general manager of the visual cloud division at Intel; and Christopher Levendos, vice president of engineering and operations at Crown Castle.

After the discussion, which touched on how 5G can disrupt media and broadcasting, we sat down with Behbehani to talk about the 5G use cases for video we’ll see first and how 5G will impact technologies like 4K and 8K along with augmented and virtual reality.

This interview has been edited for clarity and length.

FierceVideo: What are the 5G use cases across the media distribution chain that can be done this year?

Taher Behbehani: We showed a very interesting demo which I encourage you to see. We connected different cameras and then we could control the feed from the cameras instantaneously from the phone. So, it gives me a different perspective.

I think this type of a service is quite interesting if you elaborate on it. If you’re sitting in the stadium or if you’re at a race or match, sports could be one of the first areas from a B2B2C environment where I think you’ll see a lot of interest. That’s on the media distribution side of the game. The other part that we’ve been experimenting with—perhaps we could do it with a partner—would be to take lots of different content curated by individuals or semiprofessionally and put them together to create new experiences and edit them using an AI engine that exists today. That I think we can also do and be able to deliver that on a 5G device.

FierceVideo: You’re talking about user-generated content?

Behbehani: User-generated content. Our device now has a neural processor on it which runs AI connected to a cloud. We could do this in semi-real-time and create on-demand streaming [product] customized for me. Let’s assume I’m sitting in a new museum or new area and I want to understand the dynamics around it, not from the perspective we have today but from a user-generated perspective. It could be quite interesting, and these are things that can happen now.

FieceVideo: How will 5G expand what media companies can do further down the road? Is 5G going to reignite interest in 4K, 8K and VR?

Behbehani: I think so, for several different reasons. I sort of alluded to work we’re doing with different industries, and we have specific use cases. One of the use cases that keeps popping up is AR, especially in the B2B space, because it has a number of different benefits in automotive, transport and logistics. Anything that’s heavy industry use and needs to get a lot of information downloaded on the fly, real-time to be able to use it to train, I think AR is very possible. Because the processors are very powerful on these devices of ours, because the network is there and on the back end data can be stored, this will now begin to happen.

The question that comes up, is all the content ready to be delivered in an AR mechanism? We’ll see. There probably needs to be some type of compiler for AR content for that to happen. But I think AR will begin to take off now.

FierceVideo: What do you mean by a compiler?

Behbehani: So, let’s assume I have different shots and different slices of information but as a document to read. How do I translate that into an AR so I can superimpose the content of a manual to fix an engine?

FierceVideo: And we’re talking about something that will happen on mobile devices. We’re not picturing people wearing glasses.

Behbehani: The processing brain is the smart mobile device. Our Galaxy S10 5G is a really powerful device and that’s the brain. It’s the hub, it’s the brain, it’s the processor, it’s mobile everything. And that will be connected to glasses or devices that are not heavy and are usable in a work environment. I would expect, based on what I’ve seen, that these are no different than the protective goggles you wear today.

FierceVideo: What will 5G mean for video consumption in the near term beyond just addressing the growing amount of video traffic on mobile networks? It will allow people to watch more video, but what will it mean in terms of what kind of video people watch?

Behbehani: That’s a great question. I’ll give you a sample. I have my Samsung TV at home and I have this device, which is the S10. I’ve begun to watch a lot of nature shows because what they show, you could never have seen this before and it’s an incredible experience. I think people will continue to consume content that has a lot of refinement built into it and from a visual, sensory point of view, really engages you. I bet you if you watch the data behind what people are streaming when they buy these brand-new TVs and they get these new devices, you’ll see a lot of that happening. Sports is the other category.

FierceVideo: You said 5G has the potential to bring experiences together so you don’t have to look at each experience separately. Can you elaborate on that?

Behbehani: User experience for adoption of any new technology is critical. I think 5G is behind it, the user experience comes through the mobile device, and I think how we present that and develop apps is really critical for people to get the full benefits.