Both Google and Microsoft have surfaced with official announcements for their respective video streaming services in the past few weeks. Though still in development, technology advances within those projects could eventually impact regular old video streaming, too.
First up, Google announced Stream, a “technical test to solve some of the biggest challenges of streaming.” The company is conducting trials by streaming the latest “Assassin’s Creed” video game to Chrome browsers.
Next up, Microsoft announced Project xCloud, a video game streaming service it’s currently testing with plans for public trials in 2019. The service will work on PCs and mobile devices—along with consoles—and will allow users to play with a Bluetooth controller or on-screen controls.
Having two tech giants forge ahead into video game streaming is reason enough to get excited about the possibilities of cloud gaming. But CCS Insights analyst Raghu Gopal said it may still be a few years before a reliable cloud-based games platform can launch.
“The reality is that many popular fighting, racing and action games need millisecond-precise input to deliver a competitive gaming experience. This problem becomes most apparent when trying to play games online through this type of service,” wrote Gopal in a blog post. “Given Microsoft's efforts to solve many important technical challenges, Project xCloud will be eagerly watched by the gaming community.”
While details regarding both tests are still scarce, both companies have indicated that the biggest technical hurdles to clear in streaming graphic-rich, compute power-hungry video games is minimizing buffering and graphic degradation while getting latency down to near nonexistent levels.
PlayGiga, a streaming video game company that recently launched in the U.S. and is working with telecoms on gaming solutions based on 5G network technology, has also been working to address latency levels in cloud gaming. PlayGiga CEO Javier Polo said the key is to map out where latency builds, whether that’s in the game controller, the streaming technology, the set-top box or the network.
“You really need to understand where technology is building up and then optimize it,” Polo said, adding that PlayGiga can visualize how that latency builds in each link of the streaming chain.
As more high-speed 5G networks come online, telecoms need use cases to justify why consumers should pay for such rapid throughput. Polo said that streaming immersive media formats like virtual reality and augmented reality could be possible offerings from telecoms, and that both of those formats will require the kind of ultralow latency being worked on for streaming video games.
CDN providers like Akamai are working on getting latency levels down in the sub-one-second range, but current standards typically fall around the 10- to 12-second range. That’s generally good enough for streaming video, but for video games, where the response time between the controller and the screen needs to be instantaneous, 10 seconds of delay won’t cut it.
Nelson Rodriguez is global director of media industry strategy for Akamai, which counts both Google and Microsoft as customers. He said that based on his recent discussions with video game publishers like EA, Sony and Ubisoft, the general consensus is that the technology issues around video game streaming are almost solved. He said that Google’s and Microsoft’s tests could further advance video game streaming and add global scale.
Rodriguez said that right now, most of the streaming solutions for video games are based on a super PoP infrastructure, which means the data center is about 200 miles away from most of the people accessing it.
“That’s significantly different from a fully distributed architecture where there are servers literally within five miles of anyone,” Rodriguez said. “All of the solutions right now are relying on the super PoP approach, and there are some limitations there.”
He said that moving closer to the edge of the network presents advantages for cloud gaming along with other streaming applications. But he said that Microsoft’s reliance on Xbox motherboards sitting in a data center would be harder to accomplish at the network edge.
“It’s going to depend on the solution that’s necessary for the publisher or customer that’s trying to serve their audience,” said Rodriguez. “I think in the case where companies aren’t dependent on a particular form of hardware, the edge is going to be a better solution.”
Microsoft is working to enable its cloud gaming project on the data center level. The company removed the guts from multiple Xbox One consoles and stuffed them into a server blade which will be installed at its Azure data centers. According to ZDNet, Microsoft has also already increased data center bandwidth, and the company is working on developing advances in encoding and decoding along with network topology. Network topology refers to methods for arranging network elements such as links and nodes.
In terms of how what Microsoft and Google are testing could eventually impact streaming video, Rodriguez pointed out that right now, the streaming video game tests are being done at 1080 or lower resolutions when necessary, so the model isn’t working for the highest end video like 4K or even 8K at some point.
He said the problem that video game streaming tests are looking to solve isn’t purely about video streaming. It’s more about interaction (being able to push a button and get a response within 120 milliseconds) and less about maintaining quality of experience for viewers.
“When we get to the day when it’s 4K, full quality, low latency, you’re pressing a button and 120 milliseconds later you’re seeing a 4K image on your screen, then the question for broadcasters will be how they can use it,” Rodriguez said.
He said emerging programming formats like esports and gambling could be cases where broadcasters really do need latency down as close to real stime as possible.
“But I don’t think yet that’s a widespread mainstream need,” Rodriguez said.