Skip to main content

How 5G networks will make low-latency game streaming a reality

Ahead of the launch of 5G networks this year, the buzz around speed is palpable. It’s more than just speed for the sake of speed, though.  Faster networks can transform actual user experiences, with early tests showing that an entire 4K movie can be downloaded in as little as eight seconds. As we eagerly wait for faster speeds and more bandwidth, the killer feature of 5G is low latency. Mobile carriers are exploring ways to leverage low latency to bring higher fidelity content to mobile devices, ranging from game streaming to mobile mixed reality experiences and even remote surgeries.

Low latency not only reduces the lags and delays that can ruin gaming, but the feature can be used to bring more power to devices with limited hardware. How? By moving more and more of that GPU power to the cloud.

Rethinking rendering with a hybrid approach

Image used with permission by copyright holder

“We know we can’t do 100 percent of the rendering on the devices simply because we don’t have enough batteries to support it, and if we did, we would melt the device – we simply cannot make that much heat,” Dr. Morgan McGuire, a research scientist at Nvidia, explained. “We’ve known for a while that cloud streaming has to be the answer.”

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

But streaming a 4K or 8K gaming experience, especially with high frame rates, is intensive for any network, let alone a mobile network, so carriers are devising new approaches to handling immersive, interactive experiences. “We can’t just think of it like we used to do — 100ms latency for passive media and now we’re reducing it to 10ms or 1ms, and that’s what we need for [gaming and VR],” Alisha Seam, a principal researcher at AT&T’s Foundry lab in Palo Alto, California. “It’s really not that simple.”

Depending on the research and the network, generally accepted latency figures for game streaming is between several milliseconds to approximately 20ms.

One approach, which AT&T and Verizon are exploring for gaming, is to take a hybrid approach and split the rendering duties between the cloud and the device. Essentially, hybrid rendering relies on a virtual gaming PC in the edge cloud to do most of the heavy lifting and the client device to do some of the decoding of rendered images. It’s a technique that Nvidia is using for its GeForce Now game streaming platform, which gives light computing devices access to GeForce graphics located on the edge cloud.

“There are different approaches where you can launch everything as a web service or virtualizing everything as a VM where you send your inputs up and your video streams down or finally having a hybrid stack where you’re decoupling the render loops and seeing what you can actually offset,” Raheel Khalid, the chief engineer for Verizon’s virtual reality lab said, with each solution having its own pros and cons.

But when you’re streaming games, you’ll have to evaluate successful experiences using new metrics. Passive video streams were fine with 100ms latency. But for VR and gaming, the server isn’t just pushing content to your eyes, you’re constantly sending input to the server based on your movement in a game. The server then has to render new video and send it back to the device to be decoded and displayed. Latency has to be further reduced.

“The big determining factor for user experience is responsiveness, and we’re fighting against the vestibular ocular reflex – the correlation between what your eyes see and where your inner ear places you at when you turn your head – and that’s on the order of 7ms,” Seam explained with VR experiences.

“The metric that we’re most interested in is motion to photon latency, which is measured by time-stamping packets,” Seam continued. “We measure what input the server is responding to, so the user will hit an input. We will send that packet to the server with a stamp from the user, render something with that input and send it back. And we can basically measure the time between the user making that movement, what we’re rendering, and what is actually displayed to users.”

Acer Predator XB3 Gaming Monitor review
Dan Baker/Digital Trends

AT&T predicts that game streaming will largely rely on split rendering where the servers will render scenes and the heavy lifting on the client devices will be used for clever technologies that augment the gaming experience. Seam envisions that client devices can fill in the gaps with tricks like foveating technology for VR. “We don’t want the client side to be consumed with trying to make up for the network, so the more closely we’re able to couple the network performance with the performance of the software layer of these applications, the more we’re able to let them do what they should be doing and not have them be just network workarounds,” she said.

That’s different from the historical relationship that cloud gaming has had with the network. In the past, cloud gaming platforms tried to compensate for the unpredictability of a network by sending through massive amounts of data hoping that some of those data packets will arrive. When the do arrive, many of these packets are delayed or are out of order, placing strain on the client device to decode and rearrange these packets in a timely and orderly manner.

“The latency number itself is important, but even more so is the distribution of that latency, and that’s something that we can really get into with 5G and edge,” she continued, noting that jitter, or the variability of the latency, plays a significant role in streaming performance.

Verizon’s plug-in model

Image used with permission by copyright holder

Verizon’s hybrid approach is largely app-based, and the carrier had worked with Unreal Engine to create an edge-based plug-in that will enable split streams.

“When we have this ecosystem with very complex games — which have a lot of rendering potential that have to have a lot done in a short amount of time and you have to also maintain a 60 or 120 Hz refresh rate on your device where you have your inputs received and never have jitter or lag — we look at how you decouple these two things, and what we started to do is build a new paradigm for game engines where you can take certain rendering steps and push them into the edge, or the cloud, and decouple that from the input loops that you have on your device,” Khaleed said.

“And that’s mainly where we focus on – the future of that main rendering stack and how we’re going to achieve that,” Khaleed continued. “How you’re going to take the input and the frame update and separate that from things that you need? As we’ve gone and built this Unreal Engine plugin and investigated how you build split rendering stacks, we figured out what you’re going to take and move up into the edge and what you can move up into the cloud.”

Regardless of the approach, both Seam and Khaleed agree that frame loss is a major factor on making game streaming successful. Gamers may not care if certain effects – like lighting or shadows – may be delayed by a frame or two. What makes or breaks the user experience are input lags and frame loss.

“Hardcore gamers care about that. If you have a frame loss, it’s going to be the end of that service. You’re never coming back, and why would you? It’s too big an impact,” Khaleed said. “The casual gaming crowd may be acceptable and tolerant, but at the end of the day, we’ve built the holy grail: How do you go and separate your game update loop and your game buffer update from the most compute heavy operations.”

The economics of 5G

Image used with permission by copyright holder

Because interactive streaming, like gaming, is more complex to deliver, carriers expect to charge a premium for gamers who demand a more stable network experience on 5G.

“It’s just so much more complicated because you can’t do something as simple as a progressive download,” said John Benko, a researcher at Orange’s Silicon Valley lab. “Since we’re talking about really pushing 10, 50, 100, 200 Mbps through the wireless channel, this is not going to be something that everyone can do and expect to pay the exact same price that they’re paying now to stream a 2 Mbps signal. So, the economics will need to be looked at carefully to see how we can make it a reality for people who want it.”

Part of the advantage of using a 5G network, Benko explained, is that operators can create network slices for particular use cases, offering more reliability and stability for customers who are willing to pay more.

Casual traffic from a mobile phone can get deprioritized for congestion, for example, but if your client device is provisioned for gaming or VR, networks can offer a guaranteed experience with a promise latency range for that use. Beyond gaming, mission critical applications, like remote surgery, can be provisioned to an even higher priority tier, to avoid any potential network interruptions that can endanger the application.

Though 5G promises to deliver a lot for gaming and other uses, cost remains a big factor for mobile adoption of game streaming.

Editors' Recommendations

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Fortnite and Verizon team up for in-game Super Bowl LV fan experience
Fortnite Super Bowl

Epic Games and Verizon have teamed up to create a massive Super Bowl LV celebration inside of Fortnite. The fan experience adds a full stadium to the game's creative mode.

The event begins today and runs through Tuesday, February 9. During that time, players can pop into the game's creative mode and explore the Verizon 5G stadium, built by Beyond Creative. Verizon says the stadium is the "largest activation ever built in Fortnite's creative mode."

Read more
Galaxy Book Flex2: Samsung’s first 5G laptop has an 11th-gen Intel Processor
samsung galaxy book flex2 5g news top down

Samsung has announced the finalized launch details for the Galaxy Book Flex2 5G 2-in-1 Windows laptop. It comes complete with Intel Evo certification, which means alongside the 11th-generation Intel Core processor, it has Iris XE integrated graphics, instant wake, and improved battery efficiency. You may have seen a laptop similar to it already, and that's because Samsung previously announced the laptop simply as the Galaxy Book Flex 5G, but didn't talk about release or pricing at the time. The confusing name change comes with the Galaxy Book Flex2 5G’s release in the U.K.

It’s an important model for Samsung, as it's the company's first 5G laptop, giving you the chance to connect to the fastest mobile signal possible, provided you’re in an area with coverage, and have the relevant plan from your carrier. However, it’ll be a Sub-6 5G connection only, and will not connect to an mmWave signal. The laptop has a Thunderbolt 4 port, a USB Type-C port, and a USB 3.0 port, plus there’s also Wi-Fi 6 onboard, Wi-Fi Direct, and Bluetooth 5.1, rounding out the impressive array of connectivity options.

Read more
Lenovo takes on M1-powered MacBooks with its own ARM-based IdeaPad 5G
lenovo ideapad 5g qualcomm snapdragon 8cx ces 2021

After being an early adopter of Qualcomm's Snapdragon chipsets on its Windows notebooks, Lenovo is upping the ante at CES 2021. While Lenovo is continuing to support Microsoft's Windows on ARM efforts, it's also now embracing 5G mobile coverage on the new IdeaPad 5G -- one of the best new laptops at CES this year.

Like Lenovo's previous Yoga C630 Snapdragon-powered clamshell, the IdeaPad 5G features strong battery life -- this notebook is rated for 20 hours of continuous video playback -- and a fan-less design with a thin-and-light form factor. The IdeaPad 5G this year will be powered by Qualcomm's Snapdragon 8cx 5G compute platform and feature Adreno 680 integrated graphics, a Snapdragon X55 modem, and 4G LTE support in areas where 5G signals aren't yet available. Where 5G is present, Lenovo claims that large files can download up to 10 times faster than over LTE.

Read more