Skip to main content

Nvidia researchers use artificial intelligence to upgrade your game’s graphics

At the Neural Information Processing Conference in Montreal, Canada, Nvidia researchers demonstrated that they could use the power of artificial intelligence to render synthetic, yet realistic, scenes with details and textures. Researchers claim that the work is still in its early stages, and it is unclear when the technology will be released to consumers, but there is big potential for Nvidia’s artificial intelligence-driven rendering in the gaming space.

“This work is about a new way of rendering computer graphics using neural networks,” Nvidia’s Vice President of Applied Deep Learning Bryan Catanzaro said in a conference call. Basically, researchers wanted to know how they can apply A.I. to make computer graphics better, and the solution is to apply machine learning to real-world videos in order to render new graphics.

“We’ve built a system that takes high-level representations of the physical world — basically taking a video sketch and convert that into a rendered scene,” Catanzaro said. “The model understands high-level information of objects in the real world, and then elaborate those to add texture and lighting information. The goal is to be able to synthesize new scenes with graphics.”

Image: Nvidia Image used with permission by copyright holder

Machine learning is used to analyze existing videos and Nvidia would apply computer vision techniques to add labels to objects and their properties. This means that A.I. would be able to recognize an urban cityscape and understand what objects are trees, cars, or building, for example.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

This technology is derived from existing research, like that from the University of California, Berkeley, according to Nvidia. The company has shown off other A.I.-based rendering techniques in the past, including one that would remove noise from an image.

Researchers were able to achieve real-time rendering on a Tensor Core GPU, but for the conference, Nvidia demonstrated the technology on its Titan V card. “Though you can do this on any processor, the real-time aspect does require a lot of A.I. throughput,” Catanzaro explained, noting that the Tensor Core GPU is important.

By being able to create content and adding them to virtual worlds, the gaming market could greatly benefit from the work created by this research. Developers, for example, could “remaster” old games by re-rendering old titles to add high-definition visuals, or they could add new levels to existing games with little effort.

As a basic example of how this would work, users can take a video of themselves, upload it to a game, and the rendering will be able to create highly personalized avatars for use in the game. Nvidia has open-sourced the code from its research right now, but Catanzaro cautions that the early work in this field is more suited for computer scientists than game developers.

In a separate demo, Nvidia showcased that it could analyze dance moves — like from the popular music video Gangnam Style — and transfer those moves to another person using the same computer vision techniques. “We analyze the code of the person, and that becomes our sketch,” Catanzaro said. “And the model renders the target person given that sketch.”

Nvidia cautions that this is still an early-stage technology, but you can also upgrade the graphics from an existing game and train a model on real-world imagery to re-render old games to make them look better. Because the technology requires the computer to analyze objects that are known in the real world, Catanzaro cautions that it won’t work on fantastic stuff, like rendering Santa’s elves. In theory, he admits, you can train the computer to render elves, but you need to create physical elves and capture images for the computer to learn.

Like the ray tracing technology that was introduced on Nvidia’s recent RTX series graphics cards for consumers, the company said that this A.I.-based rendering is used in a hybrid way and isn’t meant to replace traditional rendering techniques. Instead, A.I.-based rendering is meant to coexist and be used with traditional graphics rendering engines.

Right now, it’s unclear when this technology will hit the gaming market. It could take as little as a couple years, Catanzaro optimistically speculated. Combined with ray tracing, A.I.-based scene rendering will deliver quality visuals in game titles that are rendered in real time.

Editors' Recommendations

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Nvidia’s AI game demo puts endless dialogue trees everywhere
An AI game demo produced by Nvidia.

Nvidia did what we all knew was coming -- it made an AI-driven game demo. In Convert Protocol, you play as a detective trying to track down a particular subject at a high-end hotel. The promise is sleuthing through conversations with non-playable characters (NPCs) to get what you need. Except in this demo, you use your microphone and voice to ask questions instead of choosing from a list of preset options.

I saw the demo with a few other journalists in a small private showing. As the demo fired up and Nvidia's Seth Schneider, senior product manager for ACE, took the reigns, I was filled with excitement. We could ask anything; we could do anything. This is the dream for this type of detective game. You don't get to play the role of a detective with a preset list of dialogue options. You get to ask what you want, when you want.

Read more
My most anticipated game of 2024 is getting the full Nvidia treatment
A character gearing up for battle in Black Myth: Wukong.

As if I wasn't already looking forward to Black Myth: Wukong enough, Nvidia just announced that the game is getting the full RTX treatment when it launches on August 20. We see new games with ray tracing and Nvidia's Deep Learning Super Sampling (DLSS) all the time, but Black Myth: Wukong is joining a very small list of titles that currently leverage the full suite of features Nvidia has available.

The game comes with, as Nvidia describes it, "full ray tracing." That undersells the tech a bit. As we've seen with games like Alan Wake 2, "full ray tracing" means path tracing. This is a more demanding version of ray tracing where everything uses the costly lighting technique. It's taxing, but in the new games that we've seen with path tracing, such as Cyberpunk 2077 and Portal with RTX, it looks stunning.

Read more
Copilot: how to use Microsoft’s own version of ChatGPT
Microsoft's AI Copilot being used in various Microsoft Office apps.

ChatGPT isn’t the only AI chatbot in town. One direct competitor is Microsoft’s Copilot (formerly Bing Chat), and if you’ve never used it before, you should definitely give it a try. As part of a greater suite of Microsoft tools, Copilot can be integrated into your smartphone, tablet, and desktop experience, thanks to a Copilot sidebar in Microsoft Edge. 

Like any good AI chatbot, Copilot’s abilities are constantly evolving, so you can always expect something new from this generative learning professional. Today though, we’re giving a crash course on where to find Copilot, how to download it, and how you can use the amazing bot. 
How to get Microsoft Copilot
Microsoft Copilot comes to Bing and Edge. Microsoft

Read more