Skip to main content

You can now moonwalk on the moon with Nvidia’s A.I. and ray tracing tech

Apollo Moon Landing with RTX Technology; courtesy of Nvidia Image used with permission by copyright holder

In addition to launching 10 new RTX Studio laptops targeting creators at SIGGRAPH, Nvidia also announced some of the work that its research teams have been doing in the fields of artificial intelligence, augmented reality, and computer graphics. On the 50th anniversary of the Apollo 11 lunar landing, Nvidia showcased how ray tracing technology from its RTX graphics card is used to visually enhance the images captured by NASA 50 years ago. At SIGGRAPH, Nvidia’s effort to commemorate Apollo 11 goes a step further, allowing fans of space the opportunity to superimpose themselves into a short video clip, as if they were astronauts Neil Armstrong and Buzz Aldrin, by using A.I. and the power of ray tracing to render these videos in real-time.

“What we’re doing is using artificial intelligence to aim a camera and people just in their street clothes, to be able to do 3D pose estimation,” Rob Estes, Nvidia TKTK explained. “We know where they are in 3D space, and we know how their limbs are moving, So we’re drawing that using ray tracing and placing you in with your movement as an astronaut in the scene.”

Doing the moonwalk on the moon walk

Hollywood visual effects directors have been doing something similar for years using a green screen and motion-capture actors wearing suits with dots to replicate limb movement, but ray tracing and A.I. take this a step further by eliminating the need for specialized equipment and actors. “You could do this anywhere you can film somebody,” Estes elaborated on the benefits of A.I.-enabled ray tracing in this lunar example. Essentially, SIGGRAPH attendees can film a short video montage of themselves in Aldrin’s and Armstrong’s spacesuit doing the moonwalk as if they were on the moon as part of the Apollo 11 mission.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

“We’re going to have a mock-up of the lunar surface and the lunar lander, and we’re going to get to let them see what it would be like for them to be on the moon.,” he said, with the all the lighting effects and rendering performed in real-time. “This is very leading-edge research, and nobody has done this before. You’re combing A.I. and ray tracing in a way that has many, many practical benefits.” These benefits not only aid Hollywood and its visual effects teams but also designers and researchers trying to solve hard problems with research estimation, Nvidia expanded.

Foveated rendering with prescription glasses

Image used with permission by copyright holder

Nvidia wants to make rendering for augmented and virtual reality applications appear more realistic and crisp, and it is applying its foveated rendering technology to accomplish this. What the team of researchers at Nvidia has done is added support for prescription lenses, a first for the industry. Though this is still in the early research stage right now, Nvidia envisions a day where wearers of prescription eyeglasses won’t need to wear separate glasses from their augmented reality devices.

Nvidia is working with several types of display to be able to build the VR or AR display congruent with the prescription glasses that you may have. “This is a big deal,” Estes said. “You’ve never been able to see 20/20 before because there just wasn’t the visual acuity for these displays.”

In the same way that ray tracing can bring life-like cinematic effects to video games, foveated rendering can make AR scenes and images appear more realistic with better resolution by conserving graphics power. Rather than rendering the entire scene, foveated rendering allows creators to just render the middle part of the scene in high fidelity, which is where your eyes typically focus, and the peripheral areas can be rendered in less detail to save on GPU power.

“So we’re doing work to make sure that we’re tracking where your game is, and applying this so that you get faster frame rates and better graphics in doing augmented reality, or this could be applied to virtual reality as well,” said Estes.

GauGAN, the A.I. artist, is freed

Nvidia GauGAN Researchers Photo
NVIDIA

In the past, Nvidia showed how its GauGAN drawing tool can turn even the art-challenged among us into artists by allowing you to draw complex, life-life landscapes with just a few simple strokes by leveraging the power of artificial intelligence. The A.I.-drawn scenes, Nvidia revealed, have been used in some big-name Hollywood movies as backdrops as well, with studios overlaying elements that they wanted to set the overall mood for the clip.

Rather than using A.I. to identify elements of a scene, like a dog or a cat, GauGAN, as its name implies, uses a generative adversarial network, or GAN. By specifying where the sky is by drawing a horizon line and where the ocean meets the sky, GauGAN begins to “draw” the scene with clouds and waves, and the user can add in rocks and sea cliffs to this seascape, rendering the scene with proper shadows and reflections. So instead of identifying objects in an image, GAN helps to fill in the image with a realistic creation and rendering of the scene, Nvidia explained.

Given the popularity of the tool, Nvidia revealed that it will be expanding access to GauGAN to everyone. At this time, Estes said there are no plans to monetize GauGAN, despite its creative work in the movie industry, and that the company is simply just relishing the joy that other people get from using it.

Editors' Recommendations

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Nvidia’s DLSS 3.5 update flips ray tracing on its head
Cyberpunk 2077 running on the Samsung Odyssey OLED G8.

Nvidia introduced its Deep Learning Super Sampling 3 (DLSS 3) not too long ago, but the feature is already getting a major update. DLSS 3.5 will launch this fall, seemingly alongside Cyberpunk 2077: Phantom Liberty, and it adds something totally new to Nvidia's storied RTX feature.

Ray Reconstruction is what's new. At a high level, Ray Reconstruction enables greater levels of ray tracing quality without hurting your performance (in some cases, it can even improve performance). Nvidia is billing this as an image quality improvement over traditional ray tracing methods, not as a way to improve performance, however.

Read more
Ray tracing vs. path tracing — which is the best dynamic lighting technique?
Quake 2 RTX mode.

Ray tracing is a lighting effect that has divided gamers and GPU manufacturers for years. Some consider it the next generation of in-game lighting that is ushering in previously impossible visual effects, while others see it as far too costly in terms of dollars for supporting hardware, and are also critical of its impact on frame rates. But it's not the only advanced lighting technique in town. Path tracing is an alternative way of handling dynamic lighting that can be both prettier to look at and less demanding on a GPU. At least in theory.

In reality, ray tracing and path tracing have their place in games, and you may find yourself playing with both of them in the years to come. Here's how ray tracing and path tracing compare.

Read more
Why I leave Nvidia’s game-changing tech off in most games
Ratchet and Clank Rift Apart running on the Samsung Odyssey OLED G8.

Nvidia's most recent graphics cards have increasingly relied on Deep Learning Super Sampling (DLSS) 3 to find their value. GPUs like the RTX 4070 Ti and RTX 4060 Ti aren't impressive on their own, but factor DLSS 3 into the buying decision, and they start to become attractive. If you look at Nvidia's overall strategy for this generation of chips, it looks like the company has started selling DLSS, not graphics cards.

It's not hard to see why DLSS 3 is so important. It makes the impossible possible, like path tracing in Cyberpunk 2077, and it helps multiply frame rates far beyond what should be possible in games like Portal RTX. But now that we finally have DLSS 3 in more games and the party trick status has faded away, I've left Frame Generation off in most games. Here's why.
How DLSS 3 works

Read more