Nvidia RTX DLSS: Everything you need to know

When they were launched in 2018, Nvidia’s Turing generation of GPUs introduced some intriguing new features for gamers everywhere. Ray tracing is the easiest to wrap your head around, but deep learning supersampling, or DLSS, is a little more nebulous.

Even if it’s more complicated to understand, DLSS is one of Nvidia’s most important graphical features, offering higher frame rates and resolutions while requiring fewer GPU resources. To help you understand just how it works, here’s our guide to everything you need to know about Nvidia’s RTX DLSS technology, so you can decide whether it’s enough of a reason to upgrade to a new RTX 30 series GPU.

What is DLSS?

Deep learning super sampling uses artificial intelligence and machine learning to produce an image that looks like a higher-resolution image, without the rendering overhead. Nvidia’s algorithm learns from tens of thousands of rendered sequences of images that were created using a supercomputer. That trains the algorithm to be able to produce similarly beautiful images, but without requiring the graphics card to work as hard to do it.

DLSS also incorporates more traditional beautifying techniques like anti-aliasing to create an eventual image that looks like it was rendered at a much higher resolution and detail level, without sacrificing frame rate.

This is all possible thanks to Nvidia’s Tensor cores, which are only available in RTX GPUs (outside of data center solutions, such as the Nvidia A100). Although RTX 20 series GPUs have Tensor cores inside, the RTX 3070, 3080, and 3090 come with Nvidia’s second-generation Tensor cores, which offer greater per-core performance.

Where it originally launched with little competition, though, other sharpening techniques from both AMD and Nvidia itself now compete with DLSS for mindshare and effective utilization in 2020 — even if they don’t work in quite the same way.

What does DLSS actually do?

DLSS is the end result of an exhaustive process of teaching Nvidia’s A.I. algorithm to generate better-looking games. After rending the game at a lower resolution, DLSS infers information from its knowledge base of super-resolution image training, to generate an image that still looks like it was running at a higher resolution. The idea is to make games rendered at 1440p look like they’re running at 4K, or 1080p games to look like 1440p. DLSS 2.0 offers 4x resolution, allowing you to render games at 1080p while outputting them at 4K.

More traditional super-resolution techniques can lead to artifacts and bugs in the eventual picture, but DLSS is designed to work with those errors to generate an even better-looking image. It’s still being optimized, and Nvidia claims that DLSS will continue to improve over the months and years to come, but in the right circumstances, it can deliver substantial performance uplifts, without affecting the look and feel of a game.

Where early DLSS games like Final Fantasy XV delivered modest frame rate improvements of just five to 15 FPS, more recent releases have seen far greater improvements. With games like Deliver us the Moon, and Wolfenstein: Youngblood, Nvidia introduced a new A.I. engine for DLSS, which we’re told improves image quality, especially at lower resolutions like 1080p, and can increase frame rates in some cases by over 50%.

There are also new quality adjustment modes that DLSS users can make, picking between Performance, Balanced, and Quality, each focusing the RTX GPU’s Tensor core horsepower on a different aspect of DLSS.

How does DLSS work?

DLSS forces a game to render at a lower resolution (typically 1440p) and then uses its trained A.I. algorithm to infer what it would look like if it were rendered at a higher one (typically 4K). It does this by utilizing some anti-aliasing effects (likely Nvidia’s own TAA) and some automated sharpening. Visual artifacts that wouldn’t be present at higher resolutions are also ironed out and even used to infer the details that should be present in an image.

As Eurogamer explains, the A.I. algorithm is trained to look at certain games at extremely high resolutions (supposedly 64x supersampling) and is distilled down to something just a few megabytes in size, before being added to the latest Nvidia driver releases and made accessible to gamers all over the world. Originally, Nvidia had to go through this process on a game-by-game basis. Now, with DLSS 2.0, Nvidia provides a general solution, so the A.I. model no longer needs to be trained for each game.

In effect, DLSS is a real-time version of Nvidia’s screenshot-enhancing Ansel technology. It renders the image at a lower resolution to provide a performance boost, then applies various effects to deliver a relatively comparable overall effect to raising the resolution.

The end result can be a mixed bag but in general, it leads to higher frame rates without a substantial loss in visual fidelity. Nvidia claims frame rates can improve by as much as 75% in Remedy Entertainment’s Control when using both DLSS and ray tracing. It’s usually less pronounced than that, and not everyone is a fan of the eventual look of a DLSS game, but the option is certainly there for those who want to beautify their games without the cost of running at a higher resolution.

In Death Stranding, we saw significant improvements at 1440p over native rendering. Performance mode lost some of the finer details on the back package, particularly in the tape. Quality mode maintained most of the detail while smoothing out some of the rough edges of the native render. Our “DLSS off” screenshot shows the quality without any anti-aliasing. Although DLSS doesn’t maintain that level of quality, it’s very effective in combating aliasing while maintaining most of the detail.

We didn’t see any over-sharpening in Death Stranding, but that’s something you might encounter while using DLSS.

Better over time

Deep learning supersampling has the potential to give gamers who can’t quite reach comfortable frame rates at resolutions above 1080p the ability to do so with inference. DLSS could end up being the most impactful feature of Nvidia’s RTX GPUs moving forward. They aren’t as powerful as we might have hoped, and the ray-tracing effects are pretty but tend to have a sizable impact on performance, but DLSS could give us the best of both worlds: Better-looking games that perform better, too.

The best place for this kind of technology could be in lower-end cards, but, unfortunately, it’s only supported by RTX graphics cards, the weakest of which is the RTX 2060 — a $300 card. The new RTX 3000 GPUs offer a glimpse as to how Nvidia will use DLSS in the future: Pushing resolutions above 4K while maintaining stable frame rates.

Nvidia has shown the RTX 3090, a $1,500 GPU with 24GB of memory, rendering games like Wolfenstein: YoungBlood at 8K with ray tracing and DLSS turned on. Although wide adoption of 8K is still a ways off, 4K displays are becoming increasingly common. Instead of rendering at native 4K and hoping to stick around 50-60 FPS, gamers can render at 1080p or 1440p and use DLSS to fill in the missing information. The result is higher frame rates without a noticeable loss in image quality.

Because DLSS works through a neural network, it will get better over time. DLSS 2.0 already has far fewer artifacts compared to original DLSS, allowing games like Death Stranding to provide a cleaner image compared to other image reconstruction techniques, such as checkerboard rendering. The problem now is game support.

There are currently only 15 games that support DLSS 2.0, which is fewer than the number of games that support ray tracing. Thankfully, wide adoption should come soon. Upcoming releases like Cyberpunk 2077 and Call of Duty: Black Ops Cold War will support DLSS. With the launch of Ampere GPUs, developers will likely be looking for ways to save on system resources while displaying at high resolutions. DLSS provides a tested, general solution.

It could be that in a year or two DLSS is a commonplace feature in most games due to its ease of implementation and the dominance of RTX GPUs in gamer systems. AMD may in turn be forced to develop something similar.

Editors' Recommendations