Skip to main content

What is Nvidia DLAA? New anti-aliasing technology explained

Nvidia revealed a new feature coming to RTX 2000 and RTX 3000 graphics cards called DLAA. Built on the same system as DLSS, this new anti-aliasing technique could provide the best anti-aliasing performance on PC, far past the offerings currently available. Is it really that good, though? Or is it just marketing hype?

We’re here to get you up to speed on what DLAA is, how it works, and how it stacks up to traditional anti-aliasing techniques. Right now, DLAA is only available in The Elder Scrolls Online, but we suspect more games will support the feature in the future.

Related Videos

What is Nvidia DLAA?

Nvidia DLSS showcase.

Nvidia Deep Learning Anti-Aliasing (DLAA) is an anti-aliasing feature that uses the same pipeline as Nvidia’s Deep Learning Super Sampling (DLSS). In short, it’s DLSS with the upscaling portion removed. Instead of upscaling the image, Nvidia is putting its A.I.-assisted tech to work for better anti-aliasing at native resolution.

Anti-aliasing solves the problem of aliasing in video games (go figure). Pixels are arranged in a grid on your display, so when a diagonal line shows up on screen, it creates a blocky, stair-stepping effect. These are known as jaggies. Anti-aliasing techniques try to fill in the gaps between pixels, leading to a smoother edge on objects.

Next time you boot up a game, look at foliage, fences, or any thin object with straight lines. You’ll see some amount of aliasing at work. The three main anti-aliasing techniques are multi-sampling anti-aliasing (MSAA), fast approximate anti-aliasing (FXAA), and temporal anti-aliasing (TAA). Each takes samples of pixels to create an average color value, dealing with the jaggies, but the way they do it is different.

MSAA is the most demanding, sampling each pixel at multiple points and averaging the result to fill in the missing data. TAA is similar, but it uses temporal (time-based) data instead of sampling the same pixel multiple times. That makes TAA more efficient overall while providing a similar level of quality.

Finally, FXAA is the least demanding of the lot. It only samples pixels once like TAA, but it doesn’t use past frames for reference. It’s only focused on what’s showing up on your screen for a given frame, which makes FXAA much faster than MSAA and TAA, though at the cost of image quality.

This short romp through anti-aliasing techniques is important for understanding DLAA and DLSS. DLAA works just like TAA, but instead of sampling every pixel, it only samples pixels that have changed from one frame to the next to fill in the missing information. DLAA also uses machine learning, giving the anti-aliasing technique much more information to work with.

How does Nvidia DLAA work?

If you know how DLSS works, you know how DLAA works. It’s the same technique, just applied in a different way. Although DLSS deals with upscaling an image, it’s essentially an anti-aliasing technique. That makes DLAA much easier to understand, offering the anti-aliasing bit without the upscaling.

Under the hood, DLAA works by utilizing an A.I. model and dedicated Tensor cores on RTX 2000 and RTX 3000 graphics cards. Nvidia trains an A.I. model by feeding it low resolution, aliased images rendered by the game engine, as well as motion vectors from the same low-resolution scene. During this process, the A.I. model compares the low-resolution image to a 16K reference image.

An explanation of Nvidia DLSS.

After being trained, Nvidia bundles the model into a GPU driver and sends it off to you. Once you download the driver, the Tensor cores on RTX 2000 and RTX 3000 offer the computational power to run the A.I. model in real time while you’re playing games.

To understand DLAA, we need to look again at TAA. As mentioned, TAA only collects one sample per pixel, unlike MSAA that collects multiple samples. These samples are collected to provide an average color value, equaling out the jaggies. For TAA, it jitters the pixels while collecting the sample, helping it gather more information for an average without taking multiple samples.

It’s a great solution, and it looks about as good as MSAA with a vastly lower performance cost. The problem is that TAA doesn’t handle motion well. The samples from jittered pixels aren’t usable once something in the scene moves, which leads to the ghosting effect that TAA is infamous for.

DLAA is just TAA, but it solves the problem with motion. The A.I. model can track motion, lighting changes, and edges throughout the scene and make adjustments accordingly. This gets around the old samples TAA has to deal with while providing a cleaner image.

DLSS and DLAA work in the same way. The only difference between them is that DLSS is using anti-aliasing to produce acceptable image quality with a big performance gain while DLAA is using anti-aliasing to provide the best image quality at a performance loss.

Nvidia DLAA image comparison

With the technobabble out of the way, it’s time to look at DLAA in action. Right now, the feature is only available in The Elder Scrolls Online, which also features DLSS and TAA. DLAA is meant to replace TAA, not DLSS. If you’ve been using the upscaling tech to improve performance, DLAA will take you in the opposite direction.

We took screenshots of The Elder Scrolls Online with the Maximum preset at 4K, only changing the anti-aliasing mode between shots. Zoomed in three times the original resolution, we can see some major differences between DLSS and DLAA. DLSS is working with less information, so areas like the shingles on the roof and the area under the spire ledge look muddy.

There’s not much of a difference between TAA and DLAA. They’re roughly the same, and some areas, such as the green leaves at the bottom, look slightly better with TAA. That makes sense, though. TAA and DLAA are using very similar anti-aliasing techniques, so they should produce about the same image quality.

The difference comes in motion. As mentioned, TAA doesn’t always handle motion well. DLAA does. In short, it provides the same image quality as TAA, just without the ghosting and smearing that sometimes accompany it.

It’s important to note that you’ll see a more pronounced difference at lower resolutions. Naturally, more pixels on the screen means less work for the anti-aliasing. As DLSS has proved time and again, Tensor cores can work wonders with an A.I. model on low-resolution scenes.

Same tech, but not DLSS

A demonstration of DLSS in Control.

Although DLSS and DLAA do the same thing and work with the same tech, you shouldn’t confuse them. Think of them as opposites. DLSS focuses on performance at the cost of image quality, while DLAA focuses on image quality at the cost of performance.

DLAA has applications in games like The Elder Scrolls Online, where a good chunk of players has extra GPU power that’s effectively left on the table. You won’t see it in the next Cyberpunk 2077 or Control, and if you do, you’ll need some of the most powerful hardware around to use it.

The unfortunate news is that, like DLSS, DLAA is restricted to RTX 2000 and RTX 3000 graphics cards. It requires the Tensor cores to work, but the end result is worth it.

Editors' Recommendations

Nvidia is leveraging GPUs to build 2nm (and smaller) chips
Nvidia's cuLitho technology on a silicon wafer.

As part of Nvidia's GPU Technology Conference (GTC), the company announced a new software library that will enable 2nm and smaller transistors. The library is called cuLitho, and Nvidia has already partnered with the world's largest semiconductor companies to use the tech, including TSMC and Synopsys.

The software stack is built for computational lithography, which is the process of etching the components of a processor into a silicon wafer. Lithography has gotten more complex as the demand for smaller manufacturing processes has increased, but the current technology has reached an inflection point.

Read more
What is 5G? Speeds, coverage, comparisons, and more
The 5G UW icon on the Samsung Galaxy S23.

It's been years in the making, but 5G — the next big chapter in wireless technology — is finally approaching the mainstream. While we haven't yet reached the point where it's available everywhere, nearly all of the best smartphones are 5G-capable these days, and you're far more likely to see a 5G icon lit up on your phone than not.

There's more to 5G than just a fancy new number, though. The technology has been considerably more complicated for carriers to roll out since it covers a much wider range of frequencies than older 4G/LTE technology, with different trade-offs for each. It's also a much farther-reaching wireless technology, promising the kind of global connectivity that was once merely a dream found in futuristic sci-fi novels.

Read more
GPT-4: how to use, new features, availability, and more
A laptop opened to the ChatGPT website.

ChatGPT-4 has officially been announced, confirming the longtime rumors around its improvements to the already incredibly impressive language skills of OpenAI's ChatGPT.

OpenAI calls it the company's "most advanced system, producing safer and more useful responses." Here's everything we know about it so far.

Read more