Skip to main content

Intel is cooking up an exciting DLSS 3 rival

Kena Bridge of Spirits on the Samsung Odyssey OLED G9.
Jacob Roach / Digital Trends

Intel is working on a rival to Nvidia’s DLSS 3. Intel and researchers from the University of California published a paper detailing the tech, currently dubbed ExtraSS, at Siggraph Asia 2023 (spotted by Wccftech). It accomplishes the same goal as DLSS 3 by generating new frames to improve performance. Unlike DLSS 3, however, ExtraSS uses frame extrapolation, not frame interpolation.

Recommended Videos

That latter method is how DLSS 3 (and AMD’s FSR 3) works. It takes two sequential frames and compares them to generate a frame in-between. This naturally means you’re playing on a slight delay, as the tech needs both the current and next frame to do its dirty work. Intel is proposing a technique that uses extrapolation, where it uses data only from previous frames to predict the next frame. That gets rid of the latency issue currently present in DLSS 3.

A demonstration of artifacts in frame generation.
Intel

Extrapolation isn’t free of problems, which is probably why we haven’t seen it applied in games yet. The researchers point out the disocclusion issue with extrapolation. If your character is blocking a certain part of a scene, and they move, suddenly that part of the scene is viewable. Because extrapolation is essentially predicting future frames, it doesn’t have details on what your character was blocking. This creates disocclusion artifacts, which looks like a shimmering ghost following around moving objects. But the researchers say they’re able to solve this problem.

Or, in the researchers own words: “Our method proposes a new warping method with a lightweight flow model to extrapolate frames with better qualities [compared] to the previous frame generation methods and less latency compared to interpolation-based methods.”

A video showing artifacts caused by frame generation.
Intel

In addition to frame extrapolation, the framework the researchers show includes supersampling, similar to the base versions of Nvidia DLSS and AMD FSR. The researchers say the implementation shows comparable quality, which it highlighted in a brief demo video.

The researchers say they’re able to achieve high-quality results by utilizing the geometry buffer (or G-buffer) for warping. As the paper states, “Our method still shows comparable results due to the good initialization provided by G-buffer guided warping and shading refinement modules. Even using the target resolution as inputs, [temporal anti-aliasing upscaling, or TAAU] generates lagging glossy reflections, while our [ExtraSS] has correct shadings.”

A demonstration of artifacts caused by super sampling.
Intel

Right now, this is just a research paper, not a product. We’ll still need to see if Intel can apply the technique to its XeSS utility, but it seems like that’s where we’re headed. The paper cites four engineers from Intel, including Anton Kaplanyan, the original creator of Nvidia’s DLSS.

We currently expect Intel to reveal details about its next-gen Battlemage GPUs in the second half of 2024. It’s possible that we’ll see more details on this frame extrapolation technique around that time, but it’s too soon to say right now.

Jacob Roach
Former Digital Trends Contributor
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Intel’s Arc B570 puts up an impressive fight against the RTX 4060
Fans on the Intel Arc B570.

Intel just released one of the best graphics cards you can buy -- the Arc B570. As you can read in my Intel Arc B570 review, it delivers solid gaming performance at 1080p, and at a price we haven't seen in years. But it faces some stiff competition from Nvidia in the form of the RTX 4060.

I put the two budget GPUs on the test bench to see how they hold up in a variety of games, and I'll walk you through the results I gathered. Although both cards are excellent options under $300, Intel's new Arc B570 is hard to argue with considering how much less expensive it is than the Nvidia competition.
Specs and pricing

Read more
Nvidia says the RTX 5080 is ‘about’ 15% faster than the RTX 4080 without DLSS
Nvidia's RTX 5090 sitting at CES 2025.

Nvidia made some bold claims about its RTX 50-series GPUs when they were announced earlier this month, saying that the new range can outclass their previous-gen counterparts with twice the performance. Although Nvidia's new lineup might be among the best graphics cards when they launch, the vast majority of the extra performance comes on the back of the new DLSS Multi-Frame Generation feature that's exclusive to RTX 50-series GPUs.

During Nvidia's Editor's Day for Blackwell GPUs at CES 2025, GeForce desktop product manager Justin Walker said that the RTX 5080 was about 15% faster than the RTX 4080 without DLSS 4, and that the RTX 5070 would be about 20% faster than the RTX 4070 without the feature. Nvidia didn't provide hard performance numbers for any of the new GPUs it's releasing, so pay careful attention to the "about" at the start of that statement. Walker provided a general impression of the generational uplift you can expect, but it's important to wait for reviews before drawing any conclusions about the new cards.

Read more
Nvidia’s DLSS 4 isn’t what you think it is. Let’s debunk the myths
DLSS 4 in Cyberpunk 2077.

Nvidia stole the show at CES 2025 with the announcement of the RTX 5090, and despite plenty of discourse about the card's $2,000 price tag, it ushers in a lot of new technology. Chief among them is DLSS 4, which brings multi-frame generation to Nvidia's GPUs, offering a 4X performance boost in over 75 games right away when Nvidia's new RTX 50-series GPUs hit the streets.

I've seen way too much misunderstanding about how DLSS 4 actually works, though. Between misleading comments from Nvidia's CEO and a radical redesign to how DLSS works, it's no wonder there's been misinformation floating around about the new tech, what it's capable of, and, critically, what limitations it has.

Read more