Skip to main content

Intel is cooking up an exciting DLSS 3 rival

Kena Bridge of Spirits on the Samsung Odyssey OLED G9.
Jacob Roach / Digital Trends

Intel is working on a rival to Nvidia’s DLSS 3. Intel and researchers from the University of California published a paper detailing the tech, currently dubbed ExtraSS, at Siggraph Asia 2023 (spotted by Wccftech). It accomplishes the same goal as DLSS 3 by generating new frames to improve performance. Unlike DLSS 3, however, ExtraSS uses frame extrapolation, not frame interpolation.

That latter method is how DLSS 3 (and AMD’s FSR 3) works. It takes two sequential frames and compares them to generate a frame in-between. This naturally means you’re playing on a slight delay, as the tech needs both the current and next frame to do its dirty work. Intel is proposing a technique that uses extrapolation, where it uses data only from previous frames to predict the next frame. That gets rid of the latency issue currently present in DLSS 3.

A demonstration of artifacts in frame generation.
Intel

Extrapolation isn’t free of problems, which is probably why we haven’t seen it applied in games yet. The researchers point out the disocclusion issue with extrapolation. If your character is blocking a certain part of a scene, and they move, suddenly that part of the scene is viewable. Because extrapolation is essentially predicting future frames, it doesn’t have details on what your character was blocking. This creates disocclusion artifacts, which looks like a shimmering ghost following around moving objects. But the researchers say they’re able to solve this problem.

Or, in the researchers own words: “Our method proposes a new warping method with a lightweight flow model to extrapolate frames with better qualities [compared] to the previous frame generation methods and less latency compared to interpolation-based methods.”

A video showing artifacts caused by frame generation.
Intel

In addition to frame extrapolation, the framework the researchers show includes supersampling, similar to the base versions of Nvidia DLSS and AMD FSR. The researchers say the implementation shows comparable quality, which it highlighted in a brief demo video.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The researchers say they’re able to achieve high-quality results by utilizing the geometry buffer (or G-buffer) for warping. As the paper states, “Our method still shows comparable results due to the good initialization provided by G-buffer guided warping and shading refinement modules. Even using the target resolution as inputs, [temporal anti-aliasing upscaling, or TAAU] generates lagging glossy reflections, while our [ExtraSS] has correct shadings.”

A demonstration of artifacts caused by super sampling.
Intel

Right now, this is just a research paper, not a product. We’ll still need to see if Intel can apply the technique to its XeSS utility, but it seems like that’s where we’re headed. The paper cites four engineers from Intel, including Anton Kaplanyan, the original creator of Nvidia’s DLSS.

We currently expect Intel to reveal details about its next-gen Battlemage GPUs in the second half of 2024. It’s possible that we’ll see more details on this frame extrapolation technique around that time, but it’s too soon to say right now.

Editors' Recommendations

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Intel Battlemage graphics cards: release date speculation, price, specs, and more
Intel Arc A770 GPU installed in a test bench.

Despite a rocky start, Intel's Arc GPUs are now among the best graphics cards you can buy. Targeting budget PC gamers, Intel has established itself as a major player in gaming graphics cards, and all eyes are on Team Blue with its next generation of GPUs, codenamed Battlemage.

We know Battlemage GPUs are coming, and Intel has slowly been dropping hints about the graphics cards over the past year. Although we're still waiting on an official release date, specs, and pricing details for Battlemage GPUs, there's a lot we can piece together already.
Intel Battlemage: specs

Read more
Intel’s big bet on efficient GPUs might actually work
An Intel Meteor Lake processor socketed in a motherboard.

Intel has a lot riding on its next-gen Battlemage graphics architecture, and a very early benchmark shows some promising signs for performance. An Intel Lunar Lake CPU packing a low-power integrated Battlemage GPU was reportedly spotted in the SiSoftware benchmark database. It boasts not only higher performance than Intel's Meteor Lake chips, but also much better efficiency.

User @miktdt on X (formerly Twitter) spotted the result, which appears to come from an early qualification sample of the HP Spectre x360 14. The benchmark picked up that the laptop was using a Lunar Lake CPU, which is said to come with the Xe2-LPG architecture, a lower-power version of Battlemage.

Read more
I’ve used Intel CPUs for years. Here’s why I’m finally switching to AMD
AMD Ryzen 7 7800X3D held between fingertips.

I've been using Intel CPUs for close to seven years. It started with the Core i7-8700K, but I moved on to the Core i9-10900K, Core i9-12900K, and most recently, the Core i9-13900K, all of which could have occupied a slot among the best processors at different points in time. But after so much time with Team Blue, I'm switching back to AMD.

It comes at an interesting time for Intel and the PC hardware community as a whole, which is currently abuzz about a particular article claiming that Intel is objectively "better" for PC gamers. That's not what this article is. Instead, I want to walk you through why I chose to use AMD's Ryzen 7 7800X3D in my gaming PC, and how I came to the decision.
Stability struggles
The Intel Core i9-13900K CPU Jacob Roach / Digital Trends / Digital Trends

Read more