Intel XeSS, or Xe Super Sampling, is a new feature of the upcoming Arc Alchemist graphics cards that is aimed at competing technologies from AMD and Nvidia. XeSS won’t be here until Intel’s first discrete graphics cards launch in early 2022, but we have some hints about how it will stack up to the competition.
Based on what we know, XeSS looks similar to Nvidia Deep Learning Super Sampling (DLSS). DLSS uses artificial intelligence (A.I.) upscaling to improve performance in supported games, leveraging the dedicated Tensor cores on recent Nvidia graphics cards. AMD has a similar feature called FidelityFX Super Resolution (FSR) that supports multiple generations of graphics cards from different vendors, but it doesn’t use A.I.
XeSS looks like a balance between the two. It supports a lot of different hardware, but it can also take advantage of dedicated cores to enhance the image on Intel graphics cards. Here’s what we know about XeSS so far, as well as how it stacks up to DLSS and FSR.
When it comes to image quality, DLSS is winning right now. XeSS could change that once it launches — we’ll dig into why in a moment — but Nvidia’s tech handily beats AMD’s FSR. That’s because Nvidia uses deep learning to enhance quality, and it tracks motion vectors to prevent ghosting and other visual artifacts that come up with upscaling.
At native resolution with the highest-quality mode, DLSS and FSR are roughly equal. Both techniques produce the odd visual artifact, but it’s very hard to tell between them and native resolution without some serious magnification. The differences between them come up in the more demanding quality modes.
With FSR, there’s a steep decline after the highest-quality mode, as we saw in our FidelityFX Super Resolution review. In Godfall, for example, the Balanced mode produced a severe blur across textures, rendering the game virtually unplayable. Beyond the highest-quality mode, FSR simply falls apart.
That’s likely because FSR uses a dated upscaling algorithm, which is enhanced by a sharpening filter. DLSS uses a general A.I. model that has been trained on high-quality images of games, allowing it to recreate detail more accurately from a smaller internal resolution. There’s still a drop in quality with DLSS, but even the more aggressive performance modes are playable.
We don’t know about the image quality of XeSS yet, but a demo provided by Intel looks promising. The demo (below) shows a scene going from 1080p to 4K, and zoomed in by four times, the version using XeSS renders a lot more detail. It’s important to point out that this is a demo created by Intel, not XeSS running in a game. We’ll reserve our verdict for when XeSS is actually here.
XeSS will likely be closer to DLSS than FSR, however. Like Nvidia, Intel is using machine learning and motion information for its upscaling. It also leverages dedicated hardware on upcoming Intel Arc Alchemist graphics cards, similar to how Nvidia takes advantage of Tensor cores on RTX graphics cards.
Assuming XeSS can at least hit the mark set by FSR and DLSS, the machine learning aspect should allow it to hold up at more aggressive quality modes. Intel hasn’t confirmed if XeSS has quality modes yet, but it seems like a necessary inclusion for the feature to have any legs.
DLSS and FSR support different games — and once XeSS launches, it’ll likely support different games, too. That makes direct comparisons tough, not only because the games are different but because the graphics cards are different. Nvidia requires a recent RTX graphics card, after all.
It’s a xbetter look at how much of an improvement you can expect rather than gains in specific games. It’s important to note that simply turning on DLSS or FSR will produce a much higher frame rate across titles. They’re largely the same when it comes to performance improvements; the bigger difference comes in image quality.
DLSS is capable of about a 2x improvement at 4K when using the aggressive Performance mode. In Wolfenstein Youngblood, Nvidia shows the game going from 57 frames per second (fps) to 104 fps at 4K. In Control, DLSS shows even more gains — 25.8 fps to 69.2 fps.
FSR can push things a little further, though not by much. In FSR’s Performance mode, AMD shows Godfall jumping from 49 fps to 150 fps, which is higher than DLSS. It’s important to point out that FSR’s Performance mode is equivalent to DLSS’ Ultra Performance mode. It’s confusing, but you shouldn’t equate AMD’s Performance mode with Nvidia’s.
Similarly, Intel claims up to a 2x performance increase at 4K, but once again, we haven’t seen the feature in action. If Intel wants XeSS to compete, it needs to reach the 2x performance increase set by DLSS and FSR, and it looks like that’s Intel’s goal.
Boiling the frame rates down, there isn’t a huge difference in performance. FSR can provide a slightly higher frame rate, but that’s mainly because Nvidia doesn’t offer its Ultra Performance mode in all games. Simply using one of the upscaling features gains you back a lot of performance, and across games, the results are largely the same.
It’s more about balancing performance with image quality. DLSS may not scale as high as FSR in certain games, but the image quality holds up in the aggressive performance modes. Due to the different hardware, however, it’s hard drawing firm conclusions. Add on top of that disparate game support — each feature supports a different list of games — and it becomes clear that comparing performance directly doesn’t mean much. Use what’s available; that will produce the best results.
DLSS is the oldest of the three upscaling technologies, and unsurprisingly, it supports the most games. It’s available in dozens of titles, including Cyberpunk 2077, Marvel’s Avengers, and Outriders, and Nvidia is constantly adding support for new games. The upcoming Back 4 Blood, for example, is launching with the feature.
FSR is much newer, but that hasn’t held it back from growing an impressive list of supported titles. At the time of publication, the heavy hitters are Resident Evil Village, Godfall, and Marvel’s Avengers, but AMD has announced that some big upcoming games will support it, too. Far Cry 6 and Forspoken are two of the larger games coming up with FSR support.
The number of games doesn’t tell the full story, however. Nvidia has aggressively pursued large, AAA games like Call of Duty: Modern Warfare, Doom Eternal, and Control, while FSR shows up in a lot of smaller, less demanding games that don’t call for an upscaling feature. DLSS often shows up alongside ray tracing, too, while FSR doesn’t.
DLSS not only supports more games, but it also supports more games you’ll actually use it in. That said, FSR is much easier for modders to splice into existing titles. It has already shown up Grand Theft Auto 5, as well as the PlayStation 3 emulator RPCS3.
For XeSS, Intel hasn’t announced any supported games. Despite saying “several” developers are engaged with the tech, Intel hasn’t confirmed or hinted at any titles that will support XeSS. With three upscaling technologies vying for the latest releases, Intel might have a hard time gaining a foothold. But we’ll need to wait and see once the feature is here.
The biggest difference between DLSS, FSR, and XeSS is hardware support — and it will be the difference that defines which is the best upscaling option. DLSS requires an Nvidia RTX graphics card. Not only is the feature limited to Nvidia hardware, but it’s also limited to the last two generations of Nvidia hardware.
That’s because DLSS requires the Tensor cores on recent Nvidia graphics cards, which handle the A.I. calculations. FSR doesn’t use A.I., so it doesn’t require any particular hardware. The strength of FSR isn’t that a lot of games support it or that it has better image quality compared to DLSS because it has neither of those. It’s that anyone can use it.
Outside of graphics cards from AMD and Nvidia, FSR also works on integrated graphics, APUs, and graphics cards that are older than a couple of generations. There’s a quality trade-off, but most gamers don’t have a recent Nvidia graphics card. The majority of people are still using older GPUs, an AMD card, or integrated graphics.
XeSS has a nice compromise between the two. Like DLSS, XeSS uses dedicated cores — called XMX cores on Intel graphics cards — to handle the A.I. calculations. XeSS requires these cores to work, so the full version of XeSS will only work on Intel graphics cards. But Intel is making two versions.
This is something we wanted to see out of DLSS. Essentially, Intel is offering developers two different versions of XeSS, one that requires the dedicated XMX cores and another that’s a general-purpose solution for a “wide range of hardware.” It’s the best of DLSS and FSR mashed up into one.
Although exciting, we still need to wait until XeSS is here. Two versions mean twice the work for developers if XeSS is difficult to implement, so it’s possible that developers won’t adopt it as widely into their games. XeSS hinges on this feature, and if it works, it could dethrone DLSS.
A new challenger approaching
DLSS leads the pack in quality and game support. If it worked across multiple generations of graphics cards from different brands, it would make FSR obsolete. Although FSR can increase your performance and the highest-quality mode looks good, it sacrifices too much image quality with the more demanding modes.
XeSS could make both obsolete, assuming it can hold up in terms of image quality and performance. The fact that XeSS uses A.I. hints that image quality will be closer to DLSS, and the two-lane approach ensures that it will work across a broad range of hardware. Performance should be similar, too, based on Intel’s claims.
We need to get hands-on with XeSS before drawing any conclusions, but it looks like the DLSS competitor that many assumed FSR would be. Intel has a long road ahead, however, and a lot to prove. At the moment, DLSS still dominates the world of supersampling.
- Intel Alder Lake smashes AMD Ryzen 9 6900HX — but at a cost
- Nvidia RTX 3080 10GB vs. RTX 3080 12GB: Does more RAM matter?
- Intel Arc Alchemist: Everything we know
- Intel Core i5 12400F vs. AMD Ryzen 5 5600X: Affordable CPUs, compared
- God of War PC performance guide: The best settings for high FPS