Skip to main content

Intel XeSS is already disappointing, but there’s still hope

Intel’s hotly anticipated Xe Supersampling (XeSS) tech is finally here, and a couple weeks before Intel’s Arc Alchemist GPUs show up. It’s available now in Death Stranding and Shadow of the Tomb Raider, and more games are sure to come. But right now, it’s really difficult to recommend turning XeSS on.

Bugs, lacking performance, and poor image quality have sent XeSS off to a rough start. Although there are glimmers of hope (especially with Arc’s native usage of XeSS), Intel has a lot of work ahead to get XeSS on the level of competing features from AMD and Nvidia.

Spotty performance

Norman Reedus crying in Death Stranding.
Image used with permission by copyright holder

Before getting into performance and image quality, it’s important to note that there are upscaling models for XeSS. One is for Intel’s Arc Alchemist GPUs, while the other uses DP4a instructions on GPUs that support them. Both use AI, but the DP4a version can’t do the calculations nearly as fast as Arc’s dedicated XMX cores. Because of that, the DP4a version uses a simpler upscaling model. On Arc GPUs, performance shouldn’t only be better, image quality should be, too.

We don’t have Arc GPUs yet, so I tested the DP4a version. To avoid any confusion, I’ll refer to it as “XeSS Lite” for the remainder of this article.

That’s the most fitting name because XeSS Lite isn’t the best showcase of Intel’s supersampling tech. Death Stranding provided the most consistent experience, and it’s the best point of comparison because it includes Nvidia’s Deep Learning Super Sampling (DLSS) and AMD’s FidelityFX Super Resolution 2.0 (FSR 2.0).

XeSS performance results for the RTX 3060 Ti in Death Stranding.
Image used with permission by copyright holder

With the RTX 3060 Ti and a Ryzen 9 7950X, XeSS trailed in both its Quality and Performance modes. DLSS is the performance leader, but FSR 2.0 isn’t far behind (about 6% lower in the Performance mode). XeSS in its Performance mode is a full 18% behind DLSS. XeSS is still providing nearly a 40% boost over native resolution, but DLSS and FSR 2.0 are still significantly ahead (71% and 61%, respectively).

The situation is worse with AMD’s RX 6600 XT. It seems XeSS Lite heavily favors Nvidia’s GPUs at the moment, as XeSS only provided a 24% boost in its Performance mode. That may sound decent, but consider that FSR 2.0 provides a 66% jump. In Quality mode, XeSS provided basically no benefit, with only a 3% increase.

XeSS results for the RX 6600 XT in Death Stranding.
Image used with permission by copyright holder

Shadow of the Tomb Raider also shows the disparity between recent Nvidia and AMD GPUs, but I’ll let the charts do the talking on that front. There’s a much bigger story with Shadow of the Tomb Raider. Across both the RX 6600 XT and RTX 3060 Ti, XeSS would consistently break the game.

I was able to finally get the Performance mode to work by setting the game to exclusive fullscreen and turning on XeSS in the launcher (thank goodness this game has a launcher). If I turned on XeSS in the main menu, the game would slow to a slideshow. And in the case of the Quality mode, I couldn’t get a consistent run even with the launcher workaround.

A new update for Shadow of the Tomb Raider reportedly fixes the bug, but we haven’t had a chance to retest yet. For now, make sure to update to the latest version of the game if you want to use XeSS.

I tried out Shadow of the Tomb Raider on my personal rig with an RTX 3080 12GB, and it worked great without the launcher workaround. This is the case for many GPUs, and the update should fix the startup crashes that were occurring for others.

Poor image quality

The performance for XeSS isn’t great right now, but the more disappointing factor is image quality. It lags behind FSR 2.0 and DLSS a bit in the Quality mode, but bumping down to the Performance mode shows just how far behind XeSS is right now in this regard.

Shadow of the Tomb Raider is the best example of that. DLSS looks a bit better, utilizing sharpening to pull out some extra detail on the distant skull below. XeSS falls apart. In Performance mode in Shadow of the Tomb Raider, XeSS looks like you’re simply running at a lower resolution.

Image quality in Shadow of the Tomb Raider.
Image used with permission by copyright holder

This is zoomed in quite a bit, so the difference isn’t nearly as stark when zoomed out. And the Quality mode holds up decently. It still suffers from the low-res look when zoomed in so much, but the differences are much harder to spot in Quality mode when you’re actually playing the game.

XeSS image quality in Shadow of the Tomb Raider
Image used with permission by copyright holder

Death Stranding tells a different story — and largely because it includes FSR 2.0. In Quality mode, FSR 2.0 and native resolution are close, aided a lot by FSR 2.0’s aggressive sharpening. DLSS isn’t quite as sharp, but it still manages to maintain most of the detail on protagonist Sam Porter Bridges. XeSS is a step behind, though it’s not as stark as Shadow of the Tomb Raider. It manages to reproduce details, but they’re not as well-defined. See the hood and shoulder on Bridges and the rock behind him.

XeSS Quality comparison in Death Stranding.
Image used with permission by copyright holder

Performance mode is where things get interesting. Once again FSR 2.0 is ahead in a still image with its aggressive sharpening, but XeSS and DLSS are almost identical. Performance is behind, and Intel still needs to work with developers for better XeSS implementations. But this showcases that XeSS can be competitive. One day, at least.

XeSS Performance comparison in Death Stranding.
Image used with permission by copyright holder

Just like with performance, it’s important to keep in mind that this isn’t the full XeSS experience. Without Arc GPUs to test yet, it’s hard to say if XeSS’ image quality will improve once it’s running on the GPUs it was intended to run on. For now, XeSS Lite is behind on image quality, though Death Stranding is proof that it could catch up.

Will XeSS hold up on Arc?

Intel Arc A750M Limited Edition graphics card sits on a desk.
Intel

XeSS is built first and foremost for Intel Arc Alchemist, so although these comparisons are useful now, it’s all going to come down to how XeSS can perform once Arc GPUs are here. XeSS Lite still needs some work, especially in Shadow of the Tomb Raider, but Death Stranding is a promising sign that the tech can get there eventually.

Even then, it’s clear that XeSS isn’t the end-all-be-all supersampling tech it was billed as. It’s possible that by trying to do both machine learning and general-purpose supersmapling, XeSS will lose on both fronts. For now, though, we just have to wait until we can test Intel’s GPUs with XeSS.

Editors' Recommendations

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
I tested Intel’s new overclocking tool, and it does AI all wrong
Intel's 14900K CPU socketed in a motherboard.

One of the most interesting features of Intel's recent Core i9-14900K is its AI-assisted overclocking. Available through the Extreme Tuning Utility (XTU), AI Assist is billed as the natural next step of automatic overclocking. It uses AI to push chips further rather than relying on a predetermined list of checks that Intel already offers through XTU.

That's the pitch, at least. But according to my own testing, AI Assist doesn't do much of anything.

Read more
Intel’s next-gen GPUs are its first real shot at being the best
Intel Arc A770 GPU installed in a test bench.

When Intel's first-gen GPUs launched, their performance had some serious weaknesses. Intel acknowledged this before the launch of the GPUs, promising that it would improve performance through driver updates in the future. It's Intel's take on  AMD's classic "fine wine" approach to GPU drivers that we've seen in the past.

And that's exactly what it did. It seems like every week Intel has been making headlines with massive performance improvements in individual games. But the big payoff from these drivers isn't just for the Arc A770 and A750. These big driver boosts are laying a foundation for Intel's next-gen Battlemage GPUs, and they could make all the difference.
Starting from the bottom

Read more
Don’t believe the hype — the era of native resolution gaming isn’t over
Alan Wake looking at a projection of himself.

Native resolution is dead, or so the story goes. A string of PC games released this year, with the most recent being Alan Wake 2, have come under fire for basically requiring some form of upscaling to achieve decent performance. Understandably, there's been some backlash from PC gamers, who feel as if the idea of running a game at native resolution is quickly becoming a bygone era.

There's some truth to that, but the idea that games will rely on half-baked upscalers to achieve reasonable performance instead of "optimization" is misguided at best -- and downright inaccurate at worst. Tools like Nvidia's Deep Learning Super Sampling (DLSS) and AMD's FidelityFX Super Resolution (FSR) will continue to be a cornerstone of PC gaming, but here's why they can't replace native resolution entirely.
The outcry
Let's start with why PC gamers have the impression that native resolution is dead. The most recent outcry came over Alan Wake 2 when the system requirements revealed that the game was built around having either DLSS or FSR turned on. That's not a new scenario, either. The developers of Remnant 2 confirmed the game was designed around upscaling, and you'd be hard-pressed to find a AAA release in the last few years that didn't pack in upscaling tech.

Read more