Skip to main content

Intel XeSS is already disappointing, but there’s still hope

Intel’s hotly anticipated Xe Supersampling (XeSS) tech is finally here, and a couple weeks before Intel’s Arc Alchemist GPUs show up. It’s available now in Death Stranding and Shadow of the Tomb Raider, and more games are sure to come. But right now, it’s really difficult to recommend turning XeSS on.

Bugs, lacking performance, and poor image quality have sent XeSS off to a rough start. Although there are glimmers of hope (especially with Arc’s native usage of XeSS), Intel has a lot of work ahead to get XeSS on the level of competing features from AMD and Nvidia.

Spotty performance

Norman Reedus crying in Death Stranding.
Image used with permission by copyright holder

Before getting into performance and image quality, it’s important to note that there are upscaling models for XeSS. One is for Intel’s Arc Alchemist GPUs, while the other uses DP4a instructions on GPUs that support them. Both use AI, but the DP4a version can’t do the calculations nearly as fast as Arc’s dedicated XMX cores. Because of that, the DP4a version uses a simpler upscaling model. On Arc GPUs, performance shouldn’t only be better, image quality should be, too.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

We don’t have Arc GPUs yet, so I tested the DP4a version. To avoid any confusion, I’ll refer to it as “XeSS Lite” for the remainder of this article.

That’s the most fitting name because XeSS Lite isn’t the best showcase of Intel’s supersampling tech. Death Stranding provided the most consistent experience, and it’s the best point of comparison because it includes Nvidia’s Deep Learning Super Sampling (DLSS) and AMD’s FidelityFX Super Resolution 2.0 (FSR 2.0).

XeSS performance results for the RTX 3060 Ti in Death Stranding.
Image used with permission by copyright holder

With the RTX 3060 Ti and a Ryzen 9 7950X, XeSS trailed in both its Quality and Performance modes. DLSS is the performance leader, but FSR 2.0 isn’t far behind (about 6% lower in the Performance mode). XeSS in its Performance mode is a full 18% behind DLSS. XeSS is still providing nearly a 40% boost over native resolution, but DLSS and FSR 2.0 are still significantly ahead (71% and 61%, respectively).

The situation is worse with AMD’s RX 6600 XT. It seems XeSS Lite heavily favors Nvidia’s GPUs at the moment, as XeSS only provided a 24% boost in its Performance mode. That may sound decent, but consider that FSR 2.0 provides a 66% jump. In Quality mode, XeSS provided basically no benefit, with only a 3% increase.

XeSS results for the RX 6600 XT in Death Stranding.
Image used with permission by copyright holder

Shadow of the Tomb Raider also shows the disparity between recent Nvidia and AMD GPUs, but I’ll let the charts do the talking on that front. There’s a much bigger story with Shadow of the Tomb Raider. Across both the RX 6600 XT and RTX 3060 Ti, XeSS would consistently break the game.

I was able to finally get the Performance mode to work by setting the game to exclusive fullscreen and turning on XeSS in the launcher (thank goodness this game has a launcher). If I turned on XeSS in the main menu, the game would slow to a slideshow. And in the case of the Quality mode, I couldn’t get a consistent run even with the launcher workaround.

A new update for Shadow of the Tomb Raider reportedly fixes the bug, but we haven’t had a chance to retest yet. For now, make sure to update to the latest version of the game if you want to use XeSS.

I tried out Shadow of the Tomb Raider on my personal rig with an RTX 3080 12GB, and it worked great without the launcher workaround. This is the case for many GPUs, and the update should fix the startup crashes that were occurring for others.

Poor image quality

The performance for XeSS isn’t great right now, but the more disappointing factor is image quality. It lags behind FSR 2.0 and DLSS a bit in the Quality mode, but bumping down to the Performance mode shows just how far behind XeSS is right now in this regard.

Shadow of the Tomb Raider is the best example of that. DLSS looks a bit better, utilizing sharpening to pull out some extra detail on the distant skull below. XeSS falls apart. In Performance mode in Shadow of the Tomb Raider, XeSS looks like you’re simply running at a lower resolution.

Image quality in Shadow of the Tomb Raider.
Image used with permission by copyright holder

This is zoomed in quite a bit, so the difference isn’t nearly as stark when zoomed out. And the Quality mode holds up decently. It still suffers from the low-res look when zoomed in so much, but the differences are much harder to spot in Quality mode when you’re actually playing the game.

XeSS image quality in Shadow of the Tomb Raider
Image used with permission by copyright holder

Death Stranding tells a different story — and largely because it includes FSR 2.0. In Quality mode, FSR 2.0 and native resolution are close, aided a lot by FSR 2.0’s aggressive sharpening. DLSS isn’t quite as sharp, but it still manages to maintain most of the detail on protagonist Sam Porter Bridges. XeSS is a step behind, though it’s not as stark as Shadow of the Tomb Raider. It manages to reproduce details, but they’re not as well-defined. See the hood and shoulder on Bridges and the rock behind him.

XeSS Quality comparison in Death Stranding.
Image used with permission by copyright holder

Performance mode is where things get interesting. Once again FSR 2.0 is ahead in a still image with its aggressive sharpening, but XeSS and DLSS are almost identical. Performance is behind, and Intel still needs to work with developers for better XeSS implementations. But this showcases that XeSS can be competitive. One day, at least.

XeSS Performance comparison in Death Stranding.
Image used with permission by copyright holder

Just like with performance, it’s important to keep in mind that this isn’t the full XeSS experience. Without Arc GPUs to test yet, it’s hard to say if XeSS’ image quality will improve once it’s running on the GPUs it was intended to run on. For now, XeSS Lite is behind on image quality, though Death Stranding is proof that it could catch up.

Will XeSS hold up on Arc?

Intel Arc A750M Limited Edition graphics card sits on a desk.
Intel

XeSS is built first and foremost for Intel Arc Alchemist, so although these comparisons are useful now, it’s all going to come down to how XeSS can perform once Arc GPUs are here. XeSS Lite still needs some work, especially in Shadow of the Tomb Raider, but Death Stranding is a promising sign that the tech can get there eventually.

Even then, it’s clear that XeSS isn’t the end-all-be-all supersampling tech it was billed as. It’s possible that by trying to do both machine learning and general-purpose supersmapling, XeSS will lose on both fronts. For now, though, we just have to wait until we can test Intel’s GPUs with XeSS.

Editors' Recommendations

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Intel’s big bet on efficient GPUs might actually work
An Intel Meteor Lake processor socketed in a motherboard.

Intel has a lot riding on its next-gen Battlemage graphics architecture, and a very early benchmark shows some promising signs for performance. An Intel Lunar Lake CPU packing a low-power integrated Battlemage GPU was reportedly spotted in the SiSoftware benchmark database. It boasts not only higher performance than Intel's Meteor Lake chips, but also much better efficiency.

User @miktdt on X (formerly Twitter) spotted the result, which appears to come from an early qualification sample of the HP Spectre x360 14. The benchmark picked up that the laptop was using a Lunar Lake CPU, which is said to come with the Xe2-LPG architecture, a lower-power version of Battlemage.

Read more
I’ve used Intel CPUs for years. Here’s why I’m finally switching to AMD
AMD Ryzen 7 7800X3D held between fingertips.

I've been using Intel CPUs for close to seven years. It started with the Core i7-8700K, but I moved on to the Core i9-10900K, Core i9-12900K, and most recently, the Core i9-13900K, all of which could have occupied a slot among the best processors at different points in time. But after so much time with Team Blue, I'm switching back to AMD.

It comes at an interesting time for Intel and the PC hardware community as a whole, which is currently abuzz about a particular article claiming that Intel is objectively "better" for PC gamers. That's not what this article is. Instead, I want to walk you through why I chose to use AMD's Ryzen 7 7800X3D in my gaming PC, and how I came to the decision.
Stability struggles
The Intel Core i9-13900K CPU Jacob Roach / Digital Trends / Digital Trends

Read more
Some Intel CPUs lost 9% of their performance almost overnight
Someone holding the Core i9-12900KS processor.

Over the past few weeks, we've seen an increasing number of reports of instability on high-end Intel CPUs like the Core i9-14900K. Asus has released a BIOS update for its Z790 motherboards aimed at addressing the problem, but it carries a performance loss of upwards of 9% in some workloads.

The most recent BIOS update from Asus includes the Intel Baseline Profile. This profile disables various optimizations that are automatically applied on Asus Z790 motherboards and runs high-end Intel chips within Intel's specific limits. Hardwareluxx tested the new profile with the Core i9-14900K and found that the CPU ran around 9% slower in multiple tests.

Read more