Skip to main content

AMD just revealed a game-changing feature for your graphics card

AMD logo on the RX 7800 XT graphics card.
Jacob Roach / Digital Trends

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it’s using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

We’ve heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia’s claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

Recommended Videos

AMD hasn’t revealed its research yet, so there aren’t a ton of details about how its method would work. The key with Nvidia’s approach is that it leverages the GPU to decompress textures in real time. This has been an issue in several games released in the past couple of years, from Halo Infinite to The Last of Us Part I to Redfall. In all of these games, you’ll notice low-quality textures if you run out of VRAM, which is particularly noticeable on 8GB graphics cards like the RTX 4060 and RX 7600.

One detail AMD did reveal is that its method should be easier to integrate. The tweet announcing the paper reads, “unchanged runtime execution allows easy game integration.” Nvidia hasn’t said if its technique is particularly hard to integrate, nor if it will require specific hardware to work (though the latter is probably a safe bet). AMD hasn’t made mention of any particular hardware requirements, either.

We'll present "Neural Texture Block Compression" @ #EGSR2024 in London.

Nobody likes downloading huge game packages. Our method compresses the texture using a neural network, reducing data size.

Unchanged runtime execution allows easy game integration. https://t.co/gvj1D8bfBf pic.twitter.com/XglpPkdI8D

— AMD GPUOpen (@GPUOpen) June 25, 2024

At this point, neural compression for textures isn’t a feature available in any game. These are just research papers, and it’s hard to say if they’ll ever turn into features on the level of something like Nvidia’s DLSS or AMD’s FSR. However, the fact that we’re seeing AI-driven compression from Nvidia, Intel, and now AMD suggests that this is a new trend in the world of PC gaming.

It makes sense, too. Features like DLSS have become a cornerstone of modern graphics cards, serving as an umbrella for a large swath of performance-boosting features. Nvidia’s CEO has said the company is looking into more ways to leverage AI in games, from generating objects to enhancing textures. As features like DLSS and FSR continue become more prominent, it makes sense that AMD, Nvidia, and Intel would look to expand their capabilities.

If we do see neural texture compression as marketable features, they’ll likely show up with the next generation of graphics cards. Nvidia is expected to reveal its RTX 50-series GPUs in the second half of the year, AMD could showcase its next-gen RDNA 4 GPUs in a similar time frame, and Intel’s Battlemage architecture is arriving in laptops in a matter of months through Lunar Lake CPUs.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Leaked images of an AMD GPU have me wishing it was real
A leaked RX 9070 XT reference card.

A new leak shows clear images of what could've been one of AMD's best graphics cards -- the reference version of the RX 9070 XT. AMD said that it didn't make its own version, also known as Made By AMD (MBA), which is why it's such a surprise to see one such card in the flesh. I'm not sure if this is a real GPU, but if it is, I know that I wish it made it to market.

The leaked photos surfaced on Bilibili earlier this morning. They show what appears to be an RX 9070 XT GPU, in a full-black shroud, in a sealed bag. It sports three fans, but little else can be gathered from these images.

Read more
AMD takes lead over Nvidia, but how long will it last?
An Asus RX 9070 XT TUF GPU.

While both AMD and Nvidia make some of the best graphics cards, pitting the two against each other usually reveals that Nvidia dominates the GPU market with an over 80% share. However, a new survey revealed that, at least in the recent weeks, many gamers preferred to go with AMD when buying a GPU. But how long will this surprising lead even last?

https://x.com/3DCenter_org/status/1899732939686256846

Read more
AMD’s RX 9070 XT beats Nvidia’s $1,000+ GPU, but there’s a catch
Fans on the RTX 5080.

AMD's RX 9070 XT hit the shelves last week, and the response has been largely positive. The GPU was expected to perform on around the same level as Nvidia's RTX 5070 Ti, making it capable of beating some of the best graphics cards. However, a known overclocker just managed to push the GPU to new heights, helping it beat Nvidia's $1,000+ RTX 5080.

Der8auer took the RX 9070 XT out for an extensive spin and achieved interesting results. Prior to launch, many thought the RX 9070 XT would rival the RTX 5070 at best, but now, we've seen it beating not just the RTX 5070 Ti but also the RTX 5080 in today's test. The catch? Not only did Der8auer use a premium card, but the GPU was also overclocked and undervolted.

Read more