Skip to main content

AMD just revealed a game-changing feature for your graphics card

AMD logo on the RX 7800 XT graphics card.
Jacob Roach / Digital Trends

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it’s using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

We’ve heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia’s claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

Recommended Videos

AMD hasn’t revealed its research yet, so there aren’t a ton of details about how its method would work. The key with Nvidia’s approach is that it leverages the GPU to decompress textures in real time. This has been an issue in several games released in the past couple of years, from Halo Infinite to The Last of Us Part I to Redfall. In all of these games, you’ll notice low-quality textures if you run out of VRAM, which is particularly noticeable on 8GB graphics cards like the RTX 4060 and RX 7600.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

One detail AMD did reveal is that its method should be easier to integrate. The tweet announcing the paper reads, “unchanged runtime execution allows easy game integration.” Nvidia hasn’t said if its technique is particularly hard to integrate, nor if it will require specific hardware to work (though the latter is probably a safe bet). AMD hasn’t made mention of any particular hardware requirements, either.

We'll present "Neural Texture Block Compression" @ #EGSR2024 in London.

Nobody likes downloading huge game packages. Our method compresses the texture using a neural network, reducing data size.

Unchanged runtime execution allows easy game integration. https://t.co/gvj1D8bfBf pic.twitter.com/XglpPkdI8D

— AMD GPUOpen (@GPUOpen) June 25, 2024

At this point, neural compression for textures isn’t a feature available in any game. These are just research papers, and it’s hard to say if they’ll ever turn into features on the level of something like Nvidia’s DLSS or AMD’s FSR. However, the fact that we’re seeing AI-driven compression from Nvidia, Intel, and now AMD suggests that this is a new trend in the world of PC gaming.

It makes sense, too. Features like DLSS have become a cornerstone of modern graphics cards, serving as an umbrella for a large swath of performance-boosting features. Nvidia’s CEO has said the company is looking into more ways to leverage AI in games, from generating objects to enhancing textures. As features like DLSS and FSR continue become more prominent, it makes sense that AMD, Nvidia, and Intel would look to expand their capabilities.

If we do see neural texture compression as marketable features, they’ll likely show up with the next generation of graphics cards. Nvidia is expected to reveal its RTX 50-series GPUs in the second half of the year, AMD could showcase its next-gen RDNA 4 GPUs in a similar time frame, and Intel’s Battlemage architecture is arriving in laptops in a matter of months through Lunar Lake CPUs.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Bad news for AMD? Nvidia might fast-track the RTX 50-series
Two RTX 4060 cards side by side

Things are finally about to start heating up for some of the best graphics cards. Although we're still in the dark about final release dates, both AMD and Nvidia are said to be launching new GPUs in the first quarter of 2025. However, a new leak tells us that Nvidia might try out a different approach with the RTX 50-series, and that's bound to put some pressure on AMD at the worst possible time.

What's new? We've already heard that Nvidia is likely to announce the RTX 5090 and the RTX 5080 at CES 2025, with its CEO Jensen Huang scheduled to hold a keynote during the event. However, the release dates for the rest of the lineup remained a mystery. Now, a previously reliable source sheds some light on the matter with potential details about the planned launch dates for the RTX 5070, RTX 5070 Ti, RTX 5060, and RTX 5060 Ti.

Read more
No, generative AI isn’t taking over your PC games anytime soon
Cyberpunk 2077 running on the Samsung Odyssey OLED G8.

Surprise -- the internet is upset. This time, it's about a recent article from PC Gamer on the future of generative AI in video games. It's a topic I've written about previously, and something that game companies have been experimenting with for more than a year, but this particular story struck a nerve.

Redditors used strong language like "pro-AI puff piece," PC Gamer itself issued an apology, and the character designer for Bioshock Infinite's Elizabeth called the featured image showing the character reimagined with AI a "half-assed cosplay." The original intent of the article is to glimpse into the future at what games could look like with generative AI, but without the tact or clear realization of how this shift affects people's jobs and their creative works.

Read more
25 years ago, Nvidia changed PCs forever
The GeForce 256 sitting next to a Half Life box.

Twenty-five years ago, Nvidia released the GeForce 256 and changed the face of PCs forever. It wasn't the first graphics card produced by Nvidia -- it was actually the sixth -- but it was the first that really put gaming at the center of Nvidia's lineup with GeForce branding, and it's the device that Nvidia coined the term "GPU" with.

Nvidia is celebrating the anniversary of the release, and rightfully so. We've come an unbelievable way from the GeForce 256 up to the RTX 4090, but Nvidia's first GPU wasn't met with much enthusiasm. The original release, which lines up with today's date, was for the GeForce 256 SDR, or single data rate. Later in 1999, Nvidia followed up with the GeForce 256 DDR, or dual data rate.

Read more