Skip to main content

You could be gaming on AMD’s Navi graphics card before the end of the summer

Image used with permission by copyright holder

Just days after catching our first glimpse of AMD’s next-generation 7nm Vega graphics card, the Radeon VII, the rumor mill spat out a tidbit about its next-next-generation cards, which reportedly are not far away. Off the back of AMD promising to tell us more about its 7nm Navi graphics cards before the end of 2019, new rumors suggest it will debut the new range of cards at E3 in June and will put them on sale one month later.

Alongside AMD’s big CPU announcements, we expected to see a Navi unveiling or at least a breakdown of what the new architecture might be like. That didn’t happen, but we got an in-depth look at the Vega-based Radeon VII instead — a surprisingly mature GPU solution that’s set to release to the general public on February 7. As capable as that is though, with its alleged RTX-2080-like performance, Navi is expected to be a full-fledged generation of new GPUs with a more midrange focus and is therefore likely to see much broader adoption among gamers.

Recommended Videos

All we’ve heard from AMD on the subject of Navi in the last year are tantalizing hints of promising performance. Unverified sources have proved far more exciting though, with one claiming that a supposed Navi-based “RX 3080” would be priced at $250 and offer performance comparable to Nvidia’s $500 RTX 2070. While that seems almost too good to be true, if the latest rumor is anything to go by, we may not have long to wait to find out how accurate it really is. An E3 showing for Navi puts the cards at less than six months away from a general unveiling and just over that for a general release.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

According to sources cited by RedGamingTech, the Navi cards we’ll see around the mid-point of this year will be “low to midrange,” which would suggest it will compete with the likes of Nvidia’s RTX 2060 and potentially its rumored 1660 Ti (and anything beneath it in the product stack).

It’s long been stated that we won’t see a high-end Navi solution until 2020, though that may clash with AMD’s planned successor generation code-named Arcturus which is rumored to release that same year. Considering AMD has cited Navi as being a very “scalable” architecture, its potential for high-end solutions is intriguing. As Tom’s Hardware suggests, it could mean that it offers great multi-chip or crossfire performance, or that it performs well when in excess of the typical 4,096 stream processors of the Graphics Core Next architecture.

Jon Martindale
Jon Martindale is a freelance evergreen writer and occasional section coordinator, covering how to guides, best-of lists, and…
This could be the reason you upgrade your GPU
The RTX 4080 in a running test bench.

Now more than ever, the best graphics cards aren't defined by their raw performance alone -- they're defined by their features. Nvidia has set the stage with DLSS, which now encompasses upscaling, frame generation, and a ray tracing denoiser, and AMD is hot on Nvidia's heels with FSR 3. But what will define the next generation of graphics cards?

It's no secret that features like DLSS 3 and FSR 3 are a key factor when buying a graphics card in 2024, and I suspect AMD and Nvidia are privy to that trend. We already have a taste of what could come in the next generation of GPUs from Nvidia, AMD, and even Intel, and it could make a big difference in PC gaming. It's called neural texture compression.
Let's start with texture compression

Read more
AMD just revealed a game-changing feature for your graphics card
AMD logo on the RX 7800 XT graphics card.

AMD is set to reveal a research paper about its technique for neural texture block compression at the Eurographics Symposium on Rendering (EGSR) next week. It sounds like some technobabble, but the idea behind neural compression is pretty simple. AMD says it's using a neural network to compress the massive textures in games, which cuts down on both the download size of a game and its demands on your graphics card.

We've heard about similar tech before. Nvidia introduced a paper on Neural Texture Compression last year, and Intel followed up with a paper of its own that proposed an AI-driven level of detail (LoD) technique that could make models look more realistic from farther away. Nvidia's claims about Neural Texture Compression are particularly impressive, with the paper asserting that the technique can store 16 times the data in the same amount of space as traditional block-based compression.

Read more
AMD’s multi-chiplet GPU design might finally come true
RX 7900 XTX installed in a test bench.

An interesting AMD patent has just surfaced, and although it was filed a while back, finding it now is all the more exciting because this tech might be closer to appearing in future graphics cards. The patent describes a multi-chiplet GPU with three separate dies, which is something that could both improve performance and cut back on production costs.

In the patent, AMD refers to a GPU that's partitioned into multiple dies, which it refers to as GPU chiplets. These chiplets, or dies, can either function together as a single GPU or work as multiple GPUs in what AMD refers to as "second mode." The GPU has three modes in total, the first of which makes all the chiplets work together as a single, unified GPU. This enables it to share resources and, as Tom's Hardware says, allows the front-end die to deal with command scheduling for all the shader engine dies. This is similar to what a regular, non-chiplet GPU would do.

Read more