Skip to main content

How GDDR7 memory could solve PC gaming’s VRAM woes

Micron has just announced that some of the best graphics cards might soon receive a considerable boost — and this announcement couldn’t have come at a better time.

According to Micron, the next-gen GDDR7 memory standard is expected to launch in the first half of 2024. Will this address the growing concerns regarding VRAM limitations, or will it simply contribute to the ongoing rise in GPU prices?

A graphic depicting the inside of a GPU.
Micron

While GDDR6/GDDR6X are the current video memory standards in graphics cards, it’s time to move on — the tech has been around since 2018, which is a long time in the world of computer hardware. Micron has a fix in the form of GDDR7, and it talked about its plans during its recent earnings call. So far, it appears that everything is on track and GDDR7 memory will make it to the market in less than a year from now — but that doesn’t mean we’ll be seeing it in GPUs right away.

Recommended Videos

Once GDDR7 is here, it’ll definitely serve up a massive boost in bandwidth, and that could be a lifesaver for cards with a small memory bus, like the RTX 4060 Ti or the RX 7600. However, even high-end GPUs will certainly benefit.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Wccftech reports that Micron is aiming for 36Gbps of bandwidth per pin, while the current maximum is around 22Gbps in Nvidia’s GDDR6X solutions, and 20Gbps for AMD’s GDDR6 options. Upgrading the bandwidth per pin will drastically boost the memory bandwidth for each GPU equipped with GDDR7. For instance, a budget card with a 128-bit bus will now offer 576Gbps of bandwidth, which is a huge step up. High-end models, like the RTX 4090, will be able to hit a massive 1.7TB/s in memory bandwidth versus the current maximum of 1TB/s.

This boost will arrive courtesy of Micron’s latest 1ß (1-beta) node, which utilizes deep ultraviolet lithography (DUV). The node that will follow, dubbed 1y, will move on to extreme ultraviolet lithography (EUV).

Micron’s announcement is, indeed, perfectly timed. There’s been a lot of buzz about the problems that limited VRAM and memory bandwidth can cause in gaming scenarios. Nvidia received quite a bit of backlash for its $400 RTX 4060 Ti, which only sports 8GB of VRAM across a 128-bit bus. Imagine how much better that card might have fared if it already had GDDR7 RAM to give it that crucial boost of bandwidth.

RTX 4090.
Jacob Roach / Digital Trends

The timing of Micron’s release is also pretty great. Nvidia isn’t planning to release a follow-up to its RTX 40-series graphics cards until 2025, so that gives it enough time to migrate to GDDR7. AMD is likely to follow suit, although it still has a fairly small range of RDNA 3 cards that needs to be filled out before it moves on to the next generation.

However, there’s a downside to GDDR7, which is that it will likely cause an increase in GPU prices. Seeing as some current-gen cards are already extremely overpriced, that doesn’t bode well for our wallets.

It also means that the GPUs that need this upgrade the most — meaning midrange to entry-level cards with a narrow memory bus — will likely not receive it for some time. We may see an RTX 5090 sporting GDDR7X VRAM, but the RTX 5060 might not be quite as lucky. Of course, nothing is certain right now, so we’ll just have to wait and see.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Apple’s secret Vision Pro controller suggests it’s finally taking VR gaming seriously
A person gaming on the Apple Vision Pro headset.

You would think that Apple’s Vision Pro headset would be the perfect platform for virtual reality (VR) gaming. After all, it has a top-notch processor, super-high-resolution displays, and both VR and augmented reality (AR) capabilities. In theory, that should make it a leading device for gamers.

The reality is, unfortunately, very different, with few popular gaming titles making it onto visionOS. And really, Apple’s headset is held back by one key weakness: its lack of proper VR controller support.

Read more
Big tech is dominating my digital life — here’s how I fixed it
big tech logos around capitol hill

Big tech companies are so dominant and so far-reaching right now that people could probably live their entire digital lives interacting only with Google, Apple, Meta, Microsoft, and Amazon products. Things never got quite that bad for me but I did realize recently that I've been relying far too much on Google, plus I’ve been using Safari for years even though I don’t actually like it that much.

So I decided to find some new apps to try out and came across a nice resource full of European, open-source, or non-profit alternatives for a range of different services. It introduced me to quite a few apps that are more than good enough to replace what I was using, and although I’m not hardcore enough to completely kick Google out of my life, I’m pretty happy with the results.
What’s so bad about big tech?

Read more
Meta faces lawsuit for training AI with pirated books
A silhouetted person holds a smartphone displaying the Facebook logo. They are standing in front of a sign showing the Meta logo.

In a recent lawsuit, Meta has been accused of using pirated books to train its AI models, with CEO Mark Zuckerberg's approval. As per Ars Technica, the lawsuit filed by authors including Ta-Nehisi Coates and Sarah Silverman in a California federal court, cite internal Meta communications indicating that the company utilized the Library Genesis (LibGen) dataset—a vast online repository known for hosting pirated books—despite internal concerns about the legality of using such material.

The authors argue that Meta's actions infringe upon their copyrights and could undermine the company's position with regulators. They claim that Meta's AI models, including Llama, were trained using their works without permission, potentially harming their livelihoods. Meta has defended its practices by invoking the "fair use" doctrine, asserting that using publicly available materials to train AI tools is legal in certain cases, such as "using text to statistically model language and generate original expression."

Read more