Skip to main content

The surprising reason your powerful PC still can’t handle the latest games

We’re off to a rocky start with PC releases in 2023. Hogwarts Legacy, Resident Evil 4 Remake, Forspoken, and most recently and notably The Last of Us Part One have all launched in dire states, with crashes, hitches, and lower performance despite a minor increase in visual quality. And a big reason why is that the graphics cards of the last few years aren’t equipped to handle the demands of games today.

The GPUs themselves are powerful enough; games haven’t suddenly gotten more demanding for no reason. The problem is video memory or VRAM. Many of the most powerful GPUs from the previous generation weren’t set up to handle the VRAM demands of modern games, which may explain why your relatively powerful PC can’t handle the latest and most exciting new games.

What does your VRAM do anyway?

Nvidia GPU core.
Niels Broekhuijsen/Digital Trends

Think of your graphics card like a self-contained computer. In your PC, your processor and RAM work together to do the brunt of the processing work. Your processor does the actual calculations, while your RAM holds the data it needs to do that processing closely. If your CPU had to go out to your hard drive every time it wanted to do a calculation, your computer would be too slow to be useful.

Your graphics card is the same way. The GPU handles the actual processing, while the VRAM holds the data necessary for that processing. This most notably comes up in texture resolution, as higher resolutions are much larger in size compared to lower ones. But other data flows in and out of VRAM, too: shadow maps, geometry, and critically, shaders.

Shaders, especially in titles with ray tracing, are complex and require a lot of space in VRAM. Along with rising texture resolution, the demands of modern AAA games often go beyond the standard 8GB of VRAM you’ve needed in the past, especially if you’re playing at higher resolutions. Unfortunately, this isn’t a problem that a lot of last-gen GPUs accounted for.

The RTX 3070 Ti problem

Nvidia's RTX 3070 Ti graphics card.

When ExtremeTech published a round-up for RTX 3070 Ti reviews, it didn’t mince words. The card had a “long-term problem” with its low VRAM, and we’re starting to see that problem take shape.

Resident Evil 4 Remake can hog up to 8GB of VRAM simply on textures, though you have the option to go much lower. The Last of Us Part One can consume nearly 7GB at its lowest graphics preset and upwards of 14GB at its highest. And Hogwarts Legacy sucked up nearly 13GB of VRAM with ray tracing on, and close to 8GB with it off.

The effects of this are already clear. In preliminary testing of The Last of Us Part One, Hardware Unboxed found massive stuttering with 8GB of VRAM compared to 16GB, even with two graphics cards that should perform around the same level. Keep in mind that the recommended system requirements for this game only call for 4GB of VRAM, as well.

Even powerful graphics cards from the last couple of years are running out of VRAM. Stuttering is one issue, but running out of VRAM can also cause crashes and force you to turn down settings that your GPU is otherwise capable of handling.

I’m calling this the RTX 3070 Ti problem, but it’s not exclusive to the RTX 3070 Ti. It just serves as good touchstone for a wide swath of GPUs that are stuck at or under 8GB of VRAM, despite sporting excellent GPU power otherwise. Even the 10GB RTX 3080 isn’t immune, especially with the highest graphics settings at 4K.

Focused in one direction

Two intel Arc graphics cards on a pink background.
Jacob Roach / Digital Trends

It’s upsetting that graphics cards that should be plenty powerful to run modern games are simply running out of VRAM, causing stuttering and crashes that shouldn’t be happening. Most of this problem is focused in one direction, though: Nvidia.

Nvidia makes the best graphics cards you can buy today, but AMD and Intel have focused some effort on boosting VRAM, even on lower-end models. For example, Intel’s Arc A770 includes 16GB of VRAM under $350. Even the $900 RTX 4070 Ti only includes 12GB. Similarly, AMD opted for 12GB of memory for its midrange RX 6700 XT, while Nvidia stuck with 8GB. That can make a difference in games like Hogwarts Legacy, where Intel’s GPU performs much better than its price would suggest.

Some of that is being rectified with newer cards. Rumors suggest Nvidia’s RTX 4070 could carry 12GB of VRAM, but it still stings that high-end GPUs capable of running the most demanding games are coming up against issues simply due to VRAM limitations. Unfortunately, there’s not a lot you can do if you’re running out of video memory outside of upgrading your graphics card.

You can reduce some stuttering issues, though. If you’re limited by VRAM, turning down your texture resolution can help a lot. You can also reset your shader cache through AMD Software and try increasing your shader cache size in the Nvidia Control Panel. The ultimate fix, though, is more VRAM on graphics cards, especially in lower-end models, which is going to come as a major letdown for those that recently upgraded.

Editors' Recommendations

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
Why new OLED gaming monitors still can’t beat the best from last year
alienware 34 qd oled aw3423dw review 4

OLED gaming monitors are all the rage this year. I've reviewed nearly all the best of them, and although we're getting more, I have a pretty good idea about what 2023 holds in this developing space.

And yet, I still haven't found a display that trounces last year's Alienware 34 QD-OLED. I'm not going to pretend it's perfect -- I've complained about its insistent burn-in prompts in the past, for example. But it's the gold standard, and as I'll explain, its strengths go beyond what you can see on a spec sheet.
It's not a TV, it's a monitor

Read more
Nvidia isn’t selling graphics cards — it’s selling DLSS
RTX 4070 logo on a graphics card.

Nvidia does, of course, sell graphics cards. In fact, it sells most of the best graphics cards on the market. But more than ever before, the company is increasingly hanging its hat on its impressive Deep Learning Super Sampling (DLSS) to sell GPUs rather than raw performance.

The RTX 4090 stands as a crowning achievement in the world of consumer graphics cards, but once you get down into the cards that most gamers will actually buy, the generational improvements start to slip. This became abundantly clear with the launch of the RTX 4070. The card has been well-received, and I even awarded it a rare Editor's Choice award in my RTX 4070 review. But that's despite its generational improvements, not because of them.

Read more
Cyberpunk 2077’s Overdrive mode still isn’t a reason to buy a new GPU
Cyberpunk 2077 on the Cooler Master GP27Q monitor.

Cyberpunk 2077's long-awaited Overdrive feature is here. Announced alongside the Nvidia RTX 4090, the new ray tracing mode brings full path tracing to the world of Night City -- and it looks incredible. It's also extremely demanding, and although there's some argument that the visual improvements are worth it, after testing it myself, the new ray tracing mode doesn't feel like a reason to go out and buy a new graphics card.

Path tracing is essentially the hard way of doing ray tracing, and it's only possible now with the immense power of current-gen GPUs and some crafty AI frame generation. So if you don't have access to the latest and greatest, you can't turn on the feature. Don't be worried, though; for as impressive as path tracing is on paper, it doesn't overhaul the look of Cyberpunk 2077 entirely.
Path tracing isn't just ray tracing
Cyberpunk 2077 | Ray Tracing: Overdrive Mode - 4K Technology Preview Reveal

Read more