Skip to main content

The 5 worst Nvidia GPUs of all time

Nvidia has a strong pedigree for making great graphics cards. It has never really been the underdog and its greatest GPUs outshone those of its rival AMD time and time again. But despite Nvidia’s penchant for innovation and technological advancement, it has put out quite a few abominable cards, cursed not necessarily by bad technology, but often by poor decision making. Let’s reminisce on some Nvidia GPUs we wish we could forget.

GeForce GTX 480

The way it’s meant to be grilled

The Nvidia GeForce GTX 480.
Hyins

Although Nvidia has been in business for over 20 years now, there’s really only one GPU the company has ever put out that was truly terrible on a technological level, and it’s the GTX 480. Powered by the Fermi architecture, the GTX 480 (and the entire 400 series by extension) was plagued by multitudes of issues which in turn allowed AMD to become the leading graphics chip maker and nearly overtake Nvidia in market share.

The 480’s biggest claim to fame (or infamy) was its power consumption and heat. In Anandtech’s testing, it found that a single GTX 480 consumed as much power as dual GPU systems and it could get as hot as 94 C in normal games, which at the time was insane. It was an unfortunate coincidence that the 480’s stock cooler looked like a grill, prompting detractors to turn Nvidia’s “the way it’s meant to be played” slogan into “the way it’s meant to be grilled.”

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

To make matters worse, Fermi was late to the party by about 6 months, as AMD’s HD 5000 series launched first. Sure, the 480 was the fastest graphics card with just one GPU die, but AMD’s HD 5870 had 90% of the performance without being a toaster. Besides, AMD’s HD 5970 with two GPU dies was faster, and in 2010, CrossFire had much better support in games. Last but not least, the 480’s $500 price tag was just too high to make it competitive.

Nvidia ended up killing the GTX 400 series ignominiously just eight months later by launching the GTX 500 series, which was basically a fixed version of Fermi. The new GTX 580 was faster than the GTX 480, consumed less power, and had the same price tag.

GeForce GTX 970

3.5 equals 4

Image used with permission by copyright holder

When it first came out, the GTX 970 was actually very well received, much like other 900 series cards powered by the legendary Maxwell architecture. It was $329 and was as fast as AMD’s 2013 flagship R9 290X while consuming significantly less power. In Anandtech’s opinion, it was a strong contender for the generation’s best value champion. So, what did the 970 do so badly that it ended up on this list?

Well, a few months after the 970 came out, some new information came to light about its specifications. Although the GPU had 4GB of GDDR5 VRAM, only 3.5GB of it was usable at full speed, with the remaining half GB running barely any faster than DDR3, the system memory a GPU will go to if it runs out of VRAM. For all intents and purposes, the 970 was a 3.5GB GPU, not a 4GB one, and this led to a lawsuit that Nvidia settled out of court, paying every 970 owner $30 each.

In reality, the performance implications of having half a gigabyte less VRAM were basically nonexistent according to Anandtech. At the time, most games that demanded more than 3.5GB of VRAM were just too intensive in the first place, even for the GTX 980 which had the full 4GB of VRAM.

Nowadays, there are a few games where the 970 struggles due to its suboptimal memory configuration. But performance isn’t the point here; ultimately, Nvidia more or less lied about what the GTX 970 had, and that’s not acceptable and really stains the legacy of an otherwise great card. Unfortunately, playing fast and loose with GPU specifications is a habit Nvidia has had trouble breaking ever since.

GeForce GTX 1060 3GB

Ceci n’est pas une 1060

Best graphics card for gaming
Image used with permission by copyright holder

After the 970 debacle, Nvidia never attempted to make another GPU that had a slow segment of VRAM and made sure each card was advertised with the correct amount of memory. However, Nvidia found another specification that was easier to mess with: CUDA core count.

Before the 10 series, it was common to see GPUs with multiple (usually two) versions that differed in VRAM capacity, such as the GTX 960 2GB and the GTX 960 4GB. GPUs with more VRAM were just that; they didn’t even have more memory bandwidth in the vast majority of cases. But that all started to change with Nvidia’s 10 series, which introduced GPUs like the GTX 1060 3GB. On the surface, it sounds like a GTX 1060 with half the normal 6GB, but there’s a catch: it had fewer cores, too.

As an actual product, the GTX 1060 3GB was passable, according to reviewers like Techspot and Guru3D, which didn’t even mind the lowered core count. But the 1060 3GB ushered in a barrage of GPU variants that had both less VRAM and fewer cores, and frankly, this trend has caused nothing but confusion. The core count of the GPU is arguably what makes different models of GPUs different, with the VRAM just being a secondary factor of performance.

The worst example of Nvidia doing this bait and switch would have been the RTX 4080 12GB, which was supposed to have just 78% of the cores of the RTX 4080 16GB, making it feel more like an RTX 4070 than anything else. However, the backlash to this was so intense that Nvidia actually canceled the RTX 4080 12GB, which (un)fortunately means it will never be on this list.

GeForce RTX 2080

One step forward and two back

RTX 2080
Riley Young/Digital Trends

With the GTX 10 series, Nvidia achieved total domination in the GPU market; cards like the GTX 1080 Ti and the GTX 1080 are easily some of Nvidia’s best GPUs of all time. Nvidia wasn’t slowing down, either, as its next-generation RTX 20 series introduced real-time ray tracing and AI-powered resolution upscaling. The 20 series was way more technologically advanced than the 10 series, which was basically the 900 series on a better node.

In fact, Nvidia thought so highly of its new technology that it gave the RTX 20 series the kind of price tag it thought it should deserve, with the RTX 2080 coming in at $800 and the RTX 2080 Ti at $1,200. Ray tracing and DLSS were the next big thing, so that was going to make up for it, Nvidia thought. Except, that wasn’t obvious to anyone because on launch day, there were no games with ray tracing or DLSS, and there wouldn’t be for months. Only by the time RTX 30 cards came out were there lots of games with support for these new features.

The RTX 2080 was an especially bad 20 series GPU. It was $100 or so more expensive than the GTX 1080 Ti while having slightly less performance per our testing; at least the 2080 Ti could claim to be about 25% faster than the old flagship. Even when ray tracing and DLSS came into play, enabling ray tracing was so intensive that it struggled to hit 60 fps in most titles, while DLSS 1.0 simply didn’t look very good. By the time DLSS 2 came out in early 2020, RTX 30 was just over the horizon.

Nvidia had overplayed its hand, and it knew. Just eight months after the 20 series launched, Nvidia released its RTX 20 Super GPUs, a throwback to the GTX 500 series and how it patched the 400 series. The new Super variants of the 2060, 2070, and 2080 featured more cores, better memory, and lower price tags, somewhat fixing the problems of the original 20 series.

GeForce RTX 3080 12GB

How to make a good GPU terrible

RTX 3080 graphics card on a pink background.
Jacob Roach / Digital Trends

So we’ve seen what happens when Nvidia takes a good GPU and cuts its VRAM and core count down without changing the name, but what happens when it takes a good GPU and adds more VRAM and cores? Making a good GPU even faster sounds like a great idea! Well, in the case of the RTX 3080 12GB, it resulted in the creation of what might be Nvidia’s most pointless GPU in every way.

Compared to the original RTX 3080 10GB, the 3080 12GB wasn’t actually much of an upgrade. Like other Nvidia GPUs with more memory, it did have more cores too, but only about 3% more. In our review, we found that the 10GB and 12GB models had almost identical performance, very unlike how the 1060 3GB was noticeably slower than the 1060 6GB. To Nvidia’s credit, the name of the 3080 12GB was pretty accurate, a noticeable improvement over the 1060 3GB.

So, what’s the problem with offering a new version of a GPU with more memory? Well, Nvidia released the 3080 12GB during the GPU shortage of 2020-2022, and naturally, it retailed for an absurdly high price at anywhere between $1,250 and $1,600. Meanwhile, 10GB variants were retailing for $300 to 400 less, and since the memory upgrade clearly didn’t matter, it was obvious which card you should buy.

Perhaps the most embarrassing thing for the 3080 12GB wasn’t its cheaper 10GB version, but the existence of the RTX 3080 Ti, which had the same memory size and bandwidth as the 3080 12GB. The thing is, it also had 14% more cores and consequently significantly higher performance. On review day, the 3080 Ti was cheaper, making the 3080 12GB pointless from literally every angle and just another card released during the shortage that didn’t make any sense at all.

Nvidia’s worst GPUs, so far

To Nvidia’s credit, even most of its worst GPUs had something going for it: the 970 was good in spite of its memory, the 1060 3GB was just named poorly, and the RTX 2080 was just overpriced by about $200. Nvidia has made very few technological mistakes so far, and even the GTX 480 was at least the fastest graphics card with just a single GPU die.

That being said, good technology can’t make up for bad business decisions like poor naming conventions and exorbitant pricing, and these are mistakes Nvidia keeps making every year. Unfortunately, it doesn’t seem like either of these things is going away any time soon, with the RTX 4080 12GB almost making it to market while the RTX 4080 and RTX 4090, while excellent cards, are simply far too expensive to make sense.

It wasn’t hard to predict that Nvidia’s GPUs would keep getting more and more expensive, and I fully expect this trend to continue into the future. Nvidia’s next worst GPU won’t be let down by shady marketing or misleading branding or technological blunders, but by price alone. We’d be lucky to see the RTX 4070 cost no more than AMD’s upcoming RX 7900 XTX.

Editors' Recommendations

Matthew Connatser
Former Digital Trends Contributor
Matthew Connatser is a freelance writer who works on writing and updating PC guides at Digital Trends. He first got into PCs…
Nvidia’s new GPUs could be right around the corner
Nvidia's RTX 4070 graphics cards over a pink background.

Is Nvidia really about to add to its lineup of top GPUs? All signs point to yes, and now, we have an official Nvidia keynote on the horizon that tells us when we might hear more about the rumored RTX 40 Super. Nvidia revealed that it's going to deliver a special address on January 8 as part of CES 2024. Although the company hasn't confirmed what it's planning to cover, the rumor mill has been buzzing with information about three new desktop GPUs. But will they really be worth the upgrade?

Several reputable leakers have weighed in on the matter of the RTX 40-series refresh, and we've been getting updates about the range for a few weeks now. Nvidia doesn't need to specifically state that it'll talk about these graphics cards, as that is going to be the expectation anyway. The three GPUs in question are the RTX 4080 Super, RTX 4070 Ti Super, and the RTX 4070 Super.

Read more
I’m scared of next-gen Nvidia GPUs, and you should be too
Nvidia CEO Jensen Huang with an RTX 4090 graphics card.

Few things are as thrilling in the PC world as the release of a new lineup of some of the best graphics cards. The excitement builds for months on end, with benchmarks, leaks, predictions, and finally, the launch of said GPUs. While I'm far from immune to that sort of hype, I can't bring myself to be excited about Nvidia's RTX 5000-series. In fact, I'm kind of dreading it.

My fears are based on the last couple of generations. While Nvidia certainly knows how to push its performance to new heights, all of this comes at a price that the mainstream market may not be prepared to pay.
Nvidia's rise to dominance

Read more
Nvidia may launch three new Super GPUs to fight back AMD
Three RTX 4080 cards sitting on a pink background.

Nvidia may be readying three new GPUs -- the RTX 4080 Super, RTX 4070 Ti Super, and the RTX 4070 Super. We haven't seen Super cards since Nvidia's RTX 20-series, but if this leak turns out to be true, they're coming back. Will they be worthwhile enough to rank high among the best graphics cards? It's hard to say, but they could help it compete against AMD's recent GPUs.

The information comes from hongxing2020, a frequent leaker in the GPU space. Nvidia already has a decent spread of GPUs between the RTX 4080, RTX 4070 Ti, and the RTX 4070. However, if a refresh to the Ada lineup is reportedly on the way, we might see some notable changes, but only if Nvidia decides to shake things up and use a different chip for at least two out of those three GPUs.

Read more