Skip to main content

Why Intel and Nvidia controversies prove you should always wait for benchmarks

Image used with permission by copyright holder

Biased benchmarks are nothing new, but that doesn’t make them any less misleading.

Recommended Videos

The last couple of months have been exciting times for those looking to upgrade their PCs. SSD prices have continued to fall and both Nvidia and Intel have showcased new hardware that is more powerful than anything they’ve ever released before, especially when it comes to games. But in both the debut of the Turing-powered RTX 2000 graphics cards from Nvidia and the reveal of Intel’s 9000-series CPUs, we’ve been fed benchmark information that exaggerates the advantages of the new hardware.

Just over a month since Nvidia’s controversial claims about the performance of its RTX-series of graphics cards, the PC hardware community is once again facing problems with misleading results from tests of pre-release hardware. As we sit mired in the controversy over the paid-for test results of Intel’s Core i9-9900K CPU, it’s more important than ever to remember that waiting for third-party benchmarks is a must when it comes to making an informed purchase of new components.

After spending most of its Gamescom reveal of the RTX cards talking about a feature that wasn’t even available at launch, Nvidia suggested that its cards were several times faster than previous generations using a brand new metric: RTX OPS. It responded to criticism about a lack of traditional gaming numbers with another skewed table of results which again heavily favored its new hardware.

Nvidia’s own comparison results were skewed by a focus on DLSS. Image used with permission by copyright holder

When we finally got our hands on the cards we found them to be very capable and certainly more powerful than their predecessors, but not quite like what Nvidia claimed. Indeed Pascal hardware remains competitive on both price and performance even today, when the RTX 2080 and 2080 Ti cards are very much available for those who want them.

Intel’s new CPU lineup was similarly marred by iffy benchmark results. In its keynote presentation, Intel showed off results published by a company that it paid to test its new chips. The flagship of the consumer range, the 9900K with its eight cores and 5GHz clock speed, was shown to dominate both the 8700K from the last-generation of Intel chips and the top-tier AMD consumer CPU, the 2700X. But there were discrepancies in the results and the test conditions that achieved them.

Exclusive: Interview w/ Principled Technologies on Intel Testing (9900K)

Principle Technologies, the company that performed the testing, has since admitted that it made a mistake using inferior cooling on the AMD CPUs it benchmarked and that it erroneously used a setting in the Ryzen Master overclocking software that disabled half the cores on the Ryzen 2700X. Media and consumers have also highlighted a number of other concerns with the benchmarking methodology, and Principled Technologies has responded by pledging to redo the tests with those concerns in mind.

Intel is standing by the results, claiming that they are “consistent with what we have seen in our labs,” but because of the NDA that prevents other independent media from reporting on the results, there’s no way to refute such claims until the chips are on sale.

Misleading results like those generated by both Nvidia and Intel lead to headlines and help push pre-orders of hardware that are entirely unproven in real-world settings in the rigs of real gamers. You could argue that it’s putting new hardware’s best foot forward, but equally so, it could be seen as deliberately misleading. Don’t forget: Companies like Intel and Nvidia will always want to push people to buy the next generation of hardware, even when the previous generation might still be the better option for some people.

That’s exactly why pre-ordering hardware is such a bad idea. You don’t really know what you’re buying. Always wait for the benchmarks, as otherwise all you’re doing is rewarding companies for great media spin, not great hardware.

Jon Martindale
Former Digital Trends Contributor
Jon Martindale is a freelance evergreen writer and occasional section coordinator, covering how to guides, best-of lists, and…
Forget the RTX 4090. Here’s why you should buy an older GPU instead
The RTX 4090 and RTX 3090 sitting on a table side-by-side.

Strangely, shopping for a graphics card has never been harder. During the GPU shortage, we weren't spoiled for choice -- you had to take whatever was on the shelf and be happy with it, or not shop at all. Now, with all of the best graphics cards in stock, the marketing war has begun once again.

Nvidia is pushing hard in this generation by focusing on performance. The RTX 4090 is intensely powerful but also expensive, and the other cards in the RTX 40-series aren't any better in that regard. One might say that this is the price to pay for top-notch gaming capabilities. But do you really need a current-gen card for AAA gaming in 2023? Let's find out.

Read more
Nvidia’s outrageous pricing strategy is exactly why we need AMD and Intel
Nvidia GeForce RTX 4090 GPU.

If you're finding it hard to keep up with the prices of graphics cards these days, it's not just you. GPUs have been getting pricier with each generation, and Nvidia's latest RTX 40-series is really testing the limits of how much consumers are willing to spend on PC hardware.

Nvidia may have the best GPUs available right now in terms of raw performance, but the way these new GPUs have been priced shows why the role of Intel and AMD is more important than ever.
GPU prices are through the roof

Read more
Intel may have found the solution to Nvidia’s melting GPUs
Nvidia GeForce RTX 4090 is shown along with a hand holding the power cable adapter.

The 12VHPWR connector found in Nvidia's best graphics cards has had its fair share of issues. After dozens of cases of the connector melting during regular usage, the most common cause may have been found, but a permanent solution to the problem has been elusive. Up until now, that is.

Surprisingly, the possible fix comes from Intel, not Nvidia. The company issued a recommendation regarding the design of the connector.

Read more