Skip to main content

Why now is actually a great time to build a new PC

Earlier this month, my colleague Jacob Roach wrote an article about how right now is one of the worst times to build a PC in quite a while, not because prices are terrible right now (they’re not), but because the next generation is just around the corner. New CPUs and GPUs will force retailers to sell older models for less, buying DDR4 for dead-end platforms is a bad idea, and DDR5 will continue to fall in price as we await the arrival of Ryzen 7000 and Intel Raptor Lake. If you can just wait a few months for the next generation, you’ll be much better off, or so the argument goes.

I disagree. I think right now is actually a great time to build a PC, because it’s hard to expect the next generation to be great. If you look at the past five years, it really seems like we’re entering an era where you can’t expect new CPUs and GPUs to provide better value than older models. And while it is tempting to immediately buy into new platforms with cutting-edge features like DDR5 memory and PCIe 5.0, it’s unlikely you’ll really be able to get your money’s worth.

Related Videos

Stagnant prices for CPUs

AMD Ryzen 7 5800X3D pins facing up on a table.
Jacob Roach / Digital Trends

For most of PC gaming history, newer hardware has almost always provided better value than older hardware, usually through a combination of performance increases and price reductions. However, we’re entering a new era where value improvements are steadily declining generation to generation, and it’s starting to look more like a trend than a simple speed bump.

I want to discuss CPUs first, and I want to set the stage with AMD’s Bulldozer-powered FX CPUs, which launched in 2011. The Bulldozer architecture was terrible at basically everything and crippled AMD CPUs for years. From 2011 to 2017, Intel effectively had a monopoly on the entire x86 CPU world, including the mainstream desktop. PC gamers had to be content with the same $330 Intel quad-core year after year, with modest generational improvements.

The launch of Ryzen 1000 in 2017 is often seen as the beginning of a renaissance for desktop CPUs, and it’s not hard to see why. AMD offered the eight-core Ryzen 7 1700 for $329, the same price as the then Intel flagship Core i7-7700K. Later that same year, Intel quickly launched 8th-gen CPUs that featured more cores. The back and forth between AMD and Intel has been going on ever since, albeit with a brief pause in 2020 and 2021 thanks to Intel failing to deliver 10nm CPUs in a timely manner.

As it turns out, we’re not exactly in the renaissance we thought we were. While AMD is certainly delivering significant increases in performance every generation, pricing is becoming a problem. The eight-core Ryzen 7 1700 launched for $329 in 2017 and that was a great deal back then, but five years later you’re paying just about the same amount for the Ryzen 7 5700X, which is also an eight-core CPU. Six-core CPUs still cost about $200 as they did in 2017, too.

Someone holding the Ryzen 7 5800X3D in a red light.
Jacob Roach / Digital Trends

Ryzen 5000 in particular was bad for budget buyers. Budget options like the Ryzen 5 5500 and even the Ryzen 7 5700X didn’t arrive until a few months ago — nearly two years after the first processors came out. AMD has provided generational improvements, but value CPUs arrived too late to the party to matter.

Concerning Intel, we’re seeing increased MSRPs rather than stagnation. Up until 7th-gen, $329 was the limit, but starting with the 8th-gen Intel began increasing both core counts and prices. The Core i7-8700K was Intel’s first six-core CPU for the mainstream and was just 10% more expensive than the Core i7-7700K. But next generation, Intel increased raised prices by nearly 40% with the introduction of a new performance tier led by the Core i9-9900K. AMD followed suit, and now it’s not uncommon to see CPUs like the Core i9-12900K go on sale for well over $600.

It’s hard to argue that price stagnation or increases in MSRP for AMD and Intel CPUs is purely down to competition. AMD and Intel have been very competitive in the past five years when it comes to performance, yet AMD isn’t compelled to cut prices and Intel is continuing to raise the prices on its flagship parts, even when those flagships aren’t very competitive (see the Core i9-11900K). It seems like AMD and Intel are catering more and more to high spenders while neglecting cheaper segments of the market.

The death of budget GPUs

Two graphics cards sitting on top of each other.
Jacob Roach / Digital Trends

GPUs haven’t been doing well either in the past five years. Ever since Nvidia launched its phenomenal GTX 10-series, both Nvidia and AMD have launched several poor-value GPUs and all but killed the low-end and midrange segments. Despite a few promising launching, AMD and Nvidia have clearly shown that value isn’t a focus.

It all started with the RTX 20-series. Yes, it introduced ray tracing and AI upscaling to the mainstream, but with few games that supported these features, the price for the 20-series was simply unbearable. The RTX 2080 at $699 was straight up a worse deal than the GTX 1080 at $499, being only 11% faster for $200 more. I don’t think there has ever been a GPU series before this generation that actually provided worse value than the previous generation, and I see the RTX 20-series as the turning point in desktop GPUs.

As someone who bought an RX 480 in 2016, it’s depressing to see that there isn’t a GPU worth upgrading to at the same price.

RTX 20-series represented a shift in Nvidia’s behavior where increasing bang for buck was no longer a priority, and while that hasn’t impacted the high end very much, it has absolutely destroyed the midrange and low end. Budget GPUs used to start at $100 and you could get good value budget GPUs at around the $150 mark. But today, Nvidia’s cheapest 30-series GPU is the RTX 3050 at $249. You’re not even getting your money’s worth in the 3050; the GTX 1650 Super from 2019 was $159 and the 3050 is just 30% faster.

And we don’t even need to talk about the disastrous GTX 1630.

AMD seems to have followed in Nvidia’s footsteps and also deprioritized value. One very good example is the RX 6500 XT, which is supposed to replace the RX 5500 XT. The problem? The 5500 XT came with 8GB of VRAM whereas the 6500 XT only comes with 4GB. It’s also barely any faster than the five year old RX 580, which also came with 8GB of VRAM. All of these GPUs launched around the $200 price point, and they’re all about the same performance. As someone who bought an RX 480 in 2016, it’s a pretty depressing trend to see. The RX 480 is six years old and AMD still hasn’t come out with a GPU worth upgrading to at the same price point.

It’s possible that supply chain issues caused by the pandemic are responsible for AMD and Nvidia’s lack of budget options this generation. That doesn’t necessarily mean things will go back to normal once those problems go away, though. AMD and Nvidia might decide that things actually went really well without offering budget GPUs. After all, more expensive GPUs have bigger and fatter margins, which is good for business.

Don’t be an early adopter

Intel Core i9-12900K in a motherboard.
Jacob Roach / Digital Trends

Intel’s LGA 1700 socket introduced DDR5 and PCIe 5.0, and AMD plans to follow suit with its upcoming AM5 socket. It’s certainly tempting to upgrade in order to take advantage of these features, but it’s usually not worth being an early adopter when it comes to technology.

DDR5 has been on the market for some time now and with Intel’s 12th-gen Alder Lake CPUs, you can choose between newer DDR5 and older DDR4. If you have an Alder Lake CPU, going for DDR5 doesn’t really net you much more performance, making DDR4 a much better value as most DDR4 kits are half the price of DDR5 kits of the same size. It is true that DDR5 will get cheaper and faster in the future, but DDR4 is cheap today and has good performance.

PCIe 5.0 is certainly an improvement over PCIe 4.0, providing twice the bandwidth, but more bandwidth only translates to more performance if devices are designed to take advantage of it. The extra bandwidth definitely makes sense for SSDs, and there’s no doubt that fast PCIe 5.0 SSDs will be available soon, but PCIe 5.0 for GPUs will likely not be necessary for some time. We saw the same thing happen with PCIe 4.0, the main selling point of which was really SSDs and not GPUs.

Finally, consider the teething issues platforms with new technology tend to have. New features can’t be tested perfectly before they’re released to the world, so it’s more than likely that users who build a PC on these new platforms will see at least a bug or two. I think given the price, the initial lack of use for these features, and the high likelihood of bugs make older platforms using DDR4 and PCIe 4.0 still very viable.

I hope I’m wrong

High performance and custom MSI computer building.

I would really like for the next generation to get us back on track. I really want Ryzen 7000 and Raptor Lake to launch at good prices and to see new AMD and Intel CPUs cover the entire stack, from low end to high end. I really want AMD and Nvidia to bring back truly good budget and midrange GPUs with the upcoming RX 7000 and RTX 40 GPUs.

I just don’t see that happening given what I’ve seen in the past 5 years. The big three have certainly made great strides in technology, but you can no longer enjoy that progress unless you’re willing to shell out hundreds of dollars. So buy your CPUs and GPUs while they’re relatively cheap because it probably won’t be like this forever.

Editors' Recommendations

Nvidia’s RTX 4070 Ti is destroying AMD, even though it shouldn’t
The RTX 4070 Ti graphics card on a pink background.

Despite the fact that the RTX 4070 Ti has been reviewed poorly, Nvidia's latest GPU seems to be doing well in terms of sales numbers -- at least as far as the European market is concerned.

According to newly published sales figures, Nvidia managed to sell a huge quantity of RTX 4070 Ti GPUs. It effortlessly defeated its biggest rival, AMD, even though the AMD RX 7900 XT is the better value. Regardless of how it's reviewed, the RTX 4070 Ti seems to be the king of all the new graphics cards that have out in 2023 so far.

Read more
Nvidia did AMD a huge favor by releasing the RTX 4070 Ti
RX 7900 XTX and RX 7900 XT on a pink background.

When AMD released the RX 7900 XTX and the RX 7900 XT, it was clear which of the two would be stealing the spotlight. The RX 7900 XTX is a solid GPU, and its weaker sibling is very much overshadowed by it. However, now that Nvidia launched its RTX 4070 Ti, there's hope yet for AMD.

It might sound strange, but Nvidia may have actually done AMD a huge favor by releasing the RTX 4070 Ti the way it did.
AMD's RX 7900 XT is one odd GPU

Read more
RTX 4090 connectors are melting again, and this time there’s a major change
Melted 12VHPWR connector made by CableMod for the RTX 4090.

We thought we'd seen the last of the problems with Nvidia's 12VHPWR connector used for the RTX 40-series. However, it seems that the issues are not over, and the latest incident is different in a major way.

While the previous cases of melting connectors all affected cables supplied by Nvidia, things are different this time around. A Reddit user reported that their custom CableMod connector has melted, undermining previous theories about the root of the problem.

Read more