Skip to main content

Upcoming Nvidia GPUs may require monstrous levels of power

The ongoing speculation that Nvidia’s next-gen GeForce RTX 4090 will have monstrous power requirements continues to be confirmed by more and more sources.

YouTuber Moore’s Law is Dead talked about the power draw of the RTX 4090 in his latest video, speculating that the GPU will have a total board power (TBP) of 600 watts, and that figure will only increase with nonreference designs.

RTX 4090 600w TBP Leak: Lovelace is CONFIRMED to be INSANE! (+ RX 6x50 XT Update)

Nvidia’s upcoming Ada Lovelace graphics cards have long been rumored to need intense amounts of power in order to run. Moore’s Law is Dead talked about both the consumer flagship AD102 GPU and the workstation RTX L6000, and everything points to all the talk being true — both these GPUs will need beastly PSUs.

Some reports about the AD102 GPU estimated the power draw at a massive 850-watt TBP. However, these numbers will hopefully go down, and the flagship is often rumored to have a TBP of 600 watts. However, this applies to the base configuration released by Nvidia.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Most commonly, Nvidia’s partners release even more powerful versions of the GPU, so it’s very likely that we will see even higher requirements in those nonreference designs. As an example, the reference model for the RTX 3090 Ti has a 450-watt TBP, but custom models go up to 516 watts. The same is going to be true for the next generation of cards, so even if the Nvidia model manages to squeeze the TBP down to 600 watts (which is still a lot), the custom models may bring it right back up.

Unsurprisingly, such a large power draw will require lots of cooling. Although Nvidia itself may stick to just fans, with perhaps a quad-fan design, the rumor mill points to its board partners switching to hybrid cooling with some AIO cooling added into the mix.

Wccftech speculates about the difference in power requirements for the RTX 4080 versus the flagship RTX 4090. Assuming that the reference model of the RTX 4090 has a 600-watt TBP, it’s likely that we may see up to 450 watts (or more) for the RTX 4080.

Fans on the Nvidia RTX 3080.
Jacob Roach / Digital Trends

Having such a beastly graphics card, most commonly paired with one of the best CPUs, will result in unprecedented needs when it comes to picking the right power supply for your rig. Previously, leakers like harukaze5719 on Twitter already speculated that the GPU would require at least a 1000-watt gold power supply, but that was in reference to the RTX 4090 Ti. However, as Wccftech points out, a 1000- or even 1200-watt PSU may become the new standard as we settle into the next generation of computer hardware.

Nvidia’s upcoming workstation lineup is also getting updated to the AD102 GPU. The best variant of the card, dubbed either the RTX L6000 or RTX L8000, is expected to have a TBP of at least 320 watts, but that number could go up to as high as 375 watts. Compared to the estimations for the RTX 4090, that seems perfectly reasonable, but it still marks a 25% increase in power draw from the previous generation.

There is no doubt that Nvidia’s next-gen top card is going to require a lot of power. While the exact figures will remain a topic of speculation until Nvidia officially announces the GPUs, mentally preparing yourself to buy a new power supply seems like a wise thing to do if you’re looking to upgrade your GPU this year.

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more
GPUs just broke a 25-year-old record
Two RTX 4070 graphics cards sitting side by side.

The PC graphics card market witnessed notable growth in the fourth quarter of 2023, according to Jon Peddie Research. With shipments climbing by 6% to reach 76.2 million units, this surge marks a significant 24% increase year over year, representing the most substantial gain in over 25 years.

Projections indicate a continued upward trend, with an expected 3.6% annual growth rate from 2024 to 2026, potentially culminating in a total installed base of 5 billion units by the end of 2026, with discrete GPUs comprising 30% of the market.

Read more
Why I’m feeling hopeful about Nvidia’s RTX 50-series GPUs
The RTX 4070 Super on a pink background.

I won't lie -- I was pretty scared of Nvidia's RTX 50-series, and I stand by the opinion that those fears were valid. They didn't come out of thin air; they were fueled by Nvidia's approach to GPU pricing and value for the money.

However, the RTX 40 Super refresh is a step in the right direction, and it's one I never expected to happen. Nvidia's most recent choices show that it may have learned an important lesson, and that's good news for future generations of graphics cards.
The price of performance
Nvidia really didn't hold back in the RTX 40 series. It introduced some of the best graphics cards we've seen in a while, but raw performance isn't the only thing to consider when estimating the value of a GPU. The price is the second major factor and weighing it against performance can often tip the scales from "great" to "disappointing." That was the case with several GPUs in the Ada generation.

Read more