Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Nvidia addresses the rumors about the RTX 40 GPUs’ power consumption

The new Nvidia GeForce RTX 40 lineup includes some of the most power-hungry graphics cards on the market. Because of that, you may be wondering if you’ll need a new power supply (PSU) in order to support the borderline monstrous capabilities of the RTX 4090.

To answer some of these concerns, Nvidia released new information about the power consumption of its new GPUs. The conclusion? Well, it’s really not all that bad after all.

Nvidia RTX 4090 graphics card.
Nvidia

Prior to the official announcement of the RTX 40-series, the cards have been the subject of much power-related speculation. The flagship RTX 4090 received the most coverage of all, with many rumors pointing toward insane requirements along the lines of 800-900W. Fortunately, we now know that those rumors weren’t true.

The RTX 4090 has a TGP of 450W, the same as the RTX 3090 Ti, and calls for a minimum 850W PSU. The RTX 4080 16GB takes things down a few notches with a 320W TGP and a 750W power supply. Lastly, the RTX 4070 in disguise, also known as the RTX 4080 12GB, draws 285W and calls for a 700W PSU.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Nvidia claims that this is not an increase from the previous generation, but it kind of is — after all, the RTX 3090 had a TGP of 350W. With that said, it’s not as bad as we had thought, but many are still left to wonder if they need to upgrade their existing PSUs or not.

Nvidia has now assured its customers that they can stick to the PSU they currently own as long as it meets the wattage requirements for that given card.

Similarly, Nvidia doesn’t expect there to be any problems when it comes to 8-pin to PCIe Gen 5 16-pin adapter compatibility. As said by Nvidia on its FAQ page: “The adapter has active circuits inside that translate the 8-pin plug status to the correct sideband signals according to the PCIe Gen 5 (ATX 3.0) spec.”

There’s also another fun little fact to be found in that FAQ: Nvidia confirms that the so-called smart power adapter will detect the number of 8-pin connectors that are plugged in. When four such connectors are used versus just three, it will enable the RTX 4090 to draw more power (up to 600 watts) for extra overclocking capabilities.

Nvidia CEO Jensen Huang with an RTX 4090 graphics card.
Nvidia

There have also been questions about the durability of the PCIe 5.0 connectors, which are rated at 30 cycles. Some might consider that to not be much, but Nvidia clears this up by saying that this has almost always been the case, or at least has been over the past twenty years.

Lastly, Nvidia clarified the matter of the possibility of an overcurrent or overpower risk when using the 16-pin power connector with non-ATX 3.0 power supply units. It had, indeed, spotted an issue during the early stages of development, but it has since been cleared up. Again, seemingly nothing to worry about there.

All in all, the power consumption fears have largely been squelched. Nvidia did ramp up the power requirements, but not as significantly as expected, so as long as your PSU matches what the card asks for, you should be fine. Let’s not breathe that sigh of relief yet, though — the RTX 4090 Ti might still happen, and that will likely be one power-hungry beast.

Monica J. White
Monica is a UK-based freelance writer and self-proclaimed geek. A firm believer in the "PC building is just like expensive…
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more
The RTX 4090 is past its prime, and that’s OK
Nvidia GeForce RTX 4090 GPU.

In October 2022, when I first reviewed the RTX 4090, I called it "both a complete waste of money and the most powerful graphics card ever made." That's even more true now that it was more than a year ago. The AI boom shortly after the launch of the RTX 4090, combined with some international restrictions on the GPU, has caused prices to skyrocket to unattainable places, moving the affordability from unlikely to basically impossible.

But that's changing. Reports indicate that prices are slowly dropping, moving from a high of $2,200 down to around $2,000. That's still way above the GPU's list price of $1,600, but the trajectory now is at least positive.

Read more
Why I’m feeling hopeful about Nvidia’s RTX 50-series GPUs
The RTX 4070 Super on a pink background.

I won't lie -- I was pretty scared of Nvidia's RTX 50-series, and I stand by the opinion that those fears were valid. They didn't come out of thin air; they were fueled by Nvidia's approach to GPU pricing and value for the money.

However, the RTX 40 Super refresh is a step in the right direction, and it's one I never expected to happen. Nvidia's most recent choices show that it may have learned an important lesson, and that's good news for future generations of graphics cards.
The price of performance
Nvidia really didn't hold back in the RTX 40 series. It introduced some of the best graphics cards we've seen in a while, but raw performance isn't the only thing to consider when estimating the value of a GPU. The price is the second major factor and weighing it against performance can often tip the scales from "great" to "disappointing." That was the case with several GPUs in the Ada generation.

Read more