Skip to main content

Nvidia’s next-gen GPUs to likely require a new power supply

Another day, another rumor about the specs for Nvidia’s next-gen GPUs. This time, the rumor mill is buzzing about potential power limits of Nvidia’s Lovelace GPUs and whether or not you’ll need to upgrade your power supply. Let’s just say you may have to factor in a new PSU for your next build.

According to Moore’s Law is Dead and Wccftech, Nvidia’s upcoming GPUs will likely max out at 600 watts. For comparison, the RTX 3090 tops out at 350W and RTX 3090 Ti is rumored to increase that to 450W. Obviously, any rumors must be taken with great skepticism, but 600W is a significant jump and may require many PC builders to upgrade their power supply.

Jeff Fisher presents the RTX 3090 Ti at an unveiling event.
Image used with permission by copyright holder

We’ve seen rumors of high power figures before. It was reported earlier that the RTX 4090 and 4090Ti would require a massive 1,200W power supply. Reports of those outrageous numbers were later tempered a bit since the leakers involved were unable to confirm the exact TDP figures.

It seems that Nvidia is trying to push the limits of power consumption, which could explain why such huge numbers are being rumored. The main issue seems to be ensuring that graphics cards can be adequately air cooled by both Nvidia’s reference design and Add-In-Board partners. ExtremeTech notes that Nvidia’s power targets are in line with the 12-Volt High Power PCIe Gen 5 connector that supports up to 600W.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Among the powe- consumption rumors, Lovelace GPUs may also feature superfast GDDR7 memory. The current GDDR6X used in the RTX 3080, 3080 Ti, and 3090 cards maxes out at 19Gbps. Even using the existing 256-bit and 384-bit wide memory interfaces, this would be a noticeable boost to performance.

Both of these rumors are on top of the massive performance gains that the next-gen cards are rumored to have. Nvidia’s flagship GPUs could have up to 75% more CUDA cores than the RTX 3090 and huge L2 cache. This would greatly reduce the amount of time and energy to access data from the main memory.

All of this news comes right as GPU prices are finally dropping, as much as 25% for certain graphics cards. Intel’s entrance into the graphics card market with its Arc Alchemist GPU lineup should also help ease shortage concerns.

Editors' Recommendations

David Matthews
Former Digital Trends Contributor
David is a freelance journalist based just outside of Washington D.C. specializing in consumer technology and gaming. He has…
Nvidia DLSS is amazing, but only if you use it the right way
Lies of P on the KTC G42P5.

Nvidia's Deep Learning Super Sampling, or DLSS, has become a cornerstone feature of modern PC games. It started as a way to boost your performance by rendering a game at a lower resolution, but the prominence and popularity of DLSS have prompted Nvidia to add even more features under the name.

Today, DLSS incorporates several different features, all of which leverage AI to boost performance and/or image quality. It can be intimidating if you're a new RTX user, so I'm here to break down all of the increases of DLSS in 2024 and how you can best leverage it in supported games.
The many features of DLSS

Read more
AMD needs to fix this one problem with its next-gen GPUs
The RX 7800 XT graphics card with the ReSpec logo.

AMD's current-gen graphics cards have been a revelation. Last generation, AMD was able to hit performance parity with Nvidia while sacrificing ray tracing performance. This generation, AMD is maintaining parity while getting closer in ray tracing, as showcased by GPUs like the RX 7900 GRE. But the next frontier of gaming is rapidly approaching, and AMD's current options aren't up to the task right now.

I'm talking about path tracing. Nvidia calls it "full ray tracing," and it's a lighting technique that can take gaming visuals to the next level. Path tracing is only available in a small list of titles right now, but with frame generation and upscaling tools better than they've ever been, it won't be long before we see these destination gaming experiences everywhere.
Player two in path tracing

Read more
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more