Skip to main content

Nvidia RTX 4080 vs. RTX 3080: Is Ada Lovelace worth it?

Nvidia made the RTX 4080 official at its GeForce Beyond event, and now that we have some details, it’s time to see how the next-generation Ada Lovelace card compares to the old guard: the RTX 3080. Although we don’t have a lot of concrete details yet, we still have enough to compare the specs, pricing, and expected performance of the RTX 3080 and RTX 4080.

Pricing and availability

The RTX 4080 series of graphics cards.
Image used with permission by copyright holder

The RTX 3080 launched on September 17, 2020, for a list price of $700. Very few cards actually sold for that price, though, and the card remained expensive for nearly two years. Now, you can find models available for around $750, but we expect those prices to drop as RTX 40-series cards become more common.

The RTX 4080 launched at Nvidia’s special GeForce Beyond broadcast on September 20, 2022. Nvidia hasn’t confirmed a release date, but it says the cards will arrive sometime in November. There are actually two models available, one with 16GB of memory and another with 12GB. The 16GB model will run $1,200, while the 12GB is slightly cheaper at $900. Keep in mind that these are the prices set by Nvidia, and board partner cards will sell for more.

RTX 3080 graphics cards among other GPUs.
Jacob Roach / Digital Trends.

There’s a second version of the RTX 3080 as well, one with 12GB of memory and a slight boost to cores. The 12GB RTX 3080 is slightly more expensive than the base model at around $800 to $900, but Nvidia never set an official list price on this card. You’ll often find it as just an RTX 3080 and sometimes with a slightly higher markup.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

For current prices, it’s better to go with the RTX 3080, but we have to see where prices land once the RTX 4080 has finally released.

Specs

Specs for the RTX 4080 graphics card
Image used with permission by copyright holder

For raw specs, the most interesting note about the two RTX 4080 models is that they come with a massive boost to clock speed. Core counts are up a hair, at least on the 16GB model, but the 800MHz boost is the big kicker, especially considering the RTX 4080 comes with chipmaker TSMC’s 5nm node.

RTX 3080 RTX 4080 16GB RTX 4080 12GB
Process Samsung 8nm TSMC 5nm TSMC 5nm
Architecture Ada Lovelace Ampere Ampere
CUDA cores 8960 / 8704 9,728 7,680
Memory 12GB / 10GB GDDR6X 16GB GDDR6X 12GB GDDR6X
Boost clock speed 1710MHz 2505MHz 2610MHz
Bus width 384-bit / 320-bit 256-bit 192-bit
Power 350W /320W 320W 285W

Samsung’s 8nm manufacturing process did wonders for the Ampere architecture in the previous generation, but manufacturing technology has moved on. TSMC’s 5nm process can pack a lot more power in the same area, so although the specs don’t look like a massive uplift on paper, they should be taken in the context of the new node.

One spec we can compare is memory bandwidth. Nvidia is sticking with the same GDDR6X, but it’s cutting the bus width while upping the capacity. It’s an interesting move that Nvidia experimented with in the previous generation with the RTX 3060. It’s unlikely that bandwidth will be a big problem on these flagship cards, but it’s definitely something to keep in mind with lower-end RTX 40-series offerings.

Nvidia didn’t increase power demands, despite what rumors claimed. In fact, the 12GB RTX 4080 drops power demands to 285W, which is a decent cut. Power demands have been trending upwards, so it’s nice to see them go in the opposite direction this generation.

Expected performance

Jacob Roach / Digital Trends

Nvidia hasn’t shared many details about the RTX 4080 yet, so we don’t know how it will perform. However, the company says that it’s two to four times faster than the RTX 3080 Ti, which itself is around 10% to 15% faster than the RTX 3080. You shouldn’t take those numbers as law, however, because the RTX 4080 has some special tech under the hood.

It has two things in particular: DLSS 3 and Shader Execution Reordering (SER). Unlike DLSS 2, the new version actually predicts new frames. Some of the frames in your game will be completely generated without your GPU rendering a single pixel. SER boosts ray tracing performance by prioritizing ray tracing operations based on the computational power available at the time.

The problem with Nvidia’s performance claim is that it’s not clear if SER and DLSS are factored into them. We expect a solid uplift in performance over the RTX 3080, but without concrete benchmarks to compare, it’s hard to say much at this point.

Go for the deal or performance?

The MSI Suprim X RTX 3080 installed in a PC.
Jacob Roach / Digital Trends

Although the RTX 4080 may provide a big increase in performance over the RTX 3080, it comes at a big cost. At the least, you’re looking at a $200 upcharge and upwards of a $500 increase in price. And that’s with current RTX 3080 prices, which will almost certainly drop in price once the RTX 4080 is here.

It all comes down to performance, and for now, we have to wait until benchmarks are circulating. If you’re going for the value play, though, the RTX 3080 still looks like a winner.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
The RTX 5090 might decimate your power supply
Fans on the Nvidia RTX 3080.

If you thought the best graphics cards already drew a ton of power, you're in for a rude awakening. A series of claims surrounding Nvidia's upcoming RTX 50-series GPUs say that the next-gen cards will push power limits even further, with a flagship card like the RTX 5090 drawing as much as 600 watts.

Nvidia has yet to even announce RTX 50-series GPUs, but we've already seen some troubles with the Blackwell architecture the cards will use in the data center. Official details on the cards are few and far between, but a handful of sources now claim the RTX 5090 will push power limits beyond the 450W we saw with the RTX 4090 in the previous generation. The most recent speculation comes with well-known leaker kopite7kimi, who claimed on X (formerly Twitter) that the RTX 5090 will go up to 600W, while the RTX 5080 will require 400W.

Read more
After a decade, Nvidia is fixing the worst part of G-Sync
Alan Wake 2 on the Alienware 27 QD-OLED gaming monitor.

It only took 11 years. Nvidia is finally doing away with its proprietary G-Sync module that's been the bane of the best gaming monitors for several years. Although Nvidia eventually started offering G-Sync branding with any variable refresh rate (VRR) tech, display brands have still needed to shell out for a dedicated G-Sync module for proper certification -- and pass along that cost to customers. Going forward, the G-Sync module is no more.

Nvidia's G-Sync tech isn't going anywhere, however. The company announced that it partnered with Mediatek to add every G-Sync feature to Mediatek scalers. If you're unfamiliar, a scaler is basically a small chip inside your monitor that handles resolution scaling, along with a bunch of other processing including the on-screen display (OSD), color adjustments, HDR, and input selection. The G-Sync module itself was a scaler. Now that you'll rarely find a gaming monitor without its own scaler, those features are being rolled into the chip already built into the display.

Read more
Nvidia is giving away a $4,690 gaming PC for free
The Falcon NW Talon on a table in an office.

Nvidia is giving away something big. As part of its GeForce Summer program, Nvidia announced that it's giving away a Falcon Northwest Talon desktop currently valued at $4,690 -- and that's ignoring the unique GeForce design etched into the case.

The Talon is a monster PC, so much so that even calling it a "gaming PC" somehow feels reductive. As you can read in our Falcon Northwest Talon review, it's one of the few boutique PCs on the market that brings craftsmanship back into focus. Falcon spares no detail in putting together a PC, and it shows. Despite being one of the best gaming desktops you can buy, there's always been one problem with Falcon Northwest PCs -- they're just so damn expensive.

Read more