Skip to main content

Nvidia says it’s better than AMD for low-lag gaming, and has the data to prove it

Nvidia RTX 2080 Super impressions
Riley Young/Digital Trends

Nvidia has been showing off some new technological innovations at this year’s Gamescom show, most of which require one of its RTX graphics cards to experience at their full potential. One that doesn’t is the new, ultra-low latency option that’s now included with its latest Game Ready Driver (436.02), which debuted during the show. It, like AMD’s anti-lag, helps reduce latency between user inputs and their action in games. According to Nvidia’s own benchmarks, its solution is better.

After two graphics card generations from both Nvidia and AMD that haven’t pushed the performance envelope significantly, features are a greater selling point than they’ve been in the past. For Nvidia, the two biggest benefits of its RTX generation have been RT and Tensor core-powered ray tracing and deep learning super sampling. For AMD, it’s been image sharpening and input lag reduction. While Nvidia previously claimed it had its own anti-lag system for years, it’s now released a new, improved version that’s more in line with what AMD offers. Only better, according to its own benchmarks.

Although the time saved is in terms of milliseconds, high-speed gamers should notice a slight improvement from when no anti-lag features enabled. Nvidia claims, via TechRadar, that it can reduce input lag in Apex Legends from 30ms down to just 19ms. The Division 2’s input lag could be reduced from 49ms down to just 22ms with ultra-low-latency enabled.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

AMD has made similarly impressive claims about its anti-lag feature’s capabilities, suggesting that input lag could be cut in half. While that would likely compete favorably with Nvidia’s low-latency, according to Nvidia’s testing and results, Nvidia is the clear winner. First-party benchmarks need to always be taken with a healthy dose of skepticism, but this is still good news for consumers. AMD’s anti-lag feature was enough of a concern for Nvidia that it aped it and created a feature for Nvidia gamers that is as good, if not better in some cases, which just means more gamers end up with lower input lag in their games. That’s a good thing.

AMD introduced image sharpening in a recent release and now Nvidia has done so as well. Also introduced with the new driver release, is its new Freestyle Sharpening feature that will purportedly work better than its previous “detail” filter, providing better-quality images while not exacting as much of a performance hit.

Elsewhere in this driver release, Nvidia also introduced a more expansive list of G-Sync-compatible monitors and beta support for its GPU integer scaling, which should make older games and those with a pixel art style look far better on higher-resolution monitors where they can otherwise look a little blurry.

You can download Nvidia’s latest drivers from the official website.

Interested in reducing your input lag even further? A high refresh rate monitor can help.

Editors' Recommendations

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more
AMD’s GPUs had a bigger year in 2023 than you might realize
AMD's RX 7700 XT in a test bench.

It's safe to say that 2023 turned out to be a good year for the discrete graphics cards market. According to the latest data, both AMD and Nvidia saw an increase in add-in board (AIB) GPU shipments in the final quarter of 2023, and the year-to-year gains are also massive. While Nvidia still dominates the market, AMD's share is climbing steadily, and Intel remains in the shadows.

Today's round of market insights comes from Jon Peddie Research (JPR), and it's all about discrete GPUs. According to the analyst firm, discrete GPU shipments increased by 6.8% over the fourth quarter of 2023 compared to the previous quarter. This is above the less-than-impressive 10-year average of -0.6%. The year-to-year gains are even more impressive, though, as JPR notes a 32% increase compared to the final quarter of 2022, with a total of 9.5 million GPUs shipped (as opposed to 8.9 million units at the end of 2022).

Read more
AMD finally has a strategy to beat Nvidia’s DLSS
Frank Azor presenting at AMD's RDNA 3 launch event.

AMD's FidelityFX Super Resolution 3 (FSR 3) has had an uphill climb so far, but things might get better in 2024. According to a statement from the company's chief technology officer, this year will be a big one for AMD in terms of AI -- and this doesn't just mean large-scale AI, but also upscaling. Are we going to see some major changes in AMD's next-gen RDNA 4 graphics cards?

The tantalizing bit of information comes from Mark Papermaster, AMD CTO, who was a guest on the No Priors Podcast. At the very end of the interview, Papermaster gave a few hints as to what's on AMD's agenda for 2024. It's all about AI, and no surprise -- Nvidia has adopted the same approach.

Read more