Skip to main content

Nvidia G-Sync bug is causing high power draw at high refresh-rates

nvidia g sync bug is causing high power draw at refresh rates gsyncdisplayu
Nvidia
Nvidia’s G-Sync technology is designed to be a replacement for the much-hated, age-old Vsync frame rate synching technology that helps prevent screen tearing. As the more expensive, higher-quality solution when compared with AMD’s Freesync, it’s supposed to perform this function without impacting GPU usage, but a bug in its programming is causing those running it on high refresh-rate monitors to run into an irritating issue.

The bug in question seems to stem from running G-Sync on monitors at refresh-rates of 144Hz and above. In those cases, users reported that even idling frequencies of the GPU jumped to much higher levels and the power draw from the GPU increased dramatically, despite there not being much going on with the PC as a whole.

The difference can be quite stark, too, with PCPer reporting that while raising the refresh-rate from 60Hz to 100Hz and 120Hz makes almost no difference to the system’s power draw, jumping up to 144Hz or more sees it almost double.

Related: G-Sync promises to make games buttery smooth, but does it really work? We tested

This is surprising since Nvidia’s solution for syncing frame rates takes place within the monitors themselves, meaning the system shouldn’t be affected by the monitor syncing higher frame rates, as other solutions might. But in this case that’s exactly what’s happening.

Nvidia has acknowledged the issue, stating that the tests conducted showed that there was a bug in the way the GPU was managing clocks for G-Sync at high refresh-rates. It’s promised that a fix is in the works however, so those with the latest hardware and the newest high refresh-rate monitors won’t be affected for long. Users suffering with this problem can expect a fix to appear in the next driver release from the green GPU maker.

Have you got one of the new high refresh-rate monitors released by the likes of ASUS? If so, have you run into any issues with higher power draw than expected?

Editors' Recommendations

Jon Martindale
Jon Martindale is the Evergreen Coordinator for Computing, overseeing a team of writers addressing all the latest how to…
LG OLED TVs just became the best TVs for gaming with Nvidia G-Sync support
nvidia g sync lg oled tvs 2019 tv with

OLED TVs have always been exceptionally good screens for gaming. With their high contrast ratios, lightning-fast response rates, and low input-lags, they're an excellent choice for console gamers. This week, however, LG is rolling out a firmware update that will make all of its 2019 OLED TVs much better for PC gaming with its support of Nvidia's G-Sync Variable Refresh Rate (VRR) technology.

For those who are looking for the ultimate PC gaming experience, a monitor that supports VRR is important, but gamers who use graphics cards from Nvidia in their rigs also want support for Nvidia's proprietary VRR format known as G-Sync. Until now, these gamers have been limited to specific models of gaming monitors which tend to be much smaller than 4K TVs. With G-Sync support, LG OLED TVs can now deliver the same experience but on screen sizes up to 77 inches.

Read more
Nvidia and Ericsson announce the first GPU-powered 5G mobile network
nvidia egx 5g mobile network ericcson software

Nvidia is using the power of graphics processors and artificial intelligence to bring 5G connectivity to the mobile edge. At Mobile World Congress Americas in Los Angeles, California, Nvidia CEO Jensen Huang announced that the company is partnering with Ericsson and Red Hat to help mobile network operators accelerate 5G deployment and better utilize the boost in bandwidth for applications that you'd care about, like gaming, virtual reality and augmented reality, autonomous driving, and connecting your smart home to 5G networks.

The partnerships address some of the shortcomings of current mobile network architectures. For example, 4G LTE networks are configured so that they're optimized for peak capacity with dedicated hardware to deliver voice and data access, explained Nvidia general manager of enterprise and edge computing Justin Boitana in a telephone briefing ahead of Huang's MWC keynote. In a downtown area, this means that when users leave at night, the network capacity would be wasted at night when users leave, as carriers can't redeploy the existing bandwidth.

Read more
Game of Thrones draws season-high ratings and Twitter buzz for episode 3
game of thrones season 8 episode 3 preview s8e3 03

The night was dark and full of terrors for the Battle of Winterfell  (maybe a little too dark), but that didn't stop the latest episode of Game of Thrones from drawing one of the biggest audiences an episode of the series has ever had.

The third episode of the HBO show's eighth and final season, titled The Long Night, brought in 17.8 million viewers across various broadcast and streaming platforms, the second-largest audience for any Game of Thrones episode's first airing.

Read more