Skip to main content

What is G-Sync?

When shopping for a gaming monitor, you’ll undoubtedly come across a few displays advertising Nvidia’s G-Sync technology. In addition to a hefty price hike, these monitors usually come with gaming-focused features like a fast response time and high refresh rate. To help you know where your money is going, we put together a guide to answer the question: What is G-Sync?

In short, G-Sync is a hardware-based adaptive refresh technology that helps prevent screen tearing and stuttering. With a G-Sync monitor, you’ll notice smoother motion while gaming, even at high refresh rates.

Recommended Videos

What is G-Sync?

Nvidia

G-Sync is Nvidia’s hardware-based monitor syncing technology. G-Sync solves screen tearing mainly, synchronizing your monitor’s refresh rate with the frames your GPU is pushing out each second.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Your GPU renders a number of frames each second, and put together, those frames give the impression of smooth motion. Similarly, your monitor refreshes a certain number of times each second, clearing the previous image for the new frames your GPU is rendering. To keep things moving smoothly, your GPU stores upcoming frames in a buffer. The problem is that the buffer and your monitor’s refresh rate may get out of sync, causing a nasty line of two frames stitched together.

V-Sync emerged as a solution. This software-based feature essentially forces your GPU to hold frames in its buffer until your monitor is ready to refresh. That solves the screen tearing problem, but it introduces another: Input lag. V-Sync forces your GPU to hold frames it has already rendered, which causes a slight delay between what’s happening in the game and what you see on screen.

Nvidia’s first alternative to V-Sync was Adaptive VSync. Like the older technology, Nvidia’s driver-based solution locked the frame rate to the display’s refresh rate to prevent screen tearing. However, when the GPU struggled, Adaptive VSync unlocked the frame rate until the GPU’s performance improved. Once stable, Adaptive VSync locked the frame rate until the GPU’s performance dropped again.

Nvidia introduced a hardware-based solution in 2013 called G-Sync. It’s based on VESA’s Adaptive-Sync technology, which enables variable refresh rates on the display side. Instead of forcing your GPU to hold frames, G-Sync forces your monitor to adapt its refresh rate depending on the frames your GPU is rendering. That deals with input lag and screen tearing.

However, Nvidia uses a proprietary board that replaces the typical scaler board, which controls everything within the display like decoding image input, controlling the backlight, and so on. A G-Sync board contains 768MB of DDR3 memory to store the previous frame so that it can be compared to the next incoming frame. It does this to decrease input lag.

On the PC end, Nvidia’s driver can fully control the display’s proprietary board. It manipulates the vertical blanking interval, or VBI, which represents the interval between the time when a monitor finishes drawing the current frame and the beginning of the next frame.

With G-Sync active, the monitor becomes a slave to your PC. As the GPU rotates the rendered frame into the primary buffer, the display clears the old image and gets ready to receive the next frame. As the frame rate speeds up and slows down, the display renders each frame accordingly as instructed by your PC. Since the G-Sync board supports variable refresh rates, images are often redrawn at widely varying intervals.

G-Sync system requirements

27'' UltraGear 4K UHD Nano IPS 1ms 144Hz G-Sync Compatible Gaming Monitor
Image used with permission by copyright holder

For years, there’s always been one big caveat with G-Sync monitors: You need an Nvidia graphics card. Although you still need an Nvidia GPU to fully take advantage of G-Sync — like the recent RTX 3080 — more recent G-Sync displays support HDMI variable refresh rate under the “G-Sync Compatible” banner (more on that in the next section). That means you can use variable refresh rate with an AMD card, though not Nvidia’s full G-Sync module. Outside of a display with a G-Sync banner, here’s what you need:

Desktops

  • GPU – GeForce GTX 650 Ti BOOST or newer
  • Driver – R340.52 or higher

Laptops connected to G-Sync monitors

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M GPU or newer
  • Driver – R340.52 or higher

Laptops with built-in G-Sync displays

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M or newer
  • Driver – R352.06 or higher

G-Sync vs. G-Sync Compatible vs. G-Sync Ultimate

Because G-Sync is a hardware solution, certified monitors must include Nvidia’s proprietary board. Fortunately, most major monitor manufacturers like Asus, Philips, BenQ, AOC, Samsung, and LG offer G-Sync displays.

Nvidia currently lists three monitor classes: G-Sync Ultimate, G-Sync, and G-Sync Compatible. Here’s a breakdown of each:

G-Sync Compatible

  • 24 to 88 inches
  • Validated no artifacts

G-Sync

  • 24 to 38 inches
  • Validated no artifacts
  • Certified +300 tests

G-Sync Ultimate

  • 27 to 65 inches
  • Validated no artifacts
  • Certified +300 tests
  • Best quality HDR
  • 1000 nits brightness

For G-Sync Ultimate displays, you’ll need a hefty GeForce GPU to handle HDR visuals at 4K. They’re certainly not cheap, but they provide the best experience.

As for G-Sync Compatible, it’s a newer category. These displays do not include Nvidia’s proprietary G-Sync board, but they do support variable refresh rates. These panels typically fall under AMD’s FreeSync umbrella, which is a competing technology for Radeon-branded GPUs that doesn’t rely on a proprietary scaler board. Nvidia tests these displays to guarantee “no artifacts” when connected to GeForce-branded GPUs. Consider these displays as affordable alternatives to G-Sync and G-Sync Ultimate displays.

Overall, resolutions range from Full HD to 4K while refresh rates range from 60Hz max to 240Hz max. Nvidia provides a full list of compatible monitors on its website. Prices range from about $100 to well over $1,000, like the Asus’ ROG Swift PG279Q 27-inch monitor selling for $698.

G-Sync TVs

Image used with permission by copyright holder

Since G-Sync launched in 2013, it has always been specifically for monitors. However, Nvidia is expanding. Last year, Nvidia partnered with LG to certify recent LG OLED TVs as G-Sync Compatible. You’ll need some drivers and firmware to get started, which Nvidia outlines on its site. Here are the currently available TVs that support G-Sync:

  • LG BX 2020 (50-, 65-, and 77-inch)
  • LG CX 2020 (50-, 65-, and 77-inch)
  • LG GX 2020 (50-, 65-, and 77-inch)
  • LG B9 2019 (50- and 65-inch)
  • LG C9 2019 (50-, 65-, and 77-inch)
  • LG E9 2019 (50- and 65-inch)

FreeSync: the G-Sync alternative

best ultra-wide monitors
Bill Roberson/Digital Trends

As we pointed out earlier, AMD’s FreeSync derives from VESA’s Adaptive-Sync technology. One of the main differences is that it doesn’t use proprietary hardware. Rather, FreeSync-certified displays use off-the-shelf scaler boards, which lessens the cost. The only AMD hardware you need for FreeSync is a Radeon-branded GPU. AMD introduced AdaptiveSync support in 2015.

FreeSync has more freedom in supported monitor options, and you don’t need extra hardware. So, FreeSync is a budget-friendly alternative to G-Synch compatible hardware. Asus’ MG279Q is around $100 less than the aforementioned ROG Swift monitor.

No matter which you choose, each technology has advantages. There are also numerous graphics cards and monitors to up your gaming experience. FreeSync covers graphical glitches caused by monitor and GPU synchronization issues.

Some downsides

One downside is the price. Whether you’re looking at a laptop or desktop, G-Sync requires both a capable monitor and graphics card. While there are many G-Sync compatible graphics cards, giving you plenty of budgetary options, G-Sync monitors are almost always more expensive than their AMD Freesync counterparts. Compatible laptops may be even more expensive.

In addition, users point to a lack of compatibility with Nvidia’s Optimus technology. Optimus, implemented in many laptops, adjusts graphics performance on the fly to provide the necessary power to graphics-intensive programs and optimize battery life. Because the technology relies on an integrated graphics system, frames move to the screen at a set interval, not as they are created as seen with G-Sync. One can purchase an Optimus-capable device or a G-Sync-capable device, but no laptop exists that can do both.

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
Nvidia RTX 50-series GPUs: everything we know so far
RTX 4070 seen from the side.

Nvidia already makes some of the best graphics cards, but it's also not resting on its laurels. The RTX 40-series, which has been bolstered by a refresh, is now over two years old, and Nvidia is hard at work on the next generation of GPUs.

The release date of RTX 50-series GPUs is still at least over a month away, but various rumors and leaks give us a better idea of what to expect. Here's everything we know about Nvidia's upcoming generation of graphics cards.
RTX 50-series: pricing and release date

Read more
Nvidia just scaled down DLSS 3, and that’s a good thing
The RTX 4080 Super graphics card sitting on a pink background.

Nvidia's signature tech, DLSS 3, just got yet another update -- and although it's subtle, it actually seems like a good thing for some of the best graphics cards. The latest version, 3.8.10, bundled with the GeForce 566.14 driver, doesn't seem to introduce any major changes, but Nvidia enthusiasts noticed that it's about half the size that it used to be. Where's that difference coming from?

No, Nvidia didn't downgrade DLSS 3 -- at least not in any major way. Although this hasn't been confirmed by Nvidia itself, it appears that the company removed a whole bunch of DLSS presets and replaced them with just two. These presets make it easier for gamers to choose the type of focus they want to apply to each game.

Read more
Rest in pieces: Nvidia is finally ditching GeForce Experience for good
The Nvidia app on the Windows desktop.

We've had the Nvidia app for a while, but now, it's available officially. About a year ago, Nvidia launched the Nvidia app into beta as a one-stop-shop for managing some of its best graphics cards, including grabbing new drivers, messing around with different features, and optimizing your game settings. Now, it's out of beta, officially replacing the legacy GeForce Experience and Nvidia Control Panel apps, and with some new features in tow.

One of the biggest draws of the Nvidia app initially was driver downloads. It may seem mundane, but you'd previously need to download GeForce Experience and create an Nvidia account for GPU driver updates. If you didn't, you'd have to search and install your drivers manually. The Nvidia app gives you access to new drivers, and notifies you when they're ready, all without an Nvidia login. Now, signing in is optional for "bundles and rewards" offered by Nvidia.

Read more