Digital Trends may earn a commission when you buy through links on our site.

What is G-Sync?

If you’re shopping for a gaming computer, you may see the term “G-Sync” in product descriptions and specifications. You likely already know that G-Sync comes highly recommended for serious gamers, but what is G-Sync exactly? We’re here to answer that burning question.

In short, G-Sync is a tool developed by Nvidia to solve problems with image smoothness. Without proper synchronization, the visuals on your screen can “rip” horizontally, as if the display taped together two pieces of a torn photograph. You may also see “stuttering” when the PC falls below your monitor’s refresh rate.

G-Sync aims to solve these issues.

Ripping apart your universe

Few computing programs require more system resources than games, and game developers typically push graphics hardware to their limits. Because of that, your graphics card and monitor can get out of sync, meaning your PC’s output doesn’t match the display’s refresh rate.

The movement you see on the screen, whether it’s a movie, TV show, or PC game, is just an illusion. You see images flash before your eyes 30 (or more) times per second, and your brain pieces them together. Movies and TV shows still mostly remain at 24 frames per second, while you can push PC games well beyond the 200 fps mark.

Inside your PC, the graphics chip does all this rendering while the CPU handles math, physics, artificial intelligence, and so on. The two work side-by-side, but your GPU does most of the heavy lifting.

In a simplified explanation, the GPU uses two dedicated slots, or buffers, in the video memory. The secondary buffer is where the GPU renders the current frame while the primary buffer holds a completed frame that’s transported to the display. When the GPU completes a frame in the secondary buffer, the two seats swap positions: The secondary becomes the primary and the primary becomes the secondary.

Meanwhile, the display receives a frame, flashes it before your eyes, and then clears it out — aka vertical blanking — for the next frame. If this process isn’t synchronized with the GPU’s buffer swaps, the image displayed on the screen comprises part of the first completed frame in the old primary buffer and part of the next completed fame in the new primary buffer.

Due to the way your screen draws images, these “rips” are horizontal. If there’s no movement, then you probably won’t see this effect. If the camera moves horizontally just a hair, the issue rips apart your virtual world.

Stuttering and input lag

Vertical Synchronization, or V-Sync, attempts to address this issue using software. The idea is to cap the GPU’s output to the display’s refresh rate and eliminate the visual tearing. but this introduces additional issues that can be just as annoying: Stuttering and input lag.

Typically, games include a V-Sync setting. If your display can only flash 60 images per second, then this setting limits the GPU’s output to 60 frames per second. Stuttering appears when the GPU can’t maintain that frame rate, and the display must re-use a frame until the GPU sends over a fresh copy. If the issue continues, V-Sync will lock down the game’s frame rate to 50% of the display’s refresh rate.

Meanwhile, input lag has absolutely nothing to do with your mouse or keyboard — your PC and game receives the input as normal. But because the GPU is forced to withhold frames due to V-Sync, there’s a delay between your input and your rendered actions on the screen.

Overall, V-Sync fixes one problem but introduces two others. That’s why G-Sync is the better solution.

This is G-Sync


Nvidia’s first alternative to V-Sync was Adaptive VSync. Like the older technology, Nvidia’s driver-based solution locked the frame rate to the display’s refresh rate to prevent screen tearing. However, when the GPU struggled, Adaptive VSync unlocked the frame rate until the GPU’s performance improved. Once stable, Adaptive VSync locked the frame rate until the GPU’s performance dropped again.

Nvidia introduced a hardware-based solution in 2013 called G-Sync. It’s based on VESA’s Adaptive-Sync technology, which enables variable refresh rates on the display side.

However, Nvidia uses a proprietary board that replaces the typical scaler board, which controls everything within the display like decoding image input, controlling the backlight, and so on. A G-Sync board contains 768MB of DDR3 memory to store the previous frame so that it can be compared to the next incoming frame. It does this to decrease input lag.

On the PC end, Nvidia’s driver can fully control the display’s proprietary board. It manipulates the vertical blanking interval, or VBI, which represents the interval between the time when a monitor finishes drawing the current frame, and the beginning of the next frame.

With G-Sync active, the monitor becomes a slave to your PC. As the GPU rotates the rendered frame into the primary buffer, the display clears the old image and gets ready to receive the next frame. As the frame rate speeds up and slows down, the display renders each frame accordingly as instructed by your PC. Since the G-Sync board supports variable refresh rates, images are often redrawn at widely varying intervals.

Finally, to take advantage of Nvidia’s G-Sync technology, you must have a GeForce-branded graphics card (desktop) or discrete GPU (laptop). Here are the hardware and driver requirements:


  • GPU – GeForce GTX 650 Ti BOOST or newer
  • Driver – R340.52 or higher

Laptops connected to G-Sync monitors

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M GPU or newer
  • Driver – R340.52 or higher

Laptops with built-in G-Sync displays

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M or newer
  • Driver – R352.06 or higher

Manufacturers support G-Sync

Because G-Sync is a hardware solution, certified monitors must include Nvidia’s proprietary board. Fortunately, most major monitor manufacturers like Asus, Philips, BenQ, AOC, Samsung, and LG offer G-Sync displays.

Nvidia currently lists three monitor classes: G-Sync Ultimate, G-Sync, and G-Sync Compatible. Here’s a breakdown of each:

G-Sync Ultimate

  • 27 to 65 inches
  • Validated no artifacts
  • Certified +300 tests
  • Best quality HDR
  • 1000 nits brightness


  • 24 to 38 inches
  • Validated no artifacts
  • Certified +300 tests

G-Sync Compatible

  • 24 to 88 inches
  • Validated no artifacts

For G-Sync Ultimate displays, you’ll need a hefty GeForce GPU to handle HDR visuals at 4K. They’re certainly not cheap, but they provide the best experience.

As for G-Sync Compatible, it’s a newer category. These displays do not include Nvidia’s proprietary G-Sync board, but they do support variable refresh rates. These panels typically fall under AMD’s FreeSync umbrella, which is a competing technology for Radeon-branded GPUs that doesn’t rely on a proprietary scaler board. Nvidia tests these displays to guarantee “no artifacts” when connected to GeForce-branded GPUs. Consider these displays as affordable alternatives to G-Sync and G-Sync Ultimate displays.

Overall, resolutions range from Full HD to 4K while refresh rates range from 60Hz max to 240Hz max. Nvidia provides a full list of compatible monitors on its website. Prices range from about $100 to well over $1,000, like the Asus’ ROG Swift PG279Q 27-inch monitor selling for $698.

Some downsides

One downside is the price. Whether you’re looking at a laptop or desktop, G-Sync requires both a capable monitor and graphics card. While there are many G-Sync compatible graphics cards, giving you plenty of budgetary options, G-Sync monitors are almost always more expensive than their AMD Freesync counterparts. Compatible laptops may be even more expensive.

In addition, users point to a lack of compatibility with Nvidia’s Optimus technology. Optimus, implemented in many laptops, adjusts graphics performance on the fly to provide the necessary power to graphics-intensive programs and optimize battery life. Because the technology relies on an integrated graphics system, frames move to the screen at a set interval, not as they are created as seen with G-Sync. One can purchase an Optimus-capable device or a G-Sync-capable device, but no laptop exists that can do both.

FreeSync: the G-Sync alternative

best ultra-wide monitors
Bill Roberson/Digital Trends

As previously stated, AMD’s FreeSync is based on VESA’s Adaptive-Sync technology, but it doesn’t use proprietary hardware. Instead, FreeSync-certified displays use off-the-shelf scaler boards, reducing the overall cost. The only AMD hardware required for FreeSync is a Radeon-branded GPU. AMD introduced AdaptiveSync support in 2015.

For a full explanation on FreeSync and its two variants, check out our separate article: What is FreeSync?

Currently, there are more monitors that support FreeSync than G-Sync. Furthermore, because these monitors don’t require additional hardware, they may run cheaper than their G-Sync capable counterparts. For example, Asus' MG279Q is about $100 less than the aforementioned ROG Swift monitor.

Each technology has its unique strengths, but there is a plethora of graphics card and monitor combinations that support these features. If you’re tired of the graphical glitches caused by your monitor and GPU being out of sync, help is here.

Editors' Recommendations