Skip to main content

What is G-Sync?

When shopping for a gaming monitor, you’ll undoubtedly come across a few displays advertising Nvidia’s G-Sync technology. In addition to a hefty price hike, these monitors usually come with gaming-focused features like a fast response time and high refresh rate. To help you know where your money is going, we put together a guide to answer the question: What is G-Sync?

In short, G-Sync is a hardware-based adaptive refresh technology that helps prevent screen tearing and stuttering. With a G-Sync monitor, you’ll notice smoother motion while gaming, even at high refresh rates.

What is G-Sync?

Nvidia

G-Sync is Nvidia’s hardware-based monitor syncing technology. G-Sync solves screen tearing mainly, synchronizing your monitor’s refresh rate with the frames your GPU is pushing out each second.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Your GPU renders a number of frames each second, and put together, those frames give the impression of smooth motion. Similarly, your monitor refreshes a certain number of times each second, clearing the previous image for the new frames your GPU is rendering. To keep things moving smoothly, your GPU stores upcoming frames in a buffer. The problem is that the buffer and your monitor’s refresh rate may get out of sync, causing a nasty line of two frames stitched together.

V-Sync emerged as a solution. This software-based feature essentially forces your GPU to hold frames in its buffer until your monitor is ready to refresh. That solves the screen tearing problem, but it introduces another: Input lag. V-Sync forces your GPU to hold frames it has already rendered, which causes a slight delay between what’s happening in the game and what you see on screen.

Nvidia’s first alternative to V-Sync was Adaptive VSync. Like the older technology, Nvidia’s driver-based solution locked the frame rate to the display’s refresh rate to prevent screen tearing. However, when the GPU struggled, Adaptive VSync unlocked the frame rate until the GPU’s performance improved. Once stable, Adaptive VSync locked the frame rate until the GPU’s performance dropped again.

Nvidia introduced a hardware-based solution in 2013 called G-Sync. It’s based on VESA’s Adaptive-Sync technology, which enables variable refresh rates on the display side. Instead of forcing your GPU to hold frames, G-Sync forces your monitor to adapt its refresh rate depending on the frames your GPU is rendering. That deals with input lag and screen tearing.

However, Nvidia uses a proprietary board that replaces the typical scaler board, which controls everything within the display like decoding image input, controlling the backlight, and so on. A G-Sync board contains 768MB of DDR3 memory to store the previous frame so that it can be compared to the next incoming frame. It does this to decrease input lag.

On the PC end, Nvidia’s driver can fully control the display’s proprietary board. It manipulates the vertical blanking interval, or VBI, which represents the interval between the time when a monitor finishes drawing the current frame and the beginning of the next frame.

With G-Sync active, the monitor becomes a slave to your PC. As the GPU rotates the rendered frame into the primary buffer, the display clears the old image and gets ready to receive the next frame. As the frame rate speeds up and slows down, the display renders each frame accordingly as instructed by your PC. Since the G-Sync board supports variable refresh rates, images are often redrawn at widely varying intervals.

G-Sync system requirements

27'' UltraGear 4K UHD Nano IPS 1ms 144Hz G-Sync Compatible Gaming Monitor
Image used with permission by copyright holder

For years, there’s always been one big caveat with G-Sync monitors: You need an Nvidia graphics card. Although you still need an Nvidia GPU to fully take advantage of G-Sync — like the recent RTX 3080 — more recent G-Sync displays support HDMI variable refresh rate under the “G-Sync Compatible” banner (more on that in the next section). That means you can use variable refresh rate with an AMD card, though not Nvidia’s full G-Sync module. Outside of a display with a G-Sync banner, here’s what you need:

Desktops

  • GPU – GeForce GTX 650 Ti BOOST or newer
  • Driver – R340.52 or higher

Laptops connected to G-Sync monitors

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M GPU or newer
  • Driver – R340.52 or higher

Laptops with built-in G-Sync displays

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M or newer
  • Driver – R352.06 or higher

G-Sync vs. G-Sync Compatible vs. G-Sync Ultimate

Because G-Sync is a hardware solution, certified monitors must include Nvidia’s proprietary board. Fortunately, most major monitor manufacturers like Asus, Philips, BenQ, AOC, Samsung, and LG offer G-Sync displays.

Nvidia currently lists three monitor classes: G-Sync Ultimate, G-Sync, and G-Sync Compatible. Here’s a breakdown of each:

G-Sync Compatible

  • 24 to 88 inches
  • Validated no artifacts

G-Sync

  • 24 to 38 inches
  • Validated no artifacts
  • Certified +300 tests

G-Sync Ultimate

  • 27 to 65 inches
  • Validated no artifacts
  • Certified +300 tests
  • Best quality HDR
  • 1000 nits brightness

For G-Sync Ultimate displays, you’ll need a hefty GeForce GPU to handle HDR visuals at 4K. They’re certainly not cheap, but they provide the best experience.

As for G-Sync Compatible, it’s a newer category. These displays do not include Nvidia’s proprietary G-Sync board, but they do support variable refresh rates. These panels typically fall under AMD’s FreeSync umbrella, which is a competing technology for Radeon-branded GPUs that doesn’t rely on a proprietary scaler board. Nvidia tests these displays to guarantee “no artifacts” when connected to GeForce-branded GPUs. Consider these displays as affordable alternatives to G-Sync and G-Sync Ultimate displays.

Overall, resolutions range from Full HD to 4K while refresh rates range from 60Hz max to 240Hz max. Nvidia provides a full list of compatible monitors on its website. Prices range from about $100 to well over $1,000, like the Asus’ ROG Swift PG279Q 27-inch monitor selling for $698.

G-Sync TVs

Image used with permission by copyright holder

Since G-Sync launched in 2013, it has always been specifically for monitors. However, Nvidia is expanding. Last year, Nvidia partnered with LG to certify recent LG OLED TVs as G-Sync Compatible. You’ll need some drivers and firmware to get started, which Nvidia outlines on its site. Here are the currently available TVs that support G-Sync:

  • LG BX 2020 (50-, 65-, and 77-inch)
  • LG CX 2020 (50-, 65-, and 77-inch)
  • LG GX 2020 (50-, 65-, and 77-inch)
  • LG B9 2019 (50- and 65-inch)
  • LG C9 2019 (50-, 65-, and 77-inch)
  • LG E9 2019 (50- and 65-inch)

FreeSync: the G-Sync alternative

best ultra-wide monitors
Bill Roberson/Digital Trends

As we pointed out earlier, AMD’s FreeSync derives from VESA’s Adaptive-Sync technology. One of the main differences is that it doesn’t use proprietary hardware. Rather, FreeSync-certified displays use off-the-shelf scaler boards, which lessens the cost. The only AMD hardware you need for FreeSync is a Radeon-branded GPU. AMD introduced AdaptiveSync support in 2015.

FreeSync has more freedom in supported monitor options, and you don’t need extra hardware. So, FreeSync is a budget-friendly alternative to G-Synch compatible hardware. Asus’ MG279Q is around $100 less than the aforementioned ROG Swift monitor.

No matter which you choose, each technology has advantages. There are also numerous graphics cards and monitors to up your gaming experience. FreeSync covers graphical glitches caused by monitor and GPU synchronization issues.

Some downsides

One downside is the price. Whether you’re looking at a laptop or desktop, G-Sync requires both a capable monitor and graphics card. While there are many G-Sync compatible graphics cards, giving you plenty of budgetary options, G-Sync monitors are almost always more expensive than their AMD Freesync counterparts. Compatible laptops may be even more expensive.

In addition, users point to a lack of compatibility with Nvidia’s Optimus technology. Optimus, implemented in many laptops, adjusts graphics performance on the fly to provide the necessary power to graphics-intensive programs and optimize battery life. Because the technology relies on an integrated graphics system, frames move to the screen at a set interval, not as they are created as seen with G-Sync. One can purchase an Optimus-capable device or a G-Sync-capable device, but no laptop exists that can do both.

Editors' Recommendations

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
GPU prices are back on the rise again
RTX 4060 Ti sitting next to the RTX 4070.

We haven't had to worry about the prices of some of the best graphics cards for quite some time. With most GPUs sold around their recommended retail price, there are plenty of options for PC builders in need of a new graphics card. However, a new report indicates that we might see an increase in GPU prices, especially on the cards made by Nvidia's add-in board partners (AIBs). Is it time to start worrying about another GPU shortage? Not quite, but it might be better to shop now before it gets worse.

The grim news comes from IT Home, a Chinese tech publication that cites anonymous "industry sources" as it predicts that Nvidia's AIBs are about to raise their prices by up to 10% on average -- and this won't be limited to high-end GPUs along the lines of the RTX 4090. In fact, IT Home reports that the RTX 4070 Super has already received a price increase of about 100 yuan, which equals roughly $14 at the time of this writing. This is a subtle price increase given that the GPU costs $550 to $600, but according to the report, it might just be the beginning.

Read more
The sad reality of AMD’s next-gen GPUs comes into view
The AMD RX 7900 graphics card on a pink background.

For months now, various leakers agreed on one thing -- AMD is tapping out of the high-end GPU race in this generation, leaving Nvidia to focus on making the best graphics cards with no competitor. Today's new finding may confirm that theory, as the first RDNA 4 GPU to make an official appearance is one that has been speculated about for months: Navi48.

Following the typical naming convention for AMD, the flagship in the RDNA 4 generation should have been called Navi41 -- and it very well might have been, but according to various sources, that GPU will not be making an appearance in this generation. Hence, the flagship is now said to be the Navi48, and the latest finding shared by Kepler_L2 on X tells us that might indeed be the case.

Read more
Everything you need to know about buying a GPU in 2024
RTX 4090.

The graphics card, also known as the GPU, is arguably one of the most exciting components in any PC build. Alongside the processor, your graphics card often has the greatest impact on the overall performance of your PC. That makes it a pretty high-stakes purchase, especially if you consider that GPUs can get pretty expensive.

The GPU market has a lot to offer, and that's regardless of your needs and your budget. Whether you're aiming for something super cheap to support some light browsing or a behemoth to handle the most GPU-intensive games, you have lots of options. In this guide, we'll show you what to look out for so that you can pick the best GPU that fits your budget and needs.
Nvidia, AMD, or Intel?
Consumer graphics cards are generally split into two categories -- integrated and discrete graphics. Since you're here, you're most likely looking for a discrete (or dedicated) GPU, and that's what we're going to focus on in this article.

Read more