FreeSync vs. G-Sync

G-Sync and FreeSync can make your games look better, but which is best?

Acer Predator XB2 review hero game
Bill Roberson/Digital Trends

If you’ve ever experienced screen tearing in a PC game, you know how annoying it can be. An otherwise perfectly rendered title totally ruined by gross horizontal lines and stuttering. You can turn on V-Sync, but if you don’t have a high-end system, it can put a huge dent in your performance.

Both Nvidia and AMD have stepped up to try and solve the issue while preserving framerates, and both manufacturers have turned to adaptive refresh technology for the solution. But let’s break it down to show which is a better option for you.

Performance

G-Sync and FreeSync are both designed to smooth out gameplay, reduce input lag, and prevent screen tearing. They have different methods for accomplishing these goals, but what really sets them apart is that one is closely guarded, and the other is openly shared. While Nvidia’s G-Sync is enabled by including a chip in the construction of the monitor, FreeSync uses the video card’s functionality to manage the refresh rate of the monitor using the Adaptive Sync standard built into the DisplayPort standard — the result is a difference in performance.

Users have noted that although tearing and stuttering are reduced with FreeSync enabled, some monitors exhibit another problem: Ghosting. As objects move on the screen, they leave behind a bit of the image of their last position like a shadow. It’s an artifact that some people don’t notice at all, and really annoys others.

There are a lot of fingers being pointed at what might be causing it, but the physical reason for it is power management. If you don’t apply enough power to the pixels, your image will have gaps in it, too much power, and you’ll see ghosting. Balancing the adaptive refresh technology with proper power distribution is hard.

Acer Predator XB2 review full
Bill Roberson/Digital Trends

Both FreeSync and G-Sync also start to suffer when the framerate isn’t consistently syncing within the monitor’s refresh range. G-Sync can show problems with flickering at very low framerates, and while the technology usually compensates to fix it, there are exceptions. FreeSync, meanwhile, has stuttering problems if framerate drops below a monitor’s stated minimum refresh rate. Some FreeSync monitors have an extremely narrow adaptive refresh range, and if your video card can’t deliver frames within that range, problems arise.

Most reviewers who’ve compared the two side-by-side seem to prefer the quality of G-Sync, which does not show stutter issues at low framerates, and thus smoother in real-world situations.

Selection

One of the first differences you’ll hear people talk about when it comes to adaptive refresh technology, besides the general rivalry between AMD and Nvidia, is the difference between a closed and an open standard. While G-Sync is proprietary Nvidia technology and requires the company’s permission and cooperation to use, FreeSync is free to use, and implementing it is a goal of the program, not a way to make money. Thus, there are more monitors available with FreeSync support.

On the other hand, G-Sync has been around longer, and it’s also managed by Nvidia, the current leader in GPU manufacturing. That may prevent AMD’s lead in compatible monitors from extending, but right now it still has the upper hand.

In most cases, you can’t mix and match between the two technologies. While the monitors themselves will work irrespective of the brand of graphics card, the FreeSync and G-Sync features specifically require an AMD and Nvidia GPU respectively.  You have to choose whether you want to go with Nvidia or AMD, and then purchase a monitor accordingly.

If you go the Nvidia route, the module in the monitor is going to handle a lot of the heavy lifting involved in adjusting the refresh rate. That’s going to be reflected in the price you pay for the monitor since each manufacturer has to pay Nvidia for the hardware. The upside is that the technology has been readily available since early 2014, so it’s available in monitors as cheap as $350, like the Acer Predator XB241H.

The G-Sync module also does most of the heavy lifting, so as long as your monitor is compatible, you can use lower end cards. Nvidia lists the compatible options, which range from the Titan X and 1080 Ti all the way down the 1050, which retails for as little as $150.

freesyncdisplay
Asus
Asus

You won’t end up paying much extra for a monitor with FreeSync. There’s no premium to the manufacturer to include it, unlike G-Sync. As such, FreeSync in the mid-hundreds frequently come with a 1440p display and a 144Hz refresh rate (where there G-Sync counterparts might not), and monitors without those features can run as low as $160.

You’ll also need a card that supports FreeSync which has traditionally been only AMD graphics cards and APU’s and consoles like the Xbox One, which use an AMD APU. But that traditional separation between G-Sync and FreeSync has become blurrier now with Nvidia cards able to support FreeSync.  This is thanks to a driver update which allows GeForce GTX 10-series, GeForce GTX 16-series and GeForce RTX 20-series graphics cards to work with FreeSync monitors. It generally works but there’s a catch — it’s only guaranteed to work properly on FreeSync monitors that are certified ‘Nvidia G-Sync Compatible’. This means the cards have been rigorously tested and approved by Nvidia to ensure that FreeSync runs smoothly across the card range. Here’s a current list of certified monitors.

Conclusion

Without any other components, you should expect to spend at least $450 on a 1080p G-Sync monitor and GTX 1050 graphics card; much more if you want a set up that can actually handle 4K gaming. Yet for a little under $300, you can get into the base level of FreeSync’s compatibility, with the VG245H mentioned above and a card like the Radeon RX 550 that will squeeze out 1080p gaming with decent performance.  The good news with AMD is that, up to the RX 580 (which is a great card for 1440p gaming), price points are comparable to Nvidia cards.  That means you’ll be able to get an equally powerful GPU without the G-Sync premium.

Given the price gap, you might wonder why anyone would prefer G-Sync. The answer is simple — it’s superior. Nvidia’s adaptive refresh technology just delivers more consistent overall performance. It’s also worth noting that, when it comes to high performance and 4K gaming, Nvidia video cards are currently the performance king. Going with FreeSync, and thus buying an AMD Radeon card, might mean purchasing hardware that delivers less bang for your buck.

Fortunately, the new G-Sync compatible program gives buyers a lot of new options. If you already have a Geforce GTX 10–series and upwards card, you can buy a cheaper FreeSync monitor that is certified to work with your Nvidia card. After that, simply use this handy guide to activate G-Sync on a FreeSync monitor.

Ultimately, both of these technologies largely accomplish their goals and deliver an experience that’s superior to V-Sync. Your choice will depend on whether you prefer value or top-notch gaming experience.

Editors' Recommendations

AMD vs. Intel

AMD Ryzen 9 3900x