Skip to main content

AMD is finally taking FreeSync to the next level

Two monitors with AMD FreeSync over a dark background.
AMD

It took a long time, but AMD has just updated its FreeSync adaptive sync technology requirements, and it was a much-needed change. Previously, the base tier of FreeSync didn’t have any refresh rate requirements that monitors had to meet. Now, AMD didn’t just add a requirement, but it’s pretty massive — and that’s great news for the future of gaming monitors.

When AMD first introduced FreeSync in 2015, the vast majority of gamers and casual users alike were using a 60Hz monitor. While screens with higher refresh rates existed, they were a rarity. That’s no longer the case today, and almost all of the top monitors, regardless of their price, offer refresh rates of over 120Hz.

Recommended Videos

Displays with lower refresh rates are still being made and sold widely, but you can usually find an alternative that offers 144Hz or more, provided we’re not talking about massive 4K ultrawide monitors.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

In its blog post, AMD notes the shift in the market that took place in the last nine years, and it announces the new requirements for FreeSync, FreeSync Premium, and FreeSync Premium Pro.

Laptops Monitors and TVs
FreeSync Max. refresh rate: 40-60Hz < 3,440 Horizontal resolution: Max. refresh rate > 144Hz
FreeSync Premium Max. refresh rate: > 120 Hz < 3,440 Horizontal resolution: Max. refresh rate: > 200 Hz

> 3,440 Horizontal resolution: Max. refresh rate: > 120 Hz

FreeSync Premium Pro The same as FreeSync Premium, plus AMD FreeSync HDR The same as FreeSync Premium, plus AMD FreeSync HDR

Let’s review the new requirements. For starters, the basic FreeSync asks for 40-60Hz variable refresh rates from laptops, but monitors with a horizontal resolution of fewer than 3440 pixels (so 1080p and 2K monitors) must now have a refresh rate of at least 144Hz or higher. This is a major improvement from before when FreeSync didn’t have a minimum refresh rate that monitors had to offer.

The midrange FreeSync Premium kicks things up a notch. Laptops must now have a refresh rate of at least 120Hz, while displays under 3,440 horizontal pixels need to offer over 200Hz. Meanwhile, high-resolution monitors, such as 4K or ultrawides, will need to have at least 120Hz in order to qualify. The final FreeSync Premium Pro has the same refresh rate spec as the previous tier but adds the requirement that the monitor needs to support AMD FreeSync HDR.

AMD’s new requirements are a good reflection of where monitors (especially gaming monitors) are at right now. It’s true that not everyone needs a 144Hz (or higher) display, but there’s almost no reason to buy one that has lower refresh rates unless it’s significantly cheaper.

While all those 60-75Hz monitors still have their place and will continue to be made, they will no longer receive FreeSync support, and that’s a good change.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
AMD is taking the gloves off in the AI arms race
AMD's CEO presenting the MI300X AI GPU.

AMD looks ready to fight. At its Advancing AI event, the company finally launched its Instinct MI300X AI GPU, which we first heard about first a few months ago. The exciting development is the performance AMD is claiming compared to the green AI elephant in the room: Nvidia.

Spec-for-spec, AMD claims the MI300X beats Nvidia's H100 AI GPU in memory capacity and memory bandwidth, and it's capable of 1.3 times the theoretical performance of H100 in FP8 and FP16 operations. AMD showed this off with two Large Language Models (LLMs) using a medium and large kernel. The MI300X showed between a 1.1x and 1.2x improvement compared to the H100.

Read more
Nvidia gave AMD’s RX 7800 XT a free win
AMD RX 7800 XT graphics card on an orange background.

AMD's new RX 7800 XT and RX 7700 XT are solid GPUs; let's not get that twisted. However, they certainly aren't great, with the RX 7700 XT coming in $50 more expensive than it should be, and the RX 7800 XT occasionally being beaten by last-gen's RX 6800 XT. Sounds like a disaster.

So, why did I recommend the cards so highly in my RX 7800 XT and RX 7700 XT review? That largely comes down to Nvidia, and the "tierflation" we've seen over this generation (thanks u/Convextlc97 for the pointed description). It seems gamers agree, too, with the RX 7800 XT sold out nearly everywhere.
'Tierflation' at its finest

Read more
Here’s why you should finally ditch Nvidia and buy an AMD GPU
RX 7900 XTX installed in a test bench.

The GPU market is split between AMD and Nvidia, but it's far from being divided in half. Nvidia has long been the ruler, reaping the rewards of better brand recognition as AMD trails miles behind.

AMD may not be winning the revenue game, but it still makes some of the best graphics cards, so it doesn't deserve to be overlooked. Here's why you should consider choosing AMD instead of Nvidia for your next GPU upgrade.
Price

Read more