Skip to main content

The sad, misleading, and embarrassing state of HDR in PC gaming

HDR is laughably bad for PC gaming in most cases, and we all know it’s true.

That might surprise you, though, if you were only considering how gaming monitors are advertised. After all, on paper, HDR is technically supported by your monitor, games, and graphics card. Heck, even Windows supports HDR relatively free of bugs these days.

So, who’s to blame then? Well, when I dug down deep for the answer, I found three major culprits that explain our current predicament. And even with some light at the end of the tunnel, this multifaceted problem isn’t about to just go away on its own.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The game problem

I need to start by laying the ground on why HDR is a problem specifically with PC games. It’s a highly variable experience depending on what display you have and what game you’re playing, which makes this whole HDR mess all the more confusing on PC. The big reason why is static metadata.

HDR comparison in Tina Tiny's Wonderlands.
Jacob Roach / Digital Trends

There are three main HDR standards: HDR10, HDR10+, and Dolby Vision. The latter two support dynamic metadata, which basically means they can feed the display information dynamically based on what the monitor is capable of and the scene you’re in (even what frame is currently on screen). HDR10, on the other hand, only has static metadata.

Dynamic metadata is a big reason why console HDR is so much better than HDR on PC.

Only a select few monitors support Dolby Vision, like Apple’s Pro Display XDR, and none of them are gaming monitors. There are a few HDR10+ monitors, but they’re exclusively from Samsung’s most expensive displays. The vast majority of monitors are dealing with static metadata. TVs and consoles widely support Dolby Vision, however, which is a big reason why console HDR is so much better than HDR on PC.

As former game developer and product manager for Dolby Vision Gaming Alexander Mejia points out, static metadata creates a big problem for game developers: “There are more and more HDR TVs, monitors, and laptops on the market than ever, but if you grab a couple from your local big-box retailer, your game is going to look drastically different on each one … How do you know that the look you set in your studio will be the same one the player sees?”

HDR comparison in Devil May Cry 5.
Jacob Roach / Digital Trends

On my Samsung Odyssey G7, for instance, Tina Tiny’s Wonderlands looks dark and unnatural with HDR turned on, but Devil May Cry 5 looks naturally vibrant. Look up user experiences on these two games, and you’ll find reports ranging from the best HDR game ever to downright terrible image quality.

It doesn’t help the matter that HDR is usually an afterthought for game developers. Mejia writes that developers “still need to deliver a standard dynamic range version of your game — and creating a separate version for HDR means twice as much mastering, testing, and QA. Good luck getting sign-off on that.”

There are numerous examples of developer apathy toward HDR. The recently released Elden Ringfor example, shows terrible flickering in complex scenes with HDR and motion blur turned on (above). Turn HDR off, and the problem goes away (even with motion blur still turned on). And in Destiny 2, the HDR calibration was broken for four years. HDTVTest found that the slider didn’t map brightness correctly in 2018. The issue was only fixed in February 2022 with the release of The Witch Queen expansion.

Games are a source of issues for HDR on PC, but it’s a consequential problem: A problem that stems from a gaming monitor market that seems frozen in time.

The monitor problem

The Alienware QD-OLED monitor in front of a window.
Digital Trends

Even with the numerous Windows bugs that HDR has caused over the past few years, monitors are the main source of HDR problems. Anyone in-tune with display tech can list the issues without a second thought, and that’s the point: After years of HDR monitors flooding the market, displays are mostly in the same place they were when HDR first landed on Windows.

The traditional knowledge has been that good HDR requires at least 1,000 nits of peak brightness, which is only partially true. Brighter displays help, but only because they’re able to drive higher levels of contrast. For example, the Samsung Odyssey Neo G9 is capable of twice the peak brightness as the Alienware 34 QD-OLED, but the Alienware display offers much better HDR due to its exponentially higher contrast ratio.

There are three things a display needs to achieve good HDR performance:

  1. High contrast ratio (10,000:1 or higher)
  2. Dynamic HDR metadata
  3. Expanded color range (above 100% sRGB)

TVs like the LG C2 OLED are so desirable for console gaming because OLED panels provide massive contrast (1,000,000:1 or higher). Most LED monitors top out at 3,000:1, which is not enough for solid HDR. Instead, monitors use local dimming — independently controlling the light on certain sections of the screen — to increase contrast.

A colorful image on the LG C2 OLED's screen.
Dan Baker/Digital Trends / Digital Trends

Even premium (above $800) gaming monitors don’t come with enough zones, though. The LG 27GP950-B only has 16, while the Samsung Odyssey G7 has an embarrassing eight. For a truly high contrast ratio, you need a lot more zones, like the Asus ROG Swift PG32UQX with over 1,000 local dimming zones — a monitor that costs more than building a new computer.

The vast majority of HDR monitors don’t even scratch the bare minimum. On Newegg, for example, 502 of the 671 HDR gaming monitors currently available only meet VESA’s DisplayHDR 400 certification, which doesn’t require local dimming, expanded color range, or dynamic metadata.

An example of local dimming on a Vizio TV.
Digital Trends

Spending up for a premium experience isn’t new, but this has been the case for four years now. Instead of premium features becoming mainstream, the market has been flooded with monitors that can advertise HDR without offering any of the features that make HDR tick in the first place. And monitors that check those boxes under $1,000 usually cut corners to do so with few local dimming zones and shoddy color coverage.

There are exceptions, such as the Asus ROG Swift PG27UQ, that offer a great HDR gaming experience. But the point remains that the vast majority of monitors available today aren’t too different from the monitors available four years ago, at least in terms of HDR.

Light at the end of the tunnel

The ultrawide, curved QD-OLED monitor.
Image used with permission by copyright holder

The HDR experience on PC has been mostly static for four years, but that’s changing due to some fancy new display tech: QD-OLED. As the Alienware 34 QD-OLED shows, this is the panel technology that will truly drive HDR in PC games. And good news for gamers, you won’t have to spend north of $2,500 to access it.

MSI just announced its first QD-OLED monitor with identical specs to the Alienware one, and I suspect it’ll use the exact same panel. If that’s the case, we should see a wave of 21:9 QD-OLED monitors by the beginning of next year.

We’re seeing more OLED monitors, too, like the 48-inch LG 48GQ900 that was recently announced. They’re TVs marketed as gaming monitors, sure, but display makers are clearly in-tune with the demand for OLED panels from gamers. Hopefully, we’ll see some that come in the size of a proper monitor.

There are other display technologies driving better HDR performance, such as mini LED. But QD-OLED is the seismic shift that will finally, hopefully, makes HDR a reality for PC gaming.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
I’ve never played a PC game as demanding as Black Myth: Wukong
A character gearing up for battle in Black Myth: Wukong.

Black Myth: Wukong is an anomaly. The game is already shattering Steam records, which is a pretty big accomplishment for a game that has seemingly been a glorified tech demo for the past four years. It doesn't help that the game comes with some of the most demanding requirements we've ever seen out of a modern game. Black Myth: Wukong can push your PC to its limits, but if you're smart with your graphics settings, you don't need one of the best graphics cards to run it.

For the past week, I've been poking and prodding every corner of Black Myth: Wukong to understand how the PC version works, what it's capable of, and where the easy performance gains are. This game will push your PC to its limits, and it's sure to become a staple in PC hardware reviews. And that's because it manages to push an insane level of visual fidelity while still being a game that you can, well, play.
Graphics tweaks are a must
If you were hoping to jump into Black Myth: Wukong, toggle to the highest graphics preset, and be on your way, I have bad news for you. Even with an RTX 4090, I'm not able to jump into the game will all the sliders maxed out. There's a lot of room in the graphics options to optimize your performance, and ignoring the graphics settings will only hurt your experience in the game. Similar to Alan Wake 2, Black Myth: Wukong looks beautiful, even if you need to turn some settings down to Low.

Read more
Asus’ 480Hz OLED gaming monitor is cheaper than expected
The Asus 480Hz OLED set up at CES 2024.

Asus is finally sharing more details about its upcoming ROG Swift PG27AQDP, which looks to earn a slot among the best gaming monitors. We first saw this monitor at the beginning of the year, where it stood out as the first OLED monitor ever to reach at 480Hz refresh rate at 1440p. Since originally showing it off, Asus has been quiet about the display -- until now.

The PG27AQDP officially launched at Gamescom, and it's arriving at a shockingly low price of $1,000. That's not cheap for a gaming monitor, or even an OLED monitor, but it's much lower than the competition. The Acer Predator X27U F3, which matches Asus' display in terms of specs and is due out any week now, could cost as much as $1,600, Acer says. It's possible Acer will make a last-minute pricing adjustment in the face of the PG27AQDP.

Read more
Final Fantasy XVI already runs incredibly on PC, but there’s one problem
Clive fighting an enemy in Final Fantasy 16.

The moment Final Fantasy XVI hit Steam, I had the download queued. I plan on doing a closer look at the PC release once it arrives on September 17, but after playing through the opening section of the game, it's already looking like a promising PC port. It's well-optimized on the CPU, there are a ton of graphics options, and it's loaded with PC gaming tech. There's just one problem I'd like to see addressed before launch.

That mainly concerns the game's PlayStation 5 roots, where it ran either at 30 frames per second (fps) in resolution mode, or a wildly unstable 60 fps in its performance mode. Despite the PC release supporting frame rates up to 240 fps and running much smoother on modern hardware, its basis on PS5 still shows up, particularly in cutscenes.
Cutscene woes

Read more