Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

What is HDR10+? Everything you need to know about the HDR format

It’s no mystery that TVs have come quite a long way over the last decade or so. Remember when 1080p was a huge deal? Now that 4K resolution is the average pixel count in town and 8K models are available to purchase, there are even more things to consider when investing in a new set. And on top of display tech options, including QLED, OLED, and QD-OLED screens, there’s also something called HDR10+ to think about.

HDR10+ has been around for a while now, but that doesn’t make it any less important in the grand scheme of things. HDR10+ is a powerful picture codec that unlocks some truly breathtaking brightness, contrast, and color perks, but not every modern TV has HDR10+ capabilities, even if the model in question supports other leading HDR formats. If this is starting to sound puzzling, don’t fret. We’re here to teach you about the awesome world of HDR10+. 

What is HDR?

Hisense 110UX ULED X mini-LED 4K TV.
Hisense

Before we can dive into HDR10+, we need to make sure we understand HDR. We’ve got a few fantastic deep dives on this technology that you can peruse at your leisure, but for the sake of a quick introduction, high-dynamic range as it pertains to TVs allows for video and still images with much greater brightness, contrast, and better color accuracy than what was possible in the past. HDR works for movies, TV shows, and video games. Unlike increases in resolution (like 720p to 1080p), which aren’t always immediately noticeable — especially when viewed from a distance — great HDR material is eye-catching from the moment you see it.

HDR requires two things at a minimum: A TV that is HDR-capable and a source of HDR video, such as a 4K HDR Blu-ray disc and compatible Blu-ray player, or an HDR movie on Netflix or other streaming service that supports it. Confused consumers often conflate 4K and HDR, but they are very different technologies; not all 4K TVs can handle HDR, and some do it much better than others. That said, most new TVs support both 4K UHD and HDR.

But saying “HDR” is like saying “digital music”: There are several different types of HDR, and each has its own strengths and weaknesses.

What is HDR10?

Every TV that is HDR-capable is compatible with HDR10. It’s the minimum specification. The HDR10 format allows for a maximum brightness of 1,000 nits (a measure of brightness), and a color depth of 10 bits. On their own, those numbers don’t mean much, but in context they do: Compared to regular SDR (standard dynamic range), HDR10 allows for an image that is over twice as bright, with a corresponding increase in contrast (the difference between the blackest blacks and the whitest whites), and a color palette that has one billion shades, as opposed to the measly 16 million of SDR.

As with all HDR formats, how well HDR10 is implemented depends upon the quality of the TV on which you view it. When utilized properly, HDR10 makes video content look really good, but it is no longer the top of the HDR food chain.

What is HDR10+?

As the name suggests, HDR10+ takes all of the good parts of HDR10 and improves upon them. It quadruples the maximum brightness to 4,000 nits, which thereby increases contrast. But the biggest difference is in how HDR10+ handles information. With HDR10, the “metadata” that is fed by the content source is static, which means there’s one set of values established for a whole piece of content, like an entire movie. HDR10+ makes this metadata dynamic, allowing it to change for each frame of video. This means every frame is treated to its own set of colors, brightness, and contrast parameters, making for a much more realistic-looking image. Areas of the screen that might have been oversaturated under HDR10 will display their full details with HDR10+. But wait, there’s more — Samsung, long a proponent of HDR10+, has kicked things up yet another notch. The company’s HDR10+ Adaptive technology allows your TV to detect the brightness of your viewing space and make micro adjustments to the brightness, contrast, etc., in response to changes in the room.

When the HDR10+ picture standard first rolled out, it was difficult to find the codec supported by TV brands other than Samsung and Panasonic. One of the biggest reasons behind this is that HDR10+ was developed by a consortium made up of 20th Century Fox, Samsung, and Panasonic. Currently, though, HDR10+ is starting to show up on other TVs, including TCL, Hisense, and Toshiba.

And as for the streaming landscape, as it currently stands, you can find HDR10+ media on Amazon Prime Video, AppleTV+, Hulu, Paramount+, YouTube, and the Google Play Movie and TV apps. A number of streaming devices also support the picture standard, including Samsung’s web-connected lineup of Blu-ray players, the Apple TV 4K (2022), and various Roku devices, including the Roku Express 4K, Roku Express 4K+, and Roku Ultra (2022). 

So … what about Dolby Vision?

Dolby Vision on the TCL 5-Series (S546).
Dan Baker/Digital Trends / Digital Trends

HDR10+ isn’t the only HDR format with ambitions of becoming the next king of the HDR castle. Dolby Vision is an advanced HDR format created by Dolby Labs, the same organization behind the famous collection of Dolby audio technologies like Dolby Digital and Dolby Atmos. Dolby Vision is very similar to HDR10+ in that it uses dynamic, not static, metadata, giving each frame its own unique HDR treatment. But Dolby Vision provides for even greater brightness (up to 10,000 nits) and more colors, too (12-bit depth, for a staggering 68 billion colors).

Thanks to continued improvements in HDMI technology, the latest HDMI 2.1 protocol allows for up to 16-bit depth levels within the Rec.2020 color space. While it’s going to be a while before consumer displays can decode these 16-bit signals, HDMI 2.1 does support the 12-bit data you’ll get from Dolby Vision signals. Of course, this means you’ll need to have a TV that’s capable of decoding those 12-bit Dolby Vision signals, along with a few other AV essentials (more on that below).

Unlike HDR10+ though, which only had its official launch in 2018, Dolby Vision has been around for several years and enjoys wide industry support, which could help make it the HDR standard going forward.

Oh no, not another format war!

The Samsung QN900C QLED 8K Smart Tizen TV on a stand in a living room.
Samsung

Does the presence of competing HDR formats like HDR10+ and Dolby Vision mean we’re in for another format war? Not exactly. Unlike previous tech tiffs like Blu-ray versus HD-DVD, HDR formats are not mutually exclusive. This means there’s nothing stopping a movie studio from releasing a Blu-ray that contains HDR10, HDR10+, and Dolby Vision metadata on a single disc.

A TV that supports HDR can support multiple HDR formats, and many of today’s TVs do just that. The most common combo is HDR10 and Dolby Vision support on a single TV; however, we’re also just beginning to see the arrival of TVs that add HDR10+ and even HLG (the version of HDR favored by digital TV broadcasters) to that mix. It’s also possible that some TVs that shipped from the factory with support for just two formats — say HDR10 and Dolby Vision — could be updated via a firmware upgrade to handle HDR10+.

Blu-ray players and media streamers can also support multiple HDR formats. The challenge is that, despite the ability to support multiple HDR formats, very few TVs, playback hardware devices, streaming video services, or Blu-rays actually do. This means that, as consumers, we need to pay close attention to the labels to understand the capabilities of the devices and content we own — and the ones we plan on buying.

Many Blu-ray players, for instance, only offer support for HDR10, while some models, like Sony’s UBP-X700, add Dolby Vision support. The same considerations apply to set-top streaming boxes. Right now, there are many different peripherals that support all three major HDR formats (HDR10, HDR10+, and Dolby Vision), including the Apple TV 4K (2022), the Amazon Fire TV Stick 4K and 4K Max, the Roku Streaming Stick 4K and Roku Ultra (2022), and the Chromecast with Google TV (4K).

What equipment do I need to get HDR10+?

To summarize, HDR10+ is a new format of HDR that offers higher levels of brightness and contrast plus more true-to-life colors and detail. To get it, you’ll need:

  • A source of HDR10+ video, such as a Blu-ray movie, Hulu, Amazon Prime Video, etc.
  • A device that is capable of reading HDR10+ encoded material, like a compatible Blu-ray player or media streamer
  • A TV that is HDR10+ compatible (these may also have built-in apps that let you sidestep the need for a playback device)

One more thing: If you’re using a media streamer or a Blu-ray player for your HDR10+ content and it does not plug directly into your TV, the HDMI cable that you’re using should ideally be compatible with HDMI 2.1. The reason is that HDR10+ (and Dolby Vision) use far more data bandwidth than conventional HDR10, and older HDMI 2.0 cables may not be able to support that extra demand.

So that’s that! Whether you’re looking to upgrade your home theater system or you just want to understand this cool tech, that’s really all you need to know. Stay tuned for updates!

Editors' Recommendations

Michael Bizzaco
Michael Bizzaco has been writing about and working with consumer tech for well over a decade, writing about everything from…
4K TV buying guide: Everything you need to know
An LG 70-inch Class NanoCell 75 Series LED 4K UHD Smart TV sits on an entertainment stand in a living room.

If you haven't purchased a new TV at some point in the last decade, then you may or may not know how much has changed. Remember when HD broadcasting rolled out and 1080p flatscreens were the big thing? Nowadays, 1080p is more of an afterthought on larger displays, with a majority of TVs sized 40 inches or larger now supporting 4K and 8K resolution.

On top of increased pixel counts, today's TVs also come in multiple lighting and screen styles, with QLEDs and OLEDs representing which TVs are the best you can get your hands on. Then, when you factor in things like what kind of HDMI ports you need, smart features, size, and overall price, the purchase of a new TV can become a never-ending cycle of research and second-guessing.

Read more
ATSC 3.0: Everything you need to know about broadcast TV’s next big thing
ClearStream Eclipse antenna mounted beside a TV.

ATSC 3.0 (also referred to as NextGen TV) is the future of over-the-air (OTA) TV, combining existing antenna technologies with internet-powered tools to deliver the next generation of digital broadcasting.

ATSC 3.0 upgrades our existing antenna TV system by establishing a new technical framework for how those TV signals are created, broadcast, and received. It supports higher resolutions like 4K and possibly 8K, along with much better sound. There's even the potential for ATSC 3.0 to replace some uses of mobile data, especially within the automotive world. Here's everything you need to know about ATSC 3.0.
What is ATSC 3.0?

Read more
What is a smart TV? Everything you need to know
vizio 65inch oled 4ktv deal best buy december 2020 tv 768x768

Smart TVs are everywhere. In fact, you'd be hard-pressed to find a TV on a store shelf these days that doesn't do clever things like play movies and TV shows from the latest streaming services while you ask it to do so with (gasp!) your voice through an intelligent voice assistant. Widgets and apps open up possibilities like gaming, weather, video calling, and smart home features that would make your old TV fear for the curb. 

But what makes a TV smart, and why should you care? Is it as simple as an internet connection and an operating system? If it's just a more direct route to streamers like Netflix and Disney+, then is that better or worse than my trusty Apple TV or Roku set-top box? Who makes smart TVs, and does it matter which I choose? We decided to weigh in on the matter.

Read more