The HDMI Licensing Administrator, the group that defines and licenses HDMI standards, has some confusing requirements around the HDMI 2.1 standard. The group did away with HDMI 2.0 in 2017, and display makers should stick with HDMI 2.1 going forward — even without the essential features of the newer standard.
All of this comes from a statement the HDMI Licensing Administrator sent to TFT Central. In short, the HDMI 2.0 standard “doesn’t exist” anymore, and display designers should mark any HDMI 2.x display as supporting HDMI 2.1 as long as they support one new feature. The features of HDMI 2.1, according to the statement, are optional, and display manufacturers are supposed to list the features each display supports.
TFT Central spotted a Xiaomi 1080p 240Hz monitor that claims to support HDMI 2.1, even though the port is limited to the specs of HDMI 2.0. We likely haven’t seen the end of these “fake” HDMI 2.1 monitors, and the HDMI Licensing Administrator doesn’t seem to have an issue with that.
HDMI 2.0 and HDMI 2.1 are close in name, but they’re almost entirely new generations. HDMI 2.1 can support almost three times as much data, which brings higher resolutions and refresh rates, and it supports features like variable refresh rate (VRR). HDMI 2.1 is so different that it even requires a special high-speed HDMI cable to work at its full potential.
The HDMI Licensing Administrator doesn’t see that big of a difference. In an email, the group told me that HDMI 2.0 was deprecated in November 2017, and that the features of HDMI 2.0 are now a subset of HDMI 2.1. As long as the monitor supports one HDMI 2.1 feature, it supports HDMI 2.1 full-stop. The higher data transmission of HDMI 2.1 counts, too, meaning that ports that run at HDMI 2.0 levels of bandwidth can still use the new name as long as they support at least one feature.
That’s a big problem. HDMI 2.1 has almost too many new features to count, including eARC, VRR, Display Stream Compression (DSC) for higher resolutions, Auto Low Latency Mode (ALLM), and support for dynamic HDR, to name just a few. Add support for one, and the display gets to claim HDMI 2.1.
Anyone who has upgraded to a PlayStation 5 or Xbox Series X gaming console sees the issue here. HDMI 2.1 supports some critical gaming features like ALLM and support for 4K at 120Hz. The standard — HDMI 2.1 — is supposed to encompass those features so customers don’t need to dig through a list of technical jargon to know they’re buying the right product. It doesn’t.
Imagine if LG’s C1 OLED TV — a champion for next-gen console gaming thanks to the inclusion of HDMI 2.1 — only supported eARC and not the other features of HDMI 2.1. We’d have a lot of expensive TVs in the homes of duped gamers.
Thankfully, this isn’t a widespread issue now. Even though the HDMI Licensing Administrator discontinued HDMI 2.0 in 2017, there are countless recent monitors that support the standard. The LG 27GN950, for example, is limited to HDMI 2.0, and the vast majority of displays on Newegg use the older standard.
They could, under the HDMI Licensing Administrator’s rules, advertise support for HDMI 2.1. And we’ve seen what monitor manufacturers will do with a hot new buzzword in the past.
Roll back a few years to a time when high dynamic range (HDR) monitors were just making their way to market. At the time, you would find HDR branding on nearly every new monitor, even if those monitors didn’t support any HDR standard. Even reputable manufacturers like Dell and Samsung were labeling monitors with dynamic contrast settings as HDR ready.
That was until VESA, the organization behind standardized display mounting holes and the DisplayPort connection, created the DisplayHDR standard. Instead of the Wild West of HDR monitors, the standard forced designers to conform to a list of specs so that buyers didn’t have to worry about the technical details.
HDMI 2.1 is supposed to do the same thing. If you buy a new console and want a display to see its full fidelity, you shouldn’t need to know the difference between Fixed Rate Link (FRL) and Transition Minimized Differential Signaling (TMDS). The HDMI Licensing Administrator seems to expect customers to know that difference, though.
“A manufacturer needs to only support one 2.1 feature (such as eARC, for example) to call a device HDMI 2.1-compliant. However, they need to also state which features the device supports — so as to be clear what features the device supports. This is important so consumers won’t think a 2.1-enabled device supports all features,” the group wrote in an email.
We don’t have a major problem with HDMI 2.0 and HDMI 2.1 now. Most reputable brands have done the heavy lifting in clarification. That doesn’t mean this won’t be a problem in the future, though, especially as features like VRR and high refresh rates become more in demand.
The job of any computing standard is to standardize complex, unapproachable technology. You don’t need to worry about the number of connectors in the USB 3.0 port versus a USB 2.0 one — just look for the blue port and know that USB 3.0 is much faster. Standards create space between technology and products and help demystify marketing.
HDMI is a port, but it’s also a standard. Each new number should mean something, but under the current guidance, they don’t need to. I’m in favor of well-informed customers, but the opportunity for manufacturers to be deceptive under HDMI’s current guidance is too high.
It brings into question the point of having a standard at all. If HDMI 2.1 isn’t different enough from HDMI 2.0 to warrant stricter requirements, then no standard is.
- Platin Audio’s wireless home theater speaker system now handles Dolby Atmos
- Apple AirPods Pro 2: Everything you need to know
- YouTube TV now supports 5.1 surround sound on Apple TV, additional Fire TVs
- This reliable leaker has some bad news about Apple’s M2 Pro chips
- Apple finally allows you to repair your own MacBook