If you’ve been shopping for a new TV at any point in the last decade, you’ve probably run across the industry’s favorite tech terms, like brightness, contrast ratio, black levels, and even high dynamic range (or HDR as it’s most commonly known). But there’s one term that we’re betting has flown under your radar so far: Contrast modulation.
Scratching your head? Don’t worry; so far, almost no TV makers have listed contrast modulation on their TV spec sheets, and very few media outlets (including Digital Trends) have ever discussed it.
But thanks to some new industry standards being adopted to help buyers find the best 8K TVs, contrast modulation is suddenly en vogue. Here’s everything you need to know about contrast modulation and how it affects your TV’s picture quality.
Let’s make a resolution
High definition, 720p, 1080p, 4K, and 8K — these are all references to a TV’s resolution. TV resolution has always been measured in one way. By counting the number of individual pixels that make up your TV’s screen. The more pixels a TV has, the higher the resolution. That makes sense, and it’s simple to understand. It’s the whole reason why we’ve been introduced to a steady flow of ever-improving TVs, starting with the first HDTVs, all the way to today’s leading-edge 8K TVs.
If the sheer number of pixels a TV can display were the only criteria that mattered to our viewing experience, we could stop right here. You could go out and buy any 8K TV and know that you’re getting the highest resolution possible. But it turns out that resolution isn’t just about how many pixels a TV has. That’s where contrast modulation comes in to play.
Not all pixels are alike
Saying that two TVs that both have the same resolution will have an equally good picture isn’t necessarily true. Just like two cars with the same size engine and the same number of cylinders can produce wildly different amounts of horsepower, not all 4K TVs (or 8K TVs) will produce the same image quality.
A few years ago, the International Committee for Display Metrology (ICDM) — the group that establishes the measurements by which displays such as TVs are objectively judged — started to place an emphasis on a different way of evaluating a TV’s resolution. Instead of simply counting the number of pixels, the committee began to look at a display’s Contrast Modulation or CM.
CM is a measurement of how precisely one individual pixel can be visually distinguished from its eight neighboring pixels. It’s expressed as a percentage, where zero means that it’s impossible to know where one pixel starts and another one ends, and 100 means that each pixel perfectly preserves its own color and brightness, and never affects the adjacent pixels.
In the two images above, the display on the left exhibits a high CM value, while the display on the right has a much lower CM value. A higher CM value is better.
Why does contrast modulation matter?
A lack of CM means that some on-screen images will lack sharpness or clarity. It’s most noticeable when looking at high-frequency edges. These are areas of the screen where high-contrast patterns are displayed, like black text on a white background, or a close-up shot of facial hair.
There will be times when differences in CM values are negligible. Fast on-screen movements in movies where motion blur isn’t smoothed out by the dreaded soap-opera effect are a good example of when it will be challenging for even the most perceptive viewers to see the benefits of a high CM value.
Static graphics, on-screen user interfaces like TV guides, and some games, will likely benefit the most, as will still photos if you use your TV as a giant digital frame.
How high does contrast modulation need to be?
This is a tough one. In theory, any display with a CM value of 100 provides the sharpest, most detailed image possible. Of course, no TVs that we know of have a CM that high, but some come close. LG’s 8K TVs possess a CM of 94 or higher, which is one of the reasons why LG labels its 8K TVs “Real 8K.”
Does that mean you need a CM in the 90-or-higher range? Not necessarily.
The Consumer Technology Association (CTA) recently finalized its certification standards for 8K TVs, and for the first time it has included contrast modulation as part of its specifications. Curiously, the 8K Ultra HD certification only requires an 8K TV to exhibit a minimum CM of 50. This suggests that while a higher CM is better than a lower CM, you don’t need to look for a sky-high CM display in order to be assured you’re getting good image quality.
The big picture
If by now you’re thinking that you need to pay super-close attention to contrast modulation values when shopping for a new TV, don’t worry. For one thing, very few TV makers even discuss their CM values, so it can be hard to compare apples to apples. But more importantly, CM is just one element of overall picture quality, and we’d argue it isn’t as critical to the day-to-day enjoyment of your TV as other factors.
Brightness, contrast ratio, black levels, image processing and up-scaling, and viewing angles are all far more noticeable than CM for most people. That said if you ever find yourself comparing two TVs that are identical on every attribute except CM, go for the one with the higher CM value. We doubt you’ll ever find two such closely matched TVs, but if you do, please let us know!
- What is Real 8K TV? LG’s approach to 8K fully explained
- 8K cameras are coming. No, you don’t need one
- 8K TV: Everything you need to know about the future of television
- The best TVs for 2020
- Cheap 65-inch 4K TVs: LG, Sony, TCL, Vizio on sale in time for Memorial Day