Skip to main content

Full HD: Hip or Hype?

Full HD is the latest buzzphrase in television marketing. If we want the latest and greatest in HDTV, we want Full HD—or so we’re told. But is it possible to have too much of a good thing?   Full HD is marketing-ese for 1080p. Since alphanumeric monikers tend to leave us cold, the name is an adroit way of turning something seemingly dry and technical into something that sounds more desirable, something you’ve just got to have—unless you want your neighbor smirking at you because his home theater system is Full HD and yours is only, well, partial HD.   I first wrote about 1080p for Audio Video Interiors in 1998, when HDTV was still gleaming on the horizon. But I didn’t hear the phrase Full HD until relatively recently. Now it’s rampant in TV ads, reviews, and all the assorted information and misinformation that surrounds digital television. When I realized what it meant, I began wondering why this, why now?

The Case for Full HD

What is 1080p, a.k.a. Full HD? Since I’ve written about this subject here before, I’ll keep the definition brief. It’s high-definition television with 1080 by 1920 pixels, delivered in full frames. The p is what distinguishes 1080p from 1080i, which uses an interlacing process to deliver gapped pairs of half-frames.   But 1080p and 1080i aren’t the only forms taken by HDTV. There is also a 720p format that delivers 720 by 1280 pixels. If 1080p is Full HD, then this other format must be less than Full HD, right? After all, it has fewer pixels.   The case for Full HD seems even clearer if you count the total number pixels onscreen. Multiply 720 by 1280 and you get 921,600 pixels. Multiple 1080 by 1920 and you get 2,073,600 pixels. Now it makes perfect sense, right? Full HD has more than two million pixels and that inferior HD has fewer than a million. Case closed. Send the jury home. Let’s go to the bar across the street from the courthouse and get wasted. We can drink and watch basketball games in Full HD.

1080p Is Off the Air

Not so fast. Maybe this isn’t as cut and dried at it seems at first. True, 1080p has more than twice the pixel count of 720p. But if you have all the facts, some of them will continue niggling at the back of your mind.   For one thing, the people who devised the HDTV standard didn’t even bother to provide for 1080p (at least, not in practice). The broadcast standard they had in mind included 1080i, 720p, and standard-def formats like 480i. Since 1080p isn’t part of the broadcast standard, at least not yet, there are no 1080p broadcasts. CBS and NBC, for instance, use 1080i, while ABC and Fox prefer 720p.   You can get a true 1080p signal from Blu-ray or HD DVD disc, and potentially from some PC and game sources. The format has also gained traction as a production standard. But due to the initial setting of broadcast standards, there are lots of working HDTVs that don’t support 1080p. Most of them are 1080i. In those that do offer 1080p, it’s often just an upconversion standard—these Full HD sets accepts signals in other formats and displays them in 1080p. In the case of 1080i to 1080p, this is a straightforward line doubling.   As far as HDTV’s founding fathers were concerned, 1080i and 720p were both designated as HDTV, to distinguish them from SDTV formats like 480p and 480i. This whole notion that 720p is less than Full HD is relatively new and largely marketing-driven.   It’s all so confusing, isn’t it? But there’s one more point that makes everything perfectly clear. All these DTV formats have to go through a bottleneck that puts their relative merits on a different basis than that implied by the number of pixels. Actually, two bottlenecks. They’re your eyes.

Believe Your Eyes

Eyesight is ultimately what mocks the Full HD hype. Even if you have 20/20 vision, your eyes can’t distinguish the pixel size of 1080p vs. 720p on screens of 42 inches or less. Even at 50 inches, the difference is debatable. Up at 70 inches, you may see differences—but even then, things like video artifacts, video noise, and the limits of source material (whether HD or not) take their toll.   Furthermore, with movie content, there is effectively no difference between 1080p and 1080i. True, 35mm to 70mm film is an extraordinarily high-resolution medium, so movies shot on film can be sharp enough to take advantage of 1080 by 1920 resolution (or better, in the distant future).   But movies are shot at 24 frames per second, and the display of progressive video including 1080p is always 60fps or a multiple of that (or for 1080i, 60 fields, 30 frames). So a process called 3:2 pulldown comes into play to translate 24fps to 60fps. Whether 3:2 pulldown happens in the HDTV or in the Blu-ray/HD DVD player is irrelevant. It just happens, because without it, you wouldn’t get a watchable picture. So the p in 1080p doesn’t add anything to a movie that you wouldn’t get from a 1080i set.   My smarter colleague at Home Theater, video editor Geoff Morrison, explains it all here, here, and especially here.   I’m not saying Full HD/1080p is a bad thing. If you’re buying any kind of wall-hogging front-projection system, and plan to spend a lot of time watching Blu-ray or HD DVD, 1080p is a must. I’d also want it if I were buying a flat-panel or rear-projection set of 50 inches and up. Since a little headroom is never a bad thing, make that 42. Even so, I have no intention of dumping the smaller of my two HDTV sets, a 32-inch 768p LCD, to get 1080p.   Why the Full HD hype? The real story—the one you won’t read in a lot of reviews, and certainly not in any ad—is that profit margins are plummeting swiftly in the TV manufacturing industry. While this is nothing short of fantastic for consumers, it’s also nearly catastrophic for TV makers. They need to sell us bigger TVs, because they make more money on bigger sizes. And they desperately need a “step up” feature to persuade us that a slightly more expensive medium-sized LCD or plasma is better than a slightly less expensive one. Full HD spells “performance,” while 720p and 768p are “value.”   That’s why you’re hearing so much about Full HD. I’m not saying it’s a bad thing. Just know what you’re buying.   Mark Fleischmann is the author of the annually updated book Practical Home Theater.

Editors' Recommendations

Amy Gilroy
Former Digital Trends Contributor
I replaced my MacBook with a Quest Pro for a full work week. Here’s what happened
Alan Truly is wearing his Quest Pro and holding a laptop with a message: M1 MacBook Air gets the week off.

Meta says headsets like the Quest Pro need to be able to replace a laptop if they're ever going to truly catch on. So, why not give it a shot? I tried using a Meta Quest Pro to replace my beloved M1 MacBook Air for a full week -- just to see what happened.

It was surprising in many ways, sometimes actually exceeding the productivity of my laptop, while other workflows proved frustrating and required some serious workarounds. Here's my day-by-day journal of making it all work.
Day one: a promising start

Read more
The best 1080p graphics cards in 2022: great options for Full HD
Three graphics cards on a gray background.

By far the most popular resolution for gaming is 1080p, which makes up about two-thirds of all monitor resolutions in Steam's most recent hardware survey. Although it's easy to get excited by the RTX 3080 Ti, the RX 6950 XT, or even the ultra high-end RTX 4090, they're overkill for gamers' favorite resolution. Our guide to the best 1080p graphics cards has a few more sensible options for Full HD.

We selected six GPUs that are not only capable of running games at Full HD but are capable of running them with high frame rates. The rise of high-refresh-rate monitors has pushed the goalpost back for 1080p, so our picks are targeted at gamers who don't mind sacrificing resolution for a big boost in frame rates.

Read more
Intel reveals full Arc Alchemist pricing, and it’s definitely competitive
Intel Arc Alchemist reference design render.

We now know the pricing and the release date for Intel's upcoming Arc Alchemist graphics cards. Intel shared this key piece of information in regard to two GPUs: The Arc A770 and the Arc A750. While we already knew those details for the Arc A770, the Arc A750 has now also been confirmed.

In addition to the long-awaited launch date, we now know a little more about the cards themselves. More importantly, we know that the pricing remains competitive throughout the lineup, so Intel has stayed true to its word and the promise that GPUs don't need to be obscenely expensive.

Read more