Kids born today won’t know what a pixel is, and that’s a dream come true

death of the pixel kill all pixels imac retina
The pixel is a cultural icon. Wander down any street and you’re likely to see it not only in old LCD screens lining storefronts but also logos, advertisements, and even fashion. “Pixel art,” the intentional use of low-fi, pixelated graphics, is virtually the default among indie games, and even Digital Trend’s own 404 page features 8-bit characters on a pixelated sky-scape.

Users have become accustomed to the pixel, and most have forgotten it’s an artifact of limited graphics technology. The pixel was never desired; it exists only because of the particular limitations of computer displays and graphics renderers. Over the past three decades, countless minds have tried to tame unsightly pixilation, also known as aliasing, in numerous ways.

The fight has been long, but the forces of progress are winning. Pixels are dying. It’s entirely possible that a baby born today will never see one, and a child born a decade from now will probably never know of their existence unless he or she decides to learn computer science. Let’s take a moment to reflect on the pixel before it’s laid to rest in the graveyard of obsolescence.

The jagged edge

Pixels have existed from the beginning of computer graphics, and for many early computer scientists, they represented serious problem. While their existence didn’t necessarily hobble interface design for early mainframes and the first home PCs, they presented a major problem for anyone seeking to push the limits of realistic computer graphics.

LucasFilm was an early pioneer in the field. Its computer division, which was eventually sold to Steve Jobs and re-named Pixar, searched desperately for ways to render graphics detailed enough to be used alongside miniatures in Star Wars.

What’s in one pixel could be a city.

Robert Cook, Pixar’s former Vice President of Software Development, was there from the beginning, and remembers the challenge well. “The basic problem,” he explained “is you’re trying to make an image with a lot of detail, and you only have so many pixels.”

This inevitably forces computers to make a difficult decision. Multiple objects might inhabit the space of a single pixel, yet only one can be shown – so which one should it be? “What’s in that one pixel could be a city,” said Cook, “but the computer has to pick one color.” Early computers, with limited pixels and no way to combat aliasing, were forced to dramatically simplify images. The result was coarse, jagged graphics that looked nothing like reality.

Star Wars a New Hope
Foiled by the pixilation of computer-generated graphics, Star Wars’ producers turned to real-life miniatures instead, like this recreation of the Death Star. Starwars.com

Those “jaggies” were particularly nasty in objects oriented at a diagonal to the pixel grid, and they precluded the use of computer graphics for most special effects until the problem was solved.

That proved a long, difficult road. Computer graphics never contributed significantly to the original Star Wars trilogy, which relied on a complicated dance of miniatures up to the Return of the Jedi‘s epic final battle. LucasFilm, refocusing on its core entertainment business and unhappy with the results of the Computer Division, sold it to Steve Jobs in 1986, who renamed the company to Pixar after its star product, a $135,000 anvil of processing power called the Pixar Image Computer.

A new hope

While the Pixar Image Computer was technically stunning, it wasn’t a commercial success, and it didn’t represent the company’s passion. Many of its employees wanted to use computer graphics to create entertainment, even art. This included former Disney animator John Lasseter, who was hired by the Lucasfilm Computer Division to bring life into its technically stunning graphics.

PC users expect razor-sharp image quality and despise softness, even if aliasing is the result.

Everyone knew, though, that even an animator of Lasseter’s skill couldn’t produce a compelling scene from computer graphics if jaggies remained an issue. Pixels don’t appear natural, they obscure the detail of a scene, and in motion they transition perfectly from one pixel to the next, removing the motion blur that makes film seem realistic.

The geeks at LucasFilm tried to tackle the problem in a number of ways. Eventually a hardware engineer at the company, Rodney Stock, came up with an idea, which Rob Cook refined into a fix for aliasing. Randomness.

“The jaggies come from the samples all being lined up on a grid,” Cook explained, “if you add some randomness, you break up the patterns.” Adding randomness to aliased portions of an image introductions uneven noise that, unlike patterns of perfectly stepped pixels, doesn’t seem unusual to the human eye.

Rob Cook (2010, Photo by Deborah Coleman / Pixar)
Rob Cook Deborah Coleman/Pixar

Randomness did more than just serve as effective anti-aliasing. It also helped blend computer effects with film and created a blur effect when applied to multiple frames of motion, addressing numerous problems with one tidy solution. While there were alternative techniques for aliasing in 3D film, they proved too computationally intense and didn’t produce superior results, leaving random sampling to reign as king.

Bringing it home

Solving the problem of pixels on the average home PC is not the same as solving it in film, however. Computer-generated movies are expected to replicate the nature of film, including its imperfections. A little noise or blur is not just acceptable, but desirable.

The Windows desktop, and computer interfaces in general, are a different animal. Users expect razor-sharp image quality, despise softness and frown upon noise. The ideal font is pixel-perfect, high-contrast, fine yet readable, standing out boldly from its surroundings. Even computer gamers expect a very specific experience and often look down on motion blur as an artifact or distracting visual add-on rather than a desirable effect. Games that pay homage to film, such as the original Mass Effect (which implemented a “film grain” filter), catch flack from those who prefer a sharper experience, even at the cost of aliasing. Pixels are preferable to noise.

Mass Effect
Mass Effect

Given the choice, though, users prefer to have the best of both words; razor-sharp image quality and smooth edges. A number of techniques have been developed to deliver this, with varying success. Windows uses Clear Type, a sub-pixel aliasing technology designed specifically for fonts. Apple uses font smoothing along with tight guidelines for the standards of art assets used by developers, particularly with its Retina displays. And games use numerous tactics, from multi-sample anti-aliasing, which only smooths the edges of polygons, to temporal anti-aliasing, which smooths all aliased edges while drawing on data from multiple frames.

These efforts have gradually eroded the pixel’s prominence, making unsightly, jagged edges less common, but they’re not a complete solution. Aliasing is a tough problem to solve, particularly when compute power is limited, as it so often is with home PCs. And there’s always a trade-off between sharpness and smoothness. Turning Apple’s text smoothing up a few notches in the command line can make aliasing very difficult to detect, but it also results in soft, fuzzy fonts that aren’t at all like the crisply printed text in a book.

The visual limit

Anti-aliasing was not the only solution to jaggies considered in the early days of computer graphics. Researchers also looked into rendering images with resolutions as high as 8,000 pixels on a side, which made individual pixels too small for the human eye to detect. Lucasfilm itself commissioned several high-resolution renders of its X-Wing fighter by the graphics group of a company called Information International, Inc. One of these highly impressive renders found itself on the cover of Computer magazine.

Yet this technique was soon abandoned for a number of reasons. It was insanely computationally intense, which meant a single frame effectively cost thousands dollars, and increasing the resolution did nothing to solve the motion blur issue that plagued computer graphics. Though effectively lacking visual pixels, the render didn’t look real and for Lucasfilm, deep in the production of Star Wars, that was an unforgivable sin.

Upscaling low-resolution content is an issue that’ll persist for years.

The failure of early high-resolution renders obscured the usefulness of high pixel counts for decades, but the past five years have brought resolution back to the spotlight. Apple’s first iPhone with Retina sparked the trend, and it’s quickly spread to other devices – for good reason.

Tom Peterson, and Director of Technical Marketing at Nvidia, told us that packing extra pixels really does render them invisible. “As the pixel density gets really high, it reaches the threshold of what the human eye can observe. We call that the visual limit. ” A display that exceeds the visual limit looks less like a display and more like a printed page, albeit one that glows.

What is the visual limit? It’s best described in terms of pixels per degree of vision, a metric that changes based on the size of a display and the observer’s distance. The golden number is 50 PPD, a figure that many modern smartphones easily exceed. 4K monitors don’t quite meet that goal, but 5K displays like the iMac with Retina and Dell UP2715K do, and a 65-inch 4K television can also hit the magic number if viewed from six feet away.

This is not to say that reaching the visual limit immediately eliminates aliasing. Upscaling low-resolution content is an issue that’s likely to persist until pixel-dense displays become the norm. Windows is currently struggling to curtail this issue because it must continue compatibility with numerous applications, some of which may be over a decade old and are no longer actively supported by their developers.

“Applications have some work to do using rendered fonts,” Tom Peterson explained, “because a lot of them are using bitmapped fonts.” They scale poorly, as they are “pixelated text images.” Ideally, fonts should be vector-based, making them a collection of lines and angles that scan scale easily. This is why the text in Windows 8.1’s interface looks brilliant at any resolution, but the text in desk applications often appeared soft and blurred.

Still, this is a solvable problem, and one that developers will be pressured to fix as pixel densities continue to surge. Users who spend hard money on an upgraded display will want to see the benefits, and are sure to avoid software that refuses to modernize.

We’re here today to mourn the pixel

Many new devices have already exceeded the visual limit, and while computers have lagged mobile devices, they’re beginning to catch up. Any 15-inch laptop with a 4K display easily exceeds the limitations of the human eye, and any 27-inch set with 5K resolution does the same. These panels will only decrease in price over time, just as did 1080p; within a few years they’ll be in everyday notebooks and monitors anyone can afford.

That will be the final nail in the pixel’s coffin. With televisions already heading to 4K (and beyond) there will no longer be any device on store shelves without the density needed to render pixels invisible at a typical viewing distance. Resolution itself will start to lose its meaning; users will simply be concerned with whether a display does, or doesn’t, appear as sharp as a photograph. Pixels will fade out of popular knowledge, and further advancements in sharpness will exist only for marketing. Programmers, artists and others who deal with digital images will continue to acknowledge pixels, but most users, even enthusiasts and videophiles, will have little reason to care.

Just as today’s youth can’t remember a world without the Internet, children born five years from now won’t remember a world in which the pixel exists. To them, displays will have always appeared as crisp as a window, and pixel art will be nostalgia for an era only their parents remember.
Their world will not be so different from our own. But it’ll look a hell of a lot smoother.

The views expressed here are solely those of the author and do not reflect the beliefs of Digital Trends.

Mobile

Rekindled yet again, Nokia’s next-gen phones offer more than just nostalgia

HMD Global, a startup that designs and builds Nokia Android smartphones, wants to put the Nokia brand name back “where it belongs.” It helps that it’s made up of ex-Nokia employees. We go behind the scenes to see how HMD formed.
Deals

Here’s a look at the hottest 4K TV deals for January 2019

There's no doubt that a good 4K smart TV is the best way to take your home entertainment setup to the next level to enjoy all your favorite shows, movies, and games in glorious Ultra HD. We've got the best 4K TV deals right here.
Home Theater

QLED and OLED may have similar names, but they're totally different technologies

The names may look almost identical, but OLED and QLED are two entirely different beasts. In our QLED vs. OLED battle, we dissect the differences between these dueling TV technologies, and help determine which might be best for you.
Home Theater

Here’s why you’re not getting Netflix in HD or 4K, and how to fix it

Are you having trouble watching your favorite movies or TV shows on Netflix in HD or 4K? We explain why loading takes so long, why the picture quality fluctuates, and what you can do about it.
Digital Trends Live

Microsoft has #*!@ed up to-do lists on an epic scale

Microsoft has mucked up to-do lists on a scale you simply can’t imagine, a failure that spans multiple products and teams, like a lil’ bit of salmonella that contaminates the entire output from a factory.
Opinion

As Amazon turns up the volume on streaming, Spotify should shudder

Multiple players are all looking to capitalize on the popularity of streaming, but it has thus far proved nearly impossible to make a profit. Could major tech companies like Amazon be primed for a streaming take-over?
Gaming

Throw out the sandbox. ‘Red Dead Redemption 2’ is a fully realized western world

Despite featuring around 100 story missions, the real destination in Red Dead Redemption 2 is the journey you make for yourself in the Rockstar's open world, and the game is better for it.
Gaming

‘Diablo Immortal’ is just the beginning. Mobile games are the future

Diablo fans were furious about Diablo Immortal, but in truth, mobile games are the future. From Apple and Samsung to Bethesda and Blizzard, we’re seeing a new incentive for games that fit on your phone.
Movies & TV

He created comics, movies, and superheroes. But Stan Lee lived for joy

Stan Lee was a creator, a celebrity, an icon, and beneath it all, a real-life good guy with all the same human qualities that made his superheroes so relatable. And his greatest joy was sharing his creations with the world.
Music

Brian Eno sets out to change music (again) with Bloom: 10 World

We always felt that Bloom was a musical system that could be developed further -- it was as if we’d built a CD player and only ever released one CD. For this release, we’ve created ten new worlds, starting with a reimagined version of…
Computing

Can two operating systems coexist? The Pixel Slate thinks so

The Pixel Slate is a 2-in-1 device like no other. It’s not the most polished product we’ve ever used, but Google has laid the foundation for letting mobile and desktop software live side-by-side in peace.
Android

Why commercials in Android Auto could turn your dashboard into a dumpster fire

Google announced some tweaks to the Android Auto experience, focused on making messaging and media easier, but I worry about the future of the platform. For better or worse, there’s a real chance our dashboards could turn into dumpster…
Gaming

These are the best video games you shouldn't leave 2018 without

Developers showed up with a number of amazing games this year. Each capitalized on something unique but there's always one that outdoes them all. Here are our picks for the best video games of 2018 and game of the year.
Home Theater

Will Marvel’s shows lose their punch if they move from Netflix to Disney Plus?

Disney could pick up the Marvel shows being canceled by Netflix, but the idea raises all sorts of questions. Is continuing Daredevil, Punisher, or Jessica Jones on Disney's own streaming service a good move?