Skip to main content

Kids born today won’t know what a pixel is, and that’s a dream come true

death of the pixel kill all pixels imac retina
Image used with permission by copyright holder
The pixel is a cultural icon. Wander down any street and you’re likely to see it not only in old LCD screens lining storefronts but also logos, advertisements, and even fashion. “Pixel art,” the intentional use of low-fi, pixelated graphics, is virtually the default among indie games, and even Digital Trend’s own 404 page features 8-bit characters on a pixelated sky-scape.

Users have become accustomed to the pixel, and most have forgotten it’s an artifact of limited graphics technology. The pixel was never desired; it exists only because of the particular limitations of computer displays and graphics renderers. Over the past three decades, countless minds have tried to tame unsightly pixilation, also known as aliasing, in numerous ways.

The fight has been long, but the forces of progress are winning. Pixels are dying. It’s entirely possible that a baby born today will never see one, and a child born a decade from now will probably never know of their existence unless he or she decides to learn computer science. Let’s take a moment to reflect on the pixel before it’s laid to rest in the graveyard of obsolescence.

The jagged edge

Pixels have existed from the beginning of computer graphics, and for many early computer scientists, they represented serious problem. While their existence didn’t necessarily hobble interface design for early mainframes and the first home PCs, they presented a major problem for anyone seeking to push the limits of realistic computer graphics.

LucasFilm was an early pioneer in the field. Its computer division, which was eventually sold to Steve Jobs and re-named Pixar, searched desperately for ways to render graphics detailed enough to be used alongside miniatures in Star Wars.

What’s in one pixel could be a city.

Robert Cook, Pixar’s former Vice President of Software Development, was there from the beginning, and remembers the challenge well. “The basic problem,” he explained “is you’re trying to make an image with a lot of detail, and you only have so many pixels.”

This inevitably forces computers to make a difficult decision. Multiple objects might inhabit the space of a single pixel, yet only one can be shown – so which one should it be? “What’s in that one pixel could be a city,” said Cook, “but the computer has to pick one color.” Early computers, with limited pixels and no way to combat aliasing, were forced to dramatically simplify images. The result was coarse, jagged graphics that looked nothing like reality.

Star Wars a New Hope
Foiled by the pixilation of computer-generated graphics, Star Wars’ producers turned to real-life miniatures instead, like this recreation of the Death Star. Starwars.com

Those “jaggies” were particularly nasty in objects oriented at a diagonal to the pixel grid, and they precluded the use of computer graphics for most special effects until the problem was solved.

That proved a long, difficult road. Computer graphics never contributed significantly to the original Star Wars trilogy, which relied on a complicated dance of miniatures up to the Return of the Jedi‘s epic final battle. LucasFilm, refocusing on its core entertainment business and unhappy with the results of the Computer Division, sold it to Steve Jobs in 1986, who renamed the company to Pixar after its star product, a $135,000 anvil of processing power called the Pixar Image Computer.

A new hope

While the Pixar Image Computer was technically stunning, it wasn’t a commercial success, and it didn’t represent the company’s passion. Many of its employees wanted to use computer graphics to create entertainment, even art. This included former Disney animator John Lasseter, who was hired by the Lucasfilm Computer Division to bring life into its technically stunning graphics.

PC users expect razor-sharp image quality and despise softness, even if aliasing is the result.

Everyone knew, though, that even an animator of Lasseter’s skill couldn’t produce a compelling scene from computer graphics if jaggies remained an issue. Pixels don’t appear natural, they obscure the detail of a scene, and in motion they transition perfectly from one pixel to the next, removing the motion blur that makes film seem realistic.

The geeks at LucasFilm tried to tackle the problem in a number of ways. Eventually a hardware engineer at the company, Rodney Stock, came up with an idea, which Rob Cook refined into a fix for aliasing. Randomness.

“The jaggies come from the samples all being lined up on a grid,” Cook explained, “if you add some randomness, you break up the patterns.” Adding randomness to aliased portions of an image introductions uneven noise that, unlike patterns of perfectly stepped pixels, doesn’t seem unusual to the human eye.

Rob Cook (2010, Photo by Deborah Coleman / Pixar)
Rob Cook Deborah Coleman/Pixar

Randomness did more than just serve as effective anti-aliasing. It also helped blend computer effects with film and created a blur effect when applied to multiple frames of motion, addressing numerous problems with one tidy solution. While there were alternative techniques for aliasing in 3D film, they proved too computationally intense and didn’t produce superior results, leaving random sampling to reign as king.

Bringing it home

Solving the problem of pixels on the average home PC is not the same as solving it in film, however. Computer-generated movies are expected to replicate the nature of film, including its imperfections. A little noise or blur is not just acceptable, but desirable.

The Windows desktop, and computer interfaces in general, are a different animal. Users expect razor-sharp image quality, despise softness and frown upon noise. The ideal font is pixel-perfect, high-contrast, fine yet readable, standing out boldly from its surroundings. Even computer gamers expect a very specific experience and often look down on motion blur as an artifact or distracting visual add-on rather than a desirable effect. Games that pay homage to film, such as the original Mass Effect (which implemented a “film grain” filter), catch flack from those who prefer a sharper experience, even at the cost of aliasing. Pixels are preferable to noise.

Mass Effect
Mass Effect Image used with permission by copyright holder

Given the choice, though, users prefer to have the best of both words; razor-sharp image quality and smooth edges. A number of techniques have been developed to deliver this, with varying success. Windows uses Clear Type, a sub-pixel aliasing technology designed specifically for fonts. Apple uses font smoothing along with tight guidelines for the standards of art assets used by developers, particularly with its Retina displays. And games use numerous tactics, from multi-sample anti-aliasing, which only smooths the edges of polygons, to temporal anti-aliasing, which smooths all aliased edges while drawing on data from multiple frames.

These efforts have gradually eroded the pixel’s prominence, making unsightly, jagged edges less common, but they’re not a complete solution. Aliasing is a tough problem to solve, particularly when compute power is limited, as it so often is with home PCs. And there’s always a trade-off between sharpness and smoothness. Turning Apple’s text smoothing up a few notches in the command line can make aliasing very difficult to detect, but it also results in soft, fuzzy fonts that aren’t at all like the crisply printed text in a book.

The visual limit

Anti-aliasing was not the only solution to jaggies considered in the early days of computer graphics. Researchers also looked into rendering images with resolutions as high as 8,000 pixels on a side, which made individual pixels too small for the human eye to detect. Lucasfilm itself commissioned several high-resolution renders of its X-Wing fighter by the graphics group of a company called Information International, Inc. One of these highly impressive renders found itself on the cover of Computer magazine.

Yet this technique was soon abandoned for a number of reasons. It was insanely computationally intense, which meant a single frame effectively cost thousands dollars, and increasing the resolution did nothing to solve the motion blur issue that plagued computer graphics. Though effectively lacking visual pixels, the render didn’t look real and for Lucasfilm, deep in the production of Star Wars, that was an unforgivable sin.

Upscaling low-resolution content is an issue that’ll persist for years.

The failure of early high-resolution renders obscured the usefulness of high pixel counts for decades, but the past five years have brought resolution back to the spotlight. Apple’s first iPhone with Retina sparked the trend, and it’s quickly spread to other devices – for good reason.

Tom Peterson, and Director of Technical Marketing at Nvidia, told us that packing extra pixels really does render them invisible. “As the pixel density gets really high, it reaches the threshold of what the human eye can observe. We call that the visual limit. ” A display that exceeds the visual limit looks less like a display and more like a printed page, albeit one that glows.

What is the visual limit? It’s best described in terms of pixels per degree of vision, a metric that changes based on the size of a display and the observer’s distance. The golden number is 50 PPD, a figure that many modern smartphones easily exceed. 4K monitors don’t quite meet that goal, but 5K displays like the iMac with Retina and Dell UP2715K do, and a 65-inch 4K television can also hit the magic number if viewed from six feet away.

This is not to say that reaching the visual limit immediately eliminates aliasing. Upscaling low-resolution content is an issue that’s likely to persist until pixel-dense displays become the norm. Windows is currently struggling to curtail this issue because it must continue compatibility with numerous applications, some of which may be over a decade old and are no longer actively supported by their developers.

“Applications have some work to do using rendered fonts,” Tom Peterson explained, “because a lot of them are using bitmapped fonts.” They scale poorly, as they are “pixelated text images.” Ideally, fonts should be vector-based, making them a collection of lines and angles that scan scale easily. This is why the text in Windows 8.1’s interface looks brilliant at any resolution, but the text in desk applications often appeared soft and blurred.

Still, this is a solvable problem, and one that developers will be pressured to fix as pixel densities continue to surge. Users who spend hard money on an upgraded display will want to see the benefits, and are sure to avoid software that refuses to modernize.

We’re here today to mourn the pixel

Many new devices have already exceeded the visual limit, and while computers have lagged mobile devices, they’re beginning to catch up. Any 15-inch laptop with a 4K display easily exceeds the limitations of the human eye, and any 27-inch set with 5K resolution does the same. These panels will only decrease in price over time, just as did 1080p; within a few years they’ll be in everyday notebooks and monitors anyone can afford.

That will be the final nail in the pixel’s coffin. With televisions already heading to 4K (and beyond) there will no longer be any device on store shelves without the density needed to render pixels invisible at a typical viewing distance. Resolution itself will start to lose its meaning; users will simply be concerned with whether a display does, or doesn’t, appear as sharp as a photograph. Pixels will fade out of popular knowledge, and further advancements in sharpness will exist only for marketing. Programmers, artists and others who deal with digital images will continue to acknowledge pixels, but most users, even enthusiasts and videophiles, will have little reason to care.

Just as today’s youth can’t remember a world without the Internet, children born five years from now won’t remember a world in which the pixel exists. To them, displays will have always appeared as crisp as a window, and pixel art will be nostalgia for an era only their parents remember.
Their world will not be so different from our own. But it’ll look a hell of a lot smoother.

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
The Nintendo Switch just got 2 surprise games — and they’re both worth grabbing
A teddy beat sits on an embroidery hoop in Stitch.

If you were unable to catch this week's Nintendo IndieWorld showcase, then you missed a surprisingly loaded show. Lorelei and the Laser Eyes got a May release date, WayForward showed off its Yars' Revenge revival, and Steamworld Heist 2 got an exciting reveal. In the midst of all those headlines, two smaller games were surprise released on the platform: Stitch and Sticky Business. Don't sleep on either of them, as they're both worth a purchase.

Both games are ports of previously released games, but both went a bit under the radar upon their original launch. Sticky Business modestly launched last summer on PC, whereas Stitch has actually been around since 2022 as an Apple Arcade exclusive. The latter even has an Apple Vision Pro version now that can be played in mixed reality. I can't blame anyone for missing either, but their Switch releases offer a good opportunity to catch up with some quiet hidden gems.

Read more
Is this Razer’s Steam Deck killer?
The Razer Kishi Ultra sitting on a table.

Razer has been oddly quiet in the burgeoning world of handheld gaming PCs. When I met up with the company at the Game Developers Conference (GDC) to learn about its new products, I was happy to hear it had an answer to the success of the Steam Deck.

But it was not the type of answer I was expecting.

Read more
The best iPhone emulators
A collage of the delta emulator.

The market for iPhone games has become so wide and diverse that it can realistically compete with most console and PC offerings. Where we once only got cheap time-wasters, we now have complete experiences that don't feel any less impressive than what the competition offers. In fact, a lot of games made for consoles are appearing on the iPhone now that it is becoming so powerful. However, older games have paradoxically been mostly absent from the app store. That all could be about to change as emulation is now allowed on iPhone, though with some caveats that any retro fan should know about before getting too excited to play all your favorite NES games on your phone. Here's what's up with iPhone emulators, as well as our picks for a few of the best ones you can get right now.
What you need to know about emulation on iPhone
Emulators on iPhone, as well as emulation in general, are in a strange legal gray zone. Previously, the only way to get an emulator on your iPhone was through some workarounds that generally involved jailbreaking your phone, That differs from Android, which has enjoyed native emulators for years. In 2024, Apple updated its App Store guidelines to allow for emulators on its store, but with some important restrictions.

Here's the exact wording: "Apps may offer certain software that is not embedded in the binary, specifically HTML5 mini apps and mini games, streaming games, chatbots, and plug-ins. Additionally, retro game console emulator apps can offer to download games. You are responsible for all such software offered in your app, including ensuring that such software complies with these guidelines and all applicable laws. Software that does not comply with one or more guidelines will lead to the rejection of your app. You must also ensure that the software adheres to the additional rules that follow in 4.7.1 and 4.7.5. These additional rules are important to preserve the experience that App Store customers expect, and to help ensure user safety."

Read more