Home > Apple > Zooming in on Apple’s high-density Retina…

Zooming in on Apple’s high-density Retina displays

Apple raised eyebrows when it introduced the iPhone 4 with its so-called “Retina” display. Although many dismissed the screen as a gimmick, once folks set eyes on one it was hard not to notice image quality and the readability of text on the iPhone 4 was often superior to anything on the market. Apple stuck with the Retina display with the iPhone 4S, and then upped the ante again with the third-generation iPad, which features a 9.7-inch LCD panel with a 2,048 by 1,536-pixel resolution — something that would have been unthinkable only a few years ago. Now, reports (such as those from 9 to 5 Mac have Apple readying a new MacBook Pro line with 15-inch displays sporting a 2,880 by 1,800-pixel resolution — a screen twice as dense as current models.

What’s the big deal about so-called Retina displays? Do they have the potential to revolutionize the desktop computing experience the same way high-resolution displays have changed the pixel landscape for current iPhone and iPad users?

The basic idea


The essential idea behind Apple’s Retina display has nothing to do with the particular technology used by a display. To date, Apple is using LCD screens, but the same ideas apply equally to AMOLED and even old-school CRTs. The premise is to increase the pixel density of the display beyond the point where the human eye can distinguish between individual pixels, meaning that images shown on the display appear as smooth, continuous tones akin to a high-quality printed page, rather than a grid of colored dots.

The idea isn’t new: IBM has been offering high-density displays for use with some of its top-drawer visualization and supercomputing systems since at least the mid 1990s, and things like high-end flight simulators (particularly in the military) have been using technology like this for years. The term Retina display is an Apple invention — hence the capitalization. A better term is high-density display, and even Apple uses terminology like that in its developer materials.

Pixel density is usually measured in PPI — pixels per inch — which is a bit of a carryover term from the print and broadcast industries, where dpi (dots per inch) is often used to describe resolution. At a very basic level, the same idea applies to film photography, where “grainy” images can result from individual particles on the film stock: enlarge a film image enough and you’ll start to see the “dots,” although they’re not laid out on a regular grid like pixels.

With normal 20/20 vision, the human eye can resolve objects with an apparent size of about 1 arcminute*. For a sense of how small that is, there are 60 arcminutes to a single degree and, of course, turning full circle covers 360 degrees. The key here is apparent size, because objects appear to be bigger or smaller with distance. We’re all about the same distance from the moon, so it usually makes a reference object. The full moon (overhead, say at midnight) is about 30 arcminutes across, or about half a degree. A human eye with normal vision can resolve objects about thirty times smaller. (Yes yes, the moon seems bigger at the horizon: that’s an optical illusion that hasn’t been fully explained.)

Here “resolve” means that we can actually distinguish the object’s shape and characteristics. Objects smaller than one arcminute (with normal vision!) don’t magically turn invisible and vanish from view: If there’s enough contrast, we can still see them. But all we see is a dot: we know something is there, but can’t distinguish much of anything about it. That spot in the sky might be a bird, might be a plane, might be a guy with his underwear on the outside his pants hurtling through the air. We won’t be able to tell until it gets closer.

Doing the math

So, how does the math on that work out? At a distance of 12 inches, the figures above mean a human eye with normal vision can resolve objects about 0.0035 inches in size. For individual pixels on a display to be too small for the human eye to resolve at that distance, the display needs to be about 286 pixels per inch (PPI).

Here’s how some Apple products stack up:

Displays resolutions of popular Apple products, including pixels per inch (PPI)By way of comparison, the first Macs had screen resolutions of 72 ppi (Windows systems varied quite a lot). These days, typical screen resolutions for notebook at desktop computers range from about 96 ppi to about 150 ppi.This math isn’t new: it’s the same logic that lead to the first commercially laser printers offering “high-quality” output of 300 dots per inch (DPI).

Notice only one current Apple product exceed that 286 ppi threshold: the iPhone 4 and 4S. If Apple’s forthcoming MacBook Pros do include a Retina display, it looks like it’ll come in at about 220 ppi.

But 200 ppi could be just fine, because all things aren’t quite equal. First, most people don’t consistently position a display exactly 12 inches from their eyes: Some people hold it a little closer, many people hold them a bit further away, and displays on notebooks and desktop computers tend to be further away still. Remember: the further away something is, the smaller its apparent size. As an example, my desktop displays are about 34 inches from my eyes; a notebook on a tabletop seems to be about 24 inches. (Flipside, I’m sure by now most of us have seen folks walking around with their iPhones nearly touching their noses.)

Second, plenty of people don’t have normal vision. For instance, I don’t have any trouble picking out individual pixels on an iPhone 4S from about 18 inches…at least, with one of my eyes. The other eye doesn’t do so well. It’s easy to argue that folks who don’t have 20/20 vision probably have corrective lenses, so the same visual rules about being able to resolve individual pixels ought to apply — but that’s not really true. Plenty of people don’t wear their glasses or contacts all the time, and plenty of people who don’t have 20/20 vision don’t use corrective lenses at all. In the United States, folks typically only need about 20/40 vision to pass a driving test.

All this means that screens with resolutions lower than 286 ppi can offer a Retina-like experience for many users, they’re either far enough back from the screens, don’t have perfect vision, or both.

Memory, bandwidth, and resolution

iPad "retina" display comparison

Observant readers will note two other columns on that table above: pixels and memory. It’s easy for us to think of a display measuring 2,048 by 1,536 pixels as having twice the resolution of one measuring 1,024 by 768 pixels — after all, each side is twice as long. However, that translates to a four-fold increase in the number of pixels — and that, in turn, directly translates to a four-fold increase in the amount of memory a device has to use and process to manage the current display.

In loose terms, the device with the larger display has to have about four times the memory and be able to move data in and out of its graphics systems four times faster — at least, if it wants the display to appear to perform as well as the smaller version. That means more reading and writing from memory, and that means consuming more power. So, particularly on portable devices, manufacturers use processors and other components that use as little power as possible.

Again, all things aren’t equal. The figures above may make it look like a third-generation iPad only needs 9.2 MB of video memory to manage its display — and that would be true if the iPad were just managing one image for the whole screen at any given time. (That’s what the original personal computers did, by the way.) The reality of software development is that applications often render and load up graphic elements in video memory so things can be displayed fast. Common examples include games (where characters, objects, and other graphical elements get stashed in video memory for real-time access) but also seemingly prosaic applications like Web browsers: When you switch to a different tab, the page that just disappeared is probably still in video memory, ready to be re-summoned when you want it. The same thing is true of interface elements like menus, buttons, and scrollbars, and hundreds of things we see on screen all the time.

The impact of high-resolution displays is most apparent in these elements. Let’s say a typical application icon measures 128 pixels square. On a display with 100 ppi resolution that icon is more than an inch across, and very clear and visible to most users. Put that same icon on a display with a 200 ppi resolution, and suddenly it’s about a half inch square and occupies a quarter of the area it previously consumed. The same effect applies across the board: If you jam twice as many pixels into a display, everything on that display will appear one quarter the size it did before. That’s a quick way to make applications (and interfaces) completely unusable.

Makes you squint

iPhone "retina" display compare

So why didn’t Apple’s iOS become unusable when they took the iPad and the iPhone to high-density displays? They included interface elements that were four times the resolution of the old versions — their apparent size on screen didn’t change, but their apparent quality did since they used four times as many pixels to present their image. And yes, those elements take four times as much memory and storage to manage. Similarly, application developers who wanted to leverage the Retina display had to update their apps — and that includes upgrading their graphics to four times the size of previous versions.

The same thing will be true if (ahem, when) Apple begins to introduce high-density displays to its notebook and desktop lines. Most existing applications will run just fine — but they’ll appear chunky in comparison to apps developed with the high-density display in mind. Where an application may have drawn a line that was one pixel wide before, the high-density displays will use two pixels. For some apps, this may not matter at all; for others, it might be a major eyesore.

There may be some apps that don’t make the transition very well. The most likely candidates are games that are doing sneaky graphics tricks — possible candidates include games based on OpenGL, or other graphics engines that have had to do low-level tweaks to interface with graphics hardware. Similarly, some image and video editing apps may have performance issues or glitches — the more dependent they are on interacting with graphics hardware and drivers directly, the more potential for problems.

Are high-density displays worth the hassle?

Do high-density displays bring real benefits? The overwhelming answer seems to be “yes.” Consumers are voting with their feet — and their money. The third-generation iPad and the iPhone 4 and 4S have been astonishingly successful products for Apple, and their displays have been almost universally lauded. Other mobile device manufacturers were quick to jump on the bandwagon: Samsung, HTC, Motorola, Sony, and LG have all released smartphones with high-density displays — many of which exceed the iPhone’s density. (The Sony Xperia S and HTC Rezound seem to be currently tied for the top spot, with displays offering 342 ppi) — but RIM seems to be gunning for over 350 ppi with its next tablet.)

However, it’s important to note that high-density displays have, so far, been limited to rather isolated ecosystems. In Apple’s case, the company’s tight reins on the iOS platform (and its sheer popularity) has helped ease the transition to high-density displays. When the original iPad debuted, it would run apps developed for the original iPhone, but they were seriously clunky compared to versions made specifically for the iPad. Developers quickly embraced the iPad, however, and similarly have embraced the high-density displays on the iPhone 4/4S and third-generation iPad.

Android is another story. While there are a number of Android devices that offer high-density displays, fragmentation means they aren’t seen anywhere near the same level of developer support Apple has commanded with iOS. Yes, people can buy Android devices with high-density displays, and yes, there are apps designed for those displays. But persuading an Android developer to make a high-density version of their app can be a tough sell. If an app benefits substantially from high-density displays, it might be a no-brainer. Otherwise, most Android developers are likely to consider apps that target a wide range of screen resolutions to be good enough — and if they look a little clunky on high-definition displays, so be it.

Apple may experience something similar if (ahem, when) it introduces high-density displays for Macs. The Macintosh software ecosystem is nowhere near as tightly knit as the iOS ecosystem, and is littered with legacy applications folks have been using for years — and which may never be updated. Apple has made a number of transitions that have forced its Mac OS X users to abandon legacy software — such as killing off Classic in Mac OS X 10.5 Leopard, and eliminating support for PowerPC applications in Mac OS X 10.7 Lion. Apple is now making moves to lock down the Mac OS X software ecosystem almost as hard as iOS: Apple’s forthcoming Mac OS X 10.8 Mountain Lion will drive all users towards Apple’s Mac App Store, and — perhaps as soon as June 1 — apps won’t be able to get into the store unless they’re limited to a security sandbox. It’s reasonable to assume explicit support for high-density displays will soon be considered a key feature.

Microsoft will have a much tougher time with high-density displays due to the vast ecosystem of legacy Windows software that’s in active use. Microsoft is in a very strong position to take advantage of high-density displays with the brand-new Windows Metro (particularly on Windows RT for ARM-based devices) but will likely have a tougher time convincing budget-sensitive customers dependent on legacy software for the traditional Windows desktop to step up. Plus, the ever-frustrating dance many Windows users perform with video drivers, BIOS, and applications will carry forward to high-density displays.

But gosh: when it’s all over, won’t everything be so pretty?

* — There are varying definition of “normal” vision. Optometrists often define normal 20/20 vision as the ability to distinguish letters that take up five minutes of arc; however, some definitions of normal vision include distinguishing objects with an apparent size as small as 0.6 arcminutes.

Get our Top Stories delivered to your inbox: