Home > Apple > Are high-resolution displays screwing up the Web?

Are high-resolution displays screwing up the Web?

Apple iPad iPhone MacBook Air

Beginning with Apple’s iPhone 4, high-resolution displays have gone from being a plaything of rich and well-funded engineers, renderers, and video producers to a mainstay of consumer devices. Although Apple has continued the trend with its latest iPad tablet and (most recently) the MacBook Pro with Retina Display, it’s far from alone: HTC’s Rezound Android handset packs in more than 340 pixels per inch (that’s higher than the iPhone’s 326 ppi), and manufacturers like Acer are producing Android tablets with resolutions higher than 260 ppi, putting it in the same league as Apple’s latest iPad. Plenty of other manufacturers are (or will be) producing mobile devices and notebook computers with similarly high-res displays.

There’s just one downside: Apple’s so-called “Retina” displays have a tendency to make the whole Internet look fuzzy. The reason is purely related to resolution: Where (so far) Android tablets and other devices with high-res displays have primarily used the extra pixels to offer more screen real-estate, Apple uses the pixels to double the display resolution. That works great for apps that are designed with high-density displays in mind, but the vast majority of material available on the Internet is crafted for lower resolution displays — and, compared to high-res material, it looks fuzzy at best, crappy at worst.

How can a higher quality display make things on the Internet look worse? And since high-res displays aren’t going away, are there any technological solutions in the works that can make things look great?

Zooming in on resolution

Mac 128K (hello)

At a conceptual level, high-resolution displays like the ones in the new iPad and MacBook Pros are simple. They just cram more individual pixels into the same area. Naturally, there’s a great deal of engineering involved in that feat: Shrinking the size of LCD pixel elements, managing backlighting and power consumption, and making the screens strong, lightweight, and capable of being viewed from very wide angles. But basically it boils down to more pixels, less space.

How much less space? When the original Apple Macintosh brought graphic user interfaces to the mainstream, it had a 9-inch display measuring 512 by 384 pixels. (And it was black and white!) That’s 71.1 dots per inch, or DPI. This wasn’t a coincidence: it aligned fairly neatly with standards used in modern-day typography, where a single point of type is usually 1/72nd of an inch. On the Mac, that means 12-point type could be represented on screen with 12 pixels. Combine that with a 72 dpi (dot matrix!) printer and voila: What You See Is What You Get. No wonder one of the Mac’s most popular early features was the Font menu.

Windows eventually took a different approach. Microsoft didn’t make Windows PCs, so it didn’t have any control over the physical resolution of displays used with Windows systems. So Microsoft basically decided by fiat Windows would assume everything was 96 ppi. (There were many reasons, but basically came down to a compromise between old-school graphics and high-resolution devices becoming available — yes, monitors, but particularly printers.) That meant Windows would use 16 pixels to represent a 12-point font on screen. To some, this meant Windows had better on-screen typographic display than Macs — more pixels equals more resolution equals more accuracy and easier reading, right? However, it also meant Windows machines could fit less 12-point type on a standard display, so users got in the habit of using 10-, 9-, 8-, and even 7-point type for onscreen display. This had a funny effect for Mac users years later when the Web first went mainstream: Web pages designed with 96 ppi text sizes in mind were often impossible to read in Mac Web browsers that used a 72 dpi model: Macs were drawing the text at 75 percent of the already-small size meant for Windows.

Web standards groups eventually settled on 96 ppi for browsers. That remains the assumption behind Web formatting standards like CSS and its relative units — including typographic units like ems and exes, but also pixels. It’s tempting to think pixels are a fixed unit — after all, computers and other devices can’t show anything smaller than a pixel, right? But as we’ve seen, not all pixels are created equal. Old-school pixels were 1/72nd of an inch (or even bigger!); new-school pixels can be as tiny as 1/384th of an inch or even smaller. So pixels are a relative unit: You can talk about how many pixels wide or tall something is on screen, but that doesn’t tell you anything about the physical size of the presentation. A 500 x 500 pixel graphic would measure almost 7 inches across on an old display like the ones used in the early days of the Web. Show that same image on that HTC Rezound, it’s going to measure just over 1.25 inches square. Quite a difference!

MacBook Pro display comparison

The Retina routine

From initial resolutions of 72 ppi (or even lower), display resolutions have been creeping up over the years. The biggest jumps arrived with flat-panel displays and notebooks. Although early notebook computers were still at or near 72 or 96 ppi, it wasn’t long before they were cracking 100 ppi, with resolutions of 100 to 120 ppi having been mainstream for several years. There are exceptions: High-end notebooks aimed at gamers and power users have often offered high-resolution options, like 15.4-inch displays with 1,920 by 1,200-pixel displays. Those are about 146 ppi, or twice the physical resolution of those original 72 ppi screens.

However, most of those high-resolution displays show the same Windows (or Mac OS, or Linux, or Android, or what-have-you) interfaces, just smaller. Folks buying those 15.4-inch notebooks with 1,900 x 1,200-pixel displays are usually doing it so they have much more screen real estate for their software, email, Web browsers, messaging, and (of course) games and movies. Everything is drawn using the same number of pixels, but those pixels are physically smaller. The result is a really long task bar — or menu bar, or dock, or what-have-you — along with tiny icons and controls. For some folks that’s a godsend, for others its a quick route to eyestrain.

Apple’s approach with its so-called Retina displays is different. Instead of using the same number of pixels to display icons, controls, and other interface elements, Apple is doubling the resolutions of its displays and using twice as many pixels to show those elements at roughly the same physical size. For example: an icon or other element that measured 128 by 128 pixels on a standard display is probably about an inch across, give or take. On a high-resolution display, Apple will replace that element with a graphic measuring 256 by 256 pixels — that’s four times as many pixels — but the display still shows the element as about one inch across (give or take). The results — as anyone who has oggled a MacBook Pro with Retina display, new iPad, or iPhone 4 or 4S can attest — is a much higher perception of display quality. Instead of looking boxy and digital, applications take on a nearly photographic quality.

The fuzzy Web

MacBook Pro retina/non-retina

Of course, these days most people spend inordinate amounts of time on the Internet — but with the exception of Apple.com and a handful of Apple-centric sites — virtually no Internet sites and services have been updated to serve up high-resolution graphics. The same goes for apps: While Apps like iPhoto and iMovie have been updated for use on high-resolution displays, everyday applications like Web browsers,Microsoft Word, Photoshop and Twitter clients have not.

In theory, this isn’t a problem. That neat doubling trick means Apple’s high-resolution devices are fully compatible with older graphics. Sure, the new MacBook Pro display has a physical resolution of 2,880 by 1,800 pixels, but for older apps it thinks of itself as a standard 1,440 by 900-pixel display. When an app identifies itself as having high-resolution elements available, the Mac (or the new iPad, or the iPhone 4) will happily use them. But when an app doesn’t say anything, the devices use the regular graphics, just substituting four pixels for one. If a picture has a single red pixel, the high-res displays will use four (two across, two down) to display it. The result: There’s a red dot on screen at pretty much the same size the image (or interface) designer intended.

This means that interfaces and graphics intended for today’s mainstream displays have about the same display quality on Apple’s high-resolution displays as they do on mainstream devices. However, they seem to suffer in comparison because they’re often been shown side-by-side onscreen elements designed for the high-resolution display, whether that be an app in the background, interface elements, chrome around the display, or simply the shock of switching from a high-res-savvy app like iPhoto to one that hasn’t been updated, like (say) pretty much any website. In comparison, the standard resolution graphics look blocky, chunky, and fuzzy. If there’s one thing Apple’s high-resolution displays have proven, it’s how quickly people get used to their quality.

Fighting the fuzzies

iPad "retina" display comparison

So here’s the bad news: There’s absolutely nothing Apple can do to make those millions upon millions of Web sites look any better on their high-res displays. Similarly, there’s no way Apple can force developers to upgrade their software to support the devices. (Of course, as time goes on most of them will feel pressure to support high-res displays in their apps or risk losing business to developers of similar apps who do.)

However, that doesn’t mean all is lost — just that solutions will take time.

For one thing, not everything about websites necessarily looks fuzzy on Apple’s high-res displays. For instance, Web browsers (both in iOS and Mac OS X) tie back in to the operating system to display text, and the operating system’s text display engine will take advantage of a display’s top physical resolution. This means that most type (so long as it’s not embedded in images) should be as crisp as the high-res display allows, regardless of whether a particular browser has been upgraded to handle it. (This also applies to Web fonts, not just fonts available on a particular device.) At least most people’s reading experiences will be enhanced.

It’s possible (though awkward) for website developers to detect whether a browser is connecting to their site using a high-resolution display, and opt to serve higher resolution imagery to those devices. (If you’re a code jockey, Scott Gilbertson has an overview at Wired’s Webmonkey.) Techniques like that are what Apple is doing with its own site. Users who connect from a plain-Jane computer get one version of the site, but if you pull up Apple.com from a high-resolution iPad or MacBook Pro — you get a whole different experience. However, those users will also notice considerably more strain on their broadband connections — and for folks surfing on a capped mobile data plan, all those additional bytes for high-resolution imagery can add up very quickly.

That bandwidth consumption is why, at a fundamental level, simply doubling the resolution of every image on a Web site isn’t the right way to “fix” the Web for high-resolution displays. Plenty of sites are already image-intensive: Think about flipping through an image album on Facebook or Flickr, or scanning through product images on Zappos, eBay, or another site. For most people, those sites and services are already slow and annoying to use. Potentially quadrupling the amount of bandwidth required to load every image element on those pages is going to make them even slower and more annoying to use. Plus, folks who don’t have high-resolution displays — and that’s still the vast majority of Internet users — won’t see any visual benefit at all, but will suffer all the additional slowness and bandwidth consumption.

Instead, what’s needed is a standardized way for Web browsers to identify themselves to Web sites as capable of handling high-resolution graphics — and do so in a way that’s backward compatible with sites developed before high-resolution displays were even a possibility. It would also be nice if browsers put that capability in control of users — for instance, it would be great if folks could configure mobile Web browsers to not request high-resolution imagery when using a 3G or 4G connection.

Once that mechanism is agreed upon and standardized, literally millions of Web developers around the world need to make intelligent decisions about which elements of their sites benefit from high-resolution imagery, and which do not. Event photographs, product imagery, and showcases? Sure: It makes sense to serve those images in high resolution if a Web browser asks for it. Buttons, logos, faint background images and tiles, images with simple geometry? Those might be better served with alternate formats, or even with the same old images already on the side. Just because something can be high-res, doesn’t mean it needs to be.

The bottom line here? It’s never going to happen. Just as we still occasionally see and tags left over from the bad old days of the Netscape-Internet-Explorer war, the Web is never going to completely convert over to intelligently handing high-resolution displays. And just when we think the problem is mostly fixed, someone is going to come along with a new display technology — wearable goggles? holograms? — that disrupts the industry again. For folks eager to embrace high-res displays, take heart: It only gets better from here.

Get our Top Stories delivered to your inbox: