The iPod. The iPhone. The iMac. The Macintosh. The LaserWriter. The Macbook. Mac OS X. The list goes on. Apple, the multinational corporation that began life in the mid-seventies in a garage trumpeting “Think Different” as its mantra, has concocted and released some of the most imaginative, groundbreaking, and iconic products of the digital age. It has continually set the tone for style and ease of use, and has harmonized the relationship between man and computer more efficiently than any other company on the planet.
But amongst all the smooth user interfaces, the slick innovations, the slicker marketing, and the rabid brand loyalty, this Apple does have its worms. Not everything Steve Jobs and company touched turned to gold, and some concepts were just plain rotten right from the start.
Inside this seemingly free-spirited, creative environment founded by two pseudo-hippies lies an us-against-the-world overtone and a certain conceit that just won’t go away. Granted, not everything that died prematurely did so out of pure arrogance. But the truth is that Apple doesn’t play particularly well with others. And that is both its greatest strength and its greatest flaw.
Let’s take a few bytes out of a company that lately seems to do very little wrong.
When a computer is so bad that you’re forced to replace 14,000 of them, you know you have an abject failure on your hands. Such was the case for the Apple III, the pricey (they started at $4,500 and ran up to $8,000) business-oriented computer Apple introduced in 1980 as a big brother to its triumphant Apple II. But what business wanted a computer that grew so hot inside its fan-less, cramped interior that chips would literally pop out of their sockets? Other, equally annoying problems also surfaced with time, and to Apple’s credit, it not only replaced faulty early machines, but also redesigned and re-priced the device. It did not matter. Consumers rejected the Apple III en masse, purchasing just 75,000 of them over its three-year lifespan.
As the Apple III lay dying in 1983, the Apple Lisa was born. But in 1983, $10,000 was a hell of a lot of money. Heck, it still is today. Nevertheless, that’s the princely sum Apple asked customers to cough up for its impressively innovative but otherwise unspectacular Lisa.
Why on earth would Jobs and company put such a crushing price tag on a PC? It wasn’t particularly fast, and it wasn’t particularly well-endowed on the hardware front. Yet it had one thing no other prior computer could offer – a graphical user interface. This was huge news back in the text-based early ‘80s, as was Lisa’s non-preemptive multitasking and virtual memory. But that price? Ugh.
Apple halved the original price a year later for Lisa’s offspring, the Lisa II, and added numerous upgrades, but the first of many Macintosh success stories had already appeared by then, and IBM-based PCs were substantially more affordable. Lisa was dead and buried – literally, rumor has it, in the Utah desert – by 1985.
No licensing for you!
In a now-infamous 1985 letter to Head Apple John Sculley, Microsoft’s Bill Gates suggested in pleasant-yet-not-uncertain terms that Sculley strongly consider licensing “Mac technology” to established mega corporations such as Northern Telecom, AT&T, and Motorola. Presumably, Gates, an early Mac fan, was simply being a good guy who saw promise in the revolutionary graphical operating system, and wanted to bring it out of its Apple-imposed shell.
Apple said no dice, yet subsequently signed a document permitting Microsoft to essentially copy its OS. Gates and Microsoft soon released the first permutation of Windows, encouraging software developers the world over to get involved. Some would successfully argue that keeping everything in the family, so to speak, has helped Apple to stay…Apple. Yet today, Apple PCs make up just eight percent of the total US PC market.
Scorned in comic strips such as Doonesbury, and suffering from a digital glandular problem that kept the unit nearly as bulky as a small notebook, Apple’s Newton – widely considered to be the first tangible PDA – wasn’t an immediate failure. Indeed, the Newton was considered by many to be ahead of its time. But it’s oft ballyhooed handwriting recognition feature flopped initially (though it was later improved upon) and its price point, like so many other Apple products, kept it out of the hands of all but the wealthy. With a ton of third-party enhancements and an active online community, the device has done very well in the aftermarket. But such was deservedly not the case back in 1993.
Was it a computer, or a gaming system? In 1995, while Apple was in the throes of one of its worst-ever creative and financial periods, the Apple Bandai Pippin was the subject of such questions. Though the unit featured the sleek, compact lines of a console and the branding of toymaker Bandai, it also housed a PowerPC processor and dial-up modem, and was priced at double that of competing game consoles from Nintendo, Sony, and Sega. That it was seemingly cross-promoted as both a television-feeding game machine and a low-cost PC is not surprising.
Potential customers, for the most part, did not bite, and the Pippin eventually drifted off to never-never land. But not without first showing its mainly Japanese audience that console gaming and network support are not necessarily exclusive concepts. Unaffordable, problematic, yet sporting a rather key innovation – the common threads of Apple’s biggest blunders run deep within the Pippin.
We figure any flick starring Robert De Niro, Harvey Keitel, Ray Liotta, Michael Rappaport, and, lest we forget, Sylvester Stallone is worthy of discussion anytime. Nevertheless, we’re not talking the 1997 police drama Cop Land, we’re talking Apple’s Copland (pronounced “Copeland”) – the little operating system that couldn’t.
Apple PR machine slaved away overtime way back in 1996, promoting what was to be a strong opponent to Microsoft’s Windows 95, and the successor to the long-in-the-tooth Mac OS 7. Copland, named after composer Aaron Copland, was that successor: an OS that would purportedly offer improved multitasking, superior memory allocation, and a bunch of other futuristic building blocks. But Copland’s development proved erratic, its backward compatibility never really seemed to work itself out, and early test releases were buggier than a summer vacation at the lake.
Copland was history by the time Apple purchased Steve Jobs’ NeXT Software in late ’96, welcomed Jobs back into the fold, and slowly began morphing the NeXT OS into the Mac universe.
Apple USB Mouse
In a classic case of style over substance – something not entirely unfamiliar to Apple historians – 1998’s Apple USB Mouse looked hyper-cool and certainly complemented the curvy, translucent aesthetics of the iMac G3. One problem: It didn’t work. Or very well, anyway. Strike One: It was round and hard to grip. Strike Two: It sported just one, somewhat vague button. Strike Three: Nobody seemed to know which way it pointed.
The so-called Puck Mouse resembles many other Apple disasters in that it boasted one highly redeeming characteristic, even on the road to oblivion. In this case, it was the first mouse to offer USB connectivity. Yet that wasn’t enough to stop numerous third-party vendors from unleashing a variety of adapter-like attachments to make the Puck Mouse marginally palatable. Two years later, it was gone.
Power Mac G4 Cube
We swear this is not a recording, but Apple’s Power Mac G4 Cube, first released in 2000, is yet one more example that shows how Apple has continued, throughout the years, to put a sky-high premium on aesthetics, often to the detriment of practicality. While it’s true a Cube once resided in The Museum of Modern Art, and while it’s also true that numerous home aquarium buffs have purchased used Cubes, merely to plunder the innards and replace them with tropical fish, the product simply wasn’t a popular computer.
It cost more than a Power Mac G4, but without a monitor. It sported tiny blemishes that looked like cracks in its otherwise stunning clear case. And it suffered from infamous quality control issues. The cooling fan-deprived Cube (Steve Jobs isn’t a fan of fans) sold miserably, and was mercifully cancelled just a year after its debut.
Introduced in 2006 as Apple’s answer to the growing number of iPod docks and speaker systems permeating the retail market, the iPod Hi-Fi impressed many for its big bass and surprisingly powerful output. Indeed, early reviews proclaiming the Bose-developed unit as a true sonic champ still litter the Internet. Yet not everything was peachy in Hi-Fi land.
An inauspicious start was a taste of things to come. Unveiled by Steve Jobs himself at a February 2006 Apple media event, the iPod Hi-Fi lost immediate luster simply because attendees were expecting far more significant announcements. And despite positive initial reviews, the system soon took criticism for the conspicuous absence of a built-in radio and carrying strap. Audiophiles soon piled on, dissing the unit for its muddy high frequencies. Moreover, users were forced to precariously position their iPods atop an otherwise featureless iPod Hi-Fi roof, like a single, vertical hair on a bald man’s head.
But arguably most damning was the price. At $349, Apple’s dock approached the price of some full-blown home theatre systems. Ultimately, the iPod Hi-Fi was unceremoniously discontinued just a year and a half after its launch.
Steve Jobs, almost flippantly, calls it a “hobby.” Others have substantially more wide-ranging opinions on Apple TV, calling it everything from satisfactory, to limiting, to a serious blunder. One thing is sure – the 40GB hard drive that originally shipped with the Apple TV during its 2007 launch was ridiculously small. As is the amount of effort Apple seems to put into updates, upgrades, and enhancements. A comparatively high $229 price tag doesn’t help, especially when other, similar yet cheaper devices (ahem…Roku) seem to do it and do it better (Netflix, anyone?). Even the venerable Xbox 360 has become competition.
Clearly, the set-top-box market and concept hasn’t sorted itself out yet. But still, one wonders whether Apple will even be a player a couple of years down the road.