Was Dennis Ritchie more important than Steve Jobs?

dennis-ritchie-vs-steve-jobs

Computing pioneer Dennis Ritchie died this past weekend at age 70, becoming the second technology giant to pass within a week — the other, of course, being Apple’s Steve Jobs. Although Jobs was unquestionably the better-known figure, Ritchie was the creator of the C programming language and one of the primary developers of the Unix operating system, both of which have had profound impacts on modern technology. Unix and C lie at the heart of everything from Internet servers to mobile phones, set-top boxes and software. They have exerted tremendous influence on almost all current languages and operating systems. And, these days, computers are everywhere.

The coinciding events lead to an obvious question: Who was more important to modern technology, Ritchie or Jobs? It’s a classic apples-to-oranges question… but the search for an answer sheds a bit of light on what lead to the high-tech revolution and all the cool toys we have today.

Dennis Ritchie, Unix, and C

Dennis Ritchie was a computer scientist in the truest definition: He earned a degree in physics and applied mathematics from Harvard in the the 1960s and followed his father to work at Bell Labs, which was one of the hotbeds of tech development in the United States. By 1968 Ritchie had completed his Ph.D., and from 1969 to 1973 he developed the C programming language for use with the then-fledgling Unix operating system. The language was named C because it developed out of another language called B, created by Ken Thompson (with some input from Ritchie) for use with Multics, a Unix precursor. So, yes, even the name is geeky.

dennis-ritchie

Both Multics and Unix were developed for early minicomputers. Of course, they were “mini” in name only: Back in the early 1970s, a “minicomputer” was a series of cabinets that dominated a room, made more noise than an asthmatic air conditioner, and had five- and six-figure price tags. The processing and storage capacities of those systems are utterly dwarfed by commonplace devices today: An average calculator or mobile phone has thousands-to-millions of times the storage and processing capability of those minicomputers. Minicomputers’ memory and storage constraints meant that, if you wanted to develop a multitasking operating system that could run several programs at once, you needed a very, very efficient implementation language.

Initially, that language was assembly: low-level, processor-specific languages that have a nearly one-to-one mapping with machine language, the actual instructions executed by computer processors. (Basically, when people think of utterly incomprehensible screens of computer code, they’re thinking of assembler and machine code.) Ritchie’s C enabled programmers to write structured, procedural programs using a high-level language without sacrificing much of the efficiency of assembler. C offers low-level memory access, requires almost no run-time support from an operating system, and compiles in ways that map very well to machine instructions.

If that were all C did, it probably would have been little more than a fond footnote in the history of minicomputers, alongside things like CPL, PL/I, and ALGOL. However, the Unix operating system was being designed to be ported to different hardware platforms, and so C was also developed with hardware portability in mind. The first versions of Unix were primarily coded in assembler, but by 1973 Unix had been almost completely rewritten in C. The portability turned out to be C’s superpower: Eventually, a well-written program in standard C could be compiled across an enormous range of computer hardware platforms with virtually no changes — in fact, that’s still true today. As a result, C compilers are available for virtually every computer hardware platform today and for the last three decades, and learning C is still a great way to get into programming for a huge number of platforms. C remains one of the most widely-used programming languages on the planet.

The popularity of C was tied tightly to the popularity of Unix, along with its many offshoots and descendants. Today, you see Unix not only in the many distributions of Linux (liked Red Hat and Ubuntu) but also at the core of Android as well as Apple’s iOS and Mac OS X. However, Ritchie made another tremendous contribution to C’s popularity as the co-author with Brian Kernighan of The C Programming Language, widely known as the “K&R.” For at least two generations of computer programmers, the K&R was the definitive introduction to not just C, but to compilers and general structured programming. The K&R was first published in 1978, and despite being a slim volume, set the standard for excellence in both content and quality. And if you’ve ever wondered why almost every programming reference or tutorial starts out with a short program that displays “Hello world”… just know it all started with K&R.

To be sure, neither Unix nor C are beyond criticism: Ritchie himself noted “C is quirky, flawed, and an enormous success.” Both C and Unix were developed for use by programmers and engineers with brevity and efficiency in mind. There’s almost nothing user-friendly or accessible about either Unix or C. If you want to stun non-technical computer users into cowed silence, a Unix command prompt or a page of C code are guaranteed to do the job. C’s low-level power can also be its Achilles Heel: for instance, C (and derivatives like C++) offer no bounds-checking or other protection against buffer overflows — which means many of the potential security exploits common these days can often be traced back to C… or, at least, to programmers using C and its descendants. Good workmen don’t blame their tools, right?

But the simple fact is that Unix and C spawned an incredibly broad and diverse ecosystem of technology. Microcontrollers, security systems, GPS, satellites, vehicle systems, traffic lights, Internet routers, synthesizers, digital cameras, televisions, set-top boxes, Web servers, the world’s fastest supercomputers — and literally millions of other things… the majority descend from work done by Dennis Ritchie. And that includes a ton of computers, smartphones, and tablets — and the components within them.

Steve Jobs and the rest of us

Steve Jobs’ legacy is (and will continue to be) well-documented elsewhere: As co-founder and long-time leader of Apple, as well as a technology and business celebrity enveloped in a cult of personality, Jobs’ impact on the modern technology world is indisputable.

However, Jobs’ contributions are an interesting contrast to Ritchie’s. Ritchie was about a decade-and-a-half older than Jobs, and got started in technology at a correspondingly earlier date: When Ritchie started, there was no such thing as a personal computer. Although a perfectionist with a keen eye for design and usability — and, of course, a charismatic showman — Jobs was neither a computer scientist nor an engineer, and didn’t engage in much technical work himself.

steve-jobs-holding-iphone

There’s a well-known anecdote from Jobs’ pre-Apple days, when he was working at Atari to save up money for a trip to India. Atari gave Jobs the task of designing a simpler circuit board for its Breakout game, offering him a bonus of $100 for every chip he could eliminate from the design. Jobs’ response — not being an engineer — was to take the work to long-time friend and electronics hacker Steve Wozniak, offering to split the $100-per-chip bounty with him. The incident is illustrative of Jobs’ style. In creating products, Jobs didn’t do the work himself: He recognized opportunities, then got the best people he could find to work on them. Woz reportedly cut more than four dozen chips from the board.

Wozniak and Jobs founded Apple in 1976 (with Ronald Wayne), just as Unix was graduating from research project status at AT&T to an actual product, and before K&R was initially published. But even then, Jobs wasn’t looking at the world of mainstream computing — at least, as it existed in 1976. Apple Computer (as it was known then) was about personal computers, which were essentially unknown at the time. Jobs realized there was a tremendous opportunity to take the technology that was then the realm of engineers of computer scientists like Ritchie — and, to be fair, Wozniak — and make it part of people’s everyday lives. Computers didn’t have to be just about numbers and payrolls, balance sheets and calculations. They could be entertaining, communication tools, even artful. Jobs just didn’t see computers as empowering to large corporations and industry. They could be empowering to small businesses, education, and everyday people. And, indeed, Apple Computer did jumpstart personal computers, with the Apple II essentially defining the industry — even if it was later eclipsed by IBM and IBM-compatible systems.

With the Apple Lisa and (much more successfully) the Apple Macintosh, Jobs continued to extend that idea. Unix and its brethren were inscrutable and intimidating; with the Macintosh, Jobs set out to make a “computer for the rest of us.” As we all know, the Macintosh redefined the personal computer as a friendly, intuitive device that was immediately fun and useful, and to which many users formed a personal connection. Macs didn’t just work, they inspired.

Jobs was forced out of Apple shortly after the Mac’s introduction — and, indeed, Apple spent many years literally building biege boxes, while Microsoft worked on its own GUI — but his return to the company brought back the same values. With the original iMac, Macintosh design regained its flair. With the iPod, Apple was able to meld technology and elegant design to many consumers’ obsession—popular music—and when Apple finally turned its attention to the world of mobile phones, the results were an undeniable success. It’s not much of an exaggeration to say that the bulk of the PC industry has been following Apple’s lead for at least the last dozen years (even longer, when it comes to notebooks), and Apple was never matched in the portable media player market. Similarly, Android might be the world’s leading smartphone platform, but there’s no denying the Apple iPhone is the world’s leading smartphone—and the iPad defined (and still utterly dominates) the tablet market.

As with the original Mac, the Apple II, and even that old Atari circuit board, Jobs didn’t do these things himself. He turned to the best people he could find and worked to refine and focus their efforts. In his later years, that involved remaking Apple Computer into Apple, Inc., and applying his razor-sharp sense for functionality, purpose, and design to a carefully selected range of products.

Who wins?

Dennis Ritchie eventually became the head of Lucent Technologies’ Software System Research Department before retiring in 2007; he never led a multi billion-dollar corporation, sought the public eye, or had his every utterance scrutinized and re-scrutinized. Ritchie was by all accounts a quiet, modest man with a strong work ethic and dry sense of humor. But the legacy of his work played a key role in spawning the technological revolution of the last forty years — including technology on which Apple went on to build its fortune.

Conversely, Steve Jobs was never an engineer. Instead, his legacy lies in democratizing technology, bringing it out of the realm of engineers and programmers and into people’s classrooms, living rooms, pockets, and lives. Jobs literally created technology for the rest of us.

Who wins? We all do. And now, it’s too late to personally thank either of them.

The views expressed here are solely those of the author and do not reflect the beliefs of Digital Trends.

Computing

Beam up the videos: AirPlay support is coming to VLC player

At CES 2019, the developers of VLC player announced they are adding support for Apple's Airplay feature, allowing consumers to beam video and other content from their iPhone and Android devices to an Apple TV. 
Computing

Don't spend hundreds on Pro Tools or Logic. Try one of these free alternatives

Believe it or not, Pro Tools isn't the only digital audio workstation worth your time. Check out our picks for the best free recording software, whether you're looking for a lightweight app or a full-blown audio workstation.
Web

Switch up your Reddit routine with these interesting, inspiring, and zany subs

So you've just joined the wonderful world of Reddit and want to explore it. With so many subreddits, however, navigating the "front page of the internet" can be daunting. Here are some of the best subreddits to get you started.
Home Theater

The best movies on Netflix in December, from 'Buster Scruggs’ to 'Roma'

Save yourself from hours wasted scrolling through Netflix's massive library by checking out our picks for the streamer's best movies available right now, whether you're into explosive action, witty humor, or anything else.
Movies & TV

The best shows on Netflix, from 'Haunting of Hill House’ to ‘Norsemen’

Looking for a new show to binge? Lucky for you, we've curated a list of the best shows on Netflix, whether you're a fan of outlandish anime, dramatic period pieces, or shows that leave you questioning what lies beyond.
Mobile

If you're looking for a good laugh, here are 70 questions to ask Siri

Siri has come a long way since her first appearance on the iPhone 4S in 2011. We know she can make appointments and give directions, did you know she can make you laugh too? If you want proof, here are lots of funny questions to ask Siri.
Deals

Save over $350 on the Refurbished iPad 4 for a limited time

Looking to buy an iPad without having to pay that iPad price? For a limited time, you can pick up a refurbished iPad 4 for as low as $137. That's $363 less than you would pay for something brand new.
Movies & TV

Apple’s first original feature film reunites Bill Murray, Sofia Coppola

The Lost in Translation team of Sofia Coppola and Bill Murray will reunite with their A Very Murray Christmas collaborator Rashida Jones for On the Rocks, Apple and studio A24's first official feature together.
Mobile

We tried all the latest and greatest smartphones to find the best of 2019

Smartphones are perhaps the most important and personal piece of tech on the planet. That’s why it’s important to pick the best phone for your individual needs. Here are the best smartphones you can buy.
Mobile

On a budget? We found the best affordable smartphones you can buy

Here are the best cheap phones for anyone working with a tight budget, whether you're a fan of stock Android or marathon battery life. Find out what you can get for under $500 or far, far less as we round up the best budget smartphones.
Mobile

Apple’s iPhone battery offer was reportedly way more popular than expected

As many as 11 million iPhone owners reportedly made use of Apple's cheaper battery replacement offer that launched in 2018 in response to the iPhone throttling debacle — some 10 times more than the company had apparently expected.
Mobile

Unleash your inner artist with the best drawing apps for the iPad Pro

The best drawing apps for the iPad Pro turn Apple's robust tablet into a canvas worthy of a true artist. Here are 20 of our current favorites, whether you're looking to keep things simple or take your art to the next level.
Computing

Should you buy the affordable MacBook Air, or is the MacBook Pro worth the price?

Though they both share Retina Displays and similar keyboards, there are still some specs differences and other changes that differentiate the new 2018 MacBook Air and MacBook Pro. In this guide, we stack the two up against each other.
Mobile

C you soon? Rumors swirl of a USB-C port on 2019 iPhones

While it's not been long since the last iPhones launched, rumors for the next iPhone are already surfacing. Apple's 2019 flagship could include a variety of upgrades ranging from a new design to enhanced features.