Skip to main content

Does Tesla have blood on its hands? The complex morality of autonomy

tesla autopilot black box model s oped
Image used with permission by copyright holder
We don’t live in a black and white world. Reality has a whole messy collection of grays too, not to mention stripes and plaids. And while we like simple answers that cut through the morass, real life doesn’t always provide them.

On Thursday the National Highway Transportation Safety Administration (NHTSA, pronounced “nit-sah”) announced plans to investigate the May 7 fatality of a Florida man behind the wheel of – but not driving – a Tesla Model S. The car has an Autopilot feature that allows it take full control of highway driving, and during this accident, the car was in control.

So is Tesla at fault? The real answers are far from black and white.

Beta testing at 80 mph

Tesla’s Autopilot feature is a “beta” that’s disabled every time you turn the car off. This driver (and every driver who wants the feature) had to turn it on and click through the warnings. And there are many warnings. Among them is this one:

Warning: Traffic-Aware Cruise Control can not detect all objects and may not detect a stationary vehicle or other object in the lane of travel. There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.

Maybe the driver is responsible. That warning is pretty clear — but disclaimers are just that: disclaimers. You don’t get to absolve yourself of responsibility simply because you post a note saying you aren’t responsible. If restaurants had notes saying “eat at your own risk,” are they responsible for food poisoning?

That said, what does “beta” mean in this context? Cars aren’t computers. We’re fine dealing with “beta” software on a computer, where crashes are as frequent as unpopped kernels in a bag of popcorn. Crashes on the highway don’t lead to rebooting, they lead to twisted metal. Simply by dint of the potential outcomes, unfinished software shouldn’t be released to users.

press01_Tesla-autopilot
Image used with permission by copyright holder

A note on Tesla’s website carries more than a tinge of defensiveness, as though a project manager at the company is already preparing to be excoriated for the death. The blog post is titled “A Tragic Loss,” but opens not with notes of sadness but this comment on the incidence of collisions:

“This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

It’s as if the company were saying, “Hey, we didn’t do it! Lots of people die every year!!” Only in the final paragraph of the note does the company acknowledge that “the customer who died in this crash had a loving family and we are beyond saddened by their loss… We would like to extend our deepest sympathies to his family and friends.”

Humans will be humans

A tale of two cars

David Weinberger, a senior researcher at Harvard’s Berkman Center, wrote an essay for us last year titled, “Should your self-driving car kill you to save a schoolbus full of kids?

Today’s cars should do all they can to preserve the life of the driver and passenger, he argued, because that’s about as far as today’s tech can go. In the future, when cars are completely networked, they’ll know all about their passengers as well — at which point cars will need to make moral judgments.

Imagine this scenario: Two autonomous cars are about to crash, and the computers driving can save either one, but not both. One has a 25-year-old mother in it. The other has a 70-year-old childless man in it. Do we program our cars to always prefer the life of someone young? Of a parent? Do we give extra weight to the life of a medical worker beginning a journey to an ebola-stricken area, or a renowned violinist, or a promising scientist or a beloved children’s author?

But it’s not Tesla’s fault, at least not completely. When Tesla enabled the Autopilot feature, people invariably posted videos of themselves jumping in the backseat while the car steered down the highway. One man was caught napping behind the wheel of his Tesla as the car blithely drove itself down the highway. Even in a fully autonomous vehicle, which Tesla doesn’t claim to manufacture, we should be awake and alert as 5,000 pounds of steel, leather, and batteries zips us along at 80 miles per hour.

Cars aren’t toys, and cars that can steer themselves and avoid obstacles shouldn’t turn us into passengers or children.

For another thing, records reveal that the driver had 8 speeding tickets in 6 years. In theory, a self-driving car could turn him into a better driver, one who obeys the speed limits and doesn’t change lanes recklessly. That’s in the future, of course, when cars are fully autonomous. Today’s cars are hardly smart enough.

Perhaps the trillion-dollar question in this case – “Is it Tesla’s fault?” — should be rephrased as, “How do you deal with human nature?”

It’s inevitable that people will act recklessly – the videos of people pulling stupid stunts are evidence of that. How do self-driving cars (and the people who program them) deal with that? Google has said it wants to make its cars drive more like humans. After all, human drivers expect other vehicles on the road to act as they would, and humans aren’t good drivers. Imagine if the car in front of you came to a full stop at that yellow light as it’s supposed to, rather than tearing through as you would. Would that catch you buy surprise? Having a car that anticipates human foibles and can know enough to accelerate through a red light may reduce accidents.

A speed bump on the road to autonomy

The ultimate point of self-driving vehicles is just that: reducing accidents. Call them what they really are: collisions, and usually avoidable ones at that. More than a million people die every year in vehicle crashes, and the vast majority of them are caused simply because humans are human. We look at cell phones. We get distracted by others, our own thoughts, the radio, passing street signs, UFOs, whatever.

While this incident was a tragedy, it shouldn’t detract from the larger goal of reducing vehicular deaths. If designed right, computers will be much better drivers than we are – they never tire, they don’t get distracted, they come to a full stop and respect yellow lights. The road to complete autonomy for cars is potted and full of obstacles. But let’s keep the destination in our sights.

Editors' Recommendations

Topics
Jeremy Kaplan
As Editor in Chief, Jeremy Kaplan transformed Digital Trends from a niche publisher into one of the fastest growing…
Tesla to fix window software on 1M of its U.S. cars
A 2021 Tesla Model S.

Tesla is sending out an over-the-air update to a million of its vehicles in the U.S. to fix faulty window software that could leave occupants with pinched fingers.

According to a document issued by the National Highway Traffic Safety Administration (NHTSA), Tesla engineers discovered that the affected vehicles may not meet certain automatic window reversal system requirements. It said that in some cases, the window may exert more force before automatically retracting when sensing an obstruction such as a person’s fingers. The condition “may increase the risk of a pinching injury to the occupant,” the NHTSA’s document says.

Read more
Tesla recalls 130,000 U.S. vehicles over touchscreen safety issue
tesla wants youtube on touchscreens touchscreen

Tesla is recalling 129,960 of its electric cars in the U.S. over an issue with the touchscreen that could result in the device overheating or losing its image.

This is considered a safety issue as the display provides a feed from the rearview camera, as well as settings linked to the vehicle’s windshield defrosters. It also shows if the vehicle is in drive, neutral, or reverse. Tesla said it isn't aware of any crashes, injuries, or deaths linked to the issue.

Read more
Elon Musk eyes 2024 for Tesla robotaxi sans steering wheel, pedals
Elon Musk - Tesla CEO

Tesla is aiming to mass produce a robotaxi by 2024, CEO Elon Musk revealed during an earnings call with investors on Wednesday as the company reported better-than-expected quarterly figures.

The dedicated robotaxi will feature a “futuristic” design and come without a steering wheel or pedals, Musk said, adding that the vehicle, which would build on the automaker’s current self-driving technology, could be “a massive driver of Tesla’s growth.”

Read more