Should your self-driving car protect you at all times? What if that meant driving into a crowd of pedestrians rather than running into a bridge abutment? What if it meant driving off the road to avoid hitting one pedestrian?
When we leave the driving to our cars, we can’t do so without accepting the possibilities of moral choice. With vehicles driving at speed, there are times when avoiding a collision is no longer an option, and the choice lies between two or more harmful results.
When humans are in control, events often unfold so quickly that there’s no time to make conscious moral decisions. But computers can “think” and make choices much faster than people can, and herein lies the challenge. Two questions arise. What should autonomous vehicles do when faced with two harmful choices? What would humans want the vehicles to do?
In a study reported in Science, researchers from the Department of Psychology and Social Behavior, University of California, conducted six online surveys of moral principles involving autonomous vehicles (AVs). The surveys were given between June and November 2015 to 1,928 participants.
When queried about harming many versus harming the passengers of the car, respondents approved of vehicles making the decision for the greater good — sacrificing the one or the few in order to save many. However, that approval applied to other people’s cars. The same survey participants would want their own autonomous cars to protect them and their passengers at all cost. The survey respondents were not in favor of laws or regulations that demanded self-sacrifice and indicated they would not buy AVs programmed to self-sacrifice for the greater good.
The U.S. National Highway Traffic Safety Administration is going to publish its first set of guidelines for autonomous cars in July. The NHTSA hopes to open the conversation and suggest national rules rather than have states decide separately with no guidance. We already know that consumers are less enthusiastic about self-driving cars than car makers, insurance companies, and government groups. If one of the factors in consumer buy-in of fully autonomous vehicles is accepting a utilitarian moral factor in decision making, meaning self-sacrifice for the good of the many, that buy-in may take even longer than expected.
- Tesla issues stark warning to drivers using its Full Self-Driving mode
- The future of mobility: 5 transportation technologies to watch out for
- IBM’s A.I. Mayflower ship is crossing the Atlantic, and you can watch it live
- This tech was science fiction 20 years ago. Now it’s reality
- Elon Musk suggests Autopilot was off in fatal Texas Tesla crash