Skip to main content

MIT is teaching self-driving cars how to psychoanalyze humans on the road

In March 2004, the U.S. Defense Advanced Research Projects Agency (DARPA) organized a special Grand Challenge event to test out the promise — or lack thereof — of current-generation self-driving cars. Entrants from the world’s top A.I. labs competed for a $1 million prize; their custom-built vehicles trying their best to autonomously navigate a 142-mile route through California’s Mojave Desert. It didn’t go well. The “winning” team managed to travel just 7.4 miles in several hours before shuddering to a halt. And catching fire.

A decade-and-a-half, a whole lot has changed. Self-driving cars have successfully driven hundreds of thousands of miles on actual roads. It’s non-controversial to say that humans will almost certainly be safer in a car driven by a robot than they are in one driven by a human. However, while there will eventually be a tipping point when every car on the road is autonomous, there’s also going to be a messy intermediary phase when self-driving cars will have to share the road with human-driven cars. You know who the problem parties are likely to be in this scenario? That’s right: the fleshy, unpredictable, sometimes-cautious, sometimes-prone-to-road-rage humans.

xijian/Getty Images

To try and solve this problem, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have created a new algorithm intended to allow self-driving cars to classify the “social personalities” of other drivers on the road. In the same way that humans (often non-scientifically) try and ascertain the responses of other drivers when we’re say, moving at an intersection, so the autonomous vehicles will attempt to figure out who they’re dealing with to avoid accidents on the road.

Recommended Videos

“We’ve developed a system that integrates tools from social psychology into the decision-making and control of autonomous vehicles,” Wilko Schwarting, a research assistant at MIT CSAIL, told Digital Trends. “It is able to estimate the behavior of drivers with respect to how selfish or selfless a particular driver appears to be. The system’s ability to estimate drivers’ so-called ‘Social Value Orientation’ allows it to better predict what human drivers will do and is therefore able to driver safer.”

Social Value Orientation

On the whole, our driving frameworks function fairly well; giving priority to one driver over another, dividing us into directional lanes, and so on. But there are still plenty of more subjective moments when multiple parties have to figure out how to coordinate their efforts to complete a maneuver, sometimes at high speeds. Knowing whether you’re dealing with an impatient driver who’s going to cut you up or a patient one who’s going to wait or make way can mean the difference between a successful journey and fraught fender bender. The fact that there are hundreds of thousands of lane-changing, merging and right or left turn accidents each year in the United States alone shows that humans haven’t quite mastered this subtle art.

Social Value Orientation is a part of the field of interdependent decision making, looking at the strategic interactions between two or more people. It is rooted in game theory, whose concepts were first outlined in a 1944 book by Oskar Morgenstein and John von Veumann titled Theory of Games and Economic Behavior.

The broad idea is essentially this: Agents have their own preferences which can be ordered in terms of their utility (level of satisfaction). Within these parameters they will act logically, according to those preferences. Translated into driving behavior, no matter how unpredictable the road might seem at rush hour, by knowing how altruistic, prosocial, egoistic or competitive the drivers around you might be, you can predict behavior to complete your journey without problem.

Social Behavior for Autonomous Vehicles

By observing the way that other cars drive, the MIT algorithm assesses other drivers on the “reward to others” vs. “reward to self” scale. That would mean sorting fellow road-dwellers into “altruistic,” “prosocial,” “egotistic,” “competitive,” “sadistic,” “sadomasochistic,” “masochistic,” and “martyr” categories. Through learning that not all other cars behave in the same way, the team believe their model could prove a welcome addition to self-driving car systems.

“We trained the system first by modeling road scenarios where each driver tried to maximize their own utility and analyzing their most effective responses in light of the decisions of all other agents,” Schwarting said. “The utility incorporates how much a driver weights their own benefit against the benefit of another driver, weighted by the SVO. Based on that tiny snippet of motion from other cars, our algorithm could then predict the surrounding cars’ behavior as cooperative, altruistic, or egoistic during interactions. We calibrated the rewards based on real driving data with machine learning, essentially encoding how much human drivers value comfort, safety, or getting to their goal quickly.”

Predicting the behavior of drivers

In tests, the team showed that their algorithm could more accurately predict the behavior of other cars by a factor of 25%. This helped the vehicle know when it should when at a left turn versus turning in front of an oncoming driver.

“It also allows us to decide how cooperative or egoistic an autonomous vehicle should be depending on the scenario,” Schwarting continued. “Acting overly conservative is not always the safest option because it can cause misunderstandings and confusion among human drivers.”

Volkswagen e-Golf autonomous prototype Hamburg
Image used with permission by copyright holder

The team say that the algorithm is not yet ready for prime time in terms of real world road testing. But they are continuing to develop it, and think that its applications could extend even further beyond the one described here. For one thing, observing other cars could help future self-driving vehicles learn to exhibit more human-like traits that will be easier for human drivers to understand.

“[In addition], this could be useful not just for fully self-driving cars, but for existing cars that we use,” Schwarting said. “For example, imagine that a car suddenly enters your blind spot. With the system [we have developed], you might get a warning in the rearview mirror that the car in your blind spot has an aggressive driver, which could be particularly valuable information.”

Next, the researchers hope to apply the model to pedestrians, bicycles and other agents who may appear in driving environments. “We’d also like to look at other robotic systems that need to interact with us, such as household robots,” Schwarting noted.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Cruise autonomous vehicle drives over woman just after she was hit by another car
A Cruise autonomous car.

An autonomous vehicle (AV) operated by Cruise ran over a pedestrian in San Francisco on Monday night just after she’d been hit by another car, the San Francisco Chronicle reported.

According to witnesses, the force of the initial impact knocked the woman into the path of the Cruise robotaxi, leaving her pinned under one of its wheels. The driver in the other car reportedly fled the scene.

Read more
An autonomous car in San Francisco got stuck in wet concrete
A Cruise autonomous car.

A self-driving car operated by General Motors-backed Cruise got stuck on Tuesday when it drove into a patch of wet concrete.

The incident happened in San Francisco and occurred just days after California's Public Utilities Commission made a landmark decision when it voted to allow autonomous-car companies Cruise and Waymo to expand their paid ridesharing services in the city to all hours of the day instead of just quieter periods.

Read more
Volkswagen is launching its own self-driving car testing program in the U.S.
Volkswagen self-driving ID. Buzz in Austin

Volkswagen is taking autonomous driving a little more seriously. While the likes of Tesla and Waymo have largely led the development of next-gen driving tech, the legacy automakers are certainly starting to invest more heavily. To that end, Volkswagen has announced its first autonomous driving program in the U.S.

As part of the program, Volkswagen has outfitted 10 all-electric ID. Buzz vans with autonomous driving tech, in partnership with autonomous car tech company MobileEye. Over the next few years, Volkswagen says it'll grow this fleet of autonomous cars to cover at least four additional cities, with the current fleet operating in Austin, Texas. By 2026, Volkswagen hopes to commercially launch autonomous cars in Austin.

Read more