It’s time to talk. Ford is calling on other automakers and technology companies to jointly develop a common language that self-driving cars can use to communicate with the world around them. The company stresses communication will play a key role in the safe, timely, and orderly deployment of autonomous technology.
“The idea that pedestrians, cyclists, and scooter users should change their behavior to accommodate self-driving cars couldn’t be further from our vision of how this technology should be integrated,” the company explained in a Medium post.
John Shutko, a human factor specialist for Ford’s self-driving vehicles program, explained that autonomous cars need to be able to tell other cars — both human- or software-driven — when they’re coming to a stop, when they’re about to start moving again, and when they don’t intend to stop at all. The race to deploy this technology on public roads is getting increasingly crowded, so agreeing on a common code sooner rather than later is necessary.
Otherwise, self-driving cars made by Ford, BMW, Uber, Apple, and others will turn our roads into a parade of chaos by using different codes to communicate. Imagine if a Toyota Corolla’s turn signals looked completely different than the ones on an Audi A3, or if Ford integrated the Fusion’s brake lights into the tires, but other car companies didn’t. Where you would need to look to tell what another motorist is about to do would vary depending on the type of car in front of you.
“Having one, universal communication interface [that] people across geographies and age groups can understand is critical for the successful deployment of self-driving technology,” Shutko argued.
Ford isn’t waiting for rivals and partners to join its cause. It has already teamed up with the Virginia Tech Transportation Institute (VTTI) to create a light bar that allows autonomous cars to communicate with the outside world. It put together a fun stunt to show off what it learned. Mounted right above the windshield, the bar blinks rapidly to tell onlookers it’s about to accelerate from a full stop. It’s solid white when the car intends to continue on its path, and it flashes when the car is coming to a full stop (e.g., to let a pedestrian cross the road). The company explains its research proved these basic signals are easy to understand by other road users. It took two exposures for other motorists to learn the meaning of a single signal and between five and 10 exposures to speak the language.
It’s too early to tell who will join Ford in creating a common language. It’s not the only company working toward that goal, however. The 22nd-century-esque 360c concept Volvo introduced in September of 2018 also explores ways robots can talk to humans. It uses a more sophisticated, design-led solution than Ford’s proposal, however.
- Nuro’s driverless delivery pod greenlighted for California trial
- Tesla Model 3 vs. Tesla Model Y
- As companies race to develop self-driving cars, Americans remain ambivalent
- Drivers needed (sort of): Einride wants remote pilots for its driverless pods
- Waymo’s next-gen self-driving tech can see what’s happening 500 meters ahead