Headphones have the ability to seal us in our own isolated sound bubbles; putting an invisible wall around wearers, even in public spaces. At least, it can feel that way. In reality, while the world might seem like it disappears when you put on your fancy AirPods Pro, it doesn’t actually. As walking across a busy street without paying attention would quickly remind you.
Could machine intelligence help where human intelligence fails us?
That’s certainly what researchers from Columbia University hope. They have developed a Pedestrian Audio Warning System (PAWS) that seeks to alert headphone wearers of the threat posed by passing vehicles. The smart headphone tech technology works by using machine learning algorithms to interpret vehicle sounds from up to 60 meters away. It can then provide information about the location of those vehicles. The results could be a major boon to pedestrian safety at a time when, tragically, more pedestrians than ever are killed on roads in the United States.
The headphones used for the prototype system feature an array of low-cost microphones, located in different parts of the headset. The relevant sound features of possible cars are extracted by an onboard custom integrated circuit, which then sends them to a paired smartphone app. The smartphone uses machine learning algorithms to determine what is and is not a vehicle sound. The neural network it relies on was trained using audio from a wide range of both vehicles and environmental conditions.
The system is still far from complete. For one thing, it can only identify the approximate position of vehicles; not their trajectory. Being able to determine this would be far more useful than simply assuming a static road state for vehicles that are, in reality, anything but static. Secondly, the researchers are still trying to figure out the best way to signal this information to wearers. One possibility would be to offer warning beeps on different sides of stereo headphones to make clear exactly where a sound is emanating from.
The PAWS project has already received a grant of $1.2 million from the National Science Foundation. According to IEEE Spectrum, the team is hoping to pass a “more refined” version of the tech over to a company that could bring it to market.
- Because 2020’s not crazy enough, a robot mouth is singing A.I. prayers in Paris
- This Google robot taught itself to walk, with no help whatsoever, in two hours
- Deep-learning A.I. is helping archaeologists translate ancient tablets
- Groundbreaking A.I. brain implant translates thoughts into spoken words
- Inside the mind of an autonomous delivery robot