Ever wanted to know what animals are saying? Neuroscientists at the University of Delaware have taken a big leap forward in decoding the sounds made by one particular animal in a way that takes us a whole lot closer than anyone has gotten so far. The animal in question? The humble mouse.
To study mouse vocalizations, the team gathered data as groups of four mice — two males and two females — interacted. They interacted for five hours at a time in a chamber that was kitted out with eight microphones and a video camera. In total, the researchers recorded encounters between a total of 44 mice. Starting with the enormous amounts of ensuing video and audio data, the researchers then used machine learning A.I. to develop a system that’s able to connect specific sounds with distinct animal behaviors. In short, it could work out which mouse was squeaking, where, and in what scenario.
“To link mouse vocalizations to specific actions, we needed multiple technological advances,” University of Delaware neuroscientist Joshua Neunuebel told Digital Trends. “First, we needed to be able to assign specific vocalizations to individual mice that were socially interacting. To do this, we developed a sound source localization system that simultaneously recorded mouse ultrasonic vocalizations on eight different microphones, as well as the position of the mice with a video camera.”
The combination of microphones and camera allowed the team to estimate the location of where a particular vocal signal was emitted and then assign the signal to a specific mouse. Once they were able to assign vocalizations to specific animals, the team used an unsupervised learning algorithm that groups items with similar features to categorize them. Finally, they used a tool called JAABA, the Janelia Automatic Animal Behavior Annotator, to automatically extract specific social behaviors with high fidelity.
“It’s not necessarily a translational tool per se, but it’s a tool that helps us interpret mouse social behaviors,” Neunuebel said. “However, this being said, mice are good models for understanding the neural basis of social behavior, which may ultimately shed light upon how the brain circuitry of humans is functioning.”
The work is the subject of two new papers published in the journals Nature Neuroscience and Scientific Reports. Both papers report on different aspects of the work on how communication shapes social behavior and the neural networks that encode this information.
As Neunuebel says, they haven’t developed a full-fledged human-to-mouse translation tool. Nonetheless, this work — alongside similar research into communications of animals like dolphins — certainly brings us closer to understanding the subtleties of animal chat.
- Groundbreaking A.I. brain implant translates thoughts into spoken words
- Scientists make artificial and biological neurons communicate over the internet
- Because 2020’s not crazy enough, a robot mouth is singing A.I. prayers in Paris
- Deep-learning A.I. is helping archaeologists translate ancient tablets
- This Google robot taught itself to walk, with no help whatsoever, in two hours