Speech recognition and machine translation tools are two of the most useful everyday applications of artificial intelligence as it exists today. Both of them allow our words to be understood by a machine and turned into actionable commands, used either to control devices like the Google Home smart speaker or to have a conversation with someone who speaks a different language to us. Could similar machine learning A.I. technology also be used to help decipher a baby’s cries, and in the process shed some light on exactly what it is that they are attempting to communicate?
The makers of a new free Android and iOS app called Chatterbaby certainly believe that it can. Developed by researchers at the University of California, Los Angeles, the app is based on an algorithm that’s able to work out exactly what each baby cry means and relay this information to parents. According to its creators, it can do this with astonishing accuracy; far more, in fact, than the guesswork with which most first-time parents react to their baby’s crying.
“I have four children; this project came about after I realized that number three had cries that sounded remarkably similar to my first two babies,” Ariana Anderson, assistant professor and lead in the UCLA research, told Digital Trends. “Since I am a statistician, I see patterns everywhere. I wanted to test whether the vocal patterns I could hear in my own children were present in other children as well. We decided to put this algorithm into our free Chatterbaby app not just to help parents of babies now, but to help them later as well when their children are older.”
To create the Chatterbaby app, Anderson and fellow researchers started by uploading 2,000 audio samples of infant cries. They then used A.I. algorithms to try and discover (and therefore explain) the difference between pain-induced cries, hunger-induced cries, and fussiness-induced cries.
“The training was done by extracting many acoustic features from our database of pre-labeled cries,” Anderson continued. “Pain cries were taken during vaccinations and ear-piercings. We labeled other cries using the parent-nomination and a ‘mom-panel’ consisting of veteran mothers who had at least two children. Only cries that had three unanimous ratings were used to train our algorithm, which changes and improves regularly. We used the acoustic features to train a machine learning algorithm to predict the most likely cry reason. Within our sample, the algorithm was about 90 percent accurate to flag pain, and over 70 percent accurate overall.”
Anderson does, however, note that parents should still use their best judgement, “and remember that their brain and their instincts are far more powerful than any artificial intelligence algorithm.”
While the Chatterbaby app could be useful for many parents — especially the aforementioned first-time parents — Chatterbaby could prove particularly helpful in certain scenarios. For example, it could be handy in situations in which one or both parents are deaf or hard of hearing, providing notification when their eyes are otherwise occupied.
It may also turn out to be a powerful tool in diagnosing autism at a younger age. At present, autism is diagnosed later in childhood, often around the age of three. Finding ways of predicting autism as soon as possible is something a number of researchers have been working toward, since this could allow for earlier intervention to take place. Anderson suggested that one way of picking up an early cue regarding autism may be found in listening for unusual vocal patterns in infants.
Previous studies have shown promising results in detecting abnormal vocal patterns with at-risk children, but these sample sizes are small. In an attempt to add more data to the pile, Chatterbaby offers a voluntary study which parents can enter into. Right now, it’s still at an early stage, but long term it could provide valuable insights that allow an earlier diagnosis.
“By inviting people into our research study with the Chatterbaby cry translator, we can them follow them for six years and provide free screenings for autism which they can do from their own home,” Anderson continued. “If their children are higher risk, they can then go to their doctor for a full evaluation. We want to bring the lab to the participant, instead of the participant to the lab. By offering a free service in our app that is of high-value to new parents, we believe we can connect with parents and improve our ability to identify risk factors for autism. We believe that a baby’s voice can be one of many risk factors to improve our understanding of autism.”
- The most bacteria-ridden places in your kitchen? Check your appliances
- Brain scans and A.I. confirm that dogs are great at recognizing our emotions
- One British baby’s first word was ‘Alexa,’ and he’s certainly not alone
- Mind-reading A.I. algorithm can work out what music is playing in your head
- Meet Explorest, the photo-scouting app curated by real photographers