The concept of an AI psychoanalyst has been in circulation for decades, tracing all the way back to Joseph Weizenbaum’s ELIZA chatterbot in the 1970s. But now researchers from the University of Southern California are taking the idea to the next level, courtesy of a machine learning algorithm designed to analyze a person’s speech patterns and help diagnose the possibility of depression in the process.
The tool is part of an ongoing research project called SimSensei, referring to a Kinect-powered virtual therapist able to “read” patient’s’ body language for signs of anxiety, nervousness, contemplation and other emotional attributes.
More recently, however, the project has increasingly focused on not just understanding the responses given (like Apple’s Siri does, for instance), but also the manner in which they are spoken. “I’m not so interested in what people say, as how they say it,” Stefan Scherer, one of the researchers involved with the work, tells Digital Trends. “We’re focusing on aspects of speech like voice quality — from the timbre to the color of the voice: whether it’s a tense voice, a harsh voice, or a breathy voice. We want to pick up these changes and contextualize them.”
Scherer calls his work “behavioral analytics” and says that it’s all part of creating a more fully-realized tool which can be used to augment the abilities of a real therapist or physician. “It provides a different set of eyes and ears that they would not normally have available,” he says.
In a recent paper, the authors of the study explain how: “depressed patients often display flattened or negative affect, reduced speech variability and monotonicity in loudness and pitch, reduced speech, reduced articulation rate, increased pause duration, and varied switching pause duration. Further, depressed speech was found to show increased tension in the vocal tract and the vocal folds.” Such vocal tics may not immediately be picked up on by a human.
Looking forward, Scherer says he could see technology such as this being installed in smartphone apps, so that people can more objectively measure moods in a similar way to how the “Quantified Self” movement currently does health-tracking. “You could imagine people asking if they’ve done their 1,000 smiles in a day, or whether or not they are getting excited about things,” he says. “It could be used for both people suffering from depression but also for the general population.”
- Machines are getting freakishly good at recognizing human emotions
- Your Alexa speaker can be hacked with malicious audio tracks. And lasers.
- Robots aren’t coming to steal your job. They’re coming to improve it
- The best new podcasts for the week of October 26, 2019: Bad Batch and more
- Facebook test hides Like counts to help us feel happy, not crappy