Their answer? An optimistic ‘yes.’
Over the past 21 years, Beyond Verbal has gathered an impressive broad dataset of 2.5 million emotion-tagged voices in upwards of 40 different languages. The idea is that, working with its medical partners, it can use the power of machine learning and big data to also examine the connection between vocal intonations and health issues.
“We are not calling what we’re doing right now diagnostics,” Yuval Mor, CEO of Beyond Verbal, told Digital Trends. “Instead, it’s about long-term monitoring and decision support systems. The diagnostic tools that a doctor can use right now can do a better job than we can with vocal markers. However, what we’re doing is to use voice abnormalities observed over time to say that your voice is significantly different from how it sounded yesterday, a week ago, or two months ago — and then to correlate those changes with specific things we can identify.”
The ability to recognize certain conditions based on voice is not new, and nor does it always require a machine. Listening to a person speak can be used to help reveal that they may be suffering from certain conditions, such as dementia or Parkinson’s disease. However, machine learning tools can be used to go further — by dialing in on granular details which may not be readily observable to the human ear.
In particular, what Beyond Verbal is hoping to do is expand this recognition to medical and physiological conditions which are not associated with the brain.
“That’s where this work is becoming very fascinating, and that’s what we’ve been working with the Mayo Clinic over the past two years to do,” Mor said. This may include the diagnosis of coronary artery disease, for example, which Mor said is something that is being looked at very closely.
Right now, the so-called Beyond Health Research Platform is still rolling out, but Mor noted that collaborators are certainly thinking big with the project.
“The long-term vision is that everyone will have their own companion, a guardian angel, which could be anything from your mobile phone to your Amazon Echo,” he said. “It might be that it monitors your phone conversations, or that you speak to an app for 30 seconds every morning — and from that it gives you high-level indications. You either get a green light saying everything is okay, a yellow light saying it’s worth monitoring closely, or a red light telling you that something is significantly different and it can be mapped to a specific condition, so you should go see your doctor.”
Coming soon to a device near you? We certainly hope so. In the meantime, this is a fascinating piece of research we’ll be sure to keep our eyes on.
- Machines are getting freakishly good at recognizing human emotions
- Scientists want to implant mini human brains in animals
- Bose Portable Home Speaker review: Great sound, anywhere
- The best washing machines for 2019
- The best password managers for 2019