The Web and social applications have given us many, many ways to express ourselves. We can update statuses, share photos, stream music; and it’s all shareable and can be analyzed to help us understand the motivations behind the things we upload and connect with. So what if our emotions – the ones we keep inside instead of broadcast – could be used this same way?
Beyond Verbal is trying to do just that by analyzing people’s moods, attitudes, and emotional characteristics – also known as personalities – in real time, using raw vocal intonations. “To understand these three things is to understand the dimension of emotions in human communications,” says Dan Emodi, Beyond Verbal’s VP for Marketing and Strategic Accounts. “Bundled into software, our technology allows devices and applications to understand not just what we type, click, say, or touch, but what we mean and how we feel.”
The science of emotions
Beyond Verbal’s software is backed up by 18 years of research spearheaded by chief scientist and physicist Dr. Yoram Levanon and neuro-psychologist Dr. Lan Lossos. According to Emodi, the idea for Beyond Verbal came when Levanon began showing interest in how babies – who do not understand a single word – are able to figure out exactly what their caretakers feel toward them, which led him to look into the physical modulations of air pressures that carry the sounds in our surroundings to look for the answer. Together with Lossos, they studied over 60,000 test subjects in at least 26 languages, and their success in extracting, decoding, and measuring human moods, attitudes, and personalities gave birth to what they are calling Emotions Analytics.
To demonstrate the potential of this technology, Beyond Verbal has created Moodies, the world’s first and publicly available mood recognition app for the Web. Anyone with a microphone built into their computer can record a 20-second audio clip of their own voice, and the app analyzes the speech for mood, attitude, and personality markers. Results are categorized into primary and secondary moods, the latter usually pertaining to subliminal results. Results can be shared via Facebook, Twitter, or email.
Emodi went on to demonstrate the capability of Beyond Verbal’s software by analyzing The Guardian’s recent interview of Edward Snowden.
His speech pattern may seem pedantic to the untrained ear – he is, after all, an analyst – but the Beyond Verbal software was able to pick up on Snowden’s intense passion in his beliefs. For someone who is considered persona non grata by the U.S. government for his revelation of classified National Security Agency documents, he showed a lot of control and restraint as well as confidence and sense of pride.
On the other hand, when asked about his knowledge of PRISM, President Barack Obama’s response went from self-confident to one that exuded anger, rejection, and intolerance.
For our own hands-on purposes, here are a couple of YouTube videos of a couple of celebrity apologies we ran through the Moodies app with their corresponding results, using only the first 20 seconds. The results are impressive, and seem like accurate assessments of the speakers’ attitudes. According to Emodi, Moodies users have reported results of at least 80 percent accuracy, which is pretty good considering the many factors that can affect the cleanliness of a voice recording, such as background noise, more than one person speaking at the same time, and microphone quality and volume.
Chris Brown’s apology for abusing Rihanna
Paula Deen’s tearful apology for racial slurs:
Reese Witherspoon’s apology for disorderly conduct:
Kanye West’s apology for interrupting Taylor Swift’s MTV VMAs acceptance speech:
Hugh Grant’s apology for canoodling with a prostitute:
The future of emotions analytics
Even from only a handful of examples, it’s obvious just how useful this type of technology can be. According to Emodi, the response has been so overwhelming since the launch a few months ago that it’s decided to make the technology available through an over-the-cloud API to third-party app developers and companies in the fields of consumer applications, mobile devices, and appliances through partnerships instead of just coming up with a solution in-house. “Our goal is to work with our partners to introduce Emotions Analytics capabilities in practically any voice-powered device, system, or application out there, to introduce emotional understanding into every aspect of our lives, and to allow us to gain a deeper hold of context and meaning and achieve more in practically everything we do,” says Emodi.
“We’ve developed a platform that rests on the cutting edge of understanding the underlying emotions that people project.”
One example Emodi came up with is a safety app designed to lower the speed of a moving vehicle you are driving that detects distraction levels based on your speech quality. From there, the ideas just start flowing – emotions analytics will greatly affect how people experience media, business, recreation, relationships, and technology in general. You can have an app for employers to use to analyze the personality of job applicants. An app that can help you understand your significant other’s reactions. An app casting agents can use to choose an actor that is most believable for a particular role. An app that suggests music playlists, depending on your mood. The possibilities are endless.
What’s next for Beyond Verbal
“Beyond Verbal hopes to create a new cloud-based economy that’s immediately available,” Emodi shares. He also confirms there are a lot of companies that have formally expressed intent to forge partnerships with Beyond Verbal, but he says he’d like to protect their privacy and defer to their personal launch schedules. What he did share is that the company has just secured an additional $1 million in its second seed round of funding, led by Winnovation, the start-up investment fund of industrialist Sami Sagol. Additionally, Winnovation’s co-founder and CEO Barak Ben-Eliezer is joining Beyond Verbal’s Board of Directors.
“The team at Beyond Verbal has developed a platform that rests on the cutting edge of understanding the underlying emotions that people project,” says Ben-Eliezar. “We couldn’t be more excited to invest in what we feel is a revolutionary brain science that is sure to impact numerous verticals all around the world, and are confident that the additional capital towards Beyond Verbal’s efforts will yield impressive forthcoming results.”
According to Beyond Verbal’s announcement, the recent funding will be used for further research and development of the company’s emotional analytics and artificial intelligence, as well as leveraging the growing interest surrounding the licensing of the platform’s cloud-based API to expand the technology’s capabilities, applications, and market reach.
“Following the flood of positive responses surrounding the launch of Beyond Verbal, we are looking to push the limits of the platform and further our growth in ways previously unforeseen,” says Yuval Mor, CEO of Beyond Verbal. “By continuing to innovate and unlock the potential of voice enabled emotional interpretation, we feel we can open up a new dimension of emotional understanding that will impact how people communicate with machines, as well as with each other.”
- Emotion-sensing A.I. is here, and it could be in your next job interview
- New algorithm could help diagnose depression by analyzing the tone of your voice
- Hands on with Moodsnap, the app that makes music from photos