We’ve all dreamt of having X-ray glasses to reveal to us things which we cannot see. But what if we had a device that could show us something deeper, some truth about the people we meet? That is now a reality, with a new kind of glasses that use real-time facial recognition technology to unmask a person’s true emotional state, reports Sally Adee in New Scientist. The implications of such a technology are profound, to say the least.
The special specs were developed by University of Cambridge, UK, researcher Rana el Kaliouby, who wanted to help autistic people by creating a way for them to read the emotional cues of the people they come in contact with each day in a way made impossible by their medical condition.
El Kaliouby sought the help of fellow Cambridge associate and autism expert Simon Baron-Cohen (yes, he’s Borat‘s cousin). The two identified six independent facial expressions that are used to evoke our range of emotions: thinking, agreeing, concentrating, interested, confused and disagreeing. The pair then hired actors to make the various expressions, which were then interpreted by volunteers who were asked to describe their meaning. The majority description was deemed the most accurate one.
The glasses, developed by MIT electrical engineer Rosalind Picard, use “a camera the size of a rice grain connected to a wire snaking down to a piece of dedicated computing machinery about the size of a deck of cards,” writes Adee. The camera watches 24 “feature points” on a person’s face, and inputs the data into a software, which interprets the movements and micro movements, and compares them against a database of known expressions.
Also built into the glasses are a earpiece and a light on the lens, both of which tell the wearer if the person with whom they are speaking has a negative reaction to something that’s said. If everybody’s happy, the light flashes green. If things go sour, a red light appears on the lens. The team responsible for the glasses hopes to one day make an augmented reality version that displays information from a computer on the lenses.
While the glasses were primarily developed for people with autism, the researchers found that most people are terrible at reading emotional cues; on average, their test subjects were able to distinguish the correct emotion only 54 percent of the time. The glasses, while far from perfect, knock the probability of getting it right to 64 percent.
In addition to the glasses, other teams of scientists have made similar devices that can help us get a better read on each other. One, a patch that is worn on the chest, monitors a person’s social interactions, and can tell them when they talk too much or too loudly. The team of MIT doctoral students that developed it call the pat the “jerk-o-meter.” Another, also developed by Picard, is a software that can use a webcam to determine a person’s heartbeat and other vital health information.
Combined, these devices can turn an ordinary person into an emotional IQ super-genius. And the technologies have already begun to attract the attention of private industry, so it may not be too long before hiding what you’re feeling become next to impossible.
- How emotion-tracking A.I. will change computing as we know it
- Huawei’s A.I. has finished Schubert’s Unfinished Symphony, and we’ve heard it
- The best PS3 games of all time
- Microsoft unveils $3,500 HoloLens 2 at MWC 2019. Here’s what you need to know
- Ultima Thule’s peculiar shape is a puzzle for scientists