Skip to main content

Scientists create glasses that expose what people are really feeling

x-ray-glasses
Image used with permission by copyright holder

We’ve all dreamt of having X-ray glasses to reveal to us things which we cannot see. But what if we had a device that could show us something deeper, some truth about the people we meet? That is now a reality, with a new kind of glasses that use real-time facial recognition technology to unmask a person’s true emotional state, reports Sally Adee in New Scientist. The implications of such a technology are profound, to say the least.

The special specs were developed by University of Cambridge, UK, researcher Rana el Kaliouby, who wanted to help autistic people by creating a way for them to  read the emotional cues of the people they come in contact with each day in a way made impossible by their medical condition.

El Kaliouby sought the help of fellow Cambridge associate and autism expert Simon Baron-Cohen (yes, he’s Borat‘s cousin). The two identified six independent facial expressions that are used to evoke our range of emotions: thinking, agreeing, concentrating, interested, confused and disagreeing. The pair then hired actors to make the various expressions, which were then interpreted by volunteers who were asked to describe their meaning. The majority description was deemed the most accurate one.

The glasses, developed by MIT electrical engineer Rosalind Picard, use “a camera the size of a rice grain connected to a wire snaking down to a piece of dedicated computing machinery about the size of a deck of cards,” writes Adee. The camera watches 24 “feature points” on a person’s face, and inputs the data into a software, which interprets the movements and micro movements, and compares them against a database of known expressions.

Also built into the glasses are a earpiece and a light on the lens, both of which tell the wearer if the person with whom they are speaking has a negative reaction to something that’s said. If everybody’s happy, the light flashes green. If things go sour, a red light appears on the lens. The team responsible for the glasses hopes to one day make an augmented reality version that displays information from a computer on the lenses.

While the glasses were primarily developed for people with autism, the researchers found that most people are terrible at reading emotional cues; on average, their test subjects were able to distinguish the correct emotion only 54 percent of the time. The glasses, while far from perfect, knock the probability of getting it right to 64 percent.

In addition to the glasses, other teams of scientists have made similar devices that can help us get a better read on each other. One, a patch that is worn on the chest, monitors a person’s social interactions, and can tell them when they talk too much or too loudly. The team of MIT doctoral students that developed it call the pat the “jerk-o-meter.” Another, also developed by Picard, is a software that can use a webcam to determine a person’s heartbeat and other vital health information.

Combined, these devices can turn an ordinary person into an emotional IQ super-genius. And the technologies have already begun to attract the attention of private industry, so it may not be too long before hiding what you’re feeling become next to impossible.

Read the full New Scientist article here.

Andrew Couts
Former Digital Trends Contributor
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
Digital Trends’ Top Tech of CES 2023 Awards
Best of CES 2023 Awards Our Top Tech from the Show Feature

Let there be no doubt: CES isn’t just alive in 2023; it’s thriving. Take one glance at the taxi gridlock outside the Las Vegas Convention Center and it’s evident that two quiet COVID years didn’t kill the world’s desire for an overcrowded in-person tech extravaganza -- they just built up a ravenous demand.

From VR to AI, eVTOLs and QD-OLED, the acronyms were flying and fresh technologies populated every corner of the show floor, and even the parking lot. So naturally, we poked, prodded, and tried on everything we could. They weren’t all revolutionary. But they didn’t have to be. We’ve watched enough waves of “game-changing” technologies that never quite arrive to know that sometimes it’s the little tweaks that really count.

Read more
Digital Trends’ Tech For Change CES 2023 Awards
Digital Trends CES 2023 Tech For Change Award Winners Feature

CES is more than just a neon-drenched show-and-tell session for the world’s biggest tech manufacturers. More and more, it’s also a place where companies showcase innovations that could truly make the world a better place — and at CES 2023, this type of tech was on full display. We saw everything from accessibility-minded PS5 controllers to pedal-powered smart desks. But of all the amazing innovations on display this year, these three impressed us the most:

Samsung's Relumino Mode
Across the globe, roughly 300 million people suffer from moderate to severe vision loss, and generally speaking, most TVs don’t take that into account. So in an effort to make television more accessible and enjoyable for those millions of people suffering from impaired vision, Samsung is adding a new picture mode to many of its new TVs.
[CES 2023] Relumino Mode: Innovation for every need | Samsung
Relumino Mode, as it’s called, works by adding a bunch of different visual filters to the picture simultaneously. Outlines of people and objects on screen are highlighted, the contrast and brightness of the overall picture are cranked up, and extra sharpness is applied to everything. The resulting video would likely look strange to people with normal vision, but for folks with low vision, it should look clearer and closer to "normal" than it otherwise would.
Excitingly, since Relumino Mode is ultimately just a clever software trick, this technology could theoretically be pushed out via a software update and installed on millions of existing Samsung TVs -- not just new and recently purchased ones.

Read more
AI turned Breaking Bad into an anime — and it’s terrifying
Split image of Breaking Bad anime characters.

These days, it seems like there's nothing AI programs can't do. Thanks to advancements in artificial intelligence, deepfakes have done digital "face-offs" with Hollywood celebrities in films and TV shows, VFX artists can de-age actors almost instantly, and ChatGPT has learned how to write big-budget screenplays in the blink of an eye. Pretty soon, AI will probably decide who wins at the Oscars.

Within the past year, AI has also been used to generate beautiful works of art in seconds, creating a viral new trend and causing a boon for fan artists everywhere. TikTok user @cyborgism recently broke the internet by posting a clip featuring many AI-generated pictures of Breaking Bad. The theme here is that the characters are depicted as anime characters straight out of the 1980s, and the result is concerning to say the least. Depending on your viewpoint, Breaking Bad AI (my unofficial name for it) shows how technology can either threaten the integrity of original works of art or nurture artistic expression.
What if AI created Breaking Bad as a 1980s anime?
Playing over Metro Boomin's rap remix of the famous "I am the one who knocks" monologue, the video features images of the cast that range from shockingly realistic to full-on exaggerated. The clip currently has over 65,000 likes on TikTok alone, and many other users have shared their thoughts on the art. One user wrote, "Regardless of the repercussions on the entertainment industry, I can't wait for AI to be advanced enough to animate the whole show like this."

Read more