Skip to main content
  1. Home
  2. Emerging Tech
  3. Legacy Archives

Scientists create glasses that expose what people are really feeling

Add as a preferred source on Google
x-ray-glasses
Image used with permission by copyright holder

We’ve all dreamt of having X-ray glasses to reveal to us things which we cannot see. But what if we had a device that could show us something deeper, some truth about the people we meet? That is now a reality, with a new kind of glasses that use real-time facial recognition technology to unmask a person’s true emotional state, reports Sally Adee in New Scientist. The implications of such a technology are profound, to say the least.

The special specs were developed by University of Cambridge, UK, researcher Rana el Kaliouby, who wanted to help autistic people by creating a way for them to  read the emotional cues of the people they come in contact with each day in a way made impossible by their medical condition.

Recommended Videos

El Kaliouby sought the help of fellow Cambridge associate and autism expert Simon Baron-Cohen (yes, he’s Borat‘s cousin). The two identified six independent facial expressions that are used to evoke our range of emotions: thinking, agreeing, concentrating, interested, confused and disagreeing. The pair then hired actors to make the various expressions, which were then interpreted by volunteers who were asked to describe their meaning. The majority description was deemed the most accurate one.

The glasses, developed by MIT electrical engineer Rosalind Picard, use “a camera the size of a rice grain connected to a wire snaking down to a piece of dedicated computing machinery about the size of a deck of cards,” writes Adee. The camera watches 24 “feature points” on a person’s face, and inputs the data into a software, which interprets the movements and micro movements, and compares them against a database of known expressions.

Also built into the glasses are a earpiece and a light on the lens, both of which tell the wearer if the person with whom they are speaking has a negative reaction to something that’s said. If everybody’s happy, the light flashes green. If things go sour, a red light appears on the lens. The team responsible for the glasses hopes to one day make an augmented reality version that displays information from a computer on the lenses.

While the glasses were primarily developed for people with autism, the researchers found that most people are terrible at reading emotional cues; on average, their test subjects were able to distinguish the correct emotion only 54 percent of the time. The glasses, while far from perfect, knock the probability of getting it right to 64 percent.

In addition to the glasses, other teams of scientists have made similar devices that can help us get a better read on each other. One, a patch that is worn on the chest, monitors a person’s social interactions, and can tell them when they talk too much or too loudly. The team of MIT doctoral students that developed it call the pat the “jerk-o-meter.” Another, also developed by Picard, is a software that can use a webcam to determine a person’s heartbeat and other vital health information.

Combined, these devices can turn an ordinary person into an emotional IQ super-genius. And the technologies have already begun to attract the attention of private industry, so it may not be too long before hiding what you’re feeling become next to impossible.

Read the full New Scientist article here.

Andrew Couts
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
This new technique can generate AI videos in just a few seconds
TurboDiffusion can generate AI videos upto 200 times faster without losing quality
ai-video-generation

Researchers have unveiled a new AI video generation technique called TurboDiffusion that can create synthetic videos at near-instant speed. It can generate AI video up to 200 times faster than with existing methods, without sacrificing visual quality.

The work is a joint effort between ShengShu Technology, Tsinghua University, and researchers affiliated with the University of California, Berkeley. According to its developers, the system is designed to dramatically cut the time it takes to generate video, a process that has traditionally been slow and computationally expensive.

Read more
This camera breakthrough could soon help you take photos where everything is in focus
Researchers at Carnegie Mellon have developed a new lens technology that lets cameras focus on the entire scene at once.
Smartphone camera

Whether you're snapping photos on the best camera phone or using a proper camera, getting everything from the foreground to the background in focus is almost always out of the question. A new breakthrough, however, may soon make that a very real possibility.

Researchers at the Carnegie Mellon University have successfully developed a new kind of camera lens that offers spatially selective focusing, which can allow cameras to focus on an entire scene at once. A blog post about the development details that this tech can capture photos "where every detail, near and far, is perfectly sharp—from the flower petal right in front of you to the distant trees on the horizon."

Read more
Samsung concepts put an OLED screen on a classroom robot and retro music gear 
OLED screens, everywhere. From TV and headset to class robots and your audio gear.
Samsung OLED concept for robot

Every year, Samsung’s display division gives us a taste of concept devices with cutting-edge screens, some of which eventually appear on mass-market hardware down the road. Remember the dual folding concept from 2021, which only became a proper product in late 2025 as the Galaxy Z TriFold? Well, at CES 2026, Samsung is giving another glimpse into the future. 

Robots deserve OLED, too

Read more