Skip to main content

Machines are getting freakishly good at recognizing human emotions

Until very recently we’ve had to interact with computers on their own terms. To use them, humans had to learn inputs designed to be understood by the computer: whether it was typing commands or clicking icons using a mouse. But things are changing. The rise of A.I. voice assistants like Siri and Alexa make it possible for machines to understand humans as they would ordinarily interact in the real world. Now researchers are reaching for the next Holy Grail: Computers that can understand emotions.

Whether it’s Arnold Schwarzenegger’s T-1000 robot in Terminator 2 or Data, the android character in Star Trek: The Next Generation, the inability of machines to understand and properly respond to human emotions has long been a common sci-fi trope. However, real world research shows that machine learning algorithms are actually getting impressively good at recognizing the bodily cues we use to hint at how we’re feeling inside. And it could lead to a whole new frontier of human-machine interactions.

Affectiva

Don’t get us wrong: Machines aren’t yet as astute as your average human when it comes to recognizing the various ways we express emotions. But they’re getting a whole lot better. In a recent test carried out by researchers at Dublin City University, University College London, the University of Bremen and Queen’s University Belfast, a combination of people and algorithms were asked to recognize an assortment of emotions by looking at human facial expressions.

Recommended Videos

The emotions included happiness, sadness, anger, surprise, fear, and disgust. While humans still outperformed machines overall (with an accuracy of 73% on average, compared to 49% to 62% depending on the algorithm), the scores racked up by the various bots tested showed how far they have come in this regard. Most impressively, happiness and sadness were two emotions at which machines can outperform humans at guessing, simply by looking at faces. That’s a significant milestone.

Emotions matter

Researchers have long been interested in finding out whether machines can identify emotion from still images or video footage. But it is only relatively recently that a number of startups have sprung up to take this technology mainstream. The recent study tested commercial facial recognition machine classifiers developed by Affectiva, CrowdEmotion, FaceVideo, Emotient, Microsoft, MorphCast, Neurodatalab, VicarVision, and VisageTechnologies. All of these are leaders in the growing field of affective computing, a.k.a. teaching computers to recognize emotions.

The test was carried out on 938 videos, including both posed and spontaneous emotional displays. The chance of a correct random guess by the algorithm for the six emotion types would be around 16%.

Damien Dupré, an Assistant Professor at Dublin City University’s DCU Business School, told Digital Trends that the work is important because it comes at a time when emotion recognition technology is becoming more relied upon.

“Since machine learning systems are becoming easier to develop, a lot of companies are now providing systems for other companies: mainly marketing and automotive companies,” Dupré said. “Whereas [making] a mistake in emotion recognition for academic research is, most of the time, harmless, stakes are different when implanting an emotion recognition system in a self-driving car, for example. Therefore we wanted to compare the results of different systems.”

It could one day be used to spot things like drowsiness or road rage, which might trigger a semi-autonomous car taking the wheel.

The idea of controlling a car using emotion-driven facial recognition sounds, frankly, terrifying — especially if you’re the kind of person prone to emotional outbursts on the road. Fortunately, that’s not exactly how it’s being used. For instance, emotion recognition company Affectiva has explored the use of in-car cameras to identify emotion in drivers. It could one day be used to spot things like drowsiness or road rage, which might trigger a semi-autonomous car taking the wheel if a driver is deemed unfit to drive.

Researchers at the University of Texas at Austin, meanwhile, have developed technology that curates an “ultra-personal” music playlist that adapts to each user’s changing moods. A paper describing the work, titled “The Right Music at the Right Time: Adaptive Personalized Playlists Based on Sequence Modeling,” was published this month in the journal MIS Quarterly. It describes using emotion analysis that predicts not just which songs will appeal to users based on their mood, but the best order in which to play them, too.

Affectiva

There are other potential applications for emotion recognition technology, too. Amazon, for instance, has very recently begun to incorporate emotion-tracking of voices for its Alexa assistant; allowing the A.I. to recognize when a user is showing frustration. Further down the line, there’s the possibility this could even lead to full-on emotionally responsive artificial agents, like that in Spike Jonze’s 2013 movie Her.

In the recent image-based emotion analysis work, emotion sensing is based on images. However, as some of these illustrations show, there are other ways that machines can “sniff out” the right emotion at the right time.

“When facial information is for some reason unavailable, we can analyze the vocal intonations or look at the gestures.”

“People are generating a lot of non-verbal and physiological data at any given moment,” said George Pliev, founder and managing partner at Neurodata Lab, one of the companies whose algorithms were tested for the facial recognition study. “Apart from the facial expressions, there are voice, speech, body movements, heart rate, and respiration rate. A multimodal approach states that behavioral data should be extracted from different channels and analyzed simultaneously. The data coming from one channel will verify and balance the data received from the other ones. For example, when facial information is for some reason unavailable, we can analyze the vocal intonations or look at the gestures.”

Challenges ahead?

However, there are challenges — as all involved agree. Emotions are not always easy to identify; even for the people experiencing them.

“If you wish to teach A.I. how to detect cars, faces or emotions, you should first ask people what do these objects look like,” Pliev continued. “Their responses will represent the ground truth. When it comes to identifying cars or faces, almost 100% of people asked would be consistent in their replies. But when it comes to emotions, things are not that simple. Emotional expressions have many nuances and depend on context: cultural background, individual differences, the particular situations where emotions are expressed. For one person, a particular facial expression would mean one thing, while another person may consider it differently.”

Dupré agrees with the sentiment. “Can these systems [be guaranteed] to recognize the emotion actually felt by someone?” he said. “The answer is not at all, and they will never be! They are only recognizing the emotion that people are deciding to express — and most of the time that doesn’t correspond to the emotion felt. So the take-away message is that [machines] will never read … your own emotion.”

Still, that doesn’t mean the technology isn’t going to be useful. Or stop it from becoming a big part of our lives in the years to come. And even Damien Dupré leaves slight wiggle room when it comes to his own prediction that machines will never achieve something: “Well, never say never,” he noted.

The research paper, “Emotion recognition in humans and machine using posed and spontaneous facial expression,” is available to read online here.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Meet the robotic pioneers that will help humanity colonize Mars
A rendering of Mars 2020 rover, to be launched on its journey to Mars next year.

From NASA's upcoming Moon to Mars mission to Elon Musk's ambitious plans to use a SpaceX Starship to eventually colonize Mars, the race to populate the Red Planet is already on. But before humans can visit Mars and set up any kind of long-term base there, we need to send out scouts to see the lay of the land and prepare it for manned missions.

The mechanical pioneers we'll be sending to Mars in the coming years will follow in the tire tracks of explorers like the Curiosity rover and the Insight lander, but the next generation of Martian robotics will use sophisticated AI, novel propulsion methods, and flexible smallsats to meet the challenges of colonizing a new world.
Designing for the Mars environment
There are distinct difficulties in building machines which can withstand the Martian environment. First, there's the cold, with temperatures averaging around minus 80 degrees Fahrenheit and going down to minus 190 degrees Fahrenheit at the poles. Then there's the thin atmosphere, which is just one percent the density of Earth's atmosphere. And then there's the troublesome dust that gets kicked up in any operations on the planet's surface, not to mention the intense radiation from the Sun's rays.

Read more
MIT is teaching self-driving cars how to psychoanalyze humans on the road
mit algorithm predict drivers personality car driver behind wheel

In March 2004, the U.S. Defense Advanced Research Projects Agency (DARPA) organized a special Grand Challenge event to test out the promise -- or lack thereof -- of current-generation self-driving cars. Entrants from the world's top A.I. labs competed for a $1 million prize; their custom-built vehicles trying their best to autonomously navigate a 142-mile route through California’s Mojave Desert. It didn’t go well. The “winning” team managed to travel just 7.4 miles in several hours before shuddering to a halt. And catching fire.

A decade-and-a-half, a whole lot has changed. Self-driving cars have successfully driven hundreds of thousands of miles on actual roads. It’s non-controversial to say that humans will almost certainly be safer in a car driven by a robot than they are in one driven by a human. However, while there will eventually be a tipping point when every car on the road is autonomous, there’s also going to be a messy intermediary phase when self-driving cars will have to share the road with human-driven cars. You know who the problem parties are likely to be in this scenario? That’s right: the fleshy, unpredictable, sometimes-cautious, sometimes-prone-to-road-rage humans.

Read more
Nissan launches charging network, gives Ariya access to Tesla SuperChargers
nissan charging ariya superchargers at station

Nissan just launched a charging network that gives owners of its EVs access to 90,000 charging stations on the Electrify America, Shell Recharge, ChargePoint and EVgo networks, all via the MyNissan app.It doesn’t stop there: Later this year, Nissan Ariya vehicles will be getting a North American Charging Standard (NACS) adapter, also known as the Tesla plug. And in 2025, Nissan will be offering electric vehicles (EVs) with a NACS port, giving access to Tesla’s SuperCharger network in the U.S. and Canada.Starting in November, Nissan EV drivers can use their MyNissan app to find charging stations, see charger availability in real time, and pay for charging with a payment method set up in the app.The Nissan Leaf, however, won’t have access to the functionality since the EV’s charging connector is not compatible. Leaf owners can still find charging stations through the NissanConnectEV and Services app.Meanwhile, the Nissan Ariya, and most EVs sold in the U.S., have a Combined Charging System Combo 1 (CCS1) port, which allows access to the Tesla SuperCharger network via an adapter.Nissan is joining the ever-growing list of automakers to adopt NACS. With adapters, EVs made by General Motors, Ford, Rivian, Honda and Volvo can already access the SuperCharger network. Kia, Hyundai, Toyota, BMW, Volkswagen, and Jaguar have also signed agreements to allow access in 2025.
Nissan has not revealed whether the adapter for the Ariya will be free or come at a cost. Some companies, such as Ford, Rivian and Kia, have provided adapters for free.
With its new Nissan Energy Charge Network and access to NACS, Nissan is pretty much covering all the bases for its EV drivers in need of charging up. ChargePoint has the largest EV charging network in the U.S., with over 38,500 stations and 70,000 charging ports at the end of July. Tesla's charging network is the second largest, though not all of its charging stations are part of the SuperCharger network.

Read more