Skip to main content

This A.I. literally reads your mind to re-create images of the faces you see

Do you see what I see? Harnessing brain waves can help reconstruct mental images

Google’s artificial intelligence technology may sometimes seem like it’s reading our mind, but neuroscientists at Canada’s University of Toronto Scarborough are literally using A.I. for that very purpose — by reconstructing images based on brain perception using data gathered by electroencephalography (EEG).

Recommended Videos

In a test, subjects were hooked up to EEG brainwave-reading equipment and shown images of faces. While this happened, their brain activity was recorded and then analyzed using machine learning algorithms. Impressively, the researchers were able to use this information to digitally re-create the face image stored in the person’s mind. Unlike basic shapes, being able to re-create faces involves a high level of fine-grained visual detail, showcasing a high level of sophistication for the technology.

Please enable Javascript to view this content

While this isn’t the first time that A.I. has been used to read people’s minds, it’s the first time this has been achieved using EEG data. Previous studies involved fMRI technology, which measures brain activity by detecting changes in its blood flow. One of the most exciting differences between the two techniques is that EEG is far more portable, inexpensive, and can deliver greater levels of detail in mere milliseconds.

The technology could potentially be used by law enforcement for creating more accurate eyewitness reports about a potential suspect’s likeness. Currently, this information is relayed to a sketch artist through verbal descriptions, thereby potentially lowering its levels of accuracy. It might also serve as a way of helping people who lack the ability to communicate verbally. The EEG technology could be employed to produce a neural-based reconstruction of what a person is perceiving at any given time, as well as visualizing memories or imagination that let them express themselves.

In the future, the team hopes to build on this work by looking at how effectively they can reconstruct images with EEG data, based on a person’s memory of an event. They also want to move beyond faces to explore whether they can recreate accurate images of other objects.

A paper describing the work, titled “The Neural Dynamics of Facial Identity Processing: insights from EEG-Based Pattern Analysis and Image Reconstruction,” was recently published in the journal eNeuro.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Futuristic new appliance uses A.I. to sort and prep your recycling
Lasso

Lasso: The power to change recycling for good

Given the potential planet-ruining stakes involved, you’d expect that everyone on Earth would be brilliant at recycling. But folks are lazy and, no matter how much we might see footage of plastic-clogged oceans on TV, the idea of sorting out the plastic, glass, and paper for the weekly recycling day clearly strikes many as just a little bit too much effort.

Read more
A.I. teaching assistants could help fill the gaps created by virtual classrooms
AI in education kid with robot

There didn’t seem to be anything strange about the new teaching assistant, Jill Watson, who messaged students about assignments and due dates in professor Ashok Goel’s artificial intelligence class at the Georgia Institute of Technology. Her responses were brief but informative, and it wasn’t until the semester ended that the students learned Jill wasn’t actually a “she” at all, let alone a human being. Jill was a chatbot, built by Goel to help lighten the load on his eight other human TAs.

"We thought that if an A.I. TA would automatically answer routine questions that typically have crisp answers, then the (human) teaching staff could engage the students on the more open-ended questions," Goel told Digital Trends. "It is only later that we became motivated by the goal of building human-like A.I. TAs so that the students cannot easily tell the difference between human and A.I. TAs. Now we are interested in building A.I. TAs that enhance student engagement, retention, performance, and learning."

Read more
This basic human skill is the next major milestone for A.I.
Profile of head on computer chip artificial intelligence.

Remember the amazing, revelatory feeling when you first discovered the existence of cause and effect? That’s a trick question. Kids start learning the principle of causality from as early as eight months old, helping them to make rudimentary inferences about the world around them. But most of us don’t remember much before the age of around three or four, so the important lesson of “why” is something we simply take for granted.

It’s not only a crucial lesson for humans to learn, but also one that today’s artificial intelligence systems are pretty darn bad at. While modern A.I. is capable of beating human players at Go and driving cars on busy streets, this is not necessarily comparable with the kind of intelligence humans might use to master these abilities. That’s because humans -- even small infants -- possess the ability to generalize by applying knowledge from one domain to another. For A.I. to live up to its potential, this is something it also needs to be able to do.

Read more