Skip to main content

Mind-reading A.I. analyzes your brain waves to guess what video you’re watching

Neural networks taught to "read minds" in real time

When it comes to things like showing us the right search results at the right time, A.I. can often seem like it’s darn close to being able to read people’s minds. But engineers at Russian robotics research company Neurobotics Lab have shown that artificial intelligence really can be trained to read minds — and guess what videos users are watching based entirely on their brain waves alone.

“We have demonstrated that observing visual scenes of different content affects the human brain waves, so that we can distinguish the scene categories from [one another] by analyzing the corresponding EEG (electroencephalogram) signal,” Anatoly Bobe, an engineer of Neurorobotics Lab in Moscow, told Digital Trends. “We [then] created a system for reconstructing the images from EEG signal features.”

Related Videos

The researchers trained the A.I. by showing it video clips of different objects, alongside the brain wave recordings of the people watching them. This allowed the deep learning neural network to learn the features commonly seen in brain wave activity when people were viewing particular types of video content. They then proved their model by getting test subjects to don EEG caps and record their brain activity as they watched video clips ranging from people on jet skis to nature sequences to human expressions. In 210 of 234 attempts, the A.I. was able to categorize and appropriately tag the brain activity.

“It cannot reconstruct the actual things that a subject sees or imagines, only some related images of the same category,” Bobe explained.

Bobe said that Neurobotics Lab seems to be the first research group to demonstrate this approach to video stimuli from EEG signals. However, it is not the first group to explore A.I.-driven mind-reading technology. We have covered a number of related research projects in the past. Many of these, though, have focused on fMRI analysis rather than EEG. As Bobe pointed out, “fMRI signals contain much more information on brain processes than EEG.” But a drawback of fMRI is that it requires large and expensive equipment that is only found in clinics. It is also difficult to obtain real-time results because of its poor time resolution. EEG, while being a more difficult and less reliable signal, is easier to utilize. This could make it more practical in real-world BCI (brain-computer interface) applications.

“Our system can be used in, for example, post-stroke rehabilitation, when a person needs either to exercise his brain in order to regain his cognitive abilities or needs to send mental commands through an EEG interface,” Bobe said. “Our system acts as a training system, in which a subject can train to generate mental commands, and use the reconstructed images as native feedback which shows how well he is doing with this task.”

Editors' Recommendations

New A.I. hearing aid learns your listening preferences and makes adjustments
Widex Moment hearing aids.

One of the picks for this year’s CES 2021 Innovation Awards is a smart hearing aid that uses artificial intelligence to improve the audio experience in a couple of crucial ways.

Among the improvements the Widex Moment makes to conventional hearing aids is reducing the standard sound delay experienced by wearers from 7 to 10 milliseconds seconds down to just 0.5 milliseconds. This results in a more natural sound experience for users, rather than the out-of-sync audio experience people have had to settle for up until now.

Read more
Futuristic new appliance uses A.I. to sort and prep your recycling

Lasso: The power to change recycling for good

Given the potential planet-ruining stakes involved, you’d expect that everyone on Earth would be brilliant at recycling. But folks are lazy and, no matter how much we might see footage of plastic-clogged oceans on TV, the idea of sorting out the plastic, glass, and paper for the weekly recycling day clearly strikes many as just a little bit too much effort.

Read more
New ‘A.I. lawyer’ analyzes your emails to find moneysaving loopholes
Joshua Browder parking ticket legal robot

Email systems have gotten smarter. Whether it’s filtering out spam, prioritizing the messages we need to respond to, reminding us when we’ve forgotten to include a mentioned attachment, or suggesting appropriate responses, 2020 email has come a long way from the basic inboxes of yesteryear. But there’s still further they can go -- and Joshua Browder, the creator of the robot lawyer service DoNotPay, believes he’s come up with a way to make email even more user-friendly. (Hint: It involves saving people money.)

Browder, for those unfamiliar with him, is the legal tech genius who has been creating automated legal bots for the past several years. Whether it’s helping appeal parking fines (where the original DoNotPay name came from) or aiding people in gaining unemployment benefits, he’s focused on one consumer rights area after the other to disrupt through automation.

Read more