Skip to main content

Mind-reading A.I. analyzes your brain waves to guess what video you’re watching

Neural networks taught to "read minds" in real time

When it comes to things like showing us the right search results at the right time, A.I. can often seem like it’s darn close to being able to read people’s minds. But engineers at Russian robotics research company Neurobotics Lab have shown that artificial intelligence really can be trained to read minds — and guess what videos users are watching based entirely on their brain waves alone.

“We have demonstrated that observing visual scenes of different content affects the human brain waves, so that we can distinguish the scene categories from [one another] by analyzing the corresponding EEG (electroencephalogram) signal,” Anatoly Bobe, an engineer of Neurorobotics Lab in Moscow, told Digital Trends. “We [then] created a system for reconstructing the images from EEG signal features.”

The researchers trained the A.I. by showing it video clips of different objects, alongside the brain wave recordings of the people watching them. This allowed the deep learning neural network to learn the features commonly seen in brain wave activity when people were viewing particular types of video content. They then proved their model by getting test subjects to don EEG caps and record their brain activity as they watched video clips ranging from people on jet skis to nature sequences to human expressions. In 210 of 234 attempts, the A.I. was able to categorize and appropriately tag the brain activity.

“It cannot reconstruct the actual things that a subject sees or imagines, only some related images of the same category,” Bobe explained.

Bobe said that Neurobotics Lab seems to be the first research group to demonstrate this approach to video stimuli from EEG signals. However, it is not the first group to explore A.I.-driven mind-reading technology. We have covered a number of related research projects in the past. Many of these, though, have focused on fMRI analysis rather than EEG. As Bobe pointed out, “fMRI signals contain much more information on brain processes than EEG.” But a drawback of fMRI is that it requires large and expensive equipment that is only found in clinics. It is also difficult to obtain real-time results because of its poor time resolution. EEG, while being a more difficult and less reliable signal, is easier to utilize. This could make it more practical in real-world BCI (brain-computer interface) applications.

“Our system can be used in, for example, post-stroke rehabilitation, when a person needs either to exercise his brain in order to regain his cognitive abilities or needs to send mental commands through an EEG interface,” Bobe said. “Our system acts as a training system, in which a subject can train to generate mental commands, and use the reconstructed images as native feedback which shows how well he is doing with this task.”

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Revisiting the rise of A.I.: How far has artificial intelligence come since 2010?
christie's auction house obvious art ai

2010 doesn’t seem all that long ago. Facebook was already a giant, time-consuming leviathan; smartphones and the iPad were a daily part of people’s lives; The Walking Dead was a big hit on televisions across America; and the most talked-about popular musical artists were the likes of Taylor Swift and Justin Bieber. So pretty much like life as we enter 2020, then? Perhaps in some ways.

One place that things most definitely have moved on in leaps and bounds, however, is on the artificial intelligence front. Over the past decade, A.I. has made some huge advances, both technically and in the public consciousness, that mark this out as one of the most important ten year stretches in the field’s history. What have been the biggest advances? Funny you should ask; I’ve just written a list on exactly that topic.

Read more
Here’s an A.I. preview of what climate change will do to your neighborhood
how ai can address climate crisis visualizing change street feat

For the people of the Maldives, a string of islands off the southern tip of India, the realities of climate change lie right outside their front door. A 2007 report from the United Nation’s Intergovernmental Panel on Climate Change predicted that unfettered carbon emissions could push sea level rise to 23 inches by 2100. With an average elevation of less than five feet, even a slight increase in sea level could make these islands inhabitable. The teal blue sea is swallowing them up.

This article is part of our continuing series, <i>Tech for Change</i>

Read more
Microsoft latest A.I. tools will read your Outlook emails aloud

At its Ignite conference this year, Microsoft is continuing its A.I. push. By further integrating artificial intelligence into its Microsoft 365 applications -- including its Office suite products such as Outlook and Excel -- the company wants to offload some of the heavy human lifting to its Cortana digital smart assistant to help you stay productive.

By bringing its A.I. smarts to Outlook on iOS, Microsoft is showing how you can truly go hands-free and eyes-free and still stay on top of your email. With the launch of a new Play My Emails feature, Outlook users on iPhone and iPad will be able to have Cortana read out emails using natural language technology. For commuters, this feature can save you time during your morning and evening commute, as Cortana will be able to read your email messages to you while you're driving, a move that places Microsoft's digital assistant in direct competition with music and audiobooks for your ear time.

Read more