When it comes to things like showing us the right search results at the right time, A.I. can often seem like it’s darn close to being able to read people’s minds. But engineers at Russian robotics research company Neurobotics Lab have shown that artificial intelligence really can be trained to read minds — and guess what videos users are watching based entirely on their brain waves alone.
“We have demonstrated that observing visual scenes of different content affects the human brain waves, so that we can distinguish the scene categories from [one another] by analyzing the corresponding EEG (electroencephalogram) signal,” Anatoly Bobe, an engineer of Neurorobotics Lab in Moscow, told Digital Trends. “We [then] created a system for reconstructing the images from EEG signal features.”
The researchers trained the A.I. by showing it video clips of different objects, alongside the brain wave recordings of the people watching them. This allowed the deep learning neural network to learn the features commonly seen in brain wave activity when people were viewing particular types of video content. They then proved their model by getting test subjects to don EEG caps and record their brain activity as they watched video clips ranging from people on jet skis to nature sequences to human expressions. In 210 of 234 attempts, the A.I. was able to categorize and appropriately tag the brain activity.
“It cannot reconstruct the actual things that a subject sees or imagines, only some related images of the same category,” Bobe explained.
Bobe said that Neurobotics Lab seems to be the first research group to demonstrate this approach to video stimuli from EEG signals. However, it is not the first group to explore A.I.-driven mind-reading technology. We have covered a number of related research projects in the past. Many of these, though, have focused on fMRI analysis rather than EEG. As Bobe pointed out, “fMRI signals contain much more information on brain processes than EEG.” But a drawback of fMRI is that it requires large and expensive equipment that is only found in clinics. It is also difficult to obtain real-time results because of its poor time resolution. EEG, while being a more difficult and less reliable signal, is easier to utilize. This could make it more practical in real-world BCI (brain-computer interface) applications.
“Our system can be used in, for example, post-stroke rehabilitation, when a person needs either to exercise his brain in order to regain his cognitive abilities or needs to send mental commands through an EEG interface,” Bobe said. “Our system acts as a training system, in which a subject can train to generate mental commands, and use the reconstructed images as native feedback which shows how well he is doing with this task.”
- MIT is teaching self-driving cars how to psychoanalyze humans on the road
- Have trouble staying focused? These brain-training earbuds might help
- Ingenious new wearable tracks users’ brain waves to forecast epileptic seizures
- Robot overlords? More like co-verlords. The future is human-robot collaboration
- New ‘shady’ research from MIT uses shadows to see what cameras can’t