Skip to main content

Robot arm can be controlled with thoughts, no brain implant needed

Noninvasive EEG-based control of a robotic arm for reach and grasp tasks
Researchers at the University of Minnesota claim to have made a significant breakthrough in the field of neural interfaces by creating a system which allows people to control a robotic arm using just their thoughts.

Published in the journal Scientific Reports, the research is particularly impressive because it allows a robot arm to be made to manipulate objects in a complex 3D environment without the use of a brain implant.

“We demonstrated that a group of 8 human subjects can control a robotic arm for 3D reach and grasp tasks using a noninvasive brain-computer interface (BCI),” Bin He, a University of Minnesota biomedical engineering professor and lead researcher on the study, told Digital Trends. “This represents an important advancement as noninvasive BCI does not have [the] risks or costs of invasive BCI which requires surgery and implants, and may have implications to help millions of patients who can be benefited from such a technology.”

Instead of a brain implant, the technology involves an electroencephalography (EEG) setup, which records electrical activity from the brain using a cap kitted out with 64 electrodes. Signal processing and AI machine learning technology then decodes these thoughts and turns them into specific actions for the robot to perform.

bci-umn-photo-2016-12-13
College of Science and Engineering, University of Minnesota
College of Science and Engineering, University of Minnesota

In the study, subjects started by learning to control a cursor on a computer screen. They then moved into the physical world by getting the robot arm to pick up objects on a table and place them on a three-layer shelf. All of the participants were able to control the arm, and success rates for precisely moving the objects ranged from 70-80 percent.

“We plan to move forward to test [the] prosthetic arm down the road to directly test the applicability of noninvasive BCI technology to help patients,” Professor He continued.

While at present this remains a fascinating futuristic research project, it is easy to see how this kind of technology might dramatically improve the lives of people with disabilities — in addition to having other applications involving the remote control of robots.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
In the future, welding robots could be controlled by operators’ thoughts
welding robot brain computer interface robots

Welding has always been a physical, hands-on job -- but that may be about to change. That’s because researchers from the University of Illinois Urbana-Champaign have developed a new mind-control system that makes it possible for someone to control a welding robot by transmitting mental instructions via an electroencephalography (EEG) cap. While robot welders are already used in industry, this innovation could help make the process more efficient, in addition to keeping human workers at a safe distance from the potentially deadly machines they work with.

“Welding is a high-skilled task,” Thenkurussi Kesavadas, professor of Industrial & Enterprise Systems Engineering, told Digital Trends. “A skilled welder can identify the exact joints that require welding based on drawings of the part. But when robots are used to weld, programming requires additional skill set and time. Our research is focused on automating robotic welding by using human knowledge about welding and computer vision to carry out automation.”

Read more
Brain-reading tech lets paralyzed people control a tablet with their thoughts
braingate paralysis use tablets 2 braincompute

The U.S.-based BrainGate consortium has developed technology that makes it possible for people with paralysis to use tablets and other mobile devices -- simply by thinking about cursor movements and clicks.

The technology uses a miniature sensor to record users’ neural activity via their motor cortex, the part of the brain used for planning, control, and execution of voluntary movements. These signals are then decoded and turned into instructions for controlling software. Using the system, three clinical trial participants were able to use a Google Nexus 9 tablet to carry out email messaging, chat, music streaming and video sharing. They also used the internet, checked the weather, and carried out online shopping, among other applications.

Read more
Harvard’s soft robotic exosuit adapts itself to the needs of every wearer
harvard exosuit 2018 customize to each user multi joint personalized breaks new ground  op 4

Multi-joint Personalized Exosuit Breaks New Ground

Over the past several years, we’ve seen robot exosuits move from the stuff of sci-fi firmly into the realm of science fact. Alongside this, our vision of what such exosuits may look like has shifted from the bulky “Power Loader” of James Cameron’s Aliens to softer, more personal wearables, which provide the same promise of assistive technology, but in a more practical, slimline form factor.

Read more