Skip to main content

‘Locked-in’ patient communicates through first at-home brain implant

In a medical first, a “locked-in” patient has regained the ability to communicate through a home-use brain implant that translates her thoughts into text, New Scientist reports.

The 58-year-old — who has chosen to remain anonymous — was diagnosed with amyotrophic lateral sclerosis (ALS) in 2008. Over the years, the disease destroyed her capacity to control most of her body and caused her to develop locked-in syndrome, an inability to move or communicate due to paralysis while still maintaining consciousness.

Up until recently, the patient communicated by spelling out words using an eye-tracking system. The solution was temporary, however, as a third of all ALS patients eventually lose control of their eye movements. Other methods — like the one used by physicist Stephen Hawking, who communicates through a smaller sensor controlled by his cheek muscles  — also rely on capabilities that may not always be reliable.

New England Journal of Medicine
New England Journal of Medicine

The at-home brain-computer interface (BCI) was developed by Nick Ramsey’s team at the Brain Center of University Medical Center Utrecht in the Netherlands. Previous research has seen the development of probes that have enabled users to control artificial limbs through brain signals, but so far these devices have been impractical outside of the laboratory due to their constant need for calibration.

To mitigate this issue, Ramsey and his team focused on detecting just the brain signals that fire when the brain counts backwards and commands the body to click a mouse.

The new device connects electrodes to the brain through a small hole beneath the skull. The electrodes register brain signals and transmit them wirelessly to a tablet that translates the signals into “clicks,” which specialized software can use to help the patient spell words or select items.

“I want to contribute to possible improvements for people like me,” the patient told New Scientist.

In her first day of use, the patient was able to generate a brain signal with the system. Six months later, she had an accuracy of 95 percent. Ramsey presented his team’s work yesterday at the Society for Neuroscience in San Francisco, California.

The current system isn’t perfect though. Due to its simplicity, it won’t likely be able to handle complicated tasks, such as controlling artificial limbs, in the future. And it’s slower than the eye tracker, taking about 20 seconds to select a single letter.

But the system’s simplicity makes it ideal for practical, at-home applications. And it offers the patient more freedom, since the previous device often couldn’t register slight eye movements while she was outdoors. “Now I can communicate outdoors when my eye-track computer doesn’t work,” she said. “I’m more confident and independent now outside.”

Ramsey and his team plan to trial the system with another patient. While refining the software, they expect future tests to advance faster than the first. Hardware updates may one day may it possible to capture more commands like driving a wheelchair.

Editors' Recommendations

Dyllan Furness
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
Implant restores sight in blind patients by beaming images directly to the brain
neural implant restore sight study screen shot 2019 07 18 at 15 25 52


Engineers have developed a new neural implant which could help completely blind people by bypassing non-functioning optical nerves and inputting images directly into their brain. Currently being tested by researchers at Houston, Texas’ Baylor College of Medicine, the Orion device could be a game-changer for people who are unable to take in visual information through their eyes. It has already resulted in partial sight being restored to six participants in an experimental study.

Read more
Groundbreaking A.I. can synthesize speech based on a person’s brain activity
Everything you need to know about Neuralink

Speech synthesis from neural decoding of spoken sentences

Scientists from the University of California, San Francisco have demonstrated a way to use artificial intelligence to turn brain signals into spoken words. It could one day pave the way for people who cannot speak or otherwise communicate to be able to talk with those around them.

Read more
Brain-controlled third arm lets you take your multitasking to the next level
brain machine interface japan third arm prosthetichandbrain

BMI control of a third arm for multitasking

For whatever reason, some seriously smart folks in the tech community seem to be obsessed with adding extra appendages to the human body -- and they’re getting more ambitious all the time. First, it was the 3D-printed functioning extra thumb prosthesis, made by a graduate student at London’s Royal College of Art. Then, it was the robotic Double Hand, dreamed up by augmented human startup YouBionic. Now, researchers from the Advanced Telecommunications Research Institute in Kyoto, Japan are taking the next logical step by creating a robotic third arm that will allow its wearers to take their multitasking ability to warp speed. Oh, and did we mention that it’s mind-controlled, too?

Read more