Skip to main content

Groundbreaking new prosthetic translates spinal cord signals into movement

robot arm spinal cord robotic prostheticc
Scientists at Imperial College London have developed smart sensor technology that allows a robot arm to be controlled via signals from nerves in the spinal cord.

In order to use the prosthesis, wearers think of actions, which are then interpreted as commands courtesy of electrical signals sent from the spinal motor neurons.

“Through means of advanced computational neuroscience, combined with the state of the art surgical techniques, we have shown that using non-invasive technology we can decode user’s movement intention all the way down at the level of the spinal cord,” Ivan Vujaklija, one of the researchers who worked on the project, told Digital Trends. “With this, we have established highly functional and precise man-machine interface which can be applied in control of bionic limbs.”

The robotic arm potentially represents a significant step forward from existing robot prostheses, which often rely on users controlling them via frequently damaged remnant shoulder and arm muscles. It also compares favorably to some of the more cutting-edge robotic prosthesis projects that require users to wear EEG caps or percutaneous implants for brain signal recordings.

In lab experiments carried out at Imperial College, six amputee or partial-amputee volunteers were able to use the robotic arms to show off a greater range of motion than would be possible using a regular muscle-controlled prosthesis.

It all sounds extremely promising, although Vujaklija was eager to point out that a finished product is still a way off.

“The work presented is a proof of concept which involved strictly laboratory tests,” he said. “The next stage is to optimize the developed technology and commence the translational activities, which will include extensive clinical tests with focus on robustness and miniaturization of the system. The idea would be to adapt and evaluate the system across a larger population of volunteers with different levels of impairment.”

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Like a mechanical shadow, Toyota’s new robot mimics your movements in real time
toyota robot mirrors movement 20171121 03 01

As a company, Toyota may be best known for its cars, but it sure builds some nice robots, too. Its latest, third-generation humanoid robot, unveiled November 21, is capable of exactly mimicking its operator’s movements. According to Toyota, it could one day serve as a robot assistant in homes, hospitals, and disaster zones, on construction sites -- even in outer space.

Called T-HR3, the 5-foot-1-inch robot is controlled via a proprietary Master Maneuvering System that allows it to be operated by a human user kitted out with special sensors that map their hand, arm, and foot movements. A head-mounted display allows them to see through the robot’s eyes. The robot is then able to replicate movements with impressive accuracy via its 29 articulated body parts. A cleverly designed chair, packing a 16 torque servo module, lets the user move the robot either forward or sideways while sitting still.

Read more
High-tech neuroprosthetic ‘Luke’ arm lets amputee touch and feel again
Luke Arm

“When I went to grab something, I could feel myself grabbing it. When I thought about moving this or that finger, it would move almost right away,” Keven Walgamott said. “I don’t know how to describe it except that it was like I had a hand again.”

Walgamott was describing the results of an experimental surgery to The Washington Post, where a prosthetic known as the "Luke" arm had been attached with electrodes implanted into his nerves. The real estate agent had lost his hand and most of his arm in an electrical accident 14 years ago, and he volunteered for the program at the University of Utah.

Read more
The 3D-printed Aslan robotic arm could help translate for the deaf
aslan 3d printed robotic arm asl 255760 ua  robotarm 13 6c5fc2 large 1502877456

Around 70 million people today claim sign language as their mother tongue, and now, we can add one more to their ranks. But the latest entity to be fluent in ASL isn't a person -- it's a robotic arm. Meet Aslan, a new 3D-printed structure meant to "minimize the communication barrier between the hearing and the deaf."

Intended to serve as a translator, Aslan can hear spoken language, then turn it into sign language. By means of a robotic set-up, spoken language will be immediately translated to sign language. And thanks to its 3D-printed design along with its easily attainable components, the team behind the project (sponsored by the European Institute for Otorhinolaryngology) believes that "the Aslan robot can remain available at a low-cost and more accessible to the world."

Read more