Skip to main content

Camera-clad rubber fingertips allow robots to manipulate cables and wires

Robots that can Manipulate Cables

Tying a knot or plugging a charger into a port are tasks that require human-like dexterity, but thanks to research out of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), robots could soon be up to the task.

By embedding sensors created using rubber and cameras, the team was able to create a bot capable of working with rope, cord, and wires in a more human-like manner.

Each of the robot arm’s two fingers is equipped with sensors called GelSight. These sensors use tiny cameras embedded in soft rubber. The cameras allow the robot to understand where the cable is, its size and shape, and the force as the cable slides through two fingers.

MIT CSAIL

Understanding the cable’s physical properties and location is just one part of the equation. Next, the researchers built a framework allowing the robot to use that information to adjust movements in real-time in order to quickly work with that cable. The researchers developed a controller to regulate grip strength and a second to adjust the gripper’s position.

When put to the test, the robotic arm was able to successfully plug a pair of earbuds into a headphone jack, as well as working with thinner and thicker ropes, wires, and cables. While that may bring to mind a robot butler capable of plugging itself in at night, the technology is most likely destined to start in the automotive industry, though the researchers note the potential for the technology to one day be used in surgical sutures, industrial applications, and potential household uses.

MIT CSAIL

“Manipulating soft objects is so common in our daily lives, like cable manipulation, cloth folding, and string knotting,” Yu She, MIT postdoctoral associate and lead author, said. “In many cases, we would like to have robots help humans do this kind of work, especially when the tasks are repetitive, dull, or unsafe.”

GelSight builds on research published last month that embedded cameras into a robot gripper to pick up objects as delicate as a potato chip. The new technology allows for cable manipulation with fewer drops, longer distances, and more accuracy than previous robotics, the team says. The group plans to research additional tasks like routing cables, as well as real-world uses such as automatically manipulating cables within the automotive industry.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
New ‘shady’ research from MIT uses shadows to see what cameras can’t
mit csail blind inverse light

Computational Mirrors: Revealing Hidden Video

Artificial intelligence could soon help video cameras see lies just beyond what the lens can see -- by using shadows. Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have concocted an algorithm that “sees” what’s out of the video frame by analyzing the shadows and shading that out-of-view objects create. The research, Blind Inverse Light Transport by Deep Matrix Factorization, was published today, Dec. 6.

Read more
Watch MIT’s Mini Cheetahs limber up for the robot apocalypse
watch mits mini cheetahs limber up for the robot apocalypse mit

Testing 9 New Mini Cheetahs

The Biomimetics Robotics team at the Massachusetts Institute of Technology (MIT) recently took a number of its talented Mini Cheetah robots for some fun in the park.

Read more
Bop it, twist it, pull it, grip it: MIT robot hand can pick up objects with ease
MIT Robot Gripper

Engineers from the Massachusetts Institute of Technology (MIT) have figured out a way to make a robot grasp an object quicker and more efficiently. 

MIT showed off the robot in a GIF exactly of the claw picking up and adjusting its grip on an object, which is more complicated than it looks for a machine. According to the release, it can take a robot tens of minutes to plan out the possibilities of the sequence, but with a new algorithm, it takes less than a second. 

Read more