Skip to main content

New artificial skin will allow robots to detect touch and textures

ReadingBraillePatten RobotHand

Researchers at the National University of Singapore (NUS) are working hard to teach robots how to feel. And, no, not in the sense that their subjects are opening up emotionally about the state of robotics, circa 2020. Instead, they’re learning to feel — as in, to use sensory identification to detect touches and identify the shape, texture, and hardness of objects, much as we humans do.

What the researchers have developed is an artificial skin for robots they claim is able to detect these touches more than 10,000 times faster than the (already impressively rapid) human sensory nervous system. While it’s still early days for the research, it could nonetheless open up a plethora of new applications for future robots.

Related Videos

“Enabling a sense of touch in robotics could lead to many new use cases, and in general, allows for safer human-robotic interaction by improving the robot’s perception of its environment,” Mike Davies, director of Intel’s neuromorphic computing lab, which developed the Loihi neuromorphic research chip used to power the robotic skin, told Digital Trends. “For example, robot-assisted surgery technology would greatly benefit from more sensitive and faster tactile sensing. In the factory, challenging manufacturing tasks such as soft material assembly and spring detangling will be impossible to automate until industrial robots can achieve sufficient fine motor dexterity with tactile sensing.”

In an initial experiment, the researchers used a robotic hand wearing their artificial skin to read Braille. It was able to achieve this with more than 92% accuracy, while using significantly less power than alternative approaches. The team built on this work by looking at how a combination of vision and touch data could be used to classify objects. The results were revealed at the ongoing Robotics: Science and Systems conference, which runs through this week.

“NUS plans to further develop this robotic system for applications in the logistics and food manufacturing industries, where there is a high demand for robotic automation — especially moving forward in the post-COVID era,” Davies continued. “These benefits can be compounded by adding additional sensing capabilities, such as hearing and smelling.”

It’s not quite at the level of Terminator-style cyborgs, yet. But advances such as this certainly have the potential to help robots advance to the next step in their evolution.

Editors' Recommendations

Amazon’s Scout robot appears to have made its last delivery
amazon scout delivery robot program

Amazon is ending field tests of its autonomous Scout delivery robot nearly four years after it unveiled the machine.

Amazon Scout

Read more
Nvidia’s $200 Jetson Orin Nano minicomputer is 80 times faster than the previous version
Nvidia Jetson Orin Nano system-on-module.

Nvidia announced the upcoming release of the Jetson Orin Nano, a system-on-module (SOM) that will power up the next generation of entry-level AI and robotics, during its GTC 2022 keynote today.

Nvidia says this new version delivers an 80x increase in performance over the $99 Jetson Nano. The original version was released in 2019 and has been used as a bare-bones entry into the world of AI and robotics, particularly for hobbyists and STEM students. This new version looks to seriously up the power.

Read more
Meet BILL, Nike’s sneaker-cleaning robot
BILL, Nike's sneaker-cleaning robot.

Keen to burnish its green credentials by making sneakers last as long as possible, Nike has come up with the Bot Initiated Longevity Lab, otherwise known as BILL.

BILL (below) is an enormous brush-laden contraption that’s designed to clean your Nike footwear to make them look as good as new.

Read more