Skin doesn’t just exist to keep our insides in; it’s also an incredibly useful material for sensing. Rather than having to rely on visuals to confirm that we are successfully grasping objects, we can feel them. That’s an ability that engineers at the University of Washington are keen to provide to robots — thereby making them more useful for a great deal of different tasks, ranging from moving objects in a warehouse to carrying out complex surgery.
“We have developed an artificial skin that can feel pressure and shear,” Jonathan Posner, professor of mechanical engineering and chemical engineering, told Digital Trends. “The skin mimics the way a human finger experiences tension and compression as it slides along a surface or distinguishes among different textures. As you slide your finger across a surface, one side of your nailbed bulges out while the other side becomes tight. We leveraged this effect in our artificial skin to generate asymmetric stretching. We measure how much the skin stretches using tiny channels that [are] filled with liquid metal, similar to mercury. When the channel geometry changes, so does the amount of electricity that can flow through them.”
The stretchable electronic skin was manufactured at the University of Washington’s Washington Nanofabrication Facility. It’s made from the same silicone rubber material that’s commonly used in swimming goggles, only with the addition of tiny channels the width of individual humans hairs, which are filled with electrically conductive liquid metal. The advantage of this conductive liquid metal is that it won’t crack or fatigue when stretched, which regular wires would do. In experiments, the artificial skin was able to detect tiny vibrations at a rate of 800 times per second, which is even superior to human fingers.
A paper describing the work, titled “Bioinspired flexible microfluidic shear force sensor skin,” was recently published in the journal Sensors and Actuators.
“[Next up,] we want to show that the sensors can be used to improve the manipulation of objects in a wide range of applications,” Posner continued. “We need to show that using the sensors can increase the capability of robotic and prosthetic hands in complex tasks.”
- These gloves will make virtual reality feel even more immersive
- 7 ambitious DARPA projects that could revolutionize the armed forces
- MIT’s creepy-crawly robot can help monitor your health
- A.I. can do almost anything now, but here are 6 things machines still suck at
- JackRabbot 2 is Stanford’s friendly new campus-roaming social robot