Robots can already cook for us. But what about helping us eat?
Engineers at the University of Washington have developed a robot that can feed people who struggle to feed themselves. Powered by an artificial intelligence algorithm, the system detects pieces of food on a plate, stabs them with a fork, and transports the morsels to a person’s mouth.
The project was first motivated by a trip that Siddhartha Srinivasa, UW engineer and lead researcher, took six years ago to a rehab institute, where a young girl asked him to develop a robot that would let her eat by herself. After meeting with caregivers and other people with mobility impairments, Srinivasa and his team recognized the broad need for an autonomous feeding system and set out to create one.
“The system is a robot arm … integrated with a wrist-mounted camera, a tactile sensor on the fingers, and a fork gripped by the two fingers,” Gilwoo Lee, a UW doctoral student who worked on the project, told Digital Trends.
When in use, the arm, which is mounted on a user’s wheelchair, prompts its user to select an item that she would like to eat. The system then performs some calculations, running data through a set of algorithms, to determine the food type and “skewering position,” or the angle at which the arm should stab the food. Through trials, Lee, Srinivasa, and their colleagues found out that the act of eating often entails orienting various foods differently on a fork. For example, stabbing a strawberry near the tip and tilting it towards the person’s mouth helps them eat it more easily.
“Based on the food identity and the skewering position, the robot moves down to skewer an item, executing the most successful skewering strategy tailored for each item,” Lee explained. “Once the item is skewered, the arm moves around to deliver the food to the person sitting in the wheelchair. During this time, the camera keeps detecting the person’s face and delivers the food close to the mouth. The system then waits until the person has taken a bite or eats the whole food, and then repeats.”
The researchers are testing their technology with caregivers and patients in assisted living facilities to make it more fit users’ needs more precisely.
A paper detailing part of the project was published recently in IEEE Robotics and Automation Letters.
- Camera-clad rubber fingertips allow robots to manipulate cables and wires
- Kid-mounted cameras help A.I. learn to view the world through eyes of a child
- Facebook’s new A.I. takes image recognition to a whole new level
- The U.S. wastes $161B worth of food every year. A.I. is helping us fix that
- Robotic rubdown: New robo-masseuse could make its way into your home