Disney Research, the R&D wing of everyone’s favorite Hollywood House of Mouse, has built a robot with uncanny valley-defying gaze interaction that’s so realistic you’ll be convinced that you’re looking at a real person. (Excepting the servo-assisted metallic musculature peeking out from under its robot shell, that is!)
The purpose of the research is to create more realistic “Audio-Animatronics” that could potentially be used at Disney theme parks and elsewhere. Audio-Animatronics are Disney’s name for animatronic robots, created by Walt Disney Imagineering, that both move and make sounds in synchronized fashion. This latest update means that Disney’s robots could potentially lock eyes with visitors and follow them around with their gaze. Depending on the animatronic model this was incorporated into, that could either create an emotional connection with the guest or, potentially, intimidate the bejesus out of them.
“Eye gaze is a significant part of the interactions between people, quite a bit of information is conveyed through movements of the eyes,” Matthew Pan, a postdoctoral associate at Disney Research, told Digital Trends. “We wanted to try to emulate this communication on a robot by designing eye gazing behaviors using principles of animation. We have layered these behaviors using a subsumption architecture proposed over 30 years ago by Rodney Brooks to create complexity and realism.”
A Turing Test for gaze
This new work is described in a research paper recently presented at the IROS (International Conference on Intelligence Robots and Systems) 2020 conference. Along with the Disney Research investigators, the project involved collaborators from the California Institute of Technology, University of Illinois at Urbana-Champaign, and Walt Disney Imagineering
As an article in IEEE Spectrum notes, the investigation is a bit like the gaze version of a Turing Test: An attempt to create a realistic eye-based interaction with humans that aims to be, frankly, indistinguishable from looking into the eyes of another person. While we’ve covered some impressively lifelike robots here at Digital Trends (check out this recent feature on a swimming dolphin robot that could be used to substitute for the real thing in future marine parks), this could be one of the biggest challenges for roboticists. That’s because our brains are so adept at spotting abnormalities in humanlike faces.
So how effective is it? From an outsider’s perspective, it certainly seems an impressive demonstration. Matthew Pan said that he was happy with it — and, from the sound of it, so are his fellow researchers. Even if it’s a bit harder to test that at a time when large-scale tests are harder to do (and some of Disney’s theme parks remain shuttered.)
“Typically, in human-robot interaction research, you conduct user studies with people where they interact with a robot and rate the interaction and give qualitative feedback,” Pan said. He pointed to several tools and metrics that have been established to help quantify these effects. “For this work, the pandemic prevented any human interaction studies from happening. However, as this project progressed, we witnessed that we were creating something quite remarkable. Many of us and our colleagues stepped in front of this animatronic and said ‘Wow! That looks realistic!’ With that, we decided [it was time to share this work.]”
- How a hyperrealistic robo-dolphin is paving the way for animatronic aquariums
- Leaps, bounds, and beyond: Robot agility is progressing at a feverish pace
- The best shows on Disney+ right now
- This basic human skill is the next major milestone for A.I.
- To build a lifelike robotic hand, we first have to build a better robotic brain