Whether it’s first-person controls that place you in the virtual cockpit of your drone or experiments using brain control interfaces, engineers are exploring all kinds of ways of controlling unmanned aerial vehicles. Roboticists at British Columbia’s Simon Fraser University have stumbled upon an alternative control system, however — and it involves getting users to “throw” drones off on a set trajectory by pulling a series of funny faces.
The project’s official title is Ready-Aim-Fly. This describes the three separate phases required to achieve successful flight. In the “Ready” stage, the modified Parrot Bebop drone learns the user’s face by asking them to hold a neutral expression. You then have to program in a trigger face which is distinct from your neutral expression. The “Aim” phase involves the drone beginning to fly, while making sure that it keeps the user centered in its camera view. The user can then line up their desired trajectory and move away from the drone in order to increase how far it will fly — a bit like pulling the string back on a catapult. Finally, the “Fly” stage sees the drone perform its programmed trajectory.
It’s possible to use your expressions to make the drone travel in a straight line, or return to the user like a boomerang. In tests, the Ready-Aim-Fly system was used to dispatch the drone on flights of close to 150 feet outdoors. In all, the idea is pretty darn wacky, but we kind of love the concept of being able to harness recent breakthroughs in image recognition (and particularly facial recognition) in order to issue non-verbal commands to a robot.
“The demo is cute and small scale, but we are serious about this interface,” Richard Vaughan, one of the researchers on the project, told Digital Trends. “The important part is that the robot flies in a parabola, as if it was an object being thrown. People are really good at throwing things, so the interaction is easy to learn in one demonstration. The ability to place a drone in 3D from a quick interaction is new and powerful. With a little practice, one can send the robot over a building, or onto its roof, in a couple of seconds of aiming. The user carries no special equipment and doesn’t need their hands free. Since we did this work, we are now able to read facial expressions very accurately — using the same techniques as the iPhone X animated emojis — so we can send off the robot with a big smile.”
A paper describing the project, titled “Ready-Aim-Fly! Hands-Free Face-Based HRI for 3D Trajectory Control of UAVs,” was presented at the Institute of Electrical and Electronics Engineers Canadian Conference on Computer and Robot Vision.
- Switzerland’s new air traffic control system to put drones, planes in same skies
- You can control this robot as it trawls the Chicago River picking up trash
- Control this robot kit with your voice using Alexa or Google Assistant
- Forget bulky controllers. ‘Electronic skin’ may make it easier to interact in VR
- Watch a drone lose control and crash onto Apple Park’s solar roof