There is. A company called Elliptic Labs has been working on gesture-control technology that uses ultrasound to detect your hand movements. A modified chip inside the phone tunes the speakers to emit ultrasound waves, which then bounce off your hands, and are picked up again by the microphone. The rebounding signals are interpreted as hand motions, which relate to corresponding gesture controls.
Elliptic Labs gives you the chance to decide whether the notifications you’re getting are worth unlocking your phone for.
Sound confusing? It’s not, really, Elliptic Labs told us that it’s exactly how bats use echo location to evaluate their environment, hunt, and avoid obstacles. The bat sends out a signal, which travels until it runs into something. The bat interprets the signal, and flies off to catch a juicy bug. Elliptic Labs is doing the same thing with your smartphone. Eating bugs is optional though.
We saw several impressive demos of the tech, each giving us a look at what’s possible with this surprisingly powerful new control system. The first action we saw is the most basic one: turning on your phone with a wave of a hand. The others were more complicated. One lets you check what time it is when the phone’s off, see an alarm, and even hit snooze — all with a casual wave of your hand. Best of all, the phone reacted quickly and accurately every time we tried them out.
Snap selfies, and control games using gestures
Other gesture commands allowed us to swipe through pictures, pause a movie, and snap a selfie. The most complicated of them all was a game where you control an on-screen character, in this case a jellyfish, and your hands are the only controller you need. The idea is to avoid plastic bags while bobbing about through the ocean, and eat fish to survive. This simple game was a lot more fun, and unusually relaxing, than it would have been using a touchscreen.
The phone can even detect hand motions from a pretty great distance. We were at least 3-4 feet away from the demo unit and the phone still responded to waves just as quickly as it did up close. Since it’s using ultrasound, Elliptic Labs tech can also judge distance, so app developers can use that to create apps with multiple menus or stacked pages.
Here’s an example of how this could be a real timesaver. The phone will bring different kinds of notifications up on your lock screen, based on how far away your hands are positioned. From far away, you see three or four notifications from your favorite apps, but as you get closer, you see more notifications in more detail.
By the time you’re inches away from the screen, it just shows the unlocking mechanism, at which point you have the option to actually touch the screen. However, up until that point, you’ve seen all you notifications without touching your phone at all. It gives you the chance to decide whether the notifications you’re getting are worth unlocking your phone for.
Of course, these are all just examples of what can be done with the technology. In the future, device makers and app developers can get really creative with the gesture control system. Elliptic Labs is currently in talks with major manufacturers to get its technology embedded into phones. It’s already got four big fish in its net (or should that be bats in the belfry?), though the company wouldn’t say who, but if everything works out, we should be happily waving at our phones to do things as early as later this year.