Optical proximity sensors can be found in most smartphones and they perform a number of functions. For example, the sensors turn off the screen when you’re taking a call, so that your cheek doesn’t trigger any actions. The sensors can disrupt the aesthetic of the device, though, as they are typically placed by the front-facing camera on the top of your phone. Elliptic Labs’ solution removes the need for those sensors, saving some money, and space inside the device and out.
The waves are emitted through the speakers and bounce off your hands. They are then picked up by the microphone, which translates the distance and hand movements as certain gestures. So like the optical sensor, the tech can still determine when you pick up the phone and look at it, and when you put it up to your ear to take a call.
The benefit of Elliptic’s method is that it doesn’t require any additional hardware, and could potentially incorporate a multitude of other gestures into interactions with the phone. The company says there is no difference in quality compared to using an actual optical proximity sensor, and in fact, the device can deliver a greater detection range with ultrasound. The greater range increases the number of potential actions the device can interpret, while offering a more “aesthetically pleasing design” by getting rid of the black circles on the front of your smartphone.
We saw the tech in action at Mobile World Congress last year, and were impressed by the number of different motions ultrasound could detect from as far away as 4 feet. We snapped selfies, navigated through menus, checked notifications, and even played a game — all without ever touching the device.
We don’t know exactly which OEMs the company is working with and when, but Elliptic Labs announced that it is working to integrate its new Beauty ultrasound proximity software into smartphones this year.