Absolutely. Try making that shape with your hand right now, and we’re pretty sure you’ll succeed. Now imagine using that same gesture to summon up a virtual mouse on the screen of your tablet; or in the future, an interactive tabletop, complete with left and right buttons, and that always-helpful scroll wheel. Such natural movements could transform the way we interact with touchscreens, especially large ones.
Qeexo can make those kinds of gestures possible right now, and we got a chance to see its impressive technology in action.
Anyone who has used a modern Huawei phone may be familiar with Qeexo’s first product. It’s called Finger Sense, but Huawei changed the name to Knuckle Sense for use in its Mate S, Mate 8, the P9, and other phones. The concept is simple. You use your knuckle instead of a fingertip to activate various features, such as taking a screenshot or opening apps. Underneath Finger Sense is a versatile machine learning engine, around which Qeexo has built Touch Tools and Impact Sense: two tools that go way beyond using just our knuckle.
“We don’t want to make people learn a whole lot of new gestures, we want to make gestures that people are already familiar with.”
For the last couple of years, Qeexo has refined its machine learning code to teach our devices how we, as humans, actually wield and use tools. The company has demonstrated the tech in other products such as FingerAngle along the way, and it’s the real star of the show. You simply teach the machines the way you personally hold a mouse or a pen, so that it always interprets the gesture correctly. It doesn’t just guess what you’re doing based on what hundreds of other people do, plus it will adapt to changes in the way you “make” the mouse, camera, or any other gesture over time. Adding this kind of intelligence increases the likelihood that the machine will correctly guess which type of gesture you made to provide you with the tool you want.
We watched Qeexo’s Touch Tools work on an iPad, and it looked great. It was a super slick demo that made me grin, just like all the best tech innovations should. However, it gets really exciting when you start to consider its use outside of a mobile device. The gestures seen here can be adapted to any shape and size, so could be made into a knob or dial, ready for use on a touchscreen inside a car.
For instance, a large touchscreen dominates the Tesla Model S’s dashboard, and although it looks amazing, it’s a problem to use on the move because you have to look at the display to find settings and make adjustments. That’s not a good idea when you’re moving along. Now imagine reaching out and grabbing a virtual dial to adjust the volume of the stereo or turn down the heat. For Touch Tools, this is simple; and for the driver, it’s a potential life saver.
“We don’t want to make people learn a whole lot of new gestures, we want to make gestures that people are already familiar with,” Sang continued. But how would we differentiate between dials, and understand we’re grabbing the right one without looking? “The button for the volume in my car is smaller than the one for the AC, so you’d make a different gesture, but what we can also do is link the gesture to haptic feedback, and as soon as you turn the dial you’ll feel that the engine has activated the right tool.”
Because conflicts could arise between similarities in gestures, the company will work with app developers to integrate only the most relevant tools for the situation. We don’t think anyone will need a magnifying glass in the car, for example. Qeexo’s keen to avoid gesture overload, which is one of the problems it wants to solve.
Impact Force (which sounds like the title of another Dirty Harry movie, but isn’t) is a little different to Touch Tools, but uses the same learning engine. It’s designed as an alternative to pressure-sensitive gesture systems, like 3D Touch on the iPhone 6S and the Apple Watch.
“People are struggling to trigger force sense functionality. They can’t tell the difference between a long press and a hard press,” according to Sang, “because we don’t have a pressure sensor on the end of our finger.” Qeexo sees value in the concept of a third selection process, but also wants to solve it on a software level, making it more accessible to manufacturers that don’t only produce high-end, expensive smartphones.
Impact Force reacts with a sharp impact on the display, and the machine learning engine will be able to understand the difference between a tap, a long press, and a hard/sharp press. It’s an alternative to changing out hardware to measure pressure, but there is a difference between the two approaches. “We can do most of the things a pressure sensitive screen does, but not continuous pressure sensing, such as that used for Huawei’s weight measurement trick.”
Why haven’t we seen this approach to gesture controls before now?
“It took us two and a half years to perfect,” said Sang. “We had to do a lot of optimization work based on feedback from customers, such as bringing down the power consumption, and get the CPU usage under a certain limit.”
Now it’s ready for the big time. Qeexo expects Impact Force to make an appearance on a smartphone soon. The company hinted that it may happen after the summer. For Touch Tools, Qeexo is working with an OEM partner, that Sang wouldn’t name for obvious business reasons, and is hoping to have it inside a device we can buy before the end of 2016.
Notice we said, ‘device’ rather than a phone or tablet? According to Sang, “It’s bigger than a tablet. Not a 55-inch TV or something, but between a typical tablet and TV. It will be targeting both regular people and businesses.”
Once these devices reach consumers, the times of ignoring gesture controls because we can’t remember what to do may be close to an end.
- Obsbot Tiny review: An A.I. webcam mounted to a gimbal
- The 60 best HBO series streaming right now
- Windows 11 vs. Windows 10: Should you upgrade?
- The best smartphones for 2021
- How to jailbreak your iPhone or iPod Touch