Accessibility options for the physically impaired just got a huge boost, thanks to eBay’s open-source release of its HeadGaze software that allows users to control an iPhone X with simple head movements.
The software only works on the iPhone X at the moment, and uses the phone’s revolutionary FaceID system to track the user’s head movements, and translate them directly onto the screen. The software creates an on-screen cursor that’s then controlled by the user’s movements. Triggering a button or keypress is done by lingering the cursor over the desired button, and users can scroll or turn pages by lingering the cursor on specific parts of the screen. It’s a control scheme that won’t be unfamiliar to anyone who’s used a phone-based virtual reality headset like the Gear VR — but with the added benefit of not needing a bulky headset.
The technology was created by eBay’s computer team with guidance from PhD candidate and eBay intern Muratcan Cicek. While the software hasn’t yet been released on eBay’s app, eBay has released a video that shows the software working on the app. In an altruistic move, the auction giant has also released the source code on GitHub, so other developers can benefit from the software.
The software is primarily aimed at increasing accessibility options for the physically impaired. It is hoped this software could be incorporated into future mobile operating systems, giving more options to those who may struggle with the touch-based systems currently used in most mobile phones. With a system like HeadGaze, a phone could be held in position while a user uses head movements to browse. Future improvements in voice-based A.I. could also play a role in these accessibility options, helping to open the world of tech to a larger community of people.
Speaking to VentureBeat, Cicek said that the team is working on incorporating other movements into the system, including tracking eye movement. He also pointed out the uses this technology could have beyond the options it offers for the physically impaired, with examples from everyday life.
“Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone?”
While touchscreens are likely going nowhere for a while yet, it’s likely we’ll see some serious developments in this area within the next few years. After all, a lot of us can unlock our phone with our eyes — why not control it with them, too?