Researchers from Stanford University and the University of Pittsburgh have developed software that harnesses a smartphone’s built-in accelerometer, and uses this to determine whether its user might be drunk — based entirely on how they walk.
“Every smartphone currently manufactured has many embedded sensors,” Brian Suffoletto, associate professor in emergency medicine at Stanford University, told Digital Trends. “One of these sensors is a 3-axis accelerometer. We accessed this sensor and sampled the movement in the forward-backward, side-to-side, and up-down direction 100 times per minute while individuals were walking in a straight line using a free app, Phyphox. We then cleaned the data and generated features related to walking such as step speed and variability of side-to-side movement. [After that, we] trained a model where each person served as their own control and examined how well these models, when shown new data on the individual, could discriminate between periods of intoxication and sobriety.”
For the study, the researchers recruited 22 adult volunteers, aged 21 to 43. They were given a mixed drink containing vodka that would yield a breath alcohol level of .2 percent. For seven hours, the participants had to perform various walking tasks while being breathalyzed.
This data was then used to build an A.I. model. The algorithm is reportedly about 90% accurate at predicting when breath alcohol concentration exceeds .08 percent. This is the level at which a person will test positive for driving under the influence (DUI) in the United States.
There are, of course, differences in people’s walking patterns which could affect such readings. A person with a hitch in their step, from breaking their leg as a child, could walk in a manner that might result in a false positive. Because of this, Suffoletto said, it’s important that any future commercial version of this tool would have to ensure each user served as their own control. That means that they would have to provide enough baseline data about whether they had drunk alcohol at certain junctures in order to train the model on their particular phone.
Don’t expect this tool to ship in the near future, though. “There are several key things that need to happen before we are sure about efficacy and this tool could be rolled out commercially,” Suffoletto said. “First, we need to build models that are useful when the phone is placed on different parts of the body or in a handbag. Then, we need to determine how well the models work in the real world where people may not be walking in straight lines. Most interesting to me is whether combining different features such as typing speed on a phone could [also] complement gait features.”
How useful a tool like this would be, of course, would depend on how the data is used. If it’s employed to stop people from drinking and driving by sending a notification that you should not be behind the wheel of car, it would be a great idea. If it’s used to tell advertisers you enjoy drinking at lunchtime on a Friday? Maybe less so.
A paper describing the work was recently published in the Journal of Studies on Alcohol and Drugs.
- We’re finally getting a 4K OLED gaming monitor, and it’s coming soon
- How Nintendo could use A.I. to bring 4K gaming to the Switch Pro
- To build a lifelike robotic hand, we first have to build a better robotic brain
- Facebook study shows how social media may influence our vacation behavior
- This recent study shows we’re already accustomed to our A.I. assistants