Skip to main content

Staggering implications: Smartphones may soon use our gait to see if we’re drunk

 

Researchers from Stanford University and the University of Pittsburgh have developed software that harnesses a smartphone’s built-in accelerometer, and uses this to determine whether its user might be drunk — based entirely on how they walk.

Recommended Videos

“Every smartphone currently manufactured has many embedded sensors,” Brian Suffoletto, associate professor in emergency medicine at Stanford University, told Digital Trends. “One of these sensors is a 3-axis accelerometer. We accessed this sensor and sampled the movement in the forward-backward, side-to-side, and up-down direction 100 times per minute while individuals were walking in a straight line using a free app, Phyphox. We then cleaned the data and generated features related to walking such as step speed and variability of side-to-side movement. [After that, we] trained a model where each person served as their own control and examined how well these models, when shown new data on the individual, could discriminate between periods of intoxication and sobriety.”

For the study, the researchers recruited 22 adult volunteers, aged 21 to 43. They were given a mixed drink containing vodka that would yield a breath alcohol level of .2 percent. For seven hours, the participants had to perform various walking tasks while being breathalyzed.

This data was then used to build an A.I. model. The algorithm is reportedly about 90% accurate at predicting when breath alcohol concentration exceeds .08 percent. This is the level at which a person will test positive for driving under the influence (DUI) in the United States.

Coming soon to a phone near you?

There are, of course, differences in people’s walking patterns which could affect such readings. A person with a hitch in their step, from breaking their leg as a child, could walk in a manner that might result in a false positive. Because of this, Suffoletto said, it’s important that any future commercial version of this tool would have to ensure each user served as their own control. That means that they would have to provide enough baseline data about whether they had drunk alcohol at certain junctures in order to train the model on their particular phone.

Don’t expect this tool to ship in the near future, though. “There are several key things that need to happen before we are sure about efficacy and this tool could be rolled out commercially,” Suffoletto said. “First, we need to build models that are useful when the phone is placed on different parts of the body or in a handbag. Then, we need to determine how well the models work in the real world where people may not be walking in straight lines. Most interesting to me is whether combining different features such as typing speed on a phone could [also] complement gait features.”

How useful a tool like this would be, of course, would depend on how the data is used. If it’s employed to stop people from drinking and driving by sending a notification that you should not be behind the wheel of car, it would be a great idea. If it’s used to tell advertisers you enjoy drinking at lunchtime on a Friday? Maybe less so.

A paper describing the work was recently published in the Journal of Studies on Alcohol and Drugs.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Google just gave vision to AI, but it’s still not available for everyone
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app with the camera feature open

Google has just officially announced the roll out of a powerful Gemini AI feature that means the intelligence can now see.

This started in March as Google began to show off Gemini Live, but it's now become more widely available.

Read more
This modular Pebble and Apple Watch underdog just smashed funding goals
UNA Watch

Both the Pebble Watch and Apple Watch are due some fierce competition as a new modular brand, UNA, is gaining some serous backing and excitement.

The UNA Watch is the creation of a Scottish company that wants to give everyone modular control of smartwatch upgrades and repairs.

Read more
Tesla, Warner Bros. dodge some claims in ‘Blade Runner 2049’ lawsuit, copyright battle continues
Tesla Cybercab at night

Tesla and Warner Bros. scored a partial legal victory as a federal judge dismissed several claims in a lawsuit filed by Alcon Entertainment, a production company behind the 2017 sci-fi movie Blade Runner 2049, Reuters reports.
The lawsuit accused the two companies of using imagery from the film to promote Tesla’s autonomous Cybercab vehicle at an event hosted by Tesla CEO Elon Musk at Warner Bros. Discovery (WBD) Studios in Hollywood in October of last year.
U.S. District Judge George Wu indicated he was inclined to dismiss Alcon’s allegations that Tesla and Warner Bros. violated trademark law, according to Reuters. Specifically, the judge said Musk only referenced the original Blade Runner movie at the event, and noted that Tesla and Alcon are not competitors.
"Tesla and Musk are looking to sell cars," Reuters quoted Wu as saying. "Plaintiff is plainly not in that line of business."
Wu also dismissed most of Alcon's claims against Warner Bros., the distributor of the Blade Runner franchise.
However, the judge allowed Alcon to continue its copyright infringement claims against Tesla for its alleged use of AI-generated images mimicking scenes from Blade Runner 2049 without permission.
Alcan says that just hours before the Cybercab event, it had turned down a request from Tesla and WBD to use “an icononic still image” from the movie.
In the lawsuit, Alcon explained its decision by saying that “any prudent brand considering any Tesla partnership has to take Musk’s massively amplified, highly politicized, capricious and arbitrary behavior, which sometimes veers into hate speech, into account.”
Alcon further said it did not want Blade Runner 2049 “to be affiliated with Musk, Tesla, or any Musk company, for all of these reasons.”
But according to Alcon, Tesla went ahead with feeding images from Blade Runner 2049 into an AI image generator to yield a still image that appeared on screen for 10 seconds during the Cybercab event. With the image featured in the background, Musk directly referenced Blade Runner.
Alcon also said that Musk’s reference to Blade Runner 2049 was not a coincidence as the movie features a “strikingly designed, artificially intelligent, fully autonomous car.”

Read more