Skip to main content

Microsoft and Shell build A.I. into gas stations to help spot smokers

Shell invests in safety with Azure, AI, and machine vision

The last thing you want to see when you pull into a gas station is some doofus lighting up a smoke.

Whether they missed the warning notices or, perhaps, the science class back at high school about open flames and flammable vapor is, in that moment at least, largely immaterial.

As for your own course of action upon seeing such reckless behavior, you can either put your foot down and hightail it out of there before the whole place goes up, or yell at the smoker to put it the hell out.

Tackling the very same issue, Shell has been working with Microsoft on a solution that aims to make all future visits to gas stations stress-free, at least in terms of potential explosive activity. The system uses Microsoft’s Azure IoT Edge cloud intelligence system to quickly identify and deal with smokers at a gas station, and it’s already being tested at two Shell stations in Thailand and Singapore.

It works like this: High-tech cameras positioned around the gas station filter the footage on site to identify behavior that suggests someone is lighting up, or already smoking.

Images that appear to show such behavior are then automatically uploaded to the Microsoft Azure cloud, which can power more sophisticated deep learning artificial intelligence (A.I.) models to confirm whether the person is actually smoking. If so, an alert is sent immediately to the station manager who can then shut down the pump before anything potentially cataclysmic happens. The system presumably could also be fully automated and configured to shut down the pump without the manager having to do it manually, with an audible warning given to the smoker via a speaker in the pump. Taking it to the extreme, the setup could even blast the perpetrator with foam from a fire extinguisher incorporated into the pump.

The entire process, from identification to shutdown, can take place in a matter of seconds. Shell said that this is because so much of the initial data is processed by on-site computers rather than sending everything to the cloud for processing — a feature of Azure IoT Edge. In other words, only the important data — in this case images that appear to show someone smoking or about to smoke — is sent to the cloud, a procedure that helps to speed up analysis and response time.

“Each of our retail locations has maybe six cameras and captures something in the region of 200 megabytes per second of data,” said Daniel Jeavons, Shell’s general manager for data science. “If you try to load all that into the cloud, that quickly becomes vastly unmanageable at scale. The intelligent edge allows us to be selective about the data we pass up to the cloud.”

As Microsoft explains on its A.I. blog, the intelligent computer vision tools could be used in a range of industries to automatically detect dangerous behaviors or conditions, for example, “it could be deployed on construction projects to flag when employees aren’t wearing proper safety equipment or to inspect equipment sitting on the seafloor thousands of feet underwater.”

Editors' Recommendations

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
Meet the sci-fi startup building computer chips with real biological neurons
Cortical Labs

There's a great deal of innovation embedded in today’s cutting-edge computer chips, but not much of it is as out-of-the-box as the thinking that’s driving Australian startup Cortical Labs. The company, like so many startups with artificial intelligence in mind, is building computer chips that borrow their neural network inspiration from the biological brain. The difference? Cortical is using actual biological neurons, taken from mice and humans, to make their chips.

“We’re building the first hybrid computer chip which entails implanting biological neurons on silicon chips,” Hon Weng Chong, CEO and co-founder of Cortical Labs, told Digital Trends.

Read more
A.I. could help spot telltale signs of coronavirus in lung X-rays
using ai to spot coronavirus lung damage lungs x ray

There are many pain points when it comes to the coronavirus, officially known as COVID-19. One of them is how exactly to test people for it when the necessary testing kits are in short supply. One possible solution could be to allow artificial intelligence to scrutinize chest X-rays of patients’ lungs to spot signs of potential coronavirus-caused lung damage.

That’s the basis for several exciting and promising attempts to develop a neural network that could be used to give a strong indication of whether or not a patient likely has COVID-19. Researchers at Chinese medical company Infervision recently teamed up with Wuhan Tongji Hospital in China to develop a COVID-19 diagnostic tool. It is reportedly now being used as a screening tool at the Campus Bio-Medico University Hospital in Rome, Italy.

Read more
Worried about bills during the coronavirus pandemic? This A.I. lawyer can help
Paying bills 1

Anyone who says they’re not worried about the future right now isn’t telling the truth. But while we’re all worried about the health implications of the COVID-19 coronavirus, for many the economic ramifications run an extremely close second. Especially for those without savings, or those in low-paid gig economy jobs, anxieties about how to pay the next bill are pretty darn scary. Even if it’s a type of scary that no Hollywood pandemic movie is ever going to touch.

Give a round of applause, then, to Joshua Browder: The 24-year-old legal-tech whiz kid behind DoNotPay, a growing arsenal of free automated A.I. tools that everyday consumers can use to do everything from disputing parking tickets to suing robocall scammers. His latest creation is tailor-made for those fretting about the impact that the coronavirus crisis is going to have on their ability to pay the next time they face demands.

Read more