It’s not quite Minority Report, but it’s not too far off: London’s police force has tested a new software application that can help it identify the gang members most likely to commit crimes in the future. The trial run, which lasted for 20 weeks, is the first of its kind in Britain.
The system used various indicators, such as five years’ worth of historical data and social media posts, to try and identify the groups of individuals within gangs who posed the greatest threat. If the software is eventually deployed across the police force, it will use up-to-the-minute data to make its predictions.
According to representatives from Accenture, the company that developed the program, the idea is to more effectively target the police’s resources rather than to try and prevent theoretical future crime. “You’ve got limited police resources and you need to target them efficiently,” Muz Janoowalla, head of public safety analytics at the firm, told the BBC. “What this does is tell you who are the highest risk individuals that you should target your limited resources against.”
“For example, if an individual had posted inflammatory material on the Internet… it would be recorded in the intelligence system,” he added. “What we were able to do was mine both the intelligence and the known criminal history of individuals to come up with a risk assessment model.”
Taking data from 32 different London boroughs, the number crunching was focused on four years’ worth of records — the computer’s predictions were then compared with the actual crime statistics for the fifth year to see how accurate it was. A spokesperson said the trial run had been a success but didn’t divulge any precise figures on the system’s success rate.
The London police force says the software highlights groups of known criminals rather than singling out individuals, but privacy groups have voiced concerns. “The police need to be very careful about how they use this kind of technology,” said Daniel Nesbitt, research director at Big Brother Watch. “The Metropolitan Police must ensure that they are fully transparent about how they intend implement this technology and what type of information will be used in the process.”
- Crime-predicting A.I. isn’t science fiction. It’s about to roll out in India
- A.I. perfectly predicted last year’s Super Bowl score. What happens to betting?
- Buying a used phone? Beware of this common scam no one wants to talk about
- Is there horse in your hamburger? How blockchain could fight food fraud
- Facial recognition tech picks a suspect out of a crowd of 50,000 in China