In the not so distant future, Hitachi — maker of TVs, power tools, and a bevy of business and medical equipment — plans to unleash a predictive analytics system that it says can anticipate crimes before they happen. While this sounds eerily similar to the plot line of 2002’s sci-fi thriller Minority Report, rest assured the Japanese-based company didn’t uncover a band of psychic precogs in Fukushima or anything like that. Instead, it devised a futuristic computer system capable of absorbing mass amounts of data and learning on the fly.
The system, called Hitachi Visualization Predictive Crime Analytics (PCA), comes from researchers Darrin Lipscomb and Mark Jules, co-founders of the crime-monitoring technology company Avrio and Pantascene. After Hitachi acquired the company last year, Lipscomb and Jules took up the task of developing the revolutionary new tech, opting to make use of machine learning rather than relying on preconceived variables and factors. Because of this, the PCA has the ability to derive patterns from a near-infinite amount of sources, creating behavior patterns often overlooked by the human eye.
In an interview with Fast Company, Jules says police investigators traditionally built crime-prediction models rooted in their own experiences, the product of personal variables. Hitachi’s system removes the bias of specific variables, effectively analyzing thousands of factors capable of affecting crime. For instance, the PCA culls data such as weather patterns, public transit movements, social media activity, gunshot sensors, and many, many others. The data collected, Hitachi hopes, represents a comprehensive system capable of accurately predicting crime.
“You just feed those data sets,” Jules tells Fast Company. “And it decides, over a couple of weeks, is there a correlation.”
One unique aspect of Jules and Lipscomb’s PCA deals with how it ingests social media activity. For starters, the duo claims social media plays a significant part in predicting crime, ostensibly responsible for improving predictions by an astounding 15 percent. Armed with the ability to decipher colloquial text and speech, keywords, and slang native to a specific area or gang don’t go unnoticed. The PCA makes use of a latent Dirichlet allocation which sorts tweets based on their geography, then chronicles specific language to get an idea of what’s going on. Jules and Lipscomb hope this method allows law enforcement to identify when something is uncommon, enabling them to act accordingly.
While Hitachi’s new tech obviously provides a novel way to predict and stop crime, the glaring elephant in the room no doubt concerns the seemingly inevitable problem of profiling innocent people. Though Lipscomb posits the PCA provides law enforcement with a better policing tool than New York City’s stop-and-frisk scheme, it’s likely the new tech will raise its fair share of eyebrows.
“We’re trying to provide tools for public safety so that [law enforcement is] armed with more information on who’s more likely to commit a crime,” Lipscomb says. “I don’t have to implement stop-and-frisk. I can use data and intelligence and software to really augment what police are doing.”
Until Hitachi officially unleashes the PCA in a real, working environment, it’s unclear just how accurate it will ultimately prove to be. Lipscomb and Jules feel confident enough in its capability, however, the duo understands the tech needs to perform a series of real-world tests to gain widespread approval. To do this, Hitachi plans to allow law enforcement agencies in a number of (currently unknown) cities to give its system a spin.
Some of the agencies will even participate in a double-blind trial, meaning they’ll run the predictive system in the background but won’t see the predictions when they happen. After carrying on with their normal day-to-day for a predetermined amount of time, Hitachi will then compare the PCA’s predictions to the actual police activity over the same time.
Though extensive and detailed testing is necessary to discover the true benefit of Hitachi’s Predictive Crime Analytics, there’s no denying just how incredible a tool it already appears to be. It may not be time to call Philip K. Dick’s 1956 short story a work of nonfiction quite yet. However, it’s clear that now more than ever humanity is fully entrenched in “the future.”
- A literal minority report: Examining the algorithmic bias of predictive policing
- Minority Report for poachers: Can predictive algorithms prevent wildlife crime?
- Surveillance on steroids: How A.I. is making Big Brother bigger and brainier
- Amal and George Clooney want to change the world. Can Microsoft help?
- India moves toward Minority Report-style crime prediction with A.I. tech