There’s a new weapon in the war against sex offenders preying on unwitting child victims online — and it comes in teh form of a smart algorithm.
Created by researchers from Purdue University, the artificial intelligence-powered Chat Analysis Triage Tool (CATT) is designed to help law enforcement more easily discover instances of grooming online. Because of the sheer amount of conversation that takes place on the internet, it aims to do a job that would be impossible to carry out without an entire army of dedicated humans: To monitor online conversations and highlight instances in which adults are behaving in a suspiciously inappropriate way.
“CATT analyzes the chats between minors and different types of child sex offenders, specifically offenders [who aim to] meet up with minors for sex in the real world, and fantasy-driven offenders [interested in] cybersex fantasy,” Kathryn Seigfried-Spellar, assistant professor of computer and information technology at Purdue, told Digital Trends. “Law enforcement are bombarded with cases of child sexual solicitation, so our tool triages these cases for law enforcement by analyzing the chats based on differences in language, to provide a risk assessment score on the likelihood that these individuals [might be] a contact-driven offender.”
The algorithm was developed by analyzing 4,353 messages in 107 different chat sessions that involved sex offenders who were later arrested. According to its creators, the tool is robust enough that it can understand messages even when they obfuscate their meaning using acronyms or shorthand — or simply just straightforward spelling errors.
In the future, the researchers say that CATT could also be used to teach undercover officers to better portray underage victims online by revealing constantly changing factors like language, emojis, and acronyms.
“This will be a free tool for law enforcement, and we ask that agencies who are interested in testing our tool reach out to us this summer,” Seigfried-Spellar said. “We are looking for partners to share data and test our tool, so we can have CATT ready for deployment by the end of this year.”
A paper describing the research, “Exploring Detection of Contact vs. Fantasy Online Sexual Offenders in Chats with Minors,” was published in the journal Child Abuse and Neglect.
- Algorithmic architecture: Should we let A.I. design buildings for us?
- Emotion-sensing A.I. is here, and it could be in your next job interview
- Facebook’s new image-recognition A.I. is trained on 1 billion Instagram photos
- A.I. hit some major milestones in 2020. Here’s a recap
- This groundbreaking new style of A.I. learns things in a totally different way