An algorithm tool developed by Purdue Polytechnic Institute faculty will help law enforcement filter out and focus on sex offenders most likely to set up face-to-face meetings with child victims.
The Chat Analysis Triage Tool (CATT) was presented last week by principal investigator Kathryn Seigfried-Spellar, assistant professor of computer and information technology, at the International Association of Law Enforcement Intelligence Analysts Conference in Anaheim, California.
Seigfried-Spellar said law enforcement officers are inundated with cases involving the sexual solicitation of minors – some interested in sexual fantasy chats, with others intent on persuading an underage victim into a face-to-face meeting.
CATT allows the officers to work through the volume of solicitations and use algorithms to examine the word usage and conversation patterns by a suspect. Seigfried-Spellar said data was taken from online conversations provided voluntarily by law enforcement around the country.
"We went through and tried to identify language-based differences and factors like self-disclosure," she said. Self-disclosure is a tactic in which the suspect tries to develop trust by sharing a personal story, which is usually negative, such as parental abuse.
"If we can identify language differences, then the tool can identify these differences in the chats in order to give a risk assessment and a probability that this person is going to attempt face-to-face contact with the victim," Seigfried-Spellar said. "That way, officers can begin to prioritize which cases they want to put resources toward to investigate more quickly."
Other standout characteristics of sexual predators grooming victims for a face-to-face meeting is that the chats will often go on for weeks or even months until a meeting is achieved. Those involved in sexual fantasy chatting move on from one youth to another quickly.
The project started as a result of a partnership with Ventura County Sheriff's Department in California.
Seigfried-Spellar said the research discovered tactics like self-disclosure is used early in a predator's talks with a potential victim.
"Meaning that we could potentially stop a sex offense from occurring because if law enforcement is notified of a suspicious chat quickly enough, CATT can analyze and offer the probability of a face-to-face," she said. "We could potentially prevent a child from being sexually assaulted."
Seigfried-Spellar worked in developing CATT with two co-principal investigators, associate professor Julia Taylor Rayz, who specializes in machine learning and natural language processing, and computer and information technology department head Marcus Rogers, who has an extensive background in digital forensics tool development.
CATT algorithms examine only the conversation factors and do not take the sex of either suspect or victim into consideration, at this time.
The project began with initial research done by Seigfried-Spellar and former Purdue professor Ming Ming Chiu. The exploratory study examined more than 4,300 messages in 107 online chat sessions involving arrested sex offenders, identifying different trends in word usage and self-disclosure by fantasy and contact sex offenders using statistical discourse analysis.
The trends determined through this research formed the basis for CATT. The research, "Detecting Contact vs. Fantasy Online Sexual Offenders in Chats with Minors: Statistical Discourse Analysis of Self-Disclosure and Emotion Words," has been accepted and will be published in the journal "Child Abuse and Neglect."
Initial plans are to turn the tool over to several law enforcement departments for a test run. Seigfried-Spellar said CATT could be handling data from active cases as early as the end of the year.
The conversation analysis provides the basis for future law enforcement tools as well, she said.
"What if there is a chat online and you don't know if you're chatting with an offender or someone who is 15 years old pretending to be 30," she said. "Maybe then, this tool can analyze the differences in an actual 13-year-old versus someone who is pretending to be 13 or an actual adult versus someone who is pretending to be an adult.
"So, you can then start trying to figure out, language wise, who this person is I'm chatting with."
At some point, she believes CATT could even teach officers to better portray a 10-year-old victim by perfecting constantly changing factors like language, emojis and acronyms.
"In these types of operations, our goal isn't to entrap people," she said. "In these, the offender is initiating, and as they do that, law enforcement is simply responding.
"If officers can respond in a way that speeds up the process, that gets the person off the street sooner compared to waiting eight months to allow a trust relationship to develop."
Explore further: The virtual door to online child sexual grooming is wide open