Thanks to a Freedom of Information request and some further digging, the Electronic Frontier Foundation (EFF) has uncovered just about everything there is to know about the U.S. Army’s chatbot sergeant — including the technology’s past as a tool for catching pedophiles and terrorists on the Web. The revelations are part of EFF’s ongoing investigation into how the military interacts with and collects information from the public over the Internet.
Sgt. Star is the well-known virtual public spokesperson for the United States military, appearing on the Army Careers website and associated Facebook pages to answer questions from potential recruits. EFF has compiled all of his possible responses into a 288-page document that cover all aspects of military service and even the issue of whether soldiers can use umbrellas (yes, in certain circumstances). There’s also a page of usage statistics — Sgt. Star engaged in nearly 600,000 online chats in 2013.
The chatbot was introduced as a cost-cutting measure in 2006, designed to reduce the time that human operators needed to spend on online enquiries — the military estimates that Sgt. Star can do the work of 55 real recruiters. The company behind the bot, Next IT from Spokane in Washington, has also previously developed similar technology for the FBI and CIA in the past. In this case, chatbots were used to engage suspected pedophiles and terrorists, looking for signals that would point to suspicious behavior and allowing a federal agent to monitor 20-30 conversations at once.
EFF believes this kind of activity raises questions about data collection and online monitoring — once again, the balance between the need to catch criminals online and protect the privacy of the ordinary citizen is called into question. “What happens to conversations that aren’t relevant to an investigation, and how do the agencies weed out the false positives, such as when a chatbot misinterprets a benign conversation as dangerous?” asks EFF researcher Dave Maass.
“For all his character quirks, a user would never mistake Sgt. Star for human — that’s just not how he was designed,” continues Maas. “That can’t necessarily be said for other government bots. Military, law enforcement and intelligence agencies have employed virtual people capable of interacting with and surveilling the public on a massive scale, and every answer raises many, many more questions.”