Police are deploying fake personas to catch criminals—but no arrests yet

Written by

Published 18 Apr 2025

Fact checked by

We maintain a strict editorial policy dedicated to factual accuracy, relevance, and impartiality. Our content is written and edited by top industry professionals with first-hand experience. The content undergoes thorough review by experienced editors to guarantee and adherence to the highest standards of reporting and publishing.

Disclosure

Free unknown identity undiscovered illustration

Police departments along the U.S.–Mexico border are using a controversial AI tool called Overwatch to pose as online personas—ranging from college protesters to sex workers—in an effort to track suspected criminals. The technology, developed by New York-based company Massive Blue, is being funded with taxpayer-backed anti-trafficking grants and deployed primarily in Arizona. Despite months of operation and a $360,000 contract in Pinal County, Overwatch has not yet led to a single arrest.

According to internal documents obtained through public records, Overwatch creates AI-generated social media profiles to interact with potential suspects in human trafficking, narcotics, and other criminal activities. The tool is marketed as an “AI-powered force multiplier” that can crawl public platforms and messaging services like Telegram and Discord, then communicate with individuals using tailored personas designed to appear lifelike and convincing.

“The problem with all these things is that these are ill-defined problems,” said Dave Maass of the Electronic Frontier Foundation. “One version of the AI persona is an escort. I’m not concerned about escorts. I’m not concerned about college protesters. So like, what is it effective at—violating protesters’ First Amendment rights?”

The AI personas range from a “child trafficking AI persona” modeled as a 14-year-old gamer to a “honeypot” persona of a 25-year-old woman fluent in Arabic, to “AI pimp” and “college protester” personas. Each profile comes complete with backstories, interests, and communication scripts. Sample conversations released from Massive Blue’s presentation include chats with suspected predators and sex workers, where the AI engages in text dialogue designed to build trust or extract information.

    Despite its ambitious scope, Overwatch has not yet resulted in prosecutions. The Pinal County Sheriff’s Office, which signed the largest known contract with Massive Blue, confirmed the tool has produced leads, but no actionable outcomes.

    “Our investigations are still underway,” said Sam Salzwedel, a spokesperson for the sheriff’s office. “Massive Blue has been a valuable partner… but we cannot risk compromising our efforts by providing specifics.”

    Massive Blue declined to share the full list of clients or any arrest statistics. Cofounder Mike McGraw cited the need to protect ongoing investigations: “We cannot risk jeopardizing these investigations and putting victims’ lives in further danger by disclosing proprietary information.”

    Some Arizona officials have raised alarms over the program’s secrecy and the company’s limited transparency. At a Pinal County Board of Supervisors meeting, former District 1 Supervisor Kevin Cavanaugh questioned the wisdom of spending public money on unproven technology with no demonstrable results.

    “Fighting human and sex trafficking is too important to risk half a million dollars on unproven technology,” Cavanaugh later told a local paper. “If it is just being used to collect surveillance on law-abiding citizens and is not leading to any arrests, then the program needs to be discontinued.”

    Despite initial hesitations, the Pinal County Board later approved additional funding for Overwatch without public discussion. Meanwhile, Yuma County, which tested the software with a $10,000 contract, declined to renew, stating it “did not meet our needs.”

    Behind the scenes, Massive Blue has pitched its tool to other agencies, including Cochise County and the Texas Department of Public Safety. Cochise ultimately did not purchase the program, with meeting notes reflecting skepticism over vague marketing language and a lack of concrete results.

    Overwatch marketing materials claim the platform uses AI and blockchain to analyze suspects’ behavior, create fake social profiles, and compile “Recon Reports.” These reports include geographic and cryptocurrency transaction data—though how this data is collected or verified remains unclear.

    “Why is he talking about converting folks into ‘buying something’… Talk about the widget, not how you’re selling the widget to law enforcement,” wrote Jorge Brignoni from the Cochise County Sheriff’s Office, after attending a product demo.

    Public-facing representatives for the company include Chris Clem, a former U.S. border agent, and ex-NFL kicker Nick Lowery. Clem has publicly promoted Overwatch as a kind of “cyberwall” to stop traffickers and hackers but declined to explain how the technology actually works.

    “I worked on a physical wall, now we’ve created a cyberwall,” Clem said in an interview. “I believe it will save lives.”

    As of early 2025, Overwatch remains active in several counties, but critics warn of its potential to normalize mass surveillance without clear evidence of effectiveness. Investigations continue, and officials say it may take time to evaluate results. Still, questions remain about the program’s legality, ethical use of AI, and its real-world value in combating serious crimes.

    “No matter the source of the funding,” said Cavanaugh, “this is taxpayer money. The burden of proof is on Massive Blue to show it works—and that it respects civil liberties while doing so.”