Amazon, Ashton Kutcher And America’s Surveillance Of The Sex Trade

Amazon, Ashton Kutcher And America’s Surveillance Of The Sex Trade

Huge police databases armed with Amazon’s facial recognition are harvesting online sex ads of exploited children and consenting sex workers alike. Forbes finds that even when wielding a mass surveillance arsenal, cops can and do fail to keep sex workers and trafficking victims safe.


Missing for over a week and alone, 14-year-old runaway Jessie made the call to her parents from the Red Carpet Inn, a two-story peach and maroon anomaly juxtaposed with gray freeway and a plain white gas station in Lima, Ohio. She said she’d been abandoned by the man with whom she’d absconded and needed rescuing. Cops from the Allen County Sheriff’s office found her in one of the motel’s cheap, gaudily decorated rooms. In an interview that followed, in mid-July 2019, the teenager told officers that she’d simply gone off with a boyfriend. But thanks to a surveillance technology that harvests sex advertisements from across the web, the cops knew otherwise: she’d been a victim of human trafficking.

A week before, as officers were trying to locate Jessie (not her real name), they had obtained her phone number. They entered it into a tool called Spotlight, an ever-growing database of sex advertisements and a free product supplied to police by Thorn, a nonprofit founded by actors Ashton Kutcher and Demi Moore, whose singular aim is to curtail child exploitation in America. Spotlight returned information on an ad that contained Jessie’s number. It contained two nude photos of the minor on a bed. Presented with the evidence, the victim explained how she’d met 28-year-old registered sex offender Nicolas Cochran on Snapchat. He’d convinced her to have sex with him and then with at least 12 others, Cochran later admitted to police. In 2020, he was sentenced to 42 years in prison for trafficking and a sex offense against a minor.

While the case received local and national press coverage, the role of Spotlight hasn’t been previously detailed. That’s not a surprise: Thorn has been reticent to show the public how its technology works and has only given a handful of examples of its use since Thorn’s founding in 2012. But after being shown a police demo, reviewing previously unreported court filings and interviewing current and former police, as well as sex workers and their advocates, Forbes has learned that Spotlight is one of three tools widely used by U.S. law enforcement that do much the same: scrape sex advertisements from across the web every day to fill vast, easily searchable databases containing names, numbers, images and payment details of not just trafficked individuals but consenting sex workers as well. Fed a phone number or name, machine learning tech in the products will attempt to find a person’s previous ads and possible connections to other individuals, drawing out a map of what it believes are trafficking networks.

In a police training video for Spotlight, an analyst described it as “the Google for human trafficking or online escort activities.”

“If you get one phone number, or one picture of a girl who is being trafficked, you’re never going to run out of new leads,” they said.

Spotlight, launched in 2016, and its two main competitors — Traffic Jam, created by Forbes Under 30 alum Emily Kennedy of Marinus Analytics, and TellFinder by Canadian company Uncharted Software — all rely on facial recognition, allowing police to take a photo and drop it in the system to see if any matches come back. While TellFinder has its own facial recognition algorithms, Spotlight and Traffic Jam both use Amazon’s controversial Rekognition technology — a tool that, two years ago, Amazon banned from use by police unless it was via Thorn, Traffic Jam or other tools aimed at human trafficking. That ban was implemented due to ethical concerns and the risk of false positives after researchers at the American Civil Liberties Union found Rekognition incorrectly identified 28 members of Congress as people who were previously arrested for a crime. The false matches were disproportionately of people of color, the ACLU found. (Amazon didn’t respond to a request for comment.)

Cops say they often combine Spotlight, Traffic Jam and TellFinder to maximize their returns when looking for the owner of a given telephone number or face. But police are especially complimentary of Kutcher’s Thorn. “They’re a phenomenal organization,” said Jim Cole, a 25-year veteran agent with Homeland Security Investigations. “Their goal is to eradicate child exploitation material, which is a lofty, lofty goal.”

Not everyone is so enamored. While the companies cite their work in rescuing victims of child sexual abuse and trafficked adults, Spotlight and its rivals are, by their nature, harvesting vast amounts of information about the marginalized consenting sex worker community from publicly posted ads and sex trade forums, often without their knowledge.

While sex work is illegal across most of America, there have long been calls for decriminalization, which would help prevent the “very real harms” caused by police snooping, says Kate D’Adamo, a sex worker advocate. D’Adamo, a partner at Reframe Health and Justice, a queer and trans people of color collective focused on harm reduction and legal reform, said that by monitoring sex workers, rather than working with them, police are making it “incredibly hard to do sex work in a way that is remotely safe.”

Combining the three databases with phone location tracking and myriad other surveillance tools, American cops have a vast spying apparatus they can exert over the sex trade. Law enforcement should be using its panopticon to perform its most basic function: protecting exploited citizens from harm. In some cases, though, even where the cops believe they have found a trafficking victim, they do the opposite, monitoring women from afar and not stepping in when they are endangered.

“Law enforcement has re-victimized and compounded the violence that this person already experienced…”

Kendra Albert, instructor at the Cyberlaw Clinic at Harvard Law School

When reached for comment, in a statement sent to Forbes Thorn CEO Julie Cordua said Spotlight was limited to officers who investigate child sex trafficking. “Law enforcement has reported that with Spotlight they have seen over 60% time savings in their investigative process and over 21,000 children have been identified using this tool,” Cordua said. Court filings and interviews with current and former police show Spotlight is frequently used to identify adults, a point on which Cordua declined to comment.

Emily Wyatt, director of counter human trafficking initiatives at TellFinder maker Uncharted Software, said the company carefully vets law enforcement users and only those with a mandate to investigate human trafficking are allowed to register. She declined to provide detail on how the tool could be limited to such use cases, saying it was proprietary information.

Marinus didn’t respond to multiple requests for comment. On its website, it says it is “cautious” about uses of AI and facial recognition outside of finding missing persons and sex traffickers. “Regardless of the legality of sex work, the issues of organized crime, human trafficking and missing persons remain our focus,” the company writes.


The FBI called Sarah “Victim 1.”

As a sex worker, Sarah (not her real name) sought clients in an innocuous-looking red brick area of suburban Washington D.C., which has become a bustling nexus of the city’s illicit trade. On the night of August 7, 2019, surveillance footage later reviewed by local police and the FBI showed a man grabbing her arm, slapping her, putting his hands around her neck and shaking her, according to a search warrant reviewed by Forbes.

Along with being a victim of these attacks, Victim 1 had been targeted in another way: through persistent police surveillance before and after the assault. According to court records, the cops appeared to know Sarah was working with alleged “pimp” Michael Wilkins before and after the assault, but didn’t intervene to protect her.


Got a tip about surveillance of the sex trade? Or online monitoring of marginalized communities? Reach out to the author Thomas Brewster on Signal at +447782376697 or email [email protected].


From the start of 2019, all of Sarah’s ads were scraped into an unspecified database. Because she witnessed Wilkins being shot, her image was uploaded to an unnamed facial recognition software in July. Then, a few days before she was beaten up, an undercover cop posed as a sex worker on the same street and was threatened by Wilkins. In September, Sarah was targeted in a sting operation by an undercover cop, charged with solicitation and had her phone searched; finally, throughout October, her mobile location was tracked via T-Mobile.

Advertisement

Joe Scaramucci, a detective with the Waco, Texas police department, reviewed the court document and said the escort advertisement database was almost certainly Spotlight. When the FBI, which has numerous contracts with Thorn, searched for Sarah’s number, the tool returned 691 ads between February and August 2019, according to a search warrant.

Ultimately, all that surveillance failed to protect her from the man who assaulted her. It also failed to protect three others associated with the suspected trafficker, one of whom was beaten up so badly, she’d lost vision in one eye, the FBI said. (After admitting to one count of forcing a person into a commercial sex act, Wilkins is now awaiting sentencing. Neither the FBI nor the Washington D.C. Metropolitan Police Department responded to requests for comment.)

While police regularly spy on sex workers to compile evidence of trafficking, even some cops balk at what was done to Sarah. “That’s crazy to think that somebody is being trafficked, and you’re going to let her get physically, sexually abused, you’re going to watch the trafficking occur, all so you can build a case on [the trafficker], it’s absolutely ridiculous,” said Scaramucci, the Waco detective, who also trains police across America on how to investigate trafficking. “It’s crazy, but it’s not uncommon.”

“It’s horrifying but I’m not surprised,” added Kendra Albert, a public interest technology lawyer and clinical instructor at the Cyberlaw Clinic at Harvard Law School. “Law enforcement has re-victimized and compounded the violence that this person already experienced.”

Olivia Snow, a dominatrix and research fellow at UCLA’s Center for Critical Internet Inquiry, said that kind of surveillance is “par for the course” for sex workers. “The cops can go to the media and say, ‘We were able to successfully rescue this woman from sex trafficking,’ when, no, you just spied on a chick,” she added.


For police, it can be complicated trying to determine whether or not a person is a consenting sex worker or has been trafficked and, therefore, if surveillance is justified. That means that sex workers can get caught up in law enforcement’s dragnet, even when they’re not the main target of the investigation.

For instance, when Natasha (not her real name) lost her job at a daycare center in Auburn, Washington, as the U.S. went into Covid-19 lockdown, she started doing sex work, putting ads online and posting on OnlyFans, she told Forbes. As part of this work, she began a relationship with previously convicted burglar and domestic abuser Everett Hayes — a relationship that would put Natasha at the center of a trafficking investigation and under the glare of police surveillance.

In August 2021, Hayes called the police, trying to recover belongings that were seized after he was arrested for being a felon in possession of a firearm. Suspecting him of being involved in other crimes, cops searched the number in an unspecified sex ads database, finding it was used in numerous advertisements for Natasha’s services.

“I feel like once it’s on the internet, it’s hard to get off.”

Natasha, former sex worker

Investigators then got a warrant to track the phone’s location, finding it went to the same areas as Natasha had posted in her ads. Data retrieved from a search of Hayes’ Facebook account indicated he was in the same areas at the same time, suggesting he was travelling with Natasha “for the purpose of prostitution,” investigators said. (Hayes’ lawyer did not respond to a request for comment.)

Hayes signed a plea agreement this September saying that he trafficked Natasha. Natasha, who said she was a consenting sex worker and has not been charged with any crime to date, said she was unaware her number and image had been added to a sex advertisement database or that her mobile location had been tracked until Forbes informed her. She had never considered that she might be surveilled in such a way before she got into the industry, which she quit last year.

“I do have some concerns about it… I got laid off, I dabbled in a little bit of sex work and I didn’t think too far ahead about my number being searched, my name being brought up in searches.”

Since quitting sex work, Natasha said she had tried to get the websites hosting her old adverts to delete them. She got no response. “I feel like once it’s on the internet, it’s hard to get off.”


As Spotlight, TellFinder and Traffic Jam continue to hoover up mountains of data on the sex trade, the trio of surveillance tools are getting smarter and easier to use. In a police training video obtained by Forbes, an investigator from Connecticut noted how the tool could be fed ages and locations — not just specific phone numbers — to enable broader searches. Spotlight alerts can inform law enforcement of any new advertisements containing a face or a number, giving them near real-time tracking, he said. Reverse image searches could also flag any similar photos used across advertisements, not just those of faces, but of surroundings or items of clothing too.

And the scale of these tools is vast: In Connecticut alone, when searching for the term, “new in town,” Spotlight returned 1.1 million ads, the analyst said.

Beyond the sheer size of their databases, the products’ use of facial recognition remains deeply worrying to privacy and sex worker advocates. With Amazon Rekognition and competing technologies previously shown to be biased, mismatching people of color more regularly than white individuals, one concern is that an already marginalized minority citizen could be mistakenly identified as being a sex worker. Then there’s the basic anxiety around the impact on anyone’s privacy, leading civil rights organizations like the EFF and ACLU to call for outright bans of the technology.

“It’s definitely government overreach.”

Kate D’Adamo, sex worker advocate

Earlier this week, the Center on Privacy & Technology at Georgetown Law released a research report saying there was no scientific validation for facial recognition as a reliable tool in any criminal investigation. “Technology initially intended to serve a narrow purpose like combatting human trafficking can easily and quickly expand into a tool for mass surveillance, especially when there is no oversight of these private and philanthropic partnerships between technology creators and law enforcement agencies,” said Meg Foster, one of the authors of the report.

D’Adamo, the sex worker advocate, said face recognition combined with Spotlight-type technology was a threat to people’s bodily autonomy. “Having the facial recognition technology run, tying into their ads, it’s a problem. It’s definitely government overreach,” she added.

According to Matt Mahmoudi, researcher and adviser at Amnesty International on artificial intelligence and human rights, companies use a “moralizing narrative” to justify the deployment of their tools. But without more checks on how such invasive surveillance is used in practice, they could be put to more nefarious uses than originally intended.

“It’s always the narrative of we’re going to help stop human trafficking,” Mahmoudi said “Ultimately, it’s used in very different, pernicious ways.”

MORE FROM FORBES

Advertisement

Leave a Reply

Your email address will not be published. Required fields are marked *