Why Google engineers worked full-time to combat sex trafficking

When the government seized classified-ads site Backpage, forcing it to shut down in April, it became a lot harder to find and locate potential victims of sex trafficking. While it was symbolically good that the site, whose CEO later pled guilty to charges of sex trafficking, shut down, it created a significant technical challenge for law enforcement and the organizations trying to help prevent sex-trafficking, Google Senior Software Engineer Sam Ainsley told TechCrunch.

“Once Backpage was gone, you were looking at an ecosystem in which all of those previously more centralized advertisements [for those being sex trafficked] are being redistributed on much more websites,” Ainsley said.

Ainsley got involved with this work through Google.org’s new fellowship program, which embeds Google engineers inside non-profit organizations on a full-time basis for six months. She did this work at Thorn, a non-profit organization founded by Ashton Kutcher that seeks to protect children from sexual abuse and trafficking. Thorn was the first non-profit to host Google Fellows.

When Backpage was up-and-running, Thorn had been indexing the advertisements and then providing them to law enforcement in order to facilitate the recovery of victims. That task became harder once Backpage shut down. While working at Thorn, Ainsley and four other Google engineers set out to help make this information more easily available to law enforcement.

WASHINGTON, DC – FEBRUARY 15: Ashton Kutcher, Actor and Co-Founder of Thorn: Digital Defenders of Children, speaks at a Senate Foreign Relations Committee hearing on Ending Modern Slavery: Building on Success at Dirksen Senate Office Building on February 15, 2017 in Washington, DC. (Photo by Paul Morigi/WireImage)

“The goal of this project was to create actual profiles of victims so that this information could be as accessible as possible,” Ainsley said. “That you could actually understand that victim’s history and where they might be at any given period of time; what are the cities they’ve been in over time; What are the various phone numbers they’ve used over time? How that has their appearance changed? This is all incredibly important because, in this space, that is the strategy. It’s to change things as quickly as possible to evade being found. So it’s really about taking a holistic approach and a personal approach.”

Google machine learning engineers were able to take the massive number of ads and try to understand which advertisements belonged to one individual, and then formulate that one person’s history, Ainsley said. As a data visualization expert, Ainsley’s part was to visualize all of that information in a way that is accessible to law enforcement.

“At this point in time we have built a prototype that we’re really excited about,” Ainsley said. “So we’re really optimistic about where this can go.”

Although her fellowship is over, Ainsley said she’s going to stay on working with the Thorn team one day a week because she wants to see the work taken all the way into the field.

“That’s the dream and I want to be available as much as I possibly can to assist in that effort,” Ainsley said.

(L) Doug Grundman, Google.org Fellow and Software Engineer, Google; (R) Julie Cordua, CEO, Thorn. Photo via Google

Thorn CEO Julie Cordua told TechCrunch it was a successful product, noting it “was an exploration we’ve been wanting to do for quite a while.”

In addition to working with law enforcement around sex trafficking, Thorn tackles child pornography, livestream abuse and grooming. The organization also works to activate companies in the private sector to identify and remove child porn.

Thorn recently launched Safer, a tool for small to medium-sized businesses to identify child pornography on their platforms. Currently, Imgur, Roblox and Flickr are participating in the beta program.

“Larger companies have good systems in place but when you don’t have the capacity to build out risks teams, then this type of bad content starts to flourish,” Cordua said. “That’s what we’re trying to solve for.”

(L) Samantha Ainsley, Google.org Fellow and Software Engineer, Google; (R) Doug Grundman, Google.org Fellow and Software Engineer, Google. Photo via Google.

Thorn was the first partner for the Google.org fellowship, but there are two active fellowships at the Family Independence Initiative and Goodwill, which started in November. Before selecting an organization to work with, Google.org examines the potential for impact and ability of the organization to continue the work once the Google engineer leaves.

“And we want to look for an area where there’s a really complex challenge where technology is the solution, or at least is a solution that can really help move the needle,” Google.org Product Manager and Head of Technical Team Jen Carter told TechCrunch.

And for Google, it’s a good retention strategy. Ainsley, for example, said the opportunity came at the right time because she had actually been considering leaving the company.

“I had been at this crossroads in my career as an engineer, where I was feeling pretty siloed in terms of the impact that my work was having,” Ainsley said. “I loved the work I was doing from a technical aspect — I always have — but I was considering potentially even exploring different career options because it was unclear to me how, through programming, I could have a direct impact on people’s lives. And that was something that I really wanted. And when this program was posted, it was right around that time.”

Cordua agrees, noting that it could be a good retention strategy for companies. It’s also a model that she thinks should be replicated by other tech companies.

“Our goal is to get more companies to think about this for their engineering teams,” Cordua said. “It’s an opportunity to build engineering teams that have empathy and understanding of tech’s social impact on both negative and positive things. Tech can do amazing things. Let’s channel it toward these social issues.”

Inside Facebook’s anti-sex trafficking hackathon

Tech giants put their rivalries aside for two days this week to code for a common cause: protecting children on the internet. Deep inside Facebook’s Menlo Park headquarters, teams drawn from Uber, Twitter, Google, Microsoft and Pinterest worked through the night to prototype new tools designed to help nonprofits in their fight against child sex trafficking.

Much of their work from Facebook’s third annual child safety hackathon is actually too sensitive to publish. To stay one step ahead of the criminals, the specifics of how these tools track traffickers and missing children across websites must be kept secret. But the resulting products, all donated to NGOs like Thorn and the Internet Watch Foundation, could help tech companies rally a united front against those who’d seek to hurt kids.

“The thing with work on safety and security and fighting abuse is it’s an area where the industry is collaborative,” says Guy Rosen, Facebook’s VP of product management and one of the event’s judges. “Hackathons are a great way to bring people together to actually bootstrap some of these ideas . . . ensuring that the engineers who have the smart ideas can actually understand the pain points and apply that thinking to these problems.”

The winner of 2016’s hackathon has grown into an invaluable resource for groups like the National Center for Missing and Exploited Children. The “child finder” tool matches online photos, like those on escort service listings, to NCMEC’s database of missing children. It helps reduce law enforcement’s response time so they can deploy officers in hopes of rescuing these kids.

Speaking in tech’s language of computer code, Facebook engineering manager Cristian Canton Ferrer described the tool saying, “People affected = 1; magnitude of change = enormous; lasting impact of the change = forever.”

While Facebook has recently been criticized for its dominance in social networking and approach to data privacy, its size affords it the resources to spearhead projects like this. And because it’s already accustomed to hacking on scaled tools, teaming up with NGOs and other web platforms can let the fruits of 10 years of labor around child safety be passed on to those who couldn’t build them themselves.

“It benefits no company if the general perception is that the internet is not a safe place,” says Facebook’s global head of safety Antigone Davis. “All of us have an individual interest as well as the industry’s interest in ensuring that not only people perceive it as a safe place but that it is a safe place.”

Amongst the projects at this year’s hackathon was a way to use machine vision to identify people and other distinguishing features in photos from sites known to be used for sex trafficking. Artificial intelligence can help take some of the burden off human investigators who can be emotionally taxed by constantly viewing images of the exploited.

The winning project, called “Spotting Trends,” uses clustering analysis to keep tabs on traffickers as they move around the internet. Referring to the recent termination of a popular online prostitution marketplace, Rosen told the hackathon attendees that “Backpage coming down is a big event, but the bad guys are still out there.”

The Spotting Trends team wasn’t awarded a giant novelty check or some golden trophy. Instead, they’ll get the opportunity to present their work at the big Dallas Crimes Against Children Conference, which last year drew more than 4,300 professionals from the safety industry.

“The kind of folks that come to this, they’re really motivated and really proud of the work because as internet companies we operate at the scale of hundreds of millions or billions of users. But when you do this work, you hear those individual stories,” Rosen explains. “Just knowing the things we work on have a real impact on real people is what keeps all these people coming every morning and driven to do really good work.”

Davis concludes, “I think theirs is the quiet behind-the-scenes work that doesn’t get championed nearly enough.”