“By November 2022, Snap employees were discussing 10,000 user reports of sextortion each month, while acknowledging that these reports ‘likely represent a small fraction of this abuse’ given the shame and other barriers to reporting,” says a newly unsealed version of the lawsuit filed by New Mexico’s attorney general against Snap. This less-redacted version of the filing we first saw a month ago adds fresh details about what Snap employees allegedly knew about the scope of the sextortion issue it’s accused of facilitating on its platform.
In one alleged instance, employees referenced a case with 75 reports against it “mentioning nudes, minors, and extortion, yet the account was still active.” And in 2022, Snap’s internal research allegedly found that over a third of teen girls and 30 percent of teen boys on its app had been exposed “to unwanted contact on its platform.” the complaint says.
The new details paint a picture of a company aware of its alleged shortcomings when it came to protecting kids on its service, yet not sufficiently focused on fixing them. “Former Snap trust and safety employees complained that ‘they had little contact with upper management, compared to their work at other social media companies, and that there was pushback in trying to add in-app safety mechanisms because [Snap CEO] Evan Spiegel prioritized design,’” the complaint says.
In a statement posted to its newsroom, Snap said its app is designed “as a place to communicate with a close circle of friends, with built-in safety guardrails, and have made deliberate design choices to make it difficult for strangers to discover minors on our service. We continue to evolve our safety mechanisms and policies, from leveraging advanced technology to detect and block certain activity, to prohibiting friending from suspicious accounts, to working alongside law enforcement and government agencies, among so much more.”
According to the complaint, Snap employees circulated an external report in 2021 that included examples of alleged predators connecting with kids as young as eight through Snapchat and obtaining child sexual abuse material. But they feared measures to catch this kind of behavior would be unduly burdensome on user privacy and “create disproportionate admin costs,” the complaint alleges.
Employees also allegedly identified risks with certain Snapchat features, like Quick Add, which suggests other users to connect with. “We need to come up with new approaches that ringfence our most vulnerable users (minors) and make it harder for predatory users to find them via quick add, search, etc.,” an executive wrote, according to the complaint. “We believe we can achieve this without meaningfully degrading the product experience for these users if we pursue new strategies in inventory generation/constraints and other techniques to more effectively silo minors from people outside their networks.” Snap later made it so the Quick Add feature would only show up for 13 to 17-year-olds’ accounts when they had “a certain number of friends in common with that person.” But internally, the complaint says, employees recognized that the approach would still have significant shortcomings.
The unsealed complaint also includes more details on how Snap allegedly facilitated the illicit sale of guns. In one undated presentation, the company acknowledged that its platform sees “50 posts related to illegal gun sales per day and 9,000 views per day of these marketed weapons.” And even when content is reported, “[r]eported content is usually viewed hundreds of times before report.”
It also includes internal communications acknowledging the addictiveness of Snapstreaks, where users are told how many days they’ve continued communicating with another user. “Wow, we should have more addicting features like this,” one employee allegedly wrote, according to a January 2017 email. “Most streakers are our core demographic,” wrote another. An October 2019 presentation allegedly noted that “Streaks make it impossible to unplug for even a day.”