According to Meta, its Trusted Partner program “is a key part of our efforts to improve our policies, enforcement processes, and products, to help keep users safe on our platforms.” According to some trusted partners, though, Meta neglects its flagship initiative — leaving it significantly underresourced, understaffed, and allegedly prone to “operational failures” as a result.
That’s one of the core accusations of a report that the media nonprofit Internews published on Wednesday. The Trusted Partner program consists of 465 global civil society and human rights groups. It aims to provide them with a designated channel to alert Facebook and Instagram of dangerous and harmful content such as death threats, hacked accounts, and incitement to violence. Meta promises to prioritize those reports and escalate them quickly.
But Internews claims that some participating organizations experience the same treatment as regular users: they wait months for replies to a report, get ignored, and are alienated by poor and impersonal communication. According to the report, response times are erratic, and in some cases, Meta doesn’t react at all or offer any explanation. That allegedly applies even to highly time-sensitive content, like serious threats and calls for violence.
“Two months plus. And in our emails we tell them that the situation is urgent, people are dying,” one anonymous trusted partner said. “The political situation is very sensitive, and it needs to be dealt with very urgently. And then it is months without an answer.”
For the report, Internews gathered assessments of 23 trusted partners from every major global region and added its own observations as a partner of the program. Most organizations reported similar experiences, but there was one exception: Ukraine, where responsiveness was far above average. Ukrainian partners can expect a response within 72 hours, while in Ethiopia, reports relating to the Tigray War can go unanswered for several months.
The report’s conclusions fit with previous leaks and reports on Meta’s global priorities. Trusted partners are particularly important outside of North America and Europe, where users can’t rely on content being constantly checked by AI and thousands of human Meta moderators. Yet, two years ago, former Facebook employee Frances Haugen published internal documents that revealed how little Meta cares about the global south. In countries such as Ethiopia, Syria, Sri Lanka, Morocco, and Myanmar, Facebook and Instagram fail to stop extremists from inciting violence. The alleged failure of trusted partners may be part of the reason why.
In May 2023, nearly 50 human rights and tech accountability groups signed an open letter to Mark Zuckerberg and Nick Clegg after Meareg Amare, a Tigrayan professor, was doxxed in a racist attack on Facebook and murdered shortly afterward in Ethiopia. His son, Abrham, tried in vain to get Facebook to take the posts down. “By failing to invest in and deploy adequate safety improvements to your software or employ sufficient content moderators, Meta is fanning the flames of hatred, and contributing to thousands of deaths in Ethiopia,” the letter reads.
“Trusted flagger programs are vital to user safety, but Meta’s partners are deeply frustrated with how the program has been run,” said Rafiq Copeland, platform accountability advisor at Internews and author of the report. Copeland thinks that more investment is needed to ensure Meta’s platforms are safe for users. “People’s lives depend on it.”
The review was originally set up as a collaboration with Meta. In 2022, the company withdrew its participation. Meta claims that “the reporting issues of the small sample of Trusted Partners who contributed to the report do not, in our view, represent a full or accurate picture of the program.” Internews says that it requested Meta’s assistance with notifying its partners of the review, but Meta declined.
Meta does not reveal its average and target response times or the number of employees that work on the program full time. A spokesperson declined to comment on the report.