Why Telegram’s CEO was detained in France

Posted by
Check your BMI

Pavel Durov of Telegram.
Pavel Durov, CEO and co-founder of Telegram, speaks onstage during day one of TechCrunch Disrupt on September 21, 2015, in San Francisco, California. | Steve Jennings/Getty Images for Tech Crunch
toonsbymoonlight

Pavel Durov, the CEO and founder of messaging app Telegram, was detained in Paris on Saturday as part of an ongoing French investigation into financial and cyber crimes. On Monday, French officials said he remains under arrest, though he has not been charged with any crime. 

French President Emmanuel Macron denied the arrest was politically motivated. Durov holds French and United Arab Emirates citizenship but is originally from Russia; France has been highly critical of Russia’s invasion of Ukraine and has enforced sanctions on its economy. 

Details on exactly what led to the arrest are limited. However, according to French prosecutors, Durov is being held as part of a larger French investigation. The New York Times reported that prosecutors said they are looking into a “person unnamed” who they believe may have committed an extensive list of crimes — apparently with the aid of Telegram — that include the distribution of child sexual abuse material, money laundering, and drug trafficking. The Washington Post has reported that French police have suggested that “child sex crimes” are an area of particular focus for officials.

It is unclear what Durov’s relationship, if any, is to the “person unnamed.” Unless formally charged, Durov can only be held until Wednesday

This isn’t the first time Telegram has been linked to illegal activity. It is a globally popular platform that offers both broadcast channels (in which users can send text and media to large groups of people) and user-to-user chats. It also offers what it calls “secret chat” conversations that are end-to-end encrypted — meaning that the messages sent are only decipherable to the conversation participants and that no one else, not even Telegram, can see the content.  

That feature, as well as other privacy features like self-deleting messages, make the app extremely useful for political dissidents and journalists trying to work under repressive regimes or protect sources. But the app has also, over the years, become a space where extremists can radicalize users and organize terror attacks. 

That has led to some pressure on the part of governments for Telegram to be more collaborative in the data it shares with authorities. Despite this, however, Telegram has largely been able to avoid dramatic legal encounters — until now. 

Durov’s arrest is renewing scrutiny on the app and reigniting the hotly debated issues of free speech and the challenges of content moderation on social media.

Telegram and the problem of content moderation

Durov and his brother Nikolai founded Telegram to offer an app that centered user privacy following Russia’s “Snow Revolution” in 2011 and 2012, when blatant election fraud ignited months of protests, culminating in a harsh and ever-evolving government crackdown. Previously, Durov quarreled with Russian authorities who wanted to suppress speech on the Facebook-like service he founded called VKontakte.  

In the years since its founding, Telegram has allegedly enabled some truly shocking crimes. Perhaps most infamously, it was used to coordinate ISIS attacks in Paris and Berlin. It cracked down on ISIS-based activity on the app after those attacks, but its content moderation policies have faced a lot of scrutiny.

As Vox has noted, those policies are laxer than those of other social media groups, and outlets such as the Washington Post have reported that Telegram has played host to a variety of criminal content, including child pornography. Keeping that sort of material off of a platform is an arduous — but not impossible — task,  Alessandro Accorsi, a researcher at the International Crisis Group, told Vox. 

“The effectiveness of content moderation is largely dependent on the platform and the resources it allocates to safety,” Accorsi said. “Social media companies are generally reactive. They want to limit the financial resources dedicated to moderation, as well as possible legal, political, and ethical headaches. So what usually happens is that they will focus their efforts on a few groups or issues for which inaction on their part carries legal or reputational costs.”

For example, when ISIS uses a service for terror attacks, that service focuses on stopping ISIS from using its products. 

In communications that aren’t end-to-end encrypted, tech companies use a combination of human investigators as well as algorithm-powered programs to sort through content. The sort of end-to-end encryption used in Telegram’s “secret chats,” however, makes that type of moderation all but impossible.

Also complicating matters is the varied nature of internet law across the globe. In the US, publishers are generally legally shielded from liability over what users post. But that’s not universally the case; many countries have much stricter legal frameworks around intermediary liability. France’s SREN Act is extremely stringent and can levy fines against publishers for content violations.

“It’s a really hard thing to do, especially in comparative context, because what’s hateful or extreme or radical speech in some place like the US is going to be different from Myanmar or Bangladesh or other countries,” David Muchlinski, professor of international affairs at Georgia Tech, told Vox. That makes content moderation “a clumsy tool at best.”

Telegram has, in response to recent outside pressure, employed some content moderation, Accorsi told Vox. It has banned channels associated with a handful of organizations (most recently Hamas and far-right groups in the UK), but thousands of problematic groups are still present. 

France’s investigation suggests Telegram may not be doing enough to keep bad actors from using the platform to commit crimes.