Photo by Schreibvieh via Wikimedia Commons
On August 24, 2024, Pavel Durov, the CEO of the messaging app Telegram, was arrested in Paris. French authorities have charged him with various crimes, including complicity in the distribution of child pornography, the sale of illegal drugs, and the distribution of hacking software.[1] Durov is banned from leaving France and must check in at a police station twice a week. If convicted, Durov faces a ten-year prison sentence.[2] French prosecutors are holding Durov criminally liable for permitting criminal activity to go unchallenged on Telegram. This indictment is possible under France’s LOPMI law, which was passed in January 2023. Under LOPMI, a person is liable for “complicity in the administration of an online platform to allow an illicit transaction, in an organised gang.’”[3] LOPMI, which currently has no equivalent in other countries’ legal systems, “criminalises tech titans whose platforms allow illegal products or activities.”[4] While “holding executives at social networks criminally liable for content that appears on sites was, until now, considered almost unthinkable,” Telegram’s approach to compliance with government requests for information and cooperation is exceptional. France is seeking to send a message: tech executives are responsible for activity on their platforms.[5]
Durov began wrangling with the notion of privacy in 2013. That year, protesters in Ukraine began using Durov’s platform VKontakte, “the Facebook of Russia,” to organize protests against the pro-Russian government in Kyiv. The Kremlin pressured Durov to pass along protesters’ data, which Durov refused to do, arguing that it would be a betrayal of his Ukrainian users. Consequently, in 2014, Durov left Russia, stepped down from VKontakte, and sold his stake in the company.[6] Durov’s actions established him as a privacy champion and demonstrated that he was ready to prioritize his users’ privacy above his own personal safety.
France’s problem with Telegram began in 2015-2016, when terrorists used the platform to organize attacks in Paris. Telegram made clear that it was “not going to cooperate with security services in terms of taking down Islamist content…[or] banning users who were attempting to spread Islamist propaganda.”[7] To address concerns about the dark side of online platform and social media use, the European Union (EU) passed the Digital Services Act (DSA), which went into effect in late 2023 and governs Durov’s case. The DSA’s scope is significant and covers virtually all aspects of tech operations.[8] The DSA forces social media companies to create and carry out procedures that govern the removal of content flagged as harmful by platform users or governments and obliges platforms to work with European authorities in their criminal investigations.[9] Telegram falls just under the 45-million-user threshold that would designate it a very large platform, which would subject the platform to heightened monitoring and enforcement under the DSA.[10] The French government argues that Telegram has not complied with its obligations. Indeed, the company is known for rarely checking the email inbox to which requests from law enforcement are directed.[11] This non-compliance extends into Durov’s current situation: Paris prosecutors have argued that “Telegram exhibited an ‘almost total failure to respond to judicial requests’” throughout the French investigation into Telegram.[12] Telegram, meanwhile, argues that it is in compliance with the DSA.
Durov’s arrest has raised concerns in the global tech community, with many wondering whether American tech executives, like Mark Zuckerberg, could find themselves facing charges similar to Durov’s. It is increasingly clear that social media poses a threat to vulnerable groups all over the world. Strong cases can be made both in favour of and against stronger regulation of social media. Key arguments for stronger regulation point to a plethora of evidence that indicates that social media use has contributed to a rise in false information, conspiracy theories, and deteriorated mental health, particularly among young adults and teenagers. Regulation critics assert that binding social media to greater control risks limiting freedom of speech, the free circulation of ideas, and creates privacy issues. Social media regulation can be weaponized and designed to crack down on minority beliefs or statements made by political dissenters.
However, it is important to at least acknowledge problems associated with social media. In Meta’s case, for example, the company had affirmative knowledge that its algorithms “risked contributing to extreme violence” in Myanmar as early as 2012—five years before the genocide carried out against the Rohingya in 2017.[13] In January of this year, the Senate Judiciary Committee questioned top social media executives, including Mark Zuckerberg, CEO of Meta, and Evan Spiegel, CEO of Snap, for four hours. The Committee accused the executives of having failed to protect children from exploitation, citing evidence from the National Center for Missing & Exploited Children and U.S. Surgeon General, Vivek Murthy, that pointed towards the deleterious effects of social media use on children’s mental health and their risk of being abused sexually, physically, and psychologically.[14]
American tech executives are likely safe for now. While U.S. law has no crime directly analogous to the LOPMI law, “U.S. prosecutors could charge a tech boss as a ‘co-conspirator or an aider and abetter of the crimes committed by users,’” if evidence demonstrates that the individual in question sought to actively participate in criminal activity.[15] This rule is laid out in Twitter v. Tammneh.[16] In a separate case, Ross Ulbricht, the founder of Silk Road, was found guilty of aiding and abetting in 2015 and received a life sentence.[17] Proving knowledge of criminal activity is the key challenge for prosecutors, particularly given that American tech behemoths like Meta and Google “have worked to take down and report illegal content to law enforcement officials, so their executives can argue they tried to do the right thing.”[18]
American executives also benefit from Section 230 of the Communications Decency Act, which protects platforms and their owners from being held accountable for “harmful speech.”[19] However, Section 230 is not absolute. The law preserves executives’ accountability regarding instances of “obscenity, [the] sexual exploitation of children, prostitution, [and] sex trafficking,” as well as intellectual property and communications privacy concerns.[20] Section 230 also does not apply outside of the U.S.[21] In response, legislators have proposed the Kids Online Safety Act (KOSA), which seeks to provide legal remedies to people who have suffered harm via social media companies.[22] Zuckerberg has refused to support KOSA in its current form..
Durov’s case is a strong indicator that the EU is increasingly prepared to hold tech executives accountable for activity on their platforms, even when the companies are not based in Europe. Proponents of the DSA laud the Act for creating accountability in a realm where ownership and responsibility is nebulous. However, the DSA could also be criticized for using social media and technology executives as scapegoats for a larger problem. Few would argue that the root cause of fake information and online crime is tech executives. Accordingly, there is a valid question of whether targeting these executives and holding them vicariously liable is ultimately fair. It remains to be seen whether European courts will seek to exercise this control over American executives. For now, American tech executives are on their guard, especially with the prospect of KOSA on the horizon. While the U.S. has been hesitant to apply the same legal regulations to the tech industry that Europe has, protecting one’s citizens from harm has universal appeal. Durov’s case is a litmus test for just how much weight this appeal carries.
Soraya Mazarei is a Staff Editor at CICLR.
[1] Matthew Dalton & Kate Linebaugh, The Journal: What’s Behind the Arrest of the Telegram CEO?, Spotify & The Wall Street Journal (Sept. 3, 2024), https://www.wsj.com/podcasts/the-journal/what-behind-the-arrest-of-the-telegram-ceo/a5d3fcd1-01fd-417a-a2d4-b3927fb7e39a.
[2] Gabriel Stargardter, France Uses Tough, Untested Cybercrime Law to Target Telegram’s Durov, Reuters (Sept. 17, 2024), https://www.reuters.com/world/europe/france-uses-tough-untested-cybercrime-law-target-telegrams-durov-2024-09-17/; Dalton & Linebaugh, supra note 1.
[3] Stargardter, supra note 2.
[4] Id.
[5] Bobby Allyn, Telegram CEO Pavel Durov Indicted in France, NPR (Aug. 28, 2024), https://www.npr.org/2024/08/28/nx-s1-5091295/telegram-ceo-pavel-durov-france-indicted.
[6] Dalton & Linebaugh, supra note 1.
[7] Id.
[8] Id.
[9] Id.
[10] Matthew Dalton & Ann M. Simmons, France Details Telegram CEO Pavel Durov, Fanning Tensions with Russia, The Wall Street Journal (Aug. 25, 2024), https://www.wsj.com/world/russia/france-detains-telegram-ceo-pavel-durov-fanning-tensions-with-russia-5a2805e1.
[11] Dalton & Linebaugh, supra note 1.
[12] Allyn, supra note 5.
[13] Amnesty International, Myanmar: Facebook’s Systems Promoted Violence against Rohingya; Meta Owes Reparations – New Report, Amnesty International (Sept. 29, 2022), https://www.amnesty.org/en/latest/news/2022/09/myanmar-facebooks-systems-promoted-violence-against-rohingya-meta-owes-reparations-new-report/.
[14] Cheyenne Haslett & Alexandra Hutzler, Zuckerberg Apologizes to Families of Kids Harmed Online as Senate Grills Tech CEOs, ABC News (Jan. 31, 2024), https://abcnews.go.com/Politics/social-media-ceos-face-grilling-senators-child-safety/story?id=106825984; Why Did Mark Zuckerberg Apologise at the US Senate?, Al Jazeera (Feb. 1, 2024), https://www.aljazeera.com/news/2024/2/1/why-did-mark-zuckerberg-apologise-at-the-us-senate.
[15] Stargardter, supra note 2.
[16] Twitter v. Tammneh, 598 U.S. 471 (2023) (holding that plaintiffs failed to prove that Twitter, Meta, and Google aided and abetted ISIS in carrying out the terrorist attack on the Reina Nighclub in Istanbul in 2017 due to a lack of evidence proving the tech companies’ affirmative conduct aimed at helping ISIS).
[17] Stargardter, supra note 2.
[18] Adam Satariano & Cecilia Kang, Can Tech Executives Be Held Responsible for What Happens on Their Platforms?, The New York Times (Aug. 28, 2024), https://www.nytimes.com/2024/08/28/technology/durov-telegram-liability-platforms.html.
[19] Satariano & Kang, supra note 18.
[20] Michelle P. Scott, Section 230 Protection: Meaning, Criticism, Purpose, Investopedia (Dec. 1, 2022), https://www.investopedia.com/section-230-definition-5207317#:~:text=Section%20230%20has%20%E2%80%9Cexceptions%E2%80%9D%20(,Intellectual%20property; 47 U.S.C.§ 230.
[21] Scott, supra note 20.
[22] Haslett & Hutzler, supra note 14.
Comments