Durov claims to abide by laws, including the Digital Services Act, but his arrest brings up a deeper issue: who establishes restraint at home?
As we are all aware by now, Pavel Durov, the Russian-born billionaire and founder of the Telegram messaging app, was arrested at Le Bourget airport outside Paris after arriving on a private jet from Azerbaijan. Durov was taken into custody late Saturday by the French National Anti-Fraud Office. The charges are reportedly related to alleged illegal activities by Telegram users, which the platform is accused of failing to moderate. An investigating magistrate has extended Durov’s detention while the case is examined, with a potential detention period of up to 96 hours. At the conclusion of this period, the judge will decide on further actions.
Telegram’s response and legal context
In response to the arrest, Telegram issued a statement asserting that Durov has “nothing to hide” and highlighted that the platform complies with EU regulations, including the Digital Services Act. The company emphasized that its moderation policies meet industry standards and are continuously improving.
The ethical debate: who is the ultimate decision maker?
Beyond the mere reporting of events, the Durov case has obviously reignited the ongoing debate on freedom of speech and the power of authorities over social platforms. The politicization of the case makes little sense—Durov has refused to collaborate with Russia in the past, not just with Western countries—so there are two key points to analyze:
- What should Telegram do for its customers?
- Who should decide the moderation policies?
Telegram markets and advertises itself as a privacy-focused messaging service – though it’s important to acknowledge that there are other services that offer even more robust privacy protections. Despite this, Telegram’s positioning as a privacy-centric platform is a strategic choice that resonates with a large user base seeking a balance between usability and security. Given this brand identity, it makes perfect sense that Telegram would be resistant to collaborating with authorities on matters that could compromise user data. Such cooperation could undermine the very value proposition that has attracted millions of users to the platform.
If Telegram were to capitulate to demands for data sharing or content moderation in ways that violate user privacy, it would risk eroding the trust that underpins its entire business model. Users who prioritize privacy might abandon the platform in favor of alternatives that offer stronger guarantees of confidentiality and security. Therefore, Telegram’s stance on non-cooperation with authorities is not merely a matter of principle but a strategic decision to protect its core offering.
Self-moderation is the only way
A crucial aspect of the debate on moderation concerns the capacity and legitimacy of digital platforms to control content. When a company like Telegram introduces the possibility of moderating content, several fundamental questions arise. First, in order to decide which content should be moderated, it is necessary to read all of it. This immediately raises privacy concerns: who guarantees that users’ private messages are not intercepted? If someone can read messages, privacy is already compromised.
Secondly, who decides what is legitimate? The notion of legitimacy is fluid and varies from country to country, from government to government, and even within the same societies over time. What the European Union might consider acceptable today could be deemed illegitimate tomorrow, or vice versa. The same question applies to other global actors: which standard should prevail? Well, the answer is: those of the owner.
The only way to find common ground in this global context is not to rely on existing laws, which are often contradictory, but rather on the principle of private property. Telegram itself should be the sole authority to decide what and how to moderate on its own platform. It will then be the market that judges its actions: users, through their choices, will determine whether Telegram’s moderation policies are fair or not. If users believe Telegram manages moderation fairly and respectfully, they will continue using the service; otherwise, they will migrate to other platforms that better meet their needs for freedom and privacy.