Affiliate links on Android Authority may earn us a commission. Learn more.
What makes Telegram so controversial?
The rise to prominence of Telegram over the last decade has come at a price for the popular messaging app. It is praised for its privacy and freedom while being criticized for the dark undercurrents it harbors in equal measure. To understand how the Telegram controversy came to a head with the arrest of its founder in Paris last month, it’s important to understand how it became so ubiquitous in the first place.
A meteoric rise
Some argue that Telegram has been a victim of its success. True or not, the app’s incredible surge to mainstream adoption certainly can’t be disputed. It is now the fourth most popular messaging app on the planet, with 900 million active monthly users worldwide.
Founded in 2013 by Russian-born entrepreneur Pavel Durov, Telegram was created as a secure messaging platform in response to increasing government scrutiny in Durov’s native Russia. Durov, who previously founded the Russian social media site VKontakte (VK), envisioned Telegram as a tool that would prioritize user privacy and freedom of speech, offering end-to-end encryption and a commitment to resist government censorship.
It is now the fourth most popular messaging app on the planet.
Telegram’s success and controversy can be attributed to several key features that set it apart from other messaging apps. Its strong encryption and emphasis on privacy quickly gained the trust of users concerned about their data security. The app’s availability across multiple platforms, including mobile devices and desktops, made it accessible to a global audience. Telegram’s support for large group chats and channels — now with up to 200,000 members in each one — allowed communities to form and thrive, especially in regions where free speech is suppressed.
Other factors expanded its functionality and appeal, such as the platform’s open API, which encouraged developers to create bots and services. These elements combined to propel Telegram into widespread use, particularly among those seeking an alternative to more regulated platforms like WhatsApp or Facebook. It acts as a flagbearer for the right to freedom of speech, but this right is not always exercised virtuously by users.
Telegram courts controversy
Even before the recent legal issues for its founder, Telegram was no stranger to controversy. The same elements of the app that make it popular also cultivate a significant dark side. The privacy features that have been a boon for users seeking secure communication have also made the platform a magnet for illegal activities, while the app’s commitment to minimal content moderation has attracted a range of malicious actors.
Terrorist organizations have utilized Telegram to spread propaganda, recruit members, and coordinate attacks. The app’s encrypted communication and large group capabilities provide an ideal environment for these activities.
Cybercriminals have also found a haven in Telegram, using it as a hub for illegal transactions, including drug trafficking, the sale of counterfeit goods, and even human trafficking. The anonymity of Telegram’s channels and private groups makes it difficult for authorities to trace these activities. Illicit activities that once took place on the dark web — requiring a special browser, a VPN, and other tools — now take place on an app that anyone can download to their phone.
The approach to content moderation is a feature, not a bug.
Telegram’s hands-off approach to content moderation has led to the unchecked spread of misinformation, conspiracy theories, and hate speech. This lack of oversight has allowed harmful narratives to flourish, but Telegram’s popularity among political dissidents in authoritarian regimes highlights the platform’s dual-edged nature. While social networks like Facebook employ tens of thousands of content moderators worldwide, the entire Telegram operation is fewer than 60 people.
To Durov, the approach to content moderation is a feature, not a bug. Telegram also built a reputation for refusing or ignoring requests from law enforcement agencies for assistance in their investigations. Such was Telegram’s protective stance towards its members that many authorities gave up on making the requests.
The arrest of Telegram’s founder
In August 2024, Pavel Durov was arrested in France, marking a significant moment in the history of the Telegram controversy and accountability for online content in general. The arrest was part of a broader investigation into criminal activities conducted via the platform, including the spread of child sexual abuse material, fraud, and drug trafficking. French authorities cited Telegram’s “almost total lack of response” to law enforcement requests as a key factor in the arrest.
In other words, the authorities have accused Durov of complicity by allowing such activities to take place on the platform and then failing to assist law enforcement in pursuing them.
The hashtag #FreePavel quickly gained traction on social media.
Durov, who has long championed privacy over government demands, responded by defending Telegram’s approach. He argued that it is unfair to hold him personally responsible for the actions of the platform’s users, emphasizing that the app was designed to protect free speech and privacy. He also hinted at upcoming changes aimed at addressing the abuse of Telegram, signaling a potential shift in how the platform is managed.
This arrest has sparked a global debate about the balance between privacy and security, with many viewing Durov as a martyr for free speech, while others see it as a necessary step toward regulating online spaces that harbor illegal activity. The hashtag #FreePavel quickly gained traction on social media, with figures like Elon Musk and Edward Snowden expressing support for Durov.
What next for Telegram?
The arrest of Durov could lead to significant changes for Telegram. As legal and regulatory pressures mount, the platform may be forced to reconsider its approach to content moderation and cooperation with law enforcement.
Following Durov’s arrest, Telegram has already agreed to share user data with law enforcement agencies under certain conditions. It has announced that it will now share the IP addresses and phone numbers of users who violate its rules when it receives valid legal requests from authorities. This change is a significant departure from the platform’s previous practices and is likely aimed at pacifying prosecutors and regulatory bodies, particularly in Europe.
This wasn’t the only development over the past month. While Telegram had reporting tools before, they were only used for public-facing content, like bots, stickers, or channels. It has now quietly extended this policy, with the Telegram FAQs making clear that users now have the tools to report illegal content anywhere on the app.
Durov's situation is unprecedented.
The future of Telegram is uncertain, but the arrest of its founder and the surrounding controversy underscores the complex challenges facing these platforms. It remains to be seen if Telegram can continue to be a haven for free speech or whether it will be forced to compromise on its core principles even further.
Durov’s fate is equally unclear — no messaging app founder has ever been held criminally liable for the content posted on the platform. There have been many civil suits, and Mark Zuckerberg was called to testify before Congress, but Durov’s situation is unprecedented. What happens with both Telgram and Durov could have far-reaching implications, and the whole industry will be watching.