TLDR:
Telegram updated its policy to allow reporting of illegal content in private chats
CEO Pavel Durov was arrested and released in France as part of an investigation
Durov acknowledged Telegram’s growth has made it easier for criminals to abuse the platform
Telegram now has 950 million users and removes millions of harmful posts daily
The platform added new reporting options and an email for automated takedowns
Telegram, the popular messaging platform known for its strong privacy stance, has quietly updated its content moderation policies following the recent arrest and release of CEO Pavel Durov in France.
The changes, which now allow users to report illegal content in private chats, mark a significant shift in the company’s approach to handling potentially harmful material on its platform.
On September 6, 2024, Telegram revised its FAQ section to include new guidelines for reporting illegal content. Previously, the platform stated that it did not process any requests related to private chats.
The updated policy now provides users with specific steps to flag inappropriate messages for review by Telegram’s moderation team.
According to the new guidelines, users can report illegal content by tapping on a message and selecting “Report” from the menu on Android devices, or by pressing and holding a message on iOS. Desktop users can right-click on a message to access the reporting option.
Telegram has also introduced an email address (abuse@telegram.org) for automated takedown requests.
The policy update comes in the wake of Durov’s arrest by French authorities in late August. The Telegram founder was held for four days as part of a broader investigation into the platform’s alleged role in facilitating illegal activities. Durov was released on bail and is currently under judicial supervision, facing preliminary charges.
In his first public statement since the incident, Durov described his arrest as “misguided” but acknowledged that Telegram needs to improve its efforts to combat criminal abuse.
He attributed some of the platform’s challenges to its rapid growth, which has seen the user base expand to 950 million.
Durov stated, “Our abrupt increase in user count has caused growing pains that made it easier for criminals to abuse our platform.” He emphasized that Telegram already removes “millions of harmful posts and channels every day” and rejected characterizations of the platform as an “anarchic paradise.”
The Telegram CEO also mentioned ongoing efforts to engage with regulators to find a balance between privacy and security.
Durov noted that in cases where agreement cannot be reached with a country’s regulators, the company is prepared to exit that market, citing past examples of Russia and Iran.
Telegram’s policy update signals a potential shift in its approach to content moderation, particularly in private chats. The platform has traditionally been known for its strong stance on user privacy and minimal interference in private communications.
This change could have implications for Telegram’s reputation and user base, which includes many individuals and groups who value the platform’s previously hands-off approach.
The new reporting options are now available across all Telegram apps, allowing users to flag content for various reasons, including spam, violence, child abuse, illegal drugs, personal details, and pornography.
Telegram has also directed European Union users to study specific guidelines for reporting options, indicating a focus on compliance with EU regulations.
The post Telegram’s Privacy Shake-Up: Updates Content Moderation Policies Following CEO’s Arrest in France appeared first on Blockonomi.