Telegram, the popular messaging app, has announced changes to its content moderation policies, including the removal of certain features. These changes come in response to criticism and allegations of complicity in illegal activities, particularly related to child pornography.

Telegram founder Pavel Durov revealed that the platform will be discontinuing media uploads to the “Telegraph” blog and disabling the “People Nearby” feature. While these features were used by a small minority of users, Durov stated that they were disproportionately exploited by malicious actors involved in illicit activities.

Durov emphasized Telegram’s commitment to improving its content moderation efforts. The company aims to transform content moderation from a source of criticism into a point of praise.

French prosecutors have accused Telegram of complicity in the spread of child pornography. They allege that the platform failed to enforce stricter content moderation policies, allowing for the distribution of illegal materials.

French authorities initiated an investigation into Telegram’s activities after uncovering an illicit chatroom attempting to lure underage girls. This led to the issuance of arrest warrants for Pavel Durov and his brother, Nikolai.

Pavel Durov recently broke his silence on the allegations, providing clarification on the events surrounding his arrest and the subsequent investigations.

In light of the criticism and legal challenges, Telegram has signaled a shift in its content moderation approach. The company has modified the language surrounding the moderation of private chats and trimmed certain features to enhance its ability to combat illegal activities.

Telegram’s decision to tighten content moderation policies is a significant development. While the company has faced criticism for its previous approach, these changes demonstrate a commitment to addressing concerns and improving the platform’s safety.

As the cryptocurrency industry continues to evolve, it is essential for platforms like Telegram to maintain strong content moderation practices to protect users and comply with regulatory requirements.