Can decentralized social media really guarantee freedom of expression?

November 28, 2024

In a world where traditional social media platforms dominate the digital conversation, do decentralized alternatives emerge as a promising counterpoint to censorship or as a breeding ground for hate speech?

BeInCrypto speaks to Anurag Arjun, co-founder of Avail, a blockchain infrastructure pioneer who is interested in how decentralization can transform online discourse and governance.

Decentralized social media faces challenges in oversight and privacy

In October, X (formerly known as Twitter) suspended the Hebrew-language account of Iranian Supreme Leader Ali Khamenei for “violating the platform’s rules.” The post in question commented on Israel’s retaliatory attack on Tehran, reigniting global debates about the power centralized platforms have over public discourse.

Many have wondered: Could it be that the leader of a nation is not allowed to comment on airstrikes taking place within his own borders?

Beyond political sensitivity, the same thing happens all the time to everyday creators in much less serious contexts. In Q2 2024, YouTube’s automated reporting system removed about 8.19 million videos, while user-generated reporting removed only about 238,000.

In response, decentralized platforms like Mastodon and Lens Protocol are growing in popularity. Mastodon, for example, has seen an increase of 2.5 million active users since Elon Musk acquired Twitter in November 2022. These platforms promise to redistribute control, but that raises complex questions about moderation, accountability, and scalability.

“Decentralization doesn’t mean no moderation—it’s about transferring control to user communities while maintaining transparency and accountability,” Anurag Arjun, co-founder of Avail, told BeInCrypto in an interview.

Decentralized platforms aim to remove institutional influence over online discourse. These platforms allow users themselves to define and enforce moderation standards. Unlike Facebook or YouTube, which have faced accusations of algorithmic bias and shadow bans, decentralized systems claim to foster open dialogue.

However, while decentralization removes control from a single point, it certainly doesn’t guarantee fairness. A recent Pew Research Center poll found that 72% of Americans believe social media companies have too much power over public discourse.

“Distributed governance ensures that no single individual or company makes unilateral decisions about what can or cannot be said, but it still requires safeguards to balance diverse viewpoints,” Arjun explains.

Decentralized platforms face the challenge of balancing freedom of expression with the control of harmful content such as hate speech, misinformation, and illegal activities. A prominent example is the controversy surrounding Pump.fun, a platform that allowed live streaming to promote meme coins.

“This brings up a crucial point,” Arjun explains. “Platforms need multi-layered governance models and evidence-based mechanisms to address harmful content without becoming authoritarian.”

The obvious solution seems to be to use artificial intelligence. While AI tools can identify harmful content with up to 94% accuracy, they lack the fine-grained judgment required for sensitive cases. In any case, decentralized systems must combine AI with transparent, human-led oversight to achieve effective results.

So the question remains: How do you protect people from harm or impose any form of regulation without first agreeing on what constitutes dirty play? Also, how will society reshape itself if it is to succeed in organically policing itself?

Decentralized governance democratizes decision-making but introduces new risks. Voting systems, while participatory, can marginalize minority views, reproducing the very issues that decentralization seeks to nip in the bud.

For example, on Polymarket, a decentralized prediction platform, majority voting has sometimes suppressed dissenting opinions, highlighting the need for safeguards.

Transparent appeal mechanisms and oversight of majority power are essential to prevent new forms of censorship. Decentralized platforms focus on user privacy, giving individuals control over their data and social charges.

This independence builds trust, as users are no longer at the mercy of corporate data breaches like Facebook’s Cambridge Analytica scandal in 2018, which exposed data from 87 million users. In 2017, 79% of Facebook users trusted Meta with their privacy. After the scandal, that number dropped to 66%.

However, privacy can complicate efforts to address malicious behavior. This ensures that decentralized networks remain secure without compromising their core principles.

“Privacy cannot come at the expense of accountability,” Arjun explains. “Platforms must adopt mechanisms that protect user data while enabling fair and transparent moderation.”

A key challenge for decentralized platforms is addressing legal issues like defamation and incitement. Unlike centralized systems like X, which receives 65,000 government data requests annually, decentralized platforms lack clear mechanisms for legal recourse. Arjun emphasizes the importance of collaboration between platform creators and lawmakers.

The question is not whether decentralization can work, but whether it can evolve to balance freedom and responsibility in the digital age.