Everyone seems to agree that social media has become a cesspit, with cancel-culture mobs enforcing ideological purity on one side and trolls spreading conspiracy theories on the other.

X and Facebook are accused of amplifying hatred and conflict, with riots in the United Kingdom highlighting how a handful of social media posts can ignite a cauldron of simmering anger and resentment.

In response, governments around the world are cracking down on free speech. Turkey banned Instagram, Venezuela banned X and the UK government has been sending people to jail for inciting violence and in some cases, just for having shitty opinions.

But theres a better way to fix social media than banning accounts and censoring misinformation.

The root cause of the problem isnt fake news in individual posts, its how social media algorithms prioritize conflict and highlight the most polarizing content in a bid for engagement and ad dollars.

This is going to sound a little bit crazy, but I think the free speech debate is a complete distraction right now, former Twitter boss Jack Dorsey told the Oslo Freedom Forum in June. I think the real debate should be about free will.

A marketplace of social media algorithms

Dorsey argues that black-box social media algorithms are impacting our agency by twisting our reality and hacking our mind space. He believes the solution is to enable users to choose between different algorithms to have greater control over the sort of content they serve up.

Give people choice of what algorithm they want to use, from a party that they trust, give people choice to build their own algorithm that they can plug in on top of these networks and see what they want. And they can shift them out as well. And give people choice to have, really, a marketplace.

Its a simple but compelling idea, but there are a truckload of hurdles to overcome before a mainstream platform would willingly give users a choice of algorithm. 

Why social media platforms will resist algorithmic choice

Princeton computer science professor Arvind Narayanan has extensively researched the impact of social media algorithms on society. He tells Cointelegraph that Dorseys idea is great but unlikely to happen on the big platforms.

A marketplace of algorithms is an important intervention. Centralized social media platforms dont allow users nearly enough control over their feed, and the trend is toward less and less control, as recommendation algorithms play a more central role, he explains.

I expect that centralized platforms wont allow third-party algorithms for the same reasons they dont provide user controls in the first place. Thats why decentralized social media is so important.

There are some early experiments on decentralized platforms like Farcaster and Nostr, but Twitter spinoff Bluesky is the most advanced and already has this functionality built-in. However, its only been used so far for specialty content feeds.

Read also

Features Satoshi may have needed an alias, but can we say the same?

Features Why are crypto fans obsessed with micronations and seasteading?

Bluesky to trial algorithm choice

But Northwestern University Assistant Professor William Brady tells Cointelegraph hell be trialing a new algorithm on Bluesky in the coming months that will be offered as an alternative to the sites main algorithm.

Studies have shown that up to 90% of the political content we see online comes from a tiny minority of highly motivated and partisan users. So trying to reduce some of their influence is one key feature, he says.

The representative diversification algorithm aims to better represent the most common views rather than the most extreme views without making the feed vanilla.

Were actually not getting rid of strong moral or political opinions, because we think thats important for democracy. But were getting rid of some of that most toxic content that we know is associated with the most extreme people on that distribution.

Create a personalized algorithm using AI

Approaching the subject from a different direction, Groq AI researcher and engineer Rick Lamers recently developed an open-source browser extension that works on desktop and mobile. It scans and assesses posts from people you follow and auto-hides posts based on content and sentiment.

Lamers tells Cointelegraph he created it so that he could follow people on X for their posts about AI, without having to read inflammatory political content.

I needed something in-between unfollowing and following all content, which led to selectively hiding posts based on topics with a LLM/AI.

The use of large language models (LLMs) to sort through social media content opens up the intriguing possibility of designing personalized algorithms that do not require social platforms to agree to change.

But reordering content on your feed is a much bigger challenge than simply hiding posts Lamers says, so were not there yet.

How social media algorithms amplify conflict

When social media first began in the early 2000s, content was displayed in chronological order. But in 2011, Facebooks news feed started choosing Top Stories to show users.

Twitter followed suit in 2015 with its While You Were Away feature and moved to an algorithmic timeline in 2016. The world as we knew it ended.

Although everyone claims to hate social media algorithms, theyre actually very useful in helping users wade through an ocean of content to find the most interesting and engaging posts.

Dan Romero, the founder of the decentralized platform Farcaster, points Cointelegraph to a thread he wrote on the topic. He says that every world-class consumer app uses machine learning-based feeds because thats what users want.

This is [the] overwhelming consumer revealed preference in terms of time spent, he said.

Unfortunately, the algorithms quickly learned that the content people are most likely to engage with involves conflict and hatred, polarizing political views, conspiracy theories, outrage and public shaming.

You open your feed and you are smashed with the same stuff, says Dave Catudal, the co-founder of the SocialFi platform Lyvely.

I dont want to be bombarded with Yemen and Iran and Gaza and Israel and all that […] They are clearly pushing some kind of political, disruptive conflict they want conflict.

Studies show that algorithms consistently amplify moral, emotional and group-based content. Brady explains this is an evolutionary adaptation.

We have biases to pay attention to this type of content because in small group settings, this actually gives us an advantage, he says. If you are paying attention to emotional content in your environment it helps you survive physical and social threats.

Read also

Features Get Bitcoin or die tryin: Why hip hop stars love crypto

Columns Socios boss goal? To knock crypto out of the park

Social media bubbles work differently

The old concept of the social media bubble where users only get content they agree with is not really accurate. 

While bubbles do exist, research shows that users are exposed to more opinions and ideas that they hate than ever before. Thats because they are more likely to engage with content that enrages them, either by getting into an argument, dunking on it in a quote tweet, or via a pile-on.

Content that you hate is like quicksand the more you fight against it, the more the algo serves up. But it still reinforces peoples beliefs and darkest fears by highlighting the absolute worst takes from the other side.

Like cigarette companies in the 1970s, platforms are well aware of the harms the focus on engagement causes to individuals and society, but it appears that theres too much money at stake to change direction. 

Meta made $38.32 billion in ad revenue last quarter (98% of its total revenue), with Metas chief financial officer, Susan Li, attributing much of this to AI-driven ad placements. Facebook has trialed the use of bridging algorithms, which aim to bring people together rather than divide them, but elected not to put them into production.

Bluesky, Nostr and Farcaster: Marketplace of algorithms

Dorsey also realized he wasnt going to be able to bring meaningful change to Twitter, so he created Bluesky in an attempt to build an open-source, decentralized alternative. But disillusioned with Bluesky making many of the same mistakes as Twitter, hes now thrown his weight behind Bitcoin-friendly Nostr.

The decentralized network allows users to choose which clients and relays to use, potentiallyoffering users a wide choice of algorithms.

But one big issue for decentralized platforms is that building a decent algorithm is a massive undertaking that is likely beyond the communitys abilities.

A team of developers built a decentralized feed market for Farcaster for the Paradigm hackathon last October, but no one seemed interested. 

The reason, according to Romero, was that community-built feeds were unlikely to be performant and economic enough for a modern, at-scale consumer UX. Might work as an open source, self-hosted type client.

Making a good machine learning feed is hard and requires significant resources to make performant and real-time, he said in another thread.

If you want to do a feed marketplace with good UX, youd likely need to create a back end where developers would upload their models and the client runs the model in their [infrastructure]. This obviously has privacy concerns, but maybe possible.

A bigger problem, however, is that its TBD if consumers would be willing to pay for your algo, though.

Subscribe

The most engaging reads in blockchain. Delivered once a week.

Email address

SUBSCRIBE