Each month, our panel of crypto lawyers looks at the legal implications of some of the thorniest problems facing the industry in different jurisdictions around the world.

The arrest of Telegram CEO Pavel Durov in France has reignited a global debate on the rights and responsibilities of social media platforms.

Is it right to arrest a founder for criminal behavior on their platform they had nothing to do with? Critics have likened it to arresting the head of a phone company because criminals discussed a crime on a call.

The European Union has rolled out increasingly restrictive laws with the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR).

The DSA sets strict obligations for online platforms to tackle illegal content and ensure transparency. Meanwhile, the GDPR is a comprehensive law that governs how personal data is collected, processed and stored.

With vast amounts of user-generated content (UGC) flowing across global platforms, where do we draw the line between free speech, internet safety and privacy?

Magazine spoke with a panel of legal experts to find out more: Digital & Analogue Partners co-founder Catherine Smirnova in Europe, co-chair of the Hong Kong Web3 Association Joshua Chu from Asia, and Rikka Law Managing Partner Charlyn Ho from the United States.

The discussion has been edited for clarity and brevity.

Magazine: Durov has been charged in France for allegedly allowing criminal activity and illicit content on his social media and messaging platform. We dont often see tech executives held directly responsible for what happens on their platforms. Why do you think this case is different?

Ho: It surprised me that something like this could result in the arrest of a CEO. Often, there would be a lot of publicity around issues of perhaps fostering or permitting illicit activity over a platform, but it doesnt typically result in the arrest of the CEO. There are so many platforms that permit the types of communications that Telegram permits. But to have the CEO arrested is quite interesting.

Smirnova: The jurisdiction was also quite surprising, I would say, because we could expect it in any country without such transparent regulation regarding digital platforms, but not in France.

From the very beginning, I didnt think that this arrest, and detention, was anyhow connected to the creation of Telegram itself or to DSA. This was highly speculated now that DSA is in action. DSA is about companies liability, not personal liability.

Chu: When the news broke, it was easy for us to quickly take sides because the French police also did a poor job at drip-feeding information. We had no idea what he was arrested for, and many people assumed they were looking into Telegrams messages. It later emerged that one of the main issues was certain illicit materials being published on their public platform, which is essentially a blog.

If you are a tech platform and you have been alerted by law enforcement that you are displaying child pornography, for example, you simply cant ignore it.

Read also

Features 1 in 6 new Base memecoins are scams, 91% have vulnerabilities

Features THORChain founder and his plan to vampire attack all of DeFi

Magazine: Theres a growing tension between platform responsibility and user freedoms. How do you see regulatory frameworks like the DSA or the Digital Markets Act reshaping the way platforms are held accountable for user content?

Smirnova: The DSA may not be as well-known as its counterpart, the DMA (Digital Markets Act). It applies to all online platforms, not just the large companies targeted by the DMA.

Originally, internet regulation in the EU and UK was based on the principle that no online platform could be liable for content posted by others. But the internet has changed significantly since its inception, and its both fair and reasonable to find a balance. On one hand, we have the freedom of the internet and speech; on the other, we need to make the internet a safe space comparable to a city street.

In the US, you might see a similar trend there. While there isnt federal regulation yet, several states have introduced laws aimed at protecting minors online. This mirrors the EUs approach, where the DSAs precursors were national laws aimed at internet safety, particularly for minors.

Ho: As Catherine said, theres not a tremendous amount of specific internet safety laws on the federal level [in the US]. There are certain laws that are broad and may potentially touch on aspects of internet safety, particularly with respect to children.

On the state level, there are pushes for laws. In California, you have the Age-Appropriate Design Code, which models the UK Age-Appropriate Design Code, but that has encountered legal challenges in the courts and hasnt been fully rolled out yet.

Internet safety is a very complex topic. There is content moderation, which may potentially be covered under the Communications Decency Act. One of the key points is that unless youre a publisher of content, youre not generally liable. But a few years ago, an amendment was passed at the federal level that did away with that liability shield for child exploitation materials. Its called SESTA. Regardless of whether you were the actual publisher of that content, there were certain liabilities that could apply to the platform.

Read also

Features Real-life Doge at 18: Meme that’s going to the moon

Features Blockchain games take on the mainstream: Heres how they can win

Magazine: What limitations do local governments face when enforcing their laws on global platforms?

Chu: The data privacy law in Hong Kong is governed by the Personal Data Privacy Ordinance (PDPO), which is often criticized as antiquated. Introduced right after the handover, it reflects standards that even the UK has since moved away from with the introduction of GDPR. Moreover, Hong Kong has several data privacy provisions that, although passed, have not been enacted for over 20 years. This situation is appealing to companies because cross-border data transfer issues are not yet enforced, making Hong Kong an attractive business hub due to the lack of regulatory changes, influenced by both political and commercial reasons.

Tying this back to the topic of publication platforms, the issue of content removal comes into play. For instance, if you want to remove content from YouTube stored in the US, the Hong Kong government can only enforce laws within its own jurisdiction. The most they can achieve is to have it geo-blocked so its not accessible within Hong Kong, rather than removed entirely from the internet.

A police officer is merely a tourist outside their home jurisdiction unless they have consent from another jurisdiction.

Smirnova: GDPR has significantly influenced the market. I would even say not only the European market but all markets globally.

[Its similar to] the SEC. We all know that the SEC acts like its investigating whatever it wants across the globe, even concerning companies not headquartered in the US. The same applies to GDPR.

GDPR affects every company, regardless of where its headquartered or whether it has legal representatives in the EU. The crucial factor is whether the company handles the private data of European citizens. GDPR also influences US regulations because they are always trying to harmonize their approaches to data. It has impacted all companies in many ways, such as requiring the localization of European users data within the EU and imposing strict rules on data transfer across borders.

Ho: The way the SEC operates and how privacy laws work are not exactly comparable. The SEC is an executive agency in the US, and they frankly have a very vague scope of authority. As weve seen, theres been a lot of debate about whether theyve exceeded their authority.

An executive agency in the US must be granted authority under federal law to have a specific mandate, and if they exceed that mandate, they are essentially operating outside their legal bounds. I think the SEC is not necessarily the model we should look to for how society should be governed.

Laws are passed by legislators who are elected, at least in Europe and the United States. Regardless of ones political stance, this is how laws are made.

In terms of privacy law, and specifically GDPR, Articles 2 and 3 clearly outline who is responsible for compliance. Its either a company established within the European Union or a company outside the EU that monitors the behavior of EU data subjects or offers goods and services to them.

Read also

Features How Chinese traders and miners get around Chinas crypto ban

Features Deposit risk: What do crypto exchanges really do with your money?

Magazine: Platforms are increasingly seen as responsible for moderating harmful or illegal material. What do you see as the limits of this responsibility, and how should we balance privacy, safety and free speech?

Chu: These platforms are not law enforcement agencies and have no obligation to patrol the internet, approving content. They are more reactionary, and its up to the authorities to flag content as problematic. Even then, they must go through proper channels to address these issues. For instance, because the internet is largely borderless, the most a tech company based overseas might do, in terms of a court order, is to geo-block certain content. To actually remove content, one must navigate through the relevant jurisdictions to obtain the necessary court orders.

Smirnova: I agree they are not the police, and their primary duty is to react when they receive information about illegal content. I wouldnt say they should receive this information only from the police, which was the norm before the DSA. The E-Commerce Directive adopted in 2000 in the EU had the same rule: You are not liable unless you, as a platform, were informed that the content is illegal. So, there were no pre-moderation obligations.

However, considering the amount of data we produce and consume every day, society needs new tools of control in a positive sense, of course although these can be used negatively like anything else. Especially with AI-generated content, its unrealistic to expect that a special department in the police or the FBI is responsible for determining which content is allowed and which is not, and if not, to apply a claim to the platform only after a compliance process. It doesnt work like this anymore. In some countries, it still works this way, like in Brazil, where Judge [Alexandre] de Moraes has a special responsibility for the internet in a country of 200 million people.

Ho: Depending on whos using the platform, there are First Amendment issues in the United States. Weve had situations where political parties have pressured media companies like Meta to suppress messages such as those related to COVID. If the government directs a private company to suppress messages, that potentially raises constitutional issues.

Where the average person gets confused is that platforms themselves are not obligated to provide freedom of speech because they arent the government. Only the government has to respect the Bill of Rights. A platform has every right to introduce content moderation policies, and they can determine how much or how little they want to police the content.

Subscribe

The most engaging reads in blockchain. Delivered once a week.

Email address

SUBSCRIBE