Both AI and crypto move at breakneck speed and are deeply technical, making them difficult to regulate — but whistleblowers are being silenced.

Another week, and another warning about artificial intelligence.

But this open letter — expressing fears it could exacerbate inequalities, fuel misinformation, and lead to uncontrollable AI systems that “potentially result in human extinction” — hit differently.

Why? Because four of the anonymous signatories are current employees of OpenAI, the maker of the widely popular ChatGPT. Six others used to work there.

The fact that so many people intimately involved in bringing AI to the masses fear for the future is significant. While they do believe this still-nascent technology could deliver “unprecedented benefits” to humanity, they fear the public — and regulators — aren’t getting the whole picture.

“AI companies possess substantial non-public information about the capabilities and limitations of their systems, the adequacy of their protective measures, and the risk levels of different kinds of harm. However, they currently have only weak obligations to share some of this information with governments, and none with civil society. We do not think they can all be relied upon to share it voluntarily.”

Right to Warn

The parallels between artificial intelligence and the crypto space are pretty stark. Both industries move at breakneck speed and are deeply technical. This creates huge stumbling blocks for governments and regulators alike. For one, some politicians can find it challenging to get their heads around the issue itself. Just ask U.S. Representative Brad Sherman, who infamously referred to Bitcoin’s creator as “Saratoshi Nagamoto.” 

https://twitter.com/guti_uno/status/1684383259025883137

From here, it becomes difficult to prepare literate laws that encourages innovation among good actors while deterring criminality among the bad. And by the time the authorities have caught up, such industries are often so unrecognizable that the legislation on the table fails to reflect the realities of how the technology is being used… and where the biggest risks lie. It’s telling that there’s still significant regulatory paralysis concerning cryptocurrencies in the U.S. — more than 15 years after Bitcoin first launched.

As the AI-focused open letter notes, a lack of effective government oversight means there is a huge dependence on whistleblowers within companies to hold them accountable. One of the authors’ biggest concerns relates to how confidentiality agreements effectively gag them from speaking out.

“Ordinary whistleblower protections are insufficient because they focus on illegal activity, whereas many of the risks we are concerned about are not yet regulated. Some of us reasonably fear various forms of retaliation.

Right to Warn

Again, symmetry exists between artificial intelligence and crypto here — as evidenced by a recent, in-depth, and damning report released by an independent examiner tasked with investigating FTX’s implosion in 2022. In that case, it was found that six anonymous whistleblowers with legitimate concerns were paid off to the tune of $25 million. One was told to apologize to now-jailed CEO Sam Bankman-Fried, and ended up reaching a settlement for $16 million after resigning from their role.

While the crypto industry has made promising strides to right past wrongs following a slew of bankruptcies in recent years — BlockFi, Voyager and Celsius among them — you could argue there is still more work to be done. And that makes the four commitments asked of AI firms in this open letter especially applicable to the digital assets sector.

There’s a request for leading artificial intelligence companies to refrain from enforcing clauses that ban criticism from serving employees worried about emerging risks — and for anonymous procedures to be introduced so concerns can be raised to boards, regulators and experts. Some of crypto’s biggest controversies could have been avoided if similar safeguards were in place.

And as well as embracing a culture of open criticism, there’s a call for leaders in AI to vow that they won’t retaliate against workers who release confidential information after exhausting all other possibilities to escalate an issue.

It’s unclear how much this open letter will move the needle in the quest to regulate AI. And there’s something to be said for the inherent transparency of blockchain technology, where the flow of funds — and records of transactions — can be monitored in real time. Large language models, typically built behind closed doors, are far more opaque by comparison.

But the consequences of failing to act and the potential harms that everyday consumers face are equally dire across both industries. Too many crypto investors have lost their life savings because they weren’t adequately informed of the risks — with a lack of coordination among international regulators to prevent offshore bad actors from going unchecked. And as AI gets smarter and more user intuitive with every passing day, now the livelihoods of millions of hard-working people could be jeopardized too.

You might also like: Kraken and Binance sued for $13bn. Why?