Ai deepfake kyc exchange

The recent mainstream spread of AI technology has led to the expansion of fraudulent impersonation techniques known as deepfake, now also threatening the KYC checks of crypto exchanges. 

Recently, a new tool has emerged on the web, called ProKYC, capable of evading the identity recognition checks “Know Your Customer” of the exchanges.

Represents a new level of highly sophisticated financial fraud, capable of causing serious damage to various crypto companies. 

To this are added the increasingly frequent data breaches that unfortunately also involve famous investment companies, putting their online information at risk.

Let’s see all the details below.

AI: the advent of the deepfake threat and the KYC security measures of the exchanges 

In the last 2 years, the expansion of AI technology has triggered the increase of financial scams on the web, led by the increasingly massive use of deepfake videos.

These videos, created with the help of intelligenza artificiale, allow scammers to impersonate famous people and celebrities by accurately replicating appearance and voice. With these videos, the fraudsters then propose to their less attentive victims to enter through phishing links or to purchase scam tokens.

On Twitter, Youtube, and other social media, we increasingly see cases like this, with deepfake videos featuring the faces of Cristiano Ronaldo, Elon Musk, etc.

Sometimes these models are even used in corporate videocalls, aiming to impress collaborators, employees, and managers to extract valuable data.

Deepfake AI scammers stole $26 million by impersonating company’s CFO in a video call pic.twitter.com/6k0TqvxIcr

— Dexerto (@Dexerto) February 6, 2024

Obviously, a good part of deepfake is related to the world of cryptocurrencies, given the pseudo-anonymity component that the same scammers exploit to monetize.

There are, however, also security systems, such as the KYC verification of exchanges, that prevent AI from being exploited with malicious intent.

The exchange platforms use KYC for the registration of new users, requiring photos, videos, and biometric face analysis and presenting anti-deepfake barriers.

Thus, the exchanges limit their platform from being exploited for illegal activities such as money laundering and the sale of scam proceeds.

Now, however, we are entering a new phase of AI scams, with super sophisticated technologies that sometimes manage to hit even the largest crypto exchanges.

The potential damage they can cause is unlimited if they are not blocked with new countermeasures.

A new AI-based deepfake tool capable of bypassing the KYC of cryptocurrency exchanges

As stated by the cybersecurity company Cato Networks, a deepfake creation AI tool is spreading that manages to evade the KYC checks of exchanges.

This is ProKYC, a tool downloadable with a subscription of 629 dollars annually, from which fraudsters can create fake IDs and generate pseudo-real videos.

The package includes a camera, a virtual emulator, facial animations, fingerprints, and generation of verification photos.

The tool has been customized specifically for financial companies whose KYC protocols require a cross-check between biometric face checks and documents.

The evidence seems so real that it manages to cross this barrier up to today never crossed even by the best AI systems.

A deepfake created with ProKYC and published on Cato’s blog post shows how AI can generate fake ID documents bypassing the Bybit exchange.

The user completes the platform’s KYC verification using a fictitious name, a fake document, and an artificial video.

🚨 New threat research!

🕵️‍♂️ Cato CTRL has discovered a threat actor, ProKYC, selling a deepfake tool in the cybercriminal underground to enable new account fraud against cryptocurrency exchanges.

🔗 Blog: https://t.co/STI3xjCUVd#NetworkSecurity #deepfake #SASE pic.twitter.com/sAKrR2MaYb

— Cato Networks (@CatoNetworks) October 10, 2024

In a recent report, the head of security at Cato Network Etay Maor stated that ProKYC presents itself as a new level of sophistication in financial scams.

The new AI tool represents a significant step forward compared to the old-fashioned methods that cybercriminals used to beat two-factor authentication and KYC.
Scammers are no longer required to purchase fake (or stolen) identities on the deep web but can take advantage of this new convenient service.

Identity fraud massively increases in the crypto world with AI and data breach

The advent of ProKYC with deepfake AI that bypass KYC checks of exchanges, risks leading to an increase in financial scams, already dangerously high in themselves.

The actors are now indeed able to create multiple accounts on the exchanges thanks to a feature known as New Account Fraud (NAF).

Furthermore, ProKYC claims to also be capable of bypassing KYC measures for payment platforms Stripe and Revolut.

Inevitably, if financial platforms do not develop new recognition systems against deepfakes, they could suffer serious damage to their structures.

The problem is that biometric controls that are too strict risk causing “false positivesmaking the user experience damn annoying.

As Cato states on this subject:

“The creation of super restrictive biometric authentication systems can lead to many false positive warnings. On the other hand, lax controls can lead to fraud.”

There are, however, alternative detection methods against deepfake AI tools, which involve the presence of a human as a controller.

An employee of the exchange should manually identify images and videos, identifying inconsistencies in facial movements and image quality.

A separate discussion, however, for the data breach and for the identity leaks, which exploit documents and real evidence to launch a wide range of online scams.

Just a few days ago, the investment company Fidelity experienced a data breach affecting over 77,000 clients. Driver’s licenses, social security numbers, and personal information were stolen, which will surely be sold on the dark web and used for various types of crypto scams.

JUST IN: $5.4 trillion asset manager Fidelity confirms 77,000+ customer records were hacked, including license, social security numbers, and personal information.

— Watcher.Guru (@WatcherGuru) October 10, 2024

We remind you in any case that the penalties for identity fraud in the United States can be severe and vary depending on the nature and extent of the crime. 

The maximum penalty is up to 15 years of imprisonment with heavy fines.