Binance P2P’s Invisible Guardians: Using AI to Safeguard Crypto Users

2023-11-17

Main Takeaways

  • The Binance P2P (peer-to-peer) platform employs advanced large language models (LLMs) to monitor transactions for signs of fraudulent behavior and assist in resolving user appeals. 

  • Prevalent scam tactics include tricking sellers into releasing their crypto before receiving actual payment or asking buyers to cancel their order post-payment.

  • By combining artificial intelligence tools and a diligent customer service team, we aim to ensure a secure user experience on Binance P2P.

Get a glimpse into the work of the unsung heroes who work behind the scenes to ensure a secure user experience on Binance P2P. 

Binance’s P2P platform launched in late 2018 to facilitate currency exchange transactions between bitcoin and local currencies. While convenient, peer-to-peer (P2P) trading has its specific risks. Instead of going through a centralized exchange, you are trusting another user to honor your request to buy or sell crypto. 

What if you are transacting with a scammer? Reputable P2P marketplaces, like Binance P2P, use an escrow service and a strict identity verification process to combat fraudulent activity. But even with all the appropriate safeguards in place — scammers can and often do find a way. 

Leveraging artificial intelligence (AI) models, we have built a security infrastructure designed to mitigate the specific risks associated with P2P trading. But before we delve further, let us take a look at some common scams that traders encounter when using the ‘Chat’ feature on Binance P2P. 

Four Common Binance P2P Scams

1. Fake customer service representatives

Scammers often impersonate Binance Support to trick victims into giving away their account or credit card details. They might claim that Binance has already “received payment” before requesting the seller to release their crypto in escrow. 

If there’s one thing you should remember: Our support team will never — under any circumstances — communicate with you via Binance P2P’s chatbox. 

2. Escrow scam

In this con, the scammer pretends to be a buyer. During the transaction, the scammer will lie and claim the fiat payment is being held in Binance P2P’s escrow. The scammer claims Binance will “send” the money once you release your crypto. 

This is not how Binance P2P's escrow system operates. We temporarily secure only the sellers' crypto in escrow, and fiat payments from buyers never go through our escrow service.

3. Threatening to call the police

Scammers may claim they have paid after making an order. If you hesitate, they will pressure you into releasing your payment by threatening to call the cops.

Do not give into threats on Binance P2P. For legitimate disputes or issues with your trading partners, please file an appeal by following the steps in this guide: How to Appeal for P2P Orders on Binance App.

4. Tricking buyer into canceling order post-payment

It’s not all buyer-initiated scams – sellers can also pull off malicious schemes. After receiving payment, the seller can claim there is a problem with their one-time password (OTP) or payment release and suggest that the buyer cancel the order. The seller will then “promise” to refund you in full once the order is canceled. 

Sure enough, the seller is just a scammer who never planned on refunding. Anyone asking you to cancel an order after you have paid them is likely trying to scam you. 

Meet the Invisible Guardians at Work

To protect our users from falling victim to the scams mentioned above, we have our own team of AI heroes working behind the scenes 24/7.

These heroes are specialized AI models trained to sniff out users who are acting with ill intent. The models essentially act as gatekeepers, overseeing various phases of the transaction pipeline with the sole purpose of intercepting fraudulent activity. Here’s a closer look at the models we use and how they function to deliver millions of users a reliable P2P trading experience. 

Jack of All Trades: Large Language Model (LLM)

The term large language model (LLM) refers to a general-purpose AI system that is adept at “understanding” and generating human language. LLMs are trained using text data from across the internet.

Over time, these models can be trained, or fine-tuned, to excel at specific tasks, like generating original pieces of text or recognizing messages that can signal senders’ malicious intentions.

How do we use LLMs to train our P2P models?

To fine-tune our models, we expose them to communication data associated with P2P transactions – in other words, what people say to each other as they trade. Initially, our models encountered more examples of general transaction activity than scam-related behavior during the learning process. This posed a significant hurdle: How can our models learn how scammers communicate with so few instances to pull data from?

We attempted several approaches: 

  1. Increasing the training set of the minority group (scammer samples) by repeating their instances (oversampling) more often in the model. 

  2. Reducing the number of instances from regular users (undersampling).

  3. Tweaking the importance of each group (altering the class weights). 

All three methods were still unsatisfactory due to the data diversity from limited sample size. The most effective approach was creating additional training instances via LLMs like LLaMa 2, OpenAssistant, and Falcon.

We used these LLMs to rephrase existing examples of scammers’ communicative behavior, or even invent new examples with similar messaging. This yielded a more balanced set of training material with a satisfactory sample size of scammers for our classification models.

Understanding Users’ Intentions

Most user interaction on Binance P2P takes place in our built-in chat feature. The contents of these conversations can reveal key insights about user intentions. For example, if someone is pretending to be a customer service agent, breaking the rules about how to pay, or needing help to finish an order, they say certain things in the chat.

We are continually tweaking our LLMs to identify user intentions across various P2P situations, as shown in the diagram above. Our models are tailored to comprehend situations unique to our marketplace, as well as differentiate between suspicious and normal interactions. 

Our aim is to prevent scams before they get a chance to do harm to our users. LLMs help us flag suspicious messages before the conversation leads to a transaction. In addition to strengthening security, they routinely help us identify and assist users who need help completing a transaction. So far, our AI models have helped us detect and prevent well over 2,000 potential scams and automatically facilitated 212,000 order completions in the appeals chat, with funds involved totalling over $28 million.

To better illustrate how our models function, here are two examples of them in action. 

Scenario 1: Third-party payment

When our model identifies that a user intends to utilize a third-party payment method, for example, using someone else's account to make a payment, it will promptly trigger an alert sent to the chat system visible to both parties. 

This alert aims to inform our users about the risks associated with accepting such a request.

Scenario 2: Order Completion

When a seller encounters challenges while releasing and fulfilling an order, they can contact our appeals chat for assistance.

Our model, upon recognizing that a seller needs assistance with the order, will activate a predefined set of rules to assess whether the criteria for automated order processing have been met. If these conditions are satisfied, the system will proceed to release and fulfill the order on the seller's behalf.

Closing Thoughts

At Binance, we invest significant resources into ensuring the safety of our users, and we use the widest range of approaches to achieve that goal – including innovative solutions like AI-powered tools. We employ large language models across our P2P marketplace to identify users who might be engaging in suspicious behavior. To combat the ever-evolving scam industry, our language models are constantly retrained to detect the latest tactics and trends. 

Alongside our AI tools works our passionate team of customer service agents – after all, in some situations nothing can replace a human touch. Together, they ensure Binance is not just safe, but also delivers an exceptional user experience, ensuring that every user can put trust in every product and feature available in the Binance ecosystem. 

If you’ve fallen victim to a P2P scammer, please file a report to Binance Support by following the steps in this guide: How to Report Scams on Binance Support.

Further Reading

239,246,592 users chose us. Find out why today.
Register Now