Safety knowledge: Beware of the dangers of AI face-changing

1. What is "AI face-changing"

AI face-changing usually refers to the use of Deepfakes technology in deep learning to replace one person's face with another person's face. This technology can create fake videos or pictures that look real by analyzing the facial features of the target person and then applying them to another video or picture.

Face-changing technology has certain application value in the fields of entertainment and film production, but it also brings many dangers. Specifically, Deepfakes has attracted widespread attention for its potential use in producing child sexual abuse materials, celebrity pornographic videos, revenge pornography, fake news, pranks, bullying and financial fraud.

2. The history of "AI face-changing"

In 2017, Supasorn Suwajanakorn and others from the University of Washington used deep learning technology to launch the "Synthesizing Obama" [1], which was published in Siggraph 2017. In simple terms, this technology modifies a video clip of Barack Obama, making him pronounce words from a new audio track that is not related to the video. The technology is a combination of technologies such as Recurrent Neural Network (RNN) and Mouth Synthesis. The effect is very good, and the audio and mouth in the video are matched to achieve a realistic synthesis effect:

3. The harm of "AI face-changing"!

· Cases in the traditional financial field:

In the traditional financial field, face-changing technology is used for fraud. For example, according to reports [7,8], criminals use illegally obtained personal information to simulate the portraits and voices of the deceived relatives, leaders, colleagues or public officials through computer algorithms, and impersonate the above-mentioned persons to commit fraud. After gaining the victim's trust, they use pre-prepared routines to send the victim fraudulent information such as bank card transfers, virtual investment and financial management, and rebates for swiping orders. They also use video calls and voice bombing to further lower the victim's guard, making it difficult for the victim to detect abnormalities in a short period of time. Once the victim believes the fraudster's tricks and completes the transfer, the other party will disappear without a trace. #ai