A new AI-related scam has surfaced featuring thousands of AI-generated influencers who have flooded social sites like Instagram using stolen images of real people.
The scam includes stealing videos from real models and adult content creators, giving them AI-generated faces, and monetizing their bodies with links to dating sites for instance Patreon, OnlyFans competitions, and various AI apps.
AI-generated influencers compete with humans on Instagram
In April, 404 Media first reported this practice, when it first surfaced. The practice has now grown into a popular trend that has left many assuming that the Meta-run social media is either unable or unwilling to stop the seeming trend on its platform.
This comes as the human content creators on Instagram say they are now competing with AI content in a way that is now impacting their viability.
With some of the off-the-shelf AI tools and apps hosted on the Apple App and Google Play Stores, the AI influences can now easily create and monetize their accounts. According to Wired, they reviewed more than 1,000 Instagram accounts generated by AI and Discord channels where the people who make this content share tips and discuss strategy and several guides that explain how to make money through “AI pimping”.
Wired found out that the problem, which used to be minor on Instagram had grown to become a huge practice in line with several predictions on the future social media landscape, where AI-generated content eclipses that of humans.
Adult content creators who promote their work on Instagram are now worried about directly competing with these AI rip-off accounts, many of which use photographs and videos stolen from adult content creators and Instagram models.
Elaina St James an adult content creator said that while there may be other changes to Instagram’s algorithm that could have contributed to this, since the explosion of AI generated influencer accounts on Instagram her “reach went down tremendously,” from a typical 1 million to 5 million views a month. She has not cracked a million in the last 10 months, and sometimes coming in under 500,000 views.
“This is probably one of the reasons my views are going down. It is because I am competing with something that is unnatural.” St James.
Alexios Mantzarlis the director of the security, trust and safety initiative at Cornell Tech and formerly principal of trust and safety intelligence at Google compiled a list of around 900 accounts in the 404 Media research.
A sign of what social media will look like shortly
Mantzarlis said this was nothing totally new or surprising as this was just fulfilling predictions made earlier concerning AI technology and its adoption on social media platforms.
“It felt like a possible sign of what social media is going to look like in five years,” said Mantzarlis.
“Because this may be coming to other parts of the Internet, not just the attractive people niche on Instagram. This is probably a sign that it is going to be pretty bad.”
Mantzarlis.
Out of more than 1,000 AI-generated Instagram influencer accounts, at least 100 included deepfake content that took existing videos, usually from models and adult entertainment performers, and replaced their faces with an AI-generated one to give the videos a fresh look.
The videos will look like new, original content consistent with the other AI-generated images and videos shared by the AI-generated influencer.
Instagram will act upon receiving complaints
Instagram said that it would take action on accounts only if they are reported by the right owner or someone authorized to report on their behalf. However, many human influencers are saying the channel provided has not been working as they have too many accounts of impersonators to follow.
St James added that the fact that many of these AI-generated influencer accounts are being operated by me adds insult to injury.
“We as women in the world make less money. We are at a disadvantage in a lot of ways. One area of the world where we do have an advantage is the influencer and the modeling thing, so it is just an extra layer,” said St James.
“It kind of makes me a little passed off that it is a guy that is making money pretending to be a woman,” she added.
Apple, which so far has failed to solve the “dual use” problem of face-swapping apps that can be used innocently but are often used to create nonconsensual content, removed from the App Store after it was quizzed about the issue.
From Zero to Web3 Pro: Your 90-Day Career Launch Plan