San Francisco City Attorney David Chiu has announced that his office is litigating 16 websites that use AI to develop and distribute non-consensual deepfake nude images of women and girls. This comes amid heightened attention on the creation and distribution of AI nonconsensual images.
The lawsuit, the first of its kind in San Francisco, blames operators of the websites for violating state and federal laws that denounce deepfake pornography, child pornography, and revenge pornographic material as well as California’s unfair competition law.
Chiu wants to raise alarm about this ill-practice
According to the New York Times, the initiative was chief deputy attorney Yvonne Mere’s idea, who rallied her colleagues to craft a lawsuit, that should result in the closure of 16 websites.
The websites’ names were censored in the copy of the lawsuit availed to the public Thursday.
While the Attorney’s office was yet to identify most of the websites’ owners —officials from that office have expressed optimism over finding the names of the sites’ proprietors and holding them accountable.
During a press conference Thursday, Chiu said, the sites produce “pornographic” materials without the consensus of the persons in the photo.
Chiu has hinted that the lawsuit seeks to shut down the websites and raise alarm about this form of “sexual abuse” and bring it to an end.
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation.”
Chiu.
On the said websites, users upload photos of fully dressed real people, before AI alters the photographs to mimic what the person would look like naked.
As indicated by the lawsuit, one of the sites promotes the non-consensual nature of the images, asserting, “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”
The availability and accessibility of open-source AI models means anyone can access and adapt AI-powered engines for their purposes. This results in creation of sites and apps that can generate deepfake nudes from scratch or “nudify” existing photographs in realistic ways, often for a fee.
San Francisco is not the only place to witness this challenge
In January, deepfake apps grabbed headlines when fake nude photos of Taylor Swift went viral online. Many other and far less popular people were persecuted before and after Swift.
Chiu has admitted that the “proliferation of these images has exploited a shocking number of women and girls across the globe,” from celebrities to middle school students.
Through its own investigations, the city attorney’s office found that the websites were visited over 200 million times in the first six months of this year. The office expressed concerns that once an image goes online, it becomes difficult for victims to determine the websites that were used to “nudify” their photographs.
This is because the images have nothing unique or identifying marks that one can refer back to the websites. In addition, it’s also difficult for victims to get rid of the photographs from the internet, which affects their self-esteem and their digital footprint.
Earlier in the year, five Beverly Hills eighth-graders were expelled for generating and sharing deepfake naked images of 16 eighth-grade girls, overlaying the girls’ faces onto AI-generated bodies.
Chiu’s office said it has observed similar incidents at other schools in California, New Jersey and Washington with the images used to humiliate, bully and threaten women and girls.
The net effect on the victims, the attorney’s office said, has been shattering on their reputations, suffer mental health, loss of self-esteem, and in some instances, causing individuals to become suicidal.