San Francisco City Atty. David Chiu today announced that his office is filing a legal action against 16 A.I.-powered “undressing” websites that enable users to create and distribute deepfake nudes photos of women and girls.
City officials said the suit is thought to be the first of its kind and accuses the operators of these sites of infringing state as well as federal regulations prohibiting deepfake pornography, revenge porn, child pornography and California’s unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday.
Chiu’s office has not yet determined who owns many of these websites but they believe they will be able to find them and hold them accountable.
The lawsuit serves two purposes according to Chius; closing down these websites and raising awareness about this type of “sexual assault.”
Users on such platforms upload photographs taken from real people wearing clothes, then an artificial intelligence changes their appearance in order to show how these people would look without clothes. According to Chiu during his press briefing on Thursday, this results in “pornographic” images being produced without consent from individuals whose faces or bodies are used in making those pictures public.
One site listed by name in the complaint demonstrates how nonconsensual these pictures are: “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”
Open source AI models have made it possible for anyone to access highly adaptable AI engines for different uses which include sites or applications capable of creating completely new fake nudes or just creating realistic versions from existing images for a fee.
Deepfake apps made news headlines when inappropriate nude images featuring Taylor Swift were shared online last month; however, long before she was targeted this way there were generally unknown victims as well. “From celebrities down through middle school students around the world who have been exploited by billions of views,” stated City Attorney Hal S. Chiu.
According to the office of the city attorney, these websites were visited over 200 million times in only six months of 2024.
Once an image is posted online, victims face difficulties in finding out which sites were used to generate “nudify” pictures because these images do not have any unique or identifying marks that point you back to websites,” said Yvonne R. Meré, chief deputy city attorney for San Francisco.
Victims also find it very tricky to erase this content from the web as well.
During earlier months this year five Beverly Hills eighth-graders were expelled for creating and circulating deepfake nudes images of sixteen eighth-grade girls by superimposing their faces on A.I.-generated bodies.
Similar instances have been reported by Chiu’s office at other schools in California along with Washington and New Jersey.
“These photographs are employed as a form of bullying, humiliation and intimidation directed towards women and young girls,” observed Chiu. “The harm caused to these individuals has affected their lives; reputation-wise, mentally health-wise driving some even into committing suicide.”