SAN FRANCISCO — San Francisco City Attorney David Chiu is suing several 16 companies based in the U.S. and abroad, alleging they create and distribute nonconsensual, AI-generated pornography.
Through a statement released Thursday, Chiu’s office called the lawsuit, filed on behalf of the People of the State of California, “a first-of-its-kind” alleging violations of state and federal laws prohibiting deepfake pornography, revenge pornography, and CSAM, as well as violations of California’s Unfair Competition Law.
The People, the statement notes, “seek the removal of Defendants’ websites as well as injunctive relief to permanently restrain Defendants from engaging in this unlawful conduct. The lawsuit also seeks civil penalties and costs for bringing the lawsuit.”
The statement alleges that “the proliferation of nonconsensual deepfake pornographic images has exploited real women and girls across the globe,” prompting the lawsuit against what the City Attorney calls “the owners of 16 of the most-visited websites that invite users to create nonconsensual nude images of women and girls.”
The lawsuit alleges that the people behind those sites “violate state and federal laws prohibiting deepfake pornography, revenge pornography and child pornography.”
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” said City Attorney David Chiu. “Generative AI has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology. We have to be very clear that this is not innovation—this is sexual abuse. This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible.”
The City Attorney’s office alleges that “bad actors” behind these deepfakes sites have “impacted women and girls in California and beyond, causing immeasurable harm to everyone from Taylor Swift and Hollywood celebrities to high school and middle school students.”
Deepfakes, according to the statement “are used to extort, bully, threaten, and humiliate women and girls.”
The 16 targeted websites, the statement alleges, “offer to ‘undress”’images of women and girls. These websites offer user-friendly interfaces for uploading clothed images of real people to generate realistic pornographic versions of those images. These websites require users to subscribe or pay to generate nude images, profiting off of nonconsensual pornographic images of children and adults. Collectively, these websites have been visited over 200 million times just in the first six months of 2024.”
Chiu’s statement does not mention nonconsensual, AI-generated pornography targeting men.
A Tangled Web of Obscure Sites and Companies
Chiu’s office attached the complaint — filed at San Francisco Superior Court as “People of the State of California v. Sol Ecom, Inc, et al.” — but redacted the names of the websites.
The unredacted version of the complaint names the targeted websites as: Drawnudes.io (operated by Sol Ecom); Porngen.art and Undresser.ai (operated by Briver); Undress.app, Undress.love, Undress.cc, and Ai-nudes.app (operated by Itai Tech); Nudify.online (operated by Defirex); Undressing.io (operated by Itai OÜ); Undressai.com (operated by Gribinets); Deep-nude.ai (operated by someone only identified as Doe #1); Pornx.ai (operated by someone only identified as Doe #2); Deepnude.cc (operated by someone only identified as Doe #3); Ainude.ai (operated by someone only identified as Doe #4); and Clothoff.io (operated by someone only identified as Doe #5).
According to the lawsuit, several of these sites openly promote the non-consensual nature of their deepfakes.
Drawnudes.io, for example, allegedly “allows users to ‘deepnude girls for free’ by uploading an image and using the website’s AI technology to ‘undress’ the image. Users are invited to upload a photo with the message: ‘Have someone to undress?’ Sol Ecom provides step-by-step instructions on how to select images that will provide “good” quality nudified results.”
Some of these companies and sites, alongside others, were first probed by investigative reporter and researcher Kolina Koltai for the exposé website Bellingcat back in February.
Koltai’s article claimed to have identified “a loosely affiliated network of similar platforms” including Clothoff, Nudify, Undress and DrawNudes, which had “variously manipulated financial and online service providers that ban adult content and non-consensual deepfakes by disguising their activities to evade crackdowns.”
Koltai reported that DrawNudes initially appeared connected to an obscure firm called GG Technology LTD, but during her investigation DrawNudes “changed the company listed on its website to Sol Ecom Inc.,” the first Defendant mentioned by Chiu’s lawsuit.
Sol Ecom had addresses in Miami and Los Angeles. Both GG Technology and Sol Ecom list Ukrainian nationals as operators.