City Attorney sues most-visited websites that create nonconsensual deepfake pornography

AI-generated nonconsensual explicit images of real adults and minors cause immeasurable harm and violate state and federal laws

SAN FRANCISCO (August 15, 2024) — San Francisco City Attorney David Chiu announced a first-of-its-kind lawsuit today against some of the world’s largest websites that create and distribute nonconsensual AI-generated pornography. The proliferation of nonconsensual deepfake pornographic images has exploited real women and girls across the globe. The lawsuit, filed against the owners of 16 of the most-visited websites that invite users to create nonconsensual nude images of women and girls, alleges the website owners and operators violate state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography.

City Attorney speaks at a press conference in August 2024.

“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” said City Attorney David Chiu. “Generative AI has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology. We have to be very clear that this is not innovation—this is sexual abuse. This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible. We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children.”

Background
The availability of free, sophisticated, open-source AI models, has allowed bad actors to exploit and further train and develop these models for their own purposes. Bad actors have now trained AI image generation models on pornographic images and child sexual abuse materials, so that they can be used to generate nude images of real, identifiable women and girls without their consent.

This disturbing phenomenon has impacted women and girls in California and beyond, causing immeasurable harm to everyone from Taylor Swift and Hollywood celebrities to high school and middle school students. For example, in February 2024, AI-generated nonconsensual nude images of 16 eighth-grade students were circulated among students at a California middle school.

These images, which are virtually indistinguishable from real photographs, are used to extort, bully, threaten, and humiliate women and girls. The Federal Bureau of Investigation has also warned of an uptick in extortion schemes using nonconsensual AI-generated pornography. Worse yet, victims of nonconsensual deepfake pornography have found virtually no recourse or ability to control their own image after deepfaked images have been distributed.

Defendants own and operate 16 of the world’s most-visited websites that offer to “undress” images of women and girls. These websites offer user-friendly interfaces for uploading clothed images of real people to generate realistic pornographic versions of those images. These websites require users to subscribe or pay to generate nude images, profiting off of nonconsensual pornographic images of children and adults. Collectively, these websites have been visited over 200 million times just in the first six months of 2024.

Some the websites allow users to create nonconsensual pornographic images of adults only, but others allow users to create nonconsensual pornographic images of children.

One defendant’s website specifically promotes the nonconsensual nature of the images, stating “Imagine wasting time taking her out on dates, when you can just use [website] to get her nudes.”

The lawsuit, filed on behalf of the People of the State of California, alleges violations of state and federal laws prohibiting deepfake pornography, revenge pornography, and child pornography, as well as violations of California’s Unfair Competition Law. The People seek the removal of Defendants’ websites as well as injunctive relief to permanently restrain Defendants from engaging in this unlawful conduct. The lawsuit also seeks civil penalties and costs for bringing the lawsuit.

The case is People of the State of California v. Sol Ecom, Inc, et al., San Francisco Superior Court. A redacted version of the complaint can be found here.

Tips
Any person who has been the victim of nonconsensual deepfake pornography or has relevant information, is welcome and encouraged to contact the City Attorney’s Office through its consumer complaint web portal or hotline at (415) 554-3977.

###