Deepfake porn websites sued by San Francisco city attorney
SAN FRANCISCO (KRON) — Artificial Intelligence rapidly emerged on the technology scene this year, pushing the boundaries of what computers are capable of. Sinister AI users have already utilized the new tech to create nonconsensual deepfake pornographic images.
The AI-generated images are fake, but the victims are real, San Francisco City Attorney David Chiu said.
Chiu filed a “first-of-its-kind lawsuit” on Thursday against 16 of the world’s largest websites that create and distribute nonconsensual AI-generated pornography. The lewd images exploit real women and girls across the globe, ranging from high-profile celebrities like Taylor Swift to children as young as middle school, the city attorney’s office wrote.
“This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu said.
The lawsuit was filed against the most-visited websites that invite users to create nonconsensual nude images of women and girls. The suit accuses website owners and operators of violating state and federal laws prohibiting deepfake pornography, revenge pornography and child pornography.
“Generative AI has enormous promise, but as with all new technologies, there are unintended consequences and criminals seeking to exploit the new technology. We have to be very clear that this is not innovation — this is sexual abuse,” Chiu said. “This is a big, multi-faceted problem that we, as a society, need to solve as soon as possible. We all need to do our part to crack down on bad actors using AI to exploit and abuse real people, including children.”
The accused websites offer to “undress” images of women and girls in exchange for a subscription fee, the city attorney’s office said. User-friendly interfaces for uploading clothed images of real people to generate realistic pornographic versions of those images.
Earlier this year, AI-generated nude images of 16 eighth grade students circulated among students at a California middle school.
“These images, which are virtually indistinguishable from real photographs, are used to extort, bully, threaten, and humiliate women and girls. Worse yet, victims of nonconsensual deepfake pornography have found virtually no recourse or ability to control their own image,” the city attorney’s office wrote.
One of the website’s marketing pitches states, “Imagine wasting time taking her out on dates, when you can just use (website’s name) to get her nudes.”
The city attorney filed the lawsuit, People of the State of California v. Sol Ecom, Inc., in San Francisco Superior Court on behalf of all Californians.
The suit demands that the websites are removed from the internet, injunctive relief and civil penalties.