Cracking down on AI misuse exploiting women & girls

Earlier this year, several of my deputies who are also mothers of daughters brought an alarming trend to my attention. A disturbing side to the AI industry, new technologies have allowed users to create deepfake pornography.

Websites engaged in this AI-facilitated exploitation take photos of a real, clothed person and generate pornographic images – without the consent of the person depicted and virtually indistinguishable from real photographs. Victims have no recourse – it’s impossible to determine which website generated an image, or to remove all images from the Internet.

This has impacted a shocking number of women and girls across the globe – from Taylor Swift and Hollywood celebrities to high school and middle school students. Our investigation took us to the darkest corners of the Internet, and we were horrified at what we learned. Images have been used to bully, humiliate, and extort victims, and the impacts on a person’s reputation, mental health, and loss of autonomy have been devastating.

This week, my office filed a first-of-its-kind lawsuit against the world’s largest websites that create and distribute AI-generated nonconsensual pornography. These 16 websites have been visited over 200 million times just in the first six months of 2024.

While profiting off this exploitation, these companies have violated state and federal laws banning deepfake pornography, revenge pornography and child pornography. We’re asking the court to shut down these websites and prevent operators from continuing to break the law.  

We’re also sounding the alarm. Generative AI has enormous promise in many areas, but with any new technology, there are unintended consequences and criminals seeking to exploit them. This is not innovation – this is sexual abuse.

This is also a big, multi-faceted problem that we, as a society, need to solve as soon as possible. We must each do our part to crack down on bad actors using AI to exploit and abuse real people, including children. 

You can read more in the New York Times.

Sincerely,

David