Alexandria Ocasio-Cortez expressed her shock upon realizing, during a conversation with aides in a car ride in February, that she had been digitally manipulated into a “deepfake porn” scenario using artificial intelligence and shared on X platform.
In a recent interview with Rolling Stone, the progressive congresswoman emphasized that encountering a simulation of herself engaged in a sexual act highlighted the real and disturbing implications of deepfake technology. She stressed that the unsettling image lingered in her mind throughout the day, underscoring the seriousness of the issue. “There’s a shock to seeing images of yourself that someone could think are real,” Ocasio-Cortez told Rolling Stone. “As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma, while I’m … in the middle of a f*cking meeting.”
She continued: “Digitizing violent humiliation” is akin to physical rape, and she predicted that “people are going to kill themselves over this. There are certain images that don’t leave a person, they can’t leave a person. It’s not as imaginary as people want to make it seem. It has real, real effects not just on the people that are victimized by it, but on the people who see it and consume it. And once you’ve seen it, you’ve seen it.”
At 34 years old, Ocasio-Cortez is throwing her weight behind congressional efforts to facilitate legal recourse for victims of nonconsensual AI pornography. She advocates for legislation aimed at enabling individuals to sue publishers, distributors, and consumers of illicit digital content.
Dubbed the “Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024,” the proposed law has garnered support from Dick Durbin. The Democratic senator from Illinois asserts that the legislation would hold accountable those involved in disseminating nonconsensual and sexually explicit “deepfake” materials.
Durbin highlights the prevalence of such content in exploiting and harassing women, particularly public figures like politicians and celebrities. He cites the case of Taylor Swift, whose fabricated explicit images circulated widely on social media earlier in the year.
“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit deepfakes is very real,” Durbin said. “Laws have not kept up with the spread of this abusive content.”
Meanwhile, Ocasio-Cortez said that deepfake pornography “parallels the same exact intention of physical rape and sexual assault, which is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people.”