AI means anyone can be a victim of deepfake porn. Here’s how to protect yourself
CNN
While revenge porn has been around for decades, the proliferation of AI tools means anyone can be a target of harassment, even if they’ve never taken or sent a nude photo. But there are steps that targets can take to protect themselves and places to turn for help.
“All we have to have is just a human form to be a victim.” That’s how lawyer Carrie Goldberg describes the risk of deepfake porn in the age of artificial intelligence. While revenge porn — or the nonconsensual sharing of sexual images — has been around for nearly as long as the internet, the proliferation of AI tools means that anyone can be targeted by this form of harassment, even if they’ve never taken or sent a nude photo. Artificial intelligence tools can now superimpose a person’s face onto a nude body, or manipulate existing photos to make it look as if a person is not wearing clothes. In the past year, targets of AI-generated, nonconsensual pornographic images have ranged from prominent women like Taylor Swift and Rep. Alexandria Ocasio-Cortez to high school girls. For someone discovering that they, or their child, have been made the subject of deepfake porn, the experience is typically scary and overwhelming, said Goldberg, who runs the New York-based firm C.A. Goldberg Law representing victims of sex crimes and online harassment. “Especially if they’re young and they don’t know how to cope and the internet is this big, huge, nebulous place,” she said. But there are steps that targets of this form of harassment can take to protect themselves and places to turn for help, Goldberg told me in an interview on CNN’s new tech podcast, Terms of Service with Clare Duffy. Terms of Service aims to demystify the new and emerging technologies that listeners encounter in their daily lives. (You can listen to the full conversation with Goldberg here.)