WASHINGTON, D.C.: Photo apps digitally undressing women, sexualized text-to-image prompts creating "AI girls" and manipulated images fueling "sextortion" rackets — a boom in deepfake porn is outpacing US and European efforts to regulate the technology.
Artificial intelligence-enabled deepfakes are typically associated with fake viral images of well-known personalities such as Pope Francis in a puffer coat or Donald Trump under arrest, but experts say they are more widely used for generating non-consensual porn that can destroy ordinary lives.
Women are a particular target of AI tools and apps — widely available for free and requiring no technical expertise — that allow users to digitally strip off clothing from their pictures or insert their faces into sexually explicit videos.
"The rise of AI-generated porn and deepfake porn normalizes the use of a woman's image or likeness without her consent," Sophie Maddocks, a researcher at the University of Pennsylvania tracking image-based sexual abuse, told AFP.
"What message do we send about consent as a society when you can virtually strip any woman?"
In a tearful video, an American Twitch streamer who goes by QTCinderella lamented the "constant exploitation and objectification" of women as she became the victim of deepfake porn. She was harassed, she added, by people sending her copies of the deepfakes depicting her.
delivered to your inbox
The scandal erupted in January during a livestream by fellow streamer Brandon Ewing, who was caught looking at a website that contained deepfaked sexual images of several women including QTCinderella.
"It's not as simple as 'just' being violated. It's so much more than that," she wrote on Twitter, adding that the experience had "ruined" her.
'Hyper-real'
Continue reading with one of these options:
Ad-free access
P 80 per month
(billed annually at P 960)
- Unlimited ad-free access to website articles
- Limited offer: Subscribe today and get digital edition access for free (accessible with up to 3 devices)