Latest News

Deepfakes: How AI Porn Is Being Used As A Weapon Against Women

"It’s a shocking violation to have to see yourself in that way.”
Loading the player...

Noelle Martin was 18 years old when she first watched herself performing hardcore porn.

Sitting idly in front of her laptop one night, the then law student decided to do a reverse image search of herself on Google. She was expecting to unearth an archaic MySpace profile or some old, cringeworthy social media posts. Instead, Martin was met with hundreds of explicit images and videos of herself, including graphic scenes of her engaging in intercourse and oral sex. Except it wasn’t her.

Martin is one of the hundreds of thousands of women who have been targeted by non-consensual sexual deepfakes. Meaning the image of her face was stolen and digitally mapped onto someone else’s body using AI technology.

“I felt sick,” recalls Martin, now an award-winning activist and a researcher at UWA Tech and Policy Lab.

“I was an absolute nobody. I didn’t know what the term was to describe what was happening to me. I’d never heard of image-based sexual abuse. My mind was racing with questions: ‘Were the videos made by someone that I knew? Were they going to come after me? How is this going to affect my job? Should I tell my family?’ It’s a shocking violation to have to see yourself in that way.”

noelle-martin-deepfake-campaign-ai-porn
Noelle Martin was a victim of deepfake pornography. (Credit: Supplied)

The dark reality of deepfakes is that even if you had never heard the term, you’ve likely already watched, shared or interacted with one.

First popularised by a 2017 Reddit thread linking to a series of fake celebrity sex tapes, the term “deepfakes” has crawled from the dark corners of the internet to the forefront of mainstream media, blurring the boundaries between fact and fiction.

In March, pictures of the Pope sporting an ostentatious puffer jacket made headlines, before a user came forward to claim ownership over the AI-generated image. On TikTok the hashtag #deepfake has more than 1.6 billion views, and as technology becomes increasingly more sophisticated the opportunities to distort the perception of reality grows – impacting political campaigns, public profiles and, most detrimentally, the lives of everyday women.

While a deepfake video of a wide-eyed Tom Cruise laughing unnervingly might seem perfectly harmless, the stark reality is that up to 97 per cent of deepfakes are of non-consensual porn.

The Pope AI
This now-famous image of the pope was generated by AI

“Deepfakes are a violence-against-women issue,” says Adam Dodge, the founder of EndTAB, a US non-profit that educates people about technology-enabled abuse.

“The narrative continually gets subverted as being a misinformation issue or that it’s a political disruptive. People have a really difficult time getting their mind around this type of abuse. It’s easy to default to the position of saying, ‘Well, it’s a fake video. So what’s the big deal?’ But there’s a lot of problems with that.”

When Noelle Martin walked into the local police station with her laptop under her arm, she hoped justice would prevail and the trauma of the videos would end. She was horrified to discover there were no specific criminal laws addressing the non-consensual sharing of intimate images or videos across the states or nationally. Frustrated with the lack of action, Martin, now 28, began campaigning for change, petitioning the federal government and speaking out publicly.

As a result, updates were made to the 2015 Australian Online Safety Act, so that people who distribute intimate images can now face fines or jail. But for Martin, the changes are not nearly enough.

deepfakes-AI-porn-violence-against-women
“Deepfakes are a violence-against-women issue.” (Credit: Getty)

“The laws specifically around image-based abuse are generally the same [as a decade ago]. There haven’t been any major reforms to that area. In Australia, we have a regulator called the Office of the eSafety Commissioner, and that body receives taxpayer funds to try to protect people online, and they have a lot of powers to do that. One of the things that I found, which is backed up by every annual reporting of the regulator, is that the powers that they have been given by our politicians are just not being used or utilised. We have a regulator who, in my opinion, does not regulate. To my understanding, there’s been absolutely no fines that have been issued at all.”

The eSafety Commissioner office says it’s received fewer than 10 deepfake abuse complaints, but expects this number to climb as AI technology advances. However, last year it received 7000 complaints about online image-based abuse (ie, revenge porn) and helped remove 90 per cent of those images. It says no further sanctions, such as fines, were required. “Most people just want the material removed as quickly as possible,” says the commissioner, Julie Inman Grant. “I encourage everyone who has experienced image-based abuse to report it to us.”

Deepfake porn has been a life-sentence for Martin, who still suffers from knowing most of those images and videos of her are still online. “The process of taking the material down is a never-ending battle. Even if you do manage to take down material, you can never guarantee it won’t pop up again a month, a year, even 10 years from now. Taking the content down also frustrates these perpetrators and puts a bigger target on your back for them to keep doing it.”

Today, deepfake porn is booming, with doctored videos becoming increasingly more accessible. A recent NBC News review into two of the largest websites that host deepfake porn found they are easily accessed through Google, and that creators are using the online chat platform Discord to also offer custom videos, which can be purchased via Mastercard or Visa.

The going rate for a five-minute deepfake of a “non-celebrity” (also known as a “personal girl”, someone with fewer than 2 million followers) is about $US65; other sites allow you to upload a woman’s image and a bot will strip her naked. With demand for personalised porn increasing, two popular deepfake creators are even advertising paid positions to help them create more content.

“I get people [who are deepfake victims] reaching out and I don’t want to mislead them, because they’re just not being helped,” says Martin. “No-one wants to talk about that. It’s uncomfortable, because you want to believe that there is hope. But I’m not interested in giving people false hope.”

It’s unfair to put the onus solely on women to better protect themselves online; a large part of the problem lies with the lack of consquences for the creators of deepfakes – and their consumers. Earlier this year, the popular Twitch streamer Brandon “Atrioc’’ Ewing was caught buying and watching deepfake porn of female fellow streamers after accidentally leaving a website tab open during a live stream. “I’d been reading so much AI stuff and I was on Pornhub and there was an ad on every video, so I know everyone must be clicking it because it’s on every video. It was 2am. I got curious and I clicked something,” Ewing says tearfully in an apology video with his crying wife by his side.

woman-looking-at-phone-deepfake
There is a misconception that because deepfakes are ‘fake’ that they aren’t harmful. (Credit: Getty)

A female Twitch creator, who goes by the name Sweet Anita, is one of the many victims of deepfake porn brought to light by that live stream, discovering fakes of herself only after Ewing was exposed. “My initial reaction was that I felt exhausted, because there’s already this huge stereotype of female streamers being sexualised,” says Anita, who has 1.9 million followers on Twitch. “The video was so hyperrealistic, so my immediate worry was that if I ever decide not to make content someone will look me up one day and go, ‘Oh, she made porn.”

Online sex workers have a significantly higher safety risk than creators who do not produce sexual content. Not only are online sex workers generally aware of these risks, but due to the higher volume of followers and income brought in by their content, they typically have better means to protect themselves. “What deepfakes do is they convince people that I’ve made porn,” says Sweet Anita. “Because of that I’m treated as a sex worker, except I’m vulnerable because I don’t have the wealth generated from sex work to protect me. Not doing porn was a safety decision that has now been taken away from me.

“The frustrating part is that even when I try to get ahead of the narrative and say that the videos are deepfakes, I get comments saying, ‘You’re lying because you’ve been caught out being a slut.’ It’s become my responsibility to advocate and educate people, when it should be the creators and consumers apologising and funding the legal cost of taking those videos down.”

Scroll through any forum or Twitter thread on deepfakes, and you’ll find a prevailing attitude that because deepfakes aren’t “real” they can’t harm you, and therefore aren’t as damaging as revenge porn. Dodge agrees that a large part of changing perceptions of deepfakes comes from educating people about the impact of these videos as a form of gendered violence.

“Deepfake porn manufactures the experience of sexual assault where a victim is filmed while unconscious or inebriated to the point where they don’t have a memory of their assault,” he says. “Similar to deepfakes, when the victim watches the video they have no memory of being assaulted, but they are absolutely affected by watching themselves be raped online. Deepfake pornography manufactures that experience, which is deeply traumatic and harmful.

“Being inserted into a video without your consent does not harm men in the same way it does for female-identifying victims,” Dodge continues. “The shame and the blame that exists for women harmed this way exists at a very watered-down level for male-identifying individuals. So there’s a lack of empathy, perspective and privilege that we have to address.”

For Martin, who has dedicated more than a decade to spreading awareness and advocating for change, the damage of deepfakes is very much “real”. “It is the violation and deprivation of someone’s agency and autonomy,” she says. “People don’t realise it’s not even about the harms of the emotional reaction, it’s the consequences of deepfakes that is a life sentence. The harm of these deepfakes is the flow-on effect into your whole life: your employability, your interpersonal relationships, your romantic relation­ships, your economic opportunities and your self-determination. It doesn’t matter if it looks real or not, it could affect every job that you go for. People aren’t looking closely at the content they see to try to determine if it’s authentic.”

In many cases, victims of sexual-based image abuse have had to remove themselves from the internet completely, living in constant fear of images resurfacing. In other cases, the devastating results of deepfake porn has resulted in victims having to change their names altogether. However, the full extent of the damage caused by deepfakes is still being understood. “When the victims who are depicted in these videos have difficulty identifying as victims, this is really dangerous,” says Dodge. “Because if you can’t identify as a victim, then you cannot start to heal or process your trauma. That’s really an unsafe place for people to be in.”

The complexity of deepfakes is that it’s much deeper than just sexual gratification, it’s about stripping women of power and their right to autonomy of their body. “They allow men and male-identifying individuals to exert control over the victims, without anybody’s consent or anybody stopping them,” Dodge explains. “Nobody is being fooled by these deep tech pornographic videos. It’s not about misinformation, because a perfect detection solution could be introduced tomorrow, and the deep­fake porn websites will continue to thrive, because people are not consum­ing this content under the assumption that what they’re getting is ‘real’.”

The deepfake economy appears to be a terrifying byproduct of a new digital age, but we need only look at history to understand that stripping women of their power is no new phenomenon but an age-old fixation with dehumanising them.

This article originally appeared in the June issue of marie claire Australia.

Related stories