Lately, the news has been full of stories of women being subjected to a new form of image-based sexual abuse (or IBSA, for short), perpetrated using artificial intelligence technologies. Somewhat recently, fake videos of Taylor Swift went viral on X which depicted her performing sexual acts, before being taken down and her name becoming temporarily unsearchable on the platform. A quick Google search of “deepfake pornography” will yield countless results of women sharing their stories of being victimised using this technology and advocates calling on lawmakers to do something about it.
But here’s the thing. The language that we use to talk about this phenomenon is a problem.

When there was an explosion in the use of mobile phones by teens and young adults, and social media, we saw a rise in conversations discussing “revenge porn”. However, advocates rightly highlighted that this language was harmful for two reasons. First, because calling it revenge pornography insinuates that the victim did something to deserve it – it implies it was her fault. Second, referring to this form of abuse as “pornography” is not right either. Some (though many feminists would disagree) would consider pornography to be a form of entertainment, and the non-consensual sharing of nude or sexual images certainly is not that.
Similarly, in a LinkedIn post during August 2024, CEO of the National Center on Sexual Exploitation in the United States, Dawn Hawkins, highlighted problems with using the term intimate image abuse or non-consensual intimate imagery. She writes:
“Referring to IBSA as “intimate” imagery is a gross mischaracterization. The word “intimate” implies consent and privacy, which are wholly absent in cases of IBSA. This term fails to capture the exploitation, extortion, harassment, and humiliation that victims endure. Let’s be clear: IBSA is a form of sexual assault and public humiliation, not an intimate act.”
So why, then, do we use the term “pornography” when referring to this form of deepfake imagery?
I argue that we should not be referring to this form of abuse as “deepfake pornography” when there are so many other options. The non-consensual creation and/or sharing of nude or sexual imagery is image-based sexual abuse (abbreviated IBSA), and we should be using language to describe it this way.
Victoria Rousay has proposed referring to these images as sexual deepfakes, allowing us to discuss the sexual nature of the image without referring to it as “pornography”. This is a great option able to encompass both non-consensual imagery as well as that made with the consent of the person/people depicted.
However, I would like to propose two alternatives. The first proposed alternative is non-consensual synthetic sexual imagery (abbreviated NCSSI). Despite its wordiness, I think this is one option that allows us to appropriately encapsulate what has been done – synthetic (fake) sexual imagery was created (or shared) without the consent of those depicted. A second alternative, which may be preferable for those of us who love abbreviations, would be AI-facilitated or AI-generated image-based sexual abuse (AI-IBSA).
Rebekah Wells, who founded Women Against Revenge Porn, said when discussing image-based sexual abuse:
“It has been called “revenge porn,” “involuntary pornography” and “nonconsensual pornography.” But using these terms is like calling rape “involuntary sex.” It simply doesn’t reflect the emotional, psychological and physical costs. Revenge porn is cyberrape, and we should call it as such.”
Wells suggests using the term cyberrape to refer to seemingly all forms of image-based sexual abuse, and this would extend beyond referring to only images generated or altered using AI or deepfake technologies. However, what is clear is this: we need to stop referring to non-consensual deepfakes as “pornography”.
Leave a Comment