I acknowledge the use of Bing Copilot AI (https://copilot.microsoft.com/) and Chat GPT (https://chatgpt.com/) to get feedback on the organization, clarity, and conciseness of my writing style. All of the drafting and final decisions about proofreading are my own.
Trigger Warning:
This essay discusses topics related to sexual violence, including nonconsensual deepfake pornography, victim experiences, and the psychological harms associated with image-based sexual abuse. While this paper specifically focuses on deepfake pornography as it relates to the broader social context of violence against women, it is important to acknowledge that deepfake abuse, sexual violence, and image-based sexual abuse can affect people of all genders and identities. The purpose of this essay is to analyze one aspect of this issue, not to suggest that these harms are limited to one group.
The 2024 election cycle was one of the most polarizing and divisive election cycles in my recent memory. Millions of articles, interviews, TikToks, podcasts, and many other types of media were all created, and every day it seemed that the front page of the New York Times had some new story about Congress, the presidential campaigns, or the candidates. I am somewhat embarrassed to admit that I, like many people my age, didn’t really pay attention to the vast majority of this election news. I read about it occasionally, and my family discussed politics at the dinner table, but honestly, even by August, I was ready to be done hearing about the election, and so I mostly tuned it out. One of the few articles that was memorable to me during this time, however, was not one about politics or the candidates themselves, but rather about Taylor Swift. I think it was October when I read this article, and I really only clicked on it because I wanted to see if it said anything about Reputation (Taylor’s Version). It didn’t, and in fact, it had very little to do with her at all. It was a piece of political news, discussing how deepfakes of Taylor Swift had been created, where she was shown endorsing Donald Trump in the election. A little bit further down in the article, it was noted that she had also been the victim of deepfake pornography. This was shocking to me, not because the deepfake porn had been created but rather that it received barely any coverage at the time—it says something striking about our values that the political campaign was at the forefront of our priorities and that a much more disturbing use of AI technology was ignored.
Tragically, this trivialization of sexually explicit deepfakes is far from a rare phenomenon. MEA Digital Evidence Integrity, an organization dedicated to the protection of people’s data online, defines deepfakes as “synthetic media, often in the form of videos, audio, or images, generated through artificial intelligence (AI) and deep learning algorithms.” This study then finds a 3000% increase in deepfakes online, noting that the growth trends in AI technology most likely caused the growth in deepfakes. Thus, the study reasoned, it is likely that as AI becomes more widely used, the number of deepfakes will continue to grow and that deepfakes will become more and more believable (MEA Digital Evidence Integrity). The degree to which these deepfakes are sexually explicit is also astounding—of all deepfakes generated, as per a 2018 study by Sensity AI, ninety to ninety-five percent of them were sexually explicit. Of those sexually explicit deepfakes, approximately ninety percent were sexually explicit images of women (Hao). These deepfakes also have profound impacts on the victims, with fifty-one percent experiencing suicidal thoughts or contemplating suicide, and an astounding ninety-three percent reporting severe psychological or emotional damage (Rousay). Despite this obvious trend towards the use of deepfake pornography to harm women, only six states, California, Georgia, Florida, New York, Pennsylvania, and Texas, have laws preventing non-consensual deepfake pornography. These laws generally require a high burden on the part of the victim, requiring them to prove both that the pornography was created non-consensually and that it was created with malicious intent. Additionally, these laws only cover the initial creation of the pornography, leaving post hoc distribution, the sharing of the image after its initial posting, completely legal. This legal framework is similar to those that cover traditional revenge pornography, such as photographs, which exist in forty-eight states (Rousay).
As deepfake technology continues to develop and the number of people victimized continues to grow, image-based sexual abuse (ISBA) using deepfakes seems to replicate elements observed with other forms of sexual violence, like a lack of consent and a culture of violence, among other patterns, as well as demonstrate unique harms to victims. This makes broader social recognition and legal reform even more crucial to address the damaging potential on a victim’s life. Laws, after all, only exist insofar as to protect people. If the laws fail to do this, and it is abundantly clear that these laws are failing countless women, then it makes sense to question the structure of these laws and why they are failing. This then requires an understanding of how sexually explicit deepfakes can be compared to other forms of violence, as well as an understanding of how deepfake pornography has its distinct harms.
A frame of reference that can help quantify the impacts of deepfake pornography is to see how it dovetails with other forms of violence. One of the feelings commonly described by victims is a feeling of powerlessness, violation, or betrayal, because of the nonconsensual nature of the distribution of these images, which are frequently distributed by people they once trusted, like friends, acquaintances, or partners. One victim, a British woman named Jodie, who chose to be identified by her first name only, recalls that her “sense of violation intensified when she found out that the man responsible was someone who’d been a close friend for years,” a tragic sentiment echoed by many other survivors of this violence at the hands of people they know. It feels not only violating to them in that they had to live with the knowledge that someone, an anonymous person, created and posted these deepfakes, but they also must live with the knowledge that they trusted someone and that trust was betrayed (McGlynn). Victims cite difficulty trusting people in future relationships as one of the most psychologically scarring impacts of the experience.
This phenomenon is similar to nearly all forms of sexual violence. Nonconsensual distribution of traditional sexually explicit media, e.g., photos or videos, is perhaps the most straightforward parallel to draw. In cases involving sexually explicit photography, for example, a victim might initially agree to participate in the photographs with a person they trust not to distribute the pictures. The photographer then abuses the victim’s trust, sometimes out of a desire for revenge on an ex-partner or for profit (Yousif), and distributes the photos. At first glance, this comparison seems to deteriorate—after all, the victims of deepfake pornography did not agree to any such photoshoot, even initially. While this is true, the deepfaked victim also allowed the perpetrator into their life, trusted them, and put themself into a position of intimacy, albeit emotional intimacy as opposed to physical intimacy.
Date rape can even be considered similar to deepfake pornography, even though the two initially seem very different. Date rape is generally legally understood as a person being forced or coerced to have sexual intercourse with someone else who is considered to be an acquaintance, during or after a voluntary social engagement (Legal Information Institute). It requires a degree of trust in the perpetrator on behalf of the victim for that initial social engagement—trust that he will understand that a yes to a third date is not always a yes to something more, and trust that he understands that an invitation into a bedroom is not a waiver to do whatever he wants. An agreement to go on a date is also an implicit statement of trust in the man you’re going on a date with, trust that he isn’t dangerous or creepy, and trust that you will be returned home safely. The perpetrator then takes advantage of the initial yes and of her trust that he isn’t a bad person. Instances of deepfake pornography creation also involve this violation of trust, only the initial yes given is not a yes to a date. It is rather a yes to things like sharing social media information with the perpetrator or a yes to the emotional vulnerability that comes with a friendship with a person. This trust is subsequently broken with the abuse of that initial yes, and victims across the board are left with feelings of betrayal and uncertainty for future relationships.
Another feeling commonly associated with this sort of violence is a feeling of being objectified. The deepfaker, after all, takes a woman’s image and uses it for his own sexual gratification—the victim’s feelings are not even considered. Her comfort and safety are totally disregarded, and she simply becomes a tool of pleasure for the perpetrator. Deepfake creation websites even exist, allowing people to just input photos so the algorithm creates pornography for them. The creator of one of the most prolific of these websites, Mr Deepfake, was even quoted in a 2022 BBC interview, saying, “I don’t really feel that consent is required – it’s a fantasy, it’s not real” (McDermott and Davies). This whole process denies women their own agency by taking away their chance to make decisions about how their own image and identity can be used in sexual situations. The interview makes this point even more clearly—the creators of this type of media believe that these images can be used to fulfill their sexual fantasies, regardless of whether or not the women are comfortable with their images being used in this way.
This is notable in all forms of gender-based violence, from cat-calling to sexual assault. All of these forms of violence deny a woman her ability to choose to engage or not in sexual behavior. A man catcalling an underage girl from his car does so for his own pleasure. He does not consider how his actions affect the girl; he does not think about how deeply unsafe his words make her feel, and she simply becomes an object for his sexual gratification. A rapist acts in much the same way–the woman is merely a means to an end for him, a tool or a toy or an opportunity. She is not treated as a person, but rather as an object incapable of agency. This objectification of women is uniquely damaging as it multiplies a victim’s feelings of violation and sends them a fundamental signal that the perpetrator does not even see them as human or worthy of agency. This then deepens the psycho-affective harms experienced by the victims.
In addition to a culture of objectification of women when discussing deepfakes, there also exists a culture of victim blaming. Tragically, in cases of sexual assault, victim blaming is frequent and focuses on the questions of what a woman could have done to prevent, or worse, what she did to invite, this form of abuse (Sexual Assault Centre of Edmonton). This sort of shaming is also a prevalent experience with victims of deepfakes, with women being “harassed, silenced, sexualized, judged and abused” (Rousay). One woman, who chose to go by Annie in this study, recalls people trying to define her as “promiscuous” and “immoral.” This, then, is consistent with the definition of victim blaming for other such instances of sexual violence, because those words suggest that Annie invited the deepfakes’ creation through acting that way, instead of holding the perpetrator, and the perpetrator alone, accountable for the deepfakes.
While deepfakes are comparable to other forms of sexual violence, they also have their own distinct harms that differentiate them from other instances of sexual violence. One of the ways that they’re uniquely insidious is that while they are fake, they are also functionally permanent, and if posted, they can exist on the internet in some form forever. This is a concern frequently expressed by victims: that their identity will be forever tied to publicly visible, sexually explicit content. One study reported that fifty-seven percent of victims feared for both their employability and their ability to advance professionally. One woman even reported she was fired from her teaching job after a sexually explicit deepfake of her was discovered. When she spoke out, people only became more compelled to find the video (Rousay). Even if a woman manages to win a lawsuit or prove somehow that the image is AI-generated, the image will still exist in some form online, so a future employer might be able to find the image itself, without seeing that it was later revealed to be a fake. It is also possible that the employer won’t believe a woman when she claims that the image was faked, and, as seen with the teacher, whether or not the image was faked is somewhat irrelevant because the act of speaking out can further compel people to distribute and view the media, making it more easily accessible online.
This dissemination of deepfakes also presents a new and complex challenge, independent of employment. Because deepfakes are online, they can spread indefinitely and are not confined to a single violent act. As seen in the case of the teacher, more and more people were able to find and spread the deepfakes, so there was never an end to the violence. One Australian woman, Noelle Martin, became an advocate for laws banning deepfake pornography after discovering that she had been the victim of such sexually explicit media. After she spoke out, she became the victim of a much more extensive deepfake pornography campaign (Hao). This self-perpetuating cycle is unique to deepfakes, mostly because of practical constraints. A person who was the victim of traditional revenge pornography does not usually experience this sort of backlash and further victimization, simply because it is not feasible for would-be perpetrators to gain access to more sexually explicit material. A rapist usually is unable to accost the same victim twice, due to the practical constraint of no longer being allowed around her. Conversely, a deepfake can be made at the push of a button, which allows the harms to continue indefinitely and prevents effective organizing against deepfake pornography since anyone who tries to organize against it can easily be victimized again.
The ability to generate a deepfake at the push of a button is, in and of itself, a unique impact of deepfake pornography. Unlike conventional forms of violence, which require physical proximity and intimacy with a victim, a deepfake can be made by anyone, anywhere, in less than 60 seconds (Rousay). While it is most likely at the hands of an acquaintance, partner, or friend, it can theoretically be created by anyone, since, unlike conventional violence, it is not restricted by physical separation. A man can just find a person’s picture online and manipulate it using AI technology as he sees fit. Deepfake-generating websites complicate this further, as a lack of coding knowledge no longer limits a perpetrator’s ability to generate the sexually explicit material. This makes the violence much more prevalent, since there are significantly fewer factors that act as a deterrent.
A particularly chilling effect of its accessibility is the threat of deepfake pornography; unlike conventional forms of violence, it’s completely possible for pornography to be generated and held on to indefinitely by the perpetrator, or for the pornography to circulate for a while without the victim finding out. A researcher at Harvard University conducted a study on image-based sexual abuse and found that in the quantitative part of her study, twenty-four of her twenty-six female participants had been threatened with the creation of sexually explicit media. These threats alone can have a chilling effect on victims because, unlike conventional forms of violence, it is possible for a perpetrator to carry out these threats without the victim’s immediate knowledge. For other questions, such as “Has someone ever created a nude, semi-nude, and/or sexual image(s) of you without your consent?” victims were able to select “I don’t know” as an answer, since this is a somewhat common occurrence. One participant even recalls being alerted to a sexually explicit deepfake of her by a friend, and subsequently finding out that it had already gone viral on Twitter without her knowledge. These threats reproduce some of the same harms of deepfake pornography, as victims still feel concerned about the images being distributed or redistributed, and thus fear the negative impacts of that distribution.
The accessible nature of deepfake pornography technology also means that, even if it does occur at the hands of someone the victim knows, the sexually explicit material can be created and distributed anonymously, so the perpetrator could continue to create media for years without getting caught. The laws around deepfake pornography make this an even more dangerous form of violence because these laws require a person to prove both the nonconsensual nature of the pornography and the malicious intent behind it. This can be especially difficult for victims because the perpetrator is anonymous, and it can be nearly impossible to prove that they acted with truly bad intentions. One British woman, Helen Mort, suffered for years at the hands of her abuser, and yet since he acted anonymously and did not reveal her real name, Mort was unable to do anything–he had managed to act such that his actions did not violate British law, albeit barely (Hao).
All this then begs the question: since sexually explicit nonconsensual deepfake pornography clearly demonstrates destructive effects on victims’ lives and reflects patterns seen in other, widely criminalized, forms of violence against women, why do adequate legal protections continue to elude victims? The answer is complex and rooted in nuanced social problems and contentious dynamics. Yet the answer is also reflected in the harms that deepfakes create. It is harder to address deepfakes because they are anonymous and easy to create, and once they are created, it is nearly impossible to remove them entirely from the internet. They are an objectification of women in a culture that loves to objectify women and a violation of women’s trust in a society that is seeped in misogyny and sexism.
This then comes to a new question: where, then, lies the solution? This is perhaps a more complex question, as it asks us to solve the root causes of all these harms. While I don’t have the solution to the specifics of deepfake pornography, it does make me wonder why my first introduction to this form of violence was as a throwaway line in an article about Taylor Swift. Why wasn’t it discussed in my health class, instead of yet another lesson on the importance of reducing stress from school? Why is such an insidious and destructive form of violence, a form which so clearly demonstrates harms to victims, treated as an afterthought? Again, this is not something I know the answer to, or even if a clear-cut answer exists in the first place. It does, however, make me consider my complacency in a culture that devalues and objectifies women. That is a chilling thought, to wonder how much deepfake pornography would exist in a society that valued and protected women, and then think about how far away we are from that society. And, as technology continues to improve and develop, I do wonder whether we as a people move forward or away from that more equal society.
Works Cited
Hao, Karen. “Deepfake Porn Is Ruining Women’s Lives. Now the Law May Finally Ban It.” MIT Technology Review, 12 Feb. 2021, www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/. Accessed 3 Feb. 2025.
Legal Information Institute. Cornell Law School, Aug. 2022, www.law.cornell.edu/wex/date_rape. Accessed 2 Apr. 2025.
McDermott, Sarah, and Jesse Davies. “Deepfaked: ‘They Put My Face on a Porn Video.'” BBC News, 22 Oct. 2022, www.bbc.com/news/uk-62821117. Accessed 12 Mar. 2025.
McGlynn, Clare. “Deepfake Porn: Why We Need to Make It a Crime to Create It, Not Just Share It.” The Conversation, 9 Apr. 2024, theconversation.com/deepfake-porn-why-we-need-to-make-it-a-crime-to-create-it-not-just-share-it-227177. Accessed 15 Jan. 2025.
Rousay, Victoria. Sexual Deepfakes and Image-Based Sexual Abuse: Victim-Survivor Experiences and Embodied Harms. MA thesis. Digital Access to Scholarship at Harvard, Harvard University, dash.harvard.edu/server/api/core/bitstreams/0aa9cfdb-daf2-429e-9487-05e3c4c732b5/content. Accessed 12 Mar. 2025.
“Victim Blaming.” Sexual Assault Centre of Edmonton, www.sace.ca/learn/victim-blaming/. Accessed 20 Apr. 2025.
Yousif, Nadine. “Texas Woman Awarded $1.2bn in Revenge Porn Case.” BBC, 15 Aug. 2023, www.bbc.com/news/world-us-canada-66514052. Accessed 20 Apr. 2025.
Noah Heftman
May 27, 2025 at 4:21 pm
Very eye opening