De-nudifying Humanity: On the Reasons Why We Need to Talk About Image-Based Sexual Abuse

In an age when undressing someone in an online environment can happen with one click, we need to look for solutions on how to combat a new worrisome trend that is defined as ‘image-based sexual abuse’ (IBSA) by the European Commission. Although the majority of victims are women and teen girls, it does concern men and boys as well (Dodge, 2021). 

There are some highly publicized cases of IBSA like Taylor Swift’s deep fake nudes spread on social media platform X in early 2024 (Verma and Mark, 2024). These images were most likely created by AI and Taylor Swift fans started a massive support campaign, yet most of the cases of IBSA are much more mundane than this. 

When men share nonconsensual pornography online, they wield a significant degree of power over women, as such actions can severely impact a woman’s employment prospects, familial and social relationships, as well as her self-esteem. Additionally, there have been instances of women participating in harmful behaviour through image-based sexual harassment. This includes ex-wives or jealous individuals who gain access to their victim’s partner’s electronic devices or messages and disseminate the information without consent (Bates, 2017). Research consistently shows that women and girls are disproportionately affected by IBSA compared to men and boys.  Not only do women and girls encounter elevated levels of online abuse and image-based harassment, but the overwhelming majority of images and individuals targeted on dedicated  ‘revenge porn’ platforms are female (Rigotti and McGlynn, 2022). 

The overwhelming majority of deep fakes, approximately 96%, are pornographic.  According to an international survey, 14% of respondents reported that someone had either generated or circulated digitally manipulated nude or sexual images of them without their permission (Rigotti and McGlynn, 2022). More specifically, among those whose photos were taken without consent, digital alteration occurred in one-third (34%) of cases (Rigotti and McGlynn,  2022). Recent reports from Ireland highlight the proliferation of thousands of deep fake pornographic images of women being produced and exchanged online (Rigotti and McGlynn,  2022). Unsurprisingly, a survey conducted by the German organization HateAid in 2021 revealed that 30% of women across the EU harbour concerns about the possibility of fake intimate images of themselves being disseminated without their consent (Rigotti and McGlynn, 2022). 

The Extent of the Problem 

IBSA has emerged due to the normalization of women’s objectification in media, where it has been deemed an acceptable means of “punishment” against its victims. The evolution of internet technologies, particularly social media platforms, has provided greater avenues for perpetrators of online sexual abuse (Davis Kempton, 2020). 

A 2017 survey conducted by the Cyber Civil Rights Initiative in the United Kingdom found that one in eight social media users have been targets of IBSA. According to a 2016 study, one in 10 women under the age of 30 have been threatened with the possibility of IBSA (Davis Kempton, 2020). A  2021 survey of 51 countries, including many European countries, stated that 57% of women had been victims of IBSA (Rigotti and McGlynn, 2022). 

Studies show that individuals in committed, romantic relationships are more likely to sext than those not in a relationship. (Bates, 2017) Evidence also indicates that women generally do not send nude photos to men they do not know – a level of trust is necessary before women feel comfortable sending a naked picture. This is irrelevant in the cases of deep-fake nudes but in case a victim has sent nudes first herself (or himself) the whole act of abuse may also cause victim blaming (Bates, 2017). 

Further, many women and girls will be unaware that they have been victims of IBSA, particularly deep fakes and forms of voyeurism with hidden cameras, or where the material is shared in groups and internet forums where victims are unaware their material has been distributed. The reported incidence of abuse, therefore, is likely to be a considerable under-estimate (Rigotti and McGlynn,  2022). 

Importance of Terminology 

The term ‘image-based sexual abuse’ encompasses various acts of non-consensual creation,  capture, or dissemination of intimate images or videos, which may involve altered or manipulated content, as well as threats to distribute such material (Rigotti and McGlynn, 2022). This includes situations commonly referred to as ‘revenge porn,’ where former partners maliciously share intimate content without consent, along with behaviours like threatening to share such material or circulating hacked images (Rigotti and McGlynn, 2022). 

The concept of IBSA emerged as a response to the widespread use of the term ‘revenge porn,’  which continues to be commonly employed globally (Rigotti and McGlynn, 2022). However, this term is considered misleading and tends to shift blame onto the victims (Rigotti and McGlynn,  2022). Many individuals subjected to this form of abuse feel that the term implies they are at fault for the actions of revenge, thus exacerbating their sense of victimization (Rigotti and McGlynn,  2022). 

Importantly, IBSA extends beyond mere distribution to encompass the unauthorized creation of intimate media, particularly through the use of technology and artificial intelligence to produce sexual or pornographic material, commonly referred to as ‘deep fakes’ (Rigotti and McGlynn, 2022). It also encompasses the unauthorized capturing of intimate images, such as instances where individuals are filmed or photographed without their knowledge while changing, showering, asleep, or under the influence of drugs or alcohol, or through the surreptitious use of hidden cameras in public places or on public transportation, or as a result of coercion (Rigotti and  McGlynn, 2022). Many specific forms of this abuse have distinct labels, such as ‘upskirting’ or  ‘sextortion,’ and may also include images depicting sexual assault (Rigotti and McGlynn, 2022). 

One of the ways AI is used to create nude images is a new phenomenon of ‘nudification’ apps, accessed by millions worldwide, where non-sexual images can be uploaded, and in seconds a nude image is generated (Rigotti and McGlynn, 2022). Nudification apps like Nudify or Deepnude can be easily accessed online by anyone and are free of charge. 

Consequences of IBSA: Toll on Victim’s Mental Health 

The impact of nonconsensual pornography includes public shame and humiliation, an inability to find new romantic partners, mental health effects such as depression and anxiety, job loss or problems securing new employment, and offline harassment and stalking (Bates, 2017). 

Almost all victims questioned expressed experiencing a significant decline in trust towards others following their victimization by spreading intimate photos or videos (Bates, 2017). For many, the shift was drastic, transitioning from being highly trusting individuals to rarely confiding in anyone after experiencing betrayal from someone they deeply cared about. In addition to the erosion of trust, many participants endured heightened and disruptive mental health repercussions, often resulting in official medical diagnoses of PTSD, anxiety, and depression (Bates, 2017). The loss of autonomy over one’s body was particularly traumatic, akin to the violation experienced in cases of sexual assault (Bates, 2017). 

Legal Background and Sanctions 

As indicated earlier, IBSA is often perceived as a violation of sexual autonomy and a type of sexual assault. Consequently, for many victims, it is most accurately understood as a sexual offence. This perspective has been acknowledged by the European Parliament, which has urged Member States to revise their domestic legislation to incorporate ‘image-based sexual abuse’ into the category of sexual offences (Rigotti and McGlynn, 2022). 

The European Commission has proposed Article 7 within the Directive of the European Parliament and of the Council on combating violence against women and domestic violence to define the exact violation committed by non-consensual sharing of intimate or manipulated material. It includes taking photos and videos of real people and situations as well as creating manipulated material by AI, and spreading such images, videos, and material or threatening to do so (European  Parliament, 2022). 

However, there are already EU member states, such as Romania, that have adopted the Commission’s approach, enacting new criminal laws against some forms of IBSA as sexual offences, as part of their transposition of the Istanbul Convention. For example, some jurisdictions characterize IBSA as a privacy violation, as with recent French legislation. On the other hand, there are more comprehensive approaches, such as Swedish law includes many forms of IBSA as “acts of ‘intrusive photography’ and a breach of privacy, as well as part of laws on ‘liberty’ and ‘peace’” (Rigotti and  McGlynn, 2022). 

Legal means to protect people from IBSA in Estonia are rather limited, as concludes Kristjan  Kikerpill, PhD, lecturer in Information Law and Digital Sociology at the University of Tartu in an interview with the author of this article. “Unlawful use of an image is, in the most general case, a civil law issue in Estonia – a similar situation would exist, among other things, if people’s pictures are used in the distribution of fake advertisements. Also, it would not be possible to rely, for example, on Section 157-1 of the Penal Code (illegal use of the identity of another person), because in the context of “revenge porn”, the creator-distributor of deep forgery does not try to impersonate another person. If the matter only moves from transmitting the image to, for example, extortion,  then you can react. The extortion provision in the Penal Code includes the option of “publication of  embarrassing information” for “demanding the transfer of pecuniary benefits, if there is a threat to  limit the freedom of a person, to publish embarrassing information or to destroy or damage  property, as well as if violence has been used”.” 

Kikerpill suggested turning to the online police officer, who, even if the matter does not fall under the legal responsibilities of the Estonian Police and Border Guard Board, will probably be able to help the person better. Kikerpill adds, “If the content is illegal, a relevant note from the online police officer can have a more effective outcome on the platform manager than a complaint from a private person. It is worth contacting the Data Protection Inspectorate anyway if it is clear who “is  in the picture” and that it is a matter of personal data processing.” 

Kikerpill admits that making a new law in the European Union and adopting it on the Member State level is a years-long process. Yet he points out that most likely Estonia will adopt the aforementioned directive on the basis of equity, i. e. women and men will be both protected from IBSA in the – hopefully not too distant – future. 

Positive Interventions 

Education has power. Analyses of studies carried out around the world suggest that the solution may be in educational programs for both young boys and girls, and raising awareness through media (Pérez-Martínez, etc. 2023). It is crucial to challenge the hegemonic masculinity stereotype that refers to “a gendered ideal that promotes the dominant position of men and the subordination of women” and transform these into positive masculinity that “promotes more inclusive, empathetic, caring, and egalitarian norms of manhood” (Pérez-Martínez, etc. 2023).  

Positive masculinity is perceived as the alternative to hegemonic masculinity in the sense that it promotes more inclusive, empathetic, caring, and egalitarian forms of manhood (Lomas, 2013).   Studies suggest the transformation of gender norms and attitudes is key to preventing victimization (Semahegn et al., 2019). Secondary socialization of young people in school, the mass media (Evers, 2013), and public discourse – all have an impact on models of masculinity  (Pérez-Martínez, etc. 2023).  

As hegemonic masculinity is associated with the acceptance of violent behaviours and attitudes, the selected studies also incorporated components of conflict management and communication skills which aim to resolve conflict situations without violence (Pérez-Martínez, etc. 2023).  

Resources Available Online

Internet Safety and Media Literacy 

  1. Safer internet. SALTO. (n.d.). https://participationpool.eu/resource-category/information-critical-thinking/safer-internet/  
  2. Just a moment… (n.d.). Anti-Bullying Alliance. https://anti-bullyingalliance.org.uk/tools-information/all-about-bullying/sexual-and-sexist-bullying 

Help for victims of IBSA 

  1. Cyber crime against women. (2023, December 27). GeeksforGeeks. https://www.geeksforgeeks.org/cyber-crime-against-women/ 
  2. Designed and Developed by South West Grid for Learning (https://swgfl.org.uk/). (n.d.). Stop Non-Consensual Intimate Image Abuse | StopNCII.org. https://stopncii.org/ 

Positive Masculinity 

  1. (2023, May 29). The Foundation for Positive Masculinity. https://positivemasculinity.org.au/ 
  2. (n.d.). The Centre for Male Psychology. https://www.centreformalepsychology.com/ 

Bibliography

  1. Bates, S. (2017). Revenge porn and mental health: A qualitative analysis of the mental health effects of revenge porn on female survivors. Feminist Criminology, 12(1), 22-42. https://doi.org/ 10.1177/1557085116654565 
  2. Commission welcomes political agreement on new rules to combat violence against women and domestic violence. (2024 ) European Commission, February 6 Used April 14, 2024 https://ec.europa.eu/commission/presscorner/detail/en/ip_24_649
  3. Davis Kempton, S. (2020). Erotic extortion: Understanding the cultural propagation of revenge porn. Sage open, 10(2), 2158244020931850. https://doi.org/10.1177/2158244020931850 
  4. De Angeli, A., Falduti, M., Menendez Blanco, M., & Tessaris, S. (2021, July). Reporting revenge porn: a preliminary expert analysis. In CHItaly 2021: 14th Biannual Conference of the Italian SIGCHI  Chapter (pp. 1-5). https://doi.org/10.1145/3464385.3464739 
  5. Deepnude (https://app.deepnude.cc/upload) Used 14.04.2024 
  6. Dodge, A. (2021). Trading nudes like hockey cards: Exploring the diversity of ‘revenge porn’ cases responded to in law. Social & Legal Studies, 30(3), 448-468.  https://www.researchgate.net/publication/342782553_Trading_Nudes_Like_Hockey_Cards_Exploring_the_Diversity_of_’Revenge_Porn’_Case s_Responded_to_in_Law 
  7. Evers, C. (2013). The media and models of masculinity. Gender, Place & Culture, 20(7), 933–934.  
  8. Gavin, J., & Scott, A. J. (2019). Attributions of victim responsibility in revenge pornography.  Journal of Aggression, Conflict and Peace Research, 11(4), 263-272. https://www.researchgate.net/ publication/336925764_Attributions_of_victim_responsibility_in_revenge_pornography 
  9. Kikerpill, K. (2024) Author’s interview via email. Tartu, Apr 14 
  10. Lomas, T. (2013). Critical positive masculinity. Masculinities & Social Change, 2(2), 167–193.  13. Nudify (https://nudify.site/) Used 14.04.202
  11. Pérez-Martínez, V., Marcos-Marcos, J., Cerdán-Torregrosa, A., Briones-Vozmediano, E., Sanz Barbero, B., Davó-Blanes, Mc., Daoud, N., Edwards, C., Salazar, M., La Parra-Casado, D., & Vives Cases, C. (2023). Positive Masculinities and Gender-Based Violence Educational Interventions  Among Young People: A Systematic Review. Trauma, Violence, & Abuse, 24(2), 468-486. https://doi.org/10.1177/15248380211030242 
  12. Proposal for a Directive of the European Parliament and of the Council on combating violence against women and domestic violence (March 8, 2022) EUR-Lex. Used April 14, 2024 https://eurlex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52022PC0105 
  13. Rigotti, C., & McGlynn, C. (2022). Towards an EU criminal law on violence against women: The ambitions and limitations of the Commission’s proposal to criminalize image-based sexual abuse.  New Journal of European Criminal Law, 13(4), 452-477. https://doi.org/10.1177/20322844221140713 
  14. Semahegn, A., Torpey, K., Manu, A., Assefa, N., Tesfaye, G., & Ankomah, A. (2019). Are interventions focused on gender norms effective in preventing domestic violence against women in low and lower-middle-income countries? A systematic review and meta-analysis. Reproductive  Health, 16(1), 93.  
  15. Verma, P., Mark, J. (2024) How Taylor Swift’s legions of fans fought back against fake nudes.  Washington Post, January 26. Used 06.06.2024 https://www.washingtonpost.com/technology/2024/01/26/taylor-swift-fans-deepfakes-ai-porn/

Picture of Eva Ladva

Eva Ladva

Eva Ladva is currently a Master’s student at the University of Tartu on Disinformation and Societal Resilience program. She's a fierce women's rights advocate and was one of the first among Estonians who dared to call herself a feminist in public.