TFGBV Taxonomy
Abuse Type:

Intimate image abuse (IIA)

Last Updated 6/12/25
Definition: Producing, reproducing, or sharing intimate images without consent.
Sub Types:
Deceptive synthetic media Voyeuristic recording
Perpetrators:
Personal connection Informal group Nation-state Formal group
Perpetrator Intents:
Compliance Entertainment Punitive intent Financial gain Silence Aggrandizement
Targets:
Public figure Private individual
Impact Types:
Economic harm Psychological & emotional harm Self-censorship Social & political harm Abuse normalization Sexual harm
Synonyms:
Non-consensual explicit imagery (NCEI), Revenge porn (misnomer), Non-consensual intimate image sharing (NCIIS), Non-consensual intimate image abuse (NCII), Image-based sexual abuse (IBSA), Non-consensual pornography (misnomer), Non-consensual image abuse, Image-based abuse, Non-consensual distribution of intimate images (NDII)
Skill Required:Low

Intimate image abuse (IIA), formerly (and mistakenly) termed "revenge porn," involves the producing, reproducing and/or sharing of intimate images or videos without the depicted person's consent. The shift away from the older moniker emphasizes that the root of the issue isn't solely spiteful ex-partners seeking retaliation, but comprises a serious violation of privacy and consent. IIA is a persistent challenge for platforms that permit explicit content: distinguishing between consensual and non-consensual material is not always possible from content alone. Additionally, sexual content that may have been consensually recorded can be later shared non-consensually, further challenging an already fraught dilemma.

While the vast majority of platforms prohibit IIA (or its synonyms) in their Terms of Service or Policies/Community Standards, a few nefarious sites are explicitly designed for this kind of content. The ambiguity in determining this from content alone leads many platforms that ostensibly prohibit IIA to inadvertently host it alongside consensual content.

Yet, the omnipresence of the internet means once something is uploaded, eradicating it entirely becomes nearly impossible.

Digital platforms have limited means by which to address hosting IIA today, though legal regimes, cross platform collaboration, and better reporting mechanisms may offer some hope for moving toward robust takedowns. Instead, the role platforms can play most effectively is to make sure they aren't actively promoting or enabling the discovery of IIA. Web searches for an individual’s name should only return intimate imagery for those who have intentionally associated their name or image with adult material online on the internet. IIA websites can be easily added to deny-lists for platforms to prevent users from sharing links to or distributing links to their content. Approaches like these might seem low stakes, but for a user whose privacy is violated, they can make a world of difference in how they are perceived and able to represent themselves in the world.

The perpetrator may have also done other abuse types while perpetrating IIA:

  • Was the creation of the image itself abuse? That may be either voyeuristic recording or deceptive synthetic media, or the target may have been coerced via sexual extortion to create the image.
  • Was there a threat of sharing the image unless the target performed some action? That is sexual extortion.
  • Was the receiver of the image or recording harmed by the viewing of the material? That is inappropriate content.

Cultural variation

Which kinds of images are considered “intimate” is a large factor in the difficulty of mitigating this kind of abuse. Some actions are innocuous in some cultures but taboo in others.

Examples include:

  • Two people holding hands / image of a person being shared in what appears to be in a relationship
  • A person normally veiled being shown without veil
  • A person being shown next to materials forbidden/taboo in their culture

In Pakistan's remote Kohistan region, a woman was reportedly killed by family members after the circulation of a digitally altered photograph showing her holding hands with a man (Mao, 2023). In Bangladesh, a manipulated deepfake of a woman politician in a bikini led to public criticism (Verma, 2024).

In conservative societies, even minor image manipulations can severely damage reputation and safety, while similar manipulations might have lesser consequences elsewhere.

Companies and policy that intends to address this form of abuse on a global scale must contend with these differences in interpretation across their populations.

References

  • India Ministry of Home Affairs, National Crime Records Bureau. (2022). Crime in India 2022 - Statistics Volume-I. https://www.ncrb.gov.in/uploads/nationalcrimerecordsbureau/custom/1701607577CrimeinIndia2022Book1.pdf
  • Joyful Heart Foundation. (2024). Image-Based Abuse. Joyfulheartfoundation.org. https://www.joyfulheartfoundation.org/learn/image-based-abuse
  • Kaspersky. (2024). The Naked Truth - How intimate image sharing is reshaping our world. https://media.kasperskydaily.com/wp-content/uploads/sites/86/2024/07/15164921/The-Naked-Truth-Kaspersky.pdf
  • Mao, F., Ng, K., & Zubair, M. (2023, November 28). Pakistan: Woman killed after being seen with man in viral photo. BBC News. https://www.bbc.com/news/world-asia-67551554
  • Papachristou, K. (2023). Revenge Porn Helpline 2023 Report. In Revenge Porn Helpline. https://revengepornhelpline.org.uk/assets/documents/revenge-porn-helpline-report-2023.pdf
  • Powell, A., Flynn, A., & Hindes, S. (2022, December). Technology-facilitated abuse: National survey of Australian adults’ experiences. ANROWS - Australia’s National Research Organisation for Women’s Safety. https://www.anrows.org.au/publication/technology-facilitated-abuse-national-survey-of-australian-adults-experiences/
  • Revenge Porn Helpline. (2024, April 18). Reports to the Revenge Porn Helpline Increased by 106% in 2023 | Revenge Porn Helpline. Revenge Porn Helpline. https://revengepornhelpline.org.uk/news/reports-to-the-revenge-porn-helpline-increased-by-106-in-2023/
  • Ruvalcaba, Y., & Eaton, A. A. (2019). Nonconsensual pornography among U.S. Adults: A sexual scripts framework on victimization, perpetration, and health correlates for women and men. Psychology of Violence, 10(1). https://doi.org/10.1037/vio0000233
  • UN Women. (2021). Violence against women in the online space: insights from a multi-country study in the Arab States. UN Women – Arab States. https://arabstates.unwomen.org/en/digital-library/publications/2021/11/violence-against-women-in-the-online-space
  • UN Women. (2023a). The dark side of digitalization: Technology-facilitated violence against women in Eastern Europe and Central Asia. UN Women – Europe and Central Asia. https://eca.unwomen.org/en/digital-library/publications/2023/11/the-dark-side-of-digitalization-technology-facilitated-violence-against-women-in-eastern-europe-and-central-asia
  • UN Women. (2024). Frequently Asked questions: Tech-facilitated gender-based Violence. UN Women – Headquarters. https://www.unwomen.org/en/what-we-do/ending-violence-against-women/faqs/tech-facilitated-gender-based-violence
  • Vengattil, M., & Kalra, A. (2022, July 21). Facebook’s growth woes in India: too much nudity, not enough women. Reuters; Reuters. https://www.reuters.com/technology/facebooks-growth-woes-india-too-much-nudity-not-enough-women-2022-07-21/
  • Verma, P., & Zakrzewski, C. (2024, April 23). AI deepfakes threaten to upend global elections. No one can stop them. Washington Post. https://www.washingtonpost.com/technology/2024/04/23/ai-deepfake-election-2024-us-india/

AI Risks and Opportunities

As AI improves and becomes more ubiquitous, the risk of deceptive synthetic media being successfully used as a form of IIA increases, both through it being less easily identified and by lowering the skill level to do develop convincing images.

Prevalence

  • A global survey in 2024 noted that 46% of respondents had their or their acquaintance’s intimate images shared (Kaspersky, 2024).
  • Women have 28 times more intimate images shared than men (Papachristou, 2023).
  • In one study, “30% of men who received intimate images believed that it granted them ownership" (Kaspersky, 2024).
  • In an Australian study, 1 in 3 women (28.9%) had experienced sexual and image-based abuse compared to men (19.3%) (Powell, 2022).
  • 57% of women in 1 survey said that they had experienced video- and image-based abuse (UN Women, 2025).
  • In the US, a survey in 2020 with 3000 participants from all around the country, of which 54% were women, noted that 1 in 12 adults report being victims-survivors of IIA, with the real number thought to be much higher (Ruvalcaba, 2019).
  • In India, according to the latest report from the National Crimes Records Bureau in 2022, the number of reported crimes where sexually explicit material of women was published increased by 18% compared to the previous year (India Ministry of Home Affairs, 2022)
  • A survey of 881 college-going women between the ages of 19-23 in South India by IT for Change in 2019 revealed that 30% of them had sexually explicit images shared without their consent (India Ministry of Home Affairs, 2022)
  • In India, 79% of female Facebook users expressed concerns about photo misuse as a reason for why they did not want to use the platform. (Vengattil, 2022).
  • In the UK, a survey of cases reported to the Revenge Porn helpline noted that in 95% of the cases, the survivor was female, and the organization observed a 106% increase in cases from 2023 (Revenge Porn Helpline, 2024).

Mitigation Strategies

Know your customer (KYC)
Require proof of identity for functions with a higher potential for abuse.
Update ranking model
Move Away from Engagement-Based Content Ranking.
Default to highest privacy settings
Default Privacy Settings to Minimize User Vulnerability.
Rate limits on low trust accounts
Rate Limits on Interactions from New or Unverified Accounts.
Transparent feedback and reporting
Enhanced Feedback Mechanisms for Reporting and Transparency.
Real-time prompts for reconsideration
Nudging users to reconsider harmful behavior.
Is something missing, or could it be better?
About This SiteGet InvolvedContactCopyrightPrivacy
Loading...