Leave this site
1
2
3
4
5
6
7
8

Someone has my personal images

Image-based abuse involves the sharing, threat of sharing, or creation of intimate or private images without consent. It’s sometimes called ‘revenge porn’ in media or legal contexts—but that term is misleading and stigmatising. It’s not pornography. It’s a violation of our trust, privacy, and safety.

Someone has my personal images

Image-based abuse involves the sharing, threat of sharing, or creation of intimate or private images without consent. It’s sometimes called ‘revenge porn’ in media or legal contexts—but that term is misleading and stigmatising. It’s not pornography. It’s a violation of our trust, privacy, and safety.

What to know

This kind of abuse can involve photos or videos taken with or without consent. In some cases, we shared the image willingly at the time, but it’s later used in ways we never agreed to. In others, images may be secretly taken or generated using AI tools.

Image-based abuse can happen in any kind of relationship or online interaction. It may be part of a broader pattern of coercion, blackmail, or domestic abuse—or it may happen in isolation. Regardless of how the images were created or shared, no one ever deserves this kind of violation. In the following pages, we’ll help you understand what’s happening, what you can do about it, and how you can get support.

This is for you if…

This section is for you if you’ve been targeted, or are worried you may have been targeted, for a romance, sextortion, financial, or other online scam. You might be confused about whether to trust your instincts, or feel rushed or emotionally overwhelmed. You might feel anxious or scared about what happens next, or what you can do. The information in this section can help.

Back to top

Understanding image-based abuse

Image-based abuse can take many forms. Some involve authentic images (taken with a camera); others involve AI-generated content that’s made to look real. In every case, the impact can be deeply violating—especially when the content is sexual, intimate, or private, and/or it’s used to threaten, humiliate, or control.

Below are some of the ways that image-based abuse can take place. It’s okay if we feel violated, scared, or unsure—even if the image was shared consensually at one point, or created using AI. Image-based abuse is about power and control, not whether an image is ‘authentic’ or ‘explicit enough’. If our likeness or body is being used without consent, that harm is real—and we deserve support.

Nonconsensual image sharing
Secretly taken images
Threats to share images
Cyberflashing
Child Sexual Abuse Material (CSAM)
Back to top

What it can mean for us: Identifying image-based abuse

Image-based abuse isn’t always obvious at first. Sometimes it starts with a partner who takes intimate photos but won’t delete them. Sometimes it's a stranger stealing selfies we posted, and reposting them for others to reuse or harass us. It can involve AI-generated images, photos taken without permission, or images once shared in trust now being used to control, threaten, or humiliate. The impact can be confusing, painful, or even hard to name. But if something feels wrong—it’s worth paying attention to.

Red flags to look out for

  • Someone refusing to delete photos or videos, even when asked.
  • Being pressured to send pictures, especially after saying no.
  • Threats to share images publicly or with friends, family, or employers.
  • Discovering images of ourselves online that we didn’t post—or that look altered or AI-generated.
  • Seeing our face, body, or name linked to sexual content without our consent.
  • Hearing from others that our images are circulating, even if we haven’t seen them directly.

How we might feel

  • Like we’ve lost control over our own body or identity.
  • Afraid to go online, go out, or use our real name.
  • Anxious about who has seen the image—or who might see it next.
  • Embarrassed, ashamed, or unsure whether it ‘counts’ as abuse.
  • Confused about how the image was made or where it came from.

Questions we can ask ourselves

  • Was this image created, kept, or shared without my full and ongoing consent?
  • Is someone using it to shame, control, or hurt me?
  • Do I feel unsafe, exposed, or unable to make it stop?
  • Has this changed how I use technology, social spaces, or how safe I feel in my body?

Even if we’re not sure, even if there’s no clear threat—our feelings matter. Image-based abuse doesn’t need to be shared publicly or go viral to be harmful. If something is worrying or doesn’t feel right, there are steps we can take. We’ll share some practical strategies further on in this section.

Back to top

The impact of AI

Generative AI is a type of technology that creates new content—like images, videos, or voices—based on patterns it has learned from existing data. This can be used in creative or helpful ways, but it’s also being used to harm people through image-based abuse.

AI tools can now create sexually explicit images that look very convincing, even if the person in them never took part. Some are made by editing authentic photos, while others are built entirely from scratch using only a face or profile picture.

Below are some common AI image-based abuse tools. Many of these tools are free, anonymous, and easy to access—survivors are often unaware their likeness has been used until the image is shared or threatened. But even if an image is fake, the harm is real.

‘Undressing’ apps
Face-swapping apps
Prompt-based generative AI tools
Chatbots
AI and automated accounts

Remember, AI is evolving quickly

The clues that once helped us spot AI-generated content aren’t always reliable any more. Things like extra fingers in photos or blurry edges around objects were once common signs, but new tools are improving fast. 

Instead of relying on those glitches, it’s more helpful to look at the bigger picture. These are some helpful questions to ask:

  • Does the image or audio feel believable? 
  • Who originally shared it, and where did it come from? 
  • Can you double-check it through another reputable source or tool? 

Trusting your instincts, asking questions, and slowing down can all help make sense of what you’re seeing or hearing.

Back to top

You’re not alone if you’re thinking…

Having private or intimate images shared—or threatened to be shared—can feel deeply violating. Some survivors describe feeling exposed, anxious, ashamed, or unsure what to do next. Others feel numb, angry, or overwhelmed by the fear of what might still happen. 

Different people react in different ways, and all these reactions are valid. These are some questions survivors have voiced, from clarifying legalities to assessing the extent of the abuse.

Can I report it even if I originally agreed to take or send the image?
They didn’t post the photo—they just threatened to. Is that illegal?
What if they say they deleted it, but I’m not sure?
What if my face isn’t in the image?
How do I know if the image has been shared somewhere else?
How do I prove an AI-generated image is fake?
Back to top

Things we can do

Image-based abuse isn’t always preventable—but there are steps that can help reduce risk, increase control, and make it harder for someone to misuse private images or videos. These actions won’t stop someone who’s determined to cause harm, but they can create safer conditions, limit exposure, and support faster response if something does happen.

Taking these steps can be exhausting—both emotionally and practically. It’s okay to move at our own pace and choose only the steps that feel manageable right now. No one should have to manage this alone. Support is available, and we deserve safety, privacy, and peace of mind.

Secure your accounts
Check what’s visible online
Be safe when creating or sharing images
Verify identities
Set clear boundaries
Back to top

Further steps and support

Responding to image-based abuse can feel overwhelming or frightening, but there are options for reporting, removal, and support.

Document the abuse

  • Save screenshots, URLs, platform names, usernames, and any related messages. 
  • Where possible, download or export a copy of the image, video, or webpage, and store everything in a secure folder like Android’s Locked Folder or Apple’s Hidden Folder.
  • Safety Net Project can provide more detailed guidance on how to document online abuse.

Most social media, chat, and content-sharing platforms allow users to report nonconsensual image sharing or threats. If you choose to submit a report, it’s helpful to include screenshots, usernames, and URLs where possible.

Image-based abuse is illegal in many countries, although specific laws and reporting processes vary. If you’re considering this option, local support services can help you understand what protections exist where you live, and may be able to guide you through reporting if you choose to go forward. For instance, in the UK you can report via Report Harmful Content. You can find out about reporting processes in your country by searching online for terms like:

  • ‘[your country] revenge porn laws’
  • ‘[your country] image-based abuse’

Sharing, possessing, or threatening to share sexual images of anyone under 18 is considered a criminal offence, even if the image was originally shared willingly. 

Getting images removed

  • Our Survivor AI: Compassionate guidance through the content takedown request process. Our free, anonymous AI-powered tool helps you create strong takedown letters tailored to your specific situation, which you can then send via email.
  • StopNCII.org: Allows people to hash their intimate images (including those generated by AI) and prevent them from being uploaded on participating platforms.
  • Take It Down: A free removal service for anyone under 18 who is worried about their images being shared online. Available in the US and globally.
  • Cyber Civil Rights Initiative: Offers tools and legal info for removing images in the US and internationally.

Back to top

What we’ve covered in this section

Over the last few screens, we’ve focused on image-based abuse, including non-consensual sharing of private images, sextortion, and AI-generated sexual content. We broke down what image-based abuse looks like, how to identify it, and what steps survivors can take to regain control. We also confirmed that we can still be harmed, even when images weren’t shared widely or didn’t include our face.

Image-based abuse can be deeply distressing, and that can make it hard to absorb information or follow guidance. This guide is here to come back to at any time—for now, these are the main things to keep in mind.

Key takeaways

  • Image-based abuse can involve real or AI-generated images.

  • Consent to take or share an image once doesn’t mean consent forever.

  • Survivors don’t need proof to take action or seek support.

  • Image-based abuse doesn’t need to be shared publicly or go viral to be harmful.

  • Privacy settings, secure storage, and boundary-setting can reduce risk.

Help us improve this guide
Thank you for your feedback!
Oops! Something went wrong while submitting the form.