Image-based abuse involves the sharing, threat of sharing, or creation of intimate or private images without consent. It’s sometimes called ‘revenge porn’ in media or legal contexts—but that term is misleading and stigmatising. It’s not pornography. It’s a violation of our trust, privacy, and safety.
Image-based abuse involves the sharing, threat of sharing, or creation of intimate or private images without consent. It’s sometimes called ‘revenge porn’ in media or legal contexts—but that term is misleading and stigmatising. It’s not pornography. It’s a violation of our trust, privacy, and safety.
This kind of abuse can involve photos or videos taken with or without consent. In some cases, we shared the image willingly at the time, but it’s later used in ways we never agreed to. In others, images may be secretly taken or generated using AI tools.
Image-based abuse can happen in any kind of relationship or online interaction. It may be part of a broader pattern of coercion, blackmail, or domestic abuse—or it may happen in isolation. Regardless of how the images were created or shared, no one ever deserves this kind of violation. In the following pages, we’ll help you understand what’s happening, what you can do about it, and how you can get support.
This section is for you if you’ve been targeted, or are worried you may have been targeted, for a romance, sextortion, financial, or other online scam. You might be confused about whether to trust your instincts, or feel rushed or emotionally overwhelmed. You might feel anxious or scared about what happens next, or what you can do. The information in this section can help.
Image-based abuse can take many forms. Some involve authentic images (taken with a camera); others involve AI-generated content that’s made to look real. In every case, the impact can be deeply violating—especially when the content is sexual, intimate, or private, and/or it’s used to threaten, humiliate, or control.
Below are some of the ways that image-based abuse can take place. It’s okay if we feel violated, scared, or unsure—even if the image was shared consensually at one point, or created using AI. Image-based abuse is about power and control, not whether an image is ‘authentic’ or ‘explicit enough’. If our likeness or body is being used without consent, that harm is real—and we deserve support.
Image-based abuse isn’t always obvious at first. Sometimes it starts with a partner who takes intimate photos but won’t delete them. Sometimes it's a stranger stealing selfies we posted, and reposting them for others to reuse or harass us. It can involve AI-generated images, photos taken without permission, or images once shared in trust now being used to control, threaten, or humiliate. The impact can be confusing, painful, or even hard to name. But if something feels wrong—it’s worth paying attention to.
Even if we’re not sure, even if there’s no clear threat—our feelings matter. Image-based abuse doesn’t need to be shared publicly or go viral to be harmful. If something is worrying or doesn’t feel right, there are steps we can take. We’ll share some practical strategies further on in this section.
Generative AI is a type of technology that creates new content—like images, videos, or voices—based on patterns it has learned from existing data. This can be used in creative or helpful ways, but it’s also being used to harm people through image-based abuse.
AI tools can now create sexually explicit images that look very convincing, even if the person in them never took part. Some are made by editing authentic photos, while others are built entirely from scratch using only a face or profile picture.
Below are some common AI image-based abuse tools. Many of these tools are free, anonymous, and easy to access—survivors are often unaware their likeness has been used until the image is shared or threatened. But even if an image is fake, the harm is real.
The clues that once helped us spot AI-generated content aren’t always reliable any more. Things like extra fingers in photos or blurry edges around objects were once common signs, but new tools are improving fast.
Instead of relying on those glitches, it’s more helpful to look at the bigger picture. These are some helpful questions to ask:
Trusting your instincts, asking questions, and slowing down can all help make sense of what you’re seeing or hearing.
Having private or intimate images shared—or threatened to be shared—can feel deeply violating. Some survivors describe feeling exposed, anxious, ashamed, or unsure what to do next. Others feel numb, angry, or overwhelmed by the fear of what might still happen.
Different people react in different ways, and all these reactions are valid. These are some questions survivors have voiced, from clarifying legalities to assessing the extent of the abuse.
Image-based abuse isn’t always preventable—but there are steps that can help reduce risk, increase control, and make it harder for someone to misuse private images or videos. These actions won’t stop someone who’s determined to cause harm, but they can create safer conditions, limit exposure, and support faster response if something does happen.
Taking these steps can be exhausting—both emotionally and practically. It’s okay to move at our own pace and choose only the steps that feel manageable right now. No one should have to manage this alone. Support is available, and we deserve safety, privacy, and peace of mind.
Responding to image-based abuse can feel overwhelming or frightening, but there are options for reporting, removal, and support.
Most social media, chat, and content-sharing platforms allow users to report nonconsensual image sharing or threats. If you choose to submit a report, it’s helpful to include screenshots, usernames, and URLs where possible.
Image-based abuse is illegal in many countries, although specific laws and reporting processes vary. If you’re considering this option, local support services can help you understand what protections exist where you live, and may be able to guide you through reporting if you choose to go forward. For instance, in the UK you can report via Report Harmful Content. You can find out about reporting processes in your country by searching online for terms like:
Sharing, possessing, or threatening to share sexual images of anyone under 18 is considered a criminal offence, even if the image was originally shared willingly.
Over the last few screens, we’ve focused on image-based abuse, including non-consensual sharing of private images, sextortion, and AI-generated sexual content. We broke down what image-based abuse looks like, how to identify it, and what steps survivors can take to regain control. We also confirmed that we can still be harmed, even when images weren’t shared widely or didn’t include our face.
Image-based abuse can be deeply distressing, and that can make it hard to absorb information or follow guidance. This guide is here to come back to at any time—for now, these are the main things to keep in mind.
Image-based abuse can involve real or AI-generated images.
Consent to take or share an image once doesn’t mean consent forever.
Survivors don’t need proof to take action or seek support.
Image-based abuse doesn’t need to be shared publicly or go viral to be harmful.
Privacy settings, secure storage, and boundary-setting can reduce risk.