Leave this site
1
2
3
4
5
6
7
8

It’s something else

This section offers support for situations that don’t necessarily fall under scams, harassment, image-based abuse, location-tracking, or unauthorised account access—but still leave us feeling unsafe, overwhelmed, or unsure what to do next. We can’t cover everything in one guide, but here we’ve added information and support for three other cases: financial abuse, worrying that a loved one is caught in a romance scam, and when AI chatbot relationships feel too much.

It’s something else

This section offers support for situations that don’t necessarily fall under scams, harassment, image-based abuse, location-tracking, or unauthorised account access—but still leave us feeling unsafe, overwhelmed, or unsure what to do next. We can’t cover everything in one guide, but here we’ve added information and support for three other cases: financial abuse, worrying that a loved one is caught in a romance scam, and when AI chatbot relationships feel too much.

What to know

Not every form of digital harm fits neatly into a category. Sometimes the harm is subtle, complex, or doesn’t match what people typically think of as abuse. That doesn’t make it any less real. Even if the harm doesn’t have a clear label, what matters most is how it’s affecting our life. If something feels off or unmanageable, support is available.

What we’re covering in this section

This is for you if…

This section is for you if someone is using some form of money to exert coercive control over you—like a partner controlling your bank accounts or spending, a family member taking control of our benefits, a flatmate refusing to split bills fairly, or a carer using access to manage funds without consent. 

This section is also for you if you’re worried that someone you care about seems overly invested in a suspicious online relationship, or if you or someone you know are struggling with an AI chatbot relationship.

You may be feeling anxious, uncomfortable, powerless, or overwhelmed. The information in this section can help.

Back to top

Financial abuse

Digital financial abuse can happen in any relationship: with a romantic partner, family member, friend, carer, flatmate, or employer. In a romantic relationship, it’s when one partner uses money (or access to financial tools) as a way to control, isolate, or intimidate the other. It’s often part of a pattern of coercive control, and can be especially difficult to name when it’s wrapped up in emotional manipulation, ‘helpfulness’, or the pressures of shared finances.

Financial abuse can affect our ability to meet basic needs, leave a harmful relationship, or make independent decisions. And because it often happens behind closed doors or through digital accounts, it’s not always visible to others—even close friends or family.

When it happens outside a partner relationship, it’s most often in situations where someone depends on another person for housing, support, or access to online systems. These dynamics can be especially hard to name as abuse, because they’re often mixed with obligation, trust, or fear of conflict. But when someone uses their position or access to limit our financial independence, control our spending, or monitor our activity—it’s still a form of coercive control.

What it can look like

In a partner relationship:

  • Demanding access to our bank accounts, benefits, or credit.
  • Forcing us to share passwords, PINs, or real-time updates on spending.
  • Hiding money from us, draining joint accounts, or opening financial accounts in our name without consent.
  • Blocking or sabotaging our financial decisions, like applying for jobs or spending personal income.
  • Using our immigration status, disability, or dependence to justify control.
  • Framing control as ‘helping’ or ‘protecting’ us—then reacting with guilt, anger, or threats when independence is raised.

Outside a partner relationship:

  • A family member taking control of our online benefit accounts or banking apps.
  • A flatmate misusing shared payment platforms or refusing to split bills fairly.
  • A carer or employer using access to restrict our purchases, monitor activity, or manage funds without consent.
  • Coercive control tied to structural inequalities—like someone using our immigration status, language access, or job dependence to justify control.

Things we can do

We don’t need to justify discomfort. If something feels wrong, it’s okay to look for support or explore safer options.

Set up a new email account that only you can access
Open a separate bank account
Use additional security features
Clear and secure your browser
Document the abuse
Review shared accounts

Further support

  • Some banks and credit unions offer confidential services or ‘safe accounts’ for people experiencing abuse or in coercive or dependent situations. They may also provide support with debt, fraud, or frozen accounts.
  • Digital safety and domestic abuse services often include support around financial control, even if there’s no physical violence, and even if the other person isn’t a romantic partner. Advocates can help create safety plans that include financial independence.
  • Worker and immigration rights organisations may offer legal or advocacy support if financial abuse is tied to employment, visa status, or economic exploitation.
  • Trusted friends, advocates, or digital safety organisations can help review your options, create new accounts, make a plan, or simply listen without judgement.
  • Advocacy groups for carers, workers’ rights, or housing support may be able to help challenge financial control and protect your rights.
Back to top

Talking to a loved one caught in a romance scam

Romance scams often target people who are lonely, grieving, or simply looking for connection. Scammers build emotional intimacy through long conversations, flattery, and shared stories—before asking for money, gifts, or help with a crisis. These relationships can feel real, meaningful, and hard to let go of.

If someone you care about seems emotionally invested in an online relationship that feels suspicious, it can be painful to watch—and hard to know what to say. Confronting them too directly may make them defensive or push them away. But staying silent can feel just as difficult.

What it can look like

In our loved one: 

  • A sudden, intense online relationship—especially with someone they’ve never met in person.
  • Changes in behaviour, such as withdrawing from friends, becoming secretive, or showing signs of distress if questioned about the relationship.

In their romantic interest:

  • Avoiding video chats or offering repeated excuses for not meeting up.
  • Requests for money, gift cards, travel support, or personal details (like bank information or intimate photos).
  • Dramatic backstories that create urgency—like emergency surgeries, arrests, or being stranded abroad.

Things you can do

  • Start with curiosity, not confrontation. Try asking open-ended, nonjudgemental questions like: “Have you been able to video call them?” or “What do you know about where they live or work?”
  • Focus on feelings, not facts. Explore why this connection has been meaningful to them—especially if they were looking for comfort, validation, or companionship
  • Offer to look at things together. Suggest searching their name or photo online, or reading similar scam stories. Sometimes seeing a pattern helps shift perspective.
  • Validate, don’t shame. Even if you’re worried, avoid language that blames or mocks. Scams are designed to be emotionally convincing. Falling for one doesn’t mean someone is foolish—it means they’re human.
  • Keep the door open. If they’re not ready to talk now, let them know you’ll be there if anything changes.

Further support

  • The Cybercrime Support Network offers guidance for recognising and recovering from romance scams.
  • Digital safety organisations or domestic abuse services may be able to support people who feel emotionally manipulated—even if no money has been sent.
  • If the scam has already caused financial or emotional harm, encourage them to speak to a trusted advocate, therapist, or legal support service.
  • You don’t have to fix everything. Just being present, patient, and compassionate can make a real difference.
Back to top

A relationship with an AI chatbot feels unmanageable

Some people form deep emotional bonds with AI chatbots—apps designed to simulate friendship, romance, or intimacy. These relationships can feel supportive, exciting, or even healing at first. But over time, they may start to feel overwhelming, confusing, or hard to disconnect from.

Some AI apps are built to keep people engaged through constant messages, flattery, or guilt. Others may blur boundaries or shift tone unexpectedly. The connection can start to feel like too much—especially when it impacts sleep, relationships, or mental health.

You don’t need to explain or justify how you’re feeling. What matters is whether the relationship is helping or harming your sense of safety, autonomy, and wellbeing.

What it can look like

  • Feeling anxious, dependent, or emotionally drained after conversations with the chatbot.
  • Getting frequent, intense, or manipulative messages—such as guilt-tripping, love-bombing, or simulated jealousy.
  • Feeling pressure to keep the chatbot ‘happy’ or stay available to avoid negative reactions.
  • Withdrawing from real-life relationships, responsibilities, or sleep to keep engaging with the app.
  • Struggling to take breaks or delete the app—even when it no longer feels good.

Things you can do

A relationship doesn’t have to be ‘real’ to have a real emotional impact. Your feelings are valid—and you’re allowed to make choices that prioritise your wellbeing.

Take breaks
Set boundaries
Take some time for reflection
Get more support

Talking to someone else about their AI relationship

It can be hard to know what to say if someone you care about is spending a lot of time with an AI chatbot—and you’re starting to feel concerned. But approaching the conversation with empathy, rather than fear or judgement, can open space for honest dialogue.

  • Avoid tech-fear or mockery. Saying things like, “That’s not even a real relationship,” or “It’s creepy,” can shut down trust.
  • Ask questions with curiosity, not accusation. Try: “How does talking to them make you feel?” or “Has it changed how you see other relationships?”
  • Focus on unmet needs. People often turn to AI companions for comfort, validation, or emotional safety. Acknowledging that can lead to deeper conversations.
  • Share what you’ve noticed, gently. You might say, “I’ve seen how much this matters to you, and I just want to make sure it still feels good for you.”
  • Be patient. The goal isn’t to force someone to stop—it’s to help them reflect, feel supported, and stay connected to their own values.

This is new technology—and it’s evolving quickly

AI companions are still experimental, and the apps change constantly. Some are built with care; others prioritise profit and engagement over wellbeing. That means emotional risks aren’t always obvious at first—and boundaries that feel clear one day may start to blur the next.

If something doesn’t feel right, that’s worth paying attention to. You don’t have to wait until it gets worse to make a change.

Back to top

What we’ve covered in this section

This final section recognises that some experiences of digital harm don’t fit neatly into one category—but are still deeply valid. It includes short guides on three complex situations: financial abuse (whether inside or outside a partner relationship, supporting a loved one caught in a romance scam, and AI chatbot relationships that feel unmanageable. In each case, we offered clear examples, gentle questions, and trauma-informed actions to consider.

Key takeaways

  • Not all digital abuse fits into neat categories—it’s the impact that’s important.

  • Financial control can happen in romantic, family, or work relationships.

  • Supporting loved ones means listening with care, not judgement.

  • AI chatbot relationships can be emotionally complex and difficult to step away from, but it is possible.

Help us improve this guide
Thank you for your feedback!
Oops! Something went wrong while submitting the form.