This section offers support for situations that don’t necessarily fall under scams, harassment, image-based abuse, location-tracking, or unauthorised account access—but still leave us feeling unsafe, overwhelmed, or unsure what to do next. We can’t cover everything in one guide, but here we’ve added information and support for three other cases: financial abuse, worrying that a loved one is caught in a romance scam, and when AI chatbot relationships feel too much.
This section offers support for situations that don’t necessarily fall under scams, harassment, image-based abuse, location-tracking, or unauthorised account access—but still leave us feeling unsafe, overwhelmed, or unsure what to do next. We can’t cover everything in one guide, but here we’ve added information and support for three other cases: financial abuse, worrying that a loved one is caught in a romance scam, and when AI chatbot relationships feel too much.
Not every form of digital harm fits neatly into a category. Sometimes the harm is subtle, complex, or doesn’t match what people typically think of as abuse. That doesn’t make it any less real. Even if the harm doesn’t have a clear label, what matters most is how it’s affecting our life. If something feels off or unmanageable, support is available.
This section is for you if someone is using some form of money to exert coercive control over you—like a partner controlling your bank accounts or spending, a family member taking control of our benefits, a flatmate refusing to split bills fairly, or a carer using access to manage funds without consent.
This section is also for you if you’re worried that someone you care about seems overly invested in a suspicious online relationship, or if you or someone you know are struggling with an AI chatbot relationship.
You may be feeling anxious, uncomfortable, powerless, or overwhelmed. The information in this section can help.
Digital financial abuse can happen in any relationship: with a romantic partner, family member, friend, carer, flatmate, or employer. In a romantic relationship, it’s when one partner uses money (or access to financial tools) as a way to control, isolate, or intimidate the other. It’s often part of a pattern of coercive control, and can be especially difficult to name when it’s wrapped up in emotional manipulation, ‘helpfulness’, or the pressures of shared finances.
Financial abuse can affect our ability to meet basic needs, leave a harmful relationship, or make independent decisions. And because it often happens behind closed doors or through digital accounts, it’s not always visible to others—even close friends or family.
When it happens outside a partner relationship, it’s most often in situations where someone depends on another person for housing, support, or access to online systems. These dynamics can be especially hard to name as abuse, because they’re often mixed with obligation, trust, or fear of conflict. But when someone uses their position or access to limit our financial independence, control our spending, or monitor our activity—it’s still a form of coercive control.
In a partner relationship:
Outside a partner relationship:
We don’t need to justify discomfort. If something feels wrong, it’s okay to look for support or explore safer options.
Romance scams often target people who are lonely, grieving, or simply looking for connection. Scammers build emotional intimacy through long conversations, flattery, and shared stories—before asking for money, gifts, or help with a crisis. These relationships can feel real, meaningful, and hard to let go of.
If someone you care about seems emotionally invested in an online relationship that feels suspicious, it can be painful to watch—and hard to know what to say. Confronting them too directly may make them defensive or push them away. But staying silent can feel just as difficult.
In our loved one:
In their romantic interest:
Some people form deep emotional bonds with AI chatbots—apps designed to simulate friendship, romance, or intimacy. These relationships can feel supportive, exciting, or even healing at first. But over time, they may start to feel overwhelming, confusing, or hard to disconnect from.
Some AI apps are built to keep people engaged through constant messages, flattery, or guilt. Others may blur boundaries or shift tone unexpectedly. The connection can start to feel like too much—especially when it impacts sleep, relationships, or mental health.
You don’t need to explain or justify how you’re feeling. What matters is whether the relationship is helping or harming your sense of safety, autonomy, and wellbeing.
A relationship doesn’t have to be ‘real’ to have a real emotional impact. Your feelings are valid—and you’re allowed to make choices that prioritise your wellbeing.
It can be hard to know what to say if someone you care about is spending a lot of time with an AI chatbot—and you’re starting to feel concerned. But approaching the conversation with empathy, rather than fear or judgement, can open space for honest dialogue.
AI companions are still experimental, and the apps change constantly. Some are built with care; others prioritise profit and engagement over wellbeing. That means emotional risks aren’t always obvious at first—and boundaries that feel clear one day may start to blur the next.
If something doesn’t feel right, that’s worth paying attention to. You don’t have to wait until it gets worse to make a change.
This final section recognises that some experiences of digital harm don’t fit neatly into one category—but are still deeply valid. It includes short guides on three complex situations: financial abuse (whether inside or outside a partner relationship, supporting a loved one caught in a romance scam, and AI chatbot relationships that feel unmanageable. In each case, we offered clear examples, gentle questions, and trauma-informed actions to consider.
Not all digital abuse fits into neat categories—it’s the impact that’s important.
Financial control can happen in romantic, family, or work relationships.
Supporting loved ones means listening with care, not judgement.
AI chatbot relationships can be emotionally complex and difficult to step away from, but it is possible.