Charity Alerts: Abusers Employ AI and Digital Technologies to Target and Manipulate Women

In recent years, there has been a troubling rise in the ways domestic abusers are leveraging advanced technologies such as artificial intelligence, smartwatches, and other digital devices to perpetrate abuse and exert control over their victims. This alarming trend has been highlighted by domestic abuse charities, including Refuge, which provide essential support and resources for survivors of intimate partner violence.
During the last quarter of 2025, Refuge reported a significant uptick in the number of women seeking assistance due to technological abuse. Specifically, there was a staggering 62% increase in complex cases, bringing the total to 829 women. Additionally, referrals for individuals under the age of 30 rose by 24%, underscoring how pervasive this issue has become among younger demographics.
Recent incidents have revealed disturbing patterns where perpetrators are employing wearable technology, such as smartwatches, Oura rings, and Fitbits, to stalk and keep tabs on their victims. Moreover, through the manipulation of smart home devices—capable of controlling home lighting, heating, and security—abusers are able to disrupt and destabilize the lives of their victims. Innovative AI tools have also been misused to create digital impersonations or “spoof” identities to further manipulate and intimidate.
Emma Pickering, head of the tech-facilitated abuse team at Refuge, articulated a pressing issue: “Every day, we witness the consequences of devices being released into the market without adequate consideration for how they can be weaponized against women and girls. Perpetrators have far too much accessibility to exploit these smart accessories, which leads to devastating outcomes for victims.”
“It is unacceptable for the safety and wellbeing of women and girls to be an afterthought in technology design. Their security should be the cornerstone that informs the development of wearable devices and the regulations that govern their use,” she emphasized.
Refuge also stressed that the facilitation of abuse via smart technologies indicates a pressing need to incorporate safety considerations into the design of these products from the outset.
A poignant example involves a survivor named Mina, who was forced to leave her smartwatch behind as she escaped her abuser. This device ultimately became a tool for tracking her whereabouts, as her abuser accessed linked cloud accounts to discover her location, even after she moved to a safe space.
Mina described her experience, stating: “[It] was deeply shocking and frightening. I felt suddenly exposed and unsafe, knowing that my location was being tracked without my consent. It created a constant sense of paranoia; I couldn’t relax, sleep properly, or feel settled anywhere because I knew my movements weren’t private.”
Even after the authorities returned the smartwatch to Mina, her abuser was still able to locate her through a private investigator, who allegedly employed similar tracking technology. When Mina reported these violations to the police, her concerns were dismissed on the premise that she had “not come to any harm.”
She expressed frustration, saying, “I was repeatedly advised to relocate for my safety instead of addressing the underlying issues with the smartwatch or confiscating it from him. Each move made me feel increasingly unstable and displaced. Overall, the experience left me feeling unsafe, unheard, and responsible for managing a situation that was completely out of my control. It illuminated just how tech abuse can extend coercive control silently and powerfully, leaving survivors to shoulder an emotional and practical burden when the systems designed to protect them fail to understand the issue fully.”
Pickering further noted that abusers are increasingly employing AI to manipulate evidence against survivors. For instance, they may edit videos to create a false narrative of the victim as a drunken or erratic individual, thereby providing false justification to social services and other authorities to label them as unfit parents or a risk to themselves and others. “As these technologies evolve, we will likely witness more instances of such manipulative tactics,” she cautioned.
Moreover, the head of the abuse team highlighted alarming potential abuses of medical technologies, noting that tools designed to monitor vital health metrics—such as diabetes trackers—could be weaponized to control critical aspects of a survivor’s health, proving fatal in certain circumstances.
In light of these growing concerns, Pickering urged the government to prioritize addressing digital technology-enabled crimes, suggesting that more funding is needed to train specialized digital investigations teams capable of tackling these intricate and evolving challenges. “The government often seeks immediate solutions but fails to invest in long-term strategies to keep pace with these rapid technological advancements. Without proper investment, we will fall behind,” she cautioned.
She also called for accountability within the tech industry, asserting that tech companies must ensure their products are designed to protect vulnerable individuals and prevent facilitation of abuse. “Current regulations and frameworks—like those from Ofcom and the Online Safety Act—are insufficient,” she stated.
In response to these concerns, a government spokesperson stated: “Addressing violence against women and girls in all its forms, including when facilitated through technology, is a top priority for this administration. Our new Violence Against Women and Girls (VAWG) strategy articulates how we intend to deploy the full capabilities of the state, both online and offline. We are collaborating with Ofcom to enhance how online platforms address the disproportionate abuse women and girls face in digital spaces.”
Interested in growing your brand with smarter solutions? Get in touch with Auctera today.
