A leading UK domestic abuse charity has issued a stark warning: artificial intelligence and everyday digital technologies are increasingly being used as tools of control and harassment against women. From smartwatches and home assistants to spoofing apps, location trackers, and hidden surveillance devices, abusers are exploiting technology designed for convenience to extend coercive control far beyond physical proximity.According to the charity, survivors are reporting a sharp rise in what experts describe as digital coercive control, a pattern of behavior where technology is used to monitor movements, impersonate victims, intercept communications, and instill fear. In many cases, women are unaware they are being tracked or manipulated until the abuse escalates.What makes this trend particularly alarming is the pace at which technology is advancing compared to the slow evolution of legal and protective frameworks. Existing safety tools, law enforcement protocols, and regulatory safeguards were largely designed for an earlier digital era. As a result, many women find themselves navigating abuse that authorities struggle to recognize, investigate, or prosecute effectively.The charity has called for urgent action from technology companies, lawmakers, and criminal justice systems. Recommendations include safer-by-design tech standards, clearer legal definitions of digital abuse, specialized training for police, and survivor-centered reporting mechanisms that reflect the realities of modern coercive control.From our perspective, this is one of the clearest examples of how innovation without accountability becomes a gendered risk. Technology is often framed as neutral, but its misuse is not. When tools are designed without considering how power and abuse operate, women disproportionately pay the price.What is most troubling is not simply that abusers are adapting, but that institutions are lagging. Women are being asked to protect themselves in systems that were never designed with their safety as a primary assumption. Telling survivors to “turn off devices” or “go offline” is not protection; it is displacement of responsibility.The rise of AI-enabled abuse forces a hard question: Who is technology really built for, and who bears the cost when it fails? Until women’s safety is treated as a core design and policy requirement (not an afterthought) digital spaces will continue to mirror, and magnify, offline inequalities.This is not just a tech issue. It is a women’s rights issue, a public safety issue, and a test of whether progress will be governed, or allowed to harm in silence.
Iranian Women’s Team Perseveres Amid Crisis
March 2, 2026
UN Panel Urges Urgent Action on Violence and Gender Crimes
February 28, 2026
No Comment! Be the first one.