‘Domestic abuse laws are falling behind technology’

Worried Woman with phone
©columbo.photog / Shutterstock.com

Once deemed a private matter beyond legal reach, the consolidation of domestic abuse as a crime has been a slow, hard-fought process. Domestic abuse is not uncommon, nor is it without serious consequences – not only for survivors and victims, but for our society and our economy.

One in four women in England and Wales will experience domestic abuse in their lifetime. This troubling fact alone demonstrates the vital need for domestic abuse to be written into our laws, its many forms and ramifications fully expressed and understood.

Signed into law on 29 April 2021, the Domestic Abuse Act was a landmark piece of legislation with the aim of transforming how the government, the justice system and our society understands domestic abuse. The Labour government is now ensuring that tackling violence against women and girls (VAWG) remains a priority, with an ambitious target to halve VAWG by 2034.

But, as we approach the fifth anniversary of the Domestic Abuse Act, a rapidly evolving – and highly sinister – form of abuse is outpacing our laws.

READ MORE: ‘Sixteen days, ten years, one promise: Labour must stick to its mission to halve violence against women and girls’

From online misogyny to the rise in AI being weaponised to perpetuate intimate image abuse, tech-facilitated abuse is one of the greatest threats facing women and girls today. 

Refuge, the UK’s largest specialist domestic abuse charity, played a central role in ensuring measures to combat tech-facilitated abuse, particularly intimate image abuse, were incorporated into the Act. This included a successful campaign to make threats to share intimate images a crime.

But, despite its prevalence, tech-facilitated abuse was not included in the legal definition of domestic abuse under the Act. Five years on, the ‘threat to share’ offence has still not been implemented effectively. Refuge told me that since the Domestic Abuse Act, they haven’t supported a single survivor whose perpetrator has been convicted of threatening to share an intimate image, despite survivors reporting to the police.

Worryingly, police forces have not received adequate training or resources to ensure survivors receive the full protection of the law. In fact, survivors supported by Refuge often say their reports to the police are routinely minimised or dismissed. Officers frequently demonstrate limited understanding of the harm caused by intimate image abuse, and in some cases, display victim-blaming attitudes. 

These cases are not rarities. This year alone, I’ve seen countless reports of women and girls, including my fellow MPs, who have been subjected to deepfake intimate image abuse. 

Referrals to the charity’s specialist Technology-Facilitated Abuse and Economic Empowerment team rose by more than 62% in 2025 compared to 2024, with the final three months of the year the highest on record for a single quarter. 

Technological developments are not slowing, nor is the pace at which perpetrators will find ways to abuse. It is now up to our government to take meaningful, decisive action.

Become a friend of LabourList and join our community. Our friends support our vital non-factional work and get access to exclusive content and events. 

Without urgent improvements in police practice, intimate image abuse legislation will continue to exist largely on paper rather than provide real protection for survivors. The government’s recent VAWG Strategy and police reform white paper both indicate plans for improved training for officers, but is it crucial this training is mandatory, equipping all officers with the knowledge and practical skills needed to identify, investigate and gather evidence of tech-facilitated abuse, including intimate image abuse.

The government has recently taken significant action to tackle tech-facilitated abuse, introducing new offences covering deepfake intimate image abuse and cyberflashing. Crucially, it has also strengthened requirements for tech companies to detect and prevent non-consensual intimate images, and to remove them swiftly when notified. 

To meet our target of halving VAWG by 2034, continued and increased pressure must be placed on these tech companies, which have consistently failed to police themselves. 

Ofcom’s VAWG guidance was published last year, which included welcome recommendations for tech companies to protect the safety of women and girls online. But the guidance is voluntary and non-enforceable, allowing tech platforms to continue prioritising profit over protection.

Alongside improving police responses through mandatory training, Ofcom’s VAWG guidance must be upgraded to a legally enforceable Code, giving Ofcom powers to take enforcement action against non-compliant tech companies.

Five years since the Domestic Abuse Act was introduced, the impact of tech-facilitated abuse has become increasingly clear. Without urgent legislative upgrades and improved practice in the justice system, women and girls will continue to pay the price.

Subscribe here to our daily newsletter roundup of Labour news, analysis and comment– and follow us on TikTok, Bluesky, WhatsApp, X and Facebook. You can also write to our editor to share your thoughts on our stories and share your own. The best letters are published every Sunday.

 

 

More from LabourList

Become a Friend

Support independent Labour journalism – for just £4.99 a month!

If you value what we do, become a Friend of LabourList today.