Richard MorrisTechnology reporter

Getty Images
Tech platforms would have to remove intimate images which have been shared without consent within 48 hours, under a proposed UK law.
The government said tackling intimate image abuse should be treated with the same severity as child sexual abuse material (CSAM) and terrorist content.
Failure to abide by the rules could result in companies being fined up to 10% of their global sales or have their services blocked in the UK.
Janaya Walker, interim director of the End Violence Against Women Coalition, said the "welcome and powerful move... rightly places the responsibility on tech companies to act."
The proposals are being made through an amendment to the Crime and Policing Bill, which is making its way through the House of Lords.
Under the plans, victims would only have to flag an image once, rather than contact different platforms separately.
Tech companies would have to block the images from being re-uploaded once they have been taken down.
The proposal would also provide guidance for internet service providers to be able to block access to sites hosting illegal content, the idea being that this would target rogue websites that currently fall outside of the reach of the Online Safety Act.
Women, girls and LGBT people are disproportionately affected by Intimate Image Abuse (IIA).
A government report in July 2025 found young men and boys were largely targeted for financial sexual extortion - sometimes referred to as "sextortion" - where a victim is asked to pay money to keep intimate images from being shared online.
A Parliamentary report published in May 2025 highlighted an increase of 20.9% increase in reports of intimate image abuse in 2024.
"I saw firsthand the unimaginable, often lifelong pain and trauma violence against women and girls causes," Prime Minister Sir Keir Starmer said, referencing his time as director of public prosecutions.
"My government is taking urgent action against chatbots and 'nudification' tools," he added.
Technology Secretary Liz Kendall said: "The days of tech firms having a free pass are over... no woman should have to chase platform after platform, waiting days for an image to come down".
The announcement comes after the government's standoff with X in January, when AI tool Grok was used to generate images of real women wearing very little clothing.
This eventually led to the function being removed for users.



3 hours ago
1

















































