UK law gives tech firms 48 hours to remove intimate images
Summary
UK proposes law requiring tech firms to remove non-consensual intimate images within 48 hours, with fines up to 10% of global revenue for non-compliance.
UK proposes strict new law on intimate image abuse
The UK government has proposed a new law requiring tech platforms to remove intimate images shared without consent within 48 hours. The amendment to the Crime and Policing Bill would treat this abuse with the same severity as terrorist content and child sexual abuse material.
Failure to comply could result in fines of up to 10% of a company's global sales or having their services blocked in the UK. Prime Minister Sir Keir Starmer announced the plans, stating it is part of an "ongoing battle" with platform providers.
How the new rules would work
Under the proposal, a victim would only need to flag an image once to a central system, rather than contacting each platform individually. Tech companies would then be legally obligated to take it down and block it from being re-uploaded.
The law would also empower internet service providers to block access to rogue websites hosting this illegal content, closing a loophole in the existing Online Safety Act. Technology Secretary Liz Kendall said, "The days of tech firms having a free pass are over."
Prime Minister Starmer told BBC Breakfast the rule stops victims from playing a "sort of whack-a-mole chasing wherever this image is next going up." He emphasized that since companies already have this duty for terrorist material, "it can be done."
A growing problem requiring urgent action
The push for new legislation follows alarming data on the scale of intimate image abuse (IIA). A Parliamentary report from May 2025 highlighted a 20.9% increase in reports in 2024.
Women, girls, and LGBT people are disproportionately affected. However, a separate government report from July 2025 found young men and boys are increasingly targeted for financial sexual extortion, or "sextortion."
Janaya Walker, interim director of the End Violence Against Women Coalition, said the move "rightly places the responsibility on tech companies to act."
Enforcement and recent context
Enforcement will be handled by a combination of online oversight bodies, with violations becoming a criminal matter. Sir Keir Starmer stated he did not believe this would include prison sentences for tech bosses, but would involve fines and other measures.
The announcement comes on the heels of two major events:
- Legislation in early February 2026 that made non-consensual deepfake images illegal.
- A government standoff with platform X in January 2026, after its AI tool Grok was used to generate sexualized images of real women, leading to the feature's removal.
The proposed amendment is currently making its way through the House of Lords. It represents a significant escalation in holding technology companies accountable for harmful content on their platforms.
Related Articles

Starmer: Tech firms must remove revenge porn in 48 hours or face UK block
UK PM Starmer demands tech firms remove revenge porn and deepfake nudes within 48 hours or face fines/blocking, calling online misogyny a "national emergency."

Discord to require age verification for full feature access
Discord is rolling out mandatory age verification globally starting in March. Unverified accounts will lose access to adult channels, see blurred content, and have restricted messaging. Users can verify via AI prediction, a video selfie, or ID scan. Alternatives like Slack or TeamSpeak exist but have limitations.
Stay in the loop
Get the best AI-curated news delivered to your inbox. No spam, unsubscribe anytime.

