UK to Require 48-Hour Takedown of Nonconsensual Intimate Images Under Proposed Online Safety Rules

· · Views: 1,972 · 3 min time to read

The United Kingdom plans to require internet platforms to remove non-consensual intimate images within 48 hours of being notified.

New rule targets revenge porn and similar content

Reuters reported that the UK government wants social media platforms, search engines, and other online services to remove non-consensual intimate images, often called “revenge porn,” within 48 hours of being notified. The aim is to help victims regain control of their digital privacy and shorten the time harmful content remains online.

This rule would be included in the updated Online Safety Act. Regulators could fine or penalize companies that do not meet the 48-hour deadline.

Deepfakes and broader protections

The Independent reported that the new rules are also expected to cover AI-generated deepfake images used to create non-consensual content. This reflects growing concern about synthetic media being used to harm people. Platforms that do not remove flagged material within 48 hours could face action from UK regulators.

These updated rules are part of the government’s wider effort to hold tech companies more accountable for managing harmful content on their platforms.

Government aims to strengthen user rights

NDTV reported that officials have highlighted the challenge of how quickly intimate images can be shared without consent. They say the 48-hour rule is meant to shorten the time this content stays online before platforms respond.

The proposed rule is part of the government’s effort to strengthen online user protections by requiring companies to act faster when notified of serious violations, especially those affecting personal privacy and dignity. Platforms may also need to show they have effective internal reporting and enforcement systems to meet the new deadline.

Enforcement and next steps

Under the new framework, technology companies that repeatedly miss the 48-hour takedown requirement could face fines or legal action from the UK’s communications regulator. The government is working with industry stakeholders to decide how the rule will be enforced and what types of content it will cover.

This move shows increasing political and public concern about how quickly harmful content, such as non-consensual intimate images and deepfakes, can spread on social media and other digital platforms. It has led to calls for clearer responsibilities and faster action from online services.

Share
f 𝕏 in
Copied