Online Safety Act: How UK Law Restricts Content on Gaza and Ukraine

Several major social media platforms are now limiting content about Ukraine and Gaza. This is part of their effort to comply with the UK’s Online Safety Act. The new law is meant to protect minors but is raising concerns over censorship and freedom of speech.
Online Safety Act Rules: Impact on Gaza and Ukraine Posts
The Act demands that platforms block harmful content for users under 18. This includes explicit material, self-harm posts, and violent footage. Companies that break the rules could face fines of up to £18 million or 10% of global revenue.
To comply, platforms like X (formerly Twitter) and Reddit have introduced age checks. But these checks are now hiding a wide range of content—including important posts about war and politics.
Public Interest Posts Are Being Blocked
Content affected includes drone footage from Ukraine and humanitarian clips from Gaza. Even videos of UK parliamentary debates have been restricted. Artistic works, such as Goya’s painting Saturn Devouring His Son, have also been flagged.

Critics say this restricts the public’s right to access important information. Some posts contain no graphic material at all.
How the Online Safety Act Filters Content for Logged-Out Users
Many adults browse platforms without logging in. In the UK, up to 59% of Reddit users and 37% of X users view content while logged out. These users are now treated like children under the law. As a result, they are blocked from viewing a lot of content—even when it serves the public interest.
Experts Warn About Free Speech Risks
Digital rights groups say tech firms are playing it safe. Fearing fines, they may be over-censoring posts to avoid breaking the law. This could damage free expression, especially around news and politics.

The law allows platforms to self-regulate, but experts question how well they can do this. Many have cut back on content moderation teams. That means fewer people are reviewing content carefully.
Lack of Transparency and Oversight
Each platform handles the rules differently. Meta restricts teen profiles but hasn’t made clear how much public content is filtered. Critics are calling for more transparency and better oversight. They say decisions about public interest content shouldn’t be left entirely to tech companies.
Are Platforms Overblocking on Purpose?
Some say tech firms are blocking too much content to show the law’s flaws. Others believe this is part of a difficult adjustment period. The law is new and complex, and companies may still be figuring it out.

While it may be reducing harmful content, it’s also limiting access to important topics. War, politics, and public debates should not be hidden from adults.
Final Thoughts: Balancing Protection and Public Access
The Online Safety Act aims to protect users, especially children. But the current approach may be going too far. Blocking public interest content—even unintentionally—hurts transparency and democratic dialogue.
The UK must now find a way to protect its citizens without silencing critical voices. Let me know if you’d like this converted into a press release, infographic script, or newsletter blurb.