X Promises Faster Hate Speech Removal, But Critics Say Talk Is Cheap

Elon Musk’s X has struck a deal with the UK’s media regulator Ofcom to review suspected hate speech and terrorist content within 24 hours on average. The commitment arrives amid mounting pressure on social media platforms to police illegal material more aggressively, particularly after a spike in attacks targeting Jewish communities across Britain.

According to BBC reporting, X will now submit performance data to Ofcom every three months, demonstrating whether it’s hitting the targets. The platform has also pledged to assess at least 85% of reports within 48 hours, and to withhold UK access to accounts operated by proscribed terrorist organisations. On the surface, this looks like progress. In reality, it raises a harder question: what does a commitment actually mean if nobody’s confident it will be kept?

When Promises Meet Skepticism

Ofcom’s online safety director Oliver Griffiths framed the announcement as a “step forward,” though his language carried a subtle edge. The regulator has evidence that terrorist content and illegal hate speech continues to persist on major platforms, and these new targets represent an attempt to force real action rather than performative gestures.

The timing matters. Recent religiously-motivated crimes have put the spotlight back on online radicalisation and the role platforms play in hosting material that fuels real-world violence. The Heaton Park Synagogue attack in Manchester, incidents in Golders Green, and arson attempts on Jewish sites have sharpened the focus on what actually happens when moderation fails.

Yet even those welcoming the move express wariness. Danny Stone, chief executive of the Antisemitism Policy Trust, called it a “good start” while simultaneously flagging that X is “failing in so many regards to tackle open racism on its platform.” His statement carries the tone of someone who’s learned not to celebrate prematurely.

The Reporting Problem Nobody’s Solved

One of X’s new commitments addresses a frustration that has plagued civil society groups: unclear reporting systems. Some organisations have flagged multiple pieces of suspected illegal content to X only to find themselves unsure whether their reports were even received, let alone acted upon. It’s a basic failure in communication that shouldn’t exist on a platform with X’s resources.

The second commitment, around blocking accounts operated by terrorist organisations, sounds straightforward until you consider the enforcement burden. Determining whether an account is “operated by or on behalf of” a proscribed organisation requires investigation, context, and human judgment. It’s not something a content filter can automate.

The Test Isn’t Promises, It’s Delivery

Iman Atta, director of Tell Mama, framed the stakes clearly: “The test is not what is promised, but what is delivered.” That’s the statement that should hang over these commitments. Ofcom has essentially put monitoring systems in place to check whether X actually follows through. In three months, we’ll have data. In a year, we’ll have a track record.

But here’s what troubles me: we’re still in an era where a major Technology platform needs a regulatory body breathing down its neck to agree to review illegal hate content within 24 hours. That this is considered a breakthrough says something uncomfortable about where the bar sits. X employs thousands of people and possesses vast computational resources. The fact that removing terrorist content within a day requires a formal agreement with a regulator suggests the company either lacked the motivation or the systems to do so voluntarily.

The UK has recorded more than 1,500 racist hate crimes in the past year alone, alongside over 2,300 race-related incidents. These aren’t abstract problems. They represent real harm to real people, often preceded by online incitement. Whether X actually accelerates its review processes will have consequences beyond compliance scores.

What remains to be seen is whether 24 hours is genuinely sufficient, or whether it becomes the floor that everyone treats as a ceiling.

Written by

Adam Makins

I’m a published content creator, brand copywriter, photographer, and social media content creator and manager. I help brands connect with their customers by developing engaging content that entertains, educates, and offers value to their audience.