Reddit just got hit with a £14.47m fine from the UK’s Information Commissioner’s Office, and honestly, it feels like a long time coming. The platform failed to properly verify the age of its users, which means kids under 13 were potentially wandering around spaces they shouldn’t have been accessing. For years. That’s a massive problem.
The ICO’s investigation found that between May 2018 and July 2025, Reddit was processing personal data from children without any real legal basis to do so. The platform’s terms of service say under-13s aren’t allowed, but the regulator’s estimates suggest there were loads of them on the site anyway. It’s the kind of oversight that becomes harder to excuse when you’re dealing with a company as large as Reddit.
When Asking “How Old Are You?” Isn’t Enough
Here’s where it gets frustrating. Reddit’s age verification method was basically just asking users to declare their age when signing up. You know, the same approach that’s worked so well for adult websites for the past two decades. Spoiler alert: it doesn’t work. Kids lie online. They always have, and they always will. The ICO even specifically noted this method is “easy to bypass,” which feels like the understatement of the year.
John Edwards, the UK Information Commissioner, put it bluntly: companies the size of Reddit need to actually know how old their users are. They need real age assurance measures. Not just a checkbox. Not just “I declare under penalty of perjury that I am 18 years old.” Real verification.
Reddit only started implementing proper age checks in July 2025 when the Online Safety Act requirements kicked in. That’s seven years of operating without proper protections. Seven years.
The Bigger Picture
What’s interesting here is that this fine represents a shift in how regulators see Reddit. The platform’s always positioned itself as this scrappy, privacy-respecting forum where anonymity matters. And look, there’s something to that. But as social media expert Matt Navarra told the BBC, we’re watching Reddit transition from being treated as a quirky forum site to being recognized as what it actually is: a major social platform with major responsibilities.
The UK’s got two regulators now working in tandem on this stuff. The ICO is pushing companies to take technology platforms seriously about children’s data and design. Meanwhile, Ofcom’s enforcing the Online Safety Act itself, making sure age verification actually happens across the board. It’s a pincer movement, and it’s catching up with platforms that were operating in a regulatory gray area.
What Reddit Says (And What They’re Doing About It)
Reddit’s response has been predictably defensive. A company spokesperson said the ICO’s push for better age verification is “counterintuitive and at odds with our strong belief in our users’ online privacy and safety.” They argue they didn’t require identity information precisely because they wanted to protect privacy.
That’s a fair point, actually. There is a genuine tension between data privacy and age verification. Nobody wants platforms hoarding more personal information than necessary. But here’s the thing: you can’t hide behind privacy concerns when you’re simultaneously allowing kids to access content they shouldn’t be seeing.
The company’s planning to appeal the decision, which basically means this story isn’t over. But they’ve also implemented age verification controls since July 2025, limiting unverified users’ access to adult content and certain subreddits.
The Regulatory Momentum
This fine is one of several recent actions showing that regulators are finally getting serious about protecting children online. Ofcom’s been handing out penalties to porn sites with inadequate age checks. The ICO’s investigating multiple platforms over data practices. There’s genuine momentum here.
And Reddit’s not even the biggest player in this space. TikTok and Imgur were investigated alongside it. These are the platforms where kids actually congregate. Where billions of hours are spent. The idea that they weren’t properly protecting young users until regulators forced them to is genuinely concerning.
Reddit will probably appeal this fine, probably argue about the methodology, probably claim they’re being unfairly targeted. But the underlying issue doesn’t change: the platform knew kids were using it, knew they should be protecting those kids better, and did the bare minimum for years.
The question now is whether this fine actually changes behavior across the industry, or whether it just becomes a cost of doing business for platforms betting that the fines are cheaper than implementing real safeguards.


