Instagram's Boss Says 16 Hours of Daily Use Isn't 'Addiction' and Parents Are Furious

Adam Mosseri walked into a California courthouse this week and tried to convince a jury that a teenager spending 16 hours on Instagram in a single day wasn’t necessarily addicted. Just problematic use, he said. You know, like when you binge a Netflix show until 3am.

The head of Instagram has been running the platform for eight years, and now he’s the first major tech executive to testify in what could be a landmark trial. Meta is facing accusations that Instagram caused serious mental health damage to minors, and Mosseri’s testimony is already raising eyebrows.

The case centers around a plaintiff identified as K.G.M, who reportedly made over 300 reports to Instagram about bullying on the platform. Her longest single day of use? Sixteen hours. When asked about this, Mosseri admitted it sounded like “problematic use” but stopped short of calling it addiction.

The Semantics Game Nobody Asked For

Here’s where things get interesting. Mosseri repeatedly drew a distinction between “clinical addiction” and what he called “problematic use.” He even compared Instagram use to binge-watching Netflix, as if the two are remotely the same thing.

The issue is that Mosseri isn’t an addiction expert, which he admitted multiple times under questioning. So why is he the one defining what counts as addiction and what doesn’t? Meta’s lawyers are arguing that K.G.M’s mental health struggles came from other aspects of her life, not Instagram. But when someone is spending 16 hours a day on your platform and reporting bullying hundreds of times, maybe that’s worth examining.

During testimony, lead attorney Mark Lanier brought up an internal Meta survey that found 60% of 269,000 Instagram users had experienced or witnessed bullying in the previous week. Sixty percent. That’s not a small problem, and it’s definitely not limited to people with pre-existing issues.

When Your Own Executives Sound the Alarm

One particularly damning piece of evidence came from a 2019 email exchange between Meta executives. Nick Clegg, who spent years as Meta’s head of global affairs after his time in British politics, raised concerns about Instagram’s image filters that let people alter their physical appearance.

Clegg warned that Meta would be “rightly accused of putting growth over responsibility” and that it would damage the company’s reputation. Mosseri said they eventually banned filters that went beyond makeup effects. But under cross-examination, he admitted the ban was “modified.” In other words, they walked it back.

This is the kind of thing that makes people lose faith in Technology companies. Internal executives are flagging serious concerns about harm to users, especially young ones, and the company’s response is to implement half-measures that get quietly rolled back later.

Outside the courthouse, parents gathered with photos of their children who they say were harmed by social media. Mariano Janin flew in from London holding a picture of his daughter Mia, who died by suicide at 14 in 2021. “They have the technology; they have the funds,” Janin said. “They should protect kids.”

The Business Model Question

The trial is expected to last six weeks and could set a precedent for how tech companies are held accountable for their impact on young users. Snapchat and TikTok already reached settlements before the trial began. YouTube is still named in the suit alongside Instagram.

Meta and other social media platforms are facing thousands of similar cases across the United States from families, prosecutors, and school districts. This isn’t one angry parent with an outlier case. It’s a pattern that Business leaders in Silicon Valley can no longer dismiss as isolated incidents.

Mark Zuckerberg and YouTube CEO Neal Mohan are both expected to testify in the coming weeks. It’ll be fascinating to see if they stick to the same talking points about “problematic use” versus “addiction” or if they take a different approach.

What Sixteen Hours Actually Looks Like

Let’s put that 16-hour usage number in perspective. That’s two-thirds of a day. If you sleep eight hours, that leaves exactly zero time for school, meals, family, or literally anything else. But according to Mosseri, whether that’s too much is “a personal thing.” One person could use Instagram more than you and feel fine about it, he said.

This logic falls apart pretty quickly when you’re talking about teenagers whose brains are still developing. The idea that a 14-year-old can self-regulate their social media use when the platform is specifically designed to be as engaging as possible is absurd.

Mosseri did agree with one point early in his testimony. When asked if Instagram should do everything in its power to keep users safe, especially young people, he said yes. But agreeing to a principle in court and actually implementing meaningful protections are two very different things.

The whole testimony feels like watching someone try to thread a needle while wearing oven mitts. Mosseri wants to acknowledge that something might be wrong without admitting liability, claim they’re doing everything they can without actually changing the fundamental business model, and sound sympathetic while defending a platform that by his own admission has problems.

When your defense boils down to “sure, she spent 16 hours a day on our app and reported bullying 300 times, but we don’t think we’re responsible,” maybe it’s time to ask whether the problem is bigger than any one user’s pre-existing conditions.

Written by

Adam Makins

I can and will deliver great results with a process that’s timely, collaborative and at a great value for my clients.