Instagram Chief Says Social Media Isn't Addictive. Yeah, Right.

Adam Mosseri, Instagram’s head honcho, took the stand Wednesday in Los Angeles and delivered what might be the most carefully worded dance around reality we’ve seen in a while. His main point? People aren’t addicted to social media. They just have “problematic use.”

Let that sink in for a moment. The guy running one of the world’s most engagement-optimized platforms is splitting hairs over whether endless scrolling constitutes clinical addiction or just something vaguely problematic. It’s like a casino executive arguing that gamblers aren’t addicted, they just have a problematic relationship with slot machines.

The testimony came during a landmark trial where Meta and YouTube are facing allegations that their platforms harm children. TikTok and Snap already settled, which tells you something about how confident they felt about their chances in court.

The Semantics Game

Mosseri was quick to clarify he’s not a medical expert, though someone “very close” to him has dealt with serious addiction. That personal connection apparently gives him enough authority to decide what counts as real addiction and what’s just users spending more time on Instagram than they “feel good about.”

The plaintiff’s lawyer, Mark Lanier, wasn’t having it. He pulled up old podcast clips where Mosseri used the term “addiction” himself. Mosseri’s response? He was probably just being “too casual” with his words, like everyone does. You know, casually describing your product’s grip on users as addiction, then walking it back in court.

This whole semantic juggling act feels like watching someone try to redefine what “is” means. The difference between clinical addiction and problematic use might matter in a medical journal, but when kids are glued to their screens for hours and experiencing health issues as a result, does the label really change anything?

Filters, Body Image, and Corporate Responsibility

The courtroom got tense when discussion turned to Instagram’s cosmetic filters. These are the ones that smooth your skin, reshape your face, and basically turn you into a living Facetune experiment. Some filters were so aggressive they essentially promoted plastic surgery procedures.

Mosseri defended the approach by saying Meta tries to “be as safe as possible but also censor as little as possible.” That’s corporate speak for “we want to let people do whatever drives engagement, but we also don’t want to get sued.”

Parents in the courtroom visibly struggled during this part of the testimony. The judge actually had to remind spectators not to show reactions, which tells you how emotionally charged this case has become.

Meta finally shut down third-party augmented reality filters in January 2025, but that’s after years of these tools being available to teens already dealing with body image issues. The Technology was out there, doing its damage, while the company collected data and ad revenue.

The Teen Safety Mirage

Instagram loves to tout all the safety features it’s added for young users. Parental controls, content warnings, screen time reminders. The works. But a report last year found that teen accounts were still being recommended graphic sexual content, self-harm material, and body image content that could trigger serious mental health issues.

Meta called that report “misleading, dangerously speculative” and accused researchers of misrepresenting their teen safety efforts. It’s a familiar pattern. Create tools that look good in press releases, then attack anyone who points out they don’t actually work as advertised.

The case centers on a 20-year-old identified as “KGM” and two other plaintiffs in bellwether trials. These are test cases that could shape how thousands of similar lawsuits play out. The stakes for Meta and other business giants in this space couldn’t be higher.

Meta is also dealing with a separate trial in New Mexico this week, because apparently one lawsuit at a time isn’t enough. The legal pressure is mounting, and no amount of careful wordsmithing about “problematic use” versus “addiction” is going to make the fundamental question go away: if your platform is designed to maximize engagement at all costs, and that design is harming kids, does it really matter what you call it?

Written by

Adam Makins

I can and will deliver great results with a process that’s timely, collaborative and at a great value for my clients.