---
layout: post
title: "A 20-Year-Old's Fight Against Meta and Google Reveals the Dark Side of Social Media for Kids"
description: "Kaley's lawsuit exposes how Instagram and YouTube's design keeps children hooked, raising questions about tech giants' responsibility."
date: 2026-02-26 20:00:25 +0530
author: adam
image: 'https://images.unsplash.com/photo-1674064205823-1668a0777091?q=80&w=988'
video_embed:
tags: [news, tech]
tags_color: '#1f78b4'
---
When Kaley was just six years old, she downloaded YouTube. By nine, she'd picked up Instagram. Nobody stopped her. Nobody asked why a child that young needed access to platforms designed to keep adults scrolling for hours. Today, at 20, she's telling a Los Angeles jury exactly what those years of unrestricted access cost her.
Her story isn't unique, but it's one that's finally being heard in court. And it matters more than you might think.
Kaley's testimony paints a picture most of us recognize but rarely admit out loud. She woke up, grabbed her phone, and didn't put it down until bedtime. Instagram was the first thing she saw each morning and the last thing at night. YouTube's autoplay feature? It kept pulling her deeper, video after video, an endless stream designed specifically to hold her attention. She was nine years old.
The platforms didn't just take her time. They rewired how she felt about herself.
## When Likes Become Self-Worth
Low engagement on her posts left Kaley feeling "insecure" and "ugly." That's not teenage drama or typical growing pains. That's a nine-year-old kid internalizing the metrics that social media uses to measure human value. By her early teenage years, she'd been diagnosed with body dysmorphia, a condition where people become obsessed with perceived flaws in their appearance. Before social media? She didn't have these feelings.
The anxiety and depression came around the same time. So did the self-harm. By ten years old, Kaley was cutting herself.
Meta's defense has been fairly predictable. Their lawyers point to family issues, to a difficult relationship with her mother, to anything that doesn't involve their product. But here's what's interesting: Kaley's mother says most of their arguments weren't about normal teenage stuff. They were about her iPhone use. About time spent online.
Meta wants to argue that <a href="https://infeeds.com/tags/?tag=technology">technology</a> isn't the problem. That family dysfunction is. Maybe both things are true. But when you look at the timeline, when you see a healthy child become unhealthy right after gaining access to platforms with no age verification and algorithmic systems designed to maximize engagement, it gets harder to play that game.
## The First Real Reckoning
This trial is historic in a quiet, important way. For the first time, a court is actually examining what responsibility social media companies carry for their youngest users. TikTok and Snapchat already settled their portions of the lawsuit before trial even started. They clearly decided it wasn't worth the fight.
The case is expected to run until mid-March, and the implications stretch far beyond Kaley's story. There are thousands of similar lawsuits filed by families and state governments across America. Kids dealing with eating disorders, depression, anxiety, and self-harm, all occurring or worsening after social media adoption. This trial could open doors, or at least force conversations that have been too comfortable to avoid.
What's wild is how long it took for this to happen. We've known for years that these platforms use engagement metrics that exploit psychological vulnerabilities. We've known about the autoplay features, the notification systems, the endless scroll. We've known that teenagers are especially susceptible because their brains are still developing, that the dopamine hits from likes and comments hit differently for a young mind.
Yet for years, the response has been "parents should monitor their kids" or "personal responsibility." Tell a nine-year-old to have personal responsibility against a billion-dollar algorithm designed by PhD-level engineers to be addictive. Go ahead.
## What Happens Next
The jury will decide whether Meta and Google are liable. Whether they knowingly designed products to be addictive to children. Whether they failed in any duty of care. The verdict won't suddenly change how social media works, but it will change the legal landscape. It will make it harder for companies to hide behind the "it's just a platform" excuse.
Kaley's lawyer, Mark Lanier, was staring directly at Mark Zuckerberg during his seven-hour testimony, asking him to explain why his platforms need no age verification for young children. Why autoplay exists. Why the algorithm works the way it does. These aren't hypothetical questions anymore.
The uncomfortable truth is that these companies didn't accidentally create products that hook kids. They built them that way on purpose, because engagement is profitable. The business model depends on keeping people on the platform as long as possible. When that person is a developing brain, the stakes get higher. The damage gets more real.
Kaley spent her entire childhood staring at screens instead of living. She missed family moments, missed school experiences, missed the chance to figure out who she was without an algorithm telling her what to think about herself. Whether the jury finds Meta and Google guilty or not, that part of her story is already written.
The question is whether we're finally going to do anything about the systems that are still doing this to other kids right now.