The Girl Who Took on Big Tech
The Girl Who Took on Big Tech

Yesterday, in a courtroom in Los Angeles, twelve jurors leaned forward in their seats while the judge read the verdict slowly, line by line. In the front row of the public benches sat a young woman in a tan sweater and a long pink dress. Her name was Kaley. She listened without speaking, hands folded in her lap. The jury had reached its decision. They found that the design of platforms owned by Meta Platforms and the video service YouTube had harmed her. Meta would pay $4.2 million and YouTube would pay $1.8 million. The total, $6 million, barely registers against the quarterly revenues of companies worth hundreds of billions. Yet the number itself was never the real story. What mattered was the reasoning behind it. A group of ordinary citizens had concluded that the architecture of modern social media could cause personal injury. For lawyers pursuing thousands of similar cases across the United States, the verdict felt like the first visible crack in a wall that had stood for nearly three decades.
Kaley’s story began long before the courtroom. In Chico, a quiet city in Northern California, she first discovered YouTube when she was six years old. At that age the platform felt less like a website and more like a glowing portal. Videos flowed endlessly, guided by recommendations that seemed to anticipate exactly what she wanted next. By the time she reached nine, she opened an account on Instagram, the photo sharing platform owned by Meta. The routines formed quickly. A few minutes after school stretched into long evenings. Notifications flashed across the screen with quiet urgency. A scroll revealed new faces and carefully edited lives. Each reaction to her posts carried a small surge of approval that felt difficult to ignore.
Kaley later testified that she often spent nearly the entire day online. On one occasion, her lawyers said, she used Instagram for sixteen hours. “I wanted to be on it all the time,” she told the jury. “If I wasn’t on it, I felt like I was going to miss out on something.” At first the platforms felt like refuge from bullying at school. She posted photographs and adjusted them with beauty filters before sharing them publicly. Over time the filters began to alter how she saw her own reflection. The version of herself on the screen appeared smoother, slimmer, and brighter than the one in the mirror. Her lawyers argued that the distance between those two images slowly widened. She developed body dysmorphia. Anxiety settled in. Depression followed. By her early teens she was seeing therapists who diagnosed social phobia and related conditions that shaped her adolescence.
The lawsuit that emerged years later did not claim that a particular video or post caused the damage. That approach would likely have failed because of the legal protections that shield online platforms. Instead the case targeted design. Kaley’s legal team argued that features such as infinite scrolling feeds and autoplay videos were engineered to keep users engaged for as long as possible. Their strategy drew inspiration from the legal campaign against tobacco companies that culminated in the late 1990s settlement following the Tobacco Master Settlement Agreement. In that earlier fight, lawyers argued that cigarette companies had concealed the addictive nature of their product. The comparison suggested that social media might belong in a similar category of products that reward companies when people struggle to disengage.
The case also sidestepped a long standing legal shield known as the Section 230 of the Communications Decency Act, which protects internet companies from liability for material posted by their users. Kaley’s lawyers insisted the trial was not about speech. It was about engineering decisions made deep inside Silicon Valley offices. Inside the courtroom the trial unfolded across several weeks. Executives from the tech industry took the witness stand, including Mark Zuckerberg, the chief executive of Meta, and Adam Mosseri, who runs Instagram. Both rejected the idea that their platforms were intentionally designed to create addiction in any clinical sense. Zuckerberg told jurors that Meta’s goal had always been to provide useful tools for communication and creative expression. He explained that the company had moved away from measuring success through the amount of time users spent inside its apps.
Lawyers for YouTube presented a different defense. They argued that the service functioned primarily as a streaming platform rather than a traditional social network. Their team also suggested that Kaley’s mental health struggles could have emerged from pressures beyond the apps themselves, including challenges within her family life and persistent bullying at school. Yet the plaintiff’s lawyers presented internal documents and research emails showing that employees within these companies had debated the psychological effects of their products for years. In one message cited during the trial, a researcher described Instagram in blunt language: “We’re basically pushers.” The remark lingered in the courtroom long after it was read aloud.
During closing arguments, Kaley’s lead attorney Mark Lanier brought a prop to the courtroom. He placed a glass jar on the table and filled it with colorful candy. Four hundred and fifteen M&Ms. Each piece represented one billion dollars of shareholder equity belonging to Google’s parent company, Alphabet. Lanier scooped out several candies with his hand. The jar barely looked different. “You can take out a handful and not make a difference,” he told the jury. The demonstration was theatrical yet precise in its message. Financial penalties alone would not significantly affect companies of that scale. What mattered was the precedent created by the verdict.
After more than a week of deliberation, the jurors returned with a near unanimous decision. They concluded that the platforms had been negligent in how they were designed and had failed to provide adequate warnings about possible risks. Meta was assigned seventy percent of the responsibility and YouTube the remaining share. Outside the courthouse, some jurors explained that they intentionally kept the damages modest. Their focus was not on producing a massive payout for one individual. The decision was meant to signal concern about the environment young users enter when they open these apps. “We wanted to focus on what teens and children would be subjected to in the future,” one juror said.
Both companies quickly announced plans to appeal. A spokesperson for Meta said the verdict overlooked the complexity of adolescent mental health and unfairly linked it to a single product. Google echoed that position, arguing that the trial misunderstood the nature of YouTube. The legal battle is far from over. Yet the verdict arrived at a moment when governments and regulators around the world have begun asking sharper questions about social media. In 2024 the U.S. surgeon general called for warning labels on platforms used by adolescents. Several countries are exploring age limits or restrictions for younger users.
Thousands of lawsuits are already waiting in line behind Kaley’s case. Many have been filed by families, school districts, and state attorneys general who argue that social media design harms children. For years Silicon Valley operated under the assumption that its products existed in a legal gray zone. Platforms hosted speech but rarely faced responsibility for the consequences of their design choices. In one Los Angeles courtroom, that assumption met a group of jurors who looked carefully at the evidence and reached a different conclusion. The young woman in the front row listened as the verdict was read and remained almost perfectly still while the judge finished the final lines. For a moment the courtroom was silent except for the soft rustle of papers. Then the legal machinery of appeals and future trials began to move again, and the long debate over social media’s influence on young lives entered a new chapter.