A Los Angeles jury delivered a historic verdict on March 25, 2026, finding Meta and YouTube negligent in a landmark social media addiction trial that legal experts say could reshape the entire technology industry. The jury awarded more than $6 million in damages to the plaintiff — a now-20-year-old California woman identified as Kaley — who alleged that addictive design features built into Instagram and YouTube caused her severe mental health harm beginning when she was a child. The decision marks the first time social media companies have been held liable in a personal injury lawsuit centered not on harmful content, but on the deliberate architecture of the platforms themselves.
The verdict came after nine days of deliberations spanning more than 43 hours in Los Angeles Superior Court. Jurors returned a “Yes” on every question related to negligence and failure to warn — ten jurors ruled in favor of the plaintiff, with two favoring the defense on every count. Meta was assigned 70 percent of the responsibility for the damages, with YouTube bearing the remaining 30 percent. The jury first awarded $3 million in compensatory damages, then added $2.1 million in punitive damages against Meta and an additional $900,000 against Google’s YouTube shortly after.
The case, filed under California’s Judicial Council Coordination Proceedings as a bellwether trial, was selected specifically to set legal precedent and help determine outcomes in thousands of similar consolidated lawsuits still working through state and federal courts. Snap and TikTok, originally named as co-defendants, reached undisclosed settlements before the trial began in late January. The federal trial covering consolidated claims by school districts and parents nationwide is scheduled to begin this summer in the Northern District of California.
How the Plaintiff Built Her Case Against Meta and YouTube
Kaley, identified in court filings by her initials KGM, testified that she began using YouTube at age six and Instagram at age nine — years before either platform’s stated minimum age of 13. Her legal team, led by attorney Mark Lanier of Lanier Law Firm, argued that both companies deliberately engineered their platforms with addictive feedback loops including infinite scrolling, algorithmic recommendation engines, and dopamine-triggering notification systems designed to maximize time-on-platform for the youngest and most vulnerable users.
Internal company documents presented during the seven-week trial proved decisive. One Meta memo included the statement: “If we wanna win big with teens, we must bring them in as tweens.” Another internal analysis revealed that 11-year-olds were four times more likely to return to Instagram compared with competing apps, despite the platform’s official age minimum of 13. A separate set of documents showed that Meta proceeded with beauty and cosmetic surgery filters on Instagram after 18 internal experts and employees raised concerns about the psychological harm they could cause to young users.
Kaley described in testimony how her social media use produced an emotional rush from likes and notifications, eventually causing compulsive behavior that she said still disrupts her adult life. She described sneaking away from work to scroll and spending extended periods manipulating her appearance using filters. Her legal team argued that these behaviors were not pre-existing conditions but direct consequences of deliberate design choices made by both companies — choices the companies knew were dangerous and chose not to warn users about.
Zuckerberg, Mosseri, and Goodrow Take the Stand
The trial drew extraordinary attention when Meta CEO Mark Zuckerberg testified on February 18, 2026 — a rare courtroom appearance for one of the world’s most powerful technology executives. Zuckerberg faced direct questioning about Instagram’s age restrictions and acknowledged that enforcing the 13-and-older rule is difficult because there are “a meaningful number of people who lie about their age to use our services.” He also testified that he once contacted Apple CEO Tim Cook directly to discuss child safety concerns, and described Meta’s internal decision-making process around the beauty filters that became a central exhibit in the trial.
Instagram head Adam Mosseri also testified, pushing back against the concept of clinical addiction while acknowledging that “problematic use” of Instagram — defined as spending too much time on the platform — is “real.” Mosseri previously stated publicly in the weeks before trial that characterizing social media use as addiction was “problematic.” YouTube Vice President of Engineering Cristos Goodrow testified that his own children use YouTube for hours each day and that he considers it “good” for them, and argued that YouTube was “not designed to maximize time” on the platform.
Plaintiff lawyers countered Goodrow’s claims by noting that Kaley often used YouTube while not logged into her account, meaning the platform’s usage records dramatically understated her actual exposure. Meta’s defense team argued that Kaley’s mental health challenges predated her social media use and that the platforms served as a coping mechanism rather than a cause. The jury rejected both defenses entirely.
Section 230 and the Legal Strategy That Beat Big Tech
For decades, Section 230 of the Communications Decency Act of 1996 has served as the primary legal shield protecting internet platforms from liability over user-generated content. The law states that no provider of an interactive computer service shall be treated as the publisher of third-party content. Big Tech has relied on this protection to defeat hundreds of lawsuits. The strategy used in the KGM case deliberately sidestepped Section 230 entirely by targeting product design rather than published content.
Rather than arguing that Instagram or YouTube showed Kaley harmful material, the plaintiff’s legal team argued that the companies built defective and dangerous products — products engineered to override natural psychological limits and create dependency in minors. This framing brought the case under product liability law, not platform publisher law. Legal analysts who followed the trial noted that product liability standards do not fall within the scope of Section 230 immunity, a distinction the jury effectively validated with its verdict. The approach mirrors the legal strategy that ultimately dismantled the tobacco industry’s protections in the 1990s, a comparison raised repeatedly during closing arguments.
The verdict does not immediately change how Section 230 operates, but it establishes that platform design decisions carry legal liability independent of what content users post or consume. That distinction is the foundation on which thousands of pending lawsuits are now expected to be argued.
The New Mexico Verdict — Meta’s Second Loss in Two Days
The timing of the Los Angeles verdict was not isolated. Just one day earlier, on March 24, a separate jury in New Mexico ordered Meta to pay $375 million after finding the company had misled consumers about platform safety and knowingly enabled child sexual exploitation on Instagram and Facebook. That case was brought by New Mexico Attorney General Raúl Torrez, who described the Los Angeles outcome as “a step toward justice” that puts Big Tech executives on notice.
The New Mexico trial will enter a second phase in May, during which a judge will determine whether Meta created a public nuisance and what additional penalties the company must pay. The state attorney general has signaled he will ask the court to compel Meta to structurally modify its apps. Dozens of other state attorneys general have filed or are preparing similar actions against Meta and other major platforms. The dual verdicts in less than 48 hours have accelerated what legal observers are already calling the most significant corporate accountability moment in the technology industry since the federal antitrust case against Microsoft in the late 1990s.
The back-to-back jury decisions also drew immediate comparisons to the multibillion-dollar master settlement agreement that ended decades of Big Tobacco litigation in 1998. That settlement forced tobacco companies to change their marketing practices, fund anti-smoking campaigns, and pay out hundreds of billions of dollars to states. Legal strategists working on the social media litigation now point to that precedent as the endgame for the current wave of lawsuits.
What the Verdict Means for the 2,000 Pending Lawsuits
The KGM bellwether verdict carries enormous practical weight beyond the $6 million awarded to a single plaintiff. Because the case was designated as a test case for coordinated litigation, its findings on negligence, product design liability, and the limits of Section 230 will directly influence how judges and juries approach roughly 2,000 pending cases filed by individuals, families, and school districts across the United States. Each of those cases names Meta, YouTube, TikTok, Snap, or some combination of the four as defendants.
A federal trial covering consolidated claims by school districts nationwide begins this summer in the Northern District of California. School districts have argued that social media platforms have created a mental health crisis inside their buildings, forcing them to spend hundreds of millions of dollars on counseling, mental health services, and disciplinary infrastructure to manage the consequences of student addiction. If the federal trial produces a similar outcome, the financial exposure for Meta and Google could reach into the tens of billions of dollars.
Meta and YouTube have both stated they will appeal the Los Angeles verdict. Meta described teen mental health as “profoundly complex” and said it “cannot be linked to a single app.” Google insisted the verdict misrepresents YouTube, which it characterized as a responsibly built streaming platform rather than a social media site. Legal analysts consider both appeals predictable but difficult to win given the jury’s across-the-board findings on negligence and failure to warn.
The role of algorithmic systems in platform design will likely become central to the appeals process, as both companies will argue that their recommendation engines are protected expression rather than defective product features.
Industry Reactions and the Path to Legislative Action
Advocacy groups and parents who have spent years pushing for regulatory reform welcomed the verdict as the breakthrough they had been waiting for. Laura Marquez-Garrett, a plaintiffs’ attorney for the Social Media Victims Law Center, gathered outside Los Angeles Superior Court with family members of affected children and called the decision a turning point. Parents holding photos of children they say were harmed by social media gathered at the courthouse steps, many describing years of trying to get tech companies to acknowledge the harms their products caused.
The verdict also landed with immediate political resonance. Congressional efforts to pass federal child online safety legislation have stalled repeatedly, including the Kids Online Safety Act, which passed the Senate in 2024 but failed to advance in the House. Safety advocates now argue that the courts are doing what Congress has been unable to accomplish — establishing accountability through civil liability. If the appeals courts uphold the verdict and similar outcomes accumulate across the 2,000 pending cases, the financial pressure on Big Tech may eventually force legislative compromise that has so far proven elusive.
Meta President Dina Powell McCormick addressed the verdict at Axios’ AI+DC Summit on the same day it was delivered, confirming the company would appeal and emphasizing that Meta had “continued to evolve” its child safety practices. Neither Meta nor YouTube announced any immediate product changes in response to the jury’s findings. Both companies indicated their legal teams were evaluating the full scope of the verdict before deciding next steps.
FAQ: Meta and YouTube Social Media Addiction Trial
What did the jury decide in the Meta and YouTube social media trial?
A Los Angeles jury found Meta and YouTube negligent on all counts, ruling that both companies deliberately designed their platforms to be addictive, knew the design was dangerous, and failed to adequately warn users about those risks. The jury awarded $6 million in total damages — $3 million compensatory and $3 million in punitive damages — to plaintiff Kaley, a 20-year-old California woman who began using Instagram and YouTube as a young child.
How does this verdict affect Section 230 protections for social media platforms?
The verdict does not eliminate Section 230 but establishes a critical precedent: platform design decisions are not shielded by Section 230 immunity. By framing the case as a product liability claim rather than a content moderation dispute, the plaintiff’s legal team bypassed the law’s publisher protections entirely. Courts and legal teams in the 2,000 related pending lawsuits are now expected to adopt the same product liability framing.
Will Meta and YouTube appeal the verdict?
Both companies confirmed they will appeal. Meta called the verdict inconsistent with the complexity of teen mental health and said it was evaluating all legal options. Google stated the ruling misrepresents YouTube, which it describes as a streaming platform rather than a social media site. Legal analysts consider the appeals predictable but challenging to win given the jury’s unanimous findings across every negligence question posed.
What are the 2,000 pending social media lawsuits about?
Roughly 2,000 consolidated lawsuits filed by individuals, families, and school districts allege that Meta, YouTube, TikTok, and Snapchat designed their platforms to addict minors, causing measurable mental health harm. A federal trial involving school district claims nationwide is scheduled for this summer in California. The Los Angeles bellwether verdict will directly influence how those cases are argued and decided.
Did TikTok and Snapchat also face this trial?
TikTok and Snapchat parent Snap Inc. were originally named as defendants but both settled with the plaintiff before trial began in late January 2026. Settlement terms were not disclosed. Both companies remain named defendants in other consolidated proceedings.
The Los Angeles verdict represents the first time a jury has held social media platforms directly accountable for addictive design choices that caused measurable harm to a minor. Whether it becomes a catalyst for sweeping industry reform — or gets absorbed as a manageable legal cost in the way tobacco companies once absorbed early jury awards — will depend entirely on what happens when the full weight of 2,000 pending cases lands in courtrooms across the country. What is already clear is that the legal framework for holding Big Tech accountable for its design decisions now exists, has been validated by a jury, and will not easily be undismantled. The dam, as legal observers put it, has cracked. How wide it opens is the question that will define the next decade of technology law.
For the hundreds of families still waiting for their own cases to be heard, Wednesday’s outcome in Los Angeles was not just a legal verdict. It was confirmation that the companies they believe harmed their children can, in fact, be held to account. The next phase of this battle moves to federal court this summer, and the legal, financial, and regulatory stakes have never been higher.