Zuckerberg in Court: the day social media addiction became a legal fact
Some links on this page may be affiliate links. If you purchase through them, we may earn a commission at no extra cost to you.
KEY TAKEAWAYS
The Los Angeles social media addiction trial marks a historic legal challenge against Meta and Google, focusing on alleged intentional design of addictive features targeting children.
Internal company documents reveal that Meta and Google were aware of the addictive nature of their platforms and the vulnerability of young users, contradicting their public statements.
The trial uses a product liability framework rather than content liability, bypassing Section 230 protections and potentially reshaping legal accountability for tech companies.
Design features like infinite scroll, autoplay, and variable reward notifications are central to the addiction claims, argued to be deliberate psychological manipulations.
The case draws a direct parallel to tobacco litigation, suggesting social media companies knowingly contributed to a youth mental health crisis for profit.
GLOSSARY
Bellwether trials
Test cases selected to evaluate legal arguments and outcomes before a larger group of similar lawsuits proceeds.
Product liability law
Legal framework holding companies responsible for harm caused by the design or manufacture of their products, used here to challenge social media platforms.
Infinite scroll
A design feature that removes natural stopping points by continuously loading content as the user scrolls, implicated in addictive use.
Variable reward scheduling
A psychological mechanism where unpredictable rewards, like notifications, encourage compulsive engagement similar to slot machines.
Section 230
A 1996 US law protecting tech companies from liability for user-generated content, not directly challenged in this trial due to the product design focus.
Project Myst
An internal Meta study revealing that parental controls have limited impact on protecting vulnerable children from addictive social media use.
FAQ
What is the central allegation in the social media addiction trial?
The trial alleges that Meta and Google intentionally designed addictive features in their platforms to target children, causing mental health harm. This claim is supported by internal documents showing company awareness of these effects.
How do the companies defend against the addiction claims?
Meta and Google argue that the science on social media addiction is contested and that mental health outcomes result from multiple factors beyond social media. They also highlight actual usage data to question the extent of addiction.
Why is the product liability framework significant in this case?
By framing the case under product liability rather than content liability, the plaintiffs avoid Section 230 protections that usually shield tech companies from lawsuits about user content. This legal strategy allows the case to proceed to trial.
What design features are implicated in the addiction claims?
Features like infinite scroll, autoplay, and variable reward notifications are argued to be deliberate psychological manipulations that encourage compulsive use and remove natural stopping points for users.
How does this trial compare to past litigation against the tobacco industry?
The trial draws a parallel to tobacco lawsuits where companies publicly denied harm while internally acknowledging addiction risks. Similarly, social media companies are accused of knowingly designing addictive products targeting vulnerable youth for profit.
EDITORIAL NOTE
This piece is part of The Present Minds — essays on psychology, identity, and modern life.
A digital sanctuary for the overstimulated. Clarity. Depth. Silence.
Social media addiction trial went live in Los Angeles on February 9, 2026. Not as a talking point. Not as a parental concern debated on morning television. As a legal allegation, argued before a jury, backed by internal company documents, and capable of producing damages that could reshape the entire technology industry.
Mark Zuckerberg, the CEO of Meta and one of the most powerful people on earth, took the witness stand on February 18th. Across from him sat twelve ordinary people who had been told not to change their social media settings for the duration of a trial expected to last six to eight weeks. The irony is its own kind of statement.
At the centre of the Los Angeles trial is a young woman identified only by her initials, KGM, now twenty years old. She claims she was exposed to addictive design features on Instagram and YouTube from the age of six, resulting in anxiety, depression, and long-term mental health harm. She is one of three plaintiffs selected for bellwether trials, which are essentially test cases that both sides use to evaluate how their arguments hold up before a jury before thousands of similar cases are resolved.
TikTok and Snap were originally named as defendants in this proceeding. Both settled for undisclosed sums before trial began. Meta and Google remained.
Mark Lanier, the lead attorney for the plaintiff, delivered his opening statement in terms designed to be impossible to misunderstand. “This case is about two of the richest corporations in history who have engineered addiction in children’s brains,” he told the jury. “I’m going to show you evidence that these companies built machines designed to addict the brains of children, and they did it on purpose.”
His framing was as deliberate as it was blunt: “as easy as ABC.” Addicting the Brains of Children.
Meta’s defence argued that the science on social media addiction is genuinely contested, that some researchers believe addiction is not even the right framework to apply to platform use, and that KGM’s mental health outcomes were shaped by multiple factors beyond social media. Google’s attorney pointed to KGM’s actual usage data, noting her five-year average on YouTube was twenty-nine minutes per day, and that her average daily time on YouTube Shorts was one minute and fourteen seconds.
“Infinite scroll is not infinite,” the Google attorney told the jury. “In some cases, in this case, before this court, before you, the jury, it’s as little as a minute and 14 seconds.”
That argument will be tested against the internal documents the plaintiff’s lawyers spent years obtaining.
What the documents say
The documents are the heart of this case, and they are what make this trial genuinely different from previous disputes between technology companies and regulators.
Internal emails and internal studies from Meta were entered into evidence. One internal communication between Meta employees described Instagram as being “like a drug” and characterised the company’s role as being “basically pushers.” Internal Google documents, according to the plaintiff’s lawyers, likened certain company products to a casino.
The plaintiff’s legal team presented findings from a Meta internal study called “Project Myst,” which surveyed one thousand teenagers and their parents about their social media use. The two major findings presented to the jury were that Meta knew children who had experienced adverse events such as trauma and stress were particularly vulnerable to addictive use patterns, and that parental supervision and controls made little impact on that vulnerability.
Read that second finding again. Meta’s own internal research found that parental controls did not meaningfully protect the children most at risk. Meta’s public position for years has been that parents are the appropriate line of defence when it comes to children’s social media use. Their internal research, at least according to the documents presented in court, reached a different conclusion.
The lawsuit states that plaintiffs are not merely collateral damage of the defendants’ products, but the direct victims of intentional product design choices that pushed them into self-destructive feedback loops.
Meta’s public response maintained that the company strongly disagrees with the allegations and is confident the evidence will show its longstanding commitment to supporting young people. A Meta blog post argued that narrowing teen mental health challenges to social media as a single factor ignores scientific research and the many stressors impacting young people, including academic pressure, school safety, socioeconomic challenges and substance abuse.
Both things can be simultaneously true: social media may be one of several contributing factors to a mental health crisis, and the platforms may have knowingly designed features that made their contribution to that crisis larger than it needed to be. The question the jury is being asked to answer is not whether social media is the only cause. It is whether Meta and Google were a substantial contributing cause.
The big tobacco comparison
Every account of this trial reaches, eventually, for the same historical parallel. And it is the right one.
Lawyers for the plaintiffs are borrowing strategies used against the tobacco industry in the 1990s and 2000s, which faced a similar onslaught of lawsuits. The legal mechanism is almost identical. Big Tobacco spent decades publicly claiming that cigarettes were safe while internally acknowledging the addictive properties of nicotine and the health damage caused by smoking. When the internal documents were finally forced into public view, the gap between the public position and the private knowledge became the legal foundation for a settlement that cost the industry hundreds of billions of dollars.
The 1998 Master Settlement Agreement between the major tobacco companies and forty-six US states remains the largest civil litigation settlement in American history. It required the companies to pay over $200 billion in health care costs over twenty-five years, restricted tobacco advertising targeting minors, and created a permanent public health foundation funded by the industry.
The parallel the plaintiffs’ lawyers are drawing is direct and specific. Social media companies, they argue, have done exactly what tobacco companies did: publicly claimed safety while internally documenting harm, and specifically directed their addictive products toward the most vulnerable users, in this case children, because those users were the most profitable to capture early.
The lawsuit’s language makes the comparison explicit, accusing the defendants of borrowing heavily from the behavioural and neurobiological techniques used by slot machines and exploited by the cigarette industry, and deliberately embedding in their products an array of design features aimed at maximising youth engagement to drive advertising revenue.
Whether this argument succeeds legally is one question. Whether it is factually accurate is another. The documents now entering the public record will allow the second question to be assessed with evidence rather than assumption.
The scale of what is actually at stake
This trial is one case. But it is operating within a legal landscape that is far larger than any individual plaintiff.
The Social Media Addiction multidistrict litigation has entered a critical phase, with the first two bellwether federal trials set for June 15 and August 6, 2026. More than 1,600 plaintiffs are involved in the broader litigation, including over 350 families and over 250 school districts. More than forty state attorneys general have filed lawsuits against Meta specifically. The New Mexico attorney general took Meta to a separate trial simultaneously with the Los Angeles proceedings, alleging the company violated the state’s Unfair Practices Act by misleading the public about platform safety for children.
In past mass torts such as Roundup, opioids, and 3M earplugs, the approach of bellwether trials and early plaintiff wins significantly accelerated global settlement negotiations. If the Los Angeles jury finds against Meta and Google, the settlement pressure across the entire litigation increases dramatically. Each subsequent trial becomes harder for the companies to fight. The internal documents already entered into evidence cannot be unread by the public, the press, or future juries.
Section 230 of the Communications Decency Act, the 1996 law that has protected technology companies from legal responsibility for content posted by users, is not directly at issue in this trial. The plaintiff’s lawyers have deliberately framed the case under product liability law rather than content law: they are not suing Meta for what users posted on Instagram. They are suing Meta for how Instagram was designed. That framing is the legal innovation that has finally allowed these cases to reach a jury after years of being dismissed under Section 230 protection.
If the product liability framing succeeds, it changes the legal landscape for the entire technology industry permanently.
The design features on trial
At the core of the addiction argument are specific features of the platforms whose design choices are being characterised not as neutral engineering decisions but as deliberate psychological manipulation.
Infinite scroll removes the natural stopping point that a finite page provides. Before infinite scroll, reaching the bottom of a page was a natural moment to pause and make a conscious decision to continue. Infinite scroll eliminates that moment. There is no bottom. The content continues as long as the user continues.
Autoplay removes the decision to watch the next video. Before autoplay, choosing the next piece of content required a conscious act. Autoplay converts that active choice into a passive experience: the content continues unless the user actively stops it. The default is continuation.
Variable reward scheduling is the psychological mechanism underlying notification systems. Slot machines pay out on an unpredictable schedule, and research has consistently found that unpredictable rewards produce stronger and more compulsive engagement than predictable ones. The notification that might be a like, a comment, a message, or nothing at all operates on the same principle. The uncertainty is the mechanism. The phone is checked not because there is definitely something there but because there might be.
These are not features that emerged accidentally from neutral engineering decisions. They were designed, tested, and optimised. The internal documents now entering public record will determine how clearly the companies’ own researchers understood what they were optimising for.
What this means for the generation that grew up inside it
The mental health data for this generation is consistently and significantly worse than for preceding generations across multiple countries and multiple measurement methodologies. Rates of anxiety, depression, self-harm and eating disorders among adolescents, particularly adolescent girls, began rising in the early 2010s, which corresponds precisely with the period when smartphone ownership and social media use became near-universal in this age group.
The causal relationship between social media and these outcomes is genuinely contested in the research literature. Some researchers argue the correlation is clear and the mechanism is established. Others argue the data is more complicated, that other factors contribute, and that the direction of causation is not always clear. Jonathan Haidt’s work, particularly his book “The Anxious Generation,” argues the link is causal and substantial. Other psychologists, including Amy Orben and Andrew Przybylski, have challenged this interpretation and argued the effect sizes are smaller than the public conversation suggests.
The jury in Los Angeles is not being asked to resolve this academic debate. They are being asked to determine whether Meta and Google knew their products were causing harm, continued designing them to maximise engagement among vulnerable young users, and misled the public about the safety of doing so.
Those are narrower questions. And the documents that are now public may make them easier to answer than the broader scientific debate suggests.
The honest question this trial forces
There is a question this trial forces into the open that the technology industry has been avoiding for a decade.
The business model of advertising-funded social media is built on one thing: attention. The more time a user spends on the platform, the more advertising the platform can serve. The more advertising the platform serves, the more revenue it generates. Every design choice that increases time on platform directly increases revenue. Every design choice that helps a user spend their time more consciously, set limits, and exit when they have seen enough, works directly against revenue.
Given that structure, the question is not whether it was possible that these companies designed features to maximise engagement at the expense of user wellbeing. The question is whether, given that their entire revenue model depended on maximising engagement, it was possible that they did not.
The internal documents being entered into evidence are the answer to that question made visible. Whatever they ultimately show, they are now public. Families, regulators, and future jurors in future cases will be able to read them.
That transparency, whatever the verdict, is already a form of accountability. The sealed black box has been opened. The wiring is visible.
The jury will determine what the wiring means legally. The rest of us will determine what it means for the choices we make about the devices in our children’s hands.
The Los Angeles social media addiction trial marks a historic legal challenge against Meta and Google, focusing on alleged intentional design of addictive features targeting children.
Internal company documents reveal that Meta and Google were aware of the addictive nature of their platforms and the vulnerability of young users, contradicting their public statements.
The trial uses a product liability framework rather than content liability, bypassing Section 230 protections and potentially reshaping legal accountability for tech companies.
Design features like infinite scroll, autoplay, and variable reward notifications are central to the addiction claims, argued to be deliberate psychological manipulations.
The case draws a direct parallel to tobacco litigation, suggesting social media companies knowingly contributed to a youth mental health crisis for profit.
Glossary
Bellwether trials
Test cases selected to evaluate legal arguments and outcomes before a larger group of similar lawsuits proceeds.
Product liability law
Legal framework holding companies responsible for harm caused by the design or manufacture of their products, used here to challenge social media platforms.
Infinite scroll
A design feature that removes natural stopping points by continuously loading content as the user scrolls, implicated in addictive use.
Variable reward scheduling
A psychological mechanism where unpredictable rewards, like notifications, encourage compulsive engagement similar to slot machines.
Section 230
A 1996 US law protecting tech companies from liability for user-generated content, not directly challenged in this trial due to the product design focus.
Project Myst
An internal Meta study revealing that parental controls have limited impact on protecting vulnerable children from addictive social media use.
FAQ
What is the central allegation in the social media addiction trial?
The trial alleges that Meta and Google intentionally designed addictive features in their platforms to target children, causing mental health harm. This claim is supported by internal documents showing company awareness of these effects.
How do the companies defend against the addiction claims?
Meta and Google argue that the science on social media addiction is contested and that mental health outcomes result from multiple factors beyond social media. They also highlight actual usage data to question the extent of addiction.
Why is the product liability framework significant in this case?
By framing the case under product liability rather than content liability, the plaintiffs avoid Section 230 protections that usually shield tech companies from lawsuits about user content. This legal strategy allows the case to proceed to trial.
What design features are implicated in the addiction claims?
Features like infinite scroll, autoplay, and variable reward notifications are argued to be deliberate psychological manipulations that encourage compulsive use and remove natural stopping points for users.
How does this trial compare to past litigation against the tobacco industry?
The trial draws a parallel to tobacco lawsuits where companies publicly denied harm while internally acknowledging addiction risks. Similarly, social media companies are accused of knowingly designing addictive products targeting vulnerable youth for profit.
Editorial Note
This piece is part of The Present Minds, essays on psychology, identity, and modern life.
Leave a Reply