Meta's Big Tobacco Moment
How a product liability lawsuit cracked the Section 230 shield that protected social media companies for 29 years
A former Facebook executive once said he felt “tremendous guilt” about what the platform does to people. He said the company had “created tools that are ripping apart the social fabric of how society works.”
On March 25, a Los Angeles jury told us why.
After five weeks of trial and 44 hours of deliberation stretched across nine days, a jury in Los Angeles County Superior Court found Meta and YouTube negligent in the design and operation of their platforms. The damages were $6 million. Meta pays 70%. YouTube pays 30%.
Six million dollars is a rounding error for companies worth over a trillion. But the number was never the story. How they got there was.
The Untouchables
For 29 years, social media companies operated behind a legal shield that made them nearly impossible to sue. Section 230 of the Communications Decency Act, passed in 1996, was written for a different internet. The law was a direct response to Stratton Oakmont v. Prodigy (1995). Yes, that Stratton Oakmont, Jordan Belfort's firm from The Wolf of Wall Street.
They sued an early online service because someone posted accusations of fraud on a message board. The firm was actually committing fraud, but they won anyway because Prodigy moderated its boards, which made the court treat it as a publisher liable for everything users posted. That case left platforms facing a perverse choice: moderate content and risk being treated as a publisher (liable for everything), or don't moderate and stay protected. Congress wrote Section 230 to let platforms moderate without becoming publishers. It was a reasonable fix for message boards and dial-up services.
It was never designed to protect companies that build algorithms to identify vulnerable teenagers and feed them content engineered to make them feel worse.
But that is what much of social media has become. For nearly three decades, every lawsuit aimed at social media companies ran into the same wall. You can’t sue the platform for what users post. The platform isn’t the publisher. Case dismissed.
The mob had a version of this. For decades, everyone knew what organized crime families were doing. The violence, the extortion. But the legal system couldn’t touch them through conventional prosecution. The evidence was there, the will was there, but the law wasn’t built for it.
How did the feds get around this? They got Al Capone on tax evasion. The feds couldn’t prove the murders or the bootlegging in court. But they could prove he didn’t pay his taxes. The charge didn’t match the public harm, but it fit the conduct, and it put him away.
The lawyers who brought this case against Meta did something similar.
The Side Door
They stopped trying to sue Meta as a publisher. Instead, they sued Meta as a manufacturer of a defective product.
They went after how the product was built: the infinite scroll, the notification loops designed like slot machines, the algorithm that learns within days what makes a 13-year-old girl feel ugly and then serves her more of it. Also the beauty filters that Meta’s own researchers said harm teenage girls, which Mark Zuckerberg chose to reinstate anyway.
Product liability law has been on the books for decades. It’s how courts held tobacco companies accountable for engineering cigarettes to be more addictive while marketing them to teenagers. Opioid litigation worked the same way: not “your drug is dangerous” but “you designed the distribution system to maximize addiction.”
In November 2025, Judge Carolyn B. Kuhl drew a line that made this possible. She ruled that Section 230 did not protect Meta from claims about product design. The argument wasn’t about what users posted. It was about how the product was built. That distinction, content versus conduct, cracked the shield open.
Which left one question: what had the product actually done to someone?
What the Jury Saw
The plaintiff, identified in court as K.G.M., is now 20 years old. She started watching YouTube at age 6. She was on Instagram by 9. By her teens, she had given up her hobbies, withdrawn from friends, and developed depression and suicidal thoughts.
Her lawyers didn’t argue that bad posts hurt her. They argued that the machine was built to keep her there, and it worked exactly as designed.
Zuckerberg took the stand on February 18, 2026. Plaintiff attorney Mark Lanier, who previously won a $4.69 billion verdict against Johnson & Johnson in the talc cancer litigation using the same product-liability playbook, held up an internal Meta document from 2018: “If we wanna win big with teens, we must bring them in as tweens.”
Zuckerberg, who had testified that Meta does not target children, was asked to read it aloud.
A separate internal communication from Instagram employees was entered into evidence: “We’re basically pushers... We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
Meta’s own people wrote the prosecution’s best argument.
“We’re basically pushers” - Instagram employees describing their platform
Death by a Thousand Cuts
Six million dollars barely registers on Meta’s balance sheet.
The real exposure isn’t one sweeping government settlement where lawyers consolidate everything into a single case, the company pays a lump sum, and prosecutors take jobs in-house counsel later. That model, the one that made people furious about bank settlements after 2008, breaks down when you’re facing 2,000 individual cases.
More than 2,400 lawsuits are pending against social media companies in state and federal courts. This was the first to reach a verdict and provides a playbook for future cases. TikTok and Snapchat settled before trial. Only Meta and YouTube went the distance.
The day before the LA verdict, a separate jury in Santa Fe, New Mexico ordered Meta to pay $375 million for 75,000 violations of the state’s Unfair Practices Act, each carrying a $5,000 penalty, for exploiting children on Facebook, Instagram, and WhatsApp. Attorney General Raul Torrez’s team had run an undercover operation, creating fake child accounts to document how predators used the platform and how Meta responded.
Meta cannot hire enough lawyers to fight every one of these individually, and consolidating them into a single class action is not an option. Every state attorney general watching this, particularly in an election year, is looking at a cause that polls at 80%+ bipartisan support and thinking about what a product liability case would do for their career.
There’s another detail that should worry Meta’s shareholders. A Delaware court ruled that Meta’s insurers have no duty to defend these cases. The reasoning: the lawsuits allege intentional design choices, not accidents. Insurance covers negligence. It doesn’t cover a product you built to work this way on purpose. Meta is paying its own legal bills across every one of those 2,400+ cases, which changes the financial calculation entirely.
Meta’s best argument is that proving damages here is harder than tobacco or opioids. Tobacco plaintiffs had lung cancer. Opioid plaintiffs had overdose deaths and medical records. The harms in these cases are depression and suicidal thoughts. Real, but harder to put a dollar figure on. There’s no tumor to show a jury. In previous mass torts, individual opioid victims received as little as $400 after fees. The $6 million this jury awarded is actually high by comparison. Whether that number holds as a benchmark across 2,400 cases, or whether the psychological nature of the harm drives settlements lower, remains genuinely open. But Meta can’t bet on that ambiguity while it’s paying out of pocket for every fight.
The pressure accumulates from hundreds of individual cases that collectively make it more expensive to fight than to change. The tobacco industry learned this. It took decades of individual verdicts and state AG lawsuits, a slow accumulation of internal documents proving that the companies knew what they were doing. The 1998 Master Settlement Agreement cost the industry $246 billion. It didn’t happen because of one verdict. It happened because the losses kept coming, and at some point it was cheaper to settle than to keep showing up in court.
And the people inside the companies knew it was coming.
What They Already Knew
Meta’s practices don’t stop at American teenagers. In Southeast Asia, the company has faced accusations of enabling scam operations and failing to moderate content that destabilizes communities with no resources to fight back. This verdict gives local authorities in those countries something they didn’t have before: a finding from Meta’s home country that its product design is harmful.
You don’t need to follow international law to see why this matters to your family, though.
Internal researchers at Meta wrote in documents that are now trial evidence. Meta knew that 30% of 10-to-12-year-olds in the U.S. were on Instagram despite a stated policy requiring users to be 13. They set internal targets to push daily engagement for tweens (ages 10-12) to 40 minutes.
The practices these companies encourage for your children are not the same practices their own executives follow with their children. Everyone knew that. Now a jury has seen the internal documents that prove it.
Meta will appeal. YouTube will appeal. The legal process will take years. But the legal theory that kept social media companies untouchable for three decades just failed in front of a jury.
That former Facebook executive said he felt tremendous guilt. He said it in 2017. It took eight more years and a nine-day jury deliberation for the legal system to catch up to what his own colleagues had written down.
If you’re a parent, you don’t have to wait that long.
Curious Jay is weekly essays on why things don’t work the way they should. Subscribe free for investigations into power, money, and broken systems.



I’d say this news is bullish for Substack!