In July 2000, a nicotine-addicted physician named Howard Engle won a landmark judgment against the American tobacco industry.
Amid a nationwide reckoning about the harms of smoking, Engle convinced a Florida jury that cigarette makers had knowingly sold addictive products while lying about their dangers.
Now, jurors in Los Angeles have reached a similar verdict about Instagram and YouTube.
While the consequences are still to play out, they could ultimately prove as seismic as the mass of lawsuits that humbled Big Tobacco in the 1990s.
On Wednesday, the panel at the Superior Court of California found YouTube and Meta — the sprawling social media company that owns Facebook, Instagram, and WhatsApp — liable for harming a young woman known as K.G.M. by knowingly designing addictive and distressing products.
Crucially, this ruling effectively bypasses the traditional legal shield which Big Tech has used for decades to deflect such claims, commonly known as Section 230.

“For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features,” said one of K.G.M.’s lawyers, Joseph VanZandt.
“Today’s verdict is a referendum — from a jury, to an entire industry — that accountability has arrived.”
To be clear, the actual penalties here, while huge for K.G.M., are completely insignificant for such massive companies.
Meta must pay $4.2 million in combined punitive and compensatory damages (roughly 0.02 percent of its annual profit of $22.7 billion), while YouTube must pay $1.8 million (just over 0.005 percent of its $34.5 billion profit last year). By itself, that’s hardly cause to make their accountants quake in their loafers.
But K.G.M.’s is not the only such case. Thousands of similar lawsuits have been filed across the nation by teenagers, parents, school districts, and state governments.
The outcome will be influential at least in California, where courts are treating this lawsuit as a test case. When you multiply those damages accordingly, you’ll soon reach the kind of numbers that make even a multi-trillion-dollar company sit up and take notice.
Meanwhile, on Tuesday, another jury in New Mexico found that Meta’s platforms are harmful to children’s mental health, imposing a far larger penalty of $375m.
Together, these cases signal a coming snowdrift of lawsuits against Big Tech, according to Cornell law professor Alexandra Lahav.
“The social media tort litigation is going to be beyond massive,” said Lahav on Bluesky after Wednesday’s verdict. “It will be asbestos level or bigger.”
“Imagine PFAS + Roundup + Earplugs combined,” she went on — referring to previous legal avalanches over harmful ‘forever chemicals’, carcinogenic weedkiller, and defective ear protectors — “and then 3x [it].”
‘The engineering of addiction’
For decades, tech giants have argued that they enjoy blanket protection from lawsuits like this under Section 230 of the Communications Act.
Section 230 is highly controversial, but it’s also the bedrock of the modern internet. It allows companies and individuals to host — and, crucially, to police — user-generated material online, without being held legally liable for its contents.
That’s what allows social media companies to set their own rules and remove violating posts without being treated as the publishers of those posts. If I falsely smear someone in this article, The Independent could be sued for libel, but if you falsely smear someone in the comments, Section 230 would protect us.
But does this also protect the systems by which these companies distribute that content? Does it protect all the psychological hooks and tricks they use to keep their users scrolling and coming back each day?

K.G.M.’s lawyers argued no. They presented internal documents that showed both companies’ executives were briefed on their products’ damaging effects and warned that their policies were harming children.
“If we wanna win big with teens, we must bring them in as tweens,” said one Meta memo. Another showed that Meta was aware that 11-year-olds were regularly using Instagram, despite its rules requiring a minimum age of 13.
One of them was K.G.M., who testified that said she started using YouTube at 6 years old and Instagram at 11. She said her compulsive app use had damaged her self-worth, isolated her from friends and family, and contributed to her depression and body dysmorphia.
“How do you make a child never put down the phone? That’s called the engineering of addiction,” said K.G.M.’s lawyer Mark Lanier.
The companies countered that K.G.M. had many other problems in her life, noting that her therapist never documented social media as a cause of her mental health problems. They said it was wrong and simplistic to blame social media for wider societal problems.
But, bluntly, it’s easy to see why the jury wasn’t persuaded. While Meta and YouTube are hardly the source of all society’s ills, there is evidence stretching back years of how senior executives repeatedly prioritized growth and profit over safety and harm reduction.
Most of us have used Instagram and YouTube ourselves, so we have personal experience of how compellingly they can play on our brain chemistry. Many Americans, too, have struggled to pull their children away from digital systems that seem precision-engineered to perpetually ensnare their brains.
A Meta spokesperson said it “respectfully disagrees with the verdict” and is evaluating its options. Google said the case had “misunderstood” YouTube, which is “a responsibly built streaming platform, not a social media site.”
The ruling has implications far beyond just these two companies. TikTok and Snapchat were also named in the case, only to settle out of court.
‘All of this could be reversed on appeal’
Meta and Google have shrugged off billion-dollar fines before. But there is now a plausible future timeline where the legal exposure grows expensive enough that they are forced to seriously re-engineer their products.
“There is a long road ahead, but this decision is quite significant,” Clay Calvert, a media law expert at the center-right American Enterprise Institute, told The New York Times.
“If there are a series of verdicts for plaintiffs, it will force the defendants to reconsider how they design social media platforms and how they deliver content to minors.”
That outcome is far from guaranteed. Many have predicted such a reckoning before, only for the “moment” to fizzle. That includes myself, in both 2017 (a “Philip Morris moment”) and in 2021 (a “Lehman moment”).
According to The Guardian, there are 20 more “bellwether” trials scheduled on this subject, whose outcomes might be completely different.
“It is really early to tell the significance of this, because it could all be reversed on appeal,” said Kate Klonick, a law professor and digital policy expert at St. John’s University, on Bluesky.
“This will likely be years before it is final — or not.”
That would actually be similar to what happened to Big Tobacco. Rather than a singular “moment”, it ultimately took roughly four decades for the industry to be brought to heel, from the Sixties to the 2000s.
Even Howard Engle’s victory was partially reversed by an appeals court, limiting its scope and narrowing the path for similar plaintiffs.
Still, rightly or wrongly, this week’s judgments are a potent sign that Americans have lost patience with Silicon Valley’s talking points. If I were them, I’d be brainstorming new ones.






