Juries hold Meta and YouTube liable for teen harms
Plaintiffs target addictive design rather than user posts, Section 230 shield faces a new flank
Images
zerohedge.com
zerohedge.com
zerohedge.com
zerohedge.com
zerohedge.com
Two juries in the United States have delivered unusually direct verdicts against the business model of social platforms, assigning blame not to user-generated posts but to the products that deliver them.
In Los Angeles, a jury found Meta and YouTube liable for harm to a young user who said she became addicted as a child, awarding $3 million in punitive damages and recommending an additional $3 million in compensatory damages, according to The Epoch Times. Meta was assigned 70% of the punitive damages and YouTube 30%. In a separate case in New Mexico, jurors sided with the state attorney general in finding Meta violated state law by failing to disclose risks and protect children, imposing a $375 million fine; prosecutors had sought far more.
The immediate sums are not existential for companies of Meta’s size, but the legal theory is. The Los Angeles case was built around platform design choices—“infinite scroll”, filters, and recommendation algorithms—rather than the content users posted, the report notes. That distinction matters because it tries to route around the broad liability shield that US platforms have long relied on under Section 230 of the Communications Decency Act. If plaintiffs can persuade courts that the harm flows from the product’s mechanics, not third-party speech, the litigation target shifts from moderation decisions to the engagement machine itself.
That shift would also change what platforms optimise for. A feed that maximises time-on-app is easy to measure and easy to defend internally; a feed that must survive discovery, expert testimony, and juries is a different engineering problem. The Epoch Times reports lawyers already describing the cases as “bellwether” suits and predicting copycat claims, including class actions where the numbers scale quickly. Even if most verdicts are reduced or overturned on appeal, the cost of defending them—document production, depositions, internal research made public—becomes a recurring tax on the product.
Regulators are watching the same opening. New Mexico’s case framed the issue as consumer protection: disclosure, safety features, and the gap between public assurances and internal knowledge. That approach does not require proving a specific post caused harm; it treats the platform like any other product sold to minors. The practical consequence is pressure for age verification, default restrictions, and auditable safety claims—controls that large incumbents can implement and smaller rivals may struggle to afford.
Both cases are expected to be appealed, and the report suggests they could ultimately reach the US Supreme Court. But the verdicts already sketch a future in which a teenager’s scrolling habits can be litigated like a defective product.
Meta and YouTube were found liable in Los Angeles on March 25, and New Mexico’s jury set its fine a day earlier.