Recent jury verdicts against Meta and Google have escalated a legal fight that could change how much protection U.S. law gives tech companies from lawsuits about their platforms’ design.
Jurors in the first two U.S. trials over harm to children found Meta and Google liable. This sets the stage for appeals that could challenge the scope of Section 230 of the Communications Decency Act.
Two verdicts, two different cases, one larger legal question
In Los Angeles, Reuters reported that a jury found Meta and Google responsible for a young woman’s depression and suicidal thoughts after she said she became addicted to Instagram and YouTube at a young age.
The companies were ordered to pay a total of $6 million in damages. In a separate case in New Mexico, jurors ordered Meta to pay $375 million after deciding the company misled users about the safety of its products for young people and allowed sexual exploitation of children on its platforms.
NBC News called the California case a landmark social media addiction trial, showing how closely it is being watched as a sign for thousands of similar lawsuits. The Los Angeles verdict is one of the first major test cases in a larger group of lawsuits against social media companies over alleged harm to children and teens.
Why Section 230 is suddenly under pressure again
The bigger legal issue is how the plaintiffs managed to get around Section 230, a 1996 federal law that usually protects online platforms from being sued over user-generated content.
Reuters said both cases pierce a legal shield that has been hard for plaintiffs to break, because the lawsuits claimed the companies harmed young users through their choices about the platforms’ design rather than the content itself.
This difference could affect more than just Meta and Google.
Gregory Dickinson, an assistant professor at the University of Nebraska College of Law, who said courts are now trying to separate claims about platform functionality or platform conduct from claims that would hold companies responsible for what users say.
In practice, this means courts may be more open to cases about features like infinite scroll, autoplay, recommendation systems, and engagement design, instead of just focusing on user content.
Appeals are coming, and the stakes go beyond social media
Both Meta and Google plan to appeal, and these appeals will likely focus on Section 230. Meta declined to comment further, only saying it would appeal in both cases. Google said it plans to appeal in the Los Angeles case.
The legal effects could go beyond Instagram and YouTube.
Meta, Google, Snap, and ByteDance are already facing thousands of lawsuits in state and federal courts, with claims that their platform designs have contributed to a mental health crisis among young people.
Over 2,400 cases are now before one judge in California federal court, and thousands more are combined in California state court.
More than 130 lawsuits are pending in federal court against Roblox, claiming it did not protect users from sexual exploitation.
This is why the verdicts are seen as more than just isolated losses.
Eric Goldman of Santa Clara University School of Law, said, “I think the internet is on trial, not social media,” and added that if these legal theories succeed, “they will be deployed elsewhere.”
Supreme Court interest may be growing
The Supreme Court has already shown interest in Section 230. The justices heard a case involving YouTube in 2023 but did not make a decision about legal protections for internet companies.
In 2024, the court chose not to revive a case accusing Snapchat’s owner, Snap, of failing to protect underage users from sexual predators.
However, Justices Clarence Thomas and Neil Gorsuch disagreed and warned against more delays in dealing with the issue. They wrote that social media platforms have increasingly used Section 230 as a get-out-of-jail free card.
For now, these verdicts do not remove Section 230. However, they suggest that juries, and possibly appellate courts in the future, may be more willing to separate the act of hosting speech from designing systems that could make harm more likely.
If this change continues, it could redefine legal accountability for much of today’s internet.