A 20-year-old woman, who began using social media at six, has now led a landmark case against Meta and YouTube over addictive design. A jury found both companies liable for harm tied to how their platforms were built, not just what users see.
The decision set $6 million in damages and marked one of the first major rulings to treat social media platforms as products that can cause harm. It also signals a shift in how courts examine the role of design in shaping user behavior.
The case was filed by a woman identified in court as K.G.M., who testified that her early and prolonged use of YouTube and Instagram affected her mental health. She described spending hours daily on the platforms, developing anxiety, depression, and body dysmorphia over time.
Jurors concluded that the companies’ design choices were a substantial factor in that harm and that they failed to provide adequate warnings to users, especially minors.
Unlike earlier legal challenges, this case focused on platform architecture rather than content. Lawyers pointed to features such as infinite scroll, autoplay videos, and algorithmic recommendations, which continuously feed users personalized content. These systems are designed to maximize engagement and keep users on the platform for extended periods.
By framing the issue this way, the case moved forward despite Section 230, a U.S. law that shields tech companies from liability over user-generated content.
The jury assigned 70% of the liability to Meta and 30% to YouTube, reflecting their respective roles in the plaintiff’s use. The total damages included both compensatory and punitive amounts, signaling both recognition of harm and an attempt to hold the companies accountable.
While the financial penalty is relatively small compared to the companies’ scale, the ruling carries broader legal significance as a bellwether case that could influence thousands of similar lawsuits.
Meta has introduced safety measures in recent years, including stricter content moderation, teen protections, and warnings around sensitive material. However, this case examined how the platforms function at a structural level, rather than the content itself. The jury’s decision suggests that even with these safeguards, questions remain about how core design features affect young users.
“Teen mental health is profoundly complex and cannot be linked to a single app,” Andy Stone, Meta spokesperson, said, as the company signaled plans to appeal the ruling.
YouTube also said it disagrees with the verdict and will challenge the decision, arguing that the case misrepresents how its platform operates.


















