For the first time, a U.S. jury is hearing claims that major social media platforms harmed a teenager’s mental health. The case, now underway in Los Angeles Superior Court, centers on a California teen identified as K.G.M. and her mother, who allege that platform design choices encouraged compulsive use and worsened mental health outcomes.
The lawsuit targets Meta, TikTok, and YouTube, marking the first time such claims reach a full jury trial.
At the core of the case is how these platforms are built. The plaintiff argues that features like endless scrolling feeds, algorithmic recommendations, notifications, and suggested connections were designed to maximize engagement, even for young users.
According to court filings, those mechanics allegedly contributed to compulsive use, exposure to harmful content, and declining mental health, including self-harm thoughts. Snapchat, originally a defendant, settled ahead of trial under undisclosed terms.
The outcome matters beyond this single lawsuit. K.G.M.’s case serves as a bellwether for roughly 1,500 similar personal injury claims consolidated in federal court. A verdict could influence whether other cases move forward or settle, and how courts evaluate responsibility for platform design rather than user-generated content alone.
A key legal issue involves Section 230 of the Communications Decency Act, which generally shields platforms from liability over user posts. However, the judge has indicated jurors should focus on whether product design choices themselves caused harm. That distinction could reshape how future claims against technology companies are argued.
The companies deny the allegations and point to safety tools rolled out in recent years. These include parental controls, default privacy settings for teens, content restrictions, and efforts to identify underage users. They argue the evidence does not prove their platforms caused the plaintiff’s mental health challenges.







