Meta and YouTube have been found liable in a landmark US court case in which a jury found that social media platforms, specifically those run by Meta and Google, can be held legally responsible for any harm that is caused to a young user's mental health.

The case was presented by a 20-year-old woman who claimed she was exposed to social media platforms from a young age, such as Instagram and YouTube, and that prolonged exposure to these platforms contributed to negative mental health effects, such as depression and even self-harm. The jury ruled in favor of the woman and found that social media platforms played a meaningful role in causing the harm.
What makes this case quite significant is that it didn't focus on the impact of users seeing harmful content, but rather on how they are designed, such as infinite scrolling, autoplaying videos, algorithm-driven recommendations, and notifications designed to draw users back to the platform. The jury agreed that these design choices can assist in developing addictive behavior, particularly in children, and the companies behind these platforms failed to warn users of their potential harm.
The court ruled in favor of the woman, awarding her approximately $6 million in damages, which will mostly come out of the pockets of Meta and Google, which the court found most liable. Meta and Google have both denied any wrongdoing and are expected to appeal the decision. However, the money in damages is a drop in the bucket for what could potentially come from this ruling, as it marks the first time the user interface, or the way a social media platform is designed, can result in negative effects for users.
As you can probably imagine, with such a ruling, others are now considering legal action against social media companies for similar cases, potentially opening the door to thousands of additional lawsuits on top of current ones. Not to mention regulation, as regulators could use the case as an example to enforce specific design restrictions on social platforms to mitigate negative user impact.
Ultimately, the ruling represents a shift in how courts view social media platforms. Instead of treating each platform as a neutral tool where any responsibility for users beyond what content they can/can't see is abdicated, courts are now treating them as products and recognizing that the design choices that create those products can negatively impact user wellbeing, particularly with younger audiences.




