Meta’s Landmark Legal Defeats Over Teen Safety: A Turning Point for Social Media Accountability
In a groundbreaking series of legal decisions, Meta Platforms Inc., the parent company of Facebook and Instagram, has been held accountable for endangering the safety and mental health of teenage users. These rulings mark a significant shift in the legal landscape, potentially setting precedents for future litigation against social media giants.
New Mexico’s Landmark Verdict
Last week, a jury in Santa Fe, New Mexico, found Meta liable for violating the state’s Unfair Practices Act. The court determined that Meta misled consumers about the safety of its platforms, particularly concerning their impact on children. The jury imposed the maximum penalty of $5,000 per violation, culminating in a total fine of $375 million. This case represents the first instance where Meta has been held legally responsible for compromising child safety. ([techcrunch.com](https://techcrunch.com/2026/03/24/new-mexico-just-handed-meta-its-first-courtroom-defeat-over-child-safety-and-the-rest-of-the-country-is-watching/?utm_source=openai))
Los Angeles Jury Holds Meta and YouTube Accountable
In a related case, a Los Angeles jury found Meta and YouTube (owned by Google) negligent in designing their platforms to be addictive to children and teens, thereby harming their mental health. The plaintiff, a 20-year-old identified as K.G.M., alleged that her use of these platforms contributed to her anxiety, depression, and body dysmorphia during her adolescence. The jury assigned 70% of the liability to Meta and 30% to YouTube, resulting in a combined fine of $6 million. ([techcrunch.com](https://techcrunch.com/2026/03/25/jury-finds-meta-and-youtube-negligent-in-landmark-social-media-addiction-trial/?utm_source=openai))
Implications for Future Litigation
These verdicts have opened the floodgates for a wave of lawsuits targeting Meta’s practices concerning teen users. Currently, thousands of similar cases are pending, and 40 state attorneys general have initiated lawsuits against Meta, echoing the claims made in New Mexico. Legal experts suggest that focusing on the design features of these platforms—such as endless scrolling and constant notifications—rather than user-generated content, has proven to be an effective strategy in court. Allison Fitzpatrick, a digital media lawyer and partner at Davis+Gilbert, noted, They took the model that was used against the tobacco industry many years ago… It turned out to at least be, in these two cases, a winning argument. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
Meta’s Response and Internal Documents
Meta has expressed its intention to appeal these decisions. A company spokesperson stated, Reducing something as complex as teen mental health to a single cause risks leaving the many, broader issues teens face today unaddressed and overlooks the fact that many teens rely on digital communities to connect and find belonging. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
During the litigation, internal documents from Meta were disclosed, revealing a pattern of inaction regarding the negative impact of its platforms on minors. These documents also highlighted deliberate efforts to increase teen engagement, even during school hours or through finstas—fake Instagram accounts created by teens to evade parental oversight. One 2019 study conducted by Meta involved 24 in-person interviews with users flagged for problematic usage, a designation that applied to approximately 12.5% of users. The report concluded, The best external research indicates that Facebook’s impact on people’s well-being is negative. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
Internal Communications and Employee Concerns
Internal communications further revealed that Meta’s leadership prioritized increasing teen engagement. CEO Mark Zuckerberg commented that for Facebook Live to succeed with teens, the company would need to be adept at not notifying parents/teachers. Other executives made similar remarks, with one employee stating, We learned one of the things we need to optimize for is sneaking a look at your phone in the middle of Chemistry :). Another executive noted, No one wakes up thinking they want to maximize the number of times they open Instagram that day. But that’s exactly what our product teams are trying to do. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
Former Meta employees have corroborated these findings. Kelly Stonelake, who worked at Meta from 2009 to 2024, stated, The mountain of unsealed evidence really demonstrates what I experienced firsthand. Stonelake, who led strategies for the VR social app Horizon Worlds, raised concerns about inadequate content moderation tools, particularly regarding the app’s appeal to teenagers. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
Government and Legislative Response
The U.S. government has intensified its focus on children’s online safety, especially after whistleblower Frances Haugen leaked internal documents in 2021 showing that Meta was aware of Instagram’s harmful effects on teen girls. While Congress has proposed various bills to address these concerns, some privacy advocates argue that certain measures could lead to increased surveillance and censorship without effectively protecting minors. Evan Greer, director of Fight for the Future, stated, There is no universe where passing censorship or ‘age verification’ law, under the guise of kids safety, doesn’t lead to massive online censorship of content and speech that Trump doesn’t like. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
Stonelake, who previously lobbied for the Kids Online Safety Act, has since become critical of the bill’s evolution. She expressed concerns over clauses that would override state regulations and limit legal recourse for affected parties, stating, There is language in the latest version that would close the courthouse doors to school districts, to bereaved families, to states—and that’s wild. ([techcrunch.com](https://techcrunch.com/2026/03/31/meta-was-finally-held-accountable-for-harming-teens-now-what/?utm_source=openai))
Conclusion
The recent legal defeats for Meta signify a pivotal moment in holding social media companies accountable for their impact on teen mental health. As more lawsuits emerge and legislative efforts continue, the tech industry faces increasing pressure to reevaluate platform designs and implement more robust safeguards for younger users.