Meta Seeks to Narrow Evidence Scope in New Mexico Child Safety Trial

Article Title: Meta’s Legal Maneuvers in New Mexico Child Safety Trial

Meta Platforms, Inc., the parent company of Facebook and Instagram, is preparing for a significant legal battle in New Mexico, where it faces allegations of failing to protect minors from sexual exploitation on its platforms. As the trial date approaches, Meta’s legal team is actively seeking to restrict the scope of evidence that can be presented in court.

The lawsuit, initiated by New Mexico Attorney General Raúl Torrez in late 2023, accuses Meta of negligence in safeguarding minors against online predators, trafficking, and sexual abuse. The state contends that Meta permitted explicit material to reach minors and lacked sufficient child safety measures. This case is notable as it represents the first state-level trial of its kind, with proceedings set to commence on February 2.

According to public records reviewed by Wired, Meta aims to exclude various pieces of evidence from the trial, including:

– Research on social media’s impact on youth mental health.

– Accounts of teen suicides linked to social media use.

– Details of Meta’s financial status.

– Records of the company’s past privacy violations.

– Information pertaining to CEO Mark Zuckerberg’s college years.

Legal experts consulted by Wired suggest that while it’s common for companies to seek to narrow the focus of a case, Meta’s extensive list of exclusions is unusually broad. Notably, the company also seeks to prevent discussions about its AI chatbots and a public health warning issued by former U.S. Surgeon General Vivek Murthy regarding social media’s effects on youth mental health. Meta argues that such information is irrelevant or could unfairly influence the jury.

This legal strategy unfolds against a backdrop of increasing scrutiny over Meta’s handling of child safety. In August 2025, four whistleblowers alleged that Meta suppressed internal research on children’s safety, raising concerns about the company’s commitment to protecting young users. Additionally, in May 2024, the European Commission launched formal investigations into Facebook and Instagram over child protection concerns, citing the platforms’ potentially addictive designs and their impact on minors’ mental health.

Furthermore, in August 2025, Texas Attorney General Ken Paxton initiated an investigation into Meta and Character.AI for allegedly engaging in deceptive trade practices by marketing AI platforms as mental health tools for children without proper medical credentials or oversight. This probe followed reports that Meta’s AI chatbots were interacting inappropriately with minors, including engaging in flirtatious conversations.

In response to these concerns, Meta announced in August 2025 that it would update its chatbot rules to avoid inappropriate topics with teen users. The company stated it would train chatbots to no longer engage with teenage users on self-harm, suicide, disordered eating, or potentially inappropriate romantic conversations.

As the New Mexico trial approaches, the outcome could have significant implications for Meta and the broader tech industry regarding the responsibility of social media platforms to protect minors from online harm. The case underscores the ongoing debate over the balance between free expression and the need for robust child safety measures in the digital age.