New Mexico has ordered Meta to pay $375 million after a jury found the company made false claims about protecting children on its platforms. The landmark ruling Tuesday represents one of the largest penalties ever imposed on a tech company specifically for child safety violations, marking a significant escalation in how states hold social media giants accountable for their youngest users.

The verdict follows a closely watched trial that examined Meta's public statements about child safety measures across Instagram, Facebook, and WhatsApp. New Mexico prosecutors argued the company systematically misled users about the effectiveness of its protections while continuing to expose minors to harmful content and predatory behavior.

Unlike typical privacy fines that focus on data handling, this case centered entirely on child welfare — a distinction that legal observers say could reshape how tech companies communicate about safety features. The $375 million penalty exceeds most state-level tech fines and signals growing willingness among state attorneys general to pursue child-focused litigation.

Why This Matters This ruling establishes a new precedent for holding tech companies accountable specifically for child safety claims, potentially opening the door for similar cases in other states targeting misleading safety representations.

The case emerged from a broader investigation into Meta's business practices, with New Mexico Attorney General Raúl Torrez leading efforts to examine whether the company's public safety commitments matched its actual protective measures. The trial featured evidence about internal company communications and platform design decisions.

Meta has consistently maintained that it invests heavily in child safety and has implemented numerous protective features across its platforms. The company's safety policies include age verification systems, content filtering, and tools for parents to monitor their children's activity.


The timing of the verdict coincides with growing national scrutiny of social media's impact on young users. Multiple states have introduced legislation requiring enhanced safety measures, while federal lawmakers have proposed comprehensive reforms to how platforms handle underage accounts.

Industry analysts note the case's focus on misleading claims rather than platform design itself could influence how other tech companies communicate about safety features. The distinction between actual harm and deceptive marketing creates new legal terrain for both regulators and companies.

New Mexico's approach of pursuing state-level enforcement rather than waiting for federal action reflects broader frustration with the pace of national tech regulation. Several other states are reportedly preparing similar child safety investigations targeting major social media platforms.

The $375 million penalty will be distributed according to New Mexico's consumer protection framework, with portions potentially funding digital literacy programs and child safety initiatives. Meta has not yet announced whether it plans to appeal the verdict.

Legal experts describe the ruling as the largest child-focused penalty against a social media company, setting a financial benchmark that could encourage more aggressive state-level enforcement against tech companies, particularly around claims involving vulnerable populations like children.