Critics have accused Meta of caring less about children’s rights, treating child safety as a public relations problem rather than a design problem. Now, as the parent company of Facebook and WhatsApp faces one of the most aggressive state-level crackdowns ever attempted against a social media giant, it threatens New Mexico: Push too far, and we’ll take our platforms and go home.
That threat came after New Mexico Attorney General Raúl Torrez sought sweeping court-ordered restrictions. The injunctive relief Torrez filed on April 30 would radically redefine how Facebook, Instagram, and WhatsApp operate for children and teenagers in the state.
Meta’s response was blunt. “While it is not in Meta’s interests to do so,” the company said in a statement, “if a workable solution to Attorney General Torrez’s demands is not reached, we may have no choice but to remove access to its platforms for users in New Mexico entirely.”
To Torrez, Meta’s statement sounded more like corporate intimidation. “Meta is showing the world how little it cares about child safety,” he said Thursday. “Meta’s refusal to follow the laws that protect our kids tells you everything you need to know about this company and the character of its leaders.”
The simmering confrontation between New Mexico and Meta has become one of the nation’s most consequential legal battles. At stake is the question of whether states can force social media companies to redesign products that critics say were engineered to maximize addiction, engagement, and advertising profits — even at the expense of children.
A Fake Teenager and a Flood of Predators
The lawsuit against the social media giant stemmed from a covert operation in 2023 that investigators from the New Mexico Department of Justice conducted using a decoy account.
The fake account posed as a 13-year-old girl. Court filings said the profile was rapidly inundated with sexually explicit messages, exploitative solicitations, and unsolicited images from adult men. Investigators said Meta neither flagged nor interrupted the interactions with its safety systems.
That operation became the foundation for a lawsuit. It accused Meta of deceptive business practices, helping child sexual exploitation, and intentionally designing platforms to keep young users hooked.
The case was notable because state prosecutors set aside the formidable legal shield of Section 230 of the Communications Decency Act. Under Section 230, tech companies are granted immunity for most user content, which frequently results in dismissal of lawsuits against them. Prosecutors instead pursued the case under the state’s consumer protection statutes.
In March, a jury in Santa Fe held Meta liable for 75,000 violations of New Mexico’s Unfair Practices Act. The jury ordered the company to pay $375 million in civil penalties — the maximum allowed under state law. The verdict marked the first trial victory by a state against a major technology company over allegations linked to child safety.
The court battle had also exposed internal company discussions about the consequences of encryption policies. Jurors reviewed documents revealing that Meta employees believed Zuckerberg’s 2019 encryption expansion would hinder the company’s ability to spot and report child abuse content. One Meta researcher, one testimony claimed, identified hundreds of thousands of potential child exploitation cases daily across Facebook and Instagram.
What New Mexico Wants Meta to Change
New Mexico will seek injunctive relief when a bench trial begins May 4 before Chief Judge Bryan Biedscheid, which could alter Meta’s operations for minors in New Mexico.
Among the proposed changes are the following:
- Mandatory age verification for users.
- Removal of children under 13 from Meta platforms.
- Linking every minor account to a parent or guardian account.
- Blocking adults from messaging minors unless directly connected.
- Ending algorithmic recommendations of minors to adults.
- Permanent “one-strike” bans for users involved in child exploitation.
- Disabling end-to-end encryption for users under 18.
- Restricting infinite scroll, autoplay and push notifications during school and sleeping hours.
- Capping minors’ platform use at 90 hours per month.
- Installing a court-appointed Child Safety Monitor funded by Meta for at least five years.
The demands could amount to something less than a traditional regulatory settlement. They are closer to a court-supervised redesign of social media.
Meta argues the demands are impractical. “The State’s demands are technically impractical, impossible for any company to meet, and disregard the realities of the internet,” the company said. The social media giant also argued that focusing on a single platform overlooks the broader ecosystem of apps teenagers use and unfairly burdens parents’ decision-making authority.
But Torrez rejected the claim that Meta lacks the technical ability to comply. “For years, the company has rewritten its own rules, redesigned its products, and even bent to the demands of dictators to preserve market access,” he said. “This is not about technological capability.”
A National Reckoning for Big Tech
The case arrives as lawmakers across the United States struggle to catch up with a digital ecosystem that evolved far faster than federal regulation.
More than 40 attorneys general have filed lawsuits against Meta claiming that its platforms have harmed the youth’s mental health. Meanwhile, Congress has repeatedly failed to pass major legislation to protect children from online harm.
Lawmakers enacted COPPA in 1998. The law predates the rise of algorithmic feeds, influencer culture, and TikTok-style recommendation engines, reshaping the internet into a behavioral marketplace.
The New Mexico case is especially explosive, not because of the size of the penalties or the aggressive legal theory. It is because a state court could effectively dictate how Meta, one of the world’s largest technology companies, designs its products for millions of users.
Meta may have intended its threat to leave New Mexico entirely as leverage. Torrez framed the threat as a “PR stunt.”
But the threat also revealed something deeper about the escalation between governments and Silicon Valley. That is when regulation threatens the design that drives engagement and profit; even the world’s most connected tech companies may decide disconnection is the better option.
