Already a subscriber? Make sure to log into your account before viewing this content. You can access your account by hitting the “login” button on the top right corner. Still unable to see the content after signing in? Make sure your card on file is up-to-date.
Meta has threatened to remove Facebook, Instagram, and WhatsApp from New Mexico if a judge sides with the state on court-ordered changes aimed at protecting underage users.
Some shit you should know before you dig in: Back in 2023, New Mexico Attorney General Raúl Torrez sued Meta and CEO Mark Zuckerberg, accusing the company of harming kids’ mental health and exposing them to sexual exploitation on its platforms. The whole thing started with a sting (NMDOJ investigators built a fake account pretending to be a 13-year-old girl, and the inbox started filling fast with predators sending images, propositions, and DMs). Meta’s automated systems didn’t catch a single one, and internal Meta documents introduced at trial revealed that once Zuckerberg flipped on default end-to-end encryption for Messenger in 2019, employees ran the numbers and figured out somewhere around 7.5 million CSAM reports would never reach law enforcement. One Meta researcher’s own estimate put the daily child exploitation incidents on Facebook and Instagram as high as 500,000. Ultimately, a jury in Santa Fe came back in March 2026 with 75,000 violations of the state’s Unfair Practices Act and hit the company with a $375 million fine. Next week lawyers for Meta will be back in the courtroom, and a judge will determine what reforms Meta will have to make.
What’s going on now: Attorney General Torrez has criticized Meta’s threat to pull its platforms from the state, dismissing it as a “PR stunt” during a virtual press conference Thursday and arguing the company has the technical capability to do what’s being asked. He said Meta has “rewritten its own rules, redesigned its products and even bent to the demands of dictators to preserve market access. This is not about technological capability. Meta simply refuses to place the safety of children ahead of engagement, advertising revenue and profit.”
Torrez also told reporters his team is still vetting candidates nationwide for the proposed Child Safety Monitor role and hasn’t landed on anyone yet.
The threat came in a court filing originally sealed and made public April 29, in which Meta said the state’s demands are “technologically or practically infeasible” and would essentially force the company to build separate apps just for New Mexico, leaving withdrawal as the only realistic path. The list of changes the state is asking for is sweeping and includes banning users under 13, deleting their existing accounts, linking every minor account to a guardian account, killing end-to-end encryption for users under 18, banning infinite scroll, autoplay, and push notifications during school and sleep hours, capping monthly access at 90 hours per minor, requiring 99% accuracy in detecting new child sexual abuse material, and installing a court-appointed Child Safety Monitor (paid for entirely by Meta) to oversee compliance for at least five years.
A Meta spokesperson said singling out one platform misses the bigger picture and that the state’s proposed rules infringe on parental rights and free expression, adding that the company has already launched 13 safety measures over the past year and remains committed to providing age-appropriate experiences. The company also argued in its filing that its platforms can’t be a public nuisance because nobody is forced to use them, writing that under that logic, “fast-food chains would be liable for creating a public nuisance by selling food that can contribute to obesity.”
The bench trial begins Monday in Santa Fe before Chief Judge Bryan Biedscheid and is expected to run about three weeks. Meta has tried to delay or kill the case entirely, claiming Section 230 immunity and asking for a postponement, but the court has denied each request.
This all comes as more than 40 state attorneys general have brought their own child safety cases against Meta, and Congress keeps choking on federal bills covering addictive algorithms, age verification, and platform liability for minors.






