NewsNation

Facebook parent sued by New Mexico alleging it has failed to shield children from predators

FILE - Meta's logo is seen on a sign, Nov. 9, 2022, at the company's headquarters in Menlo Park, Calif. Facebook and Instagram fail to protect underage users from exposure to child sexual abuse material and let adults solicit pornographic imagery from them, New Mexico's attorney general alleges in a lawsuit filed Tuesday, Dec. 5, 2023, that follows an undercover online investigation into Meta’s social media platforms. (AP Photo/Godofredo A. Vásquez, File)

SANTA FE, N.M. (AP) — Facebook and Instagram fail to protect underage users from exposure to child sexual abuse material and let adults solicit pornographic imagery from them, New Mexico’s attorney general alleges in a lawsuit that follows an undercover online investigation.

“Our investigation into Meta’s social media platforms demonstrates that they are not safe spaces for children but rather prime locations for predators to trade child pornography and solicit minors for sex,” Attorney General Raul Torrez said in a statement Wednesday.


The civil lawsuit filed late Tuesday against Meta Platforms Inc. in state court also names its CEO, Mark Zuckerberg, as a defendant.

In addition, the suit claims Meta “harms children and teenagers through the addictive design of its platform, degrading users’ mental health, their sense of self-worth, and their physical safety,” Torrez’s office said in a statement.

Those claims echo a lawsuit filed in late October by the attorneys general of 33 states, including California and New York, against Meta that alleges Instagram and Facebook include features deliberately designed to hook children, contributing to the youth mental health crisis and leading to depression, anxiety and eating disorders. New Mexico was not a party to that lawsuit.

Investigators in New Mexico created decoy accounts of children 14 years and younger that Torrez’s office said were served sexually explicit images even when the child expressed no interest in them. State prosecutors claim that Meta let dozens of adults find, contact and encourage children to provide sexually explicit and pornographic images.

The accounts also received recommendations to join unmoderated Facebook groups devoted to facilitating commercial sex, investigators said, adding that Meta also let its users find, share, and sell “an enormous volume of child pornography.”

“Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children,” Torrez said, accusing Meta’s executives of prioritizing “engagement and ad revenue over the safety of the most vulnerable members of our society.”

Meta, which is based in Menlo Park, California, did not directly respond to the New Mexico lawsuit’s allegations, but said that it works hard to protect young users with a serious commitment of resources.

“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” the company said. “In one month alone, we disabled more than half a million accounts for violating our child safety policies.”

Company spokesman Andy Stone pointed to a company report detailing the millions of tips Facebook and Instagram sent to the National Center in the third quarter of 2023 — including 48,000 involving inappropriate interactions that could include an adult soliciting child sexual abuse material directly from a minor or attempting to meet with one in person.

Critics including former employees have long complained that Meta’s largely automated content moderation systems are ill-equipped to identify and adequately eliminate abusive behavior on its platforms.