Mark Zuckerberg and Adam Mosseri acknowledged during court depositions that some harms to children, including sexual exploitation and mental health risks, inevitably occur across Meta's social media platforms. Their taped testimonies were presented during an ongoing trial in New Mexico, where the company is defending its safety practices.
Meta operates several of the world's largest social networks, including Facebook, Instagram, and WhatsApp. Each platform serves billions of monthly users. According to Zuckerberg, the massive scale of these networks makes some misuse unavoidable despite ongoing safety enforcement and monitoring systems.
Lawsuit Accuses Meta of Prioritizing Engagement Over Child Protection

As The Guardian reported, the lawsuit was filed by Raul Torrez, who alleges that Meta allowed predators to exploit children through its platforms while prioritizing profit and user engagement. The trial, which began earlier this year, is expected to run for several weeks.
Prosecutors presented internal estimates suggesting that in 2020, roughly 500,000 minors may have received sexually inappropriate messages daily on Instagram. These interactions reportedly included grooming attempts, where adults try to build relationships with minors for exploitation.
Meta responded by stating that earlier monitoring tools counted a broad range of interactions, including some that were not inappropriate.
Fortune wrote that Mosseri had some words when asked about Instagram's actions to keep teens safe.
"I think we should do what we can. I think that there's over 2 billion people on Instagram, which means there are millions of teens on Instagram. So when you say everything, I want to be clear that we are a large enough platform that sometimes some things will — so for instance, problematic content will be seen."
Algorithms and Messaging Features Face Scrutiny
Court testimony also examined the role of Meta's recommendation systems, including the "People You May Know" feature on Facebook. Prosecutors argued that the algorithm may have helped predators discover potential victims in certain cases.
The trial also tackled Meta's decision to implement end-to-end encryption for Facebook Messenger in 2023. Child safety organizations such as Thorn and the National Center for Missing and Exploited Children warned that encryption could make harmful activity more difficult to detect.
Zuckerberg defended the move, arguing that stronger privacy protections are increasingly important for users. Meanwhile, Mosseri said Meta continues developing tools to detect suspicious behavior and prevent risky accounts from contacting teenagers.
Just last month, Meta rejected the "social media addiction" label after a series of exploitation claims. The most notable case was the allegation from a California woman, stating that the platform damaged her mental health because of its "addictive design."
Meta Introduces Teen Safety Measures
In response to mounting criticism, Meta launched "Teen Accounts" in 2024. These accounts automatically apply stricter privacy settings, limit messaging access, and reduce unwanted contact with minors.
Read our report about allegations against Meta targeting teens by clicking here.
Originally published on Tech Times








