Zuckerberg calls crime inevitable on Meta platforms
New Mexico trial tests whether recommendation and encryption choices priced harm into growth, Prosecutors cite internal estimate of 500000 children contacted daily
Images
Mark Zuckerberg’s deposition is played for jurors in court in Santa Fe on Wednesday. Photograph: Jim Weber/AP
theguardian.com
Mark Zuckerberg told a New Mexico jury this week that “some very small percent” of Meta’s billions of users will inevitably include criminals, arguing that no platform can ever be “perfect,” according to The Guardian. The remarks, delivered via recorded depositions alongside Instagram head Adam Mosseri, were played in a trial brought by New Mexico attorney general Raúl Torrez, who alleges Meta put engagement and profit ahead of child safety.
Meta’s “inevitable at scale” claim is less a technical point than a boundary-setting exercise: if harm is framed as statistical background noise, the debate shifts from prevention to acceptable loss. Prosecutors used Meta’s own internal estimates to show what “small percent” means in practice, including evidence that the company estimated in 2020 that roughly 500,000 children were receiving sexually inappropriate communications on Instagram each day. The court also heard that Meta identified its “People you may know” recommendations as a major driver of these interactions, used to discover victims in 79% of identified cases in 2018.
That linkage matters because it ties the alleged harm to a product decision. Recommendation systems are built to increase connections, time spent, and return visits; the costs of misfires—especially when minors are involved—are borne by families, schools, police, and child-protection groups. Meta says it has invested billions in safety and points to teen accounts with default protections introduced in 2024, but the trial record presented by prosecutors focuses on earlier years when growth features scaled faster than enforcement.
The same dynamic appears in Meta’s encryption decisions. Jurors heard Zuckerberg approved end-to-end encryption for Facebook Messenger in 2023 despite warnings from groups including Thorn and the National Center for Missing and Exploited Children that it could increase risk. Zuckerberg argued privacy was the more pressing issue; Meta says it can still act on encrypted messages when users report them. The practical consequence is that detection shifts from proactive scanning to user reporting—moving the burden from platform monitoring to victims and bystanders.
Meta’s defense is that it publishes transparency data and removes violating content at scale. The prosecution’s case, as described by The Guardian, is that Meta knew which features were producing the highest-risk interactions and still shipped them because the same features also drove growth.
In court, “inevitable” is presented as a law of large numbers. The exhibits described to jurors show it can also be a business choice about which numbers are allowed to become large.