Media

Instagram removes Erin O’Connor pregnancy nudes again

Meta restores then reflags post as sensitive before second takedown, automated enforcement turns context into an appeals lottery

Images

A black and white image of Erin O’Connor, taken by Nick Knight, which she posted on Instagram on Mother’s Day, and was deleted by the platform. Photograph: Nick Knight A black and white image of Erin O’Connor, taken by Nick Knight, which she posted on Instagram on Mother’s Day, and was deleted by the platform. Photograph: Nick Knight theguardian.com
Erin O’Connor Photograph: Nick Knight Erin O’Connor Photograph: Nick Knight theguardian.com

Model Erin O’Connor says Instagram removed two nude pregnancy photographs she posted for UK Mother’s Day, restored them after she appealed, then flagged them as “sensitive” and removed them again minutes later. According to The Guardian, the black-and-white images were taken in 2014 by photographer Nick Knight when O’Connor was eight and a half months pregnant, and the accompanying post included a poem addressed to her son.

Meta’s message to O’Connor cited “nudity guidelines” and included boilerplate language about “freedom of expression” alongside the usual promise to keep the community “respectful and safe.” The sequence—remove, reinstate, restrict, remove again—shows how moderation now works in practice: rules are written broadly, enforcement is partly automated, and appeals become a queueing system where outcomes can flip without any new facts.

The practical standard is not simply “nudity” but “risk.” Platforms host billions of images; they cannot review context at scale without slowing the product and hiring armies of staff. So they translate ambiguous categories into machine-detectable signals—skin, nipples, poses, cropping—and then build escalation paths that prioritise reputational blowback. When a case attracts press attention, it gets a human look; when the attention fades, it can fall back into automated enforcement.

That creates a predictable pattern of exceptions. Content that is commercial, stylised, or already integrated into brand-safe categories tends to survive because it is legible to both machines and advertisers. Content that is intimate, personal, or hard to classify—breastfeeding, birth, medical imagery, pregnancy nudity—can be punished because the cost of a false positive is borne by the user, while the cost of a false negative is borne by the company.

O’Connor described the decision as a “double standard,” arguing at an event at London’s National Gallery that women are “hypersexualised on a daily basis” while non-sexual images of bodies are removed. The platform’s incentives do not require anyone to prefer sexualisation; they only require a preference for predictable outcomes. Advertisers buy stability, not nuance.

Meta restored the post once after media coverage, The Guardian reports. Then, within minutes, Instagram treated the same images as a compliance problem again.

The photographs did not change. The risk assessment did.