The Great Indifferent Censor

Let's not mistake content moderation for moral reasoning.

The week before last, poet and artist Rupi Kaur uploaded a picture from a photo series depicting her and her sister’s periods to Instagram. In short order, the Facebook-owned social network deleted it twice, and Kaur then posted a fine, blistering open response on their parent company’s page with the offending picture (out of neglect or apprehension, Facey didn’t can it that time around). Neither Kaur’s words or images need further external explaining or qualifying, and the picture is back.

Kaur’s pictures, elegantly framed and composed, stand up solidly as art. They’re also not the first or only photos of period blood leaking through skirts or pants to turn up on the internet. There’s a streak, long and not proud, of sleeping exes caught in embarrassing or "embarrassing" moments without their consent; of women, people in general, semi-conscious and defecating, vomiting, at their most vulnerable, unaware. This stuff always found its way online - on the crude teenage boys’ message boards of the pre-Web 2.0 era and places like - more recently, on toilet-cistern subreddits and 4Chans, probably with some “Y U DO THIS” or “DAT AWKWARD MOMENT WEN” macro on it.

You see this less on modern social networks, with their swift and unprecedented policing of content. But this isn’t the result of some noble mission to create a kinder, gentler internet, any more than the decision to censor Kaur’s deeply human antidote was the product of an agonised boardroom discussion on what passes muster. These services don’t operate, or care, that way: the conversations you start are the ones their business operates to bypass.

It’s true that content moderation processes capture stuff that’s got towering odds of being abusive, grossly invasive, or criminal (and partly, they’re intended to). But all the cloying chat about keeping “x network” safe lets these networks pose with a responsibility and accountability they don’t deserve. In the real world, “community standards” denotes some sort of negotiated and renegotiated consensus about what can be seen, said and heard. For the big internet players, it’s simply a vast, indifferent dredging. Some of the objectively cruel things people do to each other online might get caught along the way. So will anything that simply makes someone feel squeamish.

Instagram’s apology to Kaur made it sound like a bungle on its part. More likely, it was standard operating procedure - an initial snap decision to delete made within five seconds, and then a second snap decision by somebody else. The context of her practice or her life experience would not have been considered. The call will have been made by one of the innumerable content moderators that perform one of the most soul-sucking tasks in the digital service economy - people who are following labyrinthine checklists of when nudity is too nude and when violence is too real (and for the worst of it, their own visceral intuition) to sort through corpses, abuse, hate speech, and a couple of drops of blood.

Facebook and many other content managers have outsourced this work to developing countries like the Philippines - Wired article by Adrian Chen from last October can give you a sense of how the work is RSI and PTSD-inducing at the same time. The volume of material to be dealt with is huge, and the worst of it is unrelenting. This is the irony in Kaur and Instagram’s interaction - sandwiched between child porn and beheadings in a monitored cubicle, all the genuine disgust was ironed out of the equation a long time ago. This censorship is a streamlined, desensitised indifference, carried out a world away.

Labour efficiency is a tough taskmaster. Contrast the one-click approach to reporting a consensual nude with the various tiers of difficulty a person encounters trying to remove an image of themselves they never consented to seeing online in the first place. If it’s unflattering or embarrassing somewhere short of that community standards threshold, an online help portal directs them to their local privacy jurisdiction and under no circumstances to a real actual person who works for Facebook.

If it’s in that nebulous space where it’s not an automatic red-flag for some unlucky worker, but effectively amounts to abuse and harassment, you’ve got to go through another administrative process which might eventually get an image taken down, but doesn’t address its source. As for the most vicious stuff? Sure, that goes. But when Facebook, Instagram, any of them, delete “revenge porn” with the wave of a cursor, it’s ultimately the “porn” part that’s operative, and not the “revenge”.

Kaur’s photo, her subsequent stand, and Instagram’s rush to atone highlight the tensions in the enforcement of the community standards we all agreed to when we signed up to these services. They’re generally not our standards to amend or appeal; their enforcement favours expediency, efficiency, and exploitation. The nuances of context and intention that we get to try and negotiate as flesh-and-blood people are out the door. She and those who support her scored a valuable symbolic victory, but absent some great change in the power between user and provider, we’re going to keep seeing similar stories again and again.

Read by Category

The Pantograph Punch publishes urgent and vital cultural commentary by the most exciting new voices in Aotearoa.

The Pantograph Punch publishes urgent and vital cultural commentary by the most exciting new voices in Aotearoa.

Your Order (0)

Your Cart is empty