Queer Art Goes to Die in the Gray Area in Between Instagram’s Community Guidelines

My name is Mitchell Allison, I’m a visual artist with a specific interest in queer identity: the politics of sex, desire, and the body.  

I was raised in a very conservative and modest household in the south, where talk of sex was taboo, desire (straight or otherwise) was restrained, and the shape and size of my body was regular discussion. As I grew older I found freedom in reveling in the sheer “otherness” of my queer identity. In my work I often use whimsy and absurdity to question the standards and  moral guidelines I was raised under, with the intention of spurring the same questioning in the mind of my audience. In one photo series I used glass vases filled with water to distort the human form into something un-sexual—an objective blob of limbs and joints void of desire. In another I used practical makeup to turn the thirst trap on its head, taking something erotic and making it unsettling. No matter the angle I come from, my intention is to evoke the same conflicting feelings of confusion, intimacy, and power. I collaborate heavily with the models I shoot with to reach a final image that shares a piece of each of us, and that autonomy of my subjects is a part of what I love about the images I create.

All of this is to say, I was surprised to wake up the other day and open my phone to Instagram to see this: 

I scrolled to see if there was any further information—what had been taken down? And what had it been reported for? Is it possible to appeal this decision? The page was brief, iterating that my postings must abide by a standard “appropriate for a diverse audience”— no intercourse, no genitals, no female nipples, no nude children. If I violated these rules again my account would be taken down.

There was only one button at the bottom of the page: “OK.”

My initial response was that of confusion. Beyond the politics of body policing and the opaque sexism behind the statement “female-presenting nipple,” I was unaware of what I had done to violate these guidelines; anything posted to my feed begrudgingly already had large bars over genitalia or female nipples. I also couldn’t find any more information regarding the decision for my work to be removed, or any channel to appeal said decision. I cross-referenced my page with my archive and photo files on my computer, but between the hundreds of photos I’ve worked on, I couldn’t pick out what had been removed. I went to the Help page and reported two problems: one as “General” and another as “Something Isn’t Working,” both stating that I was requesting more information as to what had been reported. Eventually I couldn’t help but laugh at the sheer irony that my photos on the absurdity of the sexual gaze had been reported for pornography.

So I turned to the internet for guidance, and the internet had plenty to give. “Instagram deleted my post,” says an anonymous post on a forum, “How do I find out what they removed and why?” The general answer? You don’t. “If you have used any content which is the copyright of some party then Instagram may remove your post,” one user says. “They have very strict community guidelines. We are the premium partner of Instagram and they won’t even allow us to post a picture that doesn’t fit with their community guidelines,” says another. One person tries to help with the advice of reaching out to Instagram’s Facebook page. I finally came across one response that seemed substantial, saying that Instagram has essentially a vague blanket set of guidelines, and they enforce them strictly with a large gray area for interpretation. Further, whenever Instagram removes a post from your page “they also immediately shadow ban you,” the post states. “Shadow Ban is the case of Instagram suppressing your posts from being seen by many others. This is technically known as your impression. As your impressions are suppressed, the less likes your posts get.”

I did a bit more digging and found a TechCrunch report from a few months earlier regarding Instagram’s guidelines and how they are enforced. In a quote from the report, Instagram’s product lead for Discovery, Will Ruben, states, “We’ve started using machine learning to determine if the actual media posted is eligible to be recommended to our community.” So what does that mean in terms of daily use of the app? The report goes on to say that Instagram is training its content moderators to label content that is “borderline” in violation of guidelines, so that those labels can be used by an algorithm to learn what is and is not “recommended.” What constitutes “borderline” content? That question is left vague beyond stating that it is in effort to reduce content that is "inappropriate but [does] not go against Instagram’s community guidelines” of no sex, no genitals, no female nipples.

So the issue then becomes, who makes that call saying what is and is not “appropriate”? While regulating spam and illegal content is crucial, if a post abides by community standards, Instagram using its platform to suppress content via moderators, or worse, an algorithm, is a lazy and dangerous precedent to set. It would be one thing if the process to appeal these mistakes were transparent—as platforms such as Tumblr have (somewhat unsuccessfully) attempted, but when I reached out to friends and other artists to hear their experience with moderation and censorship on Instagram, that doesn’t seem to be the case.

“The photo was of me in a robe with nothing underneath,” says Kevin Poole, regarding a photo he posted on a private Instagram page with only a few of his close friends following, ”and then the next time I tried to log in, it told me my ‘account had been disabled for violating our terms.’” Poole says he reached out to Instagram Support multiple times with no response, and he is unable to log into any account on Instagram, including a business account he runs as a social media coordinator.

Two others—McKenzie Goodwin and Miny DuPonte—had issues with sponsored posts and stories being removed or not circulated. Goodwin, the cohost of a live comedy show titled Two Dykes and a Mic, says she’s had multiple posts and stories that feature the title of her show removed, and her multiple requests for appeal have been left on deaf ears. “We are an LGBTQ inclusive and safe show and have even performed at multiple Pride events, but it hasn’t helped our cause at all,” says Goodwin. Miny DuPonte, a lesbian-identified musician and songwriter by the stage name Miny, states she had a promotion for an upcoming show denied multiple times with no explanation as to why—her only guess being that she used the phrase “Calling all gays” in the promotion, which may have been flagged.

Well, if you’re going to enforce vague community guidelines and suppress “borderline” content, this moderation should be uniform across the board, right? Not necessarily, says Andrew Harper, who runs the account @gaytona.beach, where he (with consent) posts racy photos sent to him on the hookup app Grindr, photoshopping the conversation bubbles over the bits not appropriate for a “diverse audience.”

Harper says he’s repeatedly had posts taken down, but the frustrating thing is that when he edits and posts again, this time covering up plenty of the body, the posts often are still taken down. On the opposite end, he says, what is and is not taken down sometimes seems completely random, as he was able to get away with posting this photo: 

“It’s so frustrating (and demeaning to me),” says Harper, “that Instagram seems to be very efficient at removing things it deems ‘inappropriate’ under uncomfortably vague guideline wording—I wholeheartedly believe it disproportionately affects the queer community as well.” Harper goes on to point out the arbitrary nature of this moderation, using a recent promotional campaign on Instagram for Kim Kardashian West’s KKW Body fragrance as an example, which features Kim’s body fully on display, with her hands just covering her nipples and breasts: 

Regardless of one’s opinion on how much or how little one should be able to share on the internet, it’s clear Instagram’s vague “guidelines” are failing and disproportionately affecting a multitude of groups that make up their “diverse audience.” Beyond arbitrary and questionable rules on nudity and sexuality, in using algorithmic learning to censor content and control the reach and audience it has, Instagram is painting broad strokes to censor the media featured on its platform, and in doing so it’s failing the creators who use it—specifically those of marginalized communities whose content is more likely to be flagged as inappropriate for nothing other than being of that community.

As of the time of publication Instagram has not responded to my multiple requests for appeal.

By Mitchell Allison

No comments

Post a Comment