Can Trans People Police Themselves? Survey Says “Maybe?” by VJ

The subreddit “r/traaaaaaannnnnnnnnns” is billed as, “Trans people making fun of themselves, others, and the situations they find themselves in with memes, gifs, and videos”. The site has guidelines that are set in place to help foster that sort of community engagement and ensure that everyone that posts or interacts with posts will be in a safe environment. The posting guidelines as listed on the website are

1.Absolutely no bigotry of any kind.

2.Bigotry towards non-binary people is transphobia

3.No Spam.

4.Posts must relate to being trans, trans issues.

5.Selfies are not memes.

6.Don’t be a dick.

7.Have fun!

8.No trolling/stalking/harassment.

9.No Personably Identifiable Information Allowed.

At face value, these guidelines are important for a community such as this. A person’s gender identity is a very personal attribute and should be respected. Guidelines like this help eliminate the ability for TERFs(Trans-Exclusionary Radical Feminists) and Truscum(trans people that support the notion that dysphoria is necessary to be trans or transition) to post on this forum. Limiting poster’s ability to post hurtful messages allows a level of safety for the visitors, but who makes those decisions? Similar to many subreddits, there is a faceless, anonymous administrator who polices the posts to eliminate anything potentially offensive. This fact eliminates the ability for this space to name itself as a truly democratic online space. The question at the base of my project is, “does the ability of posters to police content themselves help or harm a space such as this?” Is democracy in a forum such as this really something to be sought after?

Issues of TERF and Truscum rhetoric are often times not hard to spot and recognize for the average trans person. Given this fact, why not allow to visitors to the page to have a say in the posts? The issue here is the question of how much harm the posts can do before enough people decide that they are offended by them. In order for a democratic society to function, majority has to rule. Is it possible for majority to rule on a platform that claims to have 93,000 posters and roughly 1,000 active visitors at any given moment? In order to answer those questions, the site itself would need a drastic shift in how it tracks post engagement and number of visitors at a given moment. Not to mention questions about what constitutes a majority, what is the time frame for a decision and does the addition of power such as this to the average poster create an environment in which people could get posts removed for petty reasons rather than legitimate, social and political reasons?

To critique this site and assess its democratic nature/necessity for MALS 72200, I brought the shit posts to the real world. I gathered a few trans friends and asked them to create content for this subreddit, in real time, face to face. The experiment was divided into two stages; the first stage functioned the same as the site does. I would give the guidelines and I could refuse any post based on my own reasons, as long as I could justify it with the posting guidelines. The second stage operated in a truly democratic fashion. After the posts were made, the room could decide what, if anything, should be removed, based on the posting guidelines. Before discussing the outcomes of this experiment, I think it is worth noting that a study such as this will automatically be slightly skewed based on the fact that the room of individuals assembled all had at least some level of relationship or familiarity with the other creators. While this is not an unheard-of situation to encounter in a digital space, people make online connections with people every day, it does allow the participants to cater posts based on the individuals in the room, if they so choose.

With six active participants, a majority was considered to be at least four votes. The votes were either “yes” or “no” for whether or not the post should be allowed to be posted. To clarify, “posting” in this experiment does not mean that the content was actually posted on the site, it simply means that in this physical space that was meant to represent a digital space, the posts would have been allowed to exist in said theoretical digital space. As we started round one, I had each participant create a meme. There was no time limit and the only instructions that were given were the posting guidelines as stated above. Once the posts were created, I gathered them and chose memes to be presented to the group, with no given reasoning for why some posts were approved and others were not. I approved four posts and shared them with the group. Once the results were shared, reactions from those whose posts were not approved were what would be expected, the creators wanted to know why the posts weren’t approved. The response that was given was “this post violated community posting guidelines” with no other context given. Unsurprisingly, people were not pleased with their content being excluded with no feedback about why. The group as a whole also was noticeably less enthused following those results. As we began round two, the structure was the same except for the removal of my power to eliminate posts. After that round was completed, the posts were presented without context as far as why the creator made the post the way they did. Upon opening up of the vote, no posts were removed. All votes were unanimous for approval and the moral of the group was noticeably higher following this round of presentation.

To conclude the experiment, I asked the participants to share thoughts on what had just transpired. The names of participants have been changed to preserve anonymity. Carly, a participant who had their meme removed in round one said,

“I found that I was more relaxed and artistic when I felt like the only people whose approval I needed were my peers in the room. I knew who I was creating content for and it allowed me to be more creative. In the first round, I was definitely nervous about my meme getting removed because of the inability to provide context or to even know why I had been censored. I felt like I was being policed, which is never fun.”

For Carly, the knowledge that an outside source, who may or may not have an understanding of the intentions of the artist, could veto content was limiting. Sybil, a participant that did not have their post removed in round one had a similar take, but with an important caveat,

“It’s definitely more freeing to know that your peers will be deciding if your post gets approved or not as opposed to an authority figure digging through them before anyone else could see them. I can see how allowing the posters on the site to have the ultimate power could be problematic though, especially with how easy it is for people to infiltrate an online space and transform it. Like, all it would take is enough TERFs to go to a page and then fill a friendly trans-specific space with hurtful nonsense and then your safe space is now a TERF space, which no one would be cool with.”

Sybil’s caveat reveals the true issue at the base of most, if not all, online community spaces, how do you prepare for a possible invasion of this space from those with destructive motives without having some sort of administrator keeping an eye on content? In order to do so, you’d have to institute some sort of vetting procedure to ensure that the only people allowed to post are those with the right intentions for the space. However, if you institute a policy like that, you’re back to policing of content, it just has a different look. It also opens the question of how inclined an average visitor the site would feel about having to go through a vetting process just to shitpost. Most visitors would not want to go through that process. They would take their memes elsewhere. Part of the motivation behind creating memes is the ability to act on a whim and for people to interact with it instantly.

The results of this experiment were conclusive in the sense that everyone involved liked the ability to vote on the content themselves. Where the study lost some of its steam was when Sybil voiced their concern over the infiltration of space. It is a valid concern that supports the idea of some form of mediator to stop actions like that in their tracks. This concern however, supports the idea of a space like r/traaaaaaannnnnnnnnns never being safe to operate in a truly democratic way, due to the nature of the content. When the content being discussed is an issue that, for better or worse, is a divisive issue in the combine public consciousness, the ability to run free is also an invitation for hijacking. The experiment as conducted seems to prove one fact more conclusively than others; democracy is preferred, but only if everyone is on the same page. In an online space that anyone can access from anywhere in the world at any time, it is impossible to guarantee a safe space without some form of policing or management. The overpowering number of variables prohibits a truly safe, free space for content and ideas online.