subreddit-bannedIn the aftermath of the horrific Orlando shooting, discussions of the attack occurred predominantly on social media. However, as is increasingly the case after a public tragedy, one of these discussions was about social media. This debate revolved around social media's censorship of the information and discussion of Sunday's events. The main website in question was Reddit, although Twitter yesterday came under fire for allegedly suspending the account of conservative commentator Milo Yiannopoulos. The exact reason for Yiannopoulos' suspension is unclear. However, many suspected it was because Yiannopoulos had been particularly vocal in his attacks on Islam in the wake of the Orlando shooting. The internet's beef with Reddit is, similarly, about its alleged suppression of discussions about Islam and the religion's link to the shooting. Users claim the censorship was to such an extent no discussion was allowed at all.

Indeed, as the story of the attack was developing on Sunday, many Redditors turned to the popular "r/news" forum for updates. The Washington Post reports that what these users found was "thousands of deleted posts, furious Redditors accusing individual moderators of censorship, and a subreddit that appeared to be bleeding subscribers."

Although the subreddit has guidelines against posts that are either "racist, sexist, vitriolic, or overly crude," or "unnecessarily rude or provocative," the moderators of the forum went far beyond the mere enforcing of the rules. Posts containing actual news updates, for example, articles reporting that the Islamic State had claimed "responsibility" for the attack, were allegedly deleted. The Washington Post sums up the resulting fury:

Everyone seems to agree that something went very wrong for r/news on what should have been a crucial day for the subreddit. But the reason for that is the subject of heated debate, with the sides of the argument falling along the now familiar fault lines of Reddit’s ongoing culture war between free speech absolutists who believe Reddit has become too “politically correct” and those who believe the site needs to do more to limit the frequency and spread of hateful, harassing and abusive speech on the platform.

In the midst of this discussion it is worth remembering that because the rules regarding censorship remain entirely in the hands of the social media sites, complications in their enforcement are inevitable.

In 2014 Marjorie Heins, a friend of NCAC, wrote an article for the Harvard Law Review titled: The Brave New World of Social Media Censorship. In the piece, Hein's explores the relationship between the law and online expression, specifically in relation to Facebook. She writes:

“Facebook,” as Jeffrey Rosen has said, wields “more power [today] in determining who can speak . . . than any Supreme Court justice, any king or any president.” Facebook’s “Statement of Rights and Responsibilities” provides: “You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.”Within Facebook’s broad ban, it’s true, reside a few categories of speech that the First Amendment does not protect: threats, incitement, and some subcategory of pornography that might be constitutionally unprotected obscenity under a local community standard. But there is no judicial determination of illegality – just the best guess of Facebook’s censors. Facebook’s internal appeals process is mysterious at best.

And most of what Facebook proscribes is protected by the First Amendment. Its power to suppress nude images, to take one example, has a huge potential impact on communications about visual art — including cinema and photography — as well as sex education and discussions of sexual politics. Similarly, judgments about what content is gratuitously violent or hateful toward a religious or ethnic group can vary widely, and the result will be subjective and unpredictable censorship of literature, art, and political discussion.

Professor Ammori tells us that Facebook lawyers have created “a set of rules that hundreds of employees can apply consistently without having to make judgment calls.” The details of these rules, however, we do not know. Unlike censorship decisions by government agencies, the process in the private world of social media is secret.

It is probably true that Facebook has a First Amendment right to censor whatever it wants in order to maintain the kind of social space it wants. Facebook is not, and arguably should not be considered a common carrier, and thus it should not be forced into a legal strait jacket that would prohibit content-based terms of service. As several students in my 2013 censorship class at NYU pointed out when we debated this issue, it is a competitive social-media world out there, and if people are unhappy with Facebook’s censorship policies — or YouTube’s, or Tumblr’s, or Twitter’s, for that matter — they will gravitate to another site.

Thus, like Facebook, Twitter and Reddit are uncovered by "judicial determination[s] of illegality” and although a site like Reddit makes its rules for unacceptable content clear, enforcement of these rules remains “the best guess of [Reddit’s] censors." The anger that Reddit provoked in the aftermath of the Orlando tragedy was clearly due to the "subjective and unpredictable" decision-making of the site's moderators in monitoring the discussions, who were overwhelmed by the amount of content and evidently overstepped the mark. Indeed, as one volunteer moderator posted on the site late on Sunday, "To be clear, we know that mistakes were made."

Social media sites can attempt to establish and enforce a definitive and fair procedure for censoring problematic content, however, because of the ultimately subjective nature of the decision to censor something shared, this procedure will inevitably fail to please everyone. It seems the best a website can do is understand its own identity and establish censorship rules based on the type of audience it wants to attract. As Heins notes, failure to define one's target audience will result in the waning popularity of a site in the hyper-competitive social media marketplace. Indeed, last year, NCAC reflected on how Reddit is increasingly straying from its free speech roots. R/news' plummeting subscriber count is no doubt proof of the resulting effects of this move.