Several recent rulings from Facebook’s Oversight Board push Facebook to show greater respect for principles of free expression in its content moderation decisions. The Oversight Board is an independent body comprised of 40 members who evaluate Facebook’s content review decisions. It’s been colloquially described as Facebook’s “Supreme Court.” The Board reviews cases that have broad implications for Facebook’s content review process and determines whether to permanently uphold or reverse Facebook’s original decisions. 

In recent decisions involving regulations on adult nudity and sexual activity, violence and incitement, and dangerous individuals and organizations, the Board encouraged Facebook to protect free expression within the confines of its Community Standards, rather than taking antagonistic or confrontational stances (Case Decision 2020-003-FB-UA), and established precedents for providing users with detailed reasoning behind its decisions, a core demand of NCAC’s advocacy with social media platforms. The Board also highlighted the necessity of clarifying the vague language in its guidelines and reviewing content within its original context, including but not limited to the identity and intent of the user as well as the audience. 

Nudity on Instagram

In Case Decision 2020-004-IG-UA, the Board overturned Facebook’s decision to remove an Instagram post which included eight images, five of which depicted visible and uncovered anatomically-female nipples. The images included text overlays explaining that the pictures were posted to raise awareness of signs of breast cancer. Nevertheless, an automated system removed the post, citing a violation of the Community Standard on Adult Nudity and Sexual Activity. Facebook’s female nipple ban, however, has an exception for raising awareness for medical reasons. In their statement, the Board advises Facebook to continuously analyze content which was removed by automated systems and make public data regarding the number of automated removal decisions and the number which were reversed based on human review. The case analysis noted that “the lack of proper human oversight raises human rights concerns” and reinforced that, because the policy treats male and female bodies differently, allowing automated systems to use inaccurate information “disproportionately affects women’s freedom of expression.”

The Board also highlighted the need for a human to review appeals of removals made by automated systems if the content is said to violate the Community Standard on Adult Nudity and Sexual Activity or the Community Standards on Sexual Exploitation of Adults and Child Sexual Exploitation, Abuse and Nudity.

This decision clarifies (and asks Facebook to clarify) that the ban on adult nudity is not absolute and that female nipples can be shown in order to raise breast cancer awareness. The Board also asks Facebook to make clear that although Instagram’s Community Guidelines should align with Facebook’s Community Standards, in light of any discrepancies, the Facebook Community Standards take precedence.

Although we appreciate that this decision shows some progress in line with the goals of our campaigns We the Nipple and Don’t Delete Art, NCAC continues to advocate for a less restriction on artistic nudity and a more inclusive approach to images of bodies. 

Clarifying Terms in Community Standards

One of the crucial recommendations to emerge from these cases calls on Facebook to clarify the language in its Community Standards in order to ensure transparency and increase accessibility. NCAC has long criticized the use of subjective terms in these policies, including words like “inappropriate” or “offensive.” In Case Decision 2020-007-FB-FBR, the Board evaluated whether content posted in a public forum for Indian Muslims was likely to incite violence. The decision asks that Facebook publicly clarify what type of “veiled threats” are permissible and what the enforcement of this restriction entails. 

Reviewing Content in Context

In Case Decision 2020-006-FB-FBR, the Board reviewed Facebook’s decision to remove a post with inaccurate information about the COVID-19 pandemic for violating its misinformation and imminent harm rule. In evaluating various contextual factors, the Board overturned Facebook’s decision to remove the post and declared Facebook’s misinformation and imminent harm rule “inappropriately vague” and contradictory to international human rights standards. The Board suggests that Facebook create an updated standard on health misinformation which clarifies already-existing rules and defines important terms such as “misinformation.” The Board also recognizes an inconsistency in the characterization of “misinformation” as opinion or fact. This case establishes a precedent for Facebook to analyze the potential of misinformation to contribute to “imminent” harm in relation to its context, including but not limited to the “status and credibility of the speaker, the reach of his/her speech, the precise language used, and whether the alleged treatment or cure is readily available to an audience vulnerable to the message.”

Allow Users to Clarify Intent

In Case Decision 2020-005-FB-UA the Board overruled Facebook’s removal of content which included a quote incorrectly attributed to Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany who is on Facebook’s list of dangerous individuals. In order to avoid confusion over the scope and enforcement of the Community Standard on Dangerous Individuals and Organizations, the Board recommends that Facebook clarify key terms in the policy, including “praise,” “support,” and “representation,” the definitions of which must comply with Facebook’s Internal Implementation Standards. In doing so, Facebook should clarify how users can show their intent in discussing dangerous individuals or organizations and publish a list of organizations and individuals that qualify as “dangerous”.