Facebook’s Supreme Court: One Board to Rule Them All

Stephen Ragan
7 min readFeb 14, 2021

Facebook has long viewed itself as a “neutral platform” avoiding political censorship and driven by the narrative of “connecting the world” as the moral imperative to spread across the world in the contemporary colonial project.

What followed from those humble beginnings has been a series of missteps. In 2016, Russia used the platform in attempts to sway the presidential election. It’s still unclear how effective that effort was. In 2019, a white supremacist in New Zealand live streamed a mass shooting. In Myanmar genocide was propagated through the social media platform and millions joined anti-vaccine and other conspiracy theory groups like QAnon.

As a result Facebook took slow and tentative steps toward content moderation proceeding cautiously. What developed was a set of rules and practices administered by Facebook on an ad-hoc basis. Critics called for transparency around those community standards. Often Facebook hid behind Section 230, which provides immunity for website publishers from culpability of the content published by third parties. Increasingly, Facebook and Twitter have come under scrutiny. For a long time neither took decisive steps to patrol their walled garden of people yelling angrily at each other through a computer screen and one of the most prominent culprits was the former President of the United States.

The reason for not removing President Trump from the platforms often relied on the public interest of knowing the stream of conscious of their fearless leader. Then January 6th happened and about 1450 days into 1460 presidential tenure, Facebook and Twitter acted decisively in plugging President Trump’s loudest bullhorn.

Many hooted and hollered about First Amendment violations and infringement of the Freedom of Speech for the most powerful man in the world. Now, this decision to remove President Trump’s from Facebook has come under scrutiny and Facebook’s Supreme Court will hear the case.

What is Facebook’s Supreme Court?

Since its founding Facebook has cultivated its brand meticulously. Part of that brand was a haven of free expression on the Internet. Facebook was to be the democratization of expression and a public square where all had a platform to say what they wanted. This cradle of expression, built on a business model that amplifies the most engaging content, has been overrun by conspiracy theories, hate speech, and disinformation. Critics have called this a threat to democracy.

In 2019, Facebook, in their ambition to create a company state, pledged to institute an external oversight board to govern the most difficult questions they faced. Instead of paying content moderators a livable wage, they engaged in a PR stunt recruiting highly qualified and overpaid ministers including a former Prime Minister and Noble laureate, to review challenges to the removal of posts and bans from the platform. The board has jurisdiction over every Facebook user in the world.

That global context is complex. In Hong Kong, activists use the social media platform to organize. Facebook’s free expression principles acted as protection from the state. In Myanmar, Facebook was used to perpetuate genocide in a vile misuse of the platform. In this case the state was protected by Facebook’s free expression principles.

Through workshops around the world, Facebook sought to crowd source beliefs about speech and gain an understanding on free expression from the global community. It seemed where the rule of law is strong, constraints on speech are viewed less favorably and vice versa.

In theory the decisions of the board will be binding and can overrule even Mark Zuckerberg in whose benevolence he foresaw that CEOs shouldn’t have complete control. As one of the largest platforms for speech, these cases will have consequences that ripple across the world and the ambition is that the decisions of the board will form precedent within the company as well as in other jurisdictions.

To appoint the board members Facebook opened a public portal, which received thousands of nominations including suggestions of candidates from political groups and civil rights organizations. In May, the first board members were announced and included Helle Throning-Schmidt, former Prime Minister of Denmark, and Twakkol Karman, a Noble Peace Prize winner for her role in the Arab Spring protests in Yemen.

The board’s membership has of course come under scrutiny. Facebook is located in Menlo Park, which is much more liberal than the general populace. When Facebook sought to appoint conservatives, this decision was met with backlash. Others worry the board will serve as a tool for Facebook to co-opt advocates who would otherwise be more critical while still others view the whole thing as a charade allowing the company to outsources its most difficult decisions.

How do Appeals Work?

The board originally consisted of twenty members and this number has grown to 40+. They are paid six figures for about fifteen hours work a week and the board is managed by an independent trust that Facebook gave a hundred and thirty million dollars. The board chooses the most representative cases and hears each in a panel of five members.

Unlike in the Supreme Court, there are no oral arguments. Rather, the individuals submit a written brief arguing their case and representatives for the company file a brief explaining the company’s position. The panel’s decision, if ratified by the rest of the members, is binding.

Users can appeal a case that Facebook has removed, but not those in which Facebook has left the post up. Unfortunately, many of Facebook’s most pressing issues relate to content they have left up although Facebook says that it will be ready to allow user appeals of posts that are kept up by mid-2021.

Regarding precedent, Facebook says it will remove “identical posts with parallel context.” We will have to wait and see how this functions in practice. Policy recommendations on the other hand are only advisory and Facebook can still do as it pleases.

What Cases Have Been Heard?

In October 2020, the board began allowing appeals from a random five percent of the users and the docket began to fill with Nazi propaganda and anti-Muslim posts. The Board reviewed the ten thousand word document that codifies Facebook’s speech policies and consulted precedent from international human rights cases. One issue that has developed is what standards should guide decision making. Some argue that decisions should follow Facebook policy while others argue the validity of those standards. Would decision be based on individual principle? Would the consequences of a post be taken into consideration? How easy is it to link cause and effect in the physical world as a result of an online post.

An early case on misinformation related to Covid-19 cures, the standard for censoring speech involved determining whether it was likely to incite direct harm in the case of self-vaccination. Because the combination of medicine was not readily available over the counter, the board decided the risk was low and voted to restore the post but encouraged Facebook to include a link to more reliable scientific information.

The big case stems from January 6th when a group of Trump supporters stormed the Capitol at the urging of President Trump by repeatedly claiming on Facebook and elsewhere that the election had been stolen. Trump later released videos disavowing the violence but reiterated his claims of a fraudulent election. In response Facebook removed two of his posts and ultimately removed Trump from Facebook.

In response to Trump’s removal on Twitter, Angela Merkel, the German Chancellor described the ban as “problematic,” and Alexy Navalny, the Russian opposition leader tweeted “I think that the ban of Donald Trump on Twitter is an unacceptable act of censorship.” The day after Joe Biden’s inauguration Facebook sent the case to the board asking it to rule on whether Trump should remain indefinitely banned from the platform. The board will now have two months to deliberate. It will be interesting to see how the board rules as well as what the situation is as it relates to other political leaders in analogous situations.

Content Moderation From the Bottom Up

It’s important to remember that the board rules on a fraction of a fraction of the difficult cases. The real power comes from those underpaid content moderators who make the call on a day to day basis in the trenches of online conversation. It remains a real concern that the Facebook Supreme Court will merely be a distraction.

Facebook has a user/community based regulation system. No one scrutinizes the content before it is uploaded. Rather, users report the posts they found inappropriate and those cases are reviewed by an often eclectic group of people where the work never stops for the digital proletariat.

While the median wage of Facebook’s employees is about $240,000, content moderation is outsourced where productive time and breaks are precisely calculated. A moderator is expected to handle 1300 reports a day, every day, which leaves only seconds to reach a decision for each report considered too complex for algorithms. A lot of the content is extreme graphic violence and in a publication in the Süddeusche Zeitung Magazin, a former content moderator wrote,

“The display of blood, mutilated and charred bodies is mere horror. I learned how to overcome my disgust and stand it.”

Did the quote relate to content moderation or to Facebook? At his core, Mark Zuckerberg is probably trying to do the right thing but this is a hard equation to get right. Caring for your employees should not be the difficult part.

--

--