Facebook has unveiled its plan to create an independent “oversight” board to make decisions over how the network is moderated.
The firm insisted the panel, which will hear its first “cases” in 2020, will have power to override decisions it makes over contentious material and influence new policy.
The idea, dubbed the Facebook supreme court, will eventually comprise 40 people around the world, but will be smaller at first.
Experts have questioned the board’s independence, as well as the motivation behind the move.
“Facebook does not have a court,” said Bernie Hogan, senior research fellow at the Oxford Internet Institute. “The only vote that really counts is the majority shareholder, Mark Zuckerberg.”
He added: “Facebook’s ‘supreme court’ invokes all the pomp and circumstance of actual judicial practice without any of the responsibility to citizens.”
The board will launch with no fewer than 11 part-time members, Facebook said, and the names of those appointed will be made public – as will the results of their deliberations. The board will be paid via a trust set up and funded by Facebook upfront.
“We are responsible for enforcing our policies every day and we make millions of content decisions every week,” wrote Facebook’s chief executive, Mark Zuckerberg. “But ultimately I don’t believe private companies like ours should be making so many important decisions about speech on our own.”
How would the process work?
Facebook outlined how the board would operate in a charter published on Tuesday. The goals of the panel, as stated by Facebook, are to:
- “Provide oversight of Facebook’s content decisions”
- Reverse Facebook’s decisions when necessary”
- “Be an independent authority outside of Facebook”
Major disagreements will be escalated to the panel once all of Facebook’s existing moderation layers had been exhausted. Facebook controls which cases are submitted to the board, although panel members will decide which of those cases to take on.
Facebook anticipated that the board would only consider “dozens” of cases a year, focusing on those where a clear decision would be in “the greatest public benefit”.
Users affected will be allowed to state their case in a written statement, but Facebook said it anticipated some board members may wish to speak to users “face-to-face”.
“The board’s decision will be binding, even if I or anyone at Facebook disagrees with it,” Mr Zuckerberg said. “The board will use our values to inform its decisions and explain its reasoning openly and in a way that protects people’s privacy.”
One caveat, according to the firm’s charter, is when recommendations are not technically feasible.
Facebook said the trust would be opened up for other networks to join – and fund – in future.
Why is Facebook doing this?
Facebook’s primary concern is that it doesn’t want the power it currently wields – or at least, it doesn’t want the scrutiny that power attracts. Its ability to decide what goes on its platform, the biggest network of people ever created, brings it nothing but trouble, particularly in its home country.
One recent example demonstrates the conflict Facebook faces. An anti-abortion video, deemed to contain inaccuracies by an independent fact-checking group contracted by Facebook, was removed – only to be reinstated after four Republican senators complained to Mr Zuckerberg personally, accusing the site of having a bias against conservative views.
In future, this kind of decision could be taken out of Facebook’s direct control and handed to the oversight board, which has the power to override the site’s policies – although experts predict Facebook will still bear the brunt of criticism.
“This panel is seen as an attempt to do something, but it appears to be just short of having enough teeth to make a difference,” argued Mr Hogan. “It is a way to tell critics ‘lay off, we are doing all it can’. Such a panel, while admirable is no match for some well organised trolls or broad systemic issues.”