The Facebook Oversight Board: A significant step for Facebook and a small step for freedom of expression

The Facebook Oversight Board: A significant step for Facebook and a small step for freedom of expression - Digital

On 6 May, the Facebook Oversight Board (FOB) announced the names of its first cohort of 20 members, a number expected to grow to 40. It is an important step in the development by Facebook of an external mechanism to oversee its content moderation practices. Many of the newly-appointed members have significantly contributed to the protection of freedom of expression and other fundamental rights globally. With their reputation at stake, one can hope that the Board will work to hold Facebook to account, and that freedom of expression will be better protected on the platform.

The FOB is an improvement on current practice whereby social media companies are left to decide by themselves what people get to see online, enforcing often vague content rules with little transparency. By putting difficult content moderation decisions to an independent group of experts, it can be hoped that we will see better outcomes for freedom of expression. The FOB is therefore an experiment worth watching.

However, ARTICLE 19 urges caution in assuming that the Board can fix all of the multitude of human rights issues presented by Facebook. For a start, its mandate and powers are inherently limited. Broader issues around the impact of social media on free expression, such as Facebook’s business model, content promotion or downranking, and at least for now, political advertising, are out of bounds. Equally, the FOB is designed specifically only to address content moderation issues on Facebook and its other platform Instagram. Beyond its limited remit, the FOB is inevitably a means for Facebook to acquire the legitimacy it direly needs and to show lawmakers that self-regulation à la Facebook can work. It will also help detract attention from Facebook for difficult content moderation decisions. There is therefore every reason to be skeptical.

The FOB can only ever be a very small part of the answer to the content moderation puzzle. This is why we would like to see a more comprehensive approach to safeguarding free expression online taken by companies including Facebook, as well as other stakeholders. Changes to companies’ approach to transparency and accountability; the unbundling of content moderation and hosting; and the development of independent oversight bodies grounded in human rights laws are all needed to provide a genuine response to the increasing risks to human rights presented by the current model of social media.

Limitations of the Oversight Board

A thorough assessment of the FOB will have to wait until it adopts its first decisions later this year. At this stage, it is important to be clear about what can be expected from this new body. The FOB has the authority to review individual content moderation decisions related to content on Facebook or Instagram (i.e., “specific posts, photos, videos, and comments”) and produce binding decisions in these cases only. By contrast, any recommended changes to Facebook’s content policies are non-binding.

The ability of the FOB to tackle the human rights questions raised by content moderation practices is also limited in a number of other important ways:

  • The FOB will apply standards ultimately decided by Facebook. The Board’s mission is to apply Facebook’s community standards, and more vaguely its ‘values’, a set of rules that already fail to meet the requirements of international standards and that can be revised at any moment by Facebook. While the Board is urged to consider the impact of content decisions on freedom of expression, it is not asked to apply human rights standards to decisions, meaning its ability to influence human rights protections in social media content moderation is limited. Hopefully, the expertise in international human rights law of some FOB members will inform the work of the FOB. Equally, we hope that the Board will make human rights-based recommendations to Facebook which it will genuinely take up. Nonetheless, the FOB is an instrument that applies Facebook standards, and this certainly nuances the claim at the heart of the Board’s creation, that it should not be up to Facebook or Mark Zuckerberg to unilaterally make decisions on the content that can circulate on the world’s widest public space.
  • The FOB’s mandate and capacity are limited. The FOB is made up a group of part-time experts who will be tasked with reviewing an incredibly small fraction of the billions of pieces of content that are removed on the platform on the basis of Facebook’s community standards. In the circumstances, the impact of Board’s decisions on Facebook’s overall approach remains unclear. It’s unclear how the FOB will be able to select cases effectively and identify those which are more deserving of its attention because they raise novel or systemic issues of public interest. In practice, for instance, decisions about hate speech are often very context-dependent, so selecting one case as somehow worthier of attention and setting a clear precedent is likely to prove difficult. Whilst Facebook might be able to identify systemic issues through its internal appeals process and refer them to the Board, it remains unclear how the Board itself will be able to sift through potentially millions of individual appeals. Beyond case selection, several procedural issues could limit the Board’s ability to scrutinise Facebook’s decisions effectively, including the lack of reasoning provided in Facebook decisions and its wide discretion in applying its own rules. Although the bylaws promise a policy rationale for decisions, it is unclear how this will play out when content is removed on the basis of filters, for example. For decisions to be challenged effectively, their rationale must be understood. In the circumstances, it remains to be seen exactly what material the FOB will be scrutinising in deciding cases.
  • The FOB is not yet representative enough. Although the members of the FOB can collectively claim to represent a number of countries and speak an impressive list of languages, they predominantly come from US and European backgrounds, something that will hopefully be remedied in the Board’s next round of recruitment. Some have also highlighted the lack of expertise in content moderation at scale or of vocal advocates on LGBTQI issues. As a central, global body for the review of content moderation decisions across a global platform, this small group of experts will inevitably appear as a distant, out of reach interlocutor for the majority of Facebook users. Furthermore, the resolution of any content moderation case requires a detailed understanding of the language, culture, politics and history of the context in which the dispute takes place. In practice, will the FOB be able develop a well-informed view of the complex circumstances of a case, in order to adopt credible, convincing decisions?

Beyond the Facebook Oversight Board

It is clear then that the FOB cannot provide a comprehensive solution to the issues of human rights protections on social media. So what else can be done? ARTICLE 19 believes there are a number of approaches to the problems of content moderation that are better tailored to ensure the comprehensive protection of free expression and other fundamental rights online.

At a minimum, fundamental changes to the approach of companies like Facebook are needed. Facebook must improve the clarity and detail of its Community Standards, to better align them to international human rights laws, and be much more transparent, including through the sharing of more comprehensive data on its content moderation decisions and processes. We’re also campaigning for an expansion of and improvements to its appeal process, to give users more power to challenge wrongful takedowns.

Beyond this, we urge a comprehensive form of decentralization known as unbundling. The ‘bundling’ of different services by dominant social media companies is what enables them to have such a significant degree of control over freedom of expression online. By controlling both the hosting of content on a platform as well as decisions around the moderation of that content, companies are able to exercise vast and largely unaccountable power over what we say online, and alternatives which might offer better protections for our rights are prevented from entering the market. Unbundling refers to the separation of hosting and content moderation services. Through unbundling, dominant social media platforms would still be able to moderate the content on their platforms, but they would be also obliged to allow competitors to provide competing content moderation services on their platforms. This would give users more choice about the kind of content they want to be able to post and see on their social media, and likely enable new providers which better protect the rights to freedom of expression and privacy to emerge.

In addition, ARTICLE 19 supports the creation of more independent and comprehensive bodies to oversee content moderation practices, in the form of Social Media Councils (SMCs). SMCs are a transparent, multi-stakeholder, voluntary compliance mechanism that would oversee content moderation practices of social media platforms on the basis of international standards on human rights.

While ARTICLE 19’s proposal to create Social Media Councils has similarities with the FOB, it differs significantly in that SMCs are based on international human rights standards, are totally independent of any one company, have high standards of openness and transparency, and will operate in closer proximity to local users and the complexity of contexts in which content moderation decisions arise.

At a moment where legislative initiatives often rely on self-regulatory mechanisms (within a legal framework of co-regulation) to address a number of complex online content issues, we believe that the SMC offers a model that fully ensures the protection of the fundamental right to freedom of expression.

Ultimately, therefore, while the creation of the FOB does mark significant progress by Facebook, it is only a small step towards the effective protection of free expression and other rights online.