Facebook oversight board: Recommendations for human rights-focused oversight

Digital 8 min read
ARTICLE 19

In November last year, Mark Zuckerberg announced Facebook’s plan to create an ‘independent oversight body’. The body would oversee decisions about the removal of content from the platform, to address ongoing concerns on content regulation by the platform. It was followed by the publication, in January, of a draft Charter that gave initial proposals on how the Oversight Board would operate. Facebook announced that they would hold global consultations to elaborate on this proposal and prepare the creation of the board, which have been taking place this year.

ARTICLE 19 continues to advocate for the self-regulation of social media platforms through the development of open, participatory and accountable multi-stakeholder bodies, working across platforms and at national level. We believe this model, which we call Social Media Councils, is the best approach for the protection of human rights, including freedom of expression, in relation to content moderation.

However, the proposal put forward by Facebook is an important first step in acknowledging the need for, and developing, mechanisms that reinforce the protection of online freedom of expression and other rights. As part of the company’s process of consultation and development towards setting up the Oversight Board, ARTICLE 19 offers the following recommendations:

Composition of the oversight board

  1. The Board must be comprised of a diverse group of experts, representing a range of different views and experiences.

The Oversight Board must represent the whole diversity of society, including marginalised groups and those targeted for discrimination. We believe that consideration should be given to creating these boards at the national level, where it would be more feasible to identify the appropriate range of actors to be represented to ensure legitimacy and credibility. At a minimum, in addition to gender parity, the Oversight Board should include members of groups including LGBTQI people, those from different regions, ethnicities, cultural origins, and religious and philosophical perspectives. The Board must also include international human rights experts, to ensure that decisions protect human rights and comply with international standards which companies like Facebook must abide by in their activities.

  1. Members of the Board should be selected in a transparent and independent manner.

Given the need for true independence for this body to work, the selection of the members of the Oversight Board should not remain in the sole hands of Facebook. In order to ensure credibility and transparency, the selection process should involve the participation of others, for example through some kind of public consultation on a range of selected applicants before the final choice is made, or by having an independent panel select the first cohort. Equally, only the Board itself should have the power to remove members and the reasons and procedure for removal should clearly be defined in the Charter. If Facebook holds the power to remove a member, this will damage the credibility and independence of the Board.

Independent Review of Content Decisions

  1. The Board should be in charge of the selection of cases, and the procedures for bringing cases and obtaining evidence should be set out in detail.

Allowing the Board only to hear individual complaints or cases selected by Facebook would undermine its effectiveness, independence and legitimacy. The Board should be able to select the cases it wants to hear (i.e. to control its docket – like a court) on the basis primarily of complaints from users and, possibly, of cases referred by Facebook. Conditions for admissibility of individual complaints, such as time limitations, should be specified, and it should be possible for individuals to be represented by another organisation or lawyer, or for aggregated or ‘class action’ complaints to be made. We note that privacy interests may be at stake and that some personal information might have to be anonymized, but nonetheless, the Board should be able to request additional information from Facebook when it deems it necessary for the resolution of the case, and expert third party opinions should be admissible.

  1. The Board must have a clear and public code of ethics.

The suggestions made so far by Facebook, preventing current or former Facebook employees from serving, and preventing incentives and external lobbying of members, are an important start. However, in order to ensure impartiality, transparency and credibility, the standards of ethics applicable to Board members, including the on independence and conflicts of interest, should be detailed in a specific code of conduct that must be made public.

  1. The Charter or ‘rules’ to be applied by the Board must be based in internationally recognised human rights standards.

From the draft Charter, it is not clear what rules the Board should apply when reviewing content decisions: is it the Terms of Service? The Community Guidelines? A separate ‘set of values’? We believe that the Board should be tasked with ensuring compliance with international human rights standards, including those on freedom of expression. These universal rights have global reach and legal recognition, which gives them greater legitimacy, and there is a wealth of authoritative interpretation that has detailed the meaning and requirements of international human rights standards. While Facebook is not legally bound by international human rights laws, the UN Guiding Principles on Business and Human Rights set out responsibilities that companies like Facebook have to respect human rights. This means ensuring that their Terms of Service and Community Standards are fully in line with human rights laws, which require them to be, among other things, sufficiently clear and accessible.

  1. The Board must be fully independent and able to make genuine, impartial and recognised decisions.

In the current proposal, Facebook is ‘ultimately responsible for making decisions related to policy, operations and enforcement’. We are concerned that this expresses an intent to exonerate Facebook from any meaningful commitment to the independent oversight mechanism. Facebook should commit to implement the decisions of the Board in good faith, in order to make it a fully effective oversight mechanism, that can address concerns around content moderation by the platform.