HRC38: UN expert calls for “human rights by default” in regulation of online content

Digital 6 min read
ARTICLE 19
HRC38: UN expert calls for “human rights by default” in regulation of online content - Digital

ARTICLE 19 today welcomed the report of the UN Special Rapporteur on freedom of expression to the Human Rights Council (HRC), calling for States and social media companies to ensure and respect freedom of expression when regulating online content. This must be reflected in commitments States are making in resolutions under consideration at the 38th Session of the HRC, including on human rights and the Internet.  

The UN expert report coincides with the launch of a new ARTICLE 19 policy “Side Stepping Rights”, setting out recommendations to push back against privatised censorship, including State-driven demands for content-removal.

When States seek the removal from the Internet of user-generated content, whether through legal or extra-legal pressure on online platforms like Facebook, Twitter or Google, it is censorship,” said Thomas Hughes, Executive Director of ARTICLE 19. “States cannot side-step their human rights obligations, including on freedom of expression, by deputising censorship in opaque or unaccountable ways to companies.”

Companies too must do more to respect rights in the development and enforcement of their terms and conditions to maximise protections for free speech, and push back against State requests when they are not rights-compliant,” Hughes added.

Even in the digital age, States’ use of archaic laws, including on blasphemy, sedition, and criminal defamation, still form the basis of many governments’ efforts to control what is said and seen online. Newer forms of restriction, relating to “terrorist content”, “incitement”, and “extremism”, too often illegitimately target dissent, and must be repealed.

ARTICLE 19 shares the Special Rapporteur’s concerns at an increasing number of States pursuing measures to compel social media companies to quickly and automatically remove user-generated content, whether on the basis of its illegality or supposed “harm”. Social media companies are not courts, and States should not circumvent the due process and rights protections provided by judicial bodies through these forms of coercion. These arrangements are often opaque and without remedy – users’ expression can be censored at the behest of a government without notice of the reasons or opportunity to contest the decision.

Social media companies are also powerful actors. The Special Rapporteur identifies that, notwithstanding some recent reform announcements, they “remain enigmatic regulators, establishing a kind of “platform law” in which clarity, consistency, accountability and remedy are elusive.” They can do much more.

The Special Rapporteur’s recommendations, based on the Ruggie Principles, urge a “human rights by default” approach, setting out a framework for companies to implement their responsibilities. These find support in ARTICLE 19’s “Side Stepping Rights” policy.

Where terms of service more accurately reflect international human rights standards, respecting users’ rights to anonymity and due process, business enterprises may more consistently resist government requests that breach those standards. This requires, in many cases, tightening terms of service, including on real-name registration requirements, as well as restrictions on “terrorism content” and on “hate speech.” Evidence suggests these standards are frequently applied arbitrarily, and to the detriment of minority and dissenting groups.

For users with complaints about the removal of content, and complaints about other users’ expression, the Special Rapporteur identifies that current systems instituted by social media companies are inadequate. They do little to protect freedom of expression. The amount of information provided to users who make complaints, or whose expression is complained about, is insufficient, in particular where the initiator of a complaint is a government. Algorithms to “promote” or “suppress” content, in particular in relation to “counter narratives” to extremism or “disinformation”, similarly can raise concerns.

Increasing rights-based resilience from social media companies to illegitimate censorship demands is central to ARTICLE 19’s new policy. To achieve this, we agree with the Special Rapporteur that a significant shift away from corporate secrecy to transparency and accountability must happen – not just in the enforcement of rules, but also in their formulation.

ARTICLE 19 particularly welcomes that the Special Rapporteur has taken on board our recommendation that social media councils, modelled on self-regulatory press councils, be examined as an industry-wide response to concerns regarding online content moderation. This approach would, in our view, provide an opportunity to increase clarity, consistency, accountability and remedy, while protecting the right to freedom of expression.

At the 38th Session of the HRC, several resolutions are being negotiated that provide States the opportunity to commit to act on the Special Rapporteur’s recommendations. This includes, in particular, the resolution on “the promotion, protection and enjoyment of human rights on the Internet”, led by Sweden, with Brazil, Nigeria and Tunisia. The United States and Turkey, both former core group members, are no longer associated with the resolution.