Ahead of UNESCO’s Internet for Trust conference, held this week in Paris, ARTICLE 19 maintains serious concerns about the potential implications of UNESCO’s draft Guidelines for regulating digital platforms (the Guidelines) on freedom of expression and human rights online. We urge UNESCO to seriously reconsider its plans to formally adopt these Guidelines and to refrain from encouraging their use by Member States wishing to adopt or amend their regulation of digital platforms.
The Guidelines, currently in a second version, are being developed by UNESCO following the Windhoek+30 Declaration on Information as a Public Good. This week, from 21 to 23 February 2023, UNESCO is hosting an international conference, during which the Guidelines will be discussed. UNESCO is scheduled to present the final version of the Guidelines in mid-2023, with intention that these will then ‘support regulators, governments, legislatures and companies, dealing with content that potentially damages human rights and democracy, while protecting freedom of expression and the availability of accurate and reliable information’.
In December 2022, ARTICLE 19 published comments on the initial draft of this document, at the time called the ‘model regulatory framework for the digital content platforms to secure information as a public good’ (the Framework). While we recognised that UNESCO’s efforts in this area may be driven by good intentions, we questioned UNESCO’s mandate to elaborate such framework and raised serious concerns regarding the lack of transparency as to the evidence and reasoning as well as the consultation process that has resulted. Most importantly, we submitted that UNESCO’s efforts could backfire and the framework be used by a number of governments around the world to justify repressive internet regulations.
In the meantime, the document has been renamed ‘Guidelines’, although its recommendations and overall scope remain of very similar nature. Although the current version of the Guidelines brings some improvements, the key concerns set out in our first response remain.
Some improvements to the Guidelines 2.0
It is evident that many of the concerns raised by various stakeholders – in particular once the consultation process actually became public – were taken seriously by UNESCO and are reflected in the new version. For instance:
- The language is more centred around human rights, and in particular the need to protect freedom of expression and the right to information in platforms regulation. The United Nations Guiding Principles on Business and Human Rights are referenced several times.
- Many provisions deal with transparency obligations (both on States and platforms), user rights and human rights due diligence.
- There is increased recognition of and focus on the role of content curation and recommender systems and their impact on human rights. In particular, the Guidelines provide that users should be given more control over the content they are recommended.
- States are advised to refrain from imposing a general monitoring obligation on platforms or an obligation to take proactive measures in relation to illegal content.
Key human rights concerns remain
Despite these improvements, ARTICLE 19 is unable to support the approach taken by UNESCO with this document and urges UNESCO to reconsider moving forward with this project.
First of all, ARTICLE 19 is concerned that the Guidelines do not clarify its intended objective, purpose and precise legal nature. A document that is supposed to contain soft law – and we maintain that UNESCO would lack a mandate to adopt such a soft law instrument – is very different from a simple discussion paper or a high-level political commitment. It is further not clearly communicated whether it is envisaged that UNESCO Member States will endorse the Guidelines or commit to them in any way.
This information is, however, important to assess the exact implications of the content of the Guidelines. Also, it remains unclear what regulatory model and what sort of enforcement is supposed to accompany the proposed Guidelines. ARTICLE 19 reiterates that in many jurisdictions, independence of the regulator is not a reality and State regulation of digital platforms not desirable. In these contexts, the Guidelines may well be used to legitimise tightening control over online public discourse and silencing of critics online.
ARTICLE 19 also finds that some of the recommendations in the document continue to be unclear, while others are problematic. These concerns include:
- Many definitions and concepts in the Guidelines are extremely vague and could be interpreted in a subjective, arbitrary and restrictive manner. For instance, it is suggested that the regulatory system should have the power to ‘summon any digital platform deemed non-compliant with its own policies or failing to protect users’. Apart from being open to all sorts of interpretations, ARTICLE 19 notes that it is problematic to tie enforcement actions in this broad manner to companies upholding their policies. These policies often provide for restrictions of speech that go well beyond those permitted under international human rights law and enable companies to censor many categories of lawful speech that they – or their advertisers – may consider harmful, inappropriate or controversial.
- While the point is touched upon in the Guidelines, the language fails to specify whether intermediaries should enjoy immunity for user-generated content – an essential component of any regulatory framework – and what the precise extent of said immunity should be.
- The Guidelines state prominently that they focus on platforms’ systems and processes. ARTICLE 19 has long argued that this would indeed constitute the right approach. However, we note that the document still places a strong emphasis on platforms ‘dealing’ with illegal content and ‘content that risks significant harm to democracy and the enjoyment of human rights’. The latter category is defined at the end of the document as ‘hate speech’, ‘disinformation and misinformation’, and ‘content with incites or portrays gender-based violence.’ All these categories lack a clear legal definition and are not a useful basis to restrict freedom of expression online. From a human rights perspective, it is for instance not justifiable to suggest that misinformation and illegal content should be treated similarly.
- The scope of Guidelines now includes messaging apps. If a regulation were to apply the broadly-phrased obligations under the Guidelines equally to hosting services and to messaging services, it would seriously risk undermining end-to-end encryption and thus protection of freedom of expression and privacy online.
- ARTICLE 19 reiterates that any regulation of platforms that does not take into account competition and data protection aspects will prove ineffective in limiting the harmful effects of the biggest online platforms’ business models.
While we recognise that the Guidelines are still evolving, the issues that still need to be addressed are fundamental and it is clear that this will not be possible in the timeframe proposed for their development. The Guidelines now refer to the consultations continuing ‘in the following months’, which is largely insufficient for the highly complex task of achieving consensus on human rights-respecting Guidelines that are meant to be of global relevance.
The way forward
ARTICLE 19 recognises the value in creating a global multi-stakeholder shared space for debates on platform regulation. This is indeed an important conversation to have. However, the exact purpose of the Guidelines and their intended use after their adoption remain unclear. As such, it appears that the risks for human rights online have not yet been adequately considered. We believe that the Guidelines should not be viewed as a primary output of this process or any sort of soft law instrument.
We stand ready to collaborate with UNESCO to reconsider the way forward for the Guidelines and to discuss broad principles that should guide discussions in this area.