On 29 April 2021, ARTICLE 19 submitted feedback to the European Commission’s consultation on the Guidance on tackling disinformation. ARTICLE 19 believes that any update to the Code of Practice on Disinformation (the Code) must be done in accordance with international human rights standards. The definition of disinformation in the Code is overly broad, thus failing to protect freedom of expression. ARTICLE 19 highlights that greater transparency in transparency reporting and algorithms is needed and that companies’ business models must be addressed if this is to be achieved.
In our submission, ARTICLE 19 reiterates the importance of having freedom of expression as a core objective of the Code. Without ensuring freedom of expression and information, there is no guarantee that restrictions on ‘disinformation’ will not violate freedom of expression standards.
Overall, ARTICLE 19 notes that the Code does not properly balance its objective of tackling ‘disinformation’ with the protection of freedom of expression. We, therefore, make the following recommendations:
- The definition of ‘disinformation’ is overly broad: In the Code, the definition of ‘disinformation’ extends beyond verifiably false information to include misleading information and ‘public harm’ is defined by reference to vague concepts such as such as ‘security’ or ‘political processes’. In practice, it appears that the Code has been applied to ‘misinformation’, a broader category which includes the dissemination of false information without intent to cause ‘public harm’. ARTICLE 19 does not believe that the elimination of all ‘misinformation’ should be the aim of any regulation. On the contrary, it should be to develop guidance for States and companies on how to promote and protect independent and diverse media as well as increasing digital media literacy.
- The Guidance should address the business model of platforms: The Guidance needs to go beyond simply addressing advertisement on online platforms and third-part websites. Platforms collect immense amounts of data and therefore must comply with data protection standards. The Code should explicitly reference the GDPR.
- Any ‘trustworthiness indicators’ should ensure content diversity: There is a risk that these indicators may entrench media incumbent at the expense of small newspapers or minority voices, who might become less visible. Compliance with indicators should not be made mandatory, lest it puts public authorities in the position of arbiters of truth and reliable information.
- Greater transparency is needed in the way in which companies implement the Code. In particular, there is a severe lack of transparency regarding algorithms that decide what content to amplify or drown out. There is also a risk that amplification may become a substitute for the removal of borderline content, preventing debate on controversial issues. Content curation practices also need to be more transparent so users understand why they are seeing that content in particular.