The United Nations (UN) Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information,
Having discussed these issues together with the assistance of ARTICLE 19 and the Centre for Law and Democracy (CLD);
Recalling and reaffirming our Joint Declarations of 26 November 1999, 30 November 2000, 20 November 2001, 10 December 2002, 18 December 2003, 6 December 2004, 21 December 2005, 19 December 2006, 12 December 2007, 10 December 2008, 15 May 2009, 3 February 2010, 1 June 2011, 25 June 2012, 4 May 2013, 6 May 2014, 4 May 2015 and 4 May 2016;
Taking note of the growing prvalence of disinformation (sometimes referred to as “false” or “fake news”) and propaganda in legacy and social media, fuelled by both States and non-State actors, and the various harms to which they may be a contributing factor or primary cause;
Expressing concern that disinformation and propaganda are often designed and implemented so as to mislead a population, as well as to interfere with the public’s right to know and the right of individuals to seek and receive, as well as to impart, information and ideas of all kinds, regardless of frontiers, protected under international legal guarantees of the rights to freedom of expression and to hold opinions;
Emphasising that some forms of disinformation and propaganda may harm individual reputations and privacy, or incite to violence, discrimination or hostility against identifiable groups in society;
Alarmed at instances in which public authorities denigrate, intimidate and threaten the media, including by stating that the media is “the opposition” or is “lying” and has a hidden political agenda, which increases the risk of threats and violence against journalists, undermines public trust and confidence in journalism as a public watchdog, and may mislead the public by blurring the lines between disinformation and media products containing independently verifiable facts;
Stressing that the human right to impart information and ideas is not limited to “correct” statements, that the right also protects information and ideas that may shock, offend and disturb, and that prohibitions on disinformation may violate international human rights standards, while, at the same time, this does not justify the dissemination of knowingly or recklessly false statements by official or State actors;
Highlighting the importance of unencumbered access to a wide variety of both sources of information and ideas, and opportunities to disseminate them, and of a diverse media in a democratic society, including in terms of facilitating public debates and open confrontation of ideas in society, and acting as a watchdog of government and the powerful;
Reiterating that States are under a positive obligation to foster an enabling environment for freedom of expression, which includes promoting, protecting and supporting diverse media, something which has come under growing pressure due to the increasingly difficult economic environment for the traditional media;
Acknowledging the transformative role played by the Internet and other digital technologies in supporting individuals’ ability to access and disseminate information and ideas, which both enables responses to disinformation and propaganda, while also facilitating their circulation;
Reaffirming the responsibilities of intermediaries, which facilitate the enjoyment of the right to freedom of expression through digital technologies, to respect human rights;
Deploring attempts by some governments to suppress dissent and to control public communications through such measures as: repressive rules regarding the establishment and operation of media outlets and/or websites; interference in the operations of public and private media outlets, including by denying accreditation to their journalists and politically-motivated prosecutions of journalists; unduly restrictive laws on what content may not be disseminated; the arbitrary imposition of states of emergency; technical controls over digital technologies such as blocking, filtering, jamming and closing down digital spaces; and efforts to “privatise” control measures by pressuring intermediaries to take action to restrict content;
Welcoming and encouraging civil society and media efforts aimed at identifying and raising awareness about deliberately false news stories, disinformation and propaganda;
Concerned about some measures taken by intermediaries to limit access to or the dissemination of digital content, including through automated processes, such as algorithms or digital recognition-based content removal systems, which are not transparent in nature, which fail to respect minimum due process standards and/or which unduly restrict access to or the dissemination of content;
Adopt, in Vienna, on 3 March 2017, the following Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda:
1. General Principles:
a) States may only impose restrictions on the right to freedom of expression in accordance with the test for such restrictions under international law, namely that they be provided for by law, serve one of the legitimate interests recognised under international law, and be necessary and proportionate to protect that interest.
b) Restrictions on freedom of expression may also be imposed, as long as they are consistent with the requirements noted in paragraph 1(a), to prohibit advocacy of hatred on protected grounds that constitutes incitement to violence, discrimination or hostility (in accordance with Article 20(2) of the International Covenant on Civil and Political Rights).
c) The standards outlined in paragraphs 1(a) and (b) apply regardless of frontiers so as to limit restrictions not only within a jurisdiction but also those which affect media outlets and other communications systems operating from outside of the jurisdiction of a State as well as those reaching populations in States other than the State of origin.
d) Intermediaries should never be liable for any third party content relating to those services unless they specifically intervene in that content or refuse to obey an order adopted in accordance with due process guarantees by an independent, impartial, authoritative oversight body (such as a court) to remove it and they have the technical capacity to do that.
e) Consideration should be given to protecting individuals against liability for merely redistributing or promoting, through intermediaries, content of which they are not the author and which they have not modified.
f) State mandated blocking of entire websites, IP addresses, ports or network protocols is an extreme measure which can only be justified where it is provided by law and is necessary to protect a human right or other legitimate public interest, including in the sense of that it is proportionate, there are no less intrusive alternative measures which would protect the interest and it respects minimum due process guarantees.
g) Content filtering systems which are imposed by a government and which are not end-user controlled are not justifiable as a restriction on freedom of expression.
h) The right to freedom of expression applies “regardless of frontiers” and jamming of signals from a broadcaster based in another jurisdiction, or the withdrawal of rebroadcasting rights in relation to that broadcaster’s programmes, is legitimate only where the content disseminated by that broadcaster has been held by a court of law or another independent, authoritative and impartial oversight body to be in serious and persistent breach of a legitimate restriction on content (i.e. one that meets the conditions of paragraph 1(a)) and other means of addressing the problem, including by contacting the relevant authorities of the host State, have proven to be demonstrably ineffective.
2. Standards on Disinformation and Propaganda:
a) General prohibitions on the dissemination of information based on vague and ambiguous ideas, including “false news” or “non-objective information”, are incompatible with international standards for restrictions on freedom of expression, as set out in paragraph 1(a), and should be abolished.
b) Criminal defamation laws are unduly restrictive and should be abolished. Civil law rules on liability for false and defamatory statements are legitimate only if defendants are given a full opportunity and fail to prove the truth of those statements and also benefit from other defences, such as fair comment.
c) State actors should not make, sponsor, encourage or further disseminate statements which they know or reasonably should know to be false (disinformation) or which demonstrate a reckless disregard for verifiable information (propaganda).
d) State actors should, in accordance with their domestic and international legal obligations and their public duties, take care to ensure that they disseminate reliable and trustworthy information, including about matters of public interest, such as the economy, public health, security and the environment.
3. Enabling Environment for Freedom of Expression:
a) States have a positive obligation to promote a free, independent and diverse communications environment, including media diversity, which is a key means of addressing disinformation and propaganda.
b) States should establish a clear regulatory framework for broadcasters which is overseen by a body which is protected against political and commercial interference or pressure and which promotes a free, independent and diverse broadcasting sector.
c) States should ensure the presence of strong, independent and adequately resourced public service media, which operate under a clear mandate to serve the overall public interest and to set and maintain high standards of journalism.
d) States should put in place other measures to promote media diversity which may include, as warranted by the situation, some or all of the following:
i. Providing subsidies or other forms of financial or technical support for the production of diverse, quality media content;
ii. Rules prohibiting undue concentration of media ownership; and
iii. Rules requiring media outlets to be transparent about their ownership structures.
e) States should take measures to promote media and digital literacy, including by covering these topics as part of the regular school curriculum and by engaging with civil society and other stakeholders to raise awareness about these issues.
f) States should consider other measures to promote equality, non-discrimination, inter-cultural understanding and other democratic values, including with a view to addressing the negative effects of disinformation and propaganda.
a) Where intermediaries intend to take action to restrict third party content (such as deletion or moderation) which goes beyond legal requirements, they should adopt clear, pre-determined policies governing those actions. Those policies should be based on objectively justifiable criteria rather than ideological or political goals and should, where possible, be adopted after consultation with their users.
b) Intermediaries should take effective measures to ensure that their users can both easily access and understand any policies and practices, including terms of service, they have in place for actions covered by paragraph 4(a), including detailed information about how they are enforced, where relevant by making available clear, concise and easy to understand summaries of or explanatory guides to those policies and practices.
c) In taking actions covered by paragraph 4(a), intermediaries should respect minimum due process guarantees including by notifying users promptly when content which they created, uploaded or host may be subject to a content action and giving the user an opportunity to contest that action, subject only to legal or reasonable practical constraints, by scrutinising claims under such policies carefully before taking action and by applying measures consistently.
d) The standards outlined in paragraph 4(b) should, subject only to legitimate competitive or operational needs, also be applied to any automated processes (whether algorithmic or otherwise) run by intermediaries for taking action either in relation to third party content or their own content.
e) Intermediaries should support the research and development of appropriate technological solutions to disinformation and propaganda which users may apply on a voluntary basis. They should cooperate with initiatives that offer fact-checking services to users and review their advertising models to ensure that they do not adversely impact diversity of opinions and ideas.
5. Journalists and Media Outlets
a) The media and journalists should, as appropriate, support effective systems of self regulation whether at the level of specific media sectors (such as press complaints bodies) or at the level of individual media outlets (ombudsmen or public editors) which include standards on striving for accuracy in the news, including by offering a right of correction and/or reply to address inaccurate statements in the media.
b) Media outlets should consider including critical coverage of disinformation and propaganda as part of their news services in line with their watchdog role in society, particularly during elections and regarding debates on matters of public interest.
6. Stakeholders cooperation
a) All stakeholders – including intermediaries, media outlets, civil society and academia – should be supported in developing participatory and transparent initiatives for creating a better understanding of the impact of disinformation and propaganda on democracy, freedom of expression, journalism and civic space, as well as appropriate responses to these phenomena.