Italy: ARTICLE 19 responds to AGCOM proposal for ‘hate speech’ regulation

ARTICLE 19 has submitted comments to the proposal of AGCOM, the Italian regulator for the postal, telecoms and media sectors, on combatting ‘hate speech’ (the Proposal) in the country. ‘Hate speech’ remains a great concern in Italy, with a significant rise in the number of recorded episodes of incitement to hatred based on various grounds in recent years. While ARTICLE 19 welcomes the AGCOM initiative, we also raise concerns about some aspects of the Proposal, which fail to adequately safeguard the right to freedom of expression.

ARTICLE 19 is concerned about the definition of ‘hate speech’ in the Proposal, which contributes little to the identification of the unlawful conduct to be prohibited by the new rules. In the Proposal, ‘hate speech’ is defined by making reference to three different definitions provided by both international and national instruments, none of which makes any distinction and all seem to treat similarly expressions which have different degrees of severity. However, as explained on previous occasions, ARTICLE 19 argues that international freedom of expression standards require a different approach to various types of ‘hate speech’ based on their severity. Namely, (i) ‘hate speech’ that must be prohibited, including by criminal law (such as incitement to genocide or incitement to violence); (ii) ‘hate speech’ that may be prohibited (such as discriminatory or bias-motivated threats or harassment); and (iii) lawful ‘hate speech,’ which raises concerns in terms of intolerance and discrimination and merits response, but should be protected from restrictions. We believe that the Proposal should include sufficient guarantees that lawful hate speech is not unduly restricted.

The Proposal contains a number of criteria for legacy media and professional journalists, and in various instances calls for the adoption of all appropriate measures or all appropriate initiatives to reach a specific aim. For example, it mandates: to always pay due attention to the context, distinguishing specific cases from generalisations; to avoid providing personal details of people involved, which are not relevant for the record; to avoid improper associations of news and phenomena that seem to establish a causal nexus between the facts and a specific group of individuals. ARTICLE 19 strongly advocates for the establishment and sharing of best practices for the sector, which could provide adequate guidance concerning such measures and initiatives. However, we believe that these best practices and ethical standards shall be defined and agreed via independent self-regulatory bodies, established through a multi-stakeholder process, which sees all relevant actors duly represented.

In addition, the Proposal contains specific rules for providers of platforms for video sharing. In particular, the Proposal promotes co-regulation mechanisms to be adopted by the platforms for contrasting the online diffusion, in particular on social media, of content that violates human dignity and for the removal of ‘hate content.’ The Proposals foresees that such measures shall include efficient systems for the individuation and flagging of unlawful content and of the users that uploaded it. ARTICLE 19 is particularly concerned by these provisions. First, the rules refer to the removal of ‘hate content’ without making the needed distinction among lawful and unlawful content. The removal of lawful ‘hate speech’ constitutes a violation of users’ freedom of expression. Second, removal mechanisms shall always include adequate procedural safeguards, such as the right to be notified about removal and the possibility to appeal the removal decision.

More in general, ARTICLE 19 raises concerns about the fact that, without the adoption of the necessary guarantees to protect freedom of expression under international standards, these provisions could lead to the establishment of a system of constant monitoring, and possibly censorship, of online content. As repeatedly stated, ARTICLE 19 strongly opposes to regulatory solutions that encourage private companies to censor content.

Finally, the Proposal calls for the establishment of an Advisory Committee, composed of five members of recognised professional and scientific competence. Its tasks include, among others, to assess the compliance with the proposed rules; to issue opinions and make suggestions concerning the respect of the non-discrimination principle and the difference between hate speech and to incitement to hate; to encourage the reaching of common understandings, among platforms, for contrasting the online diffusion of hate content and for promoting measures for users’ education concerning the respect for human dignity, and concerning hate speech.

Again, ARTICLE 19 supports the establishment and sharing of common understanding and best practices in the sector. Nevertheless, this should be a part of a multistakeholder process, where all relevant actors are duly represented and can contribute, rather than an imposition made by a few selected actors. ARTICLE 19 therefore calls on AGCOM to consider opening up the Advisory Committee to the participation of all stakeholders that are relevant in the sector, including civil society.

ARTICLE 19 has previously analysed the legal and policy framework on ‘hate speech’ in the Italy, with a particular focus on the media. We are happy to contribute to the work of AGCOM and we hope that it will consider this contribution carefully and address our concerns and improve protection for freedom of expression under the Proposal.

Read the submission [Italian]