ARTICLE 19 comments on new Italian regulation on ‘hate speech’

ARTICLE 19 provides comments on the final version of the regulation, issued by AGCOM, the Italian regulatory agency on ‘hate speech,’ following an earlier public consultation. ARTICLE 19 welcomes the improvements AGCOM has introduced in the final text, including several of our recommendations. However, a number of concerns from a freedom of expression perspective remain. Hence, ARTICLE 19 calls on AGCOM to take action in order to ensure that the regulatory framework adequately complies with international standards to guarantee freedom of expression.

Background

In May 2019, the Italian Regulator for the postal, telecoms and media sectors (AGCOM) issued a Regulation on respect for human dignity and the principle of non-discrimination in relation to hate speech (the Regulation), contained in Resolution 157/19/CONS, establishing new rules for audiovisual and radio service providers, as well as for providers of video sharing platforms on ‘hate speech.’ The Regulation was preceded by a public consultation in which ARTICLE 19 participated.

The Regulation responds to the significant problem of ‘hate speech’ in Italy, previously also highlighted by ARTICLE 19. This includes the incendiary tones used by political parties and movements within public debates against migrants; and biased media reporting on issues related to diversity and minority groups. The problem is exacerbated by the attitude of the leading political class, who have shown acceptance, and even open support for-, hateful or discriminatory statements.

In the explanatory memorandum to the Regulation, AGCOM acknowledges that the topics discussed on media and radio services are increasingly becoming polarized and pushed towards extremes on social media. Social media constitutes the main source of information and the main channel of expression for a part of the population in the country. It also recognizes that social media is often used to fuel disinformation strategies aimed at sustaining ‘hate speech’ or spreading false, instrumental and discriminatory representations of various complex phenomena. For this reason, pending the implementation by the legislators of the Revised EU Audiovisual Media Service Directive (2018 AVMSD), AGCOM have included the providers of video-sharing platforms services within the scope of the Regulation.

The relevant Italian regulatory framework (the Gasparri Law) provides that audiovisual media services must guarantee “the airing of programmes that respect the fundamental rights of the person” and must ban programmes that “contain incitement to hatred on any grounds or that, even with regard to the time of transmission, may harm the physical, psychological or moral development of minors.” The subsequent Law on Radio and Audiovisual Media Services (the Consolidated Act) further extends the ban to “programs that instigate intolerant behaviours based on difference of race, sex, religion or nationality.” The print press and the social media platforms are not subject to these laws; the print press is regulated by the Press Law.

AGCOM does not have legal powers to intervene and sanction broadcasters based in the Italian territory in cases of non-compliance with these laws, however it can do so for providers based in other EU member States. The Gasparri Law and the Consolidated Act grant AGCOM the power to issue warnings and pecuniary sanctions only when violations of the relevant provisions concern the content of programmes aired during the protected time band (between 4pm and 7pm) during which minors are required to have additional protections.

To fill this gap, in September 2016, AGCOM approved the Guidelines on Respect for Human Dignity and the Principle of Non-Discrimination in the Programmes related to News, Information Analysis and Entertainment’. They are intended as an instrument of ‘moral persuasion’, and to promote a constructive rather than a punitive sanctions-based approach to the broadcast media.

With regards to ‘hate speech’ on social media, AGCOM does not have any legal powers to regulate or impose sanctions for offending content hosted by online intermediaries or platforms, since such content was not included in the definition of ‘audiovisual media services’ of the past AVMSD implemented in Italy since 2012, and the legislators have not yet implemented the 2018 AVMSD. Nevertheless, in 2014 AGCOM set up a Permanent Observatory of Guarantees and Protection of Minors and of the Fundamental Rights of the Person on the Internet’. Its tasks include monitoring of ‘incitement to hatred, threats, harassment, bullying, hate speech and the dissemination of deplorable contents.’ It is also tasked with drafting guidelines for the adoption of self-regulatory codes of conduct by Internet companies and social media platforms.

Nevertheless, the framework currently in place did not manage to adequately deal with the phenomenon of ‘hate speech’ in the country. Faced with this growing concern, AGCOM decided to intervene with the Regulation.

ARTICLE 19’s comments

Scope of application and definitions

In the submission to the consultations on the Regulation, ARTICLE 19 warned against a broad definition of ‘hate speech’. We welcome that the definition has been narrowed down in the final text. Nevertheless, we recall that any prohibition of ‘hate speech’ should comply with the international framework. Namely, ‘hate speech’ can be divided into three categories, distinguished by the response that international human rights law requires from States:

  • Severe forms of ‘hate speech’ that international law requires States to prohibit, including through criminal, civil, and administrative measures, under both international criminal law, and Article 20(2) of the International Covenant on Civil and Political Rights (ICCPR);
  • Other forms of ‘hate speech’ that States may prohibit to protect the rights of others under Article 19(3) of the ICCPR, such as discriminatory or bias-motivated threats or harassment;
  • ‘Hate speech’ that is lawful but nevertheless raises concerns in terms of intolerance and discrimination, meriting a critical response by the State, but which should be protected from restriction under Article 19(3) of the ICCPR.

In contrast, the Regulation refers to a vague definition that does not distinguish between these three categories of ‘hate speech,’ nor to a modular approach in the response.

Furthermore, ARTICLE 19 recalls that the protective scope of any measures to address ‘hate speech’ should encompass all protected characteristics recognized under international human rights law, and not be limited to the present protected characteristics listed in Article1(n) of the Regulation. In particular, that list should be revised in light of the right to non-discrimination as provided under Article 2(1) and Article 26 of the ICCPR.

Co-regulation and Code of Conducts

Section III of the Regulation focuses on co-regulation mechanisms, which is the instrument chosen by AGCOM for approach to the media sector.

ARTICLE 19 has long argued that although the regulation was acceptable for the broadcast media to ensure that the limited spectrum for broadcasting channels is democratically distributed, for social media and the print press the least intrusive instrument is clearly self-regulation. We have also argued that self-regulation should be managed by a self-regulatory body that should be independent from the government, commercial and special interests; established via a fully consultative and inclusive process; and, democratic and transparent in the selection of its members and decision-making. We also recognise that, in certain circumstances, when self-regulation proves to be ineffective, co-regulation could represent a better option.

AGCOM’s choice to adopt co-regulatory mechanisms is grounded in the specific context of failed self-regulation of the social media platforms. Here, ARTICLE 19 notes that the mechanisms provided for in the Regulation present some problematic shortfalls; namely:

  • The reference to codes of conduct contained in Article 2(3) of the Regulation appears unnecessary and misplaced. Article 2 deals with the scope of application and not with codes of conduct, which are instead regulated by Article 9. As such, the wording in Article 2(3) and Article 9 raise doubts about how to interpret and coordinate the two provisions.
  • The Regulation contains conflicting provisions on awareness campaigns. Article 2(3) establishes that AGCOM shall identify forms of awareness campaigns for providers of video-sharing platforms; however Article 9(4) mandates social platforms to do so. Thus, it is not clear whether it should be the video-sharing platform provider who should shape and run awareness campaigns, AGCOM, or the two together, and if so, how exactly they will act and coordinate.
  • Article 9(1) calls for codes of conduct to be adopted by audiovisual media service providers as well as by video-sharing platforms’ providers. It seems that these two categories of providers have been put on an equal footing. Nonetheless, under Article 9(5), AGCOM reserves the possibility to review the Regulation in light of the codes of conduct that will be adopted by audiovisual media service providers, but not in light of those that will be adopted by video-sharing platforms’ providers. This asymmetric provision creates an unjustified disparity of treatment, at best, and seems to suggest that AGCOM will dedicate less attention to video-sharing platforms conduct, at worst.
  • Article 9(2) establishes that measures aimed to combat ‘hate speech’ shall include efficient systems for the flagging of violations as well as the identification of the actors to be held liable for them. ARTICLE 19 believes that this provision is over broad, and provides no parameters to assess the efficiency of the systems to be put in place; we warn against the risk that the outcome could be a box ticking exercise for providers, rather than a real effort to limit ‘hate speech.’
  • Article 9(3) mandates providers of video sharing platforms to send to AGCOM a quarterly report on the monitoring carried out to identify online ‘hate content’, with an indication of the operating methods and verification systems used. First, it is not clear whether the concept of ‘hate content’ is limited to ‘hate speech’ that can be prohibited according to international law, or whether it goes beyond those boundaries, and, if the latter case, what this includes. Second, the rule enhances transparency, but it does not provide for any remedies in case the monitoring shows insufficient outcomes.
Reporting and sanctions

Article 6(2) of the Regulation provides a possibility to report ‘hate speech’ to a limited number of entities: association or other organizations that represent users’ interests, as well as associations and entities dedicated by Statute to fight against discrimination. However, individuals, including those who might be the direct target of ‘hate speech,’ cannot report that to AGCOM. ARTICLE 19 believes that the limitation of this remedy is disproportionately narrow and that it negatively impacts the efficacy of the reporting system.

Furthermore, Article 7 of the Regulation establishes different procedures for episodic and systemic violations, respectively. While ARTICLE 19 welcomes the distinction, it also warns against the lack of careful definition of the thresholds for the identification of a systemic violation. Vagueness in that respect, in fact, would bring legal uncertainty for both actors and citizens.

Overall, ARTICLE 19 welcomes AGCOM’s attempt to counter the widespread phenomenon of hate speech in Italy and recognizes that the text adopted contains a number of improvements for the protection of freedom of expression in the country. Nevertheless, we call AGCOM to ensure that the Regulation is compliant with the international standards on freedom of expression. We also encourage AGCOM to provide clarifications on the interpretation of the Regulation’s provisions, in order to ensure legal certainty for all relevant actors.

Read the analysis in Italian.