France: Analysis of draft hate speech bill

ARTICLE 19 is concerned that the French proposal of a Bill on Countering Online Hatred – so called “Avia Bill” (Project de loi Avia) fails to comply with international free expression standards. We warn that the Bill entrenches private censorship of a wide range of illegal content at the expense of the courts, the 24 hour-time limit on content removals is too short and the sanctions meted out to eradicate online ‘hatred’ are disproportionate. ARTICLE 19 urges the National Assembly to reject the Bill. Instead, the French Government should explore alternative options to take a more holistic approach to concerns about the power and influence of the dominant social media platforms.

On 20 March 2019, Laetitia Avia MP proposed a Bill on Countering Online Hatred ((‘proposition de loi’) before the National Assembly. The Bill broadly follows a similar approach to the NetzDG law in Germany and the recommendations of 2018 Avia Mission Report on Combatting Racism and Anti-Semitism online. On 19 June, the Public Bill Committee of the National Assembly adopted the Bill, including several amendments. The new text was debated on 3 July in the National Assembly following the fast-track procedure, meaning that the Bill will only be given one reading in each Chamber of the French Parliament.

Key aspects of the draft Bill

The draft Bill proposed, inter alia, the following:

  • Communication service providers with a number of users set by decree would be required to remove content manifestly inciting to hatred or insults on grounds of race, religion, ethnicity, sex, sexual orientation or disability, which is illegal under French law, within 24 hours of notice;
  • The information required to notify online service providers of manifestly illegal content would be simplified. In particular, individuals wishing to notify ‘manifestly illegal content’ would no longer be required to set out the facts and reasons why they believe such content to be illegal;
  • Communication service providers would be required to put in place internal complaints mechanisms and provide information about external avenues of appeal.
  • Companies would be required to have a legal representative in the country in which they operate;
  • Companies would also be required to comply with a number of transparency obligations set by the regulator;
  • The French broadcasting regulator would be able to make recommendations as to the way in which companies could better tackle ‘hate’ online;
  • The French broadcasting regulator could impose fines of up to 4% of annual turnover for serious and recurrent failures to remove;
  • an administrative authority – in practice the Central Office for Combating Crimes Related to Information and Communication Technologies (OCLCTIC), i.e. a special branch of the police dealing with online terrorist content and child abuse images, would be able to order the blocking or de-referencing of websites, servers or any other electronic means enabling access to content that has been deemed illegal by a court decision.
  • Fines for failing to cooperate with law enforcement or other agencies, including by preserving data that may enable the identification of those who posted the allegedly illegal content, would treble from 75 000 to 250 000 EUR.

On 19 June 2019, the Public Bill Committee (‘commission des lois’) of the National Assembly examined the Bill in light of a number of recommendations made by the Conseil d’Etat in its Opinion dated 16 May 2019. The Public Bill Committee adopted the Bill with the following amendments:

  • Scope: the scope of the Bill has been considerably widened in terms of companies covered and the subject area.
  • Procedural obligations: the Bill generally simplifies the removal procedure under Law no. 2004-575 of 21 June 2004 to bring confidence in the digital economy. However, the latest draft maintains the requirement for individuals to provide information about the facts and reasons for the notification, as well as providing the location of the allegedly illegal content where appropriate.
  • Safeguards against abuse of removal procedure: the Bill creates a new offence of knowingly misrepresenting content as illegal for the purposes of requesting its removal. In other words, it penalizes malicious removal requests. The offence is punishable with one-year imprisonment and a fine of EUR 15 000.
  • Transparency obligations: the Bill spells out in more detail the transparency requirements that companies must comply with instead of leaving it to the regulator to decide. In particular, companies are required to remind the authors of ‘manifestly illegal’ content that has been removed that they expose themselves to civil and criminal sanctions for the speech offences at issue. Companies need to provide clarity about their content moderation standards. They are also required to explain how they organize themselves internally and the resources they deploy in order to comply with their removal obligations. They must also educate minors under 15 about the responsible and civil use of their services and the potential legal sanctions they expose themselves to if they post or share ‘hateful’ content.
  • The regulator: companies are now explicitly expected to comply with the recommendations of the broadcasting regulator. Failure to comply is taken into account for the purposes of determining whether a company has overall failed to comply with its removal obligations. In making this assessment, the regulator is now also required to examine both whether communication service providers are failing to remove enough content or excessively removing it.
  • Blocking of mirror sites: in the new draft, the OCLCTIC can only request the blocking or de-referencing of websites, servers or any other electronic means enabling access to content that has been deemed illegal by a court decision. If providers do not comply, the OCLCTIC can make an application (including on an emergency basis) to the courts, which can then order the removal or other restriction on access to the content at issue.
  • Other sanctions: the Bill now criminalises companies’ failure to remove manifestly ‘hateful’ content, which is punishable by one year imprisonment and by 250 000 Euros for physical persons and up to 1 250 000 Euros for legal persons if other amendments are adopted.

ARTICLE 19’s concerns

ARTICLE 19 has significant concerns about the text adopted by the Public Bill Committee. Under international human rights law, any restriction on freedom of expression must be (1) provided by law; (2) pursue a legitimate aim as exhaustively listed in Article 19 (3) of the International Covenant on Civil and Political Rights (ICCPR’ or Article 10 (2) of the European Convention on Human Rights (European convention); and (3) be necessary and proportionate to that aim. In our view, the proposed Bill fails to meet the first limb of that test in that it generally fails to provide sufficient safeguards for the protection of freedom of expression.

First, the scope of the Bill has become incredibly broad, both in terms of the companies covered by the Bill as well as the types of content that must be removed:

  • The Bill now covers search engines explicitly. A further amendment is intended to bring communication services providers within scope when they enable the exchange of content, which may be both public and private, as well as companies that host public forums as an ancillary activity. This could raise significant issues for the right to privacy is communication providers are required to monitor their networks to detect ‘illegal’ content.
  • The obligation to remove ‘manifestly illegal’ content within 24 hours has been extended to a wide range of illegal content under French law, including: apology of acts constituting an offence against human dignity, war crimes, crimes against humanity, slavery, crimes of collaboration with an enemy, voluntary interference with life or physical integrity, sexual aggression, aggravated theft, extortion or destruction, voluntary degradation or deterioration which is dangerous to a person, sexual harassment, human trafficking, pimping, incitement to or apology of acts of terrorism and child abuse content. Although part of these underlying laws cover speech or conduct that can be restricted, there are some that are overly broad and vague under freedom of expression standards. For instance, the offence of apology of voluntary interference with life has been used to prosecute jokes in bad taste about the terrorist attacks on 11 September 2001.

Second, the Bill includes sanctions that might be disproportionate under freedom of expression standard. In particular, the level of fines that can potentially be levied against companies for failure to remove content or comply with their other obligations appears to be unduly high and likely to lead to over-removal of content. The same is true of the criminal sanctions for failure to remove outlined above. The level of fines for failure to cooperate, among other things, is so high as to undermine the protection of the right to privacy. If communication service providers are faced with fines of EUR 250,000, it seems highly unlikely that they would be ready to refuse passing on the details of their users who are accused of having posted illegal content online.

Third, ARTICLE 19 finds the propose regime of enforcement of the Bill problematic. In particular:

  • 24 hours is an excessively short period of time in which to make decisions about ‘hate speech’, which is an inherently contextual, fact-specific, area of law. That the content must meet the threshold of ‘manifest illegality’ is unlikely to be helpful in practice;
  • The regulator could recommend upload filters as a more ‘effective’ means of tackling ‘hate speech’ online, yet it is unclear what would happen if communication service providers refused to follow those recommendations or guidance. It is more likely than not that they would ultimately be found in breach of their removal obligations;
  • although the OCLCTIC can no longer order restrictions on access to ‘mirrored’ content, it can still apply to the courts to obtain such an order if operators fail to comply with its demands. It appears however that such orders would be made on an ex parte basis;

ARTICLE 19 recognises that the text adopted by the Public Bill Committee does contain a number of improvements for the protection of freedom of expression. For instance, the latest version of the Bill has restored a degree of procedural fairness by, among other things, requiring individuals or companies who notify content to set out the facts and reasons for notifying such content. We also welcome provisions in the Bill aimed at improving transparency reporting on platforms or establishing internal complaints mechanisms. Requiring the regulator to consider over-removal of content when monitoring compliance with the law is also positive. While the new offence penalising bad faith requests is no panacea – bad faith is notoriously hard to prove – it contributes to the protection of freedom of expression. Nonetheless, we believe that these improvements are insufficient to redress the overwhelming balance of incentives towards content removal.

Overall, ARTICLE 19 is concerned that the new regulatory framework laid down in the Bill entrenches private censorship powers and that its highly punitive fines will be very damaging for freedom of expression.

An alternative and holistic approach

ARTICLE 19 would urge the National Assembly to pause before adopting the Bill and carefully consider the approach put forward by the French Government in the ‘Facebook mission’ report. In our view, the protection of human rights, transparency and accountability should be at the heart of any proposals to regulate social media platforms.

We would also encourage lawmakers to take a more holistic approach to concerns about the power and influence of the biggest social media platforms. In particular, lawmakers should explore alternatives that would reward companies for embracing higher standards of conduct. This could include kite-marking or grading that would enable the public to recognize companies that abide by higher standards of conduct. Policy-makers should also consider independent multistakeholder models, such as Social Media Councils.

Finally, we recommend that the French Government looks into “unbundling” options, perhaps similar to those already enacted in European utilities markets or to such changes that were recently introduced for European payment services through PSD2. Broadly speaking, the big platforms would open up a neutral version of their service (i.e. without their ranking or community standards). Competitors would then offer a service where users could find the same content but choose to apply different ranking and removal policies. We believe that French lawmakers and others should further explore this idea, as it could also help solve some of the concerns arising from the dominance of some digital companies.