Austria: the draft Communication Platforms’ Act fails to protect freedom of expression

Austria: the draft Communication Platforms’ Act fails to protect freedom of expression - Digital

On 01 September 2020, Austria notified the European Commission that it was seeking to adopt a social media law to protect users online. ARTICLE 19 is concerned that the Austrian proposal significantly interferes with the right to freedom of expression, in particular it delegates censorship powers to private companies. It also undermines EU efforts to develop a common framework for the governance of platforms under the Digital Services Act (DSA). ARTICLE 19 calls on Austria to withdraw its legislative proposals and contribute to the protection of freedom of expression through the DSA instead.

The draft Communication Platforms Act (Kommunikationsplattformen-Gesetz) puts in place an oversight mechanism for the “responsible” and transparent handling of user reports of allegedly illegal content online.

Key aspects of the draft Act

Scope: The draft Act is applicable to domestic and foreign communication platform providers with more than 100,000 average registered users in Austria in the previous quarter and the turnover it achieved in Austria did not exceed EUR 500 000 in the previous year. Online platforms for the sale of goods or services such as Amazon or Airbnb, as well as non-profit encyclopaedias for imparting knowledge (i.e. Wikipedia) are exempt. Media companies providing a comment section on their content online are also exempt.

• Subject matter: The draft Act is concerned with illegal rather than ‘legal but harmful’ content. It more specifically targets offences including coercion, dangerous threats, persistent persecution, online harassment, unauthorised image recordings, blackmail, accusation of a judicial criminal act that has already been dismissed, disparagement of religious teachings, ‘pornographic representations of minors’, initiation of sexual contact with minors, a range of terrorist offences, including glorification of terrorism, and incitement to hatred.

• Notice and takedown procedure: The draft Act puts in place a tiered notice and takedown procedure. Upon receiving “information required for an assessment,” platforms would be required to remove or disable access to content within 24 hours when it is “already evident to a legal layperson” that the content is illegal “without further investigation”. If the assessment of legality requires a more detailed examination, content must be removed within 7 days. The user who reported the content and the uploader of the content are informed about the possibility of participating in a complaint procedure and an application for a review procedure.

• Review and complaint procedure: Under the review procedure, both user and uploader have two weeks to complain about the outcome of the first decision. The internal complaint mechanisms must be completed within 2 weeks of receiving an application for review from either party. The platform is not obliged to review vexatious or automated complaints. Under the complaint procedure, users can complain about the inadequacy of the reporting procedure or the review procedure or the failure to provide information. The complaints office, which sits in the Austrian Regulatory Authority for Broadcasting and Telecommunications, can help bring about an ‘amicable’ solution when no settlement has been reached after the review procedure.

• Transparency obligations: under the draft Act, platforms must submit an annual report containing information about, amongst other things: (1) the companies’ efforts to prevent illegal content on the platform; (2) description of the design and user-friendliness of the procedure, decision-making criteria for removal decisions, description of the steps taken to determine whether content is illegal or in breach of community guidelines; (3) description of the number of reports of allegedly illegal content during the reporting period; (4) overview of the number of reports that led to the removal of content, including a summary description of the type of content; (5) overview of the quantity, content and result of the review procedures; (6) information about the organisation, staff number and technical equipment as well as training and supervision of those responsible for processing reports; (7) overview of the period of time taken to respond to reports broken down into ;within 24 hours’, ‘within 72 hours’, ‘within 7 days’ and ‘at a later point in time’; and (8) Overview of the number and type of cases in which the service provider has refrained from carrying out a reporting and review procedure. The supervising authority can also issue guidelines on the structure of the report and scope of the reporting obligations.

• Appointment of a company representative: Failure to appoint a representative can result in fines or the company’s debtors being prohibited from paying it or paying a company affiliated with it.

• Oversight by ‘supervisory authority’: in practice, oversight is carried out by the Austrian Communications Authority whilst the Austrian Regulatory Authority for Broadcasting and Telecommunications is responsible for the complaints’ office.

If there are more than five well-founded complaints made about the inadequacy of the measures taken by a platform within a month, the supervisory authority initiates a procedure to review the measures put in place by the platform.

If, based on the frequency and nature of the complaints or based on the results of previous supervisory procedures, the supervisory authority comes to the conclusion that the measures taken by the platform are inadequate, or if it comes to the conclusion independently of complaints either based on a notification from the complaints office or based on its own preliminary assessment that the obligations set out in the Act are being seriously violated, the supervisory authority can give a warning to the platform together with a notice that it has 4 weeks to comply with its obligations under the Act. If the platform has been notified more than once and still hasn’t complied with a notice, the supervisory can impose a fine.

In performing its oversight function, the supervisory authority must ensure that it does not order measures that would amount to a prior ‘check’ on content. The measures must be proportionate to their intended objective, including increasing the efficiency of the protective mechanisms for users, protecting the general public from illegal content and safeguarding the interests of the individuals affected by such content – taking into account the service providers’ legal interests.

• Sanctions: failure to comply with obligations under the Act can lead to fines of up to EUR 10 million being imposed on the platform following the supervisory procedure outlined above. In assessing the amount of the fine, the supervisory authority takes into account: (1) the financial power of the platform (e.g. total turnover); (2) number of registered users; (3) previous violations; (4) extent and duration of the failure by the platform to comply with its obligations; (5) the contribution to establishing the truth; and the extent of the precautions taken to prevent a violation or the instruction of employees on how to behave in accordance with the law.

Finally, the platform representative is liable to monetary penalties up to EUR 10,000 for failing to comply with its obligations under the Act and up to EUR 50,000 for negligence in ensuring that the platform complies with its reporting and transparency obligations.

ARTICLE 19’s concerns

ARTICLE 19 notes that the stated purpose of the draft Act is to protect internet users. The Austrian proposal follows in the footsteps of the German NetzDG and the now largely struck down French ‘Avia’ Law on hate speech online.

ARTICLE 19 shares many of the concerns that have been expressed about the lack of transparency of platforms’ content moderation operations and the need for platforms to be accountable to the public. We also support strong remedies for wrongful removal content, both through internal complaints mechanisms and before the courts. We note that the draft Act contains a number of positive measures in this respect. The transparency and complaints procedures under the law are both welcome. We also note that the law is somewhat limited in its scope so that media content and Wikipedia fall outside the scope of the law. This is also positive, as is the fact that the scope of the law is limited to illegal -rather than legal but harmful – content.

However, like our partners at epicentre.works, ARTICLE 19 has serious concerns with the draft Act:

Overall approach and delegation of censorship powers: To begin, its logic is still very much to delegate censorship powers to platforms, who are tasked with assessing the legality of content. As we have said on numerous occasions, however, platforms are not best placed to make this assessment. It should be made by the courts. Even so-called ‘manifestly unlawful’ content does not lend itself to easy analysis. We are also concerned that appeals are heard by the complaints’ office, which sits in a special unit of the Broadcasting authority. In our view, this is inappropriate and is unlikely to provide sufficient guarantees of independence. Final decisions about legality should be made by the courts or independent adjudicatory bodies.

Excessively broad scope: Although the current exemptions to the application of the Act are welcome, they do not go far enough so that not-for-profits are not altogether exempt and small platforms (compared to e.g. Facebook) will still need to comply with onerous requirements under the law, including hiring legal trained staff in order to assess the legality of content.

Unduly short timeframes: The draft Act further requires removals within 24 hours, i.e. an unduly short period of time. In practice, it means that large platforms are more likely to use automated filters in order to comply with the time limits or prioritise less serious allegations of illegality in order to comply with their obligations under the law. This is plainly undesirable and inconsistent with international standards on freedom of expression. The French Constitutional Council recently ruled that 24 hours was too short a time limit in order to conduct a proper assessment of allegations illegality.

• Lack of transparency requirements regarding the use of filters: We are further concerned that the overall objectives of this legislation continue to be aimed at countering illegal content and removing it rather than ensuring the protection of the right to freedom of expression. This translates into transparency obligations that appear more detailed in relation to the removal of content and how fast it is removed but do not ask enough questions about the quality of decision-making, for instance by requiring the publication of at least some decisions and the reasons given for them. More importantly, the draft Act does not require transparency in relation to the use of filters, for instance the rate of false positives and false negatives, or the use of impact assessments to ensure that filters do not over-remove content.

• Broad powers of the supervisory authority: ARTICLE 19 notes that the supervisory authority, i.e. the Austrian Regulatory Authority for Broadcasting and Telecommunications, ultimately has broad powers to dictate how companies should organise their content moderation systems – and therefore manage users’ speech – in order to comply with the law. It is of the utmost importance that such powers are only granted to an independent authority and exercised with restraint and with a view to ensuring the protection of freedom of expression.

Threshold for sanctions: Given the likely volume of requests for content removal and companies’ reliance on filters that are prone to errors, the threshold for the oversight procedure to start (5 well-founded complaints) appears unduly low. Moreover, the procedure starts before complaints have been resolved, i.e. a final decision has been made as to the legality of content. Whilst sanctions of EUR 10 million are unlikely to dent the bottom line of Facebook significantly, it is still very high for smaller companies. We are also concerned about the personal liability for the representative of platforms for failing to comply with the requirements of the law. In our view, it is likely to encourage the setting up of operations that lead to overzealous removal of content. Moreover, sanctions of up to EUR 10,000 for failing to provide contact details that are ‘always’ easily and directly available or for not being reachable by the supervisory authority ‘at any time’ are disproportionate.

ARTICLE 19 therefore urges the Austrian Government to withdraw the Draft Act and contribute to the protection of freedom of expression through the DSA instead. In any case, the protection of human rights, transparency and accountability should be at the heart of any proposals to regulate social media platforms.