EU: ARTICLE 19’s recommendations for the Digital Services Act trilogue

EU: ARTICLE 19’s recommendations for the Digital Services Act trilogue - Digital

The Digital Services Act (DSA) presents an important opportunity to open up Big Tech to scrutiny and to protect human rights online. As the negotiations of the DSA reach their final stage in the trilogue, we urge the European Parliament and Member States in the EU Council to do their utmost to ensure that the DSA keeps its promise to increase safety online, and improve transparency and accountability of online platforms, all while fully respecting fundamental rights.

Since the first proposal was published, ARTICLE 19 has argued that the DSA must have the protection of freedom of expression at its core. While the draft DSA does contain a number of safeguards for the protection of freedom of expression and privacy, we have throughout the legislative process repeatedly raised that some of its aspects are disquieting and should be brought in line with international human rights and free speech standards (see for example our recommendations on regulating recommender systems, due diligence obligations for online platforms or the new proposed notice and action mechanism).

ARTICLE 19 recommends in particular:

  1. To protect users’ rights to privacy and anonymity online and refrain from imposing general monitoring obligations

Encryption and anonymity are key to protecting users’ right to privacy and to ensuring that they feel confident to express themselves freely in their online communication. If users are unable to communicate privately this will substantially affect their right to freedom of expression. States should therefore refrain from restricting encrypted and anonymous services by intermediary service providers and from requiring such providers to analyse individuals’ communication and online activity via a general monitoring obligation. Apart from interfering with users’ privacy rights, a general monitoring obligation would likely lead to companies detecting and removing vast amounts of legitimate content, as content-monitoring technologies such as hash-matching algorithms and natural language processing tools are currently not advanced enough to distinguish legal from illegal content in a reliable manner.

For these reasons, we support:

Article 7 EP

  1. To avoid over-removal of content

The DSA must not create a mechanism that incentivises companies to over-remove content to shield themselves from liability. Maintaining conditional immunity from liability for hosting content in the draft DSA will go a long way in achieving this goal. At the same time, the principle of conditional liability must be accompanied by a solid provision for the notice and action mechanisms. This is because providers may lose their immunity if they fail to expeditiously remove or disable access to the illegal content following a notice of illegality. It is therefore of critical importance that users’ content that has been the subject of a notice of illegality remains accessible pending assessment of its legality (and that immunity of providers remain intact during that time).

Removing content before such an assessment is even carried out would undoubtedly lead to a significant amount of completely legitimate content being taken down, opening up the door for abuse of the notice and action mechanism. This would not only be in contravention of due process principles but it would also risk the curtailment of freedom of expression.

For these reasons, we support:

Article 14(3) EP; Article 14(3a) (new) EP

  1. To bring due diligence obligations and risk mitigation measures in compliance with human rights standards

The draft DSA requires Very Large Online Platforms (VLOPs) to carry out risk assessments concerning a number of aspects, namely the dissemination of illegal content, any negative effects on fundamental rights and the intentional manipulation of their service by automated means with a foreseeable negative effect on public health, public safety, civic discourse and electoral processes. ARTICLE 19 endorses in principle the adoption of due diligence obligations by VLOPs to ensure that potential risks to users’ human rights are properly identified and addressed. At the same time, ARTICLE 19 has highlighted that the identification of systemic risks and the adoption of appropriate measures need to meet the legality test under international human rights law. Users or third parties must be able to foresee how their freedom of speech rights might be restricted to counter systemic risks. This requires clear rules governing the identification of systemic risks and clear limits as to which measures are acceptable and which are off limits. As ARTICLE 19 has highlighted before, the vague terminology around what could constitute a systemic risk remains a source of concern. At the very least, the DSA should contain strong transparency requirements regarding the methods applied in carrying out risk assessments and determining the measures to adopt. This should be coupled with the involvement of human rights organisations at all stages of this process.

For these reasons, we support:

Article 26(2a) (new) EP; Article 26(2b) (new) EP; Article 26(2c) (new) EP; Article 27(1) EP; Article 27(1a) (new) EP; Article 27(1b) (new) EP; Article 27 (1c) (new) EP; Article 31(2)EP; Article 31(2a) EP.

ARTICLE 19 is concerned that the European Parliament proposal is expanding the catalogue of potential systemic risks that can justify risk mitigation measures by the dissemination of ‘content that is in breach with [VLOPs’] terms and conditions’. This would allow companies to take measures based on terms and conditions that can be subjected to change at any point and that may go well beyond the legitimate aims foreseen in international human rights law.

For these reasons, we reject:

Article 26(1)(a) EP

  1. To provide users with effective remedy mechanisms

Procedural safeguards are an essential component of protecting users’ free speech online. For example, ARTICLE 19 believes that individuals should not only have access to internal complaint-handling and redress mechanisms but also to judicial remedies. Users should be able to challenge any decision by service providers that affects users’ rights, for example, the removal of content or suspension of the service, before independent courts or tribunals.

For these reasons, we support:

Article 9a (new) EP; Article 17(5a) (new) EP

Read full recommendations