EU: Protect freedom of expression in Digital Services Act

EU: Protect freedom of expression in Digital Services Act - Digital

In September 2020, ARTICLE 19 responded to the EU consultation on the future Digital Services Act (‘DSA’) and to the parallel consultation on a New Competition Tool (NCT). The DSA is set to provide a blueprint for the regulation of platforms and other intermediaries for at least a generation in the EU. Meanwhile, the NCT will equip the European Commission with the necessary instruments to address structural competition issues in digital markets. Both proposals are currently being developed by the Commission with a draft text expected on 2 December 2020. ARTICLE 19 has previously articulated our 10 key recommendations for the DSA. In our response to the EU consultation, we call on the European Commission to take this opportunity to set a gold standard for the protection of fundamental rights online with radical transparency and accountability at the heart of the DSA. We also call on the European Commission to duly consider the impact of gatekeepers not only on economic dynamics on the markets, but also on consumers’s fundamental rights.

 In our responses to the EU consultations, ARTICLE 19 emphasises the following key recommendations:

  1. Transparency, accountability and the protection of human rights and must be embedded in the DSA as core values

The DSA must have the protection of human rights at its heart. In particular, the principles of legality and proportionality must be upheld throughout to prevent government overreach and potential human rights abuses.  For this reason, vague concepts such as the prevention of ‘harm’ must be avoided.

Comprehensive transparency and internal due process obligations in the DSA are essential to accountability and user empowerment. Users should be able to understand how companies make decisions about what content they get to see or what they get to say and why. Users should have access to appeals mechanisms to challenge company decisions on content.

User choice and robust evidence must also be central to technology design and policy solutions. Users must be given the tools they need to choose the kind of experience they want to have onlinew. The DSA must also be sufficiently flexible to enable the development of technical and practical solutions that meet international human rights standards.

  1. The cornerstones of the E-Commerce Directive, conditional immunity from liability and the prohibition of member-states from mandating general monitoring, must be preserved

Conditional immunity from liability for platforms for the content they have not interfered with and prohibition of member states to require platforms to conduct general monitoring of content are the cornerstones of the E-Commerce Directive. ARTICLE 19 believes that any changes to existing conditional immunity from liability provisions risk either increasing censorship by encouraging platforms to take down more content, or pushing platforms to avoid content moderation altogether, likewise threatening online free expression.

The current standard of knowledge required to benefit from immunity from liability must be maintained, i.e. it should remain ‘actual’ rather than ‘constructive knowledge’, and that actual knowledge of illegality can only be obtained by an order of a court. To hold otherwise would be to accept that content is illegal simply because a third party, such as a copyright holder, said so.

General monitoring requirements create a serious threat to both freedom of expression and privacy online and must continue to be prohibited in the DSA.

  1. The DSA should set out clear notice and action procedures to deal with allegations of illegality, creating a ‘Notice to notice’ system for private disputes, and a ‘Notice and takedown’ approach for allegations of serious criminality.

Clearer notice and action procedures are essential to protecting free expression online. We propose notice and notice procedures for private disputes and a court-based takedown model for allegations of criminality. Our proposal provides legal certainty for both users and companies whilst ensuring that companies are not put in charge of deciding what’s legal or not in the first place. With strong due process safeguards before content is removed, freedom of expression is better protected.

A notice to notice procedure for private disputes, such as defamation or copyright complaints, would require the complainant to file a ‘notice’ with the hosting provider regarding a piece of content. The hosting provider would then pass this to the original content creator. The company would be required to take down the content if the creator fails to respond, or if following their response, the complainant seeks legal resolution and a court order is issued.

Notice and takedown, for serious criminal issues such as a threat to life or sexual abuse imagery, would mean companies are liable for failing to take down content after a takedown notice is issued by a court, or where, in urgent matters, to prevent serious criminality, a takedown order is issued by law enforcement using statutory powers (this should be later confirmed by a court order).

  1. The DSA should look beyond content regulation to require unbundling of content moderation and hosting, and interoperability, by large platforms. This should be approached through ex ante regulatory measures, to increase competition, user choice and rights protections.

The concentration of power with a few large social media platforms means users have limited choice, particularly on issues of privacy, accessibility, and free expression. By providing hosting and content moderation services as a bundle, large platforms raise barriers to entry to the market for potential competitors and lock in users. In other words, by offering both services together, dominant social media platforms manage to protect themselves from competitive pressure and deprive users from alternatives; they are able to hold their gatekeeping position safely.

Incorporating requirements for big platforms to unbundle their hosting from content moderation, and allow third parties to access their platform through interoperability requirements, would reduce the imbalance of power in social media markets and have a positive impact for business users, the wider industry, and individuals. Users would be able to make choices on which type of content moderation and which content rules they would like to be subject to, and which company hosts and access their data.

Unbundling, data portability and interoperability requirements present the best way to ensure healthy competition and innovation in social media markets, and return to the promise of a free, diverse and decentralised Internet.

  1. Concentrated markets, barriers to entry, tipping and gatekeeping weaken or eliminate competition in the market, frustrating all positive outcomes competition is supposed to deliver, especially in terms of consumer welfare and innovation.

Cross border, effective competition is one of the main EU values and a grounding pillar of the digital single market’s objective. As a guardian of competition in the EU, the Commission should be able to intervene every time there is a structural competition problem that raises obstacles to the delivery of the consumer welfare outcomes that competition rules are supposed to guarantee.

Sometimes, for example where there are demand-side or hold-up problems in markets, certain  features such as gatekeeping or tipping allow platforms to exploit their power without necessarily incurring in the conducts sanctioned by Articles 101 and 102 TFEU. Therefore, there might be a gap there, which the Commission should be able to fill with different instruments to make markets work effectively. In particular, the Commission should be able to impose behavioural remedies on large platforms, for example the unbundling obligations described above.

 

Read response to EU consultation on the DSA

Read response to EU consultation on a NCT