EU: Platforms’ election risk mitigation measures must put human rights first

EU: Platforms’ election risk mitigation measures must put human rights first - Digital

ARTICLE 19 and the Electronic Frontier Foundation (EFF) jointly submitted a response to a consultation process by the European Commission (‘the Commission’) for input to their “Guidelines for Providers of Very Large Online Platforms and Very Large Online Search Engines on the Mitigation of Systemic Risks for Electoral Processes”. In the submission, we recognise the significant responsibility that platforms have during elections due to their influence over the information space and make several key recommendations that promote a human-rights-based approach and emphasise the need to focus on the risks stemming from the platforms’ own systems and processes.

ARTICLE 19 and the EFF welcome the opportunity to offer our input to the European Commission’s Guidelines, informed by Article 35(3) of the Digital Services Act (DSA) which provides that the Commission may issue guidelines on risk mitigation measures that providers of very large online platforms (VLOPs) and very large online search engines (VLOSEs) are required to adopt in relation to specific risks.

We believe the Guidelines raise several important points:

  • Foremost, online platforms should be properly prepared for elections and should address foreseeable negative effects on election integrity. We agree with the Guidelines that a proper understanding of local context is crucial in this effort.
  • Online platforms should dedicate sufficient resources and create dedicated teams to prepare for elections, proportionate to the level of risk.
  • Finally, we are pleased to see the recommendation that fundamental rights impact assessments are made available to civil society organisations as soon as they are completed.

We also offer a number of key recommendations and urge the Commission to ensure that the Guidelines:

  • Focus on platforms’ own systems and processes. In their current form, the Guidelines overly emphasise the risks associated with certain types of user-generated content and the activity of malicious actors on online platforms. Instead, the Guidelines should address certain steps VLOPs may undertake to mitigate risks stemming from their own systems and processes, such as their recommender systems, content moderation practices, ad libraries, monetary incentives, or design choices which may contribute to human rights risks during elections.
  • Follow a human-rights based approach. The Guidelines should place an increased focus on the objective of safeguarding human rights, reflecting one of the DSA’s key objectives, in particular the right to freedom of expression and the right to free and fair elections, by promoting a healthy information ecosystem. Any suggested mitigation measures should be in line with international human rights standards. 
  • Concerns about watermarking. We are sceptical about the suggestion that VLOPs and VLOSEs should use watermarking as a risk mitigation measure in light of the evidence suggesting that watermarking will not curb disinformation. Watermarking also presents a risk to fundamental rights, such as privacy and freedom of expression.
  • Address ‘disinformation’ without censorship. It is essential that the DSA does not result in undue restrictions on ill-defined categories of user-generated content. Concepts like “hate speech”, “FIMI”, or “disinformation”, used repeatedly throughout the Guidelines, are not clearly defined under international human rights law and are not well suited to serve as a basis upon which freedom of expression may be restricted. Mitigation measures should not focus on restriction of content deemed problematic. Instead,  the Guidelines should focus on promoting access to diverse sources and viewpoints.
  • Ensure that cooperation with national authorities is rights-respecting and transparent. We caution against broadly worded recommendations for VLOPs to exchange with national authorities. VLOPs should conduct due diligence prior to engaging with electoral authorities to understand their level of independence and assess the human rights risks associated with engaging with those electoral authorities, and to tailor the mode of engagement accordingly. While we do not oppose engagement that seeks to facilitate voter access to official information about elections or promoting media and digital literacy, we are wary of opaque modes of engagement that might lead to restrictions of content. 

View as PDF