Seven priorities for online platforms during elections

Seven priorities for online platforms during elections - Digital

The largest online platforms, including YouTube, Facebook, Instagram, Twitter/X and TikTok play an increasingly crucial role during elections. As billions of citizens in over 50 countries head to the polls this year, ARTICLE 19 urges these platforms to do more to protect human rights and the integrity of the democratic process. 

Elections held so far in 2024 have highlighted a number of serious challenges to citizens’ right to freely express themselves online, access diverse and accurate information, form opinions and choose their representatives without interference. 

From information manipulation and influence operations by foreign states, the use of deepfakes and AI-generated content and coordinated disinformation campaigns to political candidates engaging in hate speech targeting minorities, and governments tightening their control over the internet, the integrity of the electoral process has never been more at risk. 

The largest online platforms play a pivotal role in both exacerbating and mitigating those challenges. Yet, in the past, they have been slow to acknowledge this responsibility, and have frequently fallen short of meeting their human rights responsibilities – often with grave real world consequences. 

As the Super Election Year continues, ARTICLE 19 presents our seven key priority calls for online platforms. Each of these calls is essential for platforms to fulfil their responsibilities to safeguard free expression and the integrity of elections and the broader democratic process. 


Read the priority calls in full


1. Ensure readiness and sufficient resources for elections in all countries of operation

Historically, online platforms’ preparedness for elections has been dictated by ‘strategic’ priority assigned to each country. Resource allocation must not be based on the profit potential or the perceived influence of the country’s government but rather on specific human rights risks. Proper preparation for elections must account for the fact that each election presents unique challenges to free speech and the right to free and fair elections. 

Platforms must conduct human rights due diligence for each significant election in every country they operate in and allocate sufficient resources to analyse risks and implement mitigation measures.

2. Enable independent scrutiny of election related actions through transparency and stakeholder engagement 

In most contexts, platforms do not produce publicly available information about measures taken around specific elections. This makes it challenging to scrutinise their responses as well as assess their consistency. 

Platforms’ responsibility to properly understand the dynamics and human rights risks in each election requires meaningful stakeholder engagement. Yet, platforms’ stakeholder engagement is notoriously lacking. Once again, this challenge is most pronounced in countries which are not perceived as having strategic global importance to the companies. 

Platforms must publicly communicate all election-related measures, and publish information about such measures in a consistent and centralised manner. They must also establish regular channels of communications with non-state actors, in particular civil society groups, in the months preceding elections, to ensure local stakeholders can meaningfully engage in the preparedness process and throughout the election period. 

3. Ensure that recommender systems and content moderation practices promote access to diverse viewpoints and do not silence any voices 

Many of the issues in the online information space stem from, or are at least amplified by, the platforms’ own systems and processes, particularly their recommender systems and content moderation practices. 

The engagement-driven recommender systems on some online platforms have been demonstrated to amplify false, polarising and inciting content. The lack of investment in content moderation practices exacerbates the problem, and can result in either excessive removals or insufficient moderation. This is linked to insufficient resources invested in trained and skilled human reviewers, and overreliance on poorly trained automated moderation systems. 

 Platforms must develop measures to ensure recommender systems prioritise user access to diverse viewpoints about elections and democratic processes. They must also bring their content moderation policies and practices in line with international freedom of expression standards, in particular relating to political speech, disinformation or hate speech. 

4. Cease harmful political advertising practices

Platforms that allow political advertising often do not provide enough transparency regarding funding sources for political ads, nor offer enough information to users about why they are targeted with specific content. Micro-targeting techniques pose a threat to the integrity of the electoral process, and hinder a free and diverse debate. 

Policies and practices around political advertising must clearly outline the type of content allowed in political advertisement, enhance compulsory transparency standards and ensure their consistent and global application, and limit permissible targeting methods.

5. When conducting dialogues with electoral authorities, do so in a transparent and human-rights based manner

While some electoral authorities are fully independent, others have been accused of bias or being politically controlled. This underscores potential risks associated with agreements and interactions between electoral authorities and online platforms. 

Additional risks can also stem from the specific nature of the engagement or agreements between electoral authorities and online platforms. ARTICLE 19 is particularly concerned about backchannel communication that can give government authorities influence over how platforms moderate online speech. Coupled with a lack of transparency, such cooperation raises significant concern about users’ right to free expression online. 

Platforms should analyse the human rights risks of engaging with each electoral authority and tailor their mode of engagement accordingly. Any engagement should involve civil society organisations and focus on areas such as enhancing transparency, facilitating access to official electoral content or digital literacy. Platforms should refrain from entering into agreements that allow electoral authorities to flag content for restriction. 

6. Resist and challenge government censorship

During elections, governments often place pressure on online platforms to grant them access to user data, restrict content or block accounts. When threatened with severe sanctions, such as fines, advertising bans, or throttling (bandwidth reductions that greatly slow down access to platforms’ services or render them effectively non-functional), platforms are often inclined to yield to government pressure. 

Instead, platforms should take all necessary measures under international human rights standards to prevent and mitigate adverse human rights impacts of such requests and to avoid becoming complicit in freedom of expression violations. 

This includes evaluating negative free expression implications resulting from potential sanctions and exploring all legal avenues to challenge censorship and data access demands. All assessments and actions taken by platforms must be informed by engagement with local civil society. Platforms must also be transparent about any takedown and data access requests received. 

7. Protect user access during internet shutdowns or bandwidth throttling 

Internet shutdowns have become increasingly common during election periods, and platforms are often confronted with the possibility of governments blocking access to their services or slowing them down through bandwidth throttling. 

Platforms must make every effort to preserve user access in the event of shutdowns during election periods. This includes taking legal steps to challenge shutdown or throttling orders, but also investing in designing and promoting technical tools to help circumvent the restrictions, such as making the platform accessible through proxy servers or providing ‘data light’ versions of services which can function even with significantly reduced internet speed.

Priority calls for online platforms – full text