Bridging the Gap: Local voices in content moderation

Group of journalists with cameras and smartphones in Berau, Indonesia

Social media platforms can be a space for free expression, democratic debate, and participation. But weak content moderation can transform them into hotbeds of ‘disinformation’, ‘hate speech’, and discrimination. This is especially concerning in post-conflict countries, where tensions between groups can erupt into violence. 

ARTICLE 19’s new research investigates how content is moderated on major social media platforms in three post-conflict countries – Bosnia and Herzegovina, Indonesia, and Kenya – with a particular focus on ‘harmful content’ (such as ‘hate speech’ and ‘disinformation’).

Our research has found that social media companies don’t listen to local communities. They also fail to consider context – cultural, social, historical, economic, political – when moderating users’ content. 

This can have a dramatic impact, online and offline. It can increase polarisation and the risk of violence – as when Facebook allowed incitement of genocide against Rohingya in Myanmar. 

Bridging this gap between global companies and local communities is therefore vital to ensuring sustainable peace and democracy in post-conflict countries.

Content moderation report cover
Content Moderation and Freedom of Expression: Bridging the Gap between Social Media and Local Civil Society

Content moderation
and freedom of expression
handbook

Read our country reports

Bosnia and Herzegovina

Indonesia

Kenya

Global problem, local solution

ARTICLE 19, together with our research participants, has proposed a solution: local Coalitions on Freedom of Expression and Content Moderation.

These coalitions would allow consistent engagement between social media platforms and local civil society organisations, which would contribute to bridging the gap between global tech giants and local communities. 

Our research provides more information on these coalitions. For each country, we outline practical steps for creating them, detailed risk assessments and potential members.

Edwin’s story

What are we asking social media companies to do?

Learn more about ARTICLE 19’s work on content moderation

Regulating content moderation: Who watches the watchmen?
08.12.2021 7 min read

Regulating content moderation: Who watches the watchmen?

Click here to go to article
Social Media Councils: One piece in the puzzle of content moderation
12.10.2021 4 min read

Social Media Councils: One piece in the puzzle of content moderation

Click here to go to article
Side-stepping rights: Regulating speech by contract
19.06.2018 5 min read

Side-stepping rights: Regulating speech by contract

Click here to go to article
International: The Santa Clara Principles and the push for transparency
08.12.2021 3 min read

International: The Santa Clara Principles and the push for transparency

Click here to go to article
EU: Digital Services Act crisis response mechanism must honour human rights
13.04.2022 8 min read

EU: Digital Services Act crisis response mechanism must honour human rights

Click here to go to article


“The launch on the first International Day to Counter Hate Speech of ARTICLE 19’s reports is a significant contribution of the UNESCO ‘Social Media 4 Peace’ project to curb hate speech on social media while protecting freedom of expression.” 

Mr Tawfik Jelassi, UNESCO Assistant Director-General for Communication and Information


“In conflict-affected and fragile contexts, there is often a lack of governance mechanisms to deal with the growing challenge of harmful content online. This is why we support efforts to address this growing challenge.”

Marc Fiedrich, Acting Director and Head of Service for Foreign Policy Instruments (FPI) – European Commission

Content Moderation and Freedom of Expression: Bridging the Gap between Social Media and Local Civil Society

Read our country reports

Bosnia and Herzegovina

Indonesia

Kenya

The Social Media 4 Peace project is run by UNESCO and funded by the EU

In partnership with UNESCO