We have launched the Global Expression Report for 2018/19, with updated analysis and data Take a closer look

#MissingVoices

While social media platforms offer valuable spaces to connect, they also hold immense power over the information we see online.

By using algorithms and human moderators, both of whom are prone to mistakes and bias, they are removing large amounts of content in error, silencing millions of people. This impacts women, LGBTQI people and minorities in particular, who are already often denied a voice in society.

Censorship by social media platforms reduces dialogue, shrinks public knowledge for everyone and prevents us all from holding those in power to account.

We are speaking up for the missing voices by asking platforms for more transparency and the right to appeal decisions that stifle the right to freedom of expression.

The Facts

Nearly 75% of the world’s population uses Facebook (2.2 billion), with YouTube and Twitter boasting 2 billion and 330 million active users respectively

Social media platforms are restricting speech based on government legal requests. 74% of government requests to remove content from Twitter July–December 2018 came from Russia and Turkey

Journalists, artists, activists and marginalised groups are often targets for takedown on social media. It is estimated that 22% of all violations of women artists’ free expression happened in the digital space. ( Freemuse Report)

Our Call

We ask Twitter, YouTube and Facebook to make a simple change that will give everyone a fair and equal right to be heard.

A right to appeal

• We ask for users to have the right and method to challenge decisions when their content is removed. This means that content taken down in error or unjustly, will have a second chance to be reviewed– and the right to expression restored.
• Platforms must also clearly explain how a person can appeal the decision and give a reasonable response timeline with contact details for more information.

More transparency

• We ask for proactively published data on the number of content removals, types of flaggers , reasons for removal, how many appeals they receive and the outcome of the appeals.

FAQs

Social media companies hold a huge amount of power over our freedom of expression. The biggest platforms, Facebook, Twitter and YouTube, are used by billions of people around the world every day to share ideas, communicate with others, and learn about the world around them. Through the application of their content rules (Facebook’s Community Standards, Twitter Rules and YouTube Policies), these companies ultimately make the decision on what we can see and post on their platforms.

Right now, there are very limited safeguards in place to check how content rules are being applied, and how social media platforms are making decisions on content on their platforms. Decisions on content are made by a combination of human moderators and algorithmic tools, tasked with applying these rules to a vast range of content.

However, both people and technology are prone to bias and mistakes, meaning content is often removed and accounts blocked wrongly. Platforms also offer us very little information on exactly how they are making content decisions, how often wrongful takedowns are happening, or even how repressive governments might be using them to suppress online dissent.

This kind of information is essential to understanding the scope of the problem, as well as finding solutions.

We want to ensure that the #MissingVoices, who are taken offline by error or bias, have their freedom of expression protected through the ability to appeal content decisions, and have their posts and accounts reinstated when the process concludes in their favour.

This would provide a safeguard for freedom of expression online, and enable users to challenge these companies when they make bad decisions. We also want to see more transparency from Facebook, Twitter and YouTube on their content moderation decisions, to make sure human rights are being respected on their platforms, and that this is being addressed where they are not.

It’s hard to say exactly how many voices online are being wrongfully silenced by bad content decisions, but as an example, Facebook’s July - September 2019 statistics on enforcement of their content rules on nudity and sexual activity show that 860,800 pieces of content were re-instated after they identified that they were taken down in error –either through appeals or secondary checks. And that’s only the mistakes they are aware of - the company only provides an appeal mechanism for some types of content, and there are likely to be many more pieces of content that were not appealed or were not picked up by their checks.

Given the scale of this issue, it is clear it has an impact on the freedom of expression of all social media users. However, often those who already face discrimination offline, including racial and religious minorities, LGBTQI people, and women, are disproportionately affected by wrongful takedown, meaning they face being doubly silenced. Existing biases in decision-making, whether by algorithms or people, as well as targeted flagging and takedown campaigns, means wrongful takedown is further entrenching existing discrimination in online spaces, and reducing the diversity of voices online.

An appeals process is not just about the substance of the content, but is a safeguard to protect the rights of a platform’s users. Essentially, it serves to protect everyone against arbitrary decision-making by social media companies in the enforcement of their content moderation policies.

This includes, but it is not limited to, content flagged and removed as ‘hate speech’, “extremist” content, or nudity. Right now, where appeals process safeguards are in place, they are often limited to certain types of content- we want to see a robust appeals process in place across companies’ content moderation decisions.

We’re focusing on Facebook, YouTube and Twitter, given the size of their user base, and we have three simple demands for them. These demands will mean companies are more transparent and accountable to users for what they do, and that better safeguards exist to protect free speech online.

  1. Whenever companies take down user content or suspend an account, we want them to notify the user and clearly explain what content has been removed and why.
  2. When notifying users of a take down or account suspension, we want companies to give users the opportunity to appeal the decision, using clear and simple language to tell them how to do this, and giving them the opportunity to discuss the matter with a person.
  3. Finally, we want these companies to proactively publish much more detailed data on the numbers of complaints, content takedowns and appeals which have been made together with detail on the type of information that was removed and reinstated.

While these changes will not solve the many challenges of human rights protections on social media, they are an important first step to better protecting individuals’ freedom of expression in the often opaque and flawed process of content moderation, and making the companies that hold so much control over our expression more transparent and accountable.

We are inviting you to join us in calling on Facebook, Twitter and YouTube to uphold their human rights responsibilities and better protect freedom of expression online.

Together with activists and organisations from across the world, we are asking you to promote our campaign messages on our Facebook page and on our Twitter page and encourage others to do the same.

We will be announcing campaign actions as they develop and you can be kept up to date with these when you you sign up  for the latest news.