Facebook must be more transparent and accountable over harmful content takedowns

Facebook must be more transparent and accountable over harmful content takedowns - Digital

ARTICLE 19 has responded to Facebook’s announcement that it will create 1,000 jobs in the UK, including more people to tackle harmful content on the platform.

Acting Executive Director of ARTICLE 19 Quinn McKew said:

“Facebook, and other social media platforms, are under immense pressure to remove harmful content, but we need to ensure that in doing so they do not threaten global freedom of expression.

“As the world’s biggest social media platform, Facebook’s processes for removing content can have a serious impact on artists, human rights organisations, journalists and activists who rely on social media for their work and connections. This is something ARTICLE 19 had first hand experience of recently, when a member of staff had their account temporarily blocked after trying to upload a video, which ironically was aimed at raising awareness of censorship by social media platforms.

“Facebook needs to improve its transparency and accountability, particularly with the increased use of AI to take down content automatically.”

ARTICLE 19 is calling for Facebook to be more transparent about how it identifies content as harmful removal, the amount of content it removes and its error rates. It also needs to put in place a proper appeals process for when content is removed or users’ accounts are closed down in error.

ARTICLE 19 Facebook account suspended

An ARTICLE 19 staff member had their personal Facebook account blocked for three days after trying to upload a video aimed at raising awareness of censorship by social media platforms.

Facebook also blocked the uploading of ARTICLE 19 campaign videos that aimed to raise awareness of artistic censorship. The video featured the work of Borghildur Indriðadóttir, an Icelandic visual artist whose work was censored by Facebook in 2018 because it breached Facebook’s nudity and nipple ban. However, in its Community Standards, Facebooks says that its “nudity policies have become more nuanced over time” and that they now allow “photographs of paintings, sculptures and other art that depicts nude figures”.

Missing Voices campaign

ARTICLE 19’s Missing Voices campaign is calling for more transparency and the improvement of dispute resolution when social media platforms remove content and close down users’ accounts under their policies and community guidelines.

We are calling for Facebook, Twitter and Google to:

  • Provide more transparency about the number of content removals, types of flaggers , reasons for removal, how many appeals they receive and the outcome of the appeals.
  • Give all users to have a right to appeal when their content is removed or their account closed down.

The campaign has found that journalists, artists, activists and marginalised groups are often targets for takedown on social media. According to a 2018 report by Freemuse, “Women artists suffer being silenced not only by governments and religious structures, but also by social media platforms, such as Facebook, Instagram, Twitter and YouTube. These companies increasingly remove content they deem indecent or by request from authorities.”


For more information, contact [email protected].