Eastern Africa: Call for more transparency in Facebook content moderation

Eastern Africa: Call for more transparency in Facebook content moderation - Digital

ARTICLE 19 Eastern Africa calls for greater transparency in Facebook’s content moderation and reporting processes regarding the withdrawal of accounts in Sudan and Ethiopia in 2021. We are concerned by the non-comprehensive transparency reporting surrounding the account removals in both countries.

In June 2021, Facebook released the ‘Coordinated Inauthentic Behavior Report’ detailing its take-down of accounts being used to ‘manipulate public debate.’ According to the report, accounts from seven countries, including Sudan and Ethiopia, were taken down. Facebook defines coordinated inauthentic behavior (CIB) as ‘domestic, non-government campaigns that include groups of accounts and Pages seeking to mislead people about who they are and what they are doing while relying on fake accounts.’ Facebook states that it removes both inauthentic and authentic accounts, Pages and Groups directly involved in this activity. 

We commend Facebook for its efforts to curb information disorders in Eastern Africa, and beyond. However, we maintain that companies’ transparency reporting must be comprehensive, especially where highly restrictive sanctions such as permanent removal for non-compliance with Terms of Service have been applied and where accounts and content have been removed following recommendations from third parties. This is not only important for the protection and promotion of the right to freedom of expression, but also integral to Facebook’s obligation to respect human rights, as affirmed in the UN Guiding Principles on Business and Human Rights

In Sudan, Facebook removed ‘53 Facebook accounts, 51 pages, three groups, and 18 Instagram accounts’ that targeted domestic audiences and were linked to individuals including those associated with the Future Movement for Reform and Development, a political party in the country. Facebook stated that they were alerted about the existence of this network following reports from a third party, researchers at Valent Projects, a digital agency for social impact. ARTICLE 19 is concerned that, although some information is provided in the report about concerns with the accounts, Facebook’s CIB report does not detail the information it received from the researchers at Valent Projects, which led to the removal of accounts on both Facebook and Instagram. Further, ARTICLE 19 notes that the report does not specify whether the account holders were given an opportunity to respond, before the removal, despite the impact of this action on freedom of expression and access to information online in Sudan. 

In Ethiopia, Facebook removed ‘62 Facebook accounts, 49 Pages, 26 Groups, and 32 accounts on Instagram’ that targeted domestic audiences and were linked to individuals associated with the Information Network Security Agency in Ethiopia (INSA). Facebook stated that they found this network as part of an internal investigation into suspected coordinated inauthentic behavior in the region. The conclusive findings of the internal investigation that were shared with the public do not disclose the administrative process followed, effectively denying Ethiopian citizens and the public in general, the opportunity to review the conduct of their government, in what constitutes a matter of public interest. 

We call on the Facebook Oversight Board to investigate Facebook’s decision to remove, rather than temporarily, permanently or indefinitely suspend, the removed accounts in Sudan and Ethiopia, given their close connection to democracy, freedom of expression and public interest.

ARTICLE 19 Eastern Africa will continue to engage both Facebook and stakeholders affected by content moderation. 

 

Contact

Mugambi Kiai, Regional Director at [email protected]