Current content moderation practices allow social media companies to wield significant power over people’s right to freedom of expression. Despite this, social media platforms are still not accountable for the way they moderate content.
As of October 2021, the only existing external self-regulatory mechanism for the oversight of content moderation, the Facebook Oversight Board, has rendered 15 decisions. While Facebook is bound by the Oversight Board’s decisions on whether a particular piece of content should be allowed on the platform, general recommendations that the Oversight Board makes are not binding. The long-term effectiveness of this model remains to be assessed. No other social media giant has committed to a similar initiative.
Recent legislative developments are guided by an ambition to make ‘Big Tech’ accountable. Among planned legal initiatives that are likely to be globally influential, the UK Draft Online Safety Bill poses a severe risk of becoming a chokehold for freedom of expression. The draft EU Digital Services Act, while retaining the general conditional immunity from liability for hosting providers and a prohibition on general monitoring, raises a number of concerns about the protection of freedom of expression online.
Along with monitoring legal developments for the regulation of social media platforms, ARTICLE 19 has been developing a model for multi-stakeholder, voluntary-compliance mechanism, known as a Social Media Council (SMC). The SMC provides a transparent and independent forum to address content moderation issues on social media platforms on the basis of international human rights standards.
The key objectives of the SMC
Review individual content moderation decisions made by social media platforms on the basis of international standards on freedom of expression and other fundamental rights. The right of appeal gives the SMC more credibility in the eyes of the public and gives individual users an opportunity to be heard on matters that directly impact them.
Provide general guidance on content moderation guided by international standards on freedom of expression and other fundamental rights. While there is a growing consensus on the relevance of international human rights law for content moderation, this is still an emerging field with many open questions.
Act as a forum where all stakeholders can discuss and adopt recommendations (or the interpretation thereof). This participatory methodology promotes collective adoption and interpretation of guidelines and can help embed international standards into practices of content moderation.
Use a voluntary-compliance approach to the oversight of content moderation where social media platforms and all stakeholders sign up to a model that does not create legal obligations and where they voluntarily implement the SMC’s decisions and recommendations. The SMC will be a self-regulatory mechanism where representatives of the various stakeholders come together to regulate the practices of the sector.
Since our initial publication on the SMCs in 2018, ARTICLE 19 have consulted a wide range of actors about this mechanism and have taken the first steps towards setting up a pilot SMC in Ireland.
This report brings that information and experience into the public domain.