Back to top

Social Media and Content Moderation

There are currently around 2.2 billion active Facebook users and 330 million Twitter users, while 1 billion hours of video are watched daily on YouTube. These new giants have caused a huge disruption in how information and content are distributed, the effects of which are still becoming clear.

As social media companies hit the news – from Cambridge Analytica to Facebook’s role in the Myanmar crisis – the calls for a solution have become louder. The stakes are high but little progress is being made. However, viable solutions may be on the horizon, including a self-regulatory media council. [1]

The way that technology companies – particularly social media networks – function can be seen as a monopoly; one that distorts the market and violates both consumer and human rights. If a company controls a large part of the social media market, as giants like Facebook and Twitter do, they become not only an economic gatekeeper but also a human rights gatekeeper – particularly in terms of expression and privacy.

In a free and competitive market, abusive terms of service would be unsustainable. In a functioning market, individuals would shift to other platforms offering better terms that protected their expression. In the world of social media, simply transferring use to another platform is not possible, due to the network effect – where the value of a product or service increases according to the number of others using it. Competition law may be a key part of a toolkit to check the power of platforms and service providers. [2]



[1] ARTICLE 19, Self-Regulation and ‘Hate Speech’ on Social Media Platforms, 2018, available at‘hate-speech’-on-social-media-platforms_March2018.pdf

[2] ARTICLE 19, How Can Competition Law Help to Secure Freedom of Expression on Social Media?, 23 November 2018, available at