Taming Big Tech: Protecting expression for all

leer la pagina en español
lire la page en français
читать по русски

Governments around the world are seeking to regulate how social media companies address problematic content on their platforms, especially hate speech, harassment, and disinformation. 

But while well-intentioned, their proposals risk doing more harm than good, and they fail to tackle the real problem: the excessive power of a few huge companies whose business models are inherently exploitative. 

ARTICLE 19’s policies set out a solution that would not only protect freedom of expression and privacy online but also – finally – give us all a viable alternative to Big Tech.

The problem with platforms

Social media networks are a vital space for us to connect, share, and access information. But because the business models of large social media platforms rely on capturing our attention and selling it to advertisers, their algorithms are designed to keep us engaged for as long as possible – including by amplifying problematic content like hate speech and disinformation.

Governments have come up with various proposals to address this. Yet rather than tackling the flawed business model, many of their so-called solutions focus on what kinds of content people should and shouldn’t be allowed to post or access on social media. 

This gives large platforms even more power to police what we see, say, and share online – with disastrous consequences for public debate, the free flow of information, and democracy itself.

The conversation about social media has become marred by ‘regulatory drama’: we don’t want state intrusion, but we do want better regulation. 

How should this drama be resolved? 

ARTICLE 19 has a two-pronged solution.

ARTICLE 19’s solution

1. How to regulate content moderation while protecting freedom of expression

First, our policy Watching the Watchmen sets out how governments can ensure their efforts to regulate platforms respect users’ freedom of expression, improve platforms’ transparency, accountability, and decision-making, and – crucially – avoid giving even greater power to the handful of companies that dominate the digital sphere.

Watching the Watchmen policy document cover

Watching the Watchmen

How to regulate content moderation while protecting free expression

Read our policy paper

But setting human rights standards for social media services addresses only part of the problem. 

Currently, a few platforms dominate the social media markets, exploit their users, and violate our rights to privacy, free expression, and non-discrimination. And the lack of viable alternatives locks us into these exploitative relationships.

To truly fix problems in the social media markets, we must tackle the excessive market power of the few huge corporations that control them

2. How to tackle the excessive market power of social media giants

ARTICLE 19’s second policy, Taming Big Tech, shows how to do just that. 

It lays out a pro-competition solution that would transform social media – from a closed space, controlled by a handful of exploitative companies and riddled with hate speech and disinformation, to an open and diverse space where we have a real choice between service providers and can step out from exploitative relationships.

Taming Big Tech

 How to tackle the excessive market power of social media giants

Read our policy paper

Taken together, these two proposals would protect freedom of expression, media pluralism, and diversity, and lead to more open, fair, decentralised platforms that enable the free flow of information. 

This would be a win–win: for social media users, for smaller service providers, and for society and democracy more broadly.

Key questions answered

Isn’t it a good thing that governments are addressing problematic content on social media?

It’s certainly encouraging that governments want to tackle online abuse, hate speech, and other problematic content – problems that the biggest platforms have repeatedly failed to address, and that drive many users away. 

But while their intentions might be understandable, many of their actual proposals would do more harm than good. 

This is because, while they claim to be about regulating platforms, governments’ proposals are really more about regulating users’ speech. Effectively, governments are asking platforms to police us and decide what kinds of speech are ‘illegal’ – or even ‘legal but harmful’. This would give a few huge companies even more power.

ARTICLE 19’s policy Watching the Watchmen outlines how governments can regulate platforms’ content moderation and content curation in a way that protects users’ rights. 

But current content-moderation and -curation systems are only part of the problem. Governments must also address the excessive market power of the dominant tech companies.

This excessive market power plays a vital role in free expression challenges. That’s why our proposals offer a two pronged solution that prevents further concentration of that power and set out an innovative solution that would create more open, fair, decentralised social media markets.

As such, ARTICLE 19’s two policies represent two sides of the same coin – and both solutions are necessary to protect users’ rights.

If people don’t like social media, why don’t they just stop using it? 

Worldwide, over three-quarters of people aged 13+ use social media, and we spend an average of 2.5 hours on the platforms every day.

Simply leaving is not an option for most of us because social media is so central to every aspect of our lives.

From staying in touch with friends and family to shopping, from keeping up with world news to participating in community forums, from pursuing education to partaking in our hobbies, and from organising a birthday party to exchanging information about protests, social media has become the digital village green, town square, and city hall. 

Suggesting that people can simply leave social media is therefore a very privileged position. For many people, leaving would be akin to leaving our communities – even our societies – and being cut off from basic services. 

And nor should we have to leave, given that there is no competing service to switch to – which itself is a result of Big Tech’s market domination. 

We are stuck between a rock and a hard place. 

The only escape route is to tackle the excessive power of Big Tech.

Why does it matter that just a few companies control our online spaces?

Monopolies of any kind are bad for society. They control the market, lock us into using their goods or services, and have no incentive to improve – after all, there’s no competition or alternative.

No single entity – private or public – should control the flow of information in society. Yet the excessive market power of the large social media platforms, coupled with their popularity as a source of information and their power over what we see, means they can do just that. 

This makes the dominant platforms gatekeepers: not only of the market (because they can see off competitors and lock in users) but also of human rights (because they can grant or restrict our rights to privacy, free expression, and other fundamental rights). 

To fix these challenges, we must dilute this power and keep it in check.

What is ARTICLE 19’s solution to Big Tech’s excessive power?

ARTICLE 19’s policy, Taming Big Tech, offers a unique solution to both of these problems: current content-moderation and curation systems on social media platforms, and the excessive market power of the companies that own them.

We propose separating (also known as ‘unbundling’) two services that large platforms currently provide as one package (one ‘bundle’): (1) hosting content, and (2) curating content.

Currently, platforms both host our content (i.e. we can create our profiles and post on their platform) and curate it (i.e. they use their own algorithms to create our timeline or newsfeed: what we see on their platform). They offer us no choice in this: they present hosting and curation – two distinct services – as one inseparable package. 

But there is no reason why they should be inseparable. The only reason they currently are is because it allows the dominant companies to lock out competitors, lock in users, and maximise their already-outrageous profits (Meta, for example, made a profit of $39.37 billion in 2021). 

Separating these two services would mean the large platforms could still host our content (i.e. we could still use our existing profile on their platform), but they would have to allow third parties to curate it (i.e. create our timeline or newsfeed). Third parties could then compete with the Big Tech giants to curate what we see in more diverse ways, offering us greater control and choice, and breaking up the current monopoly.

This would mean that, for instance, Facebook would have to ask us whether we want Facebook itself or another company – which we could freely select – to curate content for us. We could then select a company that prioritises privacy, or simply one that specialises in a subject we’re interested in (be that football, hip-hop, or climate change), to curate what we see in our newsfeed.

Of course, some users would be happy for Facebook to continue to both host and curate their content – and that would also remain an option. The crucial factor here is user choice. 

But the current model is so lucrative for the large platforms that they aren’t going to change voluntarily. That’s why ARTICLE 19’s solution requires independent and accountable regulators to enforce and oversee it. 

Crucially, we need both unbundling and human rights-compliant standards that all content-curation providers, from the smallest to the largest, would have to adhere to. 

Isn’t it about time they paid for solving the problems they created?

That’s why ARTICLE 19 believes the biggest platforms should foot the bill for separating content hosting from content curation. 

And it’s why governments should impose a levy on the biggest platforms to fund our other solutions, like giving people access to dispute-resolution mechanisms when their content is wrongfully removed, and supporting new business models for platforms that create social value – rather than simply extracting it from us. 

How would our solution benefit people, providers, and society?

As ARTICLE 19’s policy Taming Big Tech lays out, separating content hosting from content curation and providing access to competitors would have innumerable benefits: 

  • For individuals, it would finally give us concrete and viable alternatives to the biggest platforms’ content-curation systems. We could select a company to curate our content based on a variety of criteria – for example, how well they protect our privacy or give us access to a plurality of perspectives – conditions that are essential to making informed choices and protecting democracy. And we wouldn’t even need to leave the platform: we could keep our existing profile, friends, and followers.
  • For smaller providers, it would grant easier access to users and an incentive to compete with one another to provide content curation that best serves users’ interests – including safeguarding our privacy and online speech.
  • For society, it would lead to far more open, fair, diverse, and decentralised social media markets that enable the free flow of information – a healthier environment for free speech all round.

In other words, our solution would be a win–win for social media users, smaller providers, and society at large.

Who should fund our solution?

In 2021, Meta – which owns Facebook, Instagram, Messenger, and WhatsApp – made a profit of $39.37 billion

That’s more than the GDPs of over half the world’s countries – and more than the combined GDPs of the poorest 31 countries.  

Around 97% of Meta’s total revenue comes from advertising: that is, from monetising users’ attention and selling it on. 

We work for them.

We work for free.

And they make their billions by invading our privacy, controlling what we can see and say online, and amplifying conflict, disinformation, and hate speech. 

This is no accident; it’s their entire business model. 

They intentionally create these problems – which are profitable for them yet disastrous for society – while pretending they are making the world a better place for us all. 

Isn’t it about time they paid for solving the problems they created?

That’s why ARTICLE 19 believes the biggest platforms should foot the bill for separating content hosting from content curation. 

And it’s why governments should impose a levy on the biggest platforms to fund our other solutions, like giving people access to dispute-resolution mechanisms when their content is wrongfully removed, and supporting new business models for platforms that create social value – rather than simply extracting it from us.

Podcast: Taming the Titans

Episode 5: Momentum is building – so where now?

This week, we talk about a crucial third force in this discussion – beyond state and business: civil society – groups of citizens advocating for human and democratic rights and for the good of societies. A global debate is coming: legislators and regulators in different parts of the world will need to adapt emerging regulatory tools and concepts to their own context and markets. What can we learn from the process of negotiations around the Digital Markets Act in Europe and how can it be replicated in other contexts? Is it even the right template? How can we weave together a global civil society to make sure that people’s voices are really heard in this growing conversation?

Recommendations

For large social media companies

Separate content hosting from content curation, and ensure third-party players have access to users in order to offer them alternative curation services. 

This separation (unbundling) of services should be shaped as a form of functional separation. In other words, our proposal does not mean that a large platform needs to sell off the content-curation part of their business. The large platform that provides the hosting should remain free to offer content curation, too, so that users can freely choose which company provides them with this service. To guarantee a real choice, the option to select the large platform has to be presented to the user as an opt-in.

For governments

Adopt regulation to require large social media platforms to separate their hosting and content-curation services and to provide fair, reasonable, and non-discriminatory access to third-party providers. 

Currently, platforms both host content on their platforms and curate it using algorithms. Separating (or ‘unbundling’) these two functions would offer users greater choice over which company curates their newsfeed or timeline, giving them a viable alternative to switch to and greater control over the information they see, without needing to leave the platform they currently use. It would also encourage providers to compete with each other to provide a service that best safeguards users’ privacy, represents plural perspectives, and thus strengthens informed debate and improves public knowledge overall. ARTICLE 19’s new policy, Taming Big Tech, deals with this in greater detail.

This unbundling must be mandatory, and it needs to be enforced and monitored by independent authorities. Large platforms will not separate these two functions unless they are made to, because doing so would expose them to competition. 

We offer preliminary recommendations for how regulators can effectively implement the unbundling of content hosting from content curation in our policy, Taming Big Tech.

Introduce regulation to make sure all social media companies, not just the biggest platforms, base their content-moderation and content-curation rules on human rights. 

Transparency, accountability, and the protection of human rights must form the overarching principles of any regulatory framework for social media platforms. 

Regulation should include obligations to improve transparency over content-moderation decisions, and to improve systems to resolve any disputes these decisions cause. 

Governments must not impose a general obligation on social media companies to monitor content. This would likely lead to increased censorship and silence far too many voices.

Any framework to regulate content moderation must:

  • Be strictly limited in scope. It should focus only on illegal (not ‘legal but harmful’) content, should not apply to private-messaging or news services, and should only apply in the country that passes the regulation.
  • Clearly define obligations. These should include transparency and an obligation to protect privacy and promote media diversity. They should not include compliance targets or an overly broad ‘duty of care’ to prevent ‘harm’.
  • Be proportionate. Governments should not adopt measures that, while intended to hold large social media companies to account, in reality impose an undue burden on smaller services.
  • Provide access to effective remedies for users. These should include internal complaint mechanisms, access to judicial remedies, and alternative dispute-resolution mechanisms.
  • Maintain platforms’ conditional immunity from liability for third-party content, but clarify scope and notice and action procedures. Removing or limiting platforms’ immunity from liability would give them an incentive to remove either too much or too little content.

Taming Big Tech

How to tackle the excessive market power of social media giants

Read our policy paper

Watching the Watchmen

How to regulate content moderation while protecting free expression

Read our policy paper

ARTICLE 19’s new proposals finally give users concrete and viable alternatives to the biggest platforms. The proposals would allow people to select providers to curate our content based on how well these providers protect our privacy. Vitally, they would ensure that our communities and societies have access to a plurality of perspectives – essential conditions for making informed choices and protecting democracy.

Find out more

Digital markets: Why competition is good for freedom of expression
27.07.2022 5 min read

Digital markets: Why competition is good for freedom of expression

Click here to go to article
Bridging the Gap: Local voices in content moderation
06.06.2022 1 min read

Bridging the Gap: Local voices in content moderation

Click here to go to article
Twitter: What Elon Musk must do to protect free speech
27.04.2022 3 min read

Twitter: What Elon Musk must do to protect free speech

Click here to go to article
International: Why Haugen’s Facebook testimony misses the point
10.11.2021 7 min read

International: Why Haugen’s Facebook testimony misses the point

Click here to go to article
US: A Capitol riot and Big Tech takes a stand: but is it the one we want?
12.01.2021 14 min read

US: A Capitol riot and Big Tech takes a stand: but is it the one we want?

Click here to go to article
Why decentralisation of content moderation might be the best way to protect freedom of expression online
30.03.2020 11 min read

Why decentralisation of content moderation might be the best way to protect freedom of expression online

Click here to go to article
#MissingVoices Campaign
10.12.2019 1 min read

#MissingVoices Campaign

Click here to go to article