International: A new policy to tame Big Tech and protect expression

International: A new policy to tame Big Tech and protect expression - Digital

CEO of Twitter Jack Dorsey appears on a monitor as he testifies remotely during the Senate Commerce, Science, and Transportation Committee hearing 'Does Section 230's Sweeping Immunity Enable Big Tech Bad Behavior?', on Capitol Hill in Washington, DC, U.S., October 28, 2020. Michael Reynolds/Pool via REUTERS -

Taming Big Tech: Protecting freedom of expression through the unbundling of services, open markets, competition, and users’ empowerment

Today, ARTICLE 19 publishes a new policy to address how the excessive market power of large social media platforms prevents media diversity, reduces the right to freedom of expression, and limits the free flow of information. ARTICLE 19 offers a solution for how competition and market-regulation tools can be used to achieve open, fair, and decentralised digital markets where freedom of expression and other human rights are adequately protected, and where no single entity private or public can control the flow of information in society. 

Although​​ social media platforms are important spaces in which people connect, engage, and share and access information, platforms’ business model, based on advertising and monetising users’ attention, has amplified ‘hate speech’ and ‘disinformation’. Governments around the world are looking for ways to address these problems. Many proposed solutions focus on content moderation, i.e. what social media companies allow or remove from their platforms. As a result, such proposals can do more harm than good to users’ rights. Another new ARTICLE 19 policy, Watching the watchmen, tackles this in more detail.

But content moderation is only part of the problem. Governments, policy makers, and we as individuals and communities must devote equal attention to the excessive market power that social media companies exercise, which plays a fundamental role in content-moderation challenges. 

These platforms act as gatekeepers of information, directly affecting what content we can and cannot see, which has an impact on public debate. Because of this, they are also gatekeepers of human rights, directly affecting our rights to privacy and freedom of expression. To fix these challenges, we must address this gatekeeping. 

ARTICLE 19’s policy offers a pro-competition, regulatory solution to these challenges: the unbundling of hosting and content-curation services on large social media platforms. Currently, platforms both host content on their platforms and curate it using algorithms and moderators. These two services are offered together as one ‘bundle’. This allows large social media platforms to protect themselves from competitors that would offer better curation services but also lock in users and deprive them of alternatives. 

ARTICLE 19 believes that this problem can be solved through regulators ordering large platforms to ‘unbundle’ hosting and content-curation activities, and to allow third parties to offer content curation to users. New providers would then be able to compete with the Big Tech giants for users. This would mean that, for instance, Facebook would ask its users whether they want Facebook itself or other players, which they could freely select, to provide the content-curation service.  

At the same time, this unbundling of services obligation must be enforced by independent and accountable regulatory authorities, both in law and in practice. Regulators will also set human rights-compliant standards for the provision of content curation for all players, and address rules for technical standards. 

For users, unbundling will mean there are finally concrete and viable alternatives to switch to, based on their preferences, without needing to leave the platform. 

For providers, there will be easier access to users, and incentives to compete with one another to provide content curation services that best safeguard users’ privacy and free expression, among others.

ARTICLE 19’s policy outlines this proposal in greater detail, and makes the following recommendations:

  1. States should adopt regulation to impose the unbundling of hosting and content curation on large social media platforms. These platforms will not do this unless they are made to, as their current approach locks in users and deters competitors, making it highly lucrative. 

 

  1. Independent regulatory authorities should be the ones to enforce this regulation. To protect users’ free expression, it is crucial that these authorities remain independent from public and private power. 

 

  1. The unbundling of services should be shaped as a form of functional separation. In other words, our proposal does not mean that a large platform needs to dismiss assets. The large platform that provides the hosting should remain free to offer content curation, too, so that users can freely choose which company provides them with this service. To guatantee a real choice, the option to select the large platform has to be presented to the user as an opt-in.

 

  1. Independent regulatory authorities should ensure the unbundling rules are implemented effectively. We offer preliminary recommendations for how regulators can do this in our policy.

 

  1. States should introduce regulation to make sure all social media companies, not just the biggest platforms, base their content-curation rules on human rights. This should include obligations to improve transparency over content-moderation decisions, and to improve systems to resolve any disputes these decisions cause. 

 

It is high time we tackle the excessive market power of Big Tech. ARTICLE 19’s proposal would lead to better protection of freedom of expression, pluralism, and diversity, as well as to far more open, fair, and decentralised digital markets that enable the free flow of information in society. This would be a winwin for both social media users and alternative players.

 

Read the policy