Why decentralisation of content moderation might be the best way to protect freedom of expression online

Why decentralisation of content moderation might be the best way to protect freedom of expression online - Digital

In December 2019, Twitter CEO Jack Dorsey announced that Twitter was funding research into creating a decentralised standard for social media. The idea is that, by creating an open standard that could be used by a limitless number of content moderation providers, control over content (and data) is no longer held by a few, dominant companies. This essentially means that these few companies would no longer hold such power over freedom of speech online.

The proposal presents a possible solution to a complex freedom of expression problem. As a freedom of expression organisation, ARTICLE 19 is worried about how current social media content moderation practices allow certain companies to wield significant power over freedom of expression online. Content moderation allows companies to decide what we can post and see online, based on rules they determine and implement, with limited oversight.

We have been talking about this topic a lot, and we are not the only ones. The problem of a few private companies’ control of freedom of expression and access to information online is one that many public and private actors are trying to solve, but so far with little success.

Facebook’s approach had been the creation of its own Oversight Board to provide additional, independent review, and recommend subsequent action to Facebook. ARTICLE 19’s counter-proposal, what we call Social Media Councils (SMCs), envisages the creation of open, participatory and accountable bodies made up of various actors working at the national level to safeguard freedom of expression online. We have outlined the benefits of our proposal compared to that of Facebook, here. Both solutions are designed as a check on the current system of content moderation by powerful platforms. The Twitter proposal presents a new approach – instead of creating safeguards on current content moderation by platforms, it would create an entirely new system.

Twitter’s decentralisation proposal

Although Twitter is yet to say more about their proposal, called bluesky initiative, it is understood that it will enable social media platforms to function in a similar manner to email, creating an interoperable network which enables users to connect across different, distinct platforms. Dorsey acknowledged the challenges of the current, centralised standard for social media platforms and cites the difficulties of “centralised enforcement of global policy to address abuse and misleading information [which] is unlikely to scale over the long-term without placing far too much burden on people.” By creating protocols which no singular organisation controls in a decentralised network, Twitter’s proposal is presented as “enabling consumer choice” between different content providers.

Twitter’s Chief Technology Officer, Parag Agrawal, also acknowledged that moving towards a decentralised standard would present several challenges, including the difficulty of finding “viable and sustainable incentives and business models” for those companies adopting the standard. Right now, dominant platforms all use similar, data driven business models, which might not be easy to be replicated in a decentralised environment. However, Dorsey also noted that the fundamentals of such a decentralised system already exist.

While it’s not the first proposal of its kind, it has once again focused attention on how to tackle the dominance of a few, large social media companies, and their influence on freedom of expression online.

How decentralisation protects free expression

We believe that centralised control of key communication services, at both the content and infrastructure levels, and in the hands of a few dominant companies is creating significant harms for individuals’ rights. This level of control means companies can impose restrictions on users’ freedom of expression (and other rights, such as privacy) without accountability. Users are largely unable to ‘opt-out’ if they wish to continue engaging in key online communication spaces.

What remains to be seen is whether Twitter’s decentralisation proposal will encompass the guarantees and safeguards necessary to ensure the protection of users’ rights, especially freedom of expression. In order for any proposal to fully address the harms of centralised content moderation, it must be approached in a way that is truly independent from the control of powerful companies, applies sector-wide, addresses the relevant market failure, and observes human rights standards.

While decentralisation has a number of merits when it comes to addressing the various challenges raised by content moderation, it is not necessarily sufficient to solve all problems. Additional or complementary measures may remain necessary if we want to lay down fundamental standards for content moderation. For example, decentralisation may render it more difficult to implement common standards on how to respond to illegal content, or to avoid the spread of disinformation.

Decentralisation through unbundling 

A decentralised standard is an exciting potential development. At ARTICLE 19, we think that one of the most promising elements of it – the reduction of the bottlenecks created by the centralised enforcement of content moderation – can also be achieved through another, more comprehensive form of decentralisation, known as unbundling.

It is the ‘bundling’ of different services by dominant social media companies that enables them to have such a tangible impact on freedom of expression online, controlling both the hosting of the social media platform and the moderation of content on the platform. Unbundling refers to the separation of the hosting and content moderation services performed by dominant social media platforms. Through unbundling these two services, dominant social media platforms would still be able to moderate the content on their platforms, but they would be also obliged to allow competitors to provide competing content moderation services on their platforms.

In addition, unbundling requires decentralising the marketplace of content moderation on social media at both the contractual and the technical layer. This would affect the contractual relationship between the dominant platform and content moderation providers (the contractual layer), as well as require an adequate level of interoperability between content moderation providers (the technical layer). Therefore, unbundling is a more holistic approach, aimed at the creation of a more open and sustainable marketplace.

Rather than relying on private companies’ initiatives (and putting greater power in their hands), unbundling relies on regulatory authorities taking action to ensure consumers benefit from a fair, open, and competitive marketplace. Regulatory authorities should lower the barriers to market entry and allow various competitors to provide content moderation services. New providers would be then able to compete with the existing dominant players for users on the basis of the quality of service they offer, which should include the level of rights protections they offer.

For users, unbundling will mean there are finally concrete and viable alternatives to switch to, based on their preferences about the characteristics of the service, without the need to leave the platform. For providers, there will be easier access to users and there will be incentives to compete with one another to provide content moderation services which best safeguard the privacy and the free expression of users.

How can unbundling be achieved?

To achieve the sort of change we envisage through unbundling, regulatory and competition authorities have a vital role to play. They should carefully design the unbundling remedy to be imposed on platforms, and to do so they should engage in a dialogue with platforms as well as with other relevant stakeholders.

Due to the high asymmetry of information currently in the market, regulators alone do not possess the necessary information to properly shape the unbundling requirement at the contractual and at the technical layer. Nonetheless, regulators should be the ones to lead this process, and to closely monitor the compliance of market operators.

Importantly, any new proposals should also conform with the guarantees provided by the international human rights framework. This includes obligations to ensure that measures taken are transparent, and that users’ rights are duly respected. In particular, unbundling should be achieved providing sufficient guarantees in terms of privacy and data protection for users, and adequate measures should be taken to support the supply of services that ensure users’ exposure to a diversity and plurality of voices.

Twitter’s proposal represents a possible move towards a better solution, and we agree that an approach which diversifies control of hosting and content moderation could hold significant benefits for users’ human rights. However, we believe that unbundling presents advantages for both competitors and users compared to Twitter’s approach, and might lead to a more sustainable ecosystem for content moderation, and greater respect for freedom of expression online.

 

Read more:

ARTICLE 19 in Open Global Rights: Social media complicates mainstream media goals of pluralism and diversity

ARTICLE 19 in CIGI: The Decline of Media Diversity — and How We Can Save It