As Joe Biden begins his term as US President, the fallout from the 6 January storming of the Capitol building, and questions about the role of social media in it, continues. The riot followed a speech by Trump earlier that day in which he said that he would never concede the election and attacked the ‘fake news media’ as ‘the enemy of the people’. The speech was riddled with falsehoods about election fraud or the powers of the Vice-President Michael Pence to overturn the election result. Whilst not directly calling for violence, Trump called on his supporters to walk down to the Capitol and give ‘weak Republicans’ the kind of ‘pride and boldness’ they needed to ‘take our country back’.
On 7 January, the major tech companies, including Twitter, Facebook and many others, suspended Trump’s accounts temporarily. Facebook and Twitter later said that Trump’s account would be suspended permanently, due to the likelihood of future violence as a result of his tweets. Although not for the first time, this was a remarkable display of power on the part of social media platforms. Quite rightly many feel very uncomfortable about it, including Twitter’s CEO, Jack Dorsey. Should platforms really be the ones to decide whether the President of the United States gets to speak? Or any other elected politicians for that matter? This has re-ignited one of the most fraught debates in content moderation circles, i.e. who gets to decide what is or is not allowed in public discourse? This question has become even more acute as hosting providers such as Amazon Web Services and the Apple and Android app stores decided to suspend access to Parler, a social network favoured by far-right groups and Republican politicians, until it commits to robust content moderation policies.
In the wake of Trump’s suspension, many governments reacted with unease, calling for democratic accountability or saying that governments should be making these decisions. A spokesperson for the German Chancellor, Angela Merkel, suggested that the permanent suspension of Trump went too far. The European Commissioner for the internal market, Thierry Breton, took this opportunity to liken it to a 9/11 watershed moment for the big tech companies. In Poland, the government swiftly announced that it would pass legislation preventing social media companies from removing lawful speech. Meanwhile, Mexican President Andres Manuel Lopez Obrador vowed to launch a global effort against social media bans.
And here lies the rub. As uncomfortable as it is for large companies to make these decisions, the decision of suspending Trump from social media platforms should not be put in the hands of governments either. Nor is it likely that a government would ever want to de-platform its leader regardless of what she or he says. If anything, the Polish proposal testifies to the determination of some governments to continue to say things which are likely to be in breach of international law. Free speech organisations such as ARTICLE 19 are also well-placed to know that regulators are often not independent, filled with political friends and prone to commercial capture. If any branch of the State, the courts should decide whether what politicians say amount to incitement. And even so, in countries such as Poland or Turkey, the independence of the courts is in serious doubt. Moreover, in practice, it is difficult to see who would bring a case and whether a decision could be made in good time.
Indeed, another key difficulty in the Trump saga is that what happened did not come as a surprise. The storming of the Capitol did not happen in a vacuum. It was the culmination of four years during which President Trump relentlessly lied and stoked the flames of division in the United States. It followed months of President Trump sowing the seeds of doubt about the likely fairness of the election and then persistently refusing to admit the election result. The question then becomes if Trump should have been suspended earlier, and if so, when? After all, the US President has often tweeted falsehoods, but these were not necessarily illegal and what he says deserves scrutiny. Even if Trump was ‘de-platformed’, as President of the United States, what he says is covered by the media, he has plenty of other outlets where he can share his views. Indeed, misinformation has not only been rife on social media but on broadcasting networks too. There are no easy answers to these questions.
What of some of the platform regulation proposals in Europe, like the Digital Services Act (‘DSA’)? Ironically, these proposals tend to delegate great power to companies to decide what’s illegal or not in the first place. Companies are also encouraged to moderate content on their platforms. Whilst the DSA contains many valuable proposals for greater transparency and due process, it is unclear that they would have led to a different outcome in this case.
What to do then? As a first step, our public discourse on ‘social media platforms’ needs to recognise that content moderation is messy and involves trade-offs. In the case of Trump’s tweets on 6 and 7 January, there is a strong case to be made that they were likely to incite violence as a matter of international law. Next, we need far more transparency and due process from platforms, as many civil society groups have advocated for years, for instance in the Santa Clara Principles on Transparency and Accountability in Content Moderation. The suspension of Trump’s account stands out because both Jack Dorsey for Twitter and Mark Zuckerberg for Facebook had to explain their decision in some detail. That is more than what elected representatives from Lebanon, generals from Burma or high ranking officials in Iran ever got. Not only do the largest social media companies need to explain their decisions much more, they need to be far more consistent about how they apply their standards across the globe and give themselves the means to understand the context in which they operate. Oversight by independent multi-stakeholder bodies, like Social Media Councils, could also help address the thorniest issues in content moderation, like the decision on Trump.
What about power? The decisions of Twitter or Facebook would be far less significant if they didn’t concentrate so much of the world’s population on their platforms. Regulating their content moderation processes may lead to greater transparency and better decision-making on their part but it could also consolidate their dominance. As we’ve argued elsewhere, what is needed is pro-competitive measures and remedies that lower barriers to entry to the market, so that users are empowered and given viable alternatives. The discussion about power is unlikely to go away but it seems unlikely to yield the answers that we want for a free and open Internet.