USA: Supreme Court decision will have global impact on free speech online

USA: Supreme Court decision will have global impact on free speech online - Digital

On the day of the Supreme Court hearing in the Gonzalez v Google case, ARTICLE 19 warns that limiting the application of Section 230 will incentivise platforms to censor more content by users, not just in the United States but around the world. 

Gonzalez v Google and Twitter v Taamneh are landmark cases concerning the foundations of free expression on the internet: Section 230 of the Communications Decency Act, which protects platforms from liability for content posted by users. 

The cases were initiated by families whose loved ones were killed in ISIS attacks in Paris and Istanbul. Both deal with similar questions: should platforms that host and show users terrorist content be held accountable under the Anti-Terrorism Act, and can their immunity from liability under Section 230 be restricted?

Commenting on Gonzalez v Google, Bob Latham, Chair of the Board of ARTICLE 19, said:

“Section 230 helped shape much of the global internet we know today: a space where people can organise, share and access information and hold power to account. Preserving its protections is about far more than the interests of Google, Twitter, and other platforms; it is about protecting freedom of expression for all of us

“If Section 230 is limited, platforms will be faced with the prospect of thousands of lawsuits if they fail to remove the content that might be labelled as ‘terrorist’, not just in the United States, but  anywhere in the world. Companies will do everything in their power to avoid liability. They will likely do so through monitoring everything users post on their platforms and using algorithms to censor content en masse. We know those tools are not capable of understanding nuance or context – especially in languages other than English.” 

David Kaye, Former UN Special Rapporteur on Freedom of Expression, Director of the International Justice Clinic at University of California, Irvine School of Law, and ARTICLE 19 Board member, said: 

The wrong outcome in this case could transform the internet for everyone, not just in the United States. Its potential consequences are likely to be global. Since these are global companies, we should always be thinking about the impact that domestic regulatory steps have on the hundreds of millions of users outside the USA. 

“This is not to say that Big Tech should not be accountable for their practices or to oppose rights-respecting regulation of internet platforms. However, imposing liability on platforms for their recommendation systems will not guarantee the adoption of better regulation. The only guaranteed outcome will be the curtailment of free expression.”

In a joint amicus brief filed with the International Justice Clinic, ARTICLE 19 argued that limiting liability protection will incentivise platforms to over-rely on automated content moderation tools. Such tools are notoriously incapable of making complex assessments about legality of speech, which require intimate knowledge of the political, social and cultural context, as well as the local language. Relying on them will result in more take-downs of lawful, even public interest-oriented, content – likely impacting people from marginalised or minority communities, whose speech tends to be considered most ‘controversial’. 

These concerns are not hypothetical. ARTICLE 19’s Missing Voices campaign advocated on behalf of journalists, activists and artists who had their content removed or accounts blocked because of the errors or biases of automated systems. Documenting human rights violations has also been a casualty of automated moderation, as in the case of The Syrian Archive project, which relies on content on platforms to build criminal cases and conduct human rights research. The project has tracked the removal of hundreds of thousands of posts documenting potential war crimes and human rights violations on platforms – essential evidence that could have been used to seek justice for the victims. 

Barbora Bukovská, Senior Director for Law and Policy at ARTICLE 19, said: 

The pressure on companies to screen all content and add additional categories of speech to be monitored and removed will lead platforms to further prioritise speed over accuracy. More content will be taken down quickly, with little to no transparency and with severe impacts on free expression. 

“Social media platforms should not be immune from scrutiny. We know they have been slow to react to a number of legitimate concerns – such as over ‘hate speech’, ‘disinformation’, and the protection of children – on their platforms. They have been also profiting off the back of social ills, which they have enabled, without any public accountability. However, questions as fundamental as platform regulation should not be up to the Supreme Court. The legislature, not judiciary, is best placed to weigh up different competing interests and design comprehensive reform – putting respect for freedom of expression, privacy and other human rights at the heart of it.”