ARTICLE 19 and the International Justice Clinic at University of California, Irvine School of Law have made a joint intervention in a case against Google, urging the Supreme Court of the United States to maintain the existing protection of freedom of speech online.
In Gonzales v. Google, the Supreme Court may consider whether the liability shield of Section 230 of the Communications Decency Act of 1996 applies to user-generated content promoted through automated recommendation systems. In the joint intervention, we urge the Court not to limit Section 230’s application, as this would severely limit freedom of expression online in the US and beyond.
The case Gonzales v. Google concerns the death of Naomi Gonzales (the petitioner’s daughter) during an ISIS attack in Paris. The Gonzales family sued Google under the Anti-Terrorism Act (18 U.S.C. §2333) arguing that Google, through its YouTube recommendation systems, pushed ISIS videos to users and is partially responsible for Naomi’s death. The petitioner argues that Section 230 does not shield internet platforms from liability for content that is promoted through automated recommendation systems. The case reached the Supreme Court after a District Court dismissed the petitioner’s claims in favour of Google, and the 9th Circuit court affirmed the dismissal. The Court will hear the case in February 2023.
The Supreme Court decision in favour of Gonzales could radically change how the internet is regulated and weaken free expression. If the Court limits the application of Section 230, it will incentivise internet platforms to over-rely on automated content moderation tools. These are notoriously poor at differentiating lawful from unlawful speech, leading to excessive removal of lawful, even public interest-oriented, content.
In the joint intervention, ARTICLE 19 and the International Justice Clinic, led by David Kaye, former UN Special Rapporteur on Freedom of Expression, call on the Supreme Court to protect free expression, uphold the settled approach by US courts to Section 230, and defer to Congress in its consideration of how to address automated recommendation systems. In particular, we argue the following:
- First, the Court’s application of Section 230 should be guided by the protections for freedom of expression in the First Amendment and international human rights law. As support, the brief shows that the history and intention of Congress in passing Section 230 speak in support of protecting the free exchange of ideas online.
- Second, we argue that the petitioners’ claim effectively deals with the illegality of the content that is ISIS-related rather than the recommendation systems. Without immunity for the algorithmic recommendation systems that promoted this content, internet platforms are likely to over-remove content to avoid liability, resulting in significant actions against lawful content. The amicus highlights lessons learned from other jurisdictions’ efforts to address online content and foreign court decisions that have supported careful approaches to protect freedom of expression online.
- Finally, as ARTICLE 19 has highlighted elsewhere, there is good reason to pursue rights-respecting regulation of recommendation systems. The amicus argues, however, that legislative action – not judicial decision – is the appropriate avenue for such rule-making.
The brief emphasises that Section 230 was designed to promote the freedom of expression of internet users. As such, we urge the Court to affirm the lower court’s decision maintaining immunity under Section 230 and thus to protect and promote the freedom of individuals to seek, receive and impart information and ideas online.
In the proceedings, ARTICLE 19 and the International Justice Clinic are represented by Bob Latham (the Chair of the Board of ARTICLE 19), Marc Fuller and Hannah Walsh of Jackson Walker LLP