In May 2018, ARTICLE 19 submitted written evidence to the House of Lords Select Committee on Communications for the inquiry on Internet regulation.
The Committee proposed a series of questions on how content is being regulated on the Internet at present, whether these measures are effective and how they might be improved. More specifically, the Committee queried whether there are sufficient mechanisms in place by social media companies to moderate content; which individuals are responsible for this within these platforms, and how transparent the companies are about these mechanisms. They also asked about the role users should play in moderating content on social media platforms and whether effective measures are available to appeal decisions to moderate or remove their content. Overall, the Committee queried whether legal liability should be attributed to online companies for the content they decide to host, and if not, what the alternatives could be.
ARTICLE 19 argues that greater regulation of online content is unnecessary.
Our submission makes clear that online content in general and social media companies in particular are already regulated at both domestic and EU level. To introduce specific legislation to remove immunity from social media companies for hosting third-party content would create a ‘chilling effect’ on freedom of expression online.
Instead, we argue that the UK government should focus on strengthening laws related to data protection and campaign funding to protect the rights to privacy and freedom of expression.
In our submission, ARTICLE 19 reiterates our longstanding position that online platforms should remain immune from liability, unless they directly intervene in producing third-party content or are ordered to remove that content by a court. At the same time, we propose alternative models, including notice-and-notice liability and/or the creation of self-regulatory bodies modelled on press councils to deal with the wide range of issues that can arise in relation to the regulation of online content.
We also identify various ways in which companies could improve their internal processes related to content removal.
ARTICLE 19’s recommends above all that social media companies adhere to international human rights standards for protecting freedom of expression and privacy, as reflected through their community standards. This could be achieved by adopting the suggestions put forward by ARTICLE 19 in this submission and in our Policy Brief Sidestepping Rights: Regulating Speech by Contract.