Side-stepping rights: Regulating speech by contract

Digital 5 min read
ARTICLE 19
Side-stepping rights: Regulating speech by contract - Digital

People gathered in a cafe where they drink tea and coffee and use and discuss social media communications with friend and family who have gone abroad.

In this policy brief, ARTICLE 19 examines the compliance of dominant social media platforms – Facebook, Twitter, and YouTube (owned by Google) – with international freedom of expression standards; and gives practical recommendations on what companies should do to demonstrate their commitment to protecting of freedom of expression.

Social media companies have become fundamental to how we communicate.  According to recent estimates, there are currently 2.2 billion active Facebook users and 330 million Twitter users worldwide. While social media companies have had a positive effect on freedom of expression, they have also come to hold enormous power over what information we can access. Contrary to the common perception that ‘anything goes online’, sharing information on social media platforms is not free from control. When users join Facebook, Twitter or YouTube, they agree to the companies’ Terms of Service, which determine the types of content the respective company deems acceptable or not. Social media users who fall foul of these standards can see their content removed or their account disabled altogether. Furthermore, social media platforms are finding themselves under constant pressure from governments to remove content deemed harmful or illegal under respective national laws. Online censorship is therefore increasingly privatised, raising serious questions for the protection of freedom of expression online.

In this policy brief, ARTICLE 19 puts forward that although social media companies are in principle free to restrict content on the basis of freedom of contract, they should respect human rights, including the rights to freedom of expression, privacy and due process. The policy sets out the applicable standards for the protection of freedom of expression online, particularly as they relate to social media companies, and lays down the key issues at stake from regulation of speech by contract. It provides analysis of selected Terms of Service of Google, Facebook, Twitter and Youtube and examines various policy options available to regulate social media platforms. Finally, ARTICLE 19 makes recommendations for how companies should respect basic human rights standards.

Key recommendations

 Recommendations to States

  • States should adopt laws that shield social media companies from liability for third-party content and refrain from adopting laws that would make them subject to broadcasting regulatory authorities or other similar public authorities;
  • States should refrain from putting undue extra-legal pressure on social media companies to remove content;
  • States should provide for a right to an effective remedy for violations of freedom of expression by social media companies. Individuals should be given the opportunity to appeal to a court or other independent body.

Recommendations to social media companies

Consistent with the Guiding Principles on Business and Human Rights, social media companies should observe the following:

  • Companies should ensure that their Terms of Service are sufficiently clear, accessible and in line with international standards on freedom of expression and privacy. Companies should preserve user’s anonimity. They should also provide more detailed examples or case studies of the way in which their community standards are applied in practice and conduct reviews of their standards to ensure human rights compliance;
  • Companies should be more transparent about their decision-making processes, including the tools they use to moderate content, such as algorithms and trusted flagger-schemes;
  • Companies should ensure that sanctions for non-compliance with their Terms of Service are proportionate;
  • Companies should put in place internal complaints mechanisms, including for the wrongful removal of content or other restrictions on their users’ freedom of expression. In particular, individuals should be given detailed notice of a complaint and the opportunity to respond prior to content removal. Internal appeal mechanisms should be clear and easy to find on company websites;
  • Companies should collaborate with other stakeholders to develop new independent self-regulatory mechanisms. This could include a social media council, modelled after effective self-regulation archetypes in the journalism field;
  • Companies should resist government and court orders in breach of international standards on freedom of expression or privacy, this includes individual legal requests where lacking legal basis;.
  • Companies should publish comprehensive transparency reports, including detailed information about content removal requests received and actioned on the basis of their Terms of Service. Additional information should also be provided in relation to appeals processes, including the number of appeals received and their outcome.

Read the policy brief