Pakistan: Online Harms Rules violate freedom of expression

Pakistan: Online Harms Rules violate freedom of expression - Digital

Credit: Rubi Joselin Ibarra CC BY-NC-ND 2.0

ARTICLE 19 is concerned about the proposal of the Citizens Protection (against Online Harms) Rules, 2020, currently being considered by the Pakistani Government. The Rules grant a government agency extensive powers to order the blocking or removal of vaguely defined content in the absence of any meaningful safeguards in violation of international standards on freedom of expression. They also provide for obligations to filter content and to disclose user data at the request of the government in breach of international standards on privacy. ARTICLE 19 calls on the Pakistani Government to withdraw the Rules and review its legislation related to digital technologies and bring it in line with international law.

 On 21 January 2020, the Pakistani Government published the Citizens Protection (against Online Harms) Rules 2020 (the Rules) in the Official Gazette, and undertook an official consultation on this proposal. The Rules seek to implement various sections in the Pakistan Telecommunication (Re-Organisation) Act 1996 and the Prevention of Electronic Crimes Act 2016.

ARTICLE 19 has previously raised concerns with the 2016 Prevention of Electronic Crimes Act. Our concerns remain valid to this day and are not remedied by the Rules which, in our view, fail to comply with international standards on freedom of expression.

Vague definitions

The Rules define “extremism” as the violent, vocal or active opposition to fundamental values of the state of Pakistan including the security, integrity or defence of Pakistan, public order, decency or morality, the rule of law, individual liberty and the mutual respect and tolerance of different faiths and beliefs.

To begin with, ARTICLE 19 notes that illegal content should be defined by statute and subject to parliamentary scrutiny, rather than Rules drafted by the Executive. Furthermore, the definition of ‘extremism’ is overly broad in breach of the legality principle under international law. In practice, the content of anyone criticising government policy on the wide range of issues listed above, whether it be matters of public morals or how the government is handling a terrorist attack or any other national crisis, could be deemed extremist and therefore removed.

This definition seems to be inspired by the UK definition of ‘extremism’ as part of its Prevent strategy. However, both the UN Special Rapporteurs on the rights to freedom of peaceful assembly and association and on counter-terrorism have concluded that this definition is overly broad. The UN Special Rapporteur on the rights to freedom of peaceful assembly and association has gone further and said that the Prevent strategy was inherently flawed. The same is true of the Pakistani definition of ‘extremism’ under the Rules. It can only lead to the suppression of dissenting and opposition voices in violation of the right to freedom of expression.

Overbroad powers to censor content vested in an executive agency

Under the Rules, the Ministry of Information Technology and Telecommunications designates a ‘National Coordinator’ with extensive powers to issue instructions related to the blocking of unlawful content and ‘acquire data or information’ from social media companies ‘and other such matters’. The National Coordinator is comprised of stakeholders ‘as notified by the Ministry’. In other words, the Rules create a new executive agency with extensive powers to remove content on the basis of overbroad definitions.

First of all, as pointed out by the Global Network Initiative, it is unclear that the Ministry has the authority to create a new executive agency with such broad powers under the 1996 Act or the 2016 Act. In our view, any institution with wide discretionary powers to interfere with the rights to freedom of expression and privacy should be established by way of primary legislation.

In any event, whether or not content should be declared unlawful and blocked should be decided by a court or judicial authority, consistent with international standards on freedom of expression. Instead, the Rules entrench the exercise of unfettered censorship powers by the Executive. Moreover, website blocking is an extreme measure – analogous to banning a newspaper or broadcaster – which can only be justified in line with international standards on human rights. In practice, website blocking is disproportionate in the vast majority of cases since blocking orders are not sufficiently targeted and involve the restriction of access to perfectly legitimate content.

A further concern with the Rules is that website blocking orders have to be implemented within 24 hours or 6 hours in cases of emergency as determined by the National Coordinator. This is clearly insufficient for social media to review blocking orders and for any suspensive appeal to take place. A similar law was recently struck down by the French Conseil constitutionnel. The Rules further fail to require an assessment of the proportionality of a blocking order in a given case.

We further note that the Rules provide for the disclosure of data on the mere say-so of a government agency in breach of international standards on privacy. Instead, access to users’ personal data should be authorised by a court or independent body.

Proactive measures and prevention of livestreaming

ARTICLE 19 is further concerned that under the Rules, social media companies will be required to deploy ‘proactive measures’ to ensure the prevention of livestreaming on their platforms of any content in breach of any law or rules in force in Pakistan. ‘Terrorism, extremism, hate speech, defamation, fake news, incitement to violence and national security’ are highlighted as types of content of special concern but they are not otherwise defined by reference to any other existing legislation. We are unaware of any legal definition of ‘fake news’ under Pakistani law.

ARTICLE 19 further notes that ‘proactive measures’ are often synonymous with automated filters with a view to removing content quickly. These measures significantly interfere with the right to privacy and often lead to the removal of lawful content since algorithms are inherently incapable of properly taking context into account. EU law currently prohibits general monitoring requirements. The prevention of live streaming further amounts to prior censorship. This is deeply disturbing in the absence of any court order determining the legality of content and could have a significant chilling effect on freedom of expression, including on news reporting.

Lack of independence of the Authority ordering content restrictions

The Rules set out a complaints mechanism by individuals, legal entities or public authorities against ‘unlawful’ content that falls well below international standards of procedural fairness. A key flaw is that the legality of content is determined by ‘the Authority’, i.e. a government body, in the first instance, rather than a court. This problem is compounded by the fact that an appeal before the High Court only takes place after an aggrieved party has asked for a review by the Authority of its original decision. It is entirely unclear why a review by the same decision-maker is useful. It is also inconsistent with due process principles. To the extent that it is meant to enable any party aggrieved by the order to challenge it because they were not a party to the original proceedings, it only highlights the lack of procedural safeguards at first instance. In this regard, we note that counter-notices are not systematic. Equally, the giving of reasons for the blocking order is at the discretion of the Authority. Accordingly, the complaints mechanism falls well below international standards on freedom of expression and due process rights.

 Other concerns

ARTICLE 19 is further concerned that the following measures amount to unnecessary restrictions on the rights to freedom of expression and privacy:

  • Registration of social media companies with the Telecommunications Authority;
  • Requirement to store data in Pakistan in the absence of a domestic data protection law together with the establishment of local offices and a ‘focal’ person in Pakistan;
  • The extra-territorial application of the Rules to Pakistani citizens abroad;
  • A requirement that companies mark content as ‘false’ at the request of the Government;
  • The provision of data, including traffic data, subscriber data, or content data in ‘decrypted’ format, upon request from the Investigation Agency rather than a court order.

In addition, sanctions for failing to comply with the Rules, including the blocking of an entire social media platform and fines of up to 500 million rupees, are disproportionate. Moreover, the blocking of the platform itself cannot be challenged before a court but before a government committee. In other words, these provisions violate the rights to freedom of expression, privacy and due process.

ARTICLE 19 is not alone in our view that the Rules are deeply flawed. Many others have raised similar concerns about the Rules, including the UN Special Rapporteur on freedom of expression David Kaye and Digital Rights Foundation, Pakistan.

In ARTICLE 19’s view, the Rules are so flawed that they should be withdrawn in their entirety. As the Pakistani Telecommunication Authority is expected to issue a revised version of the Rules, ARTICLE 19 urges the Government to withdraw the proposal of the Rules in its entirety. Instead, we call on the Government to use this opportunity to bring its legislation on digital technologies into full compliance with international human rights standards. In particular, the Prevention of Electronic Crime Act itself should be reviewed and brought in line with international standards on freedom of expression and privacy.