Tanzania: Online Content Regulations 2020 extremely problematic in the context of COVID-19 pandemic

Tanzania: Online Content Regulations 2020 extremely problematic in the context of COVID-19 pandemic - Digital

ARTICLE 19 is concerned about the possible impact of the Electronic and Postal Communications (Online Content) Regulations, 2020 (the 2020 Regulations), adopted in July 2020, on right to information during the COVID-19 pandemic. We have repeatedly raised concerns that the 2020 Regulations fail to comply with basic international freedom of expression standards. We are especially and gravely concerned that the 2020 Regulations prohibit the dissemination of critical public health information in Tanzania during the global health emergency of COVID-19. There is absolutely no justification under international law for the restriction of potentially life-saving public health information, even of foreign journalists reporting in Tanzania.

Background

The Electronic and Postal Communications (Online Content) Regulations, 2020 were issued by the Tanzanian Government on 17 July 2020. They entrench the licensing and taxation of bloggers, online discussion forums, radio and television webcasters, and repress online speech, privacy and access to information.

ARTICLE 19 has analysed previous versions of the Regulations in April 2018 (the 2018 Regulations) and also raised concerns about the 2020 versions. We are deeply concerned that they not only fail to address the concerns and grave issues raised in our prior analysis, but introduced several measures that make the 2020 Regulations more of a threat to freedom of expression in Tanzania.

In the 2018 analysis, we found that the 2018 Regulations broadly prohibited many categories of speech in manners well beyond acceptable limitations, and provided for licensing requirements in clear violation of international standards. We noted other grave problems: steep penalties for minor offenses under the 2018 Regulations, including imprisonment for at least a year; sweeping content removal powers without proper safeguards granted to the Tanzania Communications Regulatory Authority (TCRA); and vague and overlapping definitions making it unclear who the legislation applies to.

As a result, we concluded that the 2018 Regulations failed to protect and promote freedom of expression and recommended that they be withdrawn entirely and be subject to more comprehensive engagement with relevant stakeholders. Indeed, those concerns materialised, as an investigation by The Verge reported months after the 2018 Regulations were implemented that scores of young content creators, bloggers, and video bloggers took down their accounts in response, dramatically harming the creative output and economic potential of artists, writers, and photographers in Tanzania. Further, whistleblowing sites such as JamiiForums which have sought to protect their sources, have struggled with the mandates of the Regulations.

ARTICLE 19 continues to be concerned that the 2020 Regulations further exacerbate the problems of its precedent 2018 Regulations, merely re-organising problematic provisions to make the Regulations appear different despite identical substance in many respects. These problems include an expansion of numerous categories of prohibited content, unrealistic two-hour time-limits on content removals, and a continued failure to define basic terms in the legislation or justify the imposition of licenses for exercising the right to speak online. The 2020 Regulations also broaden ‘filtering’ requirements to over forty enumerated categories of prohibited content.

Perhaps most telling of the confused drafting and incomprehensibility of the 2020 Regulations is that they simultaneously prohibit content that interferes with freedom to practice one’s religion, while simultaneously barring content related to witchcraft, which by the very definition of the word is a religious practice. These considerations, all taken together, weigh for the immediate and urgent withdrawal of the 2020 Regulations which are not only contrary to international human rights law, but have been documented to have a severe chilling effect in the country, have significant economic harms, and stifle expression and creativity.

We reiterate that under international human rights law, any restriction on freedom of expression must be (1) provided by law; (2) pursue a legitimate aim as exhaustively listed in Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR); and (3) be necessary and proportionate to that aim. We further continue to observe that Article 9 of the African Charter on Human and Peoples’ Rights, to which Tanzania is a signatory, guarantees the right to freedom of expression. Further, during its 2nd Universal Periodic Review Cycle, the Tanzanian government accepted recommendations to improve its national laws and regulations for the protection and promotion of human rights.

Key changes in the 2020 Regulations

The 2020 Regulations present the following relevant changes over the 2018 Regulations:

• Introducing, in the Third Schedule ten far-reaching categories of prohibited content, each category prohibiting between one and thirteen subcategories of content. Virtually none of these prohibitions are acceptable limitations under international law. The few that may appear justified do not contain adequate safeguards to make them permissible under applicable standards. In sum this provides a staggering degree of censorship. Importantly, Category 8(c) describes content related to “outbreak of a deadly or contagious diseases in the country or elsewhere without the approval of the respective authorities” as part of content that may “cause public havoc and disorder.”

• Introducing a requirement that prevents providing technology that “helps users to have access to prohibited content,” which appears to criminalise the provision of basic technologies.

• Providing only two hours under which an application services license holder is required to suspend or terminate a subscriber account on grounds of containing “prohibited content.” This prohibition covers so many categories as to be practically limitless. No judicial oversight, review, or right of appeal is provided. Previously, the 2018 Regulations provided 12 hours which we already noted was effectively immediate.

• Creating four separate categories of licenses for various activities, including: news and current affairs; entertainment; education and religion; and simulcasting national coverage. Explicitly requiring pre-approval to disseminate these forms of content contravenes not only the right of freedom of expression under international law, but also interferes with the right to freedom of thought, conscience and religion among other rights.

• The definition of ‘content’ in Part I, Section 3, carves out private communications from its scope, though this limitation does not impact virtually any of the concerns expressed in our analysis. Further, it is not defined what constitutes a ‘private’ communication. For example, the 2020 Regulations cannot answer whether communication within a closed group of one hundred persons on a social media forum would constitute a ‘private’ communication.

Specific Concerns with the 2020 Regulations

ARTICLE 19 highlights specific concerns with some of the changes made in the 2020 Regulations:

Scope of prohibited content. Schedule III contains ten different categories of prohibited content. We note, as before, that prohibitions that may lead to criminal penalties such as heavy fines or imprisonment should be, at a minimum, promulgated by a legislative body and subject to judicial oversight, rights of appeal, and safeguards. These categories of prohibited content are even more problematic given that Part III, Section 9(d) requires the implementation of “moderating tools to filter prohibited content.” As before, we observe that requiring monitoring and filtering generally breaches freedom of expression obligations. The current categories include:

o Sexuality and decency. As before we observe that outright bans on pornographic content are impermissible under international law. Similarly, homosexuality, as well as any sexual act or obscenity, are not forms of expression that may be restricted; the Human Rights Committee has affirmed that restrictions on freedom of expression for the protection of public morals must be based on a broad understanding of what ‘public morals’ means. We do note that child sexual abuse images are a type of expression that States are required to prohibit under international law, although it is unclear that regulations of service providers and users are the proper place to do so; child sexual abuse materials should be subject to legislation.

o Personal privacy and respect to human dignity. While individuals have a right to privacy, these provisions more resemble provisions prohibiting defamation. The provisions also prohibit “publication of private information regardless of whether the information is true where publishing the same may harm the person.” Criminal defamation laws are not justified under international law, and these provisions do precisely that—provide criminal penalties for defamatory content. Even in the context of defamation, truthful information is not defamatory. These provisions provide for penalties for publication of private information with no carveout for public figures, for whom there may be overriding public interest concerns for publishing private information that may harm a person (such as evidence of corruption of a government official). In fact, most investigative journalism that exposes misconduct of an official for the public benefit can be said to “harm” the reputation of the individual exposed. Alleged violations of privacy are not enough to provide blanket censorship powers to users, and accompanying takedown obligations to providers. Further, the prohibition on content that promotes “practices of witchcraft, enchantment, or sorcery” where those terms are not even defined but are often associated with particular religious beliefs, would appear to violate the religious rights of groups of individuals contrary to international law.

o Public security, violence, and national safety. The problems with this category of prohibitions were considered in our analysis of the 2018 Regulations. Briefly, while international standards do allow for some legitimate limitations pursuant to public safety, the thirteen sweeping prohibitions in this section are not narrowly crafted according to these standards. The first prohibits content “harming the reputation, prestige or status of the United Republic” or its flags or symbols, which is not a legitimate restriction; similar vague provisions include prohibiting content that “is likely to threaten the stability of the United Republic” and prohibiting promotion of “sedition.” We observe that generally speaking, sedition laws, which include laws that proscribe subversive activities, are undemocratic and infringe on the right to freedom of expression. In most democracies, sedition laws have formally been rescinded. Specifically, such laws run afoul of a central tenet of human rights law that restrictions on freedom of expression must be “necessary.” Vague or broadly defined restrictions on ‘subversion’ are generally unacceptable as they go beyond what is strictly required to protect an interest even if a legitimate interest exists and is provided by law.

The prohibition on “content that is involved in planning, organizing, promoting or calling for demonstrations, marches or the like which may lead to public disorder” expressly violates Article 21 of the ICCPR which provides for the right to freedom of peaceful assembly.

o Criminal activities and illegal trade activities. With respect to content that allegedly encourages illegal activities, it should not be up to service providers to make nuanced legal determinations as to what constitutes criminal activity under Tanzanian law. It certainly should not be up to third parties to be able to force the takedown of content judged to be criminal without any judicial oversight at all, or right of appeal. What is or is not criminal should be subject to laws passed by legislators, subject to precise definitions and intentionality requirements, and subject to judicial oversight. While some forms of content may be more acceptable to restrict access to (i.e. bomb-making manuals), it is unclear why those forms of problematic content cannot be addressed by appealing to a judicial authority on a case-by-case basis rather than imposing broad prohibitions on providers themselves to surveil for such content.

o Health and public safety. It is unclear why the 2020 Regulations require a separate prohibition on matters concerning health, and why this regulatory body is competent to do so. The only authority that appears to have oversight is the Cabinet; if anything, an independent health regulatory body should be the entity that promulgates standards on health materials in the media.

o Protection of intellectual property rights. Coupled with the short takedown requirements of two hours and complete lack of judicial oversight or right of appeal, this provision provides a sweeping censorship power that is wholly incompatible with minimal freedom of expression protections. Notice-and-takedown provisions require great scrutiny, and that is even when they allow for some form of process, appeals, and longer notice period (such as forty-eight hours). None of those safeguards are available here.

o Respect to religion and personal beliefs. While individuals have a right to freedom of thought and religion, these provisions go further to broadly prohibit “defaming” and “ridicule” which go beyond acceptable prohibitions on hate speech. This provision also claims to respect religious beliefs, while earlier prohibiting content related to witchcraft, which by definition is a religious practice. If any laws or regulations in Tanzania properly implement the acceptable nuances of hate speech prohibitions permitted under international human rights law, this separate provision should be unnecessary.

o Public information that may cause public havoc and disorder. We reiterate the concerns raised above regarding public security. However, we note that subsection (c), which prohibits “content with information with regards to the outbreak of a deadly or contagious diseases in the country or elsewhere without the approval of the respective authorities,” is particularly urgent given the COVID-19 pandemic. This provision will only serve to stifle journalism both domestically and internationally relevant not only for the safety of Tanzanians citizens but of those of neighbouring countries. Further, the section also prohibits reporting on categories of information such as “weather forecasts” which seem completely benign.

o Use of bad languages and disparaging words. Profanities or disparaging words, even if they may not offend, are not by themselves permissible restricted content under international law.

o False, untrue, misleading content. As before, false content by itself is not a legitimate prohibition under international standards. Requiring that satire or parody is identified as such also compels individuals to make statements that they may not otherwise make, and is not justified under applicable standards.

• Nearly-instant takedown requirements, and lack of procedural safeguards. We observe that the 2020 Regulations continue to fail to contain any meaningful safeguards, judicial oversight, or right of appeal for content removal decisions. This is particularly problematic given the revisions that reduce an already near-instant mandatory content takedown time of twelve hours to an astonishingly short two hours. Any restrictions on the right of freedom of expression must be subject to appropriate oversight by a competent independent body; the Tanzanian Minister for Information, Culture, Arts and Sports and the Tanzania Communications Regulatory Authority are not independent bodies.

• Prohibition on providing technology that helps users access prohibited content. The measures in Part IV, Section 16(2) prohibit providing technology, applications, or programs that “helps users to have access to prohibited content.” This appears to criminalise the provision of basic technologies, including phone apps and computer programs, based on expressive activities of end users that may have no connection to the technologies. For example, if a social media platform released an app, and a user posted an article about the COVID-19 pandemic, the platform would be liable under these provisions. We have written on the dangers of criminalising so-called “dual-use” technologies, but dual-use technologies typically involve uses that are criminal or violate a right (such as copyright) clearly defined in law. The 2020 Regulations go further, providing for sanctions of fines or imprisonment for speech-based uses. The 2020 Regulations prohibit technologies based on how a user might use them to engage in expression or to access the expression of others. There is absolutely no way for a technology to predict or control such attenuated use, and this provision utterly lacks any justification.

Issues the 2020 Regulations continue to fail to resolve

In its analysis of the 2018 Regulations, ARTICLE 19 noted several issues with the 2018 Regulations that have not been addressed at all; in some instances, they have been made worse. Those include:

• Failure to limit vague and overbroad definitions. Several terms including ‘hate material,’ ‘hate speech,’ ‘indecent material,’ ‘obscene content,’ ‘application services licensee,’ ‘blog or weblog,’ and ‘blogger’ still contain problematic definitions under international law. Some definitions have been removed from the definitions section, such as ‘obscene material.’ But these prohibitions have been simply moved, in substance, to the ‘Prohibited Content’ section of the Third Schedule, which prohibits depictions or promotion of sexuality and immorality. We previously noted how the definition of hate speech as “any portrayal . . . which denigrates, defames or otherwise devalues a person or group on the basis of race, ethnicity, religion or disability” conflated several concepts, including defamation, and did not narrowly track the definition of hate speech provided under international law.

• Failure to limit the sweeping power of the Authority. We previously commented on the wide latitude possessed by the Authority to require registration of bloggers, online forums, online radios and television, as well as powers to order removal of content. We also expressed concern that the Authority is not an independent figure. The Authority still possesses these powers in Part IV, Section 19, subject to even shorter takedown timeframes.

• Failure to clarify or limit obligations imposed on online content providers or provide appropriate due process safeguards. We took issue previously with the obligations imposed on various actors involved in online content, including everyday users and third parties. These obligations include requiring that content is ‘safe and secure,’ filtering certain forms of content, and that providers ‘cooperate with law enforcement officers.’ We previously noted that the scope of what this cooperation means is wholly unclear. These obligations still exist in Part III, Section 9 of the 2020 Regulations. They also still require, among other issues, the requirement of “mechanisms to identify source[s] of content” in Section 9(e), jeopardising the ability of journalists to protect their sources in matters of public concern. They also still require taking into account “trends and cultural sensitivities of the general public” in Section 9(b).

• Failure to limit obligations imposed on application service licensees or provide appropriate due process safeguards. We highlighted previously that the Regulations impose a contractual obligation on application service licensees to include particular terms, including a right to deny access or terminate service where a subscriber contravenes the provisions of the Regulations. While these provisions have been restructured, their underlying focus appears unchanged, requiring application service licensees to serve as censors by proxy. Licensees are still required to inform subscribers that they must remove content, and now must do so within two hours of receiving notice from the regulatory authority (rather than within twelve hours as before). Pursuant to Part III, Section 11(4), if the subscriber fails to remove content within two hours, the licensee must suspend of terminate the subscriber’s access. We took issue that these provisions contained no due process safeguards. Specifically, they do not require content removal decisions to be made by a competent court.

• Failure to limit obligations imposed on online content hosts and internet cafes. We noted that the regulations require content hosts to adopt codes of conduct allowing for mandatory removal of content, reiterating that content removal decisions should only be taken by a court or independent adjudicatory authority. The content host provisions are contained in Part IV, Section 15. We also noted that the Regulations impose filtering, surveillance, and registration requirements on internet cafes that are disproportionate to any legitimate aims under international human rights law. These provisions of concern are now in Part III, Section 13.

• Limitations of safeguards on disclosure of personal data. We noted that the Regulations contained some protections for disclosures of personal data, although these did not go far enough. The past version required requests to be made by courts, lawfully constituted tribunals, or law enforcement agencies. This appears to have been eliminated in favour of disclosure “where the information is required by relevant authorities according to the law” as indicated in Part IV, Section 17. The term “relevant authorities” is nowhere defined in the 2020 Regulations. As before, we reiterate that that access to personal data by any public authority should, in principle, require a judicial warrant with limited exceptions in limited circumstances. We also recommended the reference to and adoption of data protection legislation, which has not yet been done.

• Failure to limit the scope of ‘prohibited content.’ We noted that numerous forms of content were penalised under the 2018 Regulations that were not explicitly banned under Tanzanian law or that were legitimate under international human rights law. These included pornography, violence, annoyance, public disorder, bad language, and false content. Other categories, such as national security and hate speech, are not narrowly crafted in line with permissible limitations under international law. These forms of prohibited content are now included in Schedule III of the 2020 Regulations.

• Failure to limit sanctions. We noted that new offences should not be left to regulatory instruments but be provided for by legislation. We also noted that sanctions were particularly heavy and were likely to have a chilling effect on online freedom of expression in Tanzania, which they sadly have. Current penalties are included in Part IV, Section 21 of the 2020 Regulations.

Hence, ARTICLE 19 calls on the Tanzanian Government to immediately suspend the 2020 Regulations, and ensure that all legislation complies with international freedom of expression standards. Any new regulation and legislation in this area must be subject of broad public consultation, involving all stakeholders including civil society organisations.