UN: Data-driven technologies must not reinforce historic inequalities

UN: Data-driven technologies must not reinforce historic inequalities - Digital

Photo by Miłosz Klinowski on Unsplash

Summary

ARTICLE 19 made this statement during the interactive dialogue on the report on the right to privacy in the digital age presented by the Office of the UN High Commissioner for Human Rights (OHCHR) at the 60th Session of the UN Human Rights Council.

ARTICLE 19 welcomes the report of the United Nations Human Rights Office of the High Commissioner (OHCHR), which rightly shows how data-driven technologies often reinforce historic and structural social, economic and political inequalities, and how technological innovation must be anchored in international human rights law and standards.  

Sensitive data – related to the most identifiable, personal, and intimate aspects of an individual – deserves special protection under international human rights law as improper disclosure can result in harm and discrimination, profoundly intruding into the privacy, personal dignity, and other fundamental freedoms of individuals. Informational self-determination – recognition of an individual’s right to control the access and use of such data – is paramount. 

In modern societies, the potential human rights violations in the collection, processing, and using of sensitive data is of enormous public relevance. The right to privacy is not only essential for the individual – it plays an important role in the development of democratic societies as it constitutes a necessary condition for the exercise of other human rights. 

Governments and businesses collect, process, analyse, use, and disseminate greater amounts of sensitive data through the increasing deployment of digital technologies in daily life – for welfare, health, education, security, and finance.  

Across the world, biometric technologies are also being used to collect and process data on a mass scale, not only bringing unprecedented chilling effects on the right to freedom of expression, but risks of profiling, discrimination, and violations of the right to privacy and due process.

Alarmingly, we are also seeing an increased use of data-driven tools by law enforcement for determining the likelihood of criminal activity either in certain localities or by certain groups. This predictive policing, which often relies on biased or historical data for predicting possible future events, is inherently biased and likely to produce discriminatory outcomes. 

It is essential that States and business transparently and systematically conduct human rights due diligence throughout the life cycle of the data-driven systems that they design, develop, deploy, sell, obtain or operate, and to stop using sensitive data to feed into artificial intelligence systems. They should also ensure redress mechanisms for the abuse and misuse of sensitive data.  

All data-driven technologies must meet principles of legality, necessity, and proportionality. Many will not, and as such there needs to be complete bans on their use, such as for biometric mass surveillance, emotion recognition, and predictive policing.