European Court of Human Rights: Groundbreaking ruling on facial recognition

European Court of Human Rights: Groundbreaking ruling on facial recognition - Civic Space

Photo by Miłosz Klinowski on Unsplash

ARTICLE 19 welcomes the European Court of Human Rights’ decision today in a case concerning the use of facial recognition against protesters in Russia. The Court found that using facial recognition to locate and arrest a protester while he was travelling on the Moscow underground violated his right to freedom of expression and privacy. 

ARTICLE 19 has long called for a moratorium on the development and deployment of all biometric technologies, including facial recognition, by both States and private actors. This groundbreaking ruling is a stark reminder that these technologies are frequently misused to target individuals and groups, including protesters, journalists, and activists, that play significant roles in promoting democratic values.

In Glukhin v. Russia, police arrested the applicant, Mr Nikolay Sergeyevich Glukhin, on a train in the Moscow metro. The police told him his name was on the wanted persons list as he had previously held a solo demonstration without submitting prior notification to the authorities. Facial recognition cameras installed in the metro had been used to identify him, and screenshots from the recording were used as evidence in the proceedings against him. Mr Glukhin was then convicted of an administrative offence for breaching the notification procedure for the conduct of public events.

Barbora Bukovská, ARTICLE 19’s Senior Director for Law and Policy, commented:

The use of facial recognition and other biometric technologies represents one of the greatest threats to fundamental rights in the digital age. These technologies constitute a threat to the right to privacy and anonymity and have a strong “chilling effect” on the rights to freedom of expression. If governments can use biometric technologies to quash any form of dissent and if people know they are being watched, they are less likely to express themselves freely in public spaces and might not choose to exercise their rights.

Facial recognition has been used around the world, including in Russia, without sufficient legal frameworks to protect the rights at stake. As a consequence, individuals’ rights have not been sufficiently protected, leading to cases like the present one, where severe human rights violations have occurred. 

Today’s decision at the European Court of Human Rights rightly pointed out that the use of facial recognition can be particularly intrusive in protests that present no danger to public or transport safety. We hope that the decision will support our future advocacy for a moratorium on the use of these kinds of technologies until proper human rights safeguards are put in place.’

Background

The case concerned the Russian authorities’ use of facial-recognition technology against Mr Glukhin following his protest. He staged ‘a solo demonstration’ in the Moscow underground where he held a life-size cardboard figure of Konstanin Kotov, a protester whose case had prompted a public outcry and had attracted widespread attention in the media, and a sign reading, ‘I’m facing up to five years … for peaceful protests’. Later, the police discovered photographs and a video of this demonstration uploaded on social media. According to Mr Glukhin, they must have used facial recognition technology to identify him on social media and collected footage from CCTV cameras in the Moscow underground. He was convicted in administrative-offence proceedings for failure to notify the authorities of his solo demonstration using a ‘quickly (de)assembled object’ and fined 20,000 Russian rubles (about 283 euros). The screenshots from social media and CCTV surveillance footage were used in evidence against him. 

In October 2019, the Moscow City Court upheld Glukhin’s conviction on appeal. It ruled that the peaceful nature of his protest was irrelevant and that the offence had been discovered and evidence had been collected in accordance with the Police Act. Mr Glukhin then took the case to the European Court.

Today, the European Court found that the Russian Federation violated Mr Glukhin’s right to freedom of expression (guaranteed in Article 10 of the European Convention of Human Rights) and his right to private and family life (under Article 8 of the Convention). 

The Court stated that although it was difficult for Mr Glukhin to prove his allegation that facial recognition technology had been used in his case, there was no other explanation for the police having identified him so quickly after his protest. It also noted that the Russian Government did not deny that they used facial recognition technology in this case and that there was a magnitude of reports on cases when protesters in Russia were identified through facial recognition technology. 

The European Court concluded that ‘the processing of Mr Glukhin’s personal data in the context of his peaceful protest, which had not caused any danger to public order or safety, had been particularly intrusive. The use of facial recognition technology in his case had been incompatible with the ideals and values of a democratic society governed by the rule of law.’

Although the Russian Federation is no longer a party to the European Convention of Human Rights, the European Court found that it still had jurisdiction to deal with the case, as the facts giving rise to the alleged violations of the Convention had taken place before 16 September 2022 (the date on which Russia ceased to be a Party to the European Convention).

ARTICLE 19 intervened in the case. Our submission is available here.

ARTICLE 19 has been raising concerns about the rapid and increased use of biometric technologies by public authorities and the private sector. These technologies are highly intrusive, violate people’s privacy, fail to adequately protect personal data, and prevent people from enjoying their right to freedom of expression.

 

Find out more about this work