EU: Risky biometric technology projects must be transparent from the start

EU: Risky biometric technology projects must be transparent from the start - Digital

Emotion recognition: fotologic and www.projectoxford.ai

ARTICLE 19 is disappointed with the decision the General Court of the European Union delivered earlier this week in a case concerning the secrecy of a controversial European Union-funded emotion recognition project to be used on travellers to the EU. The General Court recognised that the European Commission’s Research Executive Agency (REA) had failed to properly assess a number of access requests filed by digital rights activist and a member of the European Parliament Patrick (Patrick Breyer v European Commission). However, the Court did not rule that the public interest in having a public debate about controversial tech experiments deserves priority over the commercial interests from the very beginning of these projects.   

In the verdict of 15 December 2021 on the lawsuit brought by MEP Patrick Breyer in 2019, the General Court established that a number of access requests denied to Breyer by the REA were not sufficiently justified by the REA. This is an important step in ensuring transparency regarding controversial biometric technology projects launched with EU funds. 

However, we believe that, in its verdict, the General Court has missed an important opportunity to reinforce EU democratic values. On one hand, the Court has recognised that there was a public interest in the democratic oversight of the development of surveillance and control technologies such as the iBorderCtrl. On the other hand, it suggested that such democratic oversight should begin only after these types of projects were concluded. 

Barbora Bukovska, Senior Director for Law and Policy of ARTICLE 19 said:

“The decision of the General Court in this case is an inadequate step towards transparency in this critical field. We appreciate that the General Court recognised that access to information requests cannot be denied on generic claims of protecting commercial interests and intellectual property, and that a further case by case assessment was needed to ensure the protection of individuals’ right to access information. 

However, we are disappointed that the Court failed to recognise that a wide access to crucial information should be made available in the public domain from the very first phases of controversial biometric technology projects. This is especially the case for projects such as iBorderCtrl, paid by EU funds – that is by the EU taxpayers. Importantly, it also fails to understand the grave threats emotion recognition systems pose to freedom of expression, privacy, and equality.  

ARTICLE 19 has been advocating for a complete ban of the emotion recognition technology  due to its pseudoscientific nature and fundamental inconsistency with international human rights standards. Research into the use of emotion recognition is not only a huge waste of money, it is also indicative of a dangerous tendency to enthusiastically adopt “sophisticated” technology without thinking of their societal implications.” 

Background to the case

In November 2018, Mr Patrick Breyer, currently a Member of the European Parliament, filed an access to documents request to the REA with regards to the iBorderControl research project. The iBorderCtrl research project, funded by the EU with 4.5 million Eur, included the development of a ‘video lie detector’ based on ‘artificial intelligence’ to be used on travellers to the EU and to detect whether people are lying when answering questions. The idea was that people who want to travel to the EU should take a lie detector test at home in front of their webcam. Based on their facial expressions and behaviour when answering standard questions, special software would determine whether the person is telling the truth. The travelers’ Facebook profiles and other activities on social networks would also be included in the assessment. 

Because of the high risks to fundamental rights that the deployment of this video lie detector technology raises, and willing to bring more transparency on the issue, Mr Breyer asked the REA for access to documents related to the project, including an ethical assessment and a report on the legality of the technology. The European Commission denied access on grounds that the disclosure would undermine the protection the commercial interest, including intellectual property, of REA, it would have caused harm to the partners of the iBorderCtrl  consortium, and on the assessment that a public interest outweighing the harm did not exist. In March 2019, Mr Breyer filed an appeal before the EU General Court against the European Commission’s decision. 

 

ARTICLE 19’s position on transparency of biometric technologies

ARTICLE 19 studied the assumptions and objectives underlying the iBorderCtrl project as a part of our work on the impact of surveillance technologies on freedom of expression. In our  report Emotional Entanglement, we revealed the fundamental inconsistency of emotion recognition systems with international human rights standards, the unscientific, oppressive and discriminatory foundations on which they are built, and the ill-advised turn to such systems by governments around the world.

ARTICLE 19 finds that the technologies investigated within the iBorderCrtl research project raise numerous challenges for fundamental rights. The invisible, opaque, and unfettered manner in which emotion recognition is being developed risks depriving people of their rights to freedom of expression, privacy, and the right to dissent through protest, amongst others. Moreover, emotion recognition’s pseudoscientific foundations render this technology untenable. 

We believe that it is therefore essential that a transparent and public debate about whether they should be designed, developed and deployed within the EU or at the EU borders is fully supported from the initial phases. There is no reason why the substantial funding of research projects about these technologies with EU money should be subtracted from this debate, and therefore from democratic oversight. There is no justification to postpone the public debate to the deployment phase of these technologies only.