Israel: Stop using biometric mass surveillance against Palestinians

Israel: Stop using biometric mass surveillance against Palestinians - Civic Space

Protest against Israel's surveillance of Palestinians and the companies who make it possible, San Francisco, CA, August 2023. Credit: Phil Pasquini / Shutterstock

Drawing upon the detailed account recently provided by The New York Times regarding Israel’s implementation of an extensive facial recognition programme in the Gaza Strip, ARTICLE 19 underscores the critical importance of safeguarding fundamental human rights amidst the deployment of surveillance and AI technologies, especially during times of war. We call on the Israeli government to stop the use of biometric mass surveillance in this war on Gaza, a war which, we reiterate, must be brought to an immediate end.

ARTICLE 19 has long advocated that the untargeted use of facial recognition that leads to mass surveillance in public spaces is inherently incompatible with international human rights law and should never be allowed. In addition, because of the intrinsic invasiveness of facial recognition technology, its use should only be allowed in exceptional circumstances, which have to be justified and tied to a specific legitimate purpose, permitted under international law. 

We note that Israel has already used facial recognition technologies against Palestinians before 7 October 2023. In May 2023, Amnesty International documented a pervasive system of facial and biometric identification of Palestinians across the West Bank “to track Palestinians and automate harsh restrictions on their freedom of movement”. A 2021 analysis found that, even though drawing firm conclusions was difficult due to a lack of transparency, the deployment of facial recognition surveillance appeared not to satisfy the criteria under international human rights law, which also informs the obligations of an occupying power under international humanitarian law. 

According to The New York Times, after the 7 October 2023 attacks by Hamas, Israeli intelligence officers in Unit 8200 also started to turn to the use of facial recognition technology in Gaza. The report shows how Israel uses Corsight’s technology and Google Photos to conduct mass surveillance of people in Gaza, with the alleged aim of identifying individuals based on their affiliation with Hamas or other groups.

In Gaza today, the deployment of facial recognition technologies can lead to serious consequences for physical integrity and human life, making it even more likely for the use of such technology to fail the proportionality requirements under both international human rights law and international humanitarian law. 

As demonstrated by the case of Palestinian poet Mosab Abu Toha, a contributor for The New Yorker, Israel’s deployment of biometric mass surveillance has resulted in instances of misidentification with severe consequences. After Abu Toha was misidentified based on facial recognition technology, he was beaten and interrogated in an Israeli detention center for two days before being returned to Gaza. Abu Toha credited his release to a campaign led by journalists at The New Yorker and other publications. Abu Toha’s misidentification through facial recognition technology is not unique. Numerous studies demonstrate that facial recognition fails in terms of accuracy, particularly for underrepresented or historically disadvantaged groups – with potentially dire consequences for those wrongly identified by the Israeli army. 

While Abu Toha’s case appears to be the first documentation of facial recognition during the Gaza war, numerous other cases likely exist, affecting people with a smaller public platform. Yet, the lack of any transparency from the Israeli army regarding its use of facial recognition technology and the scarce information coming out of Gaza makes it extremely difficult to scrutinise Israel’s actions in this respect properly. 

This situation is emblematic of the broader concerns ARTICLE 19 has raised regarding deploying biometric mass surveillance technologies globally. Without stringent safeguards, these technologies can be misused to infringe upon fundamental human rights, particularly in contexts marked by conflict and tension. ARTICLE 19 finds that the use of biometric mass surveillance presents a danger to the life, human rights, and dignity of civilians in the Gaza Strip. 

The use of facial recognition is only one of the dangerous facets of the deployment of AI-powered surveillance tools in warfare and their potential use to identify civilians as bombing targets. A recent investigation by +972 magazine revealed that the Israeli army had developed an AI-based program known as “Lavender”, designed to mark all suspected operatives in the military wings of Hamas and other groups. It has identified as many as 37,000 Palestinians as suspected militants and thus potential bombing targets to systematically attack while they were in their homes. These killings reportedly required only a few seconds of verification by human beings to ensure the targets met the criteria of being men. According to military whistleblowers, the system works with a ten percent error rate, with a permissible casualty rate of 10-15 civilians as the targets were assassinated alongside their families and other residents of their buildings.

ARTICLE 19 is deeply concerned about the impact of the use of facial recognition and other AI-powered surveillance and targeting tools on the safety of civilians in Gaza. We urge the Israeli government to end mass biometric surveillance in Gaza. The government must also ensure that any use of facial recognition technologies or other AI tools in this conflict is strictly regulated in compliance with international human rights and humanitarian law standards, in particular the principles of necessity and proportionality. 

We also remind companies engaged in the design, development, sale, and deployment of biometric or other AI technologies that they must adhere to the UN Guiding Principles on Business and Human Rights, which includes respecting both international human rights and international humanitarian law in armed conflicts. In this context, implementing those principles implies the guarantee of transparency, accountability, and liability in all phases of these technologies’ life cycle, including concerning public-private partnerships, procurement processes, and exports to States and parties in conflict. 

Finally, we urge Israel one more time to end its systematic violations of international humanitarian and human rights law in this conflict once and for all, including as regards freedom of expression. Israel must also urgently implement the International Court of Justice’s provisional measures order, which acknowledged the spread of famine and starvation in Gaza and ordered Israel to ensure the unhindered provision of humanitarian assistance. Most importantly, Israel must bring this conflict to an immediate end.