Facial recognition technologies — a dual-use challenge

Stop Killer Robots
4 min readOct 26, 2023

--

by Yohan Freuville

A CCTV camera positioned behind a barbed wire fence overlayed with a blue, yellow, and white filter

The United Nations (UN) Disarmament Week, observed from 24–30 October, seeks to promote awareness and better understanding of disarmament issues and their cross-cutting importance. Young people have an important role to play in the promotion of disarmament as a method to protect people from harm and to build safer and more peaceful societies.

In honour of this important week, members of the Stop Killer Robots Youth Network were asked to share their thoughts on the themes of disarmament and autonomous weapons systems. Disclaimer: The blogs in this series do not necessarily constitute the opinions of Stop Killer Robots. Nor should they be considered the opinions and views of all Stop Killer Robots members.

Why should we talk about Facial Recognition Technologies (FRTs) in the humanitarian disarmament context? Because FRTs are technologies which could be used in the development of autonomous weapons systems (AWS) to select and engage targets without human intervention. Additionally, the use of AWS raises concerns about their capacity to discriminate between civilians and soldiers. Such weapons would be inhumane as life and death decisions would be in the sole “hands” of a machine, enabling a process called digital dehumanisation. Humanitarian disarmament seeks to prohibit weapons which have indiscriminate effects or are inhumane par nature. In a joint call, the United Nations Secretary-General and the President of the International Committee of the Red Cross (ICRC) recently called for new prohibitions and regulations of AWS as they pose “serious humanitarian, legal, ethical and security concerns”.

A first point of concern is that during an armed conflict, there are many international rules on how to conduct hostilities, known as International Humanitarian Law (IHL). One of the core principles of IHL is the principle of distinction that prohibits the targeting of civilians or soldiers who are hors de combat (eg. wounded or surrendering). There are risks associated with integrating FRTs into weapons systems given the documented error rates of FRTs. In 2019, research demonstrated that FRTs of some of the leading companies in the sector had error rates of less than 1% for white men, while the error percentage rose to 35% for black women. In addition to racial bias, FRTs have a hard time recognizing transgender people and will fail to find their gender more than one-third of the time. These failures can lead the system to mismatch people, especially black women, which could lead to innocent people being incarcerated or unjustly targeted. With the knowledge of such errors in mind, how could an AWS, and especially the ones using biometric technologies, perfectly discriminate between soldiers who can be targeted and people who are protected under IHL? An error in the analysis could lead to the death of an innocent person. The conduct of hostilities requires precise and scrutinized analysis of the context of the conflict and of every single operation as it will influence which rules apply and if there are exceptions to be applied. As in most fields of law, there is rarely a right or wrong decision. Decisions depend on various factors that make it difficult to believe that automated machines will make the lawful decision all the time. Finally, a lawful action does not necessarily mean it is ethical or humane. Therefore, meaningful human control must remain in order to respect legal and ethical concerns.

Secondly, FRTs are already being used to violate human rights. In the United States, prominent Black activists are being monitored by police departments using Facial Recognition Technologies, for example during Black Lives Matter protests, to verify those activists’ identities and surveil them. This is worrying for human rights as police could be enabled to track and suppress activists using such technologies. In Israel and the West Bank, a facial recognition system is used for example at checkpoints or roads in Hebron and can deny Palestinians entry into their own neighbourhoods. In East Jerusalem, Amnesty International estimated there were about 1 to 2 cameras every 5 metres. This is very concerning not only for the right to privacy but also for the rights to freedom of expression and peaceful assembly. A Palestinian journalist told Amnesty International that Palestinians fear protesting because they know they will be recognized by those cameras and arrested later. In China, citizens are being monitored through the use of FRTs in cameras across the country with their details registered in national records. FRTs are also used against the Uighur communities to track and surveil them. Regarding the development of national security systems, could governments “upgrade” those systems and integrate AWS which could apply lethal force on any person labelled as a terrorist or state enemy?

Autonomous Weapon Systems must be addressed through an international legally binding instrument and states must adhere to such an instrument. Additionally, Facial Recognition Technologies must be specifically addressed as a dual-use technology and firmly prohibited from being used in AWS. Lessons can be learned from the successful disarmament treaties which have implemented dual-use technology restriction and control such as the Chemical Weapons Convention or The Wassenaar Arrangement.

Hi, I am Yohan Freuville a Political Sciences student who is deeply passionate about peace. I believe that weapons are a threat and that some weapons are deeply inhumane such as killer robots or landmines. As a Scout, I am dedicated to making the world a better place and therefore I believe that such weapons should be banned.

--

--

Stop Killer Robots

With growing digital dehumanisation, the Stop Killer Robots campaign works to ensure human control in the use of force. www.stopkillerrobots.org