Skip to main content
Donate Now

Statement on Humanitarian Considerations to UNGA Meeting on Lethal Autonomous Weapons Systems

Delivered by Mary Wareham, Deputy Director

Thank you, Chair.

Human Rights Watch, a co-founding member of Stop Killer Robots campaign, is deeply concerned about the grave humanitarian threats autonomous weapons systems pose during armed conflict. These systems would face significant challenges complying with international humanitarian law—particularly its core principles of distinction and proportionality—and thus endanger civilians and protected individuals on the battlefield.

There are serious doubts about whether autonomous weapons systems could distinguish adequately between lawful and unlawful targets. In today’s armed conflicts, combatants often shed their uniforms and operate in civilian areas, which means they must be identified in other ways.

Determining who is an active combatant and who is a civilian or a wounded or surrendering soldier often requires gauging an individual’s intentions based on subtle behavioral clues. Autonomous weapons systems would lack the ability to accurately recognize and interpret such clues because they would not possess the human emotions and contextual understanding essential for these tasks.

Even if technological developments could improve the ability of autonomous weapons systems to comply with the rule of distinction, these weapons systems would nonetheless find it harder to overcome the challenges of weighing the proportionality of an attack. That test requires determining whether civilian harm is excessive in relation to military advantage on a case-by-case basis in a rapidly changing environment.

Humans can apply human judgment, informed by legal and moral norms and personal experience, but it would be difficult to replicate in a machine the human judgment necessary to handle the unexpected and infinite number of situations they might encounter on the battlefield. 

Autonomous weapons systems would also endanger civilians in other ways. As the ICRC has noted, they could lead to escalation and reduce the threshold of going to war, thus putting civilians at greater risk.  

The use of autonomous weapons systems, depending on their payload, may result in explosive remnants of war and other battlefield contamination, creating significant humanitarian threats for civilians during and after armed conflict.

Finally, autonomous weapons systems would in addition create an accountability gap that will not only mean no one will be held responsible for a system’s unlawful action but will also eliminate an important tool for preventing harm. Accountability, which helps promote deterrence and retribution, seeks in part to avert future violations.

A legally binding instrument that prohibits the targeting of people by autonomous weapon systems and guarantees meaningful human control over the use of force is the best way to address the threats to international humanitarian law violations, maximize civilian protection, and address potential humanitarian impacts.

Thank you.  

Your tax deductible gift can help stop human rights violations and save lives around the world.