Cover Art for Carr Center Discussion PaperIn spring 2013, a global coalition, the Campaign to Stop Killer Robots, launched with a mission to advocate for a ban on “machines that determine whom to kill.” Nine years later, almost to the day at the time of writing, no such ban exists. Autonomous weapons research is alive and well, and artificial intelligence has made it to the fore of the Pentagon’s future weapons development strategy. The latest Review Conference of the Convention on Conventional Weapons (CCW), a primary forum for international talks on lethal autonomous weapon systems, failed to achieve consensus on whether new international laws are needed to address threats posed by autonomous weapons technology. Meanwhile, high-tech military powers, including China, Russia, Israel, South Korea, the US, and the UK, continue to invest heavily in the development of autonomous weapon systems.

One especially widely shared worry is that AWS may not be able to comply with the laws of armed conflict. This paper warns that, though seemingly natural and ubiquitous, appeals to international humanitarian law (IHL) should be handled with care. By interrogating compliance with IHL as a criterion for assessing the moral permissibility of deployment, this paper illuminates an altogether different dimension of the debate: what criteria we should apply to begin with, as we confront the moral and legal conundrums of the increasing autonomization of warfare.



Linda Eggert. 9/16/2022. “Handle With Care: Autonomous Weapons and Why the Laws of War Are Not Enough.” Edited by Joshua Simons. Cambridge, MA: Harvard Kennedy School.