|
Description:
|
|
Intergovernmental discussions on the regulation of emerging technologies in the area of (lethal) autonomous weapon systems (AWS) are back on track in Geneva after more than a year of COVID-19 related disruptions. A critical task facing States is to further clarify how international humanitarian law (IHL) applies: what limits does it place on the development and use of AWS and, perhaps most importantly, what does it require from humans in the use of force?
In this post, Laura Bruun from the Stockholm International Peace Research Institute (SIPRI), reflects on whether IHL provides sufficiently clear guidance as to how humans and machines may interact in use of force decisions. Building on the findings of a recent SIPRI study, she argues that clarification may be warranted and provides concrete suggestions on how States may further identify what IHL compliance requires in the development and use of AWS. |