MetaTOC stay on top of your field, easily

No Such Thing as Killer Robots

Journal of Applied Philosophy

Published online on

Abstract

There have been two recent strands of argument (one offered by Rob Sparrow and the other offered by Duncan Purves, Ryan Jenkins, and Bradley Strawser) arguing for the pro tanto impermissibility of fully autonomous weapon systems (AWS). On Sparrow's view, AWS are impermissible because they generate a morally problematic ‘responsibility gap’. According to Purves et al., AWS are impermissible because moral reasoning is not codifiable and because AWS are incapable of acting for the ‘right’ reasons. I contend that these arguments are flawed and that AWS are not morally problematic in principle. Specifically, I contend that these arguments presuppose an incoherent conception of an AWS as somehow making genuine decisions but not being morally responsible for those very same decisions. Rather than conceiving of AWS in this way, I argue that an AWS is either a socially‐constructed institution that has been physically instantiated or it is a genuine agent. If it is the former, then we should treat AWS as we do any other collective action problem. If it is the latter, then we should treat AWS as responsibility‐bearers, but also as bearers of rights and/or interests. To reject this disjunction is not only conceptually incoherent but also potentially morally dangerous.