Add to Technorati Favorites

Weekly Index
Research Sites
CALENDAR

  • Features
  • Categories
  • Resources
  • About

search ctlab

Last 100 Entries
« Wired for ... Nuclear War? | Main | Opening Remarks »
Monday
30Mar

Whither the Anti-Killer Robot Lobby?

Peter Singer has written the definitive primer on how military robotics is changing the nature war. I could rave, as I’ve already done elsewhere, about the various merits of the book – everything from his myriad citations of popular culture to the book’s intense readability about an extremely sophisticated subject. But my interest in Wired for War pertains primarily to the social implications of warbots, rather than the technologies themselves; and in particular how this revolution is changing or challenging not war itself but rather the rules of war. So my remarks here will focus on this dimension of the book – on rule-sets.

In the chapter on “Digitizing the Laws of War,” Singer overviews of some of the conceptual issues introduced by these weapons, and provides a decent introduction to the laws of war for a book pitched at a wide audience. And he concludes by emphasizing the problems associated with the current legal vacuum: ““We had better either enact a legal ban on such systems soon or start to develop some legal answers for how to deal with them.” (p. 409)

A point well made. However I want to take on a few claims he makes in this chapter.

1) One of the most interesting data-points in the chapter is the mind-blowing absence of a policy position on battlefield robotics by leading human rights and humanitarian law organizations, including the guardian of the Geneva Conventions, the International Committee of the Red Cross (p. 385). I think he explains this away rather too easily, as a result of organizational overstretch on the one hand (“the organization is burdened with everything from ensuring detainee rights at Guantanamo Bay to pressuring nations to fulfill their pledges to end the use of landmines”), and the “brewing breakdown between the laws of war and the reality of conflict in the twenty-first century” (p. 386) on the other.

But my sense is that there is a more complex story to the selection process in humanitarian law advocacy organizations. Singer’s suggestion that robots are “just too futuristic” doesn’t gel with the fact that the ICRC took the lead on banning blinding laser weapons in the late ‘80s, at a time when most governments thought those weapons were science fiction. Nor are the laws of war necessarily “breaking down” just because they often lag behind the times – this is simply the nature of international law. The truth is, the law works very well where it is developed and refined through multilateral negotiation and where states are pressed by NGOs to follow through on their commitment. The problem here is not that the law doesn’t work; the problem is that the law isn’t being applied to or molded to fit all the most pressing problems.

The interesting question for me is why “global civil society” gravitates toward particular gaps in the law, like international judicial architecture and cluster bombs, but ignores others, like compensation for collateral damage victims or autonomous weapons systems. I presented a paper at Columbia University last week that fleshes out some of this variation. At any rate, an important lesson emerges from the absence of attention by NGOs that Singer documents: norm entrepreneurs like Noel Sharkey, who are calling for a precautionary principle against the use of these weapons would be wise to focus on selling this idea to organizations like the ICRC and Human Rights Watch rather than to governments per se, because rare is the weapons ban that has been negotiated, since the 1980s, without the blessing of these humanitarian law gatekeepers.

2) Regarding the basis for such a ban. I don’t agree with Singer’s assessment that the best reason to ban autonomous weapons is “for no other reason than that the world doesn’t want them around.” (p. 408). He says this twice, each time characterizing robots as analogous to blinding lasers and chemical weapons – the argument being that such weapons don’t violate jus in bello principles per se, they’re just socially unacceptable for reasons having nothing to do with humanitarian law.

But I think the analogy is incorrect. Chemical weapons are inherently uncontrollable (therefore indiscriminate) and so they actually do violate the rules of war. And blinding lasers are considered to cause disproportionate suffering even when used only against combatants, because of the physical and psychological damage associated with permanent blinding. Both these weapons were banned in the context of existing international law, not just because people didn’t like them.

So too might robots be banned on the basis of the law, rather than our gut feelings. If you treat robots as weapons - though you could also treat them as soldiers, which opens another can of wormsif you treat them as weapons, the burden is theoretically on states to demonstrate that they meet the principles of proportionality and discrimination. The most serious concern is whether they can be trained to discriminate between civilians and combatants, and so far I have serious doubts. Even if they can, their effects would by definition be uncontrollable once deployed.

So there is in fact a legal argument to be made here. The question is, will someone make it successfully? The burden of proof is only on states to the extent that humanitarian law advocates act as if it is.

Ultimately, Singer probably wouldn’t characterize himself as an advocate of a ban on autonomous weapons. He has too level-headed an appraisal of the good that might be done with them, and the book is chock full of arguments from both sides. And one could envision rule-sets other than a complete treaty ban: I really like Singer’s idea of using product liability law as a model. But certainly I would agree with his assessment that some legal rule-sets need to be developed, and fast, to govern the changes that are taking place in this arena.

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
|
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>