War robots no different than landmines, says Landmine Action

Mar 28, 2008 16:22 GMT  ·  By

Landmines are banned from use in more than 150 countries around the world, however the Landmine Action believes that autonomous robots capable of killing people should also be banned because they fall under the same treaty that had outlawed the landmines. The US Army is currently the proud owner of a few thousand military robots, carrying machine-guns, going about killing people throughout countries such as Iraq and Afganistan.

Although they are not completely autonomous at this time, the US Department of Defense wants to implement a system which will give them the ability to make decisions on their own, whether to kill or not a person. Upon discussing with Sheffield University roboticist Noel Sharkey, Landmine Action was convinced to campaign against giving robots autonomous capabilities.

The technology employed into these robots is not very different from that used in cluster bombs, which are also one of the devices against which Landmine Action is campaigning. These bombs explode in the air in order to release a series of smaller bombs that descend with the help of parachutes. Depending on the temperature of the bodies located below them, they either detonate or not. However, if no heat is detected on the ground, the bombs detonate high in the atmosphere to avoid further inconveniences.

"But that decision to detonate is still in the hands of an electronic sensor rather than a person. Our concern is that humans, not sensors, should make targeting decisions. So similarly, we don't want to move towards robots that make decisions about combatants and noncombatants," says Moyes.

"We should not use autonomous armed robots unless they can discriminate between combatants and noncombatants. And that will be never," said Sharkey.

University of Washington researchers agree. Robots should not be used in armed conflicts, they should only be used for peaceful purposes. Roboticist Ronald Arkin from Georgia Tech believes however that, one day, robots could become more ethical soldiers. He is already working to program ethical rules, that are in concordance with the Geneva Convention. Arkin, who works for the Pentagon, says that the big advantage of using robots in armed conflicts is that they cannot be affected by emotions, self-preservation senses, or fear of their commanders.