Campaign asks for international treaty to limit war robots
by Nic Fleming for New Scientist
30 September 2009
A robotics expert, a physicist, a bioethicist and a philosopher have founded the International Committee for Robot Arms Control (ICRAC) to campaign for limits on robotic military hardware. Roboticist Noel Sharkey at the University of Sheffield, UK, and his colleagues set up ICRAC after a two-day meeting in Sheffield earlier this month. Sharkey has spoken before of ethical concerns about military systems that make their own decisions. "Robot weapons are likely to change the character of warfare," Sharkey told New Scientist. "We seem to be rushing headlong into the development of autonomous weapons systems without any real concern for the long-term impact on civilian populations."
In its opening declaration the committee called for military robots to be banned from space and said no robotic systems should carry nuclear weapons. The other founding members of ICRAC are physicist Jürgen Altmann of Dortmund University of Technology, Germany; Robert Sparrow of the Centre for Human Bioethics, Monash University, near Melbourne, Australia; and philosopher Peter Asaro of Rutgers University in New Brunswick, New Jersey. The committee will recruit more people to monitor the development of autonomous weapons and to campaign for the preventative arms control – like the regulations that govern nuclear and biological weapons – to be applied to robots.
The US air force's remote-controlled aircraft – MQ-1 Predators and MQ-9 Reapers – are playing an ever-growing role in the conflicts in Iraq and Afghanistan. And thousands of ground-based robots have been used to help western forces carry out surveillance in dangerous areas of these countries and to locate and disarm bombs worldwide. Among the most advanced military robots are Talons – small tractor-mounted units with chemical, temperature and radiation sensors that can also carry grenade launchers, machine guns and 50-calibre rifles. Close to 50 countries either already have or are working to obtain robotic military systems, says Sharkey. So far these are all controlled remotely by pilots or other operators.
ICRAC fears the principle of keeping a "man in the loop" will be eroded, so that the next generation of robot soldiers will be trusted with life-or-death decisions. Indeed, research into just such scenarios is taking place with US military funding. The committee is also worried that countries will be more likely to go to war if their casualties will be robots rather than human soldiers. They have also raised the danger of autonomous systems starting and escalating conflicts automatically. They are drawing up a report on their concerns to present to the European parliament and plan to invite researchers, politicians and representatives of the military to a conference in Germany next summer. However, robot soldiers have their place, says Michael Codner, director of military sciences at the Royal United Services Institute, a defence think tank in London. "If you are using them to clear mines and there is no one at risk, it makes absolute sense to use them. If one reaches the stage of artificial intelligence where robots become unpredictable because they are making their own minds up, it will be difficult to retain responsibility in the user," he concedes. "But that is an issue that will be some way in the future. There is time for ethics and law to cope with this eventually." Robotics engineer Ron Arkin at the Georgia Institute of Technology, Atlanta, has argued that machines could perform more ethically than humans in some battlefield situations if they had ethical rules and biases incorporated into their control software.
[The best thing I can say about this is that at least some people are becoming aware of the issues around the potential use of intelligent robots in warfare. I can’t help but think though that this is a cardinal case of too little too late. I doubt very much if any military organisation anywhere in the world will listen to academics about the risks associated with any future autonomous killing machines that will be developed to fight our wars for us. But at least they can’t say that no one told them.]
No comments:
Post a Comment