Site Overlay

Ban Lethal Autonomous Weapons Systems!

The UK Section of the Women’s International League for Peace and Freedom (WILPF) supports a ban on Lethal Autonomous Weapons Systems (LAWS).

Developing technologies are adding to the levels of autonomy in todays weapons systems.

China, Israel, Russia, the United Kingdom (UK), the United States of America (USA) and South Korea are among the high-tech states moving toward systems that would give greater combat autonomy to machines.[i]

Precise definitions vary but there are three generally recognised degrees of supervision for robotic weapons systems:

  1. Human-in-the-Loop Weapons: Robots that can select targets and deliver lethal force only with a human command – includes Drones (robotic aerial vehicles) – piloted at a distance by humans who assess the target identified by the drone and decide whether or not to attack.
  1. Human-on-the-Loop Weapons*: Robots that can select targets and deliver force under the oversight of a human operator who can override the robots’ Includes defence systems that target and track incoming rockets or missiles and respond without human intervention. Currently override capabilities exist: half a second for Israel’s Iron Dome, 4.5 seconds for NBS Mantis.[ii]
  1. Human-out-of-the-Loop Weapons: Robots that are capable of selecting targets and delivering force without any human input or interaction.[iii] Korea currently uses a border technology, the Super aEgis II, that is an Out-of-the-Loop system – the fully autonomous function is currently switched off.[iv] The US Navy X-47B is capable of taking off and landing on an aircraft carrier and of autonomous refuelling mid-air.[v]

Human Rights Watch have called for an international treaty that would prohibit the development, production, and use of fully autonomous weapons. They also call on individual nations to pass laws and adopt policies to prevent development, production, and use of such weapons at the domestic level[vi] because “the weapons’ likely use beyond the battlefield has been largely ignored. Autonomous weapons could easily be adapted for use in law enforcement operations (as has already happened with drones in Idaho, USA ), which, if things go wrong, would trigger international human rights law.[vii]

Human Rights Watch have comprehensively developed the legal arguments against LAWS: why a robot cannot stand trial; how it is highly unlikely that any of the personnel involved in the development, implementation and deployment of LAWS: programmers, manufacturers, commanders, could be liable for the actions of their creations[viii].

A robot could not be held accountable because “the goals of criminal law, the duties of humanitarian and human rights law for legal accountability and the criminal act of intentionality could not be met. A robot is not a person, has no legal persona and cannot be tried in international courts and would have no concept of ‘punishment’”[ix] Victim(s) would therefore be left without a sense of justice having been done, without a personal or collective psychological closure and the possible inability to move on in a constructive way.

The UK Foreign Office has stated “At present we do not see the need for a prohibition on the use of LAWS, as international humanitarian law already provides sufficient regulation for this area”[x] However, Human Rights Watch has argued that autonomous weapons would not be capable of meeting the key principles of International Humanitarian Law: Distinction, Proportionality and Precaution. In addition they might also contravene the Martens Clause – requiring weapons to be evaluated according to ‘principles of humanity’ and the ‘dictates of public conscience’.[xi]

LAWS capable of autonomy are estimated to be some 20-30 years away in development terms. But the technology is developing rapidly: the DARPA (Defense Advanced Research Projects Agency) Robotics Challenge put up $3.5million in prize money this year (2015) https://www.youtube.com/watch?v=BGOUSvaQcBs Why would the defence arm of the US military establishment be interested in autonomous robots if not for military purposes?

Research into artificial intelligence (AI) is progressing and became the topic of many conversations following the successful television programme Humans. AI has many applications outside of the military[xii] but it is the weaponisation of this ‘intelligence’ that is the critical issue. The US LOCUST programme is concerned with the further development of swarms of autonomous robots that will ‘multiply combat power at decreased risk to the warfighter’.[xiii]

LAWS development has raised the possibility of a future arms race as nations compete for ‘superiority’. The security of computerised systems is another area receiving too little attention – the consequences of LAWS hacking could be horrific. We already know the impact of civilians lives lived under the threat of drones monitoring their every move – lives blighted by fear and resulting in deep-seated psychological trauma.[xiv] What will be the impact of advanced LAWS operating among civilian populations?

Whilst there is nothing wrong with developing and using robotic technologies, both physical and AI, for economic or social purposes, we must question our own moral worth in delegating the responsibility for life and death decision making to robots.   The UK Section of WILPF is concerned that a ‘wait and see’ attitude prevails and may linger until it is too late. We remind our leaders that 105 states have signed up to the Protocol on Blinding Laser Weapons and ask why there is such reluctance by UK Government and other world leaders to take action and embrace a comprehensive, pre-emptive, prohibition on LAWS?

_______________________________

Documents, websites and other materials consulted:

WILPF is on the steering committee of the Campaign to Stop Killer Robots – please visit the site for much more information on this subject. http://www.stopkillerrobots.org

5 of the most advanced robots: https://www.youtube.com/watch?v=tV6-V-h4fFI

Luth, Andrew. Mind Over Machine: Why Human Soldiers are (and Will Remain) Better than Killer Robots

http://stopkillerrobots.ca/2015/06/18/mind-over-machine-why-human-soldiers-are-and-will-remain-better-than-killer-robots/ (accessed 24 June 2015)

Human Rights Watch and the International Human Rights Clinic (IHRC) at Harvard Law School have published the following reports:

  • Shaking the Foundations: The Human Rights Implications of Killer Robots http://www.hrw.org/sites/default/files/reports/arms0514_ForUpload_0.pdf
  • Losing Humanity: The Case Against Killer Robots http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf
  • Mind The Gap: The Lack of Accountability for Killer Robots http://www.hrw.org/reports/2015/04/09/mind-gap

 

[i] http://www.stopkillerrobots.org/learn/

[ii] http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf p10-11

[iii] http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf p2

[iv] http://www.bbc.com/future/story/20150715-killer-robots-the-soldiers-that-never-sleep

[v] http://www.theregister.co.uk/2015/04/17/unmanned_aerial_vehicles_are_now_autonomously_refueling/

[vi] http://www.hrw.org/news/2012/11/19/ban-killer-robots-it-s-too-late

[vii] http://www.hrw.org/reports/2014/05/12/shaking-foundations

[viii]Human Rights Watch, Mind The Gap, http://www.hrw.org/reports/2015/04/09/mind-gap p2-3

[ix]Human Rights Watch, Mind The Gap, http://www.hrw.org/reports/2015/04/09/mind-gap p2-3

[x] http://www.theguardian.com/politics/2015/apr/13/uk-opposes-international-ban-on-developing-killer-robots

[xi] http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf p.30-35

[xii] http://www.technologyreview.com/news/533686/2014-in-computing-breakthroughs-in-artificial-intelligence/

http://www.independent.co.uk/life-style/gadgets-and-tech/news/a-robot-has-passed-the-selfawareness-test-10395895.html#

[xiii] http://www.onr.navy.mil/Media-Center/Press-Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx

[xiv] http://www.channel4.com/news/drone-attacks-traumatising-a-generation-of-children (accessed 8 July 2015)

International Human Rights and Conflict Resolution Clinic, Living Under Drones, Sept 2012. http://livingunderdrones.org/ (accessed 6 July 2015)

[1] http://www.stopkillerrobots.org/learn/

[1] http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf p10-11

[1] http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf p2

[1] http://www.bbc.com/future/story/20150715-killer-robots-the-soldiers-that-never-sleep

[1] http://www.theregister.co.uk/2015/04/17/unmanned_aerial_vehicles_are_now_autonomously_refueling/

[1] http://www.hrw.org/news/2012/11/19/ban-killer-robots-it-s-too-late

[1] http://www.hrw.org/reports/2014/05/12/shaking-foundations

[1]Human Rights Watch, Mind The Gap, http://www.hrw.org/reports/2015/04/09/mind-gap p2-3

[1]Human Rights Watch, Mind The Gap, http://www.hrw.org/reports/2015/04/09/mind-gap p2-3

[1] http://www.theguardian.com/politics/2015/apr/13/uk-opposes-international-ban-on-developing-killer-robots

[1] http://www.hrw.org/sites/default/files/reports/arms1112ForUpload_0_0.pdf p.30-35

[1] http://www.technologyreview.com/news/533686/2014-in-computing-breakthroughs-in-artificial-intelligence/

http://www.independent.co.uk/life-style/gadgets-and-tech/news/a-robot-has-passed-the-selfawareness-test-10395895.html#

[1] http://www.onr.navy.mil/Media-Center/Press-Releases/2015/LOCUST-low-cost-UAV-swarm-ONR.aspx

[1] http://www.channel4.com/news/drone-attacks-traumatising-a-generation-of-children (accessed 8 July 2015)

International Human Rights and Conflict Resolution Clinic, Living Under Drones, Sept 2012. http://livingunderdrones.org/ (accessed 6 July 2015)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.