Site Overlay

Ban Lethal Autonomous Weapons Systems!

The UK Section of the Women’s International League for Peace and Freedom (WILPF) supports a ban on Lethal Autonomous Weapons Systems (LAWS).

Developing technologies are adding to the levels of autonomy in todays weapons systems.

China, Israel, Russia, the United Kingdom (UK), the United States of America (USA) and South Korea are among the high-tech states moving toward systems that would give greater combat autonomy to machines.[i]

Precise definitions vary but there are three generally recognised degrees of supervision for robotic weapons systems:

  1. Human-in-the-Loop Weapons: Robots that can select targets and deliver lethal force only with a human command – includes Drones (robotic aerial vehicles) – piloted at a distance by humans who assess the target identified by the drone and decide whether or not to attack.
  1. Human-on-the-Loop Weapons*: Robots that can select targets and deliver force under the oversight of a human operator who can override the robots’ Includes defence systems that target and track incoming rockets or missiles and respond without human intervention. Currently override capabilities exist: half a second for Israel’s Iron Dome, 4.5 seconds for NBS Mantis.[ii]
  1. Human-out-of-the-Loop Weapons: Robots that are capable of selecting targets and delivering force without any human input or interaction.[iii] Korea currently uses a border technology, the Super aEgis II, that is an Out-of-the-Loop system – the fully autonomous function is currently switched off.[iv] The US Navy X-47B is capable of taking off and landing on an aircraft carrier and of autonomous refuelling mid-air.[v]

Human Rights Watch have called for an international treaty that would prohibit the development, production, and use of fully autonomous weapons. They also call on individual nations to pass laws and adopt policies to prevent development, production, and use of such weapons at the domestic level[vi] because “the weapons’ likely use beyond the battlefield has been largely ignored. Autonomous weapons could easily be adapted for use in law enforcement operations (as has already happened with drones in Idaho, USA ), which, if things go wrong, would trigger international human rights law.[vii]

Human Rights Watch have comprehensively developed the legal arguments against LAWS: why a robot cannot stand trial; how it is highly unlikely that any of the personnel involved in the development, implementation and deployment of LAWS: programmers, manufacturers, commanders, could be liable for the actions of their creations[viii].

A robot could not be held accountable because “the goals of criminal law, the duties of humanitarian and human rights law for legal accountability and the criminal act of intentionality could not be met. A robot is not a person, has no legal persona and cannot be tried in international courts and would have no concept of ‘punishment’”[ix] Victim(s) would therefore be left without a sense of justice having been done, without a personal or collective psychological closure and the possible inability to move on in a constructive way.

The UK Foreign Office has stated “At present we do not see the need for a prohibition on the use of LAWS, as international humanitarian law already provides sufficient regulation for this area”[x] However, Human Rights Watch has argued that autonomous weapons would not be capable of meeting the key principles of International Humanitarian Law: Distinction, Proportionality and Precaution. In addition they might also contravene the Martens Clause – requiring weapons to be evaluated according to ‘principles of humanity’ and the ‘dictates of public conscience’.[xi]

LAWS capable of autonomy are estimated to be some 20-30 years away in development terms. But the technology is developing rapidly: the DARPA (Defense Advanced Research Projects Agency) Robotics Challenge put up $3.5million in prize money this year (2015) Why would the defence arm of the US military establishment be interested in autonomous robots if not for military purposes?

Research into artificial intelligence (AI) is progressing and became the topic of many conversations following the successful television programme Humans. AI has many applications outside of the military[xii] but it is the weaponisation of this ‘intelligence’ that is the critical issue. The US LOCUST programme is concerned with the further development of swarms of autonomous robots that will ‘multiply combat power at decreased risk to the warfighter’.[xiii]

LAWS development has raised the possibility of a future arms race as nations compete for ‘superiority’. The security of computerised systems is another area receiving too little attention – the consequences of LAWS hacking could be horrific. We already know the impact of civilians lives lived under the threat of drones monitoring their every move – lives blighted by fear and resulting in deep-seated psychological trauma.[xiv] What will be the impact of advanced LAWS operating among civilian populations?

Whilst there is nothing wrong with developing and using robotic technologies, both physical and AI, for economic or social purposes, we must question our own moral worth in delegating the responsibility for life and death decision making to robots.   The UK Section of WILPF is concerned that a ‘wait and see’ attitude prevails and may linger until it is too late. We remind our leaders that 105 states have signed up to the Protocol on Blinding Laser Weapons and ask why there is such reluctance by UK Government and other world leaders to take action and embrace a comprehensive, pre-emptive, prohibition on LAWS?


Documents, websites and other materials consulted:

WILPF is on the steering committee of the Campaign to Stop Killer Robots – please visit the site for much more information on this subject.

5 of the most advanced robots:

Luth, Andrew. Mind Over Machine: Why Human Soldiers are (and Will Remain) Better than Killer Robots (accessed 24 June 2015)

Human Rights Watch and the International Human Rights Clinic (IHRC) at Harvard Law School have published the following reports:

  • Shaking the Foundations: The Human Rights Implications of Killer Robots
  • Losing Humanity: The Case Against Killer Robots
  • Mind The Gap: The Lack of Accountability for Killer Robots



[ii] p10-11

[iii] p2





[viii]Human Rights Watch, Mind The Gap, p2-3

[ix]Human Rights Watch, Mind The Gap, p2-3


[xi] p.30-35



[xiv] (accessed 8 July 2015)

International Human Rights and Conflict Resolution Clinic, Living Under Drones, Sept 2012. (accessed 6 July 2015)


[1] p10-11

[1] p2





[1]Human Rights Watch, Mind The Gap, p2-3

[1]Human Rights Watch, Mind The Gap, p2-3


[1] p.30-35



[1] (accessed 8 July 2015)

International Human Rights and Conflict Resolution Clinic, Living Under Drones, Sept 2012. (accessed 6 July 2015)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.