Site Overlay

Open letter: UK Government must ensure that the UK is not complicit in Israel’s AI-supported killing of Palestinian civilians

Along with other humanitarian, peace-building and human rights organisations, and in support of the UK Campaign to Stop Killer Robots, we have signed this letter to the UK government raising the alarm at the Israeli’s Defence Force’s use of automated military systems in Gaza and the UK’s broader role in supporting Israeli military operations.

Read below, and find out more here:

To: Rt Hon Rishi Sunak MP, UK Prime Minister; Lord David Cameron, Foreign Secretary United Kingdom Government, Foreign and Commonwealth Office; Mr Grant Shapps, Secretary of State for Defence, United Kingdom; Mr Andrew Mitchell, Minister of State in the Foreign, Commonwealth & Development Office (FCDO),

We write to you as technology experts, engineers, international lawyers, human rights advocates, academics and individuals of conscience, and as members and friends of the UK Campaign to Stop Killer Robots, regarding the devastating attacks on Palestine.

Over more than four months of crippling violence in Gaza, Israeli Defence Forces (IDF) have targeted health infrastructureUN schoolsrefugee camps,religious sites, private residences and public buildings. At the time of writing more than 30,000 Palestinian civilians have been killed and 1.7 million of Gaza’s 2.2 million residents have been displaced and are now succumbing to starvation and disease. Analysis suggests that over 50% of buildings have been destroyed across Gaza and UN agencies, human rights organisations and the US Government have noted the indiscriminate nature of Israel’s bombing campaign. As humanitarians we unequivocally condemn such indiscriminate attacks and express our solidarity with Palestinians suffering in Gaza and facing increased settler violence in the West Bank. We unequivocally condemn attacks by Hamas on 7 October and call for the release of Israeli hostages in Gaza. We also call on Israeli authorities to release all Palestinians who are arbitrarily detained.

In our capacity as UK organisations, academics and individuals with expertise in autonomous weapons systems, we have been assessing the IDF’s use of military artificial intelligence (AI) and the role played by the UK in helping facilitate Israeli operations. We have been concerned by the global trend towards the increased application of military AI and other advanced technologies in conflict observable around the world. These include, but are in no way limited to, reports of AI use in UkraineFacebook’s algorithms creating hate speech during Ethiopia’s Tigray civil war and voice cloning in Sudan’s civil war, all of which directly contribute to violence and are technologies that can be used for target manipulation by weapons systems. The widely reported use of military AI in Gaza and the Occupied Palestinian Territories (OPT) escalates this trend, giving rise to urgent questions for the UK Government. We are horrified that civilian populations, and infrastructure, appear to be being used as testing grounds for unregulated technology development with lethal effect. As advocates for human rights law, international humanitarian law (IHL), laws of armed conflict (LOAC) and all associated international law and conventions, we strongly urge the adherence to and upholding of the rule of law. As citizens of the United Kingdom we call on our government to uphold its obligations and the very laws it purports to defend. 

The role of AI in facilitating harm in Gaza and the OPT is multidimensional. Israel’s occupation and suppression of Palestinian rights, widely believed to constitute the crime of apartheid, depends on highly militarised policing supported by sophisticated surveillance tools. An experimental facial recognition technology known as ‘Red Wolf ‘ has been used to track Palestinians and determine restrictions on their freedom of movement. It is widely known that machine generated functionality reinforces prejudices and structures of oppression; first, socially embedded within incomplete data, then with imperfect labels, the algorithmic code creates decision outcomes using statistical propabilities based on historical patterns set within parameters, which are then exposed to environmental chaos. Israel has developed a wide range of military systems that are integrated with AI, including riflesbombs, and other platforms. The IDF’s use of “Habsora” (“The Gospel”) AI system, which uses machine learning to generate recommended targets, has enabled a significant increase in the scale of attacksAccording to a former IDF Chief of Staff, AI target generation enabled the IDF to shift from a more manual process to one where a “machine produced 100 targets in one day”. AI generation may explain why, after more than four months, the IDF has seemingly not yet run out of targets. While a senior IDF official stated that there is always “a person supervising”, one Israeli intelligence source suggested the quantity of data generated by Habsora was so overwhelming that, “tens of thousands of intelligence officers could not process it.” This classic Infobesity problem can create a difficulty even for the machine to decipher which information is relevant and which it should ignore, which to feedback to command and which to discard. This gives rise to the possibility of the well known phenomenon of automation bias whereby human operators, without full knowledge of the inputs used to inform the decision, follow faulty machine-generated suggestions.

The combined effect of the IDF’s use of AI to process intelligence and generate targets facilitating attacks in the apparent absence of meaningful human control raises many of the same concerns that are widely associated with autonomous weapons systems. Handing over life and death decision-making to machines is an assault on fundamental human rights. It is profoundly wrong to reduce a human being to a series of data points and end their life on the basis of AI-performed pattern recognition – all in the absence of meaningful human oversight. These appear to be clear violations of Geneva Convention Protocol 1, Article 48, which necessitates “distinction”; Article 51, prohibiting indiscriminate and disproportionate harm; Article 57, which states that parties to conflict must take precautions to spare civilian populations and civilian objects; and Article 85 which makes it a “grave offence” to attack indiscriminately in the knowledge of anticipated excess harm. We fail to see how the use of knowingly biassed data and definitely unpredictable technology in civilian areas with no distinction “between the civilian population and combatants,” as per article 48, could not be a violation of any or all of these laws. As such we deem this technology to be unlawfully deployed.

Why is this a UK issue?

The UK is the world’s second-largest arms exporter, with the UK Government approving licences worth over half a billion pounds sterling to Israel in the last decade. The UK is also in the midst of deepening relations with Israel via a new free trade deal, part of a new bilateral strategy which describes “limitless” ambition for the relationship, further commiting its diplomatic support to Israel. Despite calls for greater scrutiny, the UK has stated that it will continue to arm Israel. Meanwhile, the UK’s Foreign Minister, David Cameron, now faces calls to clarify the UK’s justification for continued arms sales to Israel, particularly given the overwhelming evidence that their government and its armed forces are breaching IHL, as well as calls from independent UN experts for all states to “immediately cease” arms transfers to Israel which might be used in Gaza.

Israel relies heavily on arms imports via the UK Government’s secretive Open General Export Licences, which allow for unlimited exports of the listed items, though no information is published regarding the quantities or values of items or technology transferred under open licences. The Government does report that under standard licences, which do place limits on quantities, since 2008, the UK has approved at least £185 million worth of licences under the ML22 category, “military technology” which includes technology supplied to operate the software of weapons systems; and £1.9m under the ML21 category, “software” which includes software designed to operate weapons systems. A lack of transparency makes it unclear what military functionality is provided by this technology. In order for the UK to ensure that arms export licences are issued in accordance with the national arms export criteria, specifically criterion 2 relating to serious violations of international humanitarian and human rights law, it would have been required to assess the risk that Israeli targeting systems were sufficiently capable of distinguishing between combatants and non-combatants and ensuring commanders’ ability to take all feasible precautions to avoid civilian casualties. In addition, we understand that the UK’s main parliamentary oversight mechanism, the Committees on Arms Export Controls (CAEC), has recently disbanded, with disagreements over how scrutiny will proceed. We have forwarded a copy of this letter to the Chairs of the Committees that previously comprised the CAEC to request urgent clarification of how parliamentary oversight processes are addressing the concerns outlined in this letter at this crucial juncture.

The other significant dimension of the UK’s involvement relates to the provision of intelligence and surveillance data. On 13 October the UK confirmed that the Prime Minister deployed UK military, including P-8 and other surveillance assets, to the Eastern Mediterranean to “support Israel and reinforce regional stability”. Since then 12 UK military aircraft have been deployed to the region, as well as a naval task force. Recent reports suggest the UK has flown at least 50 spy missions over Gaza in support of Israel from its base in Cyprus. More broadly, the UK’s intelligence sharing relationship is widely known to be extensive. While the UK government stated that the extent of the information sharing will be restricted for the purposes of hostage release, we request reassurance that no UK-provided data will be fed, alongside other inputs, into Israel’s AI targeting processes and details of the safeguards that would need to be in place to uphold this arrangement. 

We are deeply alarmed by the use of automated military systems in Gaza and the UK’s broader role in supporting Israeli military operations. Gaza must not be used as a testing ground for new AI-enabled weapons technology. We urge the UK to:

* Support the numerous United Nations resolutions demanding an immediate ceasefire in Gaza, and uphold the International Court of Justice’ ruling obliging Israel’s prevention of genocidal acts against Palestinians

* Immediately halt all transfers of arms, military technology to Israel

* Publish details of any UK export, including technology and software, at risk of being used in Israel’s offensive in Gaza

* Explain what measures are being taken to ensure the UK is not complicit, through data sharing or otherwise, in Israel’s AI-driven targeting systems

* Join the UN Secretary-General, the Pope, and over 100 countries by working for a legally binding treaty to regulate Autonomy in Weapons Systems

* Continue aid to refugees as well as to support structures and organisations, including UNRWA

* Support independent accountability mechanisms and access required for record and evidence gathering.

Without action to address the concerns laid out in this letter, confidence that the UK is not complicit in Israel’s AI-supported slaughter of Palestinian civilians is impossible. 

Read more at UK Stop Killer Robots

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.