Jump to content
  • Sign Up
×
×
  • Create New...

Israel accused of using AI to target thousands in Gaza, as killer algorithms outpace international law


Recommended Posts

  • Diamond Member



******* accused of using AI to target thousands in Gaza, as ******* algorithms outpace international law

Credit: Pixabay/CC0 Public Domain

The ******** army used a new artificial intelligence (AI) system to generate lists of tens of thousands of human targets for potential airstrikes in Gaza, according to a

This is the hidden content, please
published last week. The report comes from the nonprofit outlet +972 Magazine, which is run by ******** and ************ journalists.

The report cites interviews with six unnamed sources in ******** intelligence. The sources claim the system, known as Lavender, was used with other AI systems to target and ************ suspected militants—many in their own homes—causing large numbers of civilian casualties.

According to another report in the Guardian, based on the same sources as the +972 report, one intelligence officer

This is the hidden content, please
the system “made it easier” to carry out large numbers of strikes, because “the machine did it coldly”.

As militaries around the world race to use AI, these reports show us what it may look like: machine-speed warfare with limited accuracy and little human oversight, with a high cost for civilians.

Military AI in Gaza is not new

The ******** Defence Force denies many of the claims in these reports. In a

This is the hidden content, please
, it said it “does not use an artificial intelligence system that identifies ********** operatives”. It said Lavender is not an AI system but “simply a database whose purpose is to cross-reference intelligence sources”.

But in 2021, the Jerusalem Post reported an intelligence official saying ******* had just won its first “

This is the hidden content, please
“—an earlier conflict with ******—using a number of machine learning systems to sift through data and produce targets. In the same year a book called
This is the hidden content, please
, which outlined a vision of AI-powered warfare, was published under a pseudonym by an author
This is the hidden content, please
to be the head of a key ******** clandestine intelligence unit.

Last year,

This is the hidden content, please
said ******* also uses an AI system called Habsora to identify potential militant buildings and facilities to *****. According the report, Habsora generates targets “almost automatically”, and one former intelligence officer described it as “a mass ************** factory”.

The

This is the hidden content, please
also claims a third system, called Where’s Daddy?, monitors targets identified by Lavender and alerts the military when they return home, often to their family.

****** by algorithm

Several countries are turning to algorithms in search of a military edge. The US military’s Project Maven supplies

This is the hidden content, please
that has been used in the Middle East and Ukraine. China too is rushing to
This is the hidden content, please
to analyze data, select targets, and aid in decision-making.

Proponents of military AI

This is the hidden content, please
it will enable faster decision-making, greater accuracy and reduced casualties in warfare.

Yet last year, Middle East Eye

This is the hidden content, please
an ******** intelligence office said having a human review every AI-generated target in Gaza was “not feasible at all”. Another source
This is the hidden content, please
they personally “would invest 20 seconds for each target” being merely a “rubber stamp” of approval.

The ******** Defence Force response to the most recent report

This is the hidden content, please
“analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law”.

As for accuracy, the latest

This is the hidden content, please
Lavender automates the process of identification and cross-checking to ensure a potential target is a senior ****** military figure. According to the report, Lavender loosened the targeting criteria to include lower-ranking personnel and weaker standards of evidence, and made errors in “approximately 10% of cases”.

The report also claims one ******** intelligence officer said that due to the Where’s Daddy? system, targets would be bombed in their homes “without hesitation, as a first option”, leading to civilian casualties. The ******** army

This is the hidden content, please
it “outright rejects the claim regarding any policy to ***** tens of thousands of people in their homes”.

Rules for military AI?

As military use of AI becomes more common, ethical, moral and legal concerns have largely been an afterthought. There are so far no clear, universally accepted or legally binding rules about military AI.

The ******* Nations has been discussing “lethal autonomous weapons systems” for more than ten years. These are devices that can make targeting and ******* decisions without human input, sometimes known as “******* robots”. Last year saw some progress.

The UN General Assembly voted in favor of a new draft resolution

This is the hidden content, please
algorithms “must not be in full control of decisions involving ********”. Last October, the US also
This is the hidden content, please
a declaration on the responsible military use of AI and autonomy, which has since been endorsed by 50 other states. The
This is the hidden content, please
on the responsible use of military AI was held last year, too, co-hosted by the Netherlands and the Republic of Korea.

Overall, international rules over the use of military AI are struggling to keep pace with the fervor of states and arms companies for high-tech, AI-enabled warfare.

Facing the ‘unknown’

Some ******** startups that make AI-enabled products are reportedly

This is the hidden content, please
of their use in Gaza. Yet reporting on the use of AI systems in Gaza suggests how far short AI falls of the dream of precision warfare, instead creating serious humanitarian harms.

The industrial scale at which AI systems like Lavender can generate targets also effectively “

This is the hidden content, please
” in decision-making.

The willingness to accept AI suggestions with barely any human scrutiny also widens the scope of potential targets, inflicting greater harm.

Setting a precedent

The reports on Lavender and Habsora show us what current military AI is already capable of doing. Future risks of military AI may increase even further.

******** military analyst Chen Hanghui has envisioned a future “

This is the hidden content, please
“, for example, in which machines make decisions and take actions at a pace too fast for a human to follow. In this scenario, we are left as little more than spectators or casualties.

A

This is the hidden content, please
earlier this year sounded another warning note. US researchers carried out an experiment in which large language models such as GPT-4 played the role of nations in a wargaming exercise. The models almost inevitably became trapped in arms races and escalated conflict in unpredictable ways, including using nuclear weapons.

The way the world reacts to current uses of military AI—like we are seeing in Gaza—is likely to set a precedent for the future development and use of the technology.

Provided by
The Conversation


This article is republished from

This is the hidden content, please
under a Creative Commons license. Read the
This is the hidden content, please
.

Citation:
******* accused of using AI to target thousands in Gaza, as ******* algorithms outpace international law (2024, April 11)
retrieved 11 April 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.







This is the hidden content, please

Science, Physics News, Science news, Technology News, Physics, Materials, Nanotech, Technology, Science
#******* #accused #target #thousands #Gaza #******* #algorithms #outpace #international #law

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.