Diamond Member Pelican Press 0 Posted April 4 Diamond Member Share Posted April 4 ‘The machine did it coldly’: ******* used AI to identify 37,000 ****** targets | *******-Gaza war The ******** military’s ******** campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to ******, according to intelligence sources involved in the war. In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that ******** military officials permitted large numbers of ************ civilians to be *******, particularly during the early weeks and months of the conflict. Their unusually candid testimony provides a rare glimpse into the first-hand experiences of ******** intelligence officials who have been using machine-learning systems to help identify targets during the six-month war. *******’s use of powerful AI systems in its war on ****** has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines. “This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more ****** in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.” Another Lavender user questioned whether humans’ role in the selection process was meaningful. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.” ************ children salvage items amid the destruction caused by ******** ******** in Bureij, central Gaza, on 14 March. Photograph: AFP/Getty Images The testimony from the six intelligence officers, all who have been involved in using AI systems to identify ****** and ************ Islamic ****** (PIJ) targets in the war, was given to the journalist Yuval Abraham for a This is the hidden content, please Sign In or Sign Up . Their accounts were shared exclusively with the Guardian in advance of publication. All six said that Lavender had played a central role in the war, processing masses of data to rapidly identify potential “junior” operatives to target. Four of the sources said that, at one stage early in the war, Lavender listed as many as 37,000 ************ men who had been linked by the AI system to ****** or PIJ. Lavender was developed by the ******* Defense Forces’ elite intelligence division, Unit 8200, which is comparable to the US’s National Security Agency or GCHQ in the ***. Several of the sources described how, for certain categories of targets, the IDF applied pre-authorised allowances for the estimated number of civilians who could be ******* before a strike was authorised. Two sources said that during the early weeks of the war they were permitted to ***** 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as “***** ******”, the sources said, destroying entire homes and ******** all their occupants. story tips embed “You don’t want to waste expensive ****** on unimportant people – it’s very expensive for the country and there’s a shortage [of those ******],” one intelligence officer said. Another said the principal question they were faced with was whether the “collateral damage” to civilians allowed for an *******. “Because we usually carried out the attacks with ***** ******, and that meant literally dropping the whole house on its occupants. But even if an ******* is averted, you don’t care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting.” According to conflict experts, if ******* has been using ***** ****** to flatten the homes of thousands of Palestinians who were linked, with the assistance of AI, to militant groups in Gaza, that could help explain the shockingly high ****** toll in the war. The health ministry in the ******-run territory says 33,000 Palestinians have been ******* in the conflict in the past six months. This is the hidden content, please Sign In or Sign Up shows that in the first month of the war alone, 1,340 families suffered multiple losses, with 312 families losing more than 10 members. ******** soldiers stand on the ******** side of the *******-Gaza border surveying the ************ territory on 30 March. Photograph: Amir Cohen/Reuters Responding to the publication of the testimonies in +972 and Local Call, the IDF said in a statement that its operations were carried out in accordance with the rules of proportionality under international law. It said ***** ****** are “standard weaponry” that are used by IDF pilots in a manner that ensures “a high level of precision”. The statement described Lavender as a database used “to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of ********** organisations. This is not a list of confirmed military operatives eligible to *******. “The IDF does not use an artificial intelligence system that identifies ********** operatives or tries to predict whether a person is a **********,” it added. “Information systems are merely tools for analysts in the target identification process.” Lavender created a database of tens of thousands of individuals In earlier military operations conducted by the IDF, producing human targets was often a more labour-intensive process. Multiple sources who described target development in previous wars to the Guardian, said the decision to “incriminate” an individual, or identify them as a legitimate target, would be discussed and then signed off by a legal adviser. In the weeks and months after 7 October, this model for approving strikes on human targets was dramatically accelerated, according to the sources. As the IDF’s bombardment of Gaza intensified, they said, commanders demanded a continuous pipeline of targets. “We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us,” said one intelligence officer. “We were told: now we have to ***** up ******, no matter what the cost. Whatever you can, you *****.” To meet this demand, the IDF came to rely heavily on Lavender to generate a database of individuals judged to have the characteristics of a PIJ or ****** militant. Details about the specific kinds of data used to train Lavender’s algorithm, or how the programme reached its conclusions, are not included in the accounts published by +972 or Local Call. However, the sources said that during the first few weeks of the war, Unit 8200 refined Lavender’s algorithm and tweaked its search parameters. After randomly sampling and cross-checking its predictions, the unit concluded Lavender had achieved a 90% accuracy rate, the sources said, leading the IDF to approve its sweeping use as a target recommendation tool. Lavender created a database of tens of thousands of individuals who were marked as predominantly low-ranking members of ******’s military wing, they added. This was used alongside another AI-based decision support system, called the Gospel, which recommended buildings and structures as targets rather than individuals. Two ******** air force F15 fighter jets near the city of Gedera, southern *******, on 27 March. Photograph: Abir Sultan/EPA The accounts include first-hand testimony of how intelligence officers worked with Lavender and how the reach of its dragnet could be adjusted. “At its peak, the system managed to generate 37,000 people as potential human targets,” one of the sources said. “But the numbers changed all the time, because it depends on where you set the bar of what a ****** operative is.” They added: “There were times when a ****** operative was defined more broadly, and then the machine started bringing us all kinds of civil defence personnel, police officers, on whom it would be a shame to waste ******. They help the ****** government, but they don’t really endanger soldiers.” Before the war, US and ******** estimated membership of ******’s military wing at approximately 25-30,000 people. In the weeks after the ******-led 7 October ******** on southern *******, in which ************ militants ******* nearly 1,200 Israelis and kidnapped about 240 people, the sources said there was a decision to treat ************ men linked to ******’s military wing as potential targets, regardless of their rank or importance. The IDF’s targeting processes in the most intensive phase of the bombardment were also relaxed, they said. “There was a completely permissive policy regarding the casualties of [********] operations,” one source said. “A policy so permissive that in my opinion it had an element of revenge.” Another source, who justified the use of Lavender to help identify low-ranking targets, said that “when it comes to a junior militant, you don’t want to invest manpower and time in it”. They said that in wartime there was insufficient time to carefully “incriminate every target”. “So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it,” they added. ‘It’s much easier to ***** a family’s home’ The testimonies published by +972 and Local Call may explain how such a western military with such advanced capabilities, with weapons that can conduct highly surgical strikes, has conducted a war with such a vast human toll. When it came to targeting low-ranking ****** and PIJ suspects, they said, the preference was to ******* when they were believed to be at home. “We were not interested in ******** [******] operatives only when they were in a military building or engaged in a military activity,” one said. “It’s much easier to ***** a family’s home. The system is built to look for them in these situations.” Relatives outside the morgue of the al-Najjar hospital in Rafah mourn Palestinians ******* in ******** bombings on 1 February. Photograph: Mohammed Abed/AFP/Getty Images Such a strategy risked higher numbers of civilian casualties, and the sources said the IDF imposed pre-authorised limits on the number of civilians it deemed acceptable to ***** in a strike aimed at a single ****** militant. The ratio was said to have changed over time, and varied according to the seniority of the target. According to +972 and Local Call, the IDF judged it permissible to ***** more than 100 civilians in attacks on a top-ranking ****** officials. “We had a calculation for how many [civilians could be *******] for the brigade commander, how many [civilians] for a battalion commander, and so on,” one source said. “There were regulations, but they were just very lenient,” another added. “We’ve ******* people with collateral damage in the high double digits, if not low triple digits. These are things that haven’t happened before.” There appears to have been significant fluctuations in the figure that military commanders would tolerate at different stages of the war. One source said that the limit on permitted civilian casualties “went up and down” over time, and at one point was as low as five. During the first week of the conflict, the source said, permission was given to ***** 15 non-combatants to take out junior militants in Gaza. However, they said estimates of civilian casualties were imprecise, as it was not possible to know definitively how many people were in a building. Another intelligence officer said that more recently in the conflict, the rate of permitted collateral damage was brought down again. But at one stage earlier in the war they were authorised to ***** up to “20 uninvolved civilians” for a single operative, regardless of their rank, military importance, or age. “It’s not just that you can ***** any person who is a ****** soldier, which is clearly permitted and legitimate in terms of international law,” they said. “But they directly tell you: ‘You are allowed to ***** them along with many civilians.’ … In practice, the proportionality criterion did not exist.” The IDF statement said its procedures “require conducting an individual assessment of the anticipated military advantage and collateral damage expected … The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.” It added: “The IDF outright rejects the claim regarding any policy to ***** tens of thousands of people in their homes.” Experts in international humanitarian law who spoke to the Guardian expressed alarm at accounts of the IDF accepting and pre-authorising collateral damage ratios as high as 20 civilians, particularly for lower-ranking militants. They said militaries must assess proportionality for each individual strike. Smoke rises over the Gaza Strip, as seen from from the ******** side of the border on 21 January. Photograph: Amir Levy/Getty Images An international law expert at the US state department said they had “never remotely heard of a one to 15 ratio being deemed acceptable, especially for lower-level combatants. There’s a lot of leeway, but that strikes me as extreme”. Sarah Harrison, a former lawyer at the US Department of Defense, now an analyst at Crisis Group, said: “While there may be certain occasions where 15 collateral civilian deaths could be proportionate, there are other times where it definitely wouldn’t be. You can’t just set a tolerable number for a category of targets and say that it’ll be lawfully proportionate in each case.” Whatever the legal or moral justification for *******’s ******** strategy, some of its intelligence officers appear now to be questioning the approach set by their commanders. “No one thought about what to do afterward, when the war is over, or how it will be possible to live in Gaza,” one said. Another said that after the 7 October attacks by ******, the atmosphere in the IDF was “painful and vindictive”. “There was a dissonance: on the one hand, people here were frustrated that we were not attacking enough. On the other hand, you see at the end of the day that another thousand Gazans have *****, most of them civilians.” Guardian Newsroom: The unfolding crisis in the Middle EastOn Tuesday 30 April, 7-8.15pm GMT, join Devika Bhat, Peter Beaumont, Emma Graham-Harrison and Ghaith Abdul-Ahad as they discuss the fast-developing crisis in the Middle East. Book tickets here or at theguardian.live This is the hidden content, please Sign In or Sign Up #machine #coldly #******* #identify #****** #targets #IsraelGaza #war This is the hidden content, please Sign In or Sign Up Link to comment https://hopzone.eu/forums/topic/10788-%E2%80%98the-machine-did-it-coldly%E2%80%99-israel-used-ai-to-identify-37000-hamas-targets-israel-gaza-war/ Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now