This on Channel 4 news about 20mins in
is very well worth watching.
They are interviewing Yuval Abraham the Israeli writer who has reported in 972 Magazine on the IDF’s use of AI, called “Lavender”, to select targets for the Israeli army.
The AI uses a base of mass surveillance of the entire Palestinian population to pick out Hamas targets. It creates lists of possible targets based on individuals being in a WhatsApp group which has a Hamas terrorist in it, moving house, or changing mobiles. But the lists which the AI creates can use an even wider net especially when it goes on to pick out many targets from the most minor Hamas operatives which it can choose for the most minor reasons.
The lists need human checking.
In practice though this human checking was not happening except at a basic level of listening to hear if the target’s voice was male or female. If female: wrong target. If male: target/Hamas. But this identification as ‘Hamas’ was not necessarily true.
Worse, once a target, that target’s house would be bombed. This was regardless of who else was inside, and the whole family could be . They did not even bother checking afterwards to see if they’d got the target himself. They’d move on to the next house to bomb. Individuals could be getting 4O houses a day to bomb off the AI list, and bombing one after the other.
At one stage the IDF were allowing for 15-20 people’s deaths as ‘collateral damage’ for every 1 ‘target’ ( correct target or not.)
The IDF claims not to have used this AI.
Yuval Abraham knows they have.
Is Israel using AI to identify targets in Gaza war? – Channel 4 News
https://www.channel4.com/news/is-israel-using-ai-to-identify-targets-in-gaza-war