Meet the Other Phone. Only the apps you allow.

Meet the Other Phone.
Only the apps you allow.

Buy now

Please or to access all these features

Conflict in the Middle East

IDF killing families in their homes

29 replies

Polka83 · 04/04/2024 14:57

Some commentators have blamed Hamas for high number of casualties in Gaza- due to them using civilians as “human shields” during active combat.

However- it now appears artificial intelligence was used to rapidly identify potential members of Hamas (with a significant 10% margin of error). These targets were not corroborated other than a 20s confirmation that they were not female. The targets were then located when they returned HOME to their families using a program called “Daddy’s Home.”

Dumb bombs were used to destroy family homes- sometimes that target was not even at home. These were used as it was cheaper for the IDF!

https://www.972mag.com/lavender-ai-israeli-army-gaza/

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for assassination, using an AI targeting system with little human oversight and a permissive policy for casualties, +972 and Local Call reveal.

https://www.972mag.com/lavender-ai-israeli-army-gaza/

OP posts:
Thread gallery
7
Scirocco · 04/04/2024 15:05

The name of the program says a lot about the character of the people designing and using it.

Efacsen · 04/04/2024 15:10

Apparently it will be on sale soon - for any other counties who fancy 'Gaza-ing' their enemies

Polka83 · 04/04/2024 15:22

Will be interesting what the ICJ thinks of the legality of this approach. It is more lethal than the approach even the IDF have taken previously with regards to what it viewed as proportional.

OP posts:
Silence1 · 04/04/2024 15:30

Polka83 · 04/04/2024 15:22

Will be interesting what the ICJ thinks of the legality of this approach. It is more lethal than the approach even the IDF have taken previously with regards to what it viewed as proportional.

It will be. It's very chilling to read and I am only half way through it.

Polka83 · 04/04/2024 18:24

EasterIssland · 04/04/2024 16:42

I posted this the other day. They’re training the software in real life with Palestinians

https://asia.nikkei.com/Politics/Israel-Hamas-war/Israeli-startups-hope-to-export-battle-tested-AI-military-tech

“The company supplies smart fire-control equipment for rifles that tracks a target to ensure a precise hit.”
Shame they aren’t creating tech that prevents snipers killing children and civilians raising white flags.

OP posts:
Scirocco · 04/04/2024 20:09

Polka83 · 04/04/2024 18:24

“The company supplies smart fire-control equipment for rifles that tracks a target to ensure a precise hit.”
Shame they aren’t creating tech that prevents snipers killing children and civilians raising white flags.

"To ensure a precise hit"... So, they're either using dangerously unreliable technology that's killing civilians, or they're using technology that's working as intended to kill civilians.

10UsernamesNotAvailableTryAnotherOne · 04/04/2024 23:10

That's terrible. And people still don't believe that Israel is testing weapons on the Palestinians.

CreditKarm · 04/04/2024 23:12

Terrorists. It's terrorism, plain and simple.

10UsernamesNotAvailableTryAnotherOne · 04/04/2024 23:16

CreditKarm · 04/04/2024 23:12

Terrorists. It's terrorism, plain and simple.

It sure is. Miko Peled said back in 2012 that the Israeli army is the best fed, best trained, best equipped terrorist organisation in the worlde's right. I don't understand how anyone can still support them at this point, but some people still do.

10UsernamesNotAvailableTryAnotherOne · 05/04/2024 06:22

He's right, not 'worlde's right'. 🤦‍♀️

EasterIssland · 17/04/2024 22:58

This reply has been deleted

Message deleted by MNHQ. Here's a link to our Talk Guidelines.

statsfun · 18/04/2024 06:46

This reply has been deleted

Message deleted by MNHQ. Here's a link to our Talk Guidelines.

That blogger lost me at being a member of Hamas is not illegal or even wrong: Israel's occupation of Gaza is illegal under international law, and Hamas' resistance against the IDF is legal and moral.

statsfun · 18/04/2024 07:14

And no, saying that Hamas' 'violence' on October 7th was illegal and immoral doesn't make that OK.

I'd stepped past the blogger's 'powerful Jews secretly running the world' tropes because Meta's CSO having been in the IDF unit which built the AI is relevant. The comments on Mark Zuckerberg and the COO less so, and verging on the nasty.

But supporting murderous terrorists crosses a boundary. Unfortunately I didn't stop reading quite in time to avoid denial of the October 7th atrocities.

I think the IDF's use of AI does warrant thinking about. Ethical use of AI is a really important topic just now, and wartime use should feed into the discussions.

But this guy is just contemptible.

Scirocco · 18/04/2024 08:01

Interesting potential use of WhatsApp data... WhatsApp is marketed as end-to-end encrypted and there have been multiple cases where access to data for things like criminal trials and missing persons investigations has been 'impossible' due to 'security'.

If 'being in a WhatsApp group with a suspected member of Hamas' is grounds for having your entire family killed, then that's an extremely broad net to cast. Especially as the current criteria for 'suspected membership' in the Gaza Strip appears to be 'male and not obviously in the IDF'.

Scirocco · 18/04/2024 08:04

The potential link between Meta and the development of AI tools used to target and kill individuals and their families should be concerning for anyone who uses Meta's products. By using these products, people are providing the company with vast amounts of personal data. Here is yet another example - and a particularly chilling example at that - of how that data may be used.

EasterIssland · 18/04/2024 08:07

Oh well the link was deleted 🤷🏻‍♀️ time will tell

Scirocco · 18/04/2024 08:16

It's also relevant that when we consider the safety of our information online, we often don't think about the meta-data. While WhatsApp messages and similar have often been 'unavailable' for investigations, the FBI etc have been able to get meta-data with warrants, etc, in some cases.

Is linking meta-data robust grounds for killing people...? If so, what else could it be considered robust grounds for...?

And this is one reason why information security is important.

Rocknrollstar · 18/04/2024 08:23

Do people realise that Hamas are still firing hundreds of rockets into Israel on a daily basis? Hamas is a terrorist organisation linked to Iran, Hezbollah and Isis. They want to eradicate the State of Israel and all Jews and won’t stop there. Think of what is happening in Northern Nigeria for example.

EasterIssland · 18/04/2024 08:29

Rocknrollstar · 18/04/2024 08:23

Do people realise that Hamas are still firing hundreds of rockets into Israel on a daily basis? Hamas is a terrorist organisation linked to Iran, Hezbollah and Isis. They want to eradicate the State of Israel and all Jews and won’t stop there. Think of what is happening in Northern Nigeria for example.

Do you realise that this thread is about how Israel is using AI and by doing so it’s killing innocent civilians ?

statsfun · 18/04/2024 09:25

AI is starting to be used increasingly widely - a result of there being so much more data and also it being possible to process huge data sets. It's so powerful in areas like law enforcement (including identifying terrorists) that it's going to change the world. The ethics are being taken very seriously internationally - but are still being figured out. Some stuff is obvious, but other ethical issues are only coming to light as it's starting to be used. This is all very new.

From the very few articles I've seen on the IDF AI system (really only a few sentences), it does seem to have been designed with some thought about the ethics. It seems it's intended to find possible suspects - and then hand over to a human to verify and make the decision. That's important, since AI is perfect for trawling through data and spotting patterns which humans would take too long to find, if they ever did. But it can miss something which is instantly obvious to a human.

Unfortunately, the articles suggested that the human step was seen as a tick-box. If that's true, that's a huge problem - but it's a problem with how it's being used not the AI system itself.

Personally, I'd like to know more about it. But I know that's unlikely given that it's being used by the military in a current war.

I'm really annoyed with myself for giving any oxygen to the conspiracy theory by saying that the CSO's links could be relevant without verifying.

The Meta chief Information Security Officer is Guy Rosen. He's Israeli and did his military service in the cyber security section, then worked for various startups, then started his own security company, then joined Facebook in 2013 (they bought the cyber security company he founded) then joined Meta in Product safety and integrity before becoming the CISO. He's not some shadowy IDF guy.

It's not particularly mysterious that someone who worked in the cyber security arm of the IDF during his military service then moved into cyber security as a career. It certainly doesn't mean he has anything to do with the Israeli military's recent developments in AI: he went straight into the commercial sector after his military service. In fact, given timings, that seems incredibly unlikely. And it's not very mysterious that Meta needs someone top of the field in Information security as their CISO.

Hinting otherwise is just conspiracy theory - with an antisemitic undercurrent.

statsfun · 18/04/2024 09:31

Not saying that people on here are hinting at it. The blogger did. It's insidious though: people read it and think there must be something in it.

I think it's important to call out that this kind of conspiracy theory is a really common antisemitic trope.