An estimated 70% of buildings in Gaza have been either damaged or completely destroyed. The majority of the population has been displaced from their homes. Civilian casualty estimates range from 40,000 to 186,000 (according to earlier projections from The Lancet medical journal), though accurate counting is difficult since most hospitals are no longer operational and it's hard to retrieve bodies from under rubble. Polio and various other diseases are resurging among the civilian population.
Given that more bombs have been dropped on Gaza than were used in the World War II bombings of Dresden, Hamburg, and London combined, claims about AI-enabled precision targeting capabilities seem difficult to reconcile with the scale of destruction.
This article seems more like marketing for Israeli "defense" tech. And the AI here is really just a "get out of jail free" card for war crimes.
> In 2014, the IDF’s acceptable civilian casualty ratio was one civilian for a high-level terrorist, said Tal Mimran, a former legal adviser to the IDF. In the Gaza war, the number has grown to about 15 civilians for one low-level Hamas member and “exponentially higher” for mid- and high-level members,” according to the Israeli human rights organization Breaking the Silence, citing numerous testimonies from IDF soldiers. The New York Times reported the number as 20 earlier this week.
This is key new information (to me, maybe it’s old news to people following the situation closely) that’s more horrifying than the AI angle. And given the liberal criteria to be classified as combatant and probably downplayed civilian casualties, the ratio has gotta be substantially higher than the already appalling official threshold.
The navel gazing about AI is a stupid angle. It sounds like a typical corporate tale of the untouchable new boss who has determined that the software he brought in is infallible, “emperor has no clothes” style.
The real story is that they formulated a much higher acceptable ratio of collateral deaths and changed the internal processes to make it impossible to escalate problems. To the point that they allowed Israel to be attacked because nobody could report what they saw up the chain of command.
End of the day, if I’m allowed to blow up a crowd to kill one dude, more people are gonna die. The AI angle is a volume knob.
Unasked questions are:
Who exactly demanded the higher target count?
Did the flooding of the combat arms teams with targets allow important targets to escape?
Did this program advance the goals of executing a war? Or did it cause a needless war? Or did it primarily serve a political purpose?
this is important to think about, but more important is to not ever ever let this sort of nonsense stop you blaming a human.
if people outsource "target detection" to some software, they personally are responsible for what it does.
if people outsource the actual murdering to some software, they personally are responsible for those deaths.
a human made the decision to permit this and the entire consequences of it should be on their head. we can never ever let the people who wrote and/or approved the software escape blame for what it does.
we're already seeing this with self-driving cars - which individual at uber should be serving sentences for their cars killing someone? or did we accidentally make murder completely legal as long as you obfuscate the cause chain enough?
Israel used ML to generate targets for its genocide in Gaza.
Fixed the title, given that it's so "off the mark", to use the IDF's words. Let's break it down.
AI to generate targets:
To maintain the war’s breakneck pace, the IDF turned to an elaborate artificial intelligence [...] which could quickly generate hundreds of additional targets.
Flawed since any engineer reading these knows how inaccurate these can be. Imagine killing people based on those predictions.
Reviewing reams of data from intercepted communications, satellite footage, and social networks, the algorithms spit out the coordinates of tunnels, rockets, and other military targets. Recommendations that survive vetting by an intelligence analyst are placed in the target bank by a senior officer.
Another machine learning tool, called Lavender, uses a percentage score to predict how likely a Palestinian is to be a member of a militant group, allowing the IDF to quickly generate a large volume of potential human targets. Other algorithmic programs have names like Alchemist, Depth of Wisdom, Hunter and Flow, the latter of which allows soldiers to query various datasets and is previously unreported.
Genocide since:
Genocide charges against Israel brought to The Hague by South Africa question whether crucial decisions about bombing targets in Gaza were made by software, an investigation that could hasten a global debate about the role of AI technology in warfare.
Let's put aside the war crimes stuff for a second and think about this as Hackers.
Israel's military actions are so awful that they obfuscate the instruments of surveillance and warfare Israel's tech and defense industries have been quietly building.
Israel has a captive population at its borders that they and are using to prototype tools to identify and track individuals, identify their familial/professional/social networks, break open their communications and remove any privacy of speech, and the tools to remotely execute (via drone) anyone who ends up marked for termination. All with stunning scale, if not necessarily accuracy. Some of this is mentioned in the WaPo article, some elsewhere (consider coverage of persistent surveillance tech deployed in the West Bank as well as Gaza).
This technology is not staying in Israel. The NSO Group's software is probably used by law enforcement near to you to break into iPhones. Elbit systems has been contracted by DHS for surveillance of the US-Mexico border, using camera and drone systems used in Gaza and Lebanon over the past decade.
The question you have to ask is whether you're ready to give up your last vestiges of privacy of movement or communication. Are you ready for the chilling effect that comes from that? Are you ready for what a government that has this technology, and the companies and capital that guide it, will ask of you?