Lavender – AI Killing Program

Lavender – AI Killing Program (also see AI in War)

Updated 2024-07-25

2024-04-27 Will AI Fight Wars Autonomously in the Future? Or Are We Already There?     rtificial intelligence is expected to impact human life in every aspect of its existence, but the most concerning one that experts fear the most is its application in warfare, and they are not wrong here. The US military has conducted the first dogfight between a human controlled aircraft and an AI controlled jet fighter.

On the other hand, AI is already deciding who to target in the real war zones, like Gaza and Ukraine. Where Israel is being said to have deployed an AI algorithm called Lavender to identify potential targets by an Israeli publication +972 Magazine research and is said to have minimal human decision application, which resulted in a huge number of civilian casualties. https://www.msn.com/en-us/news/world/will-ai-fight-wars-autonomously-in-the-future-or-are-we-already-there/ar-AA1nM0cE?

2024-04-09 Israel’s AI Targeting System Reflects the Inhumanity It Was Programmed With      The rise of AI belies the fantasy that we can escape moral culpability by assigning life-or-death decisions to a machine.  

The war in Gaza between Israel and Hamas marked its grim six-month anniversary on Sunday, and one of the most jarring things about this very 21st-century conflict has been the almost daily headlines about Israeli airstrikes obliterating the homes of notable Palestinians—sometimes known Hamas operatives, but often journalists or physicians or aid workers. In many of these attacks, large numbers of family members, including young children, die under the rubble.

In one of the war’s most notorious incidents, the prominent Palestinian poet and professor Refaat Alareer—so haunted by the daily devastation and the likelihood he and his own family would be targeted that in his final weeks he wrote a poem called “If I Must Die”—had sought refuge at a family home when an Israeli airstrike killed not only him but his brother, sister, and four children.  It’s a similar story for journalists in Gaza, whose death toll—at least 90 Palestinians, according to the conservative tally of the Committee to Protect Journalists—has exceeded any other modern conflict, in just half a year.

The +972 Magazine report, which it confirmed in interviews with six Israeli intelligence officers, said an AI program known as “Lavender” has been used by the Israeli Defense Force to identify targets in Gaza since the start of the war on October 7. The IDF has confirmed that AI is used by its intelligence officers in guiding its tactics in Gaza, but the military and the magazine differed sharply on the issue of human involvement. IDF claims the computer-driven data is only advisory and that humans are still making the key decisions for targeting bombs, but the +972 report said human reviews of the AI targets were often “a rubber stamp” as brief as 20 seconds.  https://www.commondreams.org/opinion/israel-lavender-ai-inhumane

.

Categorized Directory: News and Articles about Israel- Palestine Conflict

Palestine and Israel

.

Specific Issues Index

from Creating Better World

Unknown's avatar

About mekorganic

I have been a Peace and Social Justice Advocate most all of my adult life. In 2020 (7.4%) and 2022 (21%), I ran for U.S. Congress in CA under the Green Party. This Blog and website are meant to be a progressive educational site, an alternative to corporate media and the two dominate political parties. Your comments and participation are most appreciated. (Click photo) .............................................. Created and managed by Michael E. Kerr
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a comment