
Investigation by +972 Magazine and Local Call reveals the Israeli army’s AI targeting system with little human oversight.
In 2021, a book titled “The Human-Machine Team” was released, discussing an AI system for military strikes. This system, known as ‘Lavender’, is revealed to have played a central role in Israel’s bombing of Palestinians.
The Lavender System
Lavender marks all suspected operatives in the military wings of Hamas and PIJ as potential bombing targets. During the first weeks of the war, it marked as many as 37,000 Palestinians as suspected militants.
Human Oversight
The army gave sweeping approval for officers to adopt Lavender’s kill lists without thoroughly checking the machine’s choices, leading to significant civilian casualties.
The Gospel vs. Lavender
While The Gospel marks buildings as targets, Lavender marks individuals, putting them on a kill list.
Civilian Casualties
Thousands of Palestinians were wiped out by Israeli airstrikes due to the AI program’s decisions, many of them women and children.
Conclusion
Lavender joins another AI system, ‘The Gospel’, in the Israeli military’s use of technology for targeting. The lack of human oversight has led to devastating consequences for civilians in Gaza.

Cómo generar imágenes con inteligencia artificial paso a paso

Cómo mejorar tus prompts de inteligencia artificial. 4 claves

Bancos de edición de prompts AI: qué son y cómo usarlos gratis o con bajo costo

Humanizador de IA: Qué es y cómo funciona

Los mejores Sora 2 prompt generator

Los Mejores ChatGPT Prompt Generator

Cómo mejorar tus prompts en ChatGPT

Prompt para mejorar la calidad de una imagen. 8 puntos

Cómo utilizar Turbologo para crear un logotipo profesional para tu negocio en minutos

Sora 2: La Nueva Red Social de OpenAI que Podría Rivalizar con TikTok

¿Qué es un jailbreak prompt? 4 tipos
Declaración por la Libertad de Orion Hernández en Israel y Gaza


