
Investigation by +972 Magazine and Local Call reveals the Israeli army’s AI targeting system with little human oversight.
In 2021, a book titled “The Human-Machine Team” was released, discussing an AI system for military strikes. This system, known as ‘Lavender’, is revealed to have played a central role in Israel’s bombing of Palestinians.
The Lavender System
Lavender marks all suspected operatives in the military wings of Hamas and PIJ as potential bombing targets. During the first weeks of the war, it marked as many as 37,000 Palestinians as suspected militants.
Human Oversight
The army gave sweeping approval for officers to adopt Lavender’s kill lists without thoroughly checking the machine’s choices, leading to significant civilian casualties.
The Gospel vs. Lavender
While The Gospel marks buildings as targets, Lavender marks individuals, putting them on a kill list.
Civilian Casualties
Thousands of Palestinians were wiped out by Israeli airstrikes due to the AI program’s decisions, many of them women and children.
Conclusion
Lavender joins another AI system, ‘The Gospel’, in the Israeli military’s use of technology for targeting. The lack of human oversight has led to devastating consequences for civilians in Gaza.

Pixverse AI trend: Why AI‑Generated Videos Are Going Viral

What Are AI Prompts and How to Use Them Correctly (Practical Guide for 2026)

Qué son los prompts de IA y cómo usarlos correctamente. 5 pasos

Cómo ganar dinero con IA en 2026 (guía realista, sin humo)

Mejores prompts para foto profesional con Gemini AI. 4 tips

Understanding AI Vocaloid and AI Singing Tools in 2025

How to Create Music with AI in 2025

Gemini AI Retro Style Couple Photo Prompts

Professional Gemini AI Photo Prompts You Can Use Today. 3 Tips

Best ChatGPT Christmas Photo Prompts for 2025


