Nach dem tödlichen Luftangriff auf eine iranische Schule gerät Palantirs Maven in die Kritik. Nach dem tödlichen US-Luftangriff auf eine Grundschule im iranischen Minab mit mehr als 160 Toten, der nach übereinstimmenden Untersuchungen am 28. Februar im Rahmen eines US-Angriffs erfolgt sein dürfte, gerät Palantirs „Maven Smart System“ in die Kritik. Eine Analyse lenkt den Blick auf veraltete Daten und die sogenannte Kill-Chain, an deren Ende der Angriff steht. Anders als zunächst vielfach berichtet, spielte ein Chatbot wie Claude bei der Zielauswahl nach bisherigen Erkenntnissen offenbar keine entscheidende Rolle. Die Bombardierung deutet laut vorliegenden Analysen weniger auf ein isoliertes Technikversagen als vielmehr auf Entscheidungen entlang der gesamten Prozesskette hin – von der Datenpflege bis zum Ende der Kill-Chain. Es verdichten sich Hinweise, dass die Zielerfassung auf „Maven Smart Systems“ basierte, einer von Palantir weiterentwickelten militärischen Plattform zur Analyse von Aufklärungsdaten. Maven verknüpft Satellitenbilder, Sensordaten und Geheimdienstinformationen, um mögliche Ziele zu identifizieren und in einem stark beschleunigten Prozess zur Freigabe von Angriffen vorzulegen. Nach Medienberichten war das getroffene Gebäude in einer US-Datenbank weiterhin als militärisches Objekt geführt worden, obwohl es bereits seit Jahren als Grundschule genutzt wurde. Offenbar wurde diese Information nicht aktualisiert – mit fatalen Folgen. Maven habe diese fehlerhafte Einstufung demnach übernommen und sie wurde ungeprüft in den automatisierten Entscheidungsprozess eingespeist. Palantirs Software ist darauf ausgelegt, die sogenannte Kill-Chain massiv zu beschleunigen. In Militärübungen konnten damit tausende Zielentscheidungen in kurzer Zeit getroffen werden. Kritiker warnen jedoch, dass diese Geschwindigkeit zulasten menschlicher Kontrolle und sorgfältiger Prüfung gehen kann.

via heise: Mutmaßlicher US-Angriff auf Schule in Iran: Palantir-System rückt in den Fokus

siehe auch: Kill Chain On the automated bureaucratic machinery that killed 175 children. On the first morning of Operation Epic Fury, February 28, 2026, American forces struck the Shajareh Tayyebeh elementary school in Minab, in southern Iran, hitting the building at least two times during the morning session.1 American forces killed between 175 and 180 people, most of them girls between the ages of seven and twelve. Within days, the question that organized the coverage was whether Claude had selected the school as a target. Congress wrote to the Secretary of Defense about the extent of AI use in the strikes. The New Yorker asked whether Claude could be trusted to obey orders in combat, whether it might resort to blackmail as a self-preservation strategy, and whether the Pentagon’s chief concern should be that the chatbot had a personality.2 Almost none of this had any relationship to reality. The targeting for Operation Epic Fury ran on a system called Maven. Nobody was arguing about Maven. Eight years ago, Maven was the most contested project in Silicon Valley. In 2018, more than four thousand Google employees signed a letter opposing the company’s contract to build artificial intelligence for the Pentagon’s targeting systems.3 Workers organized a walk out. Engineers quit. And Google ultimately abandoned the contract. Palantir Technologies took it over and spent the next six years building it into a targeting infrastructure that fuses satellite imagery, signals intelligence, and sensor feeds into target packages and moves them from nomination to strike. The building in Minab had been classified as a military facility in a Defense Intelligence Agency database that had not been updated since at least 2013, years after it had been walled off from the adjacent IRGC compound and converted into a school.4 Maven processed that list. This is what the 2018 protesters were afraid of. By the start of the Iran War, Maven had sunk into the plumbing, it had become part of the military’s infrastructure, and the argument was all about Claude. This obsession with Claude is a kind of AI psychosis, though not of the kind we normally talk about, and it afflicts critics and opponents of the technology as fiercely as it does its boosters. You do not have to use a language model to let it organize your attention or distort your thinking. (…) Palantir’s Maven Smart System is the latest iteration of this compression, and it grew out of a shift in strategic thinking during Obama’s second term. In 2014, Secretary of Defense Chuck Hagel and his deputy, Robert Work, announced what they called the “Third Offset Strategy.”7 An “offset” in this line of thinking is essentially a bet that a technological advantage can compensate for a strategic weakness the country cannot fix directly. The first two “offsets” addressed the same problem: the United States could not match the Soviet Union in conventional forces. The thinking was that the Red Army could just continue to throw personnel at a problem, as they did at Stalingrad, or, to be a little anachronistic, as the contemporary Russian Army did at Bakhmut and Avdiivka. Nuclear weapons, the first offset, made the personnel advantage irrelevant in the 1950s. When the Soviets reached nuclear parity in the 1970s, precision-guided munitions and stealth offered the promise that a smaller force could defeat a larger one. By 2014, that advantage was eroding. China and Russia had spent two decades acquiring precision-guided munitions and building anti-access systems designed to neutralize the ones the US already had. Work insisted that the third offset was not about any particular technology but about operational and organizational constructs that would let the United States make decisions faster than China and Russia, overwhelming and disorienting the enemy by maintaining a faster operational tempo than they could match.8

screenshot blog palantir
Categories: DiensteGewaltInternet