15 Comments
User's avatar
Craig Ewing's avatar

A most compelling analysis, Wes, were the actual consequences not so tragic. Instead, we have your excellent insights into the future of war when no one gives a damn about morality, about the innocents, about the future after the war is over. What an effing mess. And the blood is on our hands.

Michael Smit-Drury's avatar

Incredibly troubling. Even more so that the systems are being reinforced to shoot first, questions later, in every respect and in every new action. And the “automation” is what creates the accountability gap. It’s like a driverless car that learns busy sidewalks are an acceptable shortcut.

And the idiots deploying this tech only see it as a way to spend more budget, thinking it’s somehow a unique competency, when all of this tech and munitions is becoming available to any half-baked warmonger.

Does nuclear proliferation even matter when an automated airforce of 100,000 cheap drones could destroy an entire city?

Graham Nolan's avatar

I have been using chatgpt at work for the last few weeks, as a data modelling assistant almost. It is very good at coming up with ideas and filling gaps in my knowledge.

But I regularly notice, at the bottom of each conversation, the warning that 'Chatgpt can make mistakes. Check important info...'. No doubt Claude has something similar. Excellent advice, for AI-enhanced and human-led data analysis and decision-making. You have to wonder whether this consideration was sidelined in the need for speed.

Going from 2000 analysts to 20 (or some similarly-considerable reduction) makes me wonder whether the budget reduction played a part as well.

Steve Macca's avatar

Fantastically revealing article. Although indirect, this is a case of AI misdirecting lethal power and killing 168 children.

Tina Johnson's avatar

But Wes said it wasn’t a technology failure. I’m confused.

Steve Macca's avatar

Directly it's not, but it's how AI is being used. Like Wes said, with the old system someone out of those 2000 analysts might have noticed there was a soccer field next to the target building and flagged it for review. But the visual AI system from Palantir obvisouly didn't have training data to note these things, so completely missed it. And will miss it again unless someone adds this to the traning data. It means that Maven and other AI systems are going to learn on the fly, and that means a lot more civilian casualties that wouldn't have happend had humans been involved front to back.

Tina Johnson's avatar

Steve! Your explanation was perfect, thank you. Cleared things right up for me and provided enough detail to help me get a better idea of the whole AI situation. I appreciate your time.

John Schwarzkopf's avatar

It all comes down to GIGO. Garbage In Garbage Out. Artificial Stupidity can't tell the difference between accurate information and garbage. This won't be the last incident of this kind.

Brian Rosen's avatar

As you have clearly laid out the problem for all to see, perhaps citizens will contact the responsible representatives who will listen… or not.

Porter's avatar

Probably not.

JG's avatar

Excellent detail Wes. The Machine might not have pulled the trigger, but it was definitely part of the problem. AI is not mature enough to handle this level of critical decision making. 'Human in the loop' needs to be more than Accept/Refuse 👍

USIBARIS's avatar

the process / decision tree algorithm has to be adjusted/improved/...

essentially it was human error or negligence or hubris or plain stupidity that caused this tragedy

Hakan Arvidsson's avatar

One moral question Wes. At the moment officially the USA military does not know what wrong with the missiles that killed a lot of innocent persons, mainly children.

If they do not know what went wrong is it not morally wrong, a war crime, to continue sending out new missiles that might have the same disastrous consequences?

Hakan Arvidsson's avatar

Terrible sad story how USA performed with its missiles a huge war crime in killing all these innocent persons, mainly children. But many thanks for your enlightening explanations why it happened and how the frightening speed of uncontrolled automation/AI is taking place now.

Robot Bender's avatar

We need to take into consideration that Whiskey Pete said there would be no more no stupid rules of engagement,” “no politically correct wars." Given our current "leadership," I have to wonder if they even care that they killed a bunch of kids and teachers. It also makes me wonder if they are even going to ignore the Geneva Conventions.