Ukraine is About to Get Hivemind AI to Power Drone Swarms
There are four components to Hivemind, and they're worth understanding because Ukraine is getting a complete doctrinal shift in a box
Let’s talk about Shield AI, because outside the defense tech bubble, most people have never heard of them, and that’s a mistake worth correcting.
Shield AI builds AI that operates in GPS-denied, communications-denied, fully contested environments. The kinds of places where traditional drones go deaf and blind and then augur into the ground at terminal velocity, which, coincidentally, is also how my first sergeant described most of my career decisions.
Their flagship product is called Hivemind. And Ukraine’s about to get it.
Ukraine figured out early in this war that cheap FPV drones were devastating. A $500 first-person-view drone can achieve comparable tactical effect to a $78,000 Javelin missile.
Ukraine is now producing 4.5 million drones per year. The math is genuinely staggering. So is the fact that we spent thirty years and several trillion dollars building a military industrial complex, and a Ukrainian farmer with a DJI and a 3D printer is out here eating Russia’s lunch.
But here’s the catch: every one of those drones needs an operator.
One drone, one person, staring at a screen, hands on the controls.
To field a million drones simultaneously, you’d need a million people.
Hivemind’s answer to that math is to change the ratio entirely.
But Shield AI isn’t without baggage. In April 2024, a V-BAT drone tipped over during a US Navy demonstration and partially severed a service member’s fingers in the propeller.
That’s not a PR hiccup, that’s a serious injury, and it turned out to be a symptom of something deeper. Internal reports indicated that executives had been brushing off propeller safety concerns for years.
In 2023, an employee’s shirt got caught and shredded by a propeller during a test flight, which is the kind of incident that should generate exactly one response from leadership, and apparently didn’t generate the right one.
Employees alleged their safety complaints were ignored, some said they were fired for raising them, and in July 2025, a whistleblower lawsuit landed alleging wrongful termination and violations of whistleblower protection laws.
The company also cycled through leadership in 2025, bringing in a new CEO named Gary Steele amid reported sales slumps.
None of that means Hivemind doesn’t work.
It means you should know who you’re buying it from, and Ukraine, operating on a battlefield where a malfunctioning drone doesn’t just embarrass you at a Navy demo but kills somebody, has every reason to go in with eyes open.
What Hivemind Actually Does
Traditional strike drone autopilots follow pre-planned routes. You program the waypoints, the drone flies them.
If something unexpected happens, like a jammer cuts the signal or a building appears where the map said open ground, the autopilot fails, returns to base, or becomes a very expensive lawn dart.
This is the drone equivalent of a soldier who can only operate according to the exact instructions in his OPORD and completely loses it the moment the enemy doesn’t cooperate. Every unit has one. Veterans know exactly who I’m talking about.
Hivemind can change course, avoid no-fly zones, navigate obstacles, respond to unforeseen conditions, and carry out missions without human involvement. No GPS required. No constant communication with the base required.
The intelligence lives on the aircraft.
Russia’s most effective counter to Ukrainian drones has been electronic warfare, jamming the signals that operators use to fly them. Hivemind-enabled drones don’t need those signals. You can’t jam a brain.
You can try, but that’s essentially Russia’s current strategy and it’s going about as well as everything else in this war has gone for them.
The concept works like this: a group of drones can interact with each other, distribute tasks, and act as a single swarm without any direct human control.
One scout drone charts the path. Bomb-carrying drones decide among themselves which will strike, and when. The human who launched the mission defined the objective. The swarm figures out the execution.
Ukraine’s already been doing a version of this with its domestic Swarmer system. A Ukrainian officer told journalists his unit used Swarmer more than a hundred times in the past year, often with three drones, but with tests going as high as twenty-five.
I have a separate article coming on Swarmer. Stay tuned.
Hivemind is that same concept, but orders of magnitude larger, tested on everything from reconnaissance drones to fighter aircraft.
I want to pause on that. Hivemind Pilot has been tested on the X-62A VISTA, which is a modified F-16 Viper variant. This is the AI that American engineers trust with a fifty-million-dollar airframe.
The video above shows the X-62A VISTA. Remember, this jet is flying itself.
That AI is now being integrated into Ukrainian combat drones. Somewhere, a very proud fighter pilot just read that sentence and poured one out for his profession.
The Stack
There are four components to Hivemind.
The first is Hivemind Pilot, the AI that directly controls the aircraft. It’s already deployed on the V-BAT reconnaissance drone and the MQ-20 Avenger.
The V-BAT isn’t new to Ukraine. Shield AI says it’s already there, has completed more than 130 combat missions, and is now getting a significant capability upgrade.
The Shield AI V-Bat is already flying combat missions in Ukraine. In this video you can see DoD contractors playing with it.
The second is EdgeOS, the operating environment that runs directly on the drone’s onboard hardware.
This is the piece that lets Hivemind function without a cloud connection. No phoning home, no infrastructure that Russia can target, no dependency on anything external.
The intelligence lives on the aircraft.
This is the tech equivalent of a special operations team that carries everything it needs, speaks no radio, and completes the mission without ever checking in with higher. Every commander says they want that team.
The third is Commander, which shifts the human role from pilot to commanding officer. You’re not flying the drone anymore. You’re commanding a unit that happens to be made of drones.
You define the objective. The swarm handles execution.
That’s a fundamental rewrite of how humans relate to autonomous weapons. The closest analog I can think of is the transition from individual marksmen to machine gun teams in World War One.
The people who understood what had just changed adapted. The ones who didn’t kept ordering cavalry charges.
The fourth is Forge, the development environment where new AI behaviors get built and tested. It’s the piece that allows the system to keep learning as the threat environment evolves.
Think of it as the drone swarm’s ongoing professional military education program, except it actually works and nobody falls asleep in the back row.
Hivemind is also being integrated into the Destinus Ruta miniature cruise missile and the Hornet air defense interceptor drone, with combined testing planned for later this year.

A strike missile and an interceptor, both running Hivemind. That’s both ends of the kill chain getting the same AI brain simultaneously. Offense and defense, unified under one autonomous architecture.
The Line That Nobody Wants to Name
Here’s where it gets slightly, ethically complicated.
The fastest combat scenarios of 2026 move faster than human reaction time. A drone swarm closing on a military installation at 200 kilometers per hour presents engagement windows measured in seconds.
An authorization chain that takes thirty seconds isn’t a human in the loop. It’s a human arriving after the loop has already closed, filling out the after-action report on a decision the machines already made.
Richard Drake, head of Anduril’s European branch, acknowledged that advanced autonomous kill chains can operate fully autonomously, but for now, “there is a human in the loop making those final decisions.” Government rules require it.
For now is doing a lot of work in that sentence.
Estonia’s defense officials say they insist that human control is maintained over lethal force decisions. Most NATO countries say the same thing.
Current US and allied policy supposedly requires a human to approve strikes. This is the official position. It is also the position of a person standing at the edge of a pool in a rainstorm insisting they’re not getting wet because they haven’t jumped in yet.
Meanwhile, Ukrainian developers are already testing drones with onboard AI that can lock onto a designated target and autonomously fly the terminal attack leg when data links are jammed.
Once cued, these systems pursue and strike without continuous human control. Russian technical experts have acknowledged that autonomous flying robots that determine their own targets are already being used in combat.
Not next year. Now.
Both sides are already past the line that Western doctrine says shouldn’t be crossed. The international debate is trying to regulate a threshold the battlefield crossed without waiting for permission.
UN Special Rapporteur Morris Tidball-Binz put it plainly: “The international community is crossing a threshold which may be difficult, if not impossible, to reverse later.”
He’s not wrong. The Shield AI partnership with Ukraine’s Brave1 defense accelerator is one more step across it.
The lawyers are still drafting the ethical framework. The drones already filed the flight plan.
Here’s my honest read on what this means strategically, and where I think this is all going.
Ukraine has the most battle-tested drone operators on earth.
It has the most extensive real-world data on autonomous systems operating under jamming, GPS denial, and fully contested airspace.
It has a domestic defense industry iterating on drone designs weekly based on live combat feedback. Their sprint cycles are measured in days because the alternative is losing ground.
And now Ukraine has an AI autonomy system purpose-built to operate in exactly the conditions Russian electronic warfare is designed to create.
A Hivemind-enabled drone mesh can send an ISR drone forward, followed by a swarm of simpler strike drones that find and attack using visual navigation. You can send ten drones while needing significantly fewer skilled operators controlling any of them.
One commander. Many autonomous drones. One mission.
The manpower equation flips. Russia’s electronic warfare budget can’t jam a swarm that doesn’t need a signal. You can throw every ruble you have at the problem and the swarm just shrugs and keeps flying, which is more than I can say for most of the units Russia has committed to this war.
Ukraine’s defense ministry says AI tools are already processing tens of thousands of frontline video feeds each month to identify, geolocate, and prioritize targets.
That’s not a future capability. That’s Tuesday. Hivemind is the next layer of that stack, not replacing human intelligence, but removing the human bottleneck from the execution phase.
Palantir’s CEO Alex Karp said he “would have slowed down AI development if not for China and other adversaries racing ahead.”
That pressure, that logic, that reluctant acceleration is what’s driving all of this.
Nobody in this race wanted to be here. Everyone is here anyway. The physics of the threat are pulling everyone across lines that took decades of doctrine to draw, and the doctrine is losing.
The side that solves autonomous swarm warfare first redefines what warfare costs: In manpower. In training time. In operator exposure. In the political willingness to sustain a fight when you can credibly tell the public that the machines are doing the dying.
That last one is the one that keeps strategists up at night, because a war that costs fewer human lives on your side is also a war that’s much easier to keep fighting.
The moral calculus on autonomous weapons gets very complicated very fast when the casualty notifications stop coming.
Ukraine is figuring this out faster than anyone, under fire, with real stakes, against a real adversary adapting in real time.
And now it’s doing it with Hivemind.
Russia should find that unsettling because Ukraine is getting better at using it every single day, and the best countermeasure Russia had just became significantly less effective.
The drone war didn’t peak in 2024. We’re not even in the middle of it yet. And the next phase isn’t going to wait for anyone’s doctrine, or ethical framework, to catch up.
Слава Україні!





I now have existential dread. Fuck me ... I'm glad I'm old.
Yeah, except the other guys can see the same things you're seeing, and the barriers to development are lower every day. This year, it's Ukraine; next year, it's Iran, Russia, and China... Everybody wins except the civilians, and I don't have a clue what we can do about it.
FWIW, I love your writing about all of this, but damn, where is it all going?