Kneecapping AI to maintain a bloated military bureaucracy

One of two prototypes purchased by the Office of the Secretary of Defense’s Strategic Capabilities Office for its Ghost Fleet Overlord program, aimed at fielding an autonomous surface ship capable of launching missiles. (U.S. Defense Department)

Military drones are popping up everywhere. In Afghanistan and Iraq, we became used to seeing Predator drones flying around with Hellfire missiles, flown from bases in the United States and providing a near 24/7 watch for opportunities to blow up terrorists. The latest batch of drones are now becoming increasingly autonomous, meaning they can not just think for themselves, but react faster than a human and respond to an ever changing environment. In the news recently was how Artificial Intelligence that beat a top US Air Force F-16 pilot, and previously the Navy discussed how its Sea Hunter would operate as an autonomous missile barge.

But I’m not here to talk about technology, not only because details are classified, but also because any technological issues will solve themselves over time. Human engineers are pretty smart. If some piece of code doesn’t work, we’ll find a solution. Technology isn’t holding us back in the realm of military drones. People are, and unfortunately people are the real weakness, as emphasized in this quote:

“AI matters because using drones as ‘loyal wingmen’ is a key part of future air power developments,” said Teal Group analyst Richard Aboulafia via email. “It’s less important as a fighter pilot replacement.”

If we build an AI that is smarter, faster and all around better than top notch fighter pilots, why on earth would we not replace pilots? The Army just raised the minimum contract for pilots to 10 years, which in military human resources speak means that they can’t keep these people in. All the military services struggle to retain people with skills like flying, electronic warfare, cyber, and anything else that requires significant technical expertise. Using AI to fill these billets gives the military significantly more flexibility in where it sends its manpower. This manpower can be used to lead squadrons of drone aircraft, or on people who lead armies of online bots in cyberspace. It’ll require more training and expertise, and certainly a culture change in how we view people in the military.

Besides being short sighted about replacing people, the other weakness we are going to find with autonomous systems is that we do a terrible job writing out our intentions. I worked with some highly skills folks on the Navy’s autonomous sea systems, and one of the biggest challenges was turning what we call “Commanders Intent” into code. If a vessel is out looking for an enemy, its easy to say “Kill this type of enemy when you see them.” It’s harder to give instructions like “Taking the current geopolitical events into consideration, make a judgement call on whether to shoot down an adversary aircraft.”

To put it bluntly, what does that even mean? The military throws around the idea of “Commanders Intent” like its some sort of magic that springs forth from someone’s brain. In reality, its a lot of processing happening in the back of your mind that constantly takes in data from the world around you. The military benefits from having extraordinary people that stick around long enough to reach command. These extraordinary people find ways to take an ugly bureaucracy devoted toward mediocrity and somehow make it work. As our military bureaucracy has grown, this has gotten more difficult. Extraordinary people are less likely to stick around to fight a bureaucracy devoted to maintaining status quo, especially when business is happy to snap them up and pay them more. Autonomous systems give us a chance to drop much of the bureaucracy and focus on intent, strategy and “end state,” or what we want the world to look like at the end. If we don’t embrace this change, we’re missing out on the truly revolutionary changes that autonomy gives us.

Future warfare is going to feature autonomous systems, and its going to highlight how weak human beings are in a variety of areas. Rather than fight this, the military should embrace autonomous systems as a chance to recapitalize manpower. It should also begin training its future commanders, flag and general officers, about how to actually write out their intent, and stop relying on chance to give us great commanders. We can’t let a military bureaucracy devoted to maintaining a status quo on manpower stifle the massive innovation that AI offers us.

This post represents the views of the author and not the views of the Department of Defense, Department of the Navy, or any other government agency.

Leave a Reply