That’s the crux of Timothy Chung’s research, an assistant professor in the Systems Engineering department at the Naval Postgraduate School in Monterey, California. For the most part, he and the Advanced Robotics Systems Engineering Lab (ARSENL) have been working on a way to construct a series of low-cost, lightweight autonomous flying vehicles known as Aerial Battle Bots that will give the US and the western allies an advantage should a full-scale conflict involving UAV’s happen.
The aspect of cost is especially important, seeing as how drones cost on the order of several million dollars apiece. By supplementing reconnaissance and hunter-killers with dogfighting drones, the army and navy of the future will have a lost cost-option for keeping their big-budget fliers safe. What’s more, it’s extremely important that the drones work in tandem, since it’s highly likely other nations will be developing similar swarms of drones in the future too.
With the help of a DARPA research grant, Chung and his associates have completed a small fleet of about a dozen drones. Each is a essentially a commodity radio-controlled flying machine, called Unicorn, that has been retrofitted with an onboard computer and other gear in order to take their places in the larger group. He hopes that by this August, he and his team will be able to get the vehicles flying and be able to start experimenting with getting them working together, as well as facing off!
In other news, questions relating to drone dogfights and the issue of autonomous drones were raised once again at the White House. Back around Thanksgiving, the mounting concerns from the human rights community led Deputy Defense Secretary Ashton Carter to sign a series of instructions that were designed to ensure that human oversight would always be a factor where drone strikes and UAV’s were concerned.
These concerns have since mounted with the recent announcement that John Brennan, the White House’s counter-terrorism adviser and the man known as the “Drone Godfather”, was nominated to become the next head of the CIA. For years now, he has been the man in charge of the US antiterrorism efforts in Central Asia, many of which have involved the controversial use of Predator and Reaper strikes.
These concerns were voiced in a recent letter from Sen. Ron Wyden (D-Ore), a member of the Senate intelligence committee. In it, he asked Brennan pointedly when and under what conditions the president would be able to target American citizens using drones:
“How much evidence does the President need to determine that a particular American can be lawfully killed? Does the President have to provide individual Americans with the opportunity to surrender before killing them?”
Naturally, the questions were quite specific when it came to the authorization of lethal force and when such authorization would be given to target people within the US’s borders. But there were also many questions that highlighted concerns over how this same process of authorization has taken place in other countries, and how little oversight has taken place.
In short, Wyden used the occasion to express “surprise and dismay” that the intelligence agencies haven’t provided the Senate intelligence committee with a complete list of countries in which they’ve killed people in the war on terrorism, a move which he says “reflects poorly on the Obama administration’s commitment to cooperation with congressional oversight.” And given the mounting criticism at home that using killer drones against unspecified targets in Afghanistan and Pakistan has earned, not to mention the blowback happening overseas, he is not alone in thinking this.
Like it or not, it’s a new age where “umanning” the front lines is having an effect, albeit not the desired one. At one time, the predominant thinking in military and intelligence communities was that using automated aerial, land and sea vehicles, war could be fought cleanly, effectively, and without the loss of life – at least on OUR side. However, this thinking is coming under increasing scrutiny as it comes closer and closer to realization. And at the center of it all, the philosophical and existential questions are numerous and impossible to ignore.
For starters, war is and always will be a human endeavor. Just because you are not risking the lives of your own people doesn’t mean the fight is any more sanitary or bloodless. Second, even though none of your own citizens will be mourning the death of their loved ones doesn’t mean there won’t be mounting civilian opposition as conflicts go on. In a global community, people are able to witness and empathize with the plight of others. And finally, the increased use of machinery, be it autonomous or remote controlled, will inevitably lead to fears of what will happen if that same technology would ever be turned against its own people. No weapon is so safe and no government so trustworthy that people won’t fear the possibility of it being turned on them as well.