Features

November 1, 2007  

Who decides: Man or machine?

When the industrial revolution of the early 19th century threatened the centuries-old caste of the English artisans by replacing man with machine, they rose up, allegedly led by a man named Ned Ludd, in protest. To protect their way of life, they attempted to destroy the machines in hopes of clinging to their past. Since then, anyone who opposes technological advancement has been derided as a Luddite.

After reading an article in the Arizona Daily Star about future robotic warfare, such a title might well be lobbed at me — although for rather different reasons than my distinguished British predecessors.

The article celebrated the advent of technology that would soon see robots fighting wars for us, and even pondered what would happen when nations assemble robot armies. Although it brings to mind images of Qui-Gon Jinn and Obi-Wan Kenobi battling the droid army in “Star Wars,” the idea is anything but science fiction.

Already, numerous robots in Iraq and Afghanistan are show¬ing utility in counter-IED efforts and can significantly enhance a soldier’s ability to clear a hostile building. Similarly, semi-robotic UAVs provide imagery on enemy movements, as well as fire pre¬cision-guided weapons at enemy targets. Without a doubt, tech¬nology is providing a notable utility to the combat soldier.

However, the version of warfare elucidated by some defense experts could lead to a dangerous misunderstand¬ing of the appropriate use and utility of technology in war¬fare. Globalsecurity.org director John Pike was quoted in the Arizona Daily Star as saying, “By the end of the century, there will be virtually no humans on the battlefield. … Robots do what you tell them, and they don’t have to be trained.” He emphasized that casualties wouldn’t be a prob¬lem because when a robot gets hit, you just take it to the repair shop.

Another expert, Robert Finkelstein, president of Robotic Technology Inc., was quoted as saying that in the future, “there can be a robot cognitively as good as humans” at fighting on the battlefield “somewhere between 2020 and 2030.”

Such statements are dan¬gerous, because men discon¬nected from the realities of warfare may sway decision-makers regarding future force decisions and composition. Although we have made extraordinary advances in technology over the years, it is unlikely that any robotic or artificial intelligence could ever replicate the ability of a trained fighting man in combat.

Suggesting that within the next 12-plus years technology could exist that would permit life-and-death decisions to be made by algorithms is delusional. A machine cannot sense something is wrong and take action when no orders have been given. It doesn’t have intuition. It cannot operate within the commander’s intent and use initiative outside its programming. It doesn’t have compassion and cannot ext end mercy. Although the proponents of robotic warfare tout the advantages of sending a machine to do the work of an American soldier, they disregard the problems created by a computer following its programming.

One of the uniquely positive attributes of the combat soldier is his humanity in a particularly inhumane environ¬ment. There are times when the circumstances of battle require pitiless brutality and the application of maximum vio¬lence. But there are other times, even while being shot at, when the best course of action is to hold fire.

There have been instances in virtually every war involving the U.S. in which the enemy was told the American soldiers would take no prisoners and kill everyone on the battlefield. Instead, the enemy discovered that although GIs could be as ruthless and vicious as any opponent, the same soldier could extend mercy when appropriate. As information about U.S. soldiers’ humanity spread among enemy combatants, more of them willingly surrendered instead of choosing to continue the fight — which ultimately supports U.S war aims and saves lives on both sides of the battle line.

MINDLESS KILLING MACHINE

A computer, independently running a killing machine (robot), can only carry out its programming; it will kill if ordered to do so, even in a situation in which a well-trained soldier would recognize the best thing to do is to hold fire. Stories about mindless, merciless killing machines would likewise spread throughout the enemy camp, and that enemy would be much less likely to give up, instead choosing to fight to the death. Carried far enough, America might no longer be respected as a powerful but ultimately life-loving nation, and instead be regarded as an efficient, heartless killing machine to be feared and hated. That would not serve our national interest.

Additionally, the idea that robots can do our fighting masks the realities of war: unpredictability, violence, destruction, misery, suffering and death. What proponents of robotic war¬fare apparently don’t stop to recognize is that these robot armies will be sent to fight men, to kill, to destroy. When Pike boasts that in robot war “there is no condolence letter or funeral,” he exposes a lack of understanding about the nature of war. When these machines are employed, they will fight humans on a battlefield where flesh is ripped apart, civilians inevitably killed, and entire villages and towns laid to waste. To suggest that robots can be programmed to be “precision weapons” that can avoid collateral damage is self-delusion of the worst sort. There will be unintended consequences and, most assuredly, a great many funerals.

What is perhaps a more significant problem that this view of robotic warfare fails to consider is the ability of potential enemy forces to leverage technology for their own purposes. If we become overly reliant on technology to do our fighting, we could lose our old-fashioned, human fighting skills. What has typically happened throughout history is when a belliger¬ent initially developed a new weapon or technology, its intro¬duction on the battlefield often resulted in a significant advantage for that nation. But other nations immediately began to study the technology and discovered ways to mini¬mize the weapon’s effectiveness, followed closely by produc¬tion of their own versions. The result was the loss of that ini¬tial advantage and a return of warfare to its historic roots: man vs. man.

One of America’s most celebrated wartime commanders, Gen. George Patton, was also one of our most forward-think¬ing technological and doctrinal innovators, a strong propo¬nent of armored warfare long before most others in the U.S. But this forward thinker also thoroughly understood the nexus between technology and the nature of war.

“Wars may be fought with weapons,” he once said, “but they are won by men. It is the spirit of the men who follow and of the man who leads that gains the victory.” Recognizing this immutable fact, there are many in the Army’s signature future force program, Future Combat Systems (FCS), who are work¬ing hard to ensure this balance is institutionalized in the Army’s future.

Col. Emmett Schaill, commander of the Army Experimental Task Force tagged with the responsibility to test and validate emerging FCS capabilities and doctrine at Fort Bliss, Texas, is a firm proponent of the centrality of the soldier to the Army’s future fighting ability.

“Anyone who thinks FCS is only about weapons and tech¬nology would be mistaken,” he said. “FCS does indeed incor¬porate leading-edge technology and exciting new capabilities. However, it is the American soldier and his ability to fight and lead that will continue to be decisive in the future.”

The ‘decide’ component

FCS, when fully fielded, will include the use of improved fight¬ing vehicles and a significant increase in firepower, situational awareness and precision engagements. However, it will also include several variants of robots, some of which are designed to engage enemy forces in direct combat. It is critical that as FCS develops, these capabilities remain in the domain of enhancing the ability of the combat soldier as opposed to replacing him.

“The Armed Robotic Vehicle-Assault (Light) is a robot that extends the reach of a soldier but is still controlled by him,” Schaill said. “It is important that soldier-leaders remain fully in control of our combat actions so that the commander’s intent is fully accomplished; the ARV-A(L) is not designed to go on missions independent of the soldier.”

Col. Lee Fetterman, training and doctrine capabilities man¬ager for FCS, said he sees potential for robots to significantly increase the Army’s ability to detect the enemy or target, deliv¬er the ordnance necessary to destroy the target and assess the effects of the attack. However, in the design of the systems that will employ robots, Fetterman said he believes an important potential capability should not be employed: the “decide” component.

“The function that robots cannot perform for us — that is, the function we should not allow them to perform for us — is the decide function. Men should decide to kill other men, not machines,” he said. “This is a moral imperative that we ignore at great peril to our humanity. We would be morally bereft if we abrogate our responsibility to make the life-and-death decisions required on a battlefield as leaders and sol¬diers with human compassion and understanding. This is not something we would do. It is not in concert with the American spirit.”

Fetterman has a unique perspective of the use and utility of robotic warfare — not as a product of his position within the FCS program, but because of his combat experience demon¬strating the utility and limitations of robots in war. The follow¬ing is his description of two combat actions, one in Afghanistan and one in Iraq, which graphically reveals the principles he described:

“I used a PackBot in Afghanistan. It was great. It allowed me to reduce soldier risk in caves, urban courtyards, and inside buildings. It reduced risk to civilians because I was able to observe the interior of human habitations before we entered. Thus, when there were women and children present, we did not enter in the dynamic fashion” — throw in a grenade and follow it after it explodes. “This was, needless to say, something we were keen to avoid doing whenever civil¬ians were present, and the PackBot helped us sort that out when we had it.

“In Iraq, I did not have PackBot attached to my unit. It was still in the experimental stage at that time. We were about to enter a courtyard one day after having had a fire¬fight in one dwelling. The area was totally enclosed, and we were on the outside of the walls. We had just shot two men who had engaged us with AK47s upon exiting this courtyard. In the momentary pause before entry, I could hear women and children wailing inside the courtyard. I could also see my lead squad leader preparing to take his men into that courtyard. He was an exceptional NCO, and I had no doubt that he and his men would perform the battle drills they had been trained to do flaw¬lessly. That is to say, they would enter and clear the area of all resistance in accordance with our rules of engage¬ment.

“I was, however, of the opinion at this point that we had no more combatants inside the courtyard. I had no evidence to support this belief, since I could not see inside the courtyard, but I was intu¬itively convinced there was risk to inno¬cent life if I did not do something. Therefore, I told him to wait. I took the interpreter and entered the courtyard. There was a woman, a toddler and a baby inside the area. I have never been so happy that I followed my human instincts.”

However, there are those in the U.S. defense community who would like to see that focus change. Some proponents of robotic combat see only the promised benefits of employing robots in future war. However, a vision that is over-reliant on technology rests on shaky ground because it fails to adequately consider robotic limitations.

More importantly, insufficient atten¬tion has been paid to the actions and abilities of our future opponents and their ability to overcome the initial advantage we would gain through the use of robotics.

STEALING THE ADVANTAGE

In 1885, American inventor Hiram Maxim gave a demonstration to the British Army of his newest creation: the world’s first automatic, lightweight machine gun. This capability was con¬sidered to replace the equivalent fire¬power of 100 rifle-toting infantrymen and conferred a significant tactical advantage to England.

But five years later, half a dozen other European countries adopted the weapon and began development of improved versions of their own. By the outbreak of World War I, 14 years later, the gun’s advantage was lost and the capabilities equalized on both sides of the line.

During World War II, the same process took place in regard to the technologies that made possible the submarine, radar, warplanes and tank warfare. Every development and dis¬covery made by one side was quickly minimized or copied by the other, such that at the end of the war, the techno¬logical capabilities of all the major combatants were fairly equal. America’s nuclear attack on Japan gave an immediate, enormous strategic advantage to the U.S., but soon the USSR, the U.K., France and others joined the nuclear club, and the advan¬tage was neutralized. As awesome as it is, the technology of FCS and the development of robotics will follow the same path.

Therefore, it is of paramount impor¬tance that as we aggressively pursue technology, we accord the human dimension at least an equal share of our focus. At the moment, the U.S. has a marked advantage over most of the rest of the world in technology, but already, even the decidedly un-modern Iraqi insurgency has proven quite adept at hiding from or deceiving our extraordi¬nary advantage in satellite and UAV capabilities.

Other nations — with much more robust technological capabilities — are developing their own fleet of satellites and airborne sensors, and soon those abilities will be common among major powers. Therefore, it is critical that U.S. national leaders and decision-makers not fall prey to the seductive promises of technology and robotics to be all-seeing, all-knowing and always invinci¬ble.

We must wisely maximize our lever¬age of all present and emerging tech¬nology to enable our soldiers to fight and win in both the current and future fight. But within that effort, leaders at every level must never forget that the violent and unpredictable nature of warfare has never changed and will never change, and the ability of the combat soldier and his leader will remain the decisive factor in all future war. Although there will always be utili¬ty in developing new tools and weapons of war, the single most important factor — indeed, the indis¬pensable component — is a well-trained combat soldier.

Maj. Daniel L. Davis is an Army cavalry officer who fought in Operation Desert Storm in 1991 and served in Afghanistan in 2005. He is the operations officer for TRADOC Capabilities Manager-Future Combat Systems at Fort Bliss, Texas.