identity…we’re …show more content…
living through the end of human kind’s 5,000 year-old monopoly on fighting wars” (Frontline, A Robotics Revolution).
As the technology we use to fight wars has changed, so has our enemy. We have adapted to a need for more sophisticated tools to fight an enemy that avoids direct confrontation. Since
9/11, the national defense budget spent on weapons research has risen 74 percent. A large portion that budget is focused on developing more robotics technology. The number of unmanned ground systems in the military went from zero in 2001 to five thousand in 2006
(Singer, 61). This leap can be attributed to not only the maturation of technology making them cheaper to manufacture, but the need to meet an unprecedented new threat in fighting a war on terror. As one U.S. Navy researcher states, “Just as World War I accelerated automotive technology, the war on terrorism will accelerate the development of humanoid robot technology. The robot is our answer to the suicide bomber” (Singer, 62).
An adaptive enemy requires a reevaluation of tactics and merits and an increased need for autonomy, driving change in how warfare is conducted. As AI expert Robert Epstein explains,
“…the military will want it [a robot] to be able to learn, react, et cetera, in order for it to do its mission well…But once you reach a space where it is really capable, how do you limit them?”
(Singer, 126). As robots become more autonomous, concerns arise as to whether or not these machines could eventually replace humans in orchestrating and conducting military operations.
Currently, there is a pervasive opinion that while increased autonomy has its advantages, humans must remain involved in making important decisions. But the balance between direct human involvement (“in the loop”), monitoring and authorizing lethal combat decisions (“on the loop”), and full autonomy will shift as technology advances (Cartwright). With an increased fleet, pilots will be expected to control multiple units at once, an extremely difficult task that reduces performance levels by an average of 50 percent (Singer, 126). Increased autonomy can pick up this slack. As former army colonel Thomas Adams describes, the machines “will be too fast, too small, too numerous, and will create an environment too complex for humans to direct” (128). It
3
becomes inevitable that humans’ role will reduce with advancing technology, potentially becoming obsolete as autonomous robots surpass our ability to control.
Increased use of robotic technology affects not only the role of the warrior, but changes the dynamics of soldier interaction. Unit cohesion, or the set of bonds soldiers share in the face of war, deteriorates with increased distance. As Marine general James Mattis writes, “You gain that trust and focus from living and breathing the operation together. With reachback operations, you lose that camaraderie” (335). Emotional and psychological bonds are difficult to establish through an artificial presence, and trust diminishes with increased distance. This extends to leadership roles in the military, as generals have a greater reach and more control over minutia on the battlefield. This centralization of command progresses towards micromanagement, which shakes the structure of strategic command (348-352). Increased involvement, as well as increased autonomy, blurs the lines of responsibility. And there are psychological ramifications associated with these changes, as the altered experience of war creates a different reality of what it means to be a soldier.
While operators of autonomous machines are distanced from their comrades and enemies, the consequences of war remain very real. The role of Unmanned Aerial Vehicle (UAV) pilots is unique in that they live a dual lifestyle. Unlike pilots in the field, at the end of the day, UAV pilots can leave the war environment and go home to their wife and kids. Despite not being in danger of physical harm, UAV pilots experience much of the same psychological and emotional stress as pilots immersed in the battlefield. Working the same hours as pilots in the war zone,
UAV pilots alternate 12-hour shifts seven days a week, and even wear flight uniforms, reinforcing the seriousness of the role they play in the battlefield. This dichotomy can be tiring, as their role is a 24/7 job. “There are no weekends or holidays, and the pace can be grueling… yet, while the war may be on, none of the pressures of the home front disappear” (346). Juggling the two realities of civilian and soldier life can take its toll. Colonel Michael Downs explains,
“Even when you are off, you’re out of sync with your family”. The grueling hours couple with the increasing pressures of the job. “I try to ensure that people understand there are people who are counting on us to do the mission… there’s severe both military and political consequences if we fail” says Colonel William Brandt (Frontline, “Taking Out the Taliban: Home for Dinner”).
UAV pilots are counted on to save lives, and when things go wrong, they can have lasting effects on those responsible. “You see Americans killed in front of your eyes and then you have to go to a PTA meeting” explains Gary Fabricius, a UAV squadron commander (Singer, 347).
There are also the lasting effects of remotely attacking enemies. Predator drones, the most abundant UAV presently used in military operations, have high resolution cameras that provide intricate detail of their targets. Pilots observe these individuals simply living their day to day lives, sometimes for months, before being given an order to attack. “You watch it all the way to impact, and I mean it's very vivid, it's right there and personal. So it does stay in people's minds for a long time” describes Col. Albert K. Aimar (Associated Press). There are even cases of
Predator drone pilots developing Post Traumatic Stress Disorder. “These issues follow them home”, Aimar describes, “causing some family issues, some relationship issues” (Associated
Press).
Many UAV pilots try to remind themselves of the necessity and importance of the mission to combat the psychological stress of war. “It does weigh on me” says Air Force Captain
Lamont Anderson, but “It’s not something I would question in the heat of battle” (Bowman). A drone pilot known only as “Captain Dan” justifies it by stating, “You’re saving people’s lives by deploying weapons. That’s the business that we’re in” (Frontline, “Taking Out the Taliban:
4
Home for Dinner”). Although they remain highly involved in the military operations, these pilots are thousands of miles away from the action. As physical presence decreases, so does the humanity of war.
Removal from the battlefield creates an absence of cost and risk that is central to war.
This absence is deep rooted, bound to the very nature of controlling a UAV. Many of the controllers are designed to have a younger generation preconceptions in mind. P.W. Singer describes “…the military is free-riding off of the video game industry… the controllers have already been designed to fit your hands perfectly… and we have this generation that has already been trained up in their use” (Frontline, “Drone Pilots”). The danger lies in the association of war with the video games that these young pilots are so familiar with. Singer quotes Jeff Macgregor, noting the capacity of these gamers to “…eliminate the magnificent inconsistencies of the human heart and its capacity for courage or cowardice, and the game, the war, is no more than a fast twitch exercise – a battle fought without personal cost” (Singer, 367). This disassociation removes the ideals of war that have made such a reprehensible activity acceptable for thousands of years. It is here that we realize, how Yale Law school professor Paul Kahn describes, the
“paradox of riskless warfare” (432). Historically, there have been mutuality associated risks and costs involved in war. But with autonomous robots fighting the wars and doing the killing, it is difficult to justify inflicting mayhem and destruction on your fellow man. This especially holds true when the enemy does not share the same technological advantages, or the same mentality of war and the meaning of sacrifice.
The combatants we face in our present day military endeavors have a unique understanding of war, and this shapes how they view our use of robotics. Fundamentally, the motivations for war are dissimilar. As Singer writes, “One side looks at war instrumentally, as a means to an end, while the other sees it metaphysically, placing great meaning on the very act of dying for a cause” (312). It is hoped that this viewpoint will aid our cause, as the psychological effect of unmanned weapons could prove to be demoralizing to the enemy. Singer offers a
Washington Times report noting that “while soldiers will fight against the enemy if they have a chance to kill the attacker even against all odds, being killed by a remote-controlled machine is dispiriting” (298). But this is not always the case. The use of machines could bring unintended consequences, portraying the U.S. as weak and cowardly country that refuses to fight its enemies face to face. The perception of a robot army has the potential to discredit the U.S. as the beacon of freedom for the world. “If the U.S. doesn’t handle robotics right, it will undermine our
moral standing and the U.S. can’t be a global leader without such standing,” says former secretary of
Defense Larry Korb, “The age of robotic warfare has the potential to rob us of our “humanity” which vastly complicates the prerequisite of building the structures of peace” (309). In order to successfully utilize and legitimize robots on the battlefield, we must empathize with the ideals and culture of our enemy, and understand the consequences of increased autonomy in robotic warfare. The increasing presence and impact of unmanned robots on the battlefield demands a reconsideration of international law. The International Committee of the Red Cross (ICRC), the only authority approved by the international community to establish and protect international humanitarian law, has yet to take a position on robotic warfare. The organization refers to the topic as irrelevant and “too futuristic” to spend time on. “The ICRC position on robotics, or rather absence of a position, is simply representative of the brewing breakdown between the laws of war and the reality of conflict in the twenty-first century” (386). Thus, the engineers designing these new robotic system such as the predator drones, have no rules or principles to guide them.
5
“The most disturbing part,” says a drone squadron officer, “is that the needed laws and valued typically can’t keep up with such rapid change” (386). At this point there are no laws amongst the international community that prohibits artificial intelligence form making life and death decisions. The legality of technology has always come far behind its implementation. Unfortunately it usually takes some sort of catastrophe to spring the international community into action. This delay in judgment could have disastrous consequences when dealing with sophisticated, highly armed, autonomous robots. As an Army War College professor describes,
“There is no consensus yet on anything new and, unfortunately, I don’t think we are due for a breakthrough until something terribly bad happens” (387).
Further questions arise as the notion of accountability is complicated with the use of
UAVs. It can be unclear at times what chain of command a drone pilot in Nevada follows when conducting operations in Afghanistan. Returning to the idea of the micromanaging general, confusion in accountability can arise when both the local commander in Nevada and the regional commander in Afghanistan want to manage the squadron. The potential of war crimes at the hands of an autonomous machine poses an even more complicated dilemma. If an unmanned robot with advanced decision-making AI causes civilian casualties, there is confusion as to who takes the blame. The developing firm, the engineers, the operator, and the commander all share responsibility. With the continued increase in the autonomy, capability, and lethality of machines, concerns are arising over the dangers of increased robotics. The fears of the dehumanization aspect of the technology, the possibly of it getting into the wrong hands, and even the potential for a robotic revolt, has lead some scientist to say the best resolution is to stop the research altogether. As Bill Joy argues, “the only realistic alternative I see is relinquishment: to limit development of the technologies that are too dangerous, by limiting our pursuit of certain kinds of knowledge” (420). But are such precautions even possible? Even if the government supports such relinquishment, many argue that development would continue underground. There are simply too many motivating factors, economic and otherwise, for both military and private development. If that is the case, it is imperative to establish a code of ethics for not only the machines but the people behind them. “Too often, when issues of robot ethics are raised, it comes across as science fiction. Indeed this is perhaps why so many roboticists avoid talking about the issue altogether. That is an ethical shame” (427).
Robotics poses to be one of the greatest technological revolutions in human history and yet we seem completely unprepared for it. Although every technological breakthrough since the wheel has required a transitional period for mankind to adjust, the interconnectivity and speed at which the world operates today raises the stakes to an unprecedented level. “Even the most ardent supporters of robotics warn that we have to “get it right” the first time as there is little room for recoverable errors. The speed of the technology means we have less time to react and adjust to these changes” (434). This increases the need to discuss the ethics and ramifications of integrating robots into not only the military but our daily lives. The decisions scientist and military officials make today will affect the world for decades to come. We are putting into motion possibly irreversible systems, the consequences of which remain predominantly undiscussed. We have a responsibility as scientists and engineers to evaluate the ethical implications of our work on a daily basis, as the systems we design have the potential to change society at large.