In the Social Europe Journal, Zygmunt Bauman argues that military technology has shifted power and responsibility away from the axmen to the axes:
“By the start to the 21st century, military technology has managed to float and so “depersonalise” responsibility to the extent unimaginable in Orwell’s or Arendt’s time. “Smart”, “intelligent” missiles or the “drones” have taken over the decision-making and the selection of targets from both rank-and-file and the highest placed ranks of the military machine. I would suggest that most seminal technological developments in recent years have not been sought and accomplished in the murderous powers of weapons, but in the area of “adiaphorization” of military killing (i.e., removing it from the category of acts subject to moral evaluation). As Günther Anders warned after Nagasaki but still well before Vietnam, Afghanistan or Iraq, “one wouldn’t gnash teeth when pressing a button… A key is a key”.[i] Whether the pressing of the key starts a kitchen ice-cream-making contraption, feeds current into an electricity network, or lets loose the Horsemen of Apocalypse, makes no difference. “The gesture that will initiate the Apocalypse would not differ from any of the other gestures – and it will be performed, as all other identical gestures, by a similarly routine-guided and routine-bored operator”. “If something symbolizes the satanic nature of our situation, it is precisely that innocence of the gesture”[ii], Anders concludes: “the negligibility of the effort and thought needed to set off a cataclysm – any cataclysm, including the “globocide”…
What is new is the “drone”, aptly called “predator”, that took over the task of gathering and processing information. The electronic equipment of the drone excels in performing its task. But which task? Just like the manifest function of an axe is to enable the axman to execute the convict, the manifest function of the drone is to enable its operator to locate the object of the execution. But the drone that excels in that function and keeps flooding the operator with the tides of information he is unable to digest, let alone promptly and swiftly, “in real time”, to process – may be performing another, latent and unspoken-about function: that of exonerating the operator of the moral guilt that would haunt him were he fully and truly in charge of selecting the convicts for the execution; and, more importantly yet, reassuring the operator in advance that in case a mistake happens, it won’t be blamed on his immorality.
If “innocent people” are killed, it is a technical fault, not a moral failure or sin – and judging from the statute books most certainly not a crime. As Shanker and Richtel put it, “drone-based sensors have given rise to a new class of wired warriors who must filter the information sea. But sometimes they are drowning”. But is not the capacity to drown the operator’s mental (and so, obliquely but inevitably, moral) faculties included in the drone’s design? Is not drowning the operator the drone’s paramount function? When last February 23 Afghan wedding guests were killed, the buttons-pushing operators could blame it on the screens turned into “drool buckets”: they got lost just by staring into them. There were children among the bombs victims, but the operators “did not adequately focus on them amid the swirl of data” – “much like a cubicle worker who loses track of an important e-mail under the mounting pile”. Well, no one would accuse such a cubicle worker of moral failure…”
It is no surprise that the US Department of Defense has used video games as recruiting tools.