Models grew to astonishing levels of complexity, fuelled by the desire to create an accurate simulation of conflict, a scientific understanding of a quite literal war machine. The father of systems analysis, RAND researcher Ed Paxson, was symptomatic of this with the minutiae of his obsession in planning for World War III:
His dream was to quantify every single factor of a strategic bombing campaign – the cost, weight, and payload of each bomber, its distance from the target, how it should fly in formation with other bombers and their fighting escorts, their exact routing patterns, the refueling procedures, the rate of attrition, the probability that something might go wrong in each step along the way, the weight and inaccuracy of the bomb, the vulnerability of the target, the bomb’s ‘kill probability,’ the routing of the planes back to their bases, the fuel consumed, and all extraneous phenomena such as the weather – and put them all into a single mathematic equation.44
Planning for nuclear war was a particularly urgent task during the Cold War and required continuous reviewing as the technology and availability of bombs and missiles were subject to rapid change. The explosive power of individual devices – first triggered by nuclear fission in atomic bombs, then nuclear fusion for hydrogen bombs
• escalated vertigineously. Launching systems gained in range and accuracy: with the evolution from the medium-range bomber in 1945 to the intercontinental ballistic missile in 1957, full-blown nuclear war could be initiatied within a few hours of the executive decision. In the absence of any battlefield experience of nuclear weapons, systematic mathematical calculation of the theoretical damage they would inflict on urban areas and troops was the only means by which to assess defence needs. Consequently, nuclear payloads, delivery systems, military and civilian defensive measures, along with the strategies and tactics within which these would be inserted were given the systems analysis treatment. Systems analysis, game theory and the whole range of available mathematical and statistical instruments were the only means to rationalize Armageddon and ‘think the unthinkable’.45 If attempts at developing strategies of graduated nuclear war were made, the main priority of the analysts was to enforce effective deterrence and preserve the ‘delicate balance of terror’46 over and above any notion of winning the face-off, to ensure that the ‘Cold War system’ could return to homeostatic equilibrium and ward off the possibility of it exceeding the bounds beyond which it would self-destruct in an apocalyptic spasm dubbed ‘wargasm’ by the nuclear strategist Herman Kahn.
Nor was systems analysis limited to the nuclear aspects of warfare, the entire spectrum of conventional military operations was subject to its scrutiny. As RAND’s first vice-president Alan Henderson declared, ‘systems analysis seeks to cover the full range of possible future weapons characteristics and simultaneously analyse each set of possible characteristics in all possible tactics and strategies of employment’.47 However, while the consideration of the relative merits of two bombers with ten variables as possible during World War II yielded already over 1,000 combinations, raising the number of systems being considered by only four resulted in over a million combinations for evaluation.48
Furthermore, the rapidly evolving characteristics of the weapons required a constant revision of their potential impact on existing tactical and strategic plans. This exponential increase in possible permutations made this a task only achievable by the computer. Thanks to its formidable processing power, the computer allowed for the creation of complex models with multiple variables, providing a rapid calculation of any changes in their values.
The same modelling techniques were also employed for a vast range of wargames which simulated an array of operations from the tactical to the operational to the strategic, from individual battalions to anti-aircraft defences to global geopolitical exercises. While war-gaming had long been practised – the Germans had been enthusiasts of Kriegsspiel from the nineteenth century onwards – the new generation of wargames and simulations were an extension of OR and SA since their models relied on the same methodology. Ghamari-Tabrizi points to the manner in which these wargames constituted their own closed worlds:
Following the thread of systems thinking, the gamers tried to shoehorn everything of importance into game design and play. Since a major war would batter every department of life, they were tempted to expand their model into infinitely complex details in the simulation of reality. But at the same time, they were determined to set upper and lower boundaries, limits and constraints of every kind onto that surging impluse towards the Weltbild. In other words, in war game design, one makes out a wish to catch a richly furnished world, but one sealed off like a terrarium or a tableau in a paperweight. This snug little world, in which the totality could be grasped all at once, encompasses the universe of miniature life.49
Computers became increasingly employed in wargames, first to calculate the outcome of any decision by the players by processing it through complex models of warfare and international relations, and secondly as an interface for the players. With the aim of providing greater realism, the environments of decision makers were often reproduced painstakingly. Mediated through computerized displays and interfaces, real wartime situations and simulations would be largely indistinguishable. War-gamers at RAND would eventually grant computers an even greater role by making them fully-fledged players.
Faced with human players that would persistently refuse to cross the nuclear threshold in simulated excercies, the simulationists developed artificial intelligences that could play the role of the Soviet Union or United States, creating a variety of iterations characterized by different ‘personalities’ and willingnesses to resort to force.50 Computers were effectively the ideal war-gamers: cold, logical, purely instrumental and devoid of the messy cultural, social and historical attributes that plagued human players and could not be mathematically modelled. Fully computerized wargames allowed for the rapid testing of an entire range of weapon system characteristics, logistics, tactics, and strategies for the purpose of identifying the optimal combination.
But the use of wargames was not restricted to testing models; they could also provide their own facts and statistics for interpretation and inclusion in the models. As one war-gamer observed: ‘as we recede from such sources of empirical data as World War II and Korea, an ability to generate synthetic battlefield facts becomes increasingly important.’51 These synthetic facts drawn from the experiences of simulated conflict could then be fed back into further models of war – simulation begetting simulation, the constitution of a hyperreal feedback loop.
If institutional resistance from the Air Force – RAND’s main sponsor – to studies that attempted to model warfare in its entirety forced the organization into publicly downgrading the ambition in the scope of its research projects, in practice the systemic interrelationship of areas of study and the commonly held belief of analysts in the superiority of their methodology made such restrictions difficult to maintain. Alain Enthoven, Deputy Assistant Secretary for Systems Analysis under McNamara, once pointed out to a General that ‘I have fought as many nuclear wars as you have’.52
In fact, given his time at RAND modelling it and playing wargames, Enthoven may well have believed that he had actually fought more nuclear wars, albeit simulated. Enthoven’s quip was symptomatic of the attitude of RAND analysts towards the military brass, convinced as they were that ‘in order to approach nuclear war properly, one had to become a perfect amnesiac, stripped of the intuitions, judgments, and habits cultivated over a lifetime of active duty’.53 Combat experience and traditional common wisdom of the military were thus devalued in favour of the cool rational calculations of the defence intellectual.
In 1961, this latter vision appeared to have triumphed over the generals as Robert McNamara was made Secretary of Defense and proceeded to apply the paraphernalia of systems analysis across the military more systematically than ever before.
McNamara had first risen to prominence during World War II, distinguishing himself as one of the most brilliant analysts in the Statistical Control Office, where he had conducted operations research on Air Force operations. He was notably involved in the strategic bombing campaign of Japan, recommending the switch to firebombing and lower altitude bombing which the notorious Air Force General Curtis LeMay (later head of Strategic Air Command) adopted with devastating results for Japanese cities.
After the war, he left the armed forces to join the Ford Corporation, applying the same principles of scientific management with great success, before being offered the role of Secretary of Defense by President Kennedy. Surrounding himself with a team of ex-RAND analysts that shared his worldview, McNamara set out to extend these principles to all branches of the military. A controversial figure, particularly unpopular with certain sections of the military over which he asserted previously unseen levels of civilian control, McNamara was once referred to as a ‘“human IBM machine” who cares more for computerised statistical logic than for human judgments’.54
McNamara instigated the Planning, Programming and Budgeting System (PPBS) in 1962, perhaps his most lasting legacy, by institutionalizing systems analysis in the decision-making process of military planning and procurement. With PPBS, cost – benefit and cost-effectiveness analysis were applied across all branches of the military so that various military programmes from different services could be evaluated, compared, and granted funding accordingly.55 PPBS was subsequently extended across the federal bureaucratic structure, in particular the social welfare agencies of the Departments of Health, Education and Welfare and Office of Economic Opportunity. Department of Defense Comptroller Charles Hitch (ex-RAND) insisted that systems analysis acted merely as an instrument assisting decision makers rather than being the decisive factor in determining spending plans. Gregory Palmer agrees that PPBS was often more of a heuristic or ideal, but that ‘in its pristine form, PPBS was a closed system, rationally ordered to produce carefully defined outputs’.56 As such, critics claimed its influence was pervasive and dangerously misleading if applied uncritically.
In his Farewell Address to the people on 17 January 1961, President Eisenhower had famously warned against ‘the acquisition of unwarranted influence, whether sought or unsought, by the military – industrial complex’. Eisenhower also spoke of the ‘danger that public policy could itself become the captive of a scientific – technological elite’.57 A military man, Eisenhower was most likely at least in part thinking of the operations researchers and system analysts who rose to prominence during his presidency and were about to take control of the Pentagon, equipped with instruments they believed could be used to tackle all social problems. ‘The military effect of cybernetics and computers did more than bring about changes in administration, logistics, communications, intelligence and even operations’, Van Creveld tells us, ‘they also helped a new set of people to take charge, people who thought about war – and hence planned, prepared, waged, and evaluated it – with the aid of fresh criteria and from a fresh point of view.’58
Because of the scientific and mathematical methodology upon which this new point of view relied, analysts systematically privileged the quantifiable aspects of warfare:
With computers acting as the stimulus, the theory of war was assimilated into that of microeconomics .. . Instead of evaluating military operations by their product – that is, victory – calculations were cast in terms of input – output and cost effectiveness. Since intuition was replaced by calculation, and since the latter was to be carried out with the aid of computers, it was necessary that all the phenomena of war be reduced to quantitative form. Consequently everything that could be quantified was, while everything that could not be tended to be thrown onto the garbage heap.59
Under the impulse of computer modelling and systems analysis, the understanding of war which emerged during the Cold War was therefore frequently biased towards those elements which could be quantified. But even that which could be quantified could not necessarily be precisely measured or estimated and would frequently only be the product of more or less inspired guesswork.
For Solly Zuckerman: operational analysis implies a kind of scientific natural history. It is a search for exact information as a foundation for extrapolation and prediction. It is not so much a science in the sense of a corpus of exact knowledge, as it is the attempted application of rigorous methods of scientific method and action to new and apparently unique situations. The less exact the information available for analysis, the less it is founded on experience, the more imprecise are its conclusions, however sophisticated and glamorous the mathematics with which the analysis is done.60
As such, the outcome of SA studies or war games was heavily dependent on the assumptions underpinning their models, some acknowledged by the analysts, others largely concealed or unquestioned. Driven by their desire for predictability, analysts constrained uncertainty by either setting the possible variations of factors within clearly delineated numerical ranges and probability sets or by simply discounting all those elements that could not be treated in this bounded way. Princeton academic Klaus Knorr noted some of the uncertainties frequently neglected by SA:
Costs may be uncertain, technology may be uncertain, the properties of military conflict situations may be uncertain, and the reactions and capabilites of the potential enemy nations are apt to be uncertain. The last uncertainty is of particular import; it is imperative that military choices be examined within a framework of interaction. An opponent’s responses to our choices may, after all, curtail or altogether nullify the advantage we seek. Nor is it enough to recognize the conflict aspects of the problem. The possibilities of tacit or formal co-operation may be equally significant.61
McNamara himself came to be disillusioned with the approach he had championed, recognizing the impossibility of making war into a fully predictable instrument of policy: ‘war is so complex, it’s beyond the ability of the human mind to comprehend all the variables. Our judgement, our understanding, are not adequate.’62 McNamara was to learn this lesson during his tenure as Secretary of Defense between 1961 and 1967, during which the United States got progressively sucked into a Vietnam War it could not win, despite (or perhaps because of ) its army of system analysts in the Pentagon.
Vietnam: cybernetic warfare fails
The limits of the centralizing cybernetic model became clear in Vietnam, although its large role in the US defeat has often been disregarded. James Gibson has perhaps done the most to document the dramatic failure of ‘technowar’, ‘a production system that can be rationally managed and warfare as a kind of activity that can be scientifically determined by constructing computer models’.63 The principles of OR and SA were applied to provide analysis of the conflict and guidance to the policy makers while cybernetic command-and-control technologies were widely deployed. What developed in Vietnam can be appropriately described as an ‘information pathology’, an obsession with statistical evaluations and directing the war from the top, perceived as the point of omniscience, when in practice soldiers on the ground often understood far better than their superiors how badly the war was going.
Between 1967 and 1972, the Air Force ran Operation Igloo White at the cost of nearly $1 billion a year. Through an array of sensors designed to record sound, heat, vibrations, and even the smell of urine, feeding information to a control centre in Thailand which sent on the resulting targeting information to patrolling jet aircraft (even the release of bombs could be controlled remotely), this vast cybernetic mechanism was designed to disrupt the Ho Chi Minh Trail, a network of roads and trails providing logistical support to the North Vietnamese. At the time, extravagant claims were made about the performance of the system with the reported number of destroyed trucks in 1970 exceeding the total number of trucks believed to be in all of North Vietnam. In reality, far fewer truck remains were ever identified, there were probably many false positives in target identification, and the North Vietnamese and their Laotian allies became adept at fooling the sensors. In spite of all this, the official statistics still trumpeted a 90 per cent success rate in destroying equipment travelling
down the Ho Chi Minh Trail, an assertion difficult to sustain given that the North Vietnamese conducted major tank and artillery operations in South Vietnam in 1972. Edwards incisively observes that ‘Operation Igloo White’s centralized, computerized, automated, power-at-a-distance method of “interdiction” resembled a microcosmic version of the whole US approach to Vietnam’.64
Gibson submits that technowar not only altered the conduct of war but even the likelihood of the use of force: ‘by adopting microeconomics, game theory, systems analysis, and other managerial techniques, the Kennedy administration advanced “limited” war to greater specificity, making it seem much more controllable, manageable, and therefore desirable as foreign policy.’65 Henry Kissinger illustrated this very point when he claimed in 1969 that ‘a scientific revolution has, for all practical purposes, removed technical limits from the exercise of power in foreign policy’.66
The US bombing campaign in Vietnam obeyed a gradation in the use of force through which signals could be sent to the North Vietnamese. This amounted to a communicative theory of war where the level of violence can be alternatively ratcheted up or alleviated according to the message to be sent. In this manner the government wished to convince the North Vietnamese that they could not win, thereby forcing them to negotiate and steering them towards the desired behaviour.
As Kissinger put it, ‘in a limited war, the problem is to apply graduated amounts of destruction for limited objectives and also to permit the necessary breathing spaces for political contacts’.67 This thinking emerged from attempts by defence intellectuals, frustrated by the paradoxical powerlessness of nuclear weapons so destructive they could not be used, to theorize and rationalize their limited use against the Soviet Union as bargaining chips in an eventual showdown. This strategy was ultimately abandoned because of the impossibility to guarantee that nuclear war would not rapidly escalate into an apocalyptic war of extermination but resurfaced in the context of the Vietnam War.
By applying bargaining models based on game theory which assumed a common utility-maximizing rationality and cost – benefit framework of analysis on all sides, strategists erected an understanding of the enemy that was a mere reflection of their own worldview. This perception was further bolstered by the military and civilian leadership’s conception of war as the management of complex industrial systems:
‘Limited war fought as a war of attrition means that only information about technological – production systems will count as valid knowledge about the enemy. For the military as well as civilian, the enemy becomes a mirror image of ourselves, only “less” so.’68 Since military effectiveness could only be measured by the yardstick of ‘technological – production systems’, the North Vietnamese were necessarily inferior and victory was the only conceivable outcome for the American war machine.
Cybernetic warfare’s closed self-referentiality was thus a major factor in bringing about its defeat in Vietnam, blinding its proponents to the successful asymmetric strategy deployed by the Vietcong. Designed to fight total war against the Soviet Union, cybernetic warfare was susceptible to spectacular inefficiency and failure when engaged in a low-intensity conflict in which a dispersed enemy merged into a complex jungle environment. Attempts to simplify the battlespace through the practice of deforestation and the use of Agent Orange made little difference against an opponent that played to its strengths and understood its enemy far better than the Americans did. Witness North Vietnamese General Vo Nguyen Giap’s piercing observation: ‘The United States has a strategy based on arithmetic. They question the computers, add and subtract, extract square roots, and then go into action. But arithmetical strategy doesn’t work here. If it did, they would already have exterminated us with their airplanes.’69
The American reliance on information technologies to direct the war brought its own problems. For one, despite the development of ICTs, volumes of information escalated so fast that saturation and bottlenecks resulted, especially within highly centralized command-and-control structures. Intelligence on Vietcong positions and movements frequently arrived too late to be actionable, delayed in an information- processing infrastructure unable to treat all the data it was fed. And this despite the creation of an unprecedented telecommunications network in a field of operations, with electronic communications gear accounting for a third of all major items of equipment brought into the country and the first use of satellite communications for military purposes in 1965.70 As Arquilla and Ronfeldt recognize, ‘informational overload and bottlenecking has long been a vulnerability of centralized, hierarchical structures for command and control’.71
Central to this was the fact that the measure of information gathering was frequently one of quantity over quality. The pressure on infantry units to produce detailed reports of their operations and particularly to match the ‘production’ targets in terms of enemy casualties led to wildly inaccurate and overblown estimates that masked the extent to which the American strategy was failing. Gibson points to a related problem in the intelligence field where operations were gauged primarily on data volumes:
Collection departments received most agency budgets and collection departments represented their progress in terms of how many ‘bits’ of information they collected, or how many hours of radio messages were recorded. Since their work was so tangible and measurable, collection departments got the most. As one senior staff member of the National Security Council said, ‘95 percent of the US intelligence effort has been on collection, and only 5 percent on analysis’.72
The paradox of this informational approach to warfare is noted by van Creveld:
‘designed to produce accuracy and certainty, the pressure exercised at the top for more and more quantitative information ended up by producing inaccuracy and uncertainty’.73 It was widely assumed that the development of information gathering and processing technologies would allow a far greater understanding and control of military operations. In practice, the collection and production of information for its own sake created at best greater uncertainty and confusion and at worst a fictional account of the conflict based on a misplaced sense of omniscience and on which erroneous decisions would be made.
As Alain Enthoven was himself to recognize, ‘you assume that there is an information system that will tell you what you want to know. But that just isn’t so. There are huge amounts of misinformation and wrong information’.74 Thus, far from eliminating the Clausewitzian ‘fog of war’, cybernetic warfare itself generated ‘a kind of twilight, which, like fog or moonlight, often tends to make things seem grotesque and larger than they really are’.75 Statistical indicators pointing to US success in Vietnam were frequently erroneous and misleading, failing to grasp the determination of the enemy and the extent of the success of their political strategy. Colonel Harry Summers relates an anecdote whose absurdity captures the disjuncture between the statistical assessment of the war and its reality:
When the Nixon Administration took over in 1969 all the data on North Vietnam and the United States was fed into a Pentagon computer – populations, gross national product, manufacturing capability, number of tanks, ships, and aircraft, size of the armed forces, and the like. The computer was then asked, ‘When will we win?’ It took only moments to give the answer: ‘You won in 1964!’76
Conclusion
Defeat in Vietnam exposed the shortcomings of cybernetic warfare and revealed the inherent limitations of any attempts to make war into an entirely controllable and predictable activity. The cybernetic model of warfare erected by the system analysts was one that was frictionless, a perfectly oiled machine resting on elegant mathematical constructs. Rather than eternal attributes of the battlefield, uncertainty and unpredictability became understood merely as a lack of information which could be overcome through the deployment of the proper information and communication technologies and elaboration of appropriate models of conflict. John Lewis Gaddis explicitly criticizes the tendency in American strategic thought in post-war era ‘to equate the importance of information with the ease of measuring it – an approach better suited to physics than to international relations’.77
The formidable technological impulse of World War II, marked in particular by the development of nuclear weaponry and ICTs, empowered those individuals which mastered the language and methodology of the science that accompanied this technology. This was to the detriment of established traditions of military thought and practice of warfare. Via an ‘“organized scientific discourse” through multiple, but centralizing relationships among high-bureaucratic positions, technobureaucratic or production logic in the structure of its propositions, and the conventional educated prose style’, cybernetic warfare excluded accounts of the war which did not conform to the exigencies of technoscientific discourse. For Gibson, this amounted to a neglect of ‘warrior knowledge’ which he describes in terms of Foucault’s notion of ‘subjected knowledge’,78 that is, knowledge ‘disqualified as inadequate to their task or insufficiently elaborated: na¨ıve knowledges, located low down on the hierarchy, beneath the required level of cognition or scientificity’.79
If the debacle of Vietnam resulted in some serious soul-searching among American strategists and military men, it did not result in a wholesale abandonment of the worldview epitomized by cybernetic warfare or a significant revaluation of other forms of thoughts on war. Throughout the rest of the Cold War and beyond, information technology continued to be embraced as the panacea to the chaos and indeterminacy of war. The Strategic Defense Initiative promised an invulnerable shield against nuclear attack through a combination of computers and space weapons while revolutions in military affairs in the mould of Westmoreland’s vision have been repeatedly heralded (of which the current Pentagon doctrine of network-centric warfare is only the latest incarnation). A greater understanding of the role of scientific ideas and technological systems in the theory and practice of warfare therefore serves not only to shed light on the Cold War but also as a warning against any misplaced faith in the ability of information technology to grant complete control and predictability over the use of military force.
Notes
[1] Edwards, The Closed World: Computers, 127 – 8.
[2] Robin, The Making of the Cold War; Light, From Warfare to Welfare; Kaplan, The Wizards of
Armageddon; Ghamari-Tabrizi, The Worlds of Herman Kahn. [3] Edwards, The Closed World: Computers.
[4] Wiener, Cybernetics, 55. [5] Ibid., 55.
[6] Wiener, The Human Use of Human Beings, 95. [7] Ibid., 8.
[8] Heylighen and Joslyn, Cybernetics and Second-Order Cybernetics,
http://pespmc1.vub.ac.be/ Papers/Cybernetics-EPST.pdf, 18.
[9] Heims, Von Neumann and Wiener, 184.
[10] Dechert, The Social Impact of Cybernetics, 20. [11] Quoted in Heims, The Cybernetics Group, 22. [12] Quoted in Capra, The Web of Life, 62.
[13] Easton, A Framework for Political Analysis, 112, 128; Easton, A Systems Analysis of Political Life; Deutsch, The Nerves of Government.
[14] Wiener, The Human Use of Human Beings, 185 – 6. [15] Edwards, The Closed World: Computers, 12.
[16] Ibid., 15. [17] Ibid., 15. [18] Ibid., 7.
[19] Edwards, “The Closed World: Systems Discourse,” 138 – 9. [20] Ghamari-Tabrizi, The Worlds of Herman Kahn, 128.
[21] Edwards, “The Closed World: Systems Discourse,” 139.
[22] Levidow and Robins, “Towards a Military Information Society?,” 173. [23] Westmoreland, “Address.”
[24] Edwards, “Why Build Computers?” [25] Gray, Postmodern War, 95.
[26] Van Creveld, Technology and War, 239. [27] Gibson, Perfect War, 23.
[28] Rochlin, Trapped in the Net, 204.
[29] Ghamari-Tabrizi, “U.S. Wargaming Grows Up.”
[30] Edwards, “The Closed World: Systems Discourse,” 143. [31] Edwards, The Closed World: Computers, 206.
[32] Ghamari-Tabrizi, The Worlds of Herman Kahn, 149.
[33] United States Army, Operations Research/Systems Analysis. [34] Cummings, “How The World of OR Societies Began.”
[35] De Landa, War in the Age of Intelligent Machines, 5. [36] Rochlin, Trapped in the Net, 59.
[37] Beer, “What Has Cybernetics to Do with Operational Research?” [38] Martin and Norman, The Computerised Society, 569.
[39] Clayton and Sheldon, Air Force Operations Analysis. [40] Van Creveld, Technology and War, 194.
[41] Clayton and Sheldon, Military Operations Analysis. [42] Wilson, The Bomb and the Computer, 43.
[43] Kaplan, The Wizards of Armageddon, 87. [44] Ibid., 87.
[45] Thinking about the Unthinkable was the title of a book by notorious nuclear strategist Herman
Kahn.
[46] Wohlstetter, The Delicate Balance of Terror.
[47] Ghamari-Tabrizi, The Worlds of Herman Kahn, 138. [48] Holley, The Evolution of Operations Research, 101. [49] Ghamari-Tabrizi, The Worlds of Herman Kahn, 166.
[50] De Landa, War in the Age of Intelligent Machines, 103.
[51] Ghamari-Tabrizi, The Worlds of Herman Kahn, 169. [52] Kaplan, The Wizards of Armageddon, 254.
[53] Ghamari-Tabrizi, The Worlds of Herman Kahn, 48. Kahn echoed Enthoven’s sentiment when he asked officers who were critical of his approach, ‘how many thermonuclear wars have you fought recently?’
[54] At
http://www.defenselink.mil/specials/secdef_histories/bios/mcnamara.htm. [55] Kaplan, The Wizards of Armageddon, 254.
[56] Edwards, The Closed World: Computers, 5. [57] Eisenhower, Farewell Address.
[58] Van Creveld, Technology and War, 246. [59] Ibid., 246.
[60] Perry, “Commentary,” 117.
[61] Wilson, The Bomb and the Computer, 114. [62] Morris, The Fog of War.
[63] Gibson, The Perfect War, 156.
[64] Edwards, “Cyberpunks in Cyberspace.” [65] Gibson, The Perfect War, 80.
[66] Kissinger, American Foreign Policy, 51 – 97. [67] Gibson, The Perfect War, 22.
[68] Ibid., 23.
[69] Mustin, “Flesh and Blood.”
[70] Van Creveld, Command in War, 239.
[71] Arquilla and Ronfeldt, “Cyberwar is Coming!” [72] Gibson, The Perfect War, 367.
[73] Van Creveld, Command in War, 259. [74] Herken, Counsels of War, 220.
[75] Clausewitz, On War, 140.
[76] Heuser, Reading Clausewitz, 170.
[77] Gaddis, Strategies of Containment, 84. [78] Gibson, The Perfect War, 467.
[79] Foucault, Power/Knowledge, 82.
References
Arquilla, John and David Ronfeldt. “Cyberwar is Coming!” In Athena’s Camp: Preparing for Conflict in the Information Age, edited by John Arquilla and David Ronfeldt. Santa Monica, CA: RAND, 1997.Beer, Stafford. “What Has Cybernetics to Do with Operational Research?” Operational Research Quarterly 10, no. 1 (1959), 1 – 21.
Capra, Fritjof. The Web of Life: A New Synthesis of Mind and Matter. London: Flamingo, 1997. Clausewitz, Carl von. On War. Harmondsworth: Penguin, 1968.
Clayton, Thomas J., and Sheldon Robert S. Air Force Operations Analysis. Available at
http://www. mors.org/history/af_oa.pdf
Clayton, Thomas J., and Sheldon. Robert S. Military Operations Research. Available at
http://www. mors.org/history/mor.pdf
Cummings, Nigel. “How the World of OR Societies Began.” OR Newsletter April 1997. Available at
http://www.orsoc.org.uk/about/topic/news/article_news_orclub.htm.
De Landa, Manuel. War in the Age of Intelligent Machines. New York: Swerve Editions, 1991. Dechert, Charles R., ed. The Social Impact of Cybernetics. Notre Dame, IN: University of Notre Dame,
1966.
Deutsch, Karl W. The Nerves of Government: Models of Political Communication and Control. New
York, Free Press and London: Collier-Macmillan, 1963.
Easton, David. A Systems Analysis of Political Life. New York: John Wiley & Sons, 1965.
Easton, David. A Framework for Political Analysis. Chicago, IL and London: University of Chicago
Press, 1979.
Edwards, Paul N. “The Closed World: Systems Discourse, Military Policy and Post-World War II US Historical Consciousness.” In Cyborg Worlds: The Military Information Society, edited by Les Levidow and Kevin Robins. London: Free Association Books, 1989.
Edwards, Paul N. “Cyberpunks in Cyberspace: The Politics of Subjectivity in the Computer Age.” In Cultures of Computing, edited by Susan Leigh Star. Keele: Sociological Review and Monograph Series, 1995.
Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America.
Cambridge, MA and London: MIT Press, 1996.
Edwards, Paul N. “Why Build Computers?” In Major Problems in the History of American Technology: Documents and Essays, edited by Merritt Roe Smith and Gregory K. Clancey. Boston: Houghton Mifflin, 1998.
Eisenhower, Dwight. “Farwell Address to the Nation.” January 17, 1961.
http://www.ourdocuments. gov/doc.php?flash¼ true&doc¼ 90&page¼ transcript.
Foucault, Michel. Power/Knowledge. Hemel Hempstead: Harvester Press, 1980.
Gaddis, John Lewis. Strategies of Containment: A Critical Appraisal of Postwar American National
Security Policy. Oxford: Oxford University Press, 1982.
Ghamari-Tabrizi, Sharon. The Worlds of Herman Kahn: The Intuitive Science of Thermonuclear War.
Cambridge, MA: Harvard University Press, 2005.
Ghamari-Tabrizi, Sharon. “U.S. Wargaming Grows Up: A Short History of the Diffusion of Wargaming in the Armed Forces and Industry in the Postwar Period up to 1964.” Available at
http://www.strategypage.com/articles/default.asp?target ¼ Wgappen.htm.
Gibson, James. The Perfect War: Technowar in Vietnam. Boston: Atlantic Monthly Press, 1986. Gray, Chris Hables. Postmodern War: The New Politics of Conflict. New York: Guilford Press, 1997. Heims, Steve J. John Von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life
and Death. Cambridge, MA and London: MIT Press, 1980.
Heims, Steve J. The Cybernetics Group. Cambridge, MA: MIT Press, 1991. Herken, Gregg. Counsels of War. New York: Alfred A. Knopf, 1985. Heuser, Beatrice. Reading Clausewitz. London: Pimlico, 2002.
Heylighen and Joslyn. “Cybernetics and Second-Order Cybernetics” In Encyclopedia of Physical Science & Technology, edited by Meyers.
http://pespmc1.vub.ac.be/Papers/Cybernetics-EPST. pdf, 18, 2001.
Holley. Jr., I.B.“ The Evolution of Operations Research and the Impact on the Military Establishment: The Air Force Experience.” In Science, Technology and Warfare: The Proceedings of the Third Military History Symposium, edited by Monte D. Wright and Lawrence J. Paszek. United Air Force Academy, 8 – 9 May 1969.
Kahn, Herman. Thinking about the Unthinkable. London: Weidenfeld and Nicolson, 1962. Kaplan, Fred. The Wizards of Armageddon. New York: Simon & Schuster, 1984.
Kissinger, Henry. American Foreign Policy: Three Essays by Henry Kissinger. New York: W.W. Norton,
1969.
Levidow, Les and Robins, Kevin. “Towards a Military Information Society?” In Cyborg Worlds: The Military Information Society, edited by Les Levidow and Kevin Robins. London: Free Association Books, 1989.
Light, Jennifer S. From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War
America. Baltimore, MD: Johns Hopkins University Press, 2003.
Martin, James and Norman, Adrian R.D. The Computerised Society. Harmondsworth, Middlesex: Penguin Books, 1973.
Meyers, R.A., ed. Encyclopedia of Physical Science and Technology, 3rd ed. New York: Academic Press,
2001.
Morris, Errol, director. The Fog of War – Eleven Lessons from the Life of Robert S. McNamara.
Columbia Tristar, 2004.
Mustin, Lt Jeff. “Flesh and Blood: The Call for the Pilot in the Cockpit.” Air and Space Power Journal
• Chronicles Online Journal (2001):, July, Available at
http://www.airpower.maxwell.af.mil/ airchronicles/cc/mustin.html.
Perry, Robert L. “Commentary.” In Science, Technology and Warfare: The Proceedings of the Third Military History Symposium, edited by Monte D. Wright and Lawrence J. Paszek. United Air Force Academy, 8 – 9 May 1969.
Robin, Ron Theodore. The Making of the Cold War Enemy: Culture and Politics in the Military- Intellectual Complex. Princeton, NJ: Princeton University Press, 2001.
Rochlin, Gene I. Trapped in the Net: The Unanticipated Consequences of Computerization. Princeton, NJ: Princeton University Press, 1997.
United States Army. Official Department of the Army Administrative Publications. Operations
Research/Systems Analysis. Department of the Army Pamphlet 600 – 3 – 49, 1987. Available at
http://www.army.mil/usapa/epubs/pdf/p600_3_49.pdf.
Van Creveld, Martin. Command in War. Cambridge, MA and London: Harvard University Press, 2003.
Van Creveld, Martin. Technology and War: From 2000 B.C. to the Present. New York, London: Free
Press, Collier Macmillan, 1989.
Westmoreland, William. “Address to the Association of the U.S. Army.” 14 October 1969.
Wiener, Norbert. Cybernetics or Control and Communications in the Animal and the Machine.
Cambridge, MA: MIT Press, 1948.
Wiener, Norbert. The Human Use of Human Beings: Cybernetics and Society. London: Eyre & Spottiswoode, 1954.
Wilson, Andrew. The Bomb and the Computer. London: Barrie & Rockliff, 1968.
Wohlstetter, Albert. The Delicate Balance of Terror. 1958. Available at
http://www.rand.org/publications/classics/wohlstetter/P1472/P1472.html.