This Forum is Closed
January 25, 2025, 02:15:14 pm
Welcome, Guest. Please login or register.

Login with username, password and session length
News: GGF now has a permanent home: http://forum.globalgulag.com
 
  Home Help Search Links Staff List Login Register  

Cybernetics is THE all-encompassing enslavement agenda, superseding eugenics

Pages: [1]   Go Down
  Print  
Author Topic: Cybernetics is THE all-encompassing enslavement agenda, superseding eugenics  (Read 2357 times)
Anti_Illuminati
Newbie
*
Offline Offline

Posts: 2


View Profile
« on: July 28, 2010, 04:21:42 pm »

Cyberneticizing the American war machine: science and computers in the Cold War
Antoine Bousquet
Birkbeck College,, UK

Online Publication Date: 01 February 2008



Cold War History
Vol. 8, No. 1, February 2008, pp. 77–102


Cyberneticizing the American war machine: science and computers in the Cold War
Antoine Bousquet
Birkbeck  College, UK


American victory in World War II was perceived to be due in large part to its scientific and technological superiority,  best exemplified by the development  of the atom bomb. Throughout the Cold War, scientific theories and methodologies were recruited  even more extensively  to  weigh on  military  and  strategic affairs. Cybernetics, along with operations research and systems analysis, sought to impose order and  predictability  on  warfare through the  collection,  processing,  and distribution of information. The emergence of the notion of command-and-control epitomized a centralizing approach which saw military organization purely as a vast techno-social machine to be integrated and directed on the basis of the predictions of mathematical models and the deployment  of cybernetic technologies. Preparation for a nuclear conflict with the Soviet Union was the primary focus of this conception of warfare but it failed spectacularly the test of Vietnam, thereby dramatically revealing its theoretical and practical bankruptcy. Indeed, cybernetic warfare was deeply flawed in its restrictive assumptions  about conflict, its exclusive focus on quantitative elements, its dismissal of any views that did not conform to its norms of scientificity, and its neglect of the risks of information inaccuracy and overload.


Where is your data? Give me something I can put in the computer. Don’t give me your poetry.
(Robert McNamara on being told by a White House aide that the Vietnam War was doomed to failure)1


Antoine  Bousquet  is  a  Lecturer  in  International  Relations at  Birkbeck, University of  London,  currently researching the  relationship  of science and  technology to  warfare. His  latest publication  is “Time  Zero: Hiroshima,  September 11 and  Apocalyptic Revelations in Historical Consciousness,” Millennium Journal of International  Studies 34,  no.  3  (2006).  Forthcoming  is  The  Scientific Way  of   Warfare  (Hurst,   2008). Correspondence to: Antoine Bousquet, School of Politics and Sociology, Birkbeck College, University of London, 10 Gower Street, London WC1E 7HX. Email: a.bousquet@bbk.ac.uk

ISSN 1468-2745 print/ISSN 1743-7962 online
q 2008 Taylor & Francis
DOI: 10.1080/14682740701791359 http://www.informaworld.com
 

The Cold War was characterized not only by a transformation in the structure of world politics marked by the advent of bipolarity but also by momentous changes in the practice and theories of warfare. If the role of technology has been greatly studied, particularly in relation to nuclear weapons, the influence of the scientific ideas which accompanied  technological  development  has  been  afforded  far  less  attention. A number of scholars have nevertheless enquired into the relationship of scientific theory and practice to military planning and operations during the Cold War, whether it be the convergence between behavioural science and  the psychological warfare strategies deployed in Korea and Vietnam (Robin), the extension of systems analysis to warfare and  urban  planning  (Light),  or  the  ideas  and  methods  of  the  RAND Corporation’s nuclear strategists (Kaplan, Ghamari-Tabrizi).2 Closest in emphasis and an important  influence on the present article, Paul Edwards’ The Closed World has provided a rich account of computers as both political icons and cultural metaphors central to the planning and conduct of the Cold War.3

The following article seeks to build on this work by charting the cyberneticization of military and strategic thought which accompanied the computerization of the United States military in the wake of World War II and analysing the inherent flaws of such an approach to warfare. Its aim is to demonstrate how cybernetic ideas, in particular those of systemic closure, information feedback, and homeostasis, contributed to erecting an understanding of war which strove to frame the use of military force into an activity totally amenable to scientific analysis, to the detriment of other forms of thought. I have thus called cybernetic warfare the nexus of ideas and practices produced by the extension of this scientific conceptual apparatus to military and strategic affairs.

The first section focuses on the development of cybernetics and the core principles and concepts of this self-professed  ‘science of control and communications’ as first formulated by Norbert Wiener. I then turn to the Cold War itself, in which cybernetic technologies proliferated alongside a conceptual and methodological apparatus which emphasized the controllable and predictable nature  of war. Traditional notions  of command  gave way to  ‘command-and-control’,  operations  research and  systems analysis reduced war to a set of mathematical functions and cost – benefit calculations susceptible to optimization, and conflict was increasingly modelled and simulated. The article concludes with the Vietnam War, a conflict in which the aforementioned ideas and techniques were truly put to the test, incurring spectacular reversals and revealing the flawed assumptions of cybernetic warfare.


Stepchild of war: Norbert  Wiener’s cybernetics
 
The thought of every age is reflected in its technique.
(Norbert Wiener)4
 
Although drawing on older ideas and research, cybernetics was born of the imperatives of World War II. Indeed, the wartime research of cybernetics’ main progenitor Norbert Wiener played a major role in the elaboration of its central postulates. Wiener had worked on one of the most urgent technological problems of the war, namely the improvement of anti-aircraft defences. With the increase in the speed and altitude of bomber planes, anti-aircraft gunners could no longer simply visually target the plane since it would have moved out of position in the short time necessary for the projectile to reach it. Anti-aircraft defences were thus notoriously inefficient and successful hits resulted more from chance than the gunner’s accuracy. Whereas the traditional problem of ballistics required the production of lengthy tables detailing the appropriate artillery elevation according to type of gun, shell, and range to a fixed target, fire control against a rapidly mobile target was a real-time computational problem. Wiener therefore focused on first developing a mathematical theory for making a statistical prediction of the future course of a plane given available information on its position and motion. Real- time application of the theory required the processing of information provided by radar into adjustments in the aiming of the gun. A missed shot would be followed by an adjustment of the aim, a new shot and further adjustment if necessary. This led Wiener to think of this process as a feedback loop through which flows of information enable a system to adjust its behaviour in order to attain or maintain a desired state.

The etymology of cybernetics refers to the Greek for steersman or governor and reflected Wiener’s belief that a steersman and his rudder formed a feedback loop. The anti-aircraft unit thus constituted a self-steering device whether fully automated or with a human controller as only one part of the feedback loop. Wiener designated all self-steering devices relying on information feedback as servomechanisms:

The machines of which we are now speaking are not the dream of the sensationalist, nor the hope of some future time. They already exist as thermostats, automatic gyro- compass ship-steering systems, self-propelled missiles – especially such as seek their target  –  anti  aircraft fire-control  systems, automatically controlled oil-cracking stills, ultra-rapid computing machines, and the like. They had begun to be used long before the war – indeed, the very old steam-engine governor belongs among them – but the great mechanization of the Second World War brought them into their own, and the need of handling the extremely dangerous energy of the atom will probably bring them to a still higher point of development ... the present age is as truly the age of the servomechanisms as the nineteenth century was the age of the steam engine or the eighteenth century the age of the clock.5

A cybernetic system, or servomechanism, is characterized by three distinct components: receptors or sensors that can absorb inputs from its environment, a processing unit which can record and translate this input, compare it with a desired state, and issue appropriate instructions for the system to interact with the environment via its output mechanism. New outputs result in a new flow of inputs thereby closing the feedback loop. This continuous closed loop is enabled by the flow of information that links all the components together and allows the system to respond to changes in the perceived environment and adjust its behaviour accordingly (Figure 1).  Information feedback loops were not restricted to the creatures of engineers since Wiener would find cybernetic processes at work everywhere among living organisms.
 

In this way the behaviour of animal and machine could be brought under a single theory in which the identification of patterns of communication and control could be substituted for the study of specific physical embodiments. This would subsequently enable the ever closer integration of machines and organisms within single systems of control and information exchange.

Homeostasis was a term coined in the 1930s to describe the process by which living organisms adjust their  internal  environment  to maintain  a stable state. Examples would include the regulation of body temperature and cardiac rhythm or the concentration  of nutrients  and  waste products  within  the  tolerable limits of the organism. Wiener adopted the term and applied it more generally to all systems whose behaviour relies on negative feedback to stave off entropy, the general tendency of the universe towards disorder over time. Homeostasis was thus the means by which a system could maintain its goal – survival in the case of a biological life form (‘the process by which we living beings resist the general stream of corruption and decay’6), the continued  regulation of a mechanical process within defined boundaries for a servomechanism – in a changing environment.

It should be noted that if another cybernetic system was put in the place of the environment in Figure 1, we would have two cybernetic systems interacting with one another as each tries to impose its own desired state on the other. If the respective goals are incompatible, the systems will be in a state of conflict or competition; if the goals can be conciliated, a mutually satisfactory equilibrium may be reached. A control relationship is established when one system can dominate another system and impose its preferences over it. In cybernetics, control and communication  are inextricably linked since control ‘is nothing but the sending of messages which effectively change the behaviour of the recipient’.7

Complex control systems are thus composed of a hierarchy of nested cybernetic systems, each with its own goal but subservient to the goal of the system above it. For example, a machine or organism whose overall goal is survival might have a set of subsidiary goals that serve this purpose: a regular supply of energy, the evasion of a threat, or any other behaviour that addresses a disturbance that takes the system away from its overarching desired state. The more complex the environment and the greater the variety of possible perturbations, the more control loops will be required to attain and  sustain  the  system’s  goal.  These nested  hierarchies  constitute a top-down architecture of control which cyberneticists see as explanatory of the  ‘increasing complexity which characterizes such fundamental developments as the origin of life, multicellular organisms, the nervous system, learning, and human culture’.8
 


Bureaucratic  organizations  and  their  top-down   command  layers offer  another obvious example in human societies.  In the 1950s, cybernetics appeared to offer a whole new interdisciplinary theory and methodology with anthropologists, linguists, physiologists, sociologists, philosophers, engineers and computer  scientists all applying cybernetic principles to their field. Cybernetics was not so much a traditional scientific discipline as a convergence of engineering techniques, scientific ideas and philosophical principles under a common discourse that allowed the discussion and analysis of artificial machines, biological organisms, and social organization as equivalent systems of control and communication operating under a single set of principles.

Initially, Wiener’s already ambitious goal for cybernetics was a theory ‘of control and communications in the animal and the machine’,9  but the definition was soon expanded to include the behaviour of all complex systems, including social.10 In 1946 Wiener had proposed that fields as diverse as statistical mechanics, communication engineering, the theory of control mechanisms in machines, biology, psychology and social science could all be understood  through an emphasis on the role of communication:

The neuromuscular mechanism of an animal or of man is certainly a communication   instrument,   as  are  the  sense  organs  which  receive  external impulses.  Fundamentally  the  social  sciences are  the  study  of  the  means  of communication between man and man, or, more generally, in a community of any sort of being. The unifying idea of these disciplines is the MESSAGE, and not any special apparatus acting on messages.11

He further  claimed in 1948 that  ‘it is certainly true  that  the social system is an organization like the individual, that is bound together by a system of communication, and that it has a dynamics in which circular processes of a feedback nature play an important  role’.12  Several social scientists would later develop these ideas and apply many of the principles of cybernetics and systems analysis to their fields of study. Chief among them, Karl Deutsch explicitly drew from cybernetics to introduce notions of information  feedback to the understanding  of social systems and the ‘steering’ of government in his seminal Nerves of Government while David Easton formulated a theory of the political system defined as ‘a means whereby certain inputs are converted into  outputs’ and where the properties of feedback allow it ‘to regulate stress by modifying or redirecting its own behaviour’.13

But while Wiener did see the applications of his theory to social organization, he was reluctant to grant it the same scientific credibility as the study of machines and organisms. A liberal humanist at heart, Wiener was particularly concerned about the implications for the liberal subject of a cybernetic management of society:

I have spoken of machines, but not only of machines having brains of brass and thews of iron. When human atoms are knit into an organization in which they are used, not in their full right as responsible human beings, but as cogs and levers and rods, it matters little that their raw material is flesh and blood. What is used as an element in a machine, is in fact an element in the machine. Whether we entrust our decisions to machines of metal, or to those machines of flesh and blood which are bureaus and vast laboratories and armies and corporations, we shall never receive the right answers to our questions unless we ask the right questions.14

But while Wiener was uneasy about ‘social machines’, the multidisciplinary approach he  fostered  contributed   to  this  logical extension  of  the  cybernetic  conceptual apparatus. It is also quite obvious that no social machine treats individuals as ‘cogs and levers and rods’ more completely than the military (an institution with which Wiener refused any association after the war). Wiener tended to see a democratic potential in cybernetics with its promise of feedback and reciprocal influence. However cybernetics could just as well serve a hierarchical organization  in which subservient systems fulfilled individual homeostatic roles set by the overarching system.

With  its  notions  of  information   feedback, self-regulation,  and  homeostasis, cybernetics promoted an understanding of organisms, machines and organizations in terms of closed systems operating in their environment via continuous circular flows of information.  In the context of an uncertain  and  precarious Cold War, such a conceptual framework, supported by a raft of new information and communication technologies, was a perfect match for the desire of politicians and military for greater control and predictability in the conduct of their affairs.


The ‘closed world’ of cybernetic warfare

For Edwards, the rapid computerization of the military was central to the constitution of a ‘closed world’ discourse conveying ‘a radically bounded  scene of conflict, an inescapably self-referential space where every thought, word, and action is ultimately directed towards a central struggle’.15 In a Cold War framed by the threat of nuclear devastation,  computers  acted as powerful tools  and  metaphors,  promising  ‘total oversight, exacting standards of control, and technical – rational solutions to a myriad of complex problems’.16

In analysing this ‘closed world discourse’ Edwards identifies several central features. Engineering and mathematical techniques which allow for the creation of models of aspects of the world as closed systems combine with technologies such as the computer which enable large scale simulation, systems analysis and central control. A language of systems, gaming, communication  and  information  is erected, privileging abstract formalisms  over  ‘experiential  and  situated  knowledge’.  Visions of  omnipotence through  air  power  and  nuclear  weapons  assisted by ‘centralized,  instantaneous, automated command and control’ are summoned in response to fears of an expansionist Soviet Empire.17

Edwards explicitly connects this worldview to the development of computers which are seen as participating in the creation and sustaining of closed world discourse in two ways. ‘First, they allowed the practical construction of central real-time military control  systems on  a  gigantic  scale. Second,  they  facilitated  the  metaphorical understanding of world politics as a sort of system subject to technological management.’18 Hence the closed world is not simply the proliferation and imposition of the discursive framework of superpower confrontation  on all international  and domestic politics but also an understanding  of the world that defines the latter as finite, manageable and  computable. Edwards convincingly connects the computer sciences and systems theories to an overarching set of ‘tools, techniques, practices and languages which embody an approach  to the world as composed of interlocking systems amenable to formal mathematical analysis’.19

As direct  experience of  total  war  receded and  new  inconceivably destructive weapons were developed, mathematical and logical models and simulations of warfare became fetishized for their promises of predictability and control. Defence intellectuals were their keenest practitioners and most outspoken proponents, wielding these instruments to the very highest spheres of executive power. Convinced with  often  near-religious fervour  of  the  superiority  of  their  method,  they were determined  to apply scientific rationalism  to  the entire spectrum  of war. Sharon Ghamari-Tabrizi has noted that the quantitative studies they conducted and promulgated ‘often aimed toward an ideal of omniscient information management’.20

Founded  on  a  Weltanschauung that  drew  its  conviction  from  the  practical engineering successes of the informational sciences, cybernetic warfare strove to shape military affairs into a perfectly modelled and controlled closed world. By importing this methodological and conceptual baggage military thinkers internalized many of their  assumptions.  If,  as  ‘engineering  approaches  designed  to  solve real-world problems, systems theories tend in practice to assume the closure of the system they analyse’, military problems framed within the same conceptual and methodological apparatus naturally tended to be also perceived in terms of closed systems.21

Such closed systems lend themselves perfectly to modelling and simulation, the ability to run and re-run scenarios in the belief that all factors have been incorporated and appropriately weighted. Whatever the true usefulness of such models (and cases of models being ‘validated’ by real events are scant), they came to exert a strong influence on military leaders and policy makers, all looking for certainty and mastery over events, however illusory. Bill Nichols points to the role of cybernetic systems in creating a world of simulacra amenable to total control  .. . Cybernetic simulation renders experience, and the real itself, ‘problematic’. It draws us into a realm, a design for living, that fosters a fetishised relationship with the simulation as a new reality all its own, based on the capacity to control, within the domain  of the simulation, what had once eluded control beyond it.22

It might be easy to dismiss all this as academic hyperbole if it were not for senior military commanders and politicians frequently preaching from the same holy book. In 1969, General  William  Westmoreland,  Commander-in-Chief  of  US  forces in  Vietnam, famously prophesied the imminent arrival of the fully cyberneticized and frictionless battlefield:

On the battlefield of the future, enemy forces will be located, tracked, and targeted almost instantaneously through the use of data links, computer assisted intelligence evaluation, and automated fire control. With first round kill probabilities approaching certainty, and with surveillance devices that can continually track the enemy, the need for large forces to fix the opponent becomes less important. I see battlefields that are under 24-hour real or near-real time surveillance of all types. I see battlefields on which we can destroy anything we can locate through  instant communications and almost instantaneous application of highly lethal firepower

.. .  In summary, I see an Army built into and around an integrated area control system that  exploits the  advanced technology of communications,  sensors, fire direction, and the required automatic data processing.23  As we shall see, such a drive for certainty and predictability was common amongst those who put faith in computerized systems and the analytical techniques of operations research and systems analysis in the 1950s and 1960s. For Edwards, Westmoreland’s speech epitomizes the ‘vision of a closed world, a chaotic and dangerous space rendered orderly and controllable by the powers of rationality and technology’.24

The appeal of such certainty to an institution  in which the training of troops is designed ‘to reduce the conduct of war to a set of rules and a system of procedures – and thereby to make orderly and rational what is essentially chaotic and instinctive’ is obvious.25 It is therefore not surprising that the military embraced computers as the panacea to the eternal problem of uncertainty and unpredictability in war. Van Creveld facetiously sums up the attraction of computers to the military machine:

Computers with their binary on – off logic seem to appeal to the military mind. This is because the military, in order to counter the inherent confusion and danger of war, is forever seeking ways to make communications as terse and  unambiguous  as humanly possible. Computers by their very nature do just that. Had they only been able to stand at attention  and salute, in many ways they would have made ideal soldiers.26

Cybernetic thought provides a comforting lens through which to view the use of force as it reduces military strategy to ‘a one-factor question about technical forces; success or failure is measured quantitatively .. . machine-system meets machine-system and the largest, fastest, most technologically advanced system will win. Any other outcome becomes unthinkable’.27  This tendency to think of armies as ‘machine-systems’ is a product of the industrialization of warfare in which the ability to marshal resources and manage complex systems becomes the paramount factor determining military and strategic victory in total wars of attrition.  The integration  and operation  of these increasingly complex systems would only be made possible by the development and extension of information and telecommunication technologies, leading to the establishment of centralized command-and-control structures.

From command  to command-and-control

Command-and-control  became the common term employed by the military brass to describe its function. The addition of the term ‘control’ to what previously had simply been called ‘command’ is revealing in itself; command suggests the mere transmission of orders while control suggests a process than involves a feedback mechanism allowing the controller to obtain new information from the system and adjust his orders accordingly and thus direct far more accurately his subordinates. As Rochlin puts it, command was historically an open cycle process: the commander set up the battle, gave instructions, provided for whatever contingencies could be planned for, and then  issued the  command  to  execute. After that,  the  ability to  intervene was minimal. In contrast, control is a closed cycle process with feedback, analysis, and iteration; it was not  possible even to consider the transition  from command  to command-and-control  until modern  technical means for intelligence and communications became available.28

Integration of armed forces into a coherent system maintained by information and communication  technologies (ICTs) amenable to centralized control is a necessary feature of any modern industrial army. As the range and specialization of military personnel and equipment increase along with the concomitant logistical challenges characteristic of  industrial  warfare, reliable channels  of  communication  become essential. The limitations of early ICTs in terms of their availability and the volume of information that could be processed and transmitted made centralization all the more appealing since it reduced the number of potential channels of communication.

However, there were also specific geopolitical conditions which combined with the new technology of nuclear weapons to particularly drive centralization of the military in  the  post-war  era. Indeed, concerns over the  eventuality of nuclear war, be it intentional or accidental, were omnipresent in the 1950s and continued, somewhat abated, throughout the Cold War. Due to the incredible destructive power of nuclear weapons and the speed of the delivery systems, it became crucial to ensure a very tight central control over their use as well as develop effective early warning mechanisms for a credible nuclear deterrent.

With the appearance of jet-powered aircraft, the time available for detection and interception  of bombers  potentially carrying nuclear weapons shrank  and  existing command structures were no longer adequate. Computers presented a clear technological solution to the problem of effective and rapid processing and transmission of both incoming information (provided by radar and observation posts) and outgoing information (sent to anti-aircraft defences such as interceptor fighter planes or land-based weapons). As an article in the Air University Quarterly Review of Winter 1956 – 57 put it, ‘the speed with which these weapons could react, each to the other, seems to indicate that only a machine with vast memory and instant response could be expected to indicate a successful counter strategy in sufficient time to be useful’.29

Within  a  year  of  this  article,  the  Air  Force  announced  SAGE (Semi  Auto- mated  Ground  Environment),  the  first  computer-based  command,  control  and communications  system for the purpose  of constituting  a centralized air defence network. Based on information from radar echoes, the calculation of precise positions and speeds of multiple planes required massive computing power while the efficient and prompt transmission of this data to anti-aircraft weapon systems necessitated a reliable communications network.

SAGE broke significantly with existing computer technology because of its requirement for real-time processing and responses to user inputs. Until then, the norm was batch processing, the execution of series of non- interactive jobs all at one time. Users programmed the computer, entered the data to be processed, and waited for its output to be generated and displayed via print-outs. Real-time processing required a revolutionary user interface, so SAGE was able to present data to operator stations via a cathode ray tube and responded to requests for additional  information  from operators  handling light guns directed at the screen (Figure 2). The resulting decrease in the delay between inputs and outputs created a closer cybernetic  loop  between  computer  and  user  which  has  only  gained  in complexity and intimacy since (Figure 3).

Although obsolete by the time of its completion in 1963 since the deployment of ICBMs by the Soviet Union  rendered anti-aircraft  defences largely irrelevant, the system drove the development of crucial information  technologies as well as the adoption of certain organizational principles. As such, ‘SAGE was less important as an actual defence system than  as a symbol of things to  come  ...  it is the idea of automated control and information processes – the concept itself – that has shaped, more than any technology, the contemporary US armed forces’.30  In total, between $8 and $12 billion were spent on developing and implementing SAGE, a higher level of expenditure than had been dedicated to the Manhattan Project.
 
Figure  2 SAGE operator  and  his console (circa 1959). Source: Picture used with the permission of The MITRE Corporation. Copyright q The MITRE Corporation. All Rights Reserved.
Figure 3 SAGE man-machine cybernetic loop. Source: Picture used with the permission of The MITRE Corporation. Copyright q The MITRE Corporation. All Rights Reserved.

SAGE was followed by numerous similar projects, most notably the World Wide Military Command and Control System (WWMCCS) in 1962. Progressively extended from Strategic Air Command  to  the rest of the military, WWMCSS allowed for centralized global command-and-control of American troops through a broad spectrum  of telecommunication  systems including military satellites, marking the extension of command-and-control structures across the globe and establishing total cybernetic system closure over the world.

Edwards sees these cybernetic principles as applied throughout  the military with machines, bodies, and organizations constituting command-control-communications systems [which] operated as a nested hierarchy of cybernetic devices. Airplanes, communications  systems, computers, and anti- aircraft guns occupied the micro levels of this hierarchy. Higher-level ‘devices’, each of which could be considered a cyborg or  cybernetic system, included  aircraft carriers, the WWMCCS, and NORAD early warning systems.

At a still higher level stood  military  units  such  as  battalions  and  the  Army, Navy, and  Air  Force themselves. Each was conceptualized as an integrated combination of human and electronic components, operating according to formalized rules of action. Each level followed directives taken from the next highest unit and returned information on its performance to that unit. Each carried out its own functions with relative autonomy, issuing its own commands to systems under its control and evaluating the results using feedback from them.31

This understanding of military operations as interlocked systems obeying formalized rules invited, and indeed required, their analytical treatment  through the lenses of mathematics and logic. In order to determine the formalized rules of action according to which each level of the hierarchy of cybernetic devices should operate on the basis of incoming information, it was necessary to identify the parameters and signals upon which to act. Exhaustive models of the behaviour of both the US military and that of its potential enemies were thus developed, thereby reducing war to a complex equation to be resolved by a technoscientific priesthood.

Operations  research and systems analysis: solving the war equation


Modern war has become too complex to be entrusted to the intuition of even the most experienced military  commander. Only  our  giant  brains  can  calculate all  the possibilities.

(John  Kemeny,  RAND  Consultant  and  co-creator  of  the  BASIC computer language, 1961)32

‘The representation and analysis of real world processes using logic, mathematics and computer science’, Operations Research (OR) and its offspring Systems Analysis (SA) transformed the manner in which war was prepared for, planned and imagined.33  Despite initial resistance by officers, statistical control, OR and SA gained a rapidly growing influence over planning and operations in the post-war era. By 1962 this approach had become so popular and ubiquitous that OR could proclaim itself to be ‘the attack of modern  science on complex problems arising in the direction  and management of large systems of men, machines, material and money in industry, business, government and defence’.34 However, as De Landa has observed, it was in the military that this science of organization was pioneered and its broad application effectively marked the transfer of ‘command and control structures of military logistics to the rest of society and the economy’.35

Operations research seeks to improve operations by studying an entire system rather than exclusively concentrating on specific elements. In the context of the transfer of operations research to business in the 1950s – 60s but equally applicable to military affairs, Rochlin tells us that ‘the new agenda differed from the old in a major expansion of the scope of analysis; instead of treating the firm as a series of isolated, interacting operations to be integrated from the top, it was now visualised as single, complex, interrelated  pattern  of activities, to  be analysed, coordinated  and  optimised  as a whole’.36 In this sense OR and SA are very much in the cybernetic mould in their belief in a whole that is superior to the sum of its parts and their assumptions about the closure of the systems being modelled. For Stafford Beer, cybernetics was ‘the science of which operational research is the method’:

Operational  research comprises a body of methods  which cohere to  provide a powerful tool of investigation. Cybernetics is a corpus of knowledge which might reasonably claim  the  status  of  a  science. My contention  is  that  the  two  are methodologically complementary; that the first is the natural technique in research of the second, and the second the natural embodiment in science of the first. By definition,  each is concerned to  treat  a complex and  interconnected  system or process as an organic whole.37

The computer is here again central since the optimization of the mathematical models constructed by researchers is achieved through the use of computer-based algorithms which calculate changes in the system’s behaviour resulting from any changes in the multiple variables that constitute the models. Without the computer, the widespread application of OR and SA and the increasing complexity of the models developed would have been impossible. In this sense, these analytical techniques are inseparable from the technologies that support them – the viability of models is dependent on their ability to be translated into computer  code, that is into a program that can convert quantifiable inputs into quantifiable outputs. As two system analysts put it, ‘a model without numbers cannot be manipulated so measurement and quantification is a  fundamental  part  of the  description  resulting from  analysis, and  basis of the evaluation of systems design’.38  Thus that which cannot be assigned a number  or expressed in terms of logical relationships is necessarily excluded.

Operations  research was pioneered  by the  British in  the  late 1930s and  was enthusiastically embraced by the US military during World War II. OR Air Force studies multiplied exponentially in this period:

offensive ones dealing with bombing accuracy, weapons effectiveness, and target damage  .. .  defensive ones dealing with defensive formations of bombers, battle damage and losses of our aircraft, and air defense of our bases .. . studies of cruise control  procedures,  maintenance  facilities and  procedures,  accidents,  in-flight feeding and comfort of crews, possibility of growing vegetables on South Pacific islands, and a host of others.39

The  initial  success of this  mathematical  approach  can  be accounted  for  by the particular nature of strategic bombing and the defensive measures deployed against it. The development of air defences necessitated the establishment of socio-technical systems of unprecendented complexity. The integration of radar required the creation of an extended coverage of airspace by individual radar  stations  connected to  a communications network transmitting information to a centralized headquarters for processing and then onwards to air defence units such as anti-aircraft guns or fighter planes. Faced with such heavy defensive systems, it became necessary for bombers to fly at altitudes that were frequently too high for the crews to directly observe their targets. Consequently, operations had to be planned days ahead, as well as coordinated and integrated with ground facilities, air support and a host of anti-radar measures.

‘Thus strategic bombing not only found itself opposed by a technological system but itself assumed all the characteristics of such a system.’40 War in this arena thefore took on the characteristics of a battle of attrition  between two competing technological systems. This configuration naturally lent itself to being run by centralized statistically based forms of management.

Naval warfare, particularly anti-submarine activities, was the other field in which operations research showed impressive results during World War II. When on the recommendation  of operations researchers British anti-submarine aircrafts reduced the depth at which depth charges were to detonate, the increase in successful attacks was so great that the German military became convinced the British were using a new type of explosive. OR was also able to reduce the loss rate of naval convoys when analysts realized that  larger convoys suffered lower percentage losses than  small convoys.

The greater simplicity and homogeneity of the aerial and marine environments certainly played a crucial factor in the success of OR since warfare in those milieus was easier to model mathematically than land operations. More generally, modern war involved ‘more repetitive operations  susceptible to  analysis’ in that  ‘a  men-plus- machines operation  can be studied statistically, experimented with, analyzed, and predicted by the use of known scientific techniques just as a machine operation can be’.41  Scientists felt that they were ideally trained to grapple with the problems of modern war with their ability ‘to get down to the fundamentals of a question – to seek out broad underlying principles through a mass of sometimes conflicting and irrelevant data  .. .  with the result that they were often able to discredit what the military regarded as “commonsense” solutions’.42

After the  war,  operations  research (the  optimizing  of  existing systems) soon transformed into systems analysis (the design of the most effective system for the accomplishment of a defined objective), thereby granting analysts planning powers. After 1945, procurement  cycles increased in terms of the time  necessary for the research and  development, production,  and  deployment  of any new technology. Furthermore, closer cybernetic integration of vehicles, projectiles, communications, radar, and electronic counter-measures created weapon systems whose components all needed to be designed concomitantly. This naturally empowered those analysts who proposed a scientific methodology according to which technological and budgetary decisions could be determined. Systems analysis was perfectly suited to this task since its microeconomic logic and optimization routines enabled planners to determine the best allocation of resources given a limited supply of resources. Reviewed through a range of possible future security and military environments, alternative systems could be compared and judged in terms of efficiency and cost.

This approach  promoted  an  understanding  of warfare which reduced  it  to  a mathematical problem with a number  of manipulable variables and a production model that could be scientifically managed along Taylorist ideals. The RAND think- tank became the home of systems analysis, often to the detriment of any other forms of thinking about national security. As Kaplan puts it, ‘for an organization dominated by mathematicians, systems analysis appeared to be the way to get the scientific – the right – answer. Projects that involved no systems analysis, such as most of the work produced  by  the  social  science division,  were  looked  down  upon,  considered interesting in a speculative way at best’.43
« Last Edit: July 28, 2010, 06:58:54 pm by Anti_Illuminati » Report Spam   Logged

Share on Facebook Share on Twitter

Anti_Illuminati
Newbie
*
Offline Offline

Posts: 2


View Profile
« Reply #1 on: July 28, 2010, 04:24:21 pm »

Models grew to astonishing levels of complexity, fuelled by the desire to create an accurate  simulation  of conflict, a  scientific understanding  of a  quite  literal war machine. The father of systems analysis, RAND researcher Ed Paxson, was symptomatic of this with the minutiae of his obsession in planning for World War III:

His dream was to quantify every single factor of a strategic bombing campaign – the cost, weight, and payload of each bomber, its distance from the target, how it should fly in formation with other bombers and their fighting escorts, their exact routing patterns, the refueling procedures, the rate of attrition, the probability that something might go wrong in each step along the way, the weight and inaccuracy of the bomb, the vulnerability of the target, the bomb’s ‘kill probability,’ the routing of the planes back to their bases, the fuel consumed, and all extraneous phenomena such as the weather – and put them all into a single mathematic equation.44

Planning for nuclear war was a particularly urgent task during the Cold War and required  continuous  reviewing as the  technology and  availability of bombs  and missiles were subject to rapid change. The explosive power of individual devices – first triggered by nuclear fission in atomic bombs, then nuclear fusion for hydrogen bombs

•   escalated vertigineously. Launching systems gained in range and accuracy: with the evolution from the medium-range bomber in 1945 to the intercontinental  ballistic missile in 1957, full-blown nuclear war could be initiatied within a few hours of the executive decision. In the absence of any battlefield experience of nuclear weapons, systematic mathematical calculation of the theoretical damage they would inflict on urban  areas and  troops  was the  only means  by which to  assess defence needs. Consequently, nuclear  payloads, delivery systems, military  and  civilian defensive measures, along with the strategies and tactics within which these would be inserted were given the systems analysis treatment.  Systems analysis, game theory and the whole range of available mathematical and statistical instruments were the only means to rationalize Armageddon and ‘think the unthinkable’.45  If attempts at developing strategies of graduated nuclear war were made, the main priority of the analysts was to enforce effective deterrence and preserve the ‘delicate balance of terror’46  over and above any notion of winning the face-off, to ensure that the ‘Cold War system’ could return  to homeostatic equilibrium and ward off the possibility of it exceeding the bounds  beyond  which  it  would  self-destruct  in  an  apocalyptic  spasm  dubbed ‘wargasm’ by the nuclear strategist Herman Kahn.

Nor  was systems analysis limited  to  the nuclear aspects of warfare, the entire spectrum of conventional military operations was subject to its scrutiny. As RAND’s first vice-president Alan Henderson declared, ‘systems analysis seeks to cover the full range of possible future weapons characteristics and simultaneously analyse each set of possible characteristics in all possible tactics and strategies of employment’.47 However, while the consideration of the relative merits of two bombers with ten variables as possible during World War II yielded already over 1,000 combinations, raising the number  of  systems being  considered  by  only  four  resulted  in  over  a  million combinations for evaluation.48

Furthermore, the rapidly evolving characteristics of the weapons required a constant revision of their potential impact on existing tactical and strategic plans. This exponential increase in possible permutations made this a task only achievable by the computer.  Thanks to its formidable processing power, the computer  allowed for  the  creation  of  complex  models  with  multiple  variables, providing a rapid calculation of any changes in their values.

The same modelling techniques were also employed for a vast range of wargames which simulated an array of operations from the tactical to the operational to the strategic, from individual battalions to anti-aircraft defences to global geopolitical exercises. While war-gaming had  long been practised  –  the  Germans  had  been enthusiasts of Kriegsspiel from the nineteenth century onwards – the new generation of wargames and simulations were an extension of OR and SA since their models relied on the same methodology. Ghamari-Tabrizi points  to the manner  in which these wargames constituted their own closed worlds:

Following the thread of systems thinking, the gamers tried to shoehorn everything of importance  into  game design and  play. Since a major  war would batter  every department of life, they were tempted to expand their model into infinitely complex details in the simulation of reality. But at the same time, they were determined to set upper and lower boundaries, limits and constraints of every kind onto that surging impluse towards the Weltbild. In other words, in war game design, one makes out a wish to catch a richly furnished world, but one sealed off like a terrarium or a tableau in a paperweight. This snug little world, in which the totality could be grasped all at once, encompasses the universe of miniature life.49

Computers became increasingly employed in wargames, first to calculate the outcome of any decision by the players by processing it through complex models of warfare and international relations, and secondly as an interface for the players. With the aim of providing greater realism, the environments of decision makers were often reproduced painstakingly. Mediated through computerized displays and interfaces, real wartime situations and simulations would be largely indistinguishable. War-gamers at RAND would eventually grant computers an even greater role by making them fully-fledged players.

Faced with human players that would persistently refuse to cross the nuclear threshold in simulated excercies, the simulationists developed artificial intelligences that could play the role of the Soviet Union or United States, creating a variety of iterations  characterized by different ‘personalities’  and  willingnesses to  resort  to force.50   Computers  were  effectively the  ideal  war-gamers:  cold,  logical, purely instrumental and devoid of the messy cultural, social and historical attributes that plagued human players and could not be mathematically modelled. Fully computerized wargames allowed for the rapid testing of an entire range of weapon system characteristics, logistics, tactics, and strategies for the purpose of identifying the optimal combination.

But the use of wargames was not  restricted to testing models; they could also provide their own facts and statistics for interpretation and inclusion in the models. As one war-gamer observed: ‘as we recede from such sources of empirical data as World War II and Korea, an ability to generate synthetic battlefield facts becomes increasingly important.’51  These synthetic facts drawn from the experiences of simulated conflict could then be fed back into further models of war – simulation begetting simulation, the constitution of a hyperreal feedback loop.

If institutional resistance from the Air Force – RAND’s main sponsor – to studies that attempted  to  model  warfare in  its  entirety  forced the  organization  into  publicly downgrading the ambition in the scope of its research projects, in practice the systemic interrelationship of areas of study and the commonly held belief of analysts in the superiority of their methodology made such restrictions difficult to maintain. Alain Enthoven, Deputy Assistant Secretary for Systems Analysis under  McNamara, once pointed out to a General that ‘I have fought as many nuclear wars as you have’.52

In fact, given his time at RAND modelling it and playing wargames, Enthoven may well have believed that he had actually fought more nuclear wars, albeit simulated. Enthoven’s quip was symptomatic of the attitude of RAND analysts towards the military brass, convinced as they were that ‘in order to approach nuclear war properly, one had to become a perfect amnesiac, stripped of the intuitions, judgments, and habits cultivated over a lifetime of active duty’.53 Combat experience and traditional common wisdom of the military were thus devalued in favour of the cool rational calculations of the defence intellectual.

In 1961, this  latter  vision appeared  to  have triumphed  over the  generals as Robert McNamara was made Secretary of Defense and proceeded to apply the paraphernalia of systems analysis across the military more systematically than ever before.

McNamara had  first risen to  prominence  during  World  War II, distinguishing himself as one of the most brilliant analysts in the Statistical Control Office, where he had conducted operations research on Air Force operations. He was notably involved in the strategic bombing campaign of Japan, recommending the switch to firebombing and lower altitude bombing which the notorious  Air Force General Curtis LeMay (later head of Strategic Air Command) adopted with devastating results for Japanese cities.

After the war, he left the armed forces to join the Ford Corporation, applying the same principles of scientific management with great success, before being offered the role of Secretary of Defense by President Kennedy. Surrounding himself with a team of ex-RAND analysts that  shared his worldview, McNamara set out  to extend these principles to all branches of the military. A controversial figure, particularly unpopular with certain sections of the military over which he asserted previously unseen levels of civilian control, McNamara was once referred to as a ‘“human IBM machine” who cares more for computerised statistical logic than for human judgments’.54

McNamara instigated the Planning, Programming and Budgeting System (PPBS) in 1962, perhaps his most lasting legacy, by institutionalizing systems analysis in the decision-making  process  of  military  planning  and  procurement.   With  PPBS, cost – benefit and cost-effectiveness analysis were applied across all branches of the military so that various military programmes from different services could be evaluated, compared, and granted funding accordingly.55 PPBS was subsequently extended across the federal bureaucratic structure, in particular  the social welfare agencies of the Departments of Health, Education and Welfare and Office of Economic Opportunity. Department of Defense Comptroller Charles Hitch (ex-RAND) insisted that systems analysis acted merely as an instrument assisting decision makers rather than being the decisive factor in determining spending plans. Gregory Palmer agrees that PPBS was often more of a heuristic or ideal, but that ‘in its pristine form, PPBS was a closed system, rationally ordered to produce carefully defined outputs’.56  As such, critics claimed its influence was pervasive and dangerously misleading if applied uncritically.

In his Farewell Address to the people on 17 January 1961, President Eisenhower had famously warned against ‘the acquisition of unwarranted influence, whether sought or unsought, by the military – industrial complex’. Eisenhower also spoke of the ‘danger that public policy could itself become the captive of a scientific – technological elite’.57 A military man, Eisenhower was most likely at least in part thinking of the operations researchers and system analysts who rose to prominence during his presidency and were about to take control of the Pentagon, equipped with instruments they believed could be used to tackle all social problems. ‘The military effect of cybernetics and computers did more than  bring about  changes in administration,  logistics, communications, intelligence and even operations’, Van Creveld tells us, ‘they also helped a new set of people to take charge, people who thought about war – and hence planned, prepared, waged, and evaluated it – with the aid of fresh criteria and from a fresh point of view.’58

Because of the scientific and mathematical methodology upon which this new point of view relied, analysts systematically privileged the quantifiable aspects of warfare:

With computers acting as the stimulus, the theory of war was assimilated into that of microeconomics  .. . Instead of evaluating military operations by their product  – that  is, victory  –  calculations  were cast  in  terms  of  input – output  and  cost effectiveness. Since intuition was replaced by calculation, and since the latter was to be carried out with the aid of computers, it was necessary that all the phenomena of war be reduced to quantitative form. Consequently everything that could be quantified was, while everything that could not be tended to be thrown onto the garbage heap.59

Under the impulse of computer modelling and systems analysis, the understanding of war which emerged during the Cold War was therefore frequently biased towards those elements which could be quantified.  But even that which could be quantified could not necessarily be precisely measured or estimated and  would frequently only be the product  of more or less inspired guesswork.

For Solly Zuckerman:  operational analysis implies a kind of scientific natural history. It is a search for exact information as a foundation for extrapolation and prediction. It is not so much a science in the sense of a corpus of exact knowledge, as it is the attempted application of rigorous methods of scientific method and action to new and apparently unique situations. The less exact the information available for analysis, the less it is founded on experience, the more imprecise are its conclusions, however sophisticated and glamorous the mathematics with which the analysis is done.60

As such, the outcome  of SA studies or war games was heavily dependent  on the assumptions underpinning their models, some acknowledged by the analysts, others largely concealed or unquestioned. Driven by their desire for predictability, analysts constrained  uncertainty  by either setting the possible variations of factors within clearly delineated numerical ranges and probability sets or by simply discounting all those elements that could not be treated in this bounded way. Princeton academic Klaus Knorr noted some of the uncertainties frequently neglected by SA:

Costs may be uncertain, technology may be uncertain, the properties of military conflict situations may be uncertain, and the reactions and capabilites of the potential enemy nations are apt to be uncertain. The last uncertainty is of particular import; it is imperative that military choices be examined within a framework of interaction. An opponent’s responses to our choices may, after all, curtail or altogether nullify the advantage we seek. Nor is it enough to recognize the conflict aspects of the problem. The possibilities of tacit or formal co-operation may be equally significant.61

McNamara himself came to be disillusioned with the approach he had championed, recognizing the impossibility of making war into a fully predictable instrument  of policy: ‘war is so complex, it’s beyond the ability of the human mind to comprehend all the variables. Our judgement, our understanding, are not adequate.’62 McNamara was to learn this lesson during his tenure as Secretary of Defense between 1961 and 1967, during which the United States got progressively sucked into a Vietnam War it could not win, despite (or perhaps because of ) its army of system analysts in the Pentagon.

Vietnam: cybernetic warfare fails

The limits of the centralizing cybernetic model became clear in Vietnam, although its large role in the US defeat has often been disregarded. James Gibson has perhaps done the most to document the dramatic failure of ‘technowar’, ‘a production system that can be rationally managed and warfare as a kind of activity that can be scientifically determined by constructing computer models’.63  The principles of OR and SA were applied to provide analysis of the conflict and guidance to the policy makers while cybernetic command-and-control technologies were widely deployed. What developed in Vietnam can be appropriately described as an ‘information pathology’, an obsession with statistical evaluations and directing the war from the top, perceived as the point of omniscience, when in practice soldiers on the ground often understood far better than their superiors how badly the war was going.

Between 1967 and 1972, the Air Force ran Operation Igloo White at the cost of nearly $1 billion a year. Through an array of sensors designed to record sound, heat, vibrations, and even the smell of urine, feeding information  to a control centre in Thailand which sent on the resulting targeting information to patrolling jet aircraft (even  the  release of  bombs  could  be  controlled  remotely),  this  vast cybernetic mechanism was designed to disrupt the Ho Chi Minh Trail, a network of roads and trails providing logistical support to the North Vietnamese. At the time, extravagant claims were made about the performance of the system with the reported number of destroyed trucks in 1970 exceeding the total number of trucks believed to be in all of North Vietnam. In reality, far fewer truck remains were ever identified, there were probably many false positives in target identification, and the North Vietnamese and their Laotian allies became adept at fooling the sensors. In spite of all this, the official statistics still trumpeted a 90 per cent success rate in destroying equipment travelling

down the Ho Chi Minh Trail, an assertion difficult to sustain given that the North Vietnamese conducted major tank and artillery operations in South Vietnam in 1972. Edwards incisively observes that ‘Operation Igloo White’s centralized, computerized, automated, power-at-a-distance method of “interdiction” resembled a microcosmic version of the whole US approach to Vietnam’.64

Gibson submits that technowar not only altered the conduct of war but even the likelihood of the use of force: ‘by adopting microeconomics, game theory, systems analysis, and  other  managerial  techniques,  the  Kennedy administration  advanced “limited” war to greater specificity, making it seem much more controllable, manageable, and therefore desirable as foreign policy.’65  Henry Kissinger illustrated this very point when he claimed in 1969 that ‘a scientific revolution has, for all practical purposes, removed technical limits from the exercise of power in foreign policy’.66

The US bombing campaign in Vietnam obeyed a gradation in the use of force through which signals could be sent to the North Vietnamese. This amounted to a communicative theory of war where the level of violence can be alternatively ratcheted up or alleviated according to the message to be sent. In this manner the government wished to convince the North Vietnamese that they could not win, thereby forcing them to negotiate and steering them towards the desired behaviour.

As Kissinger put it, ‘in a limited war, the problem is to apply graduated amounts of destruction for limited objectives and also to permit the necessary breathing spaces for political contacts’.67 This thinking emerged from attempts by defence intellectuals, frustrated by the paradoxical powerlessness of nuclear weapons so destructive they could not be used, to theorize and rationalize their limited use against the Soviet Union as bargaining chips in an eventual showdown. This strategy was ultimately abandoned because of the impossibility to guarantee that nuclear war would not rapidly escalate into an apocalyptic war of extermination but resurfaced in the context of the Vietnam War.

By applying bargaining models based on game theory which assumed a common utility-maximizing rationality and cost – benefit framework of analysis on all sides, strategists erected an understanding of the enemy that was a mere reflection of their own worldview. This perception was further bolstered by the military and civilian leadership’s  conception of war as the management of complex industrial systems:

‘Limited war  fought  as  a  war  of  attrition  means  that  only  information  about technological – production systems will count as valid knowledge about the enemy. For the military as well as civilian, the enemy becomes a mirror image of ourselves, only “less” so.’68  Since military effectiveness could only be measured by the yardstick of ‘technological – production  systems’, the North Vietnamese were necessarily inferior and victory was the only conceivable outcome for the American war machine.

Cybernetic warfare’s closed self-referentiality was thus a major factor in bringing about its defeat in Vietnam, blinding its proponents  to the successful asymmetric strategy deployed by the Vietcong. Designed to fight total war against the Soviet Union, cybernetic warfare was susceptible to spectacular inefficiency and failure when engaged in a low-intensity conflict in which a dispersed enemy merged into a complex jungle environment. Attempts to simplify the battlespace through the practice of deforestation and the use of Agent Orange made little difference against an opponent that played to its strengths and understood its enemy far better than the Americans did. Witness North Vietnamese General Vo Nguyen Giap’s piercing observation: ‘The United States has a strategy based on arithmetic. They question the computers, add and subtract, extract square roots, and then go into action. But arithmetical strategy doesn’t work here. If it did, they would already have exterminated us with their airplanes.’69

The American reliance on information technologies to direct the war brought its own problems. For one, despite the development of ICTs, volumes of information escalated so fast that  saturation  and bottlenecks resulted, especially within highly centralized command-and-control  structures. Intelligence on Vietcong positions and movements frequently arrived too late to be actionable, delayed in an information- processing infrastructure unable to treat all the data it was fed. And this despite the creation of an unprecedented telecommunications network in a field of operations, with electronic communications  gear accounting for a third of all major items of equipment brought into the country and the first use of satellite communications for military  purposes  in  1965.70  As Arquilla and  Ronfeldt recognize, ‘informational overload and bottlenecking has long been a vulnerability of centralized, hierarchical structures for command and control’.71

Central to this was the fact that the measure of information gathering was frequently one of quantity over quality. The pressure on infantry units to produce detailed reports of their operations and particularly to match the ‘production’ targets in terms of enemy casualties led to wildly inaccurate and overblown estimates that masked the extent to which the American strategy was failing. Gibson points to a related problem in the intelligence field where operations were gauged primarily on data volumes:

Collection departments received most agency budgets and collection departments represented their progress in terms of how many ‘bits’ of information they collected, or how many hours  of radio messages were recorded. Since their  work was so tangible and measurable, collection departments got the most. As one senior staff member of the National Security Council said, ‘95 percent of the US intelligence effort has been on collection, and only 5 percent on analysis’.72

The paradox of this informational  approach  to warfare is noted  by van Creveld:

‘designed to produce accuracy and certainty, the pressure exercised at the top for more and more quantitative information ended up by producing inaccuracy and uncertainty’.73  It was widely assumed that the development of information gathering and processing technologies would allow a far greater understanding and control of military operations. In practice, the collection and production of information for its own sake created at best greater uncertainty and confusion and at worst a fictional account of the conflict based on a misplaced sense of omniscience and on which erroneous decisions would be made.

As Alain Enthoven was himself to recognize, ‘you assume that there is an information system that will tell you what you want to know. But that just isn’t so. There are huge amounts of misinformation and wrong information’.74  Thus, far from eliminating the Clausewitzian ‘fog of war’, cybernetic warfare itself generated ‘a kind of twilight, which, like fog or moonlight, often tends to make things seem grotesque and larger than they really are’.75  Statistical indicators pointing to US success in Vietnam were frequently erroneous and misleading, failing to grasp the determination of the enemy and the extent of the success of their political strategy. Colonel Harry Summers relates an anecdote whose absurdity captures the disjuncture between the statistical assessment of the war and its reality:

When the Nixon Administration took over in 1969 all the data on North Vietnam and  the United  States was fed into  a Pentagon computer  –  populations,  gross national product,  manufacturing capability, number of tanks, ships, and aircraft, size of the armed forces, and the like. The computer was then asked, ‘When will we win?’ It took only moments to give the answer: ‘You won in 1964!’76


Conclusion

Defeat in Vietnam exposed the shortcomings of cybernetic warfare and revealed the inherent limitations of any attempts to make war into an entirely controllable and predictable activity. The cybernetic model of warfare erected by the system analysts was one that was frictionless, a perfectly oiled machine resting on elegant mathematical constructs. Rather than eternal attributes of the battlefield, uncertainty and unpredictability became understood merely as a lack of information which could be overcome through the deployment of the proper information and communication technologies and elaboration of appropriate  models of conflict. John Lewis Gaddis explicitly criticizes the tendency in American strategic thought  in post-war era ‘to equate the importance of information with the ease of measuring it – an approach better suited to physics than to international relations’.77

The formidable technological impulse of World War II, marked in particular by the development of nuclear weaponry and  ICTs, empowered those individuals which mastered  the  language and  methodology  of  the  science that  accompanied  this technology. This was to the detriment of established traditions of military thought and practice of warfare. Via an ‘“organized scientific discourse” through  multiple, but centralizing relationships among high-bureaucratic positions, technobureaucratic or production  logic in the structure of its propositions, and the conventional educated prose style’, cybernetic warfare excluded accounts of the war which did not conform to the exigencies of technoscientific discourse. For Gibson, this amounted to a neglect of ‘warrior knowledge’ which he describes in terms of Foucault’s notion of ‘subjected knowledge’,78  that is, knowledge ‘disqualified as inadequate to their task or insufficiently elaborated: na¨ıve  knowledges, located  low down  on  the  hierarchy, beneath the required level of cognition or scientificity’.79

If the debacle of Vietnam resulted in some serious soul-searching among American strategists and military men, it did not  result in a wholesale abandonment  of the worldview epitomized by cybernetic warfare or a significant revaluation of other forms of thoughts  on war. Throughout  the rest of the Cold War and  beyond, information technology continued to be embraced as the panacea to the chaos and indeterminacy of war. The Strategic Defense Initiative promised an invulnerable shield against nuclear attack through a combination of computers and space weapons while revolutions in military affairs in the mould of Westmoreland’s vision have been repeatedly heralded (of which the current  Pentagon  doctrine  of network-centric  warfare is only the latest incarnation). A greater understanding of the role of scientific ideas and technological systems in the theory and practice of warfare therefore serves not only to shed light on the Cold War but also as a warning against any misplaced faith in the ability of information technology to grant complete control and predictability over the use of military force.


Notes
[1]  Edwards, The Closed World: Computers, 127 – 8.
[2]  Robin, The Making of the Cold War; Light, From Warfare to Welfare; Kaplan, The Wizards of
Armageddon; Ghamari-Tabrizi, The Worlds of Herman Kahn. [3]  Edwards, The Closed World: Computers.
[4]  Wiener, Cybernetics, 55. [5]  Ibid., 55.
[6]  Wiener, The Human Use of Human Beings, 95. [7]  Ibid., 8.
[8]  Heylighen and  Joslyn, Cybernetics and Second-Order  Cybernetics, http://pespmc1.vub.ac.be/ Papers/Cybernetics-EPST.pdf, 18.
[9]  Heims, Von Neumann and Wiener, 184.
[10]  Dechert, The Social Impact of Cybernetics, 20. [11]  Quoted in Heims, The Cybernetics Group, 22. [12]  Quoted in Capra, The Web of Life, 62.
[13]  Easton, A Framework for Political Analysis, 112, 128; Easton, A Systems Analysis of Political Life; Deutsch, The Nerves of Government.
[14]  Wiener, The Human Use of Human Beings, 185 – 6. [15]  Edwards, The Closed World: Computers, 12.
[16]  Ibid., 15. [17]  Ibid., 15. [18]  Ibid., 7.
[19]  Edwards, “The Closed World: Systems Discourse,” 138 – 9. [20]  Ghamari-Tabrizi, The Worlds of Herman Kahn, 128.
[21]  Edwards, “The Closed World: Systems Discourse,” 139.
[22]  Levidow and Robins, “Towards a Military Information Society?,” 173. [23]  Westmoreland, “Address.”
[24]  Edwards, “Why Build Computers?” [25]  Gray, Postmodern War, 95.
[26]  Van Creveld, Technology and War, 239. [27]  Gibson, Perfect War, 23.
[28]  Rochlin, Trapped in the Net, 204.
[29]  Ghamari-Tabrizi, “U.S. Wargaming Grows Up.”
[30]  Edwards, “The Closed World: Systems Discourse,” 143. [31]  Edwards, The Closed World: Computers, 206.
[32]  Ghamari-Tabrizi, The Worlds of Herman Kahn, 149.
[33]  United States Army, Operations Research/Systems Analysis. [34]  Cummings, “How The World of OR Societies Began.”
 

[35]  De Landa, War in the Age of Intelligent  Machines,  5. [36]  Rochlin, Trapped in the Net, 59.
[37]  Beer, “What Has Cybernetics to Do with Operational Research?” [38]  Martin and Norman, The Computerised  Society, 569.
[39]  Clayton and Sheldon, Air Force Operations Analysis. [40]  Van Creveld, Technology and War, 194.
[41]  Clayton and Sheldon, Military Operations Analysis. [42]  Wilson, The Bomb and the Computer, 43.
[43]  Kaplan, The Wizards of Armageddon, 87. [44]  Ibid., 87.
[45]  Thinking about the Unthinkable was the title of a book by notorious nuclear strategist Herman
Kahn.
[46]  Wohlstetter, The Delicate Balance of Terror.
[47]  Ghamari-Tabrizi, The Worlds of Herman Kahn, 138. [48]  Holley, The Evolution of Operations Research, 101. [49]  Ghamari-Tabrizi, The Worlds of Herman Kahn, 166.
[50]  De Landa, War in the Age of Intelligent  Machines,  103.
[51]  Ghamari-Tabrizi, The Worlds of Herman Kahn, 169. [52]  Kaplan, The Wizards of Armageddon, 254.
[53]  Ghamari-Tabrizi, The Worlds of Herman Kahn, 48. Kahn echoed Enthoven’s sentiment when he asked officers who were critical of his approach, ‘how many thermonuclear wars have you fought recently?’
[54]  At http://www.defenselink.mil/specials/secdef_histories/bios/mcnamara.htm. [55]  Kaplan, The Wizards of Armageddon, 254.
[56]  Edwards, The Closed World: Computers, 5. [57]  Eisenhower, Farewell Address.
[58]  Van Creveld, Technology and War, 246. [59]  Ibid., 246.
[60]  Perry, “Commentary,” 117.
[61]  Wilson, The Bomb and the Computer, 114. [62]  Morris, The Fog of War.
[63]  Gibson, The Perfect War, 156.
[64]  Edwards, “Cyberpunks in Cyberspace.” [65]  Gibson, The Perfect War, 80.
[66]  Kissinger, American Foreign Policy, 51 – 97. [67]  Gibson, The Perfect War, 22.
[68]  Ibid., 23.
[69]  Mustin, “Flesh and Blood.”
[70]  Van Creveld, Command in War, 239.
[71]  Arquilla and Ronfeldt, “Cyberwar is Coming!” [72]  Gibson, The Perfect War, 367.
[73]  Van Creveld, Command in War, 259. [74]  Herken, Counsels of War, 220.
[75]  Clausewitz, On War, 140.
[76]  Heuser, Reading Clausewitz, 170.
[77]  Gaddis, Strategies of Containment, 84. [78]  Gibson, The Perfect War, 467.
[79]  Foucault, Power/Knowledge, 82.
 

References
Arquilla, John and David Ronfeldt. “Cyberwar is Coming!” In Athena’s Camp: Preparing for Conflict in the Information Age,  edited by John Arquilla and David Ronfeldt. Santa Monica, CA: RAND, 1997.Beer, Stafford. “What Has Cybernetics to Do with Operational  Research?” Operational Research Quarterly 10, no. 1 (1959), 1 – 21.
Capra, Fritjof. The Web of Life: A New Synthesis of Mind and Matter. London: Flamingo, 1997. Clausewitz, Carl von. On War. Harmondsworth: Penguin, 1968.
Clayton, Thomas J., and Sheldon Robert S. Air Force Operations Analysis. Available at http://www. mors.org/history/af_oa.pdf
Clayton, Thomas J., and Sheldon. Robert S. Military Operations Research. Available at http://www. mors.org/history/mor.pdf
Cummings, Nigel. “How the World of OR Societies Began.” OR Newsletter April 1997. Available at http://www.orsoc.org.uk/about/topic/news/article_news_orclub.htm.
De Landa, Manuel. War in the Age of Intelligent  Machines.  New York: Swerve Editions, 1991. Dechert, Charles R., ed. The Social Impact of Cybernetics. Notre Dame, IN: University of Notre Dame,
1966.
Deutsch, Karl W. The Nerves of Government: Models of Political Communication and Control. New
York, Free Press and London: Collier-Macmillan, 1963.
Easton, David. A Systems Analysis of Political  Life. New York: John Wiley & Sons, 1965.
Easton, David. A Framework for Political Analysis. Chicago, IL and London: University of Chicago
Press, 1979.
Edwards, Paul N. “The Closed World: Systems Discourse, Military Policy and Post-World War II US Historical Consciousness.” In Cyborg Worlds: The Military Information Society, edited by Les Levidow and Kevin Robins. London: Free Association Books, 1989.
Edwards, Paul N. “Cyberpunks in Cyberspace: The Politics of Subjectivity in the Computer Age.” In Cultures of Computing, edited by Susan Leigh Star. Keele: Sociological Review and Monograph Series, 1995.
Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in Cold War America.
Cambridge, MA and London: MIT Press, 1996.
Edwards, Paul N. “Why Build Computers?” In Major Problems in the History of American Technology: Documents and  Essays,  edited  by Merritt  Roe Smith  and  Gregory K. Clancey. Boston: Houghton Mifflin, 1998.
Eisenhower, Dwight. “Farwell Address to the Nation.” January 17, 1961. http://www.ourdocuments. gov/doc.php?flash¼ true&doc¼ 90&page¼ transcript.
Foucault, Michel. Power/Knowledge. Hemel Hempstead: Harvester Press, 1980.
Gaddis, John Lewis. Strategies of Containment: A Critical Appraisal of Postwar American National
Security Policy. Oxford: Oxford University Press, 1982.
Ghamari-Tabrizi, Sharon. The Worlds of Herman Kahn: The Intuitive Science of Thermonuclear  War.
Cambridge, MA: Harvard University Press, 2005.
Ghamari-Tabrizi,  Sharon.  “U.S. Wargaming  Grows Up:  A Short  History  of  the  Diffusion  of Wargaming in the Armed Forces and Industry in the Postwar Period up to 1964.” Available at http://www.strategypage.com/articles/default.asp?target ¼ Wgappen.htm.
Gibson, James. The Perfect War: Technowar in Vietnam. Boston: Atlantic Monthly Press, 1986. Gray, Chris Hables. Postmodern War: The New Politics of Conflict. New York: Guilford Press, 1997. Heims, Steve J. John Von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life
and Death. Cambridge, MA and London: MIT Press, 1980.
Heims, Steve J. The Cybernetics Group. Cambridge, MA: MIT Press, 1991. Herken, Gregg. Counsels of War. New York: Alfred A. Knopf, 1985. Heuser, Beatrice. Reading Clausewitz. London: Pimlico, 2002.

Heylighen and  Joslyn. “Cybernetics and  Second-Order Cybernetics” In Encyclopedia of Physical Science & Technology, edited by Meyers. http://pespmc1.vub.ac.be/Papers/Cybernetics-EPST. pdf, 18, 2001.
Holley. Jr., I.B.“ The Evolution of Operations Research and the Impact on the Military Establishment: The Air Force Experience.” In Science, Technology and Warfare: The Proceedings of the Third Military History Symposium, edited by Monte D. Wright and Lawrence J. Paszek. United Air Force Academy, 8 – 9 May 1969.
Kahn, Herman. Thinking about the Unthinkable. London: Weidenfeld and Nicolson, 1962. Kaplan, Fred. The Wizards of Armageddon. New York: Simon & Schuster, 1984.
Kissinger, Henry. American Foreign Policy: Three Essays by Henry Kissinger. New York: W.W. Norton,
1969.
Levidow, Les and Robins, Kevin. “Towards a Military Information Society?” In Cyborg Worlds: The Military  Information Society, edited  by  Les Levidow and  Kevin Robins.  London:  Free Association Books, 1989.
Light, Jennifer S. From Warfare to Welfare: Defense Intellectuals  and Urban Problems in Cold War
America. Baltimore, MD: Johns Hopkins University Press, 2003.
Martin, James and Norman, Adrian R.D. The Computerised  Society. Harmondsworth,  Middlesex: Penguin Books, 1973.
Meyers, R.A., ed. Encyclopedia of Physical Science and Technology, 3rd ed. New York: Academic Press,
2001.
Morris, Errol, director. The Fog  of War  –  Eleven Lessons from the Life of Robert S. McNamara.
Columbia Tristar, 2004.
Mustin, Lt Jeff. “Flesh and Blood: The Call for the Pilot in the Cockpit.” Air and Space Power Journal
•   Chronicles Online Journal (2001):, July, Available at http://www.airpower.maxwell.af.mil/ airchronicles/cc/mustin.html.
Perry, Robert L. “Commentary.” In Science, Technology and Warfare: The Proceedings  of the Third Military History Symposium, edited by Monte D. Wright and Lawrence J. Paszek. United Air Force Academy, 8 – 9 May 1969.
Robin, Ron Theodore. The Making of the Cold War Enemy: Culture and Politics in the Military- Intellectual Complex. Princeton, NJ: Princeton University Press, 2001.
Rochlin, Gene I. Trapped in the Net: The Unanticipated Consequences of Computerization.  Princeton, NJ: Princeton University Press, 1997.
United  States Army. Official Department  of the  Army Administrative Publications. Operations
Research/Systems Analysis. Department of the Army Pamphlet 600 – 3 – 49, 1987. Available at http://www.army.mil/usapa/epubs/pdf/p600_3_49.pdf.
Van Creveld, Martin. Command in War. Cambridge, MA and London: Harvard University Press, 2003.
Van Creveld, Martin. Technology and War: From 2000 B.C. to the Present. New York, London: Free
Press, Collier Macmillan, 1989.
Westmoreland, William. “Address to the Association of the U.S. Army.” 14 October 1969.
Wiener, Norbert.  Cybernetics  or Control and  Communications in the Animal and  the Machine.
Cambridge, MA: MIT Press, 1948.
Wiener,  Norbert.  The Human  Use  of  Human  Beings: Cybernetics  and  Society. London:  Eyre & Spottiswoode, 1954.
Wilson, Andrew. The Bomb and the Computer. London: Barrie & Rockliff, 1968.
Wohlstetter,  Albert.  The  Delicate Balance of  Terror.  1958.  Available at  http://www.rand.org/
publications/classics/wohlstetter/P1472/P1472.html.
Report Spam   Logged
Pages: [1]   Go Up
  Print  
 
Jump to:  

Powered by EzPortal
Bookmark this site! | Upgrade This Forum
Free SMF Hosting - Create your own Forum

Powered by SMF | SMF © 2016, Simple Machines
Privacy Policy
Page created in 0.07 seconds with 18 queries.