
WARNING! The views expressed in FMSO publications and reports are those of the authors and do not necessarily represent the official policy or position of the Department of the Army, Department of Defense, or the U.S. Government. Infosphere Threats by Mr. Timothy L. Thomas Foreign Military Studies Office, Fort Leavenworth, KS. This article appeared in The linked image cannot be displayed. The file may have been moved, renamed, or deleted. Verify that the link points to the correct file and location. Military Review September-October 1999 ; ON 3 July 1988, the USS Vincennes, located in the Persian Gulf, picked up an Iranian plane on its Aegis system radar. Seven minutes later, the ship's Phalanx gun system blew the plane from the sky. The aircraft turned out to be a civilian airliner and not an F-14 as indicated by the Aegis system. One analysis of the incident noted that "the U.S., and by extension other countries using high-tech weapons, may have become prisoners of a technology so speedy and complex that it forces the fallible humans who run it into snap decisions that can turn into disaster."1 This unfortunate incident highlighted some of the emerging problems of the information age: first, the inability of analysts and equipment to visualize the intent of electronic images often causes an inaccurate operator "perception-reaction" response; second, a dangerous game of digital roulette results from the inability of software's embedded scenarios to handle all of the anomalies and asymmetric options that develop, by design or otherwise; and third, the impact of electronic input can overwhelm the human dimension of decision making. The analysis suggests the need for a "military software science" to help understand these new phenomena. Such a science would provide a better interpretation and forecast of the scenarios that countries embed in their military software and improve our response posture. Implications of the Switch to a Digitized Force Force XXI's digitization represents a massive shift away from analog representation data. Analog systems process continuous voltage amplitudes and are costly and specially designed, causing difficulties when sharing information with other systems. Digital systems use rapidly switching "on" or "off" states of binary "1" or "0" as data representations. Digital technology permits a vast decrease in electronic hardware's size and cost, and it allows processing in software rather than hardware. The digital format's resulting flexibility explains our increased reliance on it. The underlying commonality in all digital signal processing hardware and the ready ability to convert formats and process the information by using software have caused the explosion in information sharing among digital systems. But it is this very ease of trans-mission, extensive processing, changing software and widespread digital data sharing that make intrusion both possible and frightening. If intrusion and corruption succeed, stability disappears and the software's 1s and 0s start falling into unpredictable places, much as the ball that lands unpredictably on a spinning roulette wheel number. If the scenarios embedded in the software are unable to handle unexpected anomalies deliberately introduced by an opponent, stability could suffer.2 Nations play this game of digital roulette every day with the software in their advanced warning systems, rockets and satellites. Such a game could result in some serious mishaps or instigate some catastrophic chain reactions of events. For example, what would happen if one side could project false radar blips on a Joint Surveillance Target Attack Radar System (JSTARS) in a manner so realistic, extensive and threatening that a potential opponent expends an arsenal of cruise and other precision-guided missiles on the illusory threat? Could it result in command and control decisions that put nuclear forces in a ready-to-launch status once other assets are exhausted? The world could be thrust on the brink of a holocaust by some false images on a computer display. There are no guarantees that all cultures and nations will include "fail safe" rules in their software to guard against such an accidental launch. A programmer writes code to fulfill a task but as Ellen Ullman noted, "it is a task as a human sees it: full of unexpressed knowledge, implicit associations, allusions to allusions. Its coherence comes from knowledge structures deep in the body, from experience, memory."3 Human knowledge mechanisms, as they relate to culture, language and the means of expression, are quite complex. Someone should study the relationship between culture and programming in a systemic way because there is no concrete understanding of the impact of culture on programming. On the one hand, it is reasonable to suggest that if different cultures think differently, there is no reason why this practice might not affect the way they program. On the other hand, there is the view that traditional culture makes little difference in the act of programming. Computers may, for example, be creating a horizontal culture, like rock music and McDonalds, which obliterates traditional national boundaries. Perhaps programmers in Calcutta, while on the job, live and program almost exactly as do programmers in Boston or Silicon Valley.4 Retired Air Force Colonel Richard Szafranski, writing on "Neocortical Warfare," Military Review, November 1994, noted that F.S.C. Northrop talked about the impact of culture on the brain in 1946, before the development of computers. Northrup's interpretation would fit Ullman's first viewpoint because Szafranski, paraphrasing Northrup, remarked that "Culture conditions some of the operations of the left brain. Specifically, atmospheric and linear perspective in classical Western art and the syntax of romance languages both work together to channel cognition in ways that are different from the ways that the undifferentiated aesthetic continuum of Eastern art and the syntax of the Asian word- picture or ideogram condition the thinking of those in the East."5 However, perhaps more pertinent, any difference in programming will be swamped by doctrinal differences in how cultures develop and interpret computer displays, what they design the system to provide, and so forth. Cultures will specify tasks differently to solve problems, such as the manner in which heads-up displays were developed for the helmets of Russian and US fighter pilots. Two Russian analysts who studied digitized-age implications added other characteristics of the digital context, noting that: Collection is becoming more closely linked with data analysis and processing, and the entire effort is much more integrated than before. Human involvement is decreasing, especially in the collection and processing phases. Just as virtual reality is blurring the geopolitical boundaries of information space, it also obscures the enemy image—is it a national or transnational threat? Distance is not as important as time in making and implementing decisions. Software may become hostile to mankind if it spawns systems of growing complexity or self- destructs.6 Understanding "Information-Based Intent" Intent is an amorphous concept defined as a purpose or goal—why one performs an act. Intent originates in an individual today just as it always has. In the past, analysts measured intent by observing a country mobilize resources, move tanks to the front and deploy into battle formations. Discerning the intent of electrons, their purpose, goal or what act they are performing is another matter. Take, for instance, the difficulty in exposing the intent of electrons sent from a private computer somewhere in the world and rerouted through several intermediate countries. Where did the electrons originate? Who initiated the attack? What is the goal or purpose of the electrons, and what are they doing? The soldier-operator (S-O) behind a computer monitor or radar screen is usually the front-line defense in the battle to detect and understand electronic intent. The S-O works in and relies on virtual space for his contextual understanding. This workspace is where the tension between space-based information capabilities and intent escalates when commanders and operators face uncertainty yet pressure to react. A Navy captain who "hesitates too long while trying to identify conclusively that radar-screen blip" could lose his ship and the lives of all those aboard.7 Pressure to react rather than think produces hair-trigger rules of engagement (ROE) for naval forces in the Persian Gulf, requiring "some convincing indication [an electronic image?] that a ship or plane is approaching with hostile intent" to ask headquarters for permission to shoot.8 This leads to the frightening conclusion that the tactical operator/strategic decision maker works in a "perception-reaction" chain of events in the information technology (IT) age, a phenomenon that is even more dangerous than perception management. Perception reaction is the knee-jerk impulse that operators/decision makers feel when presented with images and warnings of an imminent attack, as happened in the case of the Vincennes. Perception reaction in IT can be understood as "actions based on observations of electronic images that elicit survival or other immediate response actions. Whether these images are real or artificial matters not, as both types will influence emotions, motives or the objective reasoning of individuals or systems."9 In the past, when opponents seemed aggressive, planners had time to
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-