[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
article
Free access

Computing, research, and war: if knowledge is power, where is responsibility?

Published: 01 August 1989 Publication History

Abstract

In the United States, artificial intelligence (AI) research is mainly a story about military support for the development of promising technologies. Since the late 1950s and early 196Os, AI research has received most of its support from the military research establishment [37, 55].1 Not until the 1980s, however, has the military connected this research to specific objectives and products. In 1983, the $600-million Strategic Computing Program (SCP) created three applications for "'pulling' the technology-generation process by creating carefully selected technology interactions with challenging military applications" [16]. These applications, an autonomous land vehicle, a pilot's associate, and a battle management system, explicitly connect the three armed services to further AI developments [29, 51, 53]. The Defense Science Board Task Force on the "Military Applications of New-Generation Computer Technologies" recommended warfare simulation, electronic warfare, ballistic missile defense and logistics management as also promising a high military payoff [18].
In his 1983 "Star Wars" speech, President Reagan enjoined "the scientific community, . . . those who gave us nuclear weapons, . . . to give us the means of rendering these nuclear weapons impotent and obsolete" [43]. As in the Manhattan and hydrogen bomb projects, AI researchers and more generally computer scientists are expected to play major parts in this quest for a defensive shield against ballistic missiles. Computing specialists such as John von Neumann played a supportive role by setting up the computations necessary for these engineering feats—with human "computers" for the atom bomb [10]2 and with ENIAC and other early computers for the hydrogen bomb [9]. The "Star Wars" project challenges computer scientists to design an intelligent system that finds and destroys targets—basically in real-time and without human intervention.
The interdependence of the military and computer science rarely surfaces during our education as computer practitioners, researchers, and teachers. Where might information concerning these important military applications enter into computer science and AI education? Where do students receive information concerning the important role they may play in weapon systems development? One of our students recently remarked that "as a computer science major, I did not realize the magnitude of the ramifications of advancing technology for the military . . . . In a field so dominated by the DoD, I will have to think seriously about what I am willing and not willing to do—and what lies in between those two poles."3
As researchers and educators, the authors wish to encourage colleagues and students to reflect upon present and historical interactions between computer science as an academic discipline and profession, and military projects and funding. As computer professionals, we lay claim to specialized knowledge and employ that knowledge in society as developers of computing technologies. Thus, we exercise power. Recognizing that as professionals we wield power, we must also recognize that we have responsibilities to society. To act responsibly does not mean that computer professionals should advocate a complete separation between computer science and military missions. However, we should openly examine the inter-relationships between the military and the discipline and practice of computing. To act responsibly does not mean that computer scientists and practioners should eschew support or employment from the military, although some are justified in taking such a stance.4 To act responsibly requires attention to the social and political context in which one is embedded; it requires reflection upon individual and professional practice; it requires open debate. The lack of attention to issues of responsibility in the typical computer science curriculum strikes us as a grave professional omission. With this article, we hope to add material to the dialogue on appropriate computing applications and their limits. We also hope to provoke reflections on computing fundamentals and practice at the individual, professional, and disciplinary levels, as well as prodding government institutions, professional societies, and industry to support in-depth research on the issues we raise here.
Reflection requires information and discussion. Academic computer science departments rarely support serious consideration of even general issues under the rubric of the social and ethical implications of computing. Unlike any other U.S. computer science department, Information and Computer Science (ICS) at UC Irvine has an active research program in the social implications of computing (Computers, Organizations, Policy and Society—CORPS). Even within CORPS, research that addresses the interactions between the military and computer science is difficult to pursue—not because individuals aren't interested, but because they are not able to find professional or academic support. The authors' interests in these issues arose from personal concerns over the dependence of military systems upon complex technology, and the possible grave outcomes of this fragile relationship. CORPS provided a supportive intellectual environment that allowed us to pursue our interests. In 1987, we developed and taught an undergraduate course designed to inform students about military applications and their limits, and allow dialogue on professional responsibilities. In general, little monetary support is available for research that considers these issues, and it is only through support from the Institute on Global Conflict and Cooperation and campus instructional funds that we were able to develop and teach the course.
Few researchers or educators can devote time and/or energy to pursue the social and ethical implications of their work and profession, in addition to their "mainstream" research. Since the discipline of computer science does not consider these reflections serious "mainstream" research, those who chose to pursue these vital questions have difficulties finding employment and/or advancing through the academic ranks. Growing concern over these issues and interest by computer scientists, as evidenced by the group Computer Professionals for Social Responsibility [38], individuals such as David Parnas [39], and this article, may lead to future research support and academic recognition.
For now, as concerned professionals, we offer the following reviews. They pose many more questions than answers. This article exemplifies the interdisciplinary investigations which are required as precursors to serious analysis of computing use in these applications. We hope that our reviews generate discussion and debate. In the first section, we present the course rationale and content, as well as student responses. In the sections following the course description, we consider three applications—smart weapons, battle management, and war game simulations—that are generating research and development funds and that have controversial implications for military uses of computing. We start with smart weapons, that is, the development of weapons that can destroy targets with minimal human intervention. Next we look at battle management systems designed to coordinate and assess the use of resources and people in warfare. Finally, we turn to war gaming as a means for evaluating weapon performance and strategies for war fighting. In each case, we describe the state of technology, its current and potential uses and its implications for the conduct of war.

References

[1]
Advanced military computing 1, 5 (1985), 1, 2, 5-7.
[2]
Advanced military computing 1, 6 (1985), 4.
[3]
Advanced military computing 2, 9 (1986), 5.
[4]
Advanced military computing 2, 10 (1986), 2.
[5]
Advanced military computing 2, 17 (1986), 3-4.
[6]
Allison, G.T. Essence of decision. Little, Brown and Co., Boston, 1971.
[7]
Anderson, P.A. Using artificial intelligence to understand decision making in foreign affairs: The problem of finding the right technology. In Artificial Intelligence and National Security, S.J. Cimbala, Ed., Lexington Books, Lexington, Mass, 1987.
[8]
Arkin, W.M. Tomahawk: Ominous new development. Bull. Atomic Scientists 40 (1984), 3-4.
[9]
Augarten, S. Bit by bit: An illustrated history of computers. Ticknor & Fields, New York, 1984.
[10]
Badash, L., Hirschfelder, J.O., and Broida, H.P., Eds. Reminiscences of Los Alamos, 1943-1945. D. Reidel Publishing Co., Holland, 1980.
[11]
Barr, A., and Feigenbaum, E.A., Eds. The handbook of artificial intelligence. Kaufmann, Los Altos: Vol. 1, 1981, Vol. 2, 1982; P.R. Cohen, and E.A. Feigenbaum, Eds., Vol. 3, 1982.
[12]
Borning, A. Computer system reliability and nuclear war. Commun. ACM 30,2 {Feb. 1987), 112-131.
[13]
Brewer, G.D., and Shubik, M. The War Game. Harvard University Press, Cambridge, 1979.
[14]
Charniak, E., and McDermott, D. Introduction to Artificial Intelligence. Addison-Wesley, Menlo Park, 1985.
[15]
Corcoran, E. Strategic computing: Far from the finish line. IEEE Institute, December 1986.
[16]
DARPA, Strategic computing: New generation computing technology: A strategic plan for its development and application to critical problems in defense. Oct. 1983.
[17]
Davis, P.K. Applying artificial intelligence techniques to strategiclevel gaming and simulation. In Modelling and Simulation Methodology in the Artificial Intelligence Era, M.S. Elzas, T. I. O/'en, and B.P. Ziegler, Eds., Elsevier Science Publishers/North Holland, Amsterdam, 1986.
[18]
Defense Science Board Task Force, Military applications of newgeneration computing technologies. December, 1984.
[19]
Dickey, A. Deep-sea robots cut their umbilical cords. The Engineer, 26+ (Sept. 11, 1986).
[20]
Dickson, P. The Electronic Battlefield. Indiana University Press, Bloomington, 1976.
[21]
Gerstenzang, J. Computers, lasers alter art of war. Los Angeles Times (Aug. 7, 1986).
[22]
Goodman, G.W., Jr. US military RPV programs have taken big strides in 1986. Armed Forces }. Int., 66+ (Dec 1986).
[23]
Gurney, G. Rocket and Missile Technology, Franklin Watts, New York, 1964.
[24]
Hellman, P. The little airplane that could. Discover (Feb. 1987), 78-87.
[25]
Hura, M., and Miller, D. Cruise missiles: Future options. In Proceedings of the U.S. Naval Institute. 112, 8 (1986), 49-53.
[26]
Hura, M., and Miller, D. Cruise missile warfare. In Proceedings of the U.S. Naval Institute. 1II, 10 (1985), 96-101.
[27]
IEEE Trans. Systems, Man, and Cybernetics. (Nov./Dec.), 1986.
[28]
Kanade, T., and Thorpe, C. CMU strategic computing vision project report: 1984 to 1985. Carnegie Mellon University, CMU-RI-86-2, 1986.
[29]
Klass, P.}. DARPA envisions new generation of machine intelligence technology. Aviation Week & Space Technology (22 April 1985).
[30]
Lemmons, P. Autonomous weapons and human responsibility. BYTE 10, I (Jan. 1985), 6.
[31]
Leveson, N.G. Software safety: What, why, and how. ACM Comp. Surv. 18 I1986), 125-163.
[32]
Littauer, R., and Uphoff, N., Eds. The air war in Indochina. Beacon Press, Boston, 1972.
[33]
McNamara, R.S. Blundering into disaster. Pantheon Books, New York, 1986.
[34]
Moore, M.K., and Schemmer, B.F. Pinpointing targets, not real estate, electronically. Armed Forces J. Int. 124 (Oct. 1986).
[35]
Navigation challenges autonomous vehicle. Aviation Week & Space Technology (22 April 1985).
[36]
Nilsson, N.J. Principles c)f artificial intelligence. Tioga Publishing Co., Palo Alto, 1980.
[37]
Office of Technology Assessment, Information technology R&D: Critical trends and issues, OTA-CIT-268, Feb 1985.
[38]
Ornstein. S.M., Smith, B.C, and Suchman, L.A. Strategic computing: An assessment. Commun. ACM 28, 2 (Feb. 1985), 134-136.
[39]
Parnas. D.L. Software aspects of strategic defense systems. Commun. ACM 28, 12 (Dec. 1985), 1326-1335.
[40]
Pattern Recognition 18, 6, 1985.
[41]
Perrow, C. Normal accidents. Basic Books, New York, 1984.
[42]
RADC focuses on expert system applications for C3. natural speech technology. Aviation Week & Space Technology 122 April 1985), 84.
[43]
Reagan, R. Address on ballistic missile defe Mar ch 1983.
[44]
Redell, D., and Cohen, D. SDI: Is the software fe d~stnbuted' ' CPSR. Palo by CPSR, Alto, CA, Apnl19X~_ 'off/~,~bb
[45]
Rich, E. Artificial intelligence. McGraw-Hill Co., New ~'1 ~B'7.///'
[46]
Rochlin, G.I. High-reliability organizations and technical~ some ethical problems and dilemmas. IEEE Technology and $5~iety Magazine 5, 3 (Sept. 1986}, 3-9.
[47]
Rogers, M. Birth of the killer robots. Newsweek (25 June 1984).
[48]
Schrodt, P.A. Pattern matching, set prediction, and foreign policy analysis. In Artificial Intelligence and National Security, S.J. Cimbala, Ed., Lexington, Mass., 1987.
[49]
Schutzer, D. Artificial intelligence. An applications oriented approach. Van Nostrand Reinhold, Co., New York, 1987.
[50]
Slagle, J.R., and Hamburger, H. An expert system for a resource allocation problem. Commun. ACM 28, 9 (Sept. 1985), 994-1004.
[51]
Stein, K.J. New strategic computing plan details programs, fiscal data. Aviation Week & Space Technology (15 December 1986).
[52]
Stevens, L. Artificial intelligence: A search for the perfect machine. Hayden Book Co., Hasbrouck Heights, N.J., 1985.
[53]
Sun, M. The Pentagon's ambitious computer plan. Science 222, (1983), 1213-1215.
[54]
Taylor, T.B. Third-generation nuclear weapons. Sci. Am. 256, 4 (1987), 30-39.
[55]
Thompson, C. Military direction of academic CS research. Commun. ACM 29, 7 (July 1986), 583-585.
[56]
Tsipis, K. Cruise missiles. Sci. Am. 236, 2 (1977), 20-29.
[57]
Tsipis, K. The operational characteristics of ballistic missiles. In World Armaments and Disarmament, SIPRI Yearbook 1984. Taylor & Francis, Philadelphia, 1984.
[58]
USAF lab simulates battle management tasks. Aviation Week & Space Technology (9 December 1985), 105.
[59]
U.S. General Accounting Office. Battlefield automation: Status of the Army Command and Control System Program. NSIAD-86-184FS, August, 1986.
[60]
Walker, P.F. Precision-guided weapons. Sci. Am. 245, 2 (1981}, 37-45.
[61]
Walker. P.F. Smart Weapons in naval warfare. Sci. Am. 248, 5 (1983), 53-61.
[62]
Weiss, G. Battle in control of the Ho Chi Minh Trail. Armed Forces J. 108, 12 (15 February 1971), 18-22.
[63]
Weiss, G. Restraining the data monster: The next step in C3. Armed
[64]
Wiener, N. The human use of human beings. Cybernetics and society. Avon Books, New York, 1967.
[65]
Wilson, A. The bomb and the computer. Barrie and Rockliff Cresset Press, London, 1968.
[66]
Winston, P.H. Artificial intelligence 2d ed., Addison-Wesley, Menlo Park, 1984.
[67]
Zuckerman, S. From apes to warlords. Harper & Row Publishers, New York, 1978.
[68]
Zuckerman, S. Nuclear illusion and reality. Collins. London, 1982.
[69]
Zusne, L. Visual perception of form. Academic Press, New York, 1970.

Cited By

View all
  • (2023)AI explainability and governance in smart energy systems: A reviewFrontiers in Energy Research10.3389/fenrg.2023.107129111Online publication date: 27-Jan-2023
  • (2022)Introduction: Critical Insights—Bringing the social sciences and humanities to AIArtificial Intelligence and Its Discontents10.1007/978-3-030-88615-8_1(1-20)Online publication date: 1-Feb-2022
  • (2021)Unsavory medicine for technological civilization: Introducing ‘Artificial Intelligence & its Discontents’Interdisciplinary Science Reviews10.1080/03080188.2020.184082046:1-2(1-18)Online publication date: 7-Mar-2021
  • Show More Cited By

Recommendations

Reviews

Peter M. Hahn

The title of this paper is misleading: it implies that the paper is about the social responsibilities of those who possess technical knowledge about computers and are involved in war research. The paper does not address this issue. Instead, it describes the lack of information or active discussion about ethical and moral issues. The authors also point out strongly that it is a well-hidden fact that a great deal of computer research in the US is defense-oriented. The authors detail clearly and with adequate examples just how important computing research is to the military. They point out where moral issues arise in several different contexts. They ask, for instance, whether a My Lai–type massacre by runaway robots would lessen the responsibility of the humans in charge. Another question is whether the availability of a “smart” weapon provides a temptation to attack in a questionable situation. These good questions, however, do not address the issue of how a computing profesional can deal with the dilemma. Can the professional protest__?__ Can you foresee before you take a job that you will be asked to do something immoral__?__ Should the moral issue be your problem__?__ Perhaps these issues are more the problem of the politician who asks for and funds the research, the institution which hopes to profit from the work, or the public which votes on government policy. Teaching computer science students to understand the applications of their studies and skills is laudable. Giving them insight into the character of the work they will do, the atmosphere of the workplace, and the consequences of decisions they might make is very important. The authors go a long way in describing how this can be done as part of an undergraduate curriculum for computer scientists. Actually, courses covering the social dilemmas arising from the use of computers are needed in nontechnical as well as technical curricula. The paper is well written, contains many examples of the potential use of computers by the military, and poses some interesting moral questions. It falls short on answering or even raising the ethical question of what a professional can and should do when the product of his or her efforts is used for a less-than-moral objective.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Communications of the ACM
Communications of the ACM  Volume 32, Issue 8
Aug. 1989
102 pages
ISSN:0001-0782
EISSN:1557-7317
DOI:10.1145/65971
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 August 1989
Published in CACM Volume 32, Issue 8

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)164
  • Downloads (Last 6 weeks)25
Reflects downloads up to 18 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)AI explainability and governance in smart energy systems: A reviewFrontiers in Energy Research10.3389/fenrg.2023.107129111Online publication date: 27-Jan-2023
  • (2022)Introduction: Critical Insights—Bringing the social sciences and humanities to AIArtificial Intelligence and Its Discontents10.1007/978-3-030-88615-8_1(1-20)Online publication date: 1-Feb-2022
  • (2021)Unsavory medicine for technological civilization: Introducing ‘Artificial Intelligence & its Discontents’Interdisciplinary Science Reviews10.1080/03080188.2020.184082046:1-2(1-18)Online publication date: 7-Mar-2021
  • (2019)Are Truthful Bidders Paying too Much? Efficiency and Revenue in Display Ad AuctionsACM Transactions on Management Information Systems10.1145/332552310:2(1-18)Online publication date: 26-Jun-2019
  • (2018)An Empirical Analysis of Amazon EC2 Spot Instance Features Affecting Cost-Effective Resource ProcurementACM Transactions on Modeling and Performance Evaluation of Computing Systems10.1145/31645383:2(1-24)Online publication date: 22-Mar-2018
  • (2012)The Curriculum Planning Process for Undergraduate Game Degree Programs in the United Kingdom and United StatesACM Transactions on Computing Education10.1145/2160547.216055012:2(1-47)Online publication date: 1-Apr-2012
  • (2012)Fitting the Needs of an IndustryACM Transactions on Computing Education10.1145/2160547.216054912:2(1-35)Online publication date: 1-Apr-2012
  • (2010)WYSINWYXACM Transactions on Programming Languages and Systems10.1145/1749608.174961232:6(1-84)Online publication date: 13-Aug-2010
  • (2010)Semantics of fractional permissions with nestingACM Transactions on Programming Languages and Systems10.1145/1749608.174961132:6(1-33)Online publication date: 13-Aug-2010
  • (2009)Distributed containerACM SIGAda Ada Letters10.1145/1653616.164744529:3(115-118)Online publication date: 1-Nov-2009
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media