By Jonathan D. Moreno
Jonathan D. Moreno is the David and Lyn Silfen University Professor at the University of Pennsylvania where he is a Penn Integrates Knowledge (PIK) professor. At Penn he is also Professor of Medical Ethics and Health Policy, of History and Sociology of Science, and of Philosophy. Moreno is an elected member of the National Academy of Medicine and is the U.S. member of the UNESCO International Bioethics Committee. A Senior Fellow at the Center for American Progress in Washington, D.C., Moreno has served as an adviser to many governmental and non-governmental organizations, including three presidential commissions, the Department of Defense, the Department of Homeland Security, the Department of Health and Human Services, the Centers for Disease Control, the Federal Bureau of Investigation, the Howard Hughes Medical Institute, and the Bill and Melinda Gates Foundation. Moreno has written several books, including Impromptu Man: J.L. Moreno and the Origins of Psychodrama, Encounter Culture, and the Social Network (2014), The Body Politic, Mind Wars (2012), and Undue Risk (2000). He has also published hundreds of papers, articles, reviews and op-eds, and frequently contributes to such publications as The New York Times, The Wall Street Journal, The Huffington Post, Psychology Today, and Nature. In 2008-09 he served as a member of President Barack Obama’s transition team. His work has also been cited by Al Gore and was used in the development of the screenplay for “The Bourne Legacy.”
A new U.S. strategic doctrine called the third offset poses an important challenge for the field of neuroethics. The neuroethical issues related to national security were not among those discussed at the Dana Foundation’s landmark “Mapping the Field” conference in 2002. But only a year after the Dana conference, Nature published a tough editorial called “The Silence of the Neuroengineers.” The editors accused Pentagon-funded investigators of failing to respond to, or even consider, questions about the potential uses of technologies like brain-machine interfaces. An indignant letter from the chief scientist at the Defense Advanced Research Projects Agency (DARPA) suggested that the Nature editors harbored a prejudicial attitude, failing to take into account the medical advances that could eventuate from DARPA-funded neuroscience (1). Since then the possible military and intelligence applications of modern neurotechnologies has stimulated a modest literature (2). Nonetheless, the field is still underperforming in its attention to the national security environment.
The Pentagon, image courtesy Wikimedia. |
The arguments for intensifying a focus on what might be called neurosecurity are many, including a steady pattern of substantial funding in the tens of millions for neuroscience projects by various national security agencies. Though DARPA has received the most attention among those who have followed these developments, both the Intelligence Advanced Research Projects Agency (IARPA) and the Office of Naval Research (ONR) have substantial programs in fields like electro-magnetic neurostimulation and computational neuroscience. As well, the dual use argument that reverberated in the exchange about the Nature editorial points up the momentum behind federal funding for neuroscience. Even those who worry about “militarized” science are put in the awkward position of threading a moral needle when, for example, new prosthetics for severely incapacitated persons are in the offing and when new therapies for dementia and trauma are so desperately needed. Such is the case in the U.S. Brain Initiative in which DARPA plays a key role.
In my own work I have tried to locate national security neuroscience in both the history of science and the history of national security doctrine, especially in the U.S. Seen through those lenses there are important overlapping cases that illustrate the fact that, though the relevant technologies are vastly improved, neurosecurity is not a new concern for defense planners. Since the ancient world commanders have striven to achieve psychological advantages over adversaries, advantages that ranged from propaganda to narcotics. The two world wars saw the introduction of intelligence tests and personality inventories. Cold war intelligence and military officials worried about whether a new compound called LSD-25 could be used as a “truth serum” or a way to demoralize fighters. As strange as these concerns might seem to us now, still more bizarre were serious explorations of ESP like “remote viewing” and telekinesis.
One important lesson of American warfighting capacity during World War II was that although U.S. industrial might out produced all the other protagonists in terms of sheer quantity, the quality of war materials often lagged behind that of Nazi Germany and Imperial Japan. Therefore since the Truman administration a consistent premise of U.S. policy has been that of technological superiority over all actual and potential adversaries. This posture has virtually assured that even implausible “technologies” that might confer an advantage will be considered. It has also resonated well with a country that has been transformed into a post-World War II national security state in which essentially all sectors play a role in the defense of the nation and all societal purposes and resources may be subordinate to national security goals.
U.S. Army "CyberCenter of Excellence," courtesy of Wikimedia |
The appreciation of the importance of a technological edge has been characterized among U.S. defense planners as an “offset strategy.” For that community nuclear weapons comprised the first offset in the face of a Soviet enemy with significant numerical advantages in conventional weapons. For all its MAD quality (the doctrine was called “Mutual Assured Destruction”), the strategy enabled a balance of power during the cold war and was subject to a more or less successful nonproliferation regime. The second offset included precision-guided munitions like laser-guided “smart bombs” and computerized command-and-control systems, proving themselves in the Gulf War of 1990-91. These technologies were clearly cutting edge in their day, but new possibilities have emerged that require new ways of thinking about defense research and development. As well, national security strategists face a multi-polar world that also includes non-state actors capable of terror attacks that pose mainly a psychological rather than an existential threat.
The result is a moving target in several senses of the term, one that requires exploring novel technological approaches. For several years these new technologies have been collected under the heading of the third offset, described by Real Clear Defense as “an attempt to offset shrinking U.S. military force structure and declining technological superiority in an era of great power competition—a challenge that military leaders have not grappled with in at least a generation (3).”
The precise outlines of third-offset technologies aren’t as clear as in the first two offsets, so observers have waited for Pentagon budgets in order to learn of their R&D commitments, though of course there are classified or “black” projects that are not publicly available. Nonetheless, some budget specifics are of particular interest for neuroethics. These include:
Autonomous "deep learning" machines and systems for early warning based on crunching big data;
Human-machine collaboration to help human operators make decisions;
Assisted-human operations so that humans can operate more efficiently with the help of machines like exoskeletons;
Advanced human-machine teaming in which a human works with an unmanned system;
Semi-autonomous weapons “hardened” for cyber warfare (4).
Because the third offset technologies are not clearly definable one needs to be alert to other projects that fit within the scope of the new doctrine, complement some of its goals, and fill out the neuroethical issues. Naturally, many of these projects are dual use. An example is DARPA’s SUBNETS (Systems-Based Neurotechnology for Emerging Therapies) program. In cooperation with other federal agencies and private industry, DARPA aims to develop improved electronic microarray chips for deep-brain stimulation for neuropsychiatric disorders and for prosthetics (5). Implantable chips that go well beyond the standard 96-elecrode array might not only be advantageous for new therapies but also for enhanced performance on a variety of tasks. An open question is whether material might be developed to create biocompatible microarrays of thousands or even tens of thousands of electrodes. DARPA’s Neuroengineering Systems Design project “aims to develop an implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the brain and electronics….The goal is to achieve this communications link in a biocompatible device no larger than one cubic centimeter in size, roughly the volume of two nickels stacked back to back (6).”
A reliable human-machine interface would be a critical component of the third offset. But some DARPA planners wonder if such a device could turn out to be still more powerful, enabling direct human-to-human communication over distances. The controversial Nature editorial in 2003 anticipated these efforts in its indictment of passive neuroscientists who were failing to address the underlying ethical questions: “The agency wants to create systems that could relay messages, such as images and sounds, between human brains and machines, or even from human to human (7).”
Military and intelligence applications of neurotechnologies are a critical driving force behind all the developments that are of interest to neuroethicists. Yet reporting and analysis of these developments have been left mainly to political scientists and technology journalists. As the third offset strategy and its components make it clear, It is no longer plausible for neuroethicists to fail to take into account the national security environment. To do so is to commit a form of scholarly malpractice.
References
1. Jonathan D. Moreno, Mind Wars: Brain Science and the Military in the 21st Century (Bellevue Literary Press, 2012).
2. James Giordano, Ed., Neurotechnology in National Security and Defense: Practical Considerations, Neuroethical Concerns (Taylor and Francis, 2015).
3. Mackenzie Eaglin, “What Is the Third Offset Strategy?” Defense News, February 16, 2016.
4. Aaron Mehta, “Work Outlines Key Steps in Third Offset Tech Development,” Defense News, December 14, 2015.
5. Lawrence-Livermore National Laboratory, “Lawrence Livermore Lab awarded $5.6 million to develop next generation neural devices,” June 11, 2014.
6. Defense Advanced Research Projects Agency, “Bridging the Bio-Electronic Divide.”
7. Editorial, “Silence of the Neuroengineers.” Nature 423, 787 (19 June 2003)
Want to cite this post?
Moreno, J.D. (2017). Neuroethics and the Third Offset Strategy. The Neuroethics Blog. Retrieved on , from http://www.theneuroethicsblog.com/2017/01/neuroethics-and-third-offset-strategy.html
No comments:
Post a Comment