Buy Print

Buy Print

Roundtable

PDF coming soon

Vol 9, Iss 2   | 102–122

-

On Optimism About New Military Technologies

This article identifies psychological, cultural, and organizational factors that drive optimism about emerging military technologies. Psychological influences include bounded rationality, cognitive biases (like the planning fallacy and confirmation bias), and motivated reasoning. Culturally, the US military's “can-do” ethos and historical narratives about technology's role in victory reinforce belief in technological solutions. Organizationally, interservice competition and strategic misrepresentation by program managers seeking resources amplify optimistic projections. All of these factors combine to contribute to technological optimism in US military acquisitions. Consistent with Amara's Law, short-term impacts are overestimated, while long-term effects are underestimated. Similar patterns often emerge as defense technology entrepreneurs consider and present “new” technologies such as AI despite mixed results. Two policy imperatives are discussed: realistic assessment through independent reviews and phased investment, and decentralized experimentation enabling rapid local adaptation alongside traditional top-down innovation.

Throughout human history, technology has advanced military capabilities. Since 1945, US military planners have looked to technology to provide them with technology-based “offsets” to enable the United States to prevail in armed conflict—and there is no question that US military forces are far superior in capability today compared to the numerically larger forces that the US had at the end of World War II. Nevertheless, the US military establishment has also exhibited what often seems to be an unreasonable degree of optimism about how useful new technologies will be for military purposes even in the short run.

This article documents a pattern of American technological optimism about many military technologies; argues that a variety of psychological, cultural, and institutional factors influence its emergence; and concludes that the United States is often overly optimistic about technology in the short term. I then discuss policy imperatives for the acquisitions process that could help to mitigate excessive short-term optimism.

The Quest for Technological Overmatch Against Adversaries

Thomas Mahnken has written: “Reliance on advanced technology has been a central pillar of the American Way of War, at least since WWII. No nation in recent history has placed greater emphasis upon the role of technology in planning and waging war than the United States.”1 The canonical story about US reliance on technology begins with World War II, a “whole-of-nation” effort that mobilized the entire US economy and society, entailing enormous disruptions to society and daily life. In the aftermath of the war, the United States was understandably unwilling to maintain its wartime stance, and shrank its armed forces by some 85 percent by 1947 as the country returned to peacetime footing.

The Soviet Union, however, did not follow suit. As it became increasingly clear that the Soviets would emerge as the next geopolitical—and military—rival of the United States, the United States faced the problem of sustaining credible commitments in Europe and Asia without restoring a wartime military establishment. Rather than try to match the Soviet Union’s larger conventional forces man-for-man and weapon-for-weapon, the US in the 1950s sought to exploit technological advantages, in particular and above all nuclear weapons and their delivery systems.2 Nuclear doctrine—in particular, “massive retaliation”—aimed to offset Warsaw Pact manpower and armor with long-range bombers, ballistic missiles, and theater nuclear forces, promising deterrence and defense without the societal burdens of huge standing forces. Crucially, this approach was underwritten by a significant US advantage in deliverable nuclear forces during much of the 1950s, which made threats of nuclear use more credible.

But as the Soviet nuclear arsenal began to expand in the 1960s, the mutual vulnerability of the United States and the Soviet Union eroded the credibility of early nuclear-heavy threats, and US defense planners turned toward the pursuit of conventional overmatch. In the 1970s and 1980s, during what came to be known as the “second offset,” the US developed an array of high-technology weapons—including precision-guided munitions; stealth; highly capable intelligence, surveillance, and reconnaissance (ISR) and advanced sensing; and battle networks—in an effort to help the United States and NATO allies to hold massed Soviet armor and deep targets at risk with conventional forces, without relying on an increasingly non-credible threat of nuclear escalation.

The effectiveness of the second offset showed in the First and Second Gulf Wars. In the First Gulf War (1991), high-technology weapons—including precision‑guided strikes integrated with wide‑area ISR, stealth‑enabled air superiority, and networked, joint command and control (C2)—propelled coalition forces to a rapid 100-hour ground victory that expelled Iraqi forces (which were modeled on Soviet forces) from Kuwait with very low coalition casualties. In the Second Gulf War (2003), the same high-tech-enabled approach, now more mature, led to the swift defeat of Iraq’s regular forces and Republican Guard outside Baghdad, although later stability and insurgency challenges demonstrated the limits of technology‑enabled conventional overmatch in addressing objectives beyond battlefield defeat—namely, political ones.

black and white charcoal art of a soldier holding a cyber-inspired shield with holes forming

Potential adversaries, notably China, have adapted lessons from American victories and pursued force modernization by integrating second-offset technologies. China has tailored these efforts to its regional security concerns, focusing on anti-access / area denial (A2/AD) strategies to keep US forces, especially carriers, distant from its mainland. Russia has taken a page from the first US offset, elevating the role of nuclear weapons through new systems and doctrine.3 Russia is further seeking to develop its own advanced military capabilities, particularly in areas like hypersonic weapons, artificial intelligence (AI), autonomous systems, and directed-energy weapons. Iran and North Korea have also pursued asymmetric strategies, investing in missile technology, cyber capabilities, and electronic warfare to offset US technological advantages.

To maintain technological overmatch against these adversaries, the US has embarked on a third offset. Less well defined than the second and first offsets, the Pentagon’s third offset strategy pursues US military superiority through next-generation technologies and concepts,4 including AI and autonomy, human-machine collaboration for more timely decision-making, and network-enabled autonomous weapons and advanced non-nuclear weapons such as directed-energy weapons, electromagnetic rail guns, and hypersonic missiles.

The spirit of the second offset continues as well. Advanced versions of technologies first deployed in the Gulf Wars continue to be the focus of efforts to prosecute joint warfare across multiple domains simultaneously through digital networked architectures and comprehensive interservice collaboration. The hoped-for result is an operational tempo that will enable friendly forces to observe, orient, decide, and act much faster than opponents can—making technology an attractive solution to process and synthesize vast, real-time data streams from multiple domains.

In addition to operational advantages, the particular high-technology weapons that are the focus of the second and third offsets have three other valuable attributes. First, weapons of long range are an effective way to keep US military personnel out of harm’s way, and their high accuracy from long distances ensures continuing military effectiveness. These weapons thus play to the well-documented American aversion to wartime casualties,5 and are part of the politically desirable package of minimal casualties and short, decisive wars. For example, then–Secretary of Defense Dick Cheney wrote in the 1993 Regional Defense Strategy that “our response to regional crises must be decisive, requiring the high-quality personnel and technological edge to win quickly and with minimum casualties.”6

Second, precision weapons can achieve high effectiveness against military targets with less collateral damage, as compared to less accurate weapons. Better ISR makes it easier—at least in principle—to distinguish between combatants and noncombatants. If one accepts these propositions, it is easier to wage war in compliance with the requirements of the laws of armed conflict,7 which mandate both distinction and proportionality during hostilities.

“US efforts to develop defenses against strategic ballistic missiles have a long history.”

Third, most theories of deterrence by punishment posit that both the capability of carrying out a deterrent threat and the likelihood of it being executed should the line-crossing act be committed must be sufficiently robust for deterrence to hold. Because precision weapons—combined with good intelligence and battlefield awareness—make it possible to conduct military actions with high confidence in their success, it is argued that technological advances are embodiments of a nation’s capacity and willingness to project power and thus strengthen deterrence.8

Some argue that American culture exhibits a distinctive form of “technological utopianism” that presumes that technology is the most effective means of problem-solving.9 This thinking reflects a persistent belief that technological advancement can resolve a range of tactical and strategic challenges, including minimizing American casualties, reducing collateral damage, and compensating for the numerical superiority of adversaries.

Technological Optimism in National Security: Some Examples

Below are several military technologies for which claims of revolutionary impact have been made, focusing mostly on those with some contemporary relevance: ballistic missile defense, rail guns, hypersonic weapons, and directed-energy weapons. AI is omitted from this list, but is discussed separately as an example of a qualitatively different kind of technology.

Early Efforts in Ballistic Missile Defense

US efforts to develop defenses against strategic ballistic missiles have a long history. Early US ballistic missile defense (BMD) efforts began in the late 1950s with the Nike Zeus program, which called for nuclear-armed interceptors to counter Soviet intercontinental ballistic missiles (ICBMs). This program evolved into Nike-X in the 1960s, replacing a single interceptor with a longer-range Spartan missile and a shorter-range Sprint interceptor (both still nuclear-armed) using phased-array radars for area and point defense.

At the time, the technical feasibility was not in doubt. For example, in 1961, Richard S. Morse, director of research and development for the Department of the Army, testified that

[Nike Zeus is] the only practical system [to guard against missile attack] that is capable of deployment or the only system that anyone is viewing that can be operable in the next decade. . . . There is no single facet of the Zeus system that any competent technical person that I know questions.10

In 1966, Secretary of the Army Stanley Resor said:

The technical feasibility of the NIKE-X [anti-ballistic missile system] and the high probability that the design objectives will be achieved are generally accepted. All major engineering problems have been solved, and only minor design fixes are foreseen.11

Announced in 1967, the first operational deployment of BMD systems was the Sentinel program, which was intended to provide “thin” nationwide protection against China’s emerging ICBMs. In 1969, with a new administration, the program became Safeguard, which focused solely on defending US Minuteman ICBM silos from Soviet attack. One operational site was deployed at Grand Forks, Michigan, in 1975, but was shut down in a matter of months due to excessive costs and doubts about its effectiveness against a large attack.

Rail Guns

Rail guns are advanced electromagnetic weapons that use electricity rather than explosive chemical propellants to accelerate projectiles to hypersonic speeds. By utilizing powerful magnetic fields generated along two parallel rails, these weapons achieve muzzle velocities far exceeding traditional guns, enabling projectiles to reach distances over 200 nautical miles with remarkable precision and speed.

Various experts and officials have described rail guns as revolutionary and transformative technologies with the potential to reshape naval warfare. Tom Boucher, a program officer at the Office of Naval Research (ONR), stated that electromagnetic weapons like the rail gun “will play a critical role in the future of naval warfare by providing greater lethality and greater economy than existing weapons,”12 emphasizing the rail gun’s capability to launch projectiles at speeds beyond conventional technology and calling this development a “revolutionary leap” in naval gun technology. Similarly, Dr. Elizabeth D’Andrea, ONR’s program manager for the electromagnetic rail gun, highlighted the weapon’s transformational potential, noting that it can deliver lethal hypersonic projectiles over extremely long ranges within minutes.13 Navy Cmdr. Joseph McGettigan echoed this by describing the rail gun as a “transformational solution for volume fires and time-critical strike,”14 underlining its operational promise for naval combat.

Industrial leaders have also framed rail guns as revolutionary. Chris Hughes, former vice president at BAE Systems, described the technology as “innovative and game changing,” with the power to “revolutionize naval warfare,”15 particularly by enhancing the armed forces’ ability to defend against threats at unprecedented distances.

Across various sectors, the rail gun has been consistently portrayed as a cutting-edge weapons system that, when fully realized, could dramatically extend reach, precision, and lethality, marking a significant technological leap beyond traditional gun systems.

However, the US Navy officially canceled its electromagnetic rail-gun development program in 2021 after spending about $500 million over fifteen years.16 Despite early enthusiasm and promising test results—including projectiles fired at hypersonic speeds and extended ranges—technical challenges proved insurmountable. The rail gun faced problems such as rapid barrel wear from intense heat, a slow rate of fire due to massive power requirements, and limitations in available electrical power on naval ships. These issues, coupled with cost concerns and the advancing maturity of alternative technologies like hypersonic missiles and directed-energy weapons, led the navy to shift funding and focus away from the rail gun.

Hypersonics

Hypersonic weapons are advanced maneuverable military weapons capable of traveling at speeds exceeding Mach 5, or five times the speed of sound. These weapons combine extreme velocity with exceptional maneuverability and precision-strike capabilities, allowing them to rapidly hit targets with high accuracy while evading traditional missile defenses. The speed of these weapons drastically reduces enemy reaction time, making them a formidable strategic and tactical asset on modern battlefields.

Various experts and organizations have described hypersonic weapons using transformative and revolutionary language. Lockheed Martin, a key defense contractor, has stated: “We are developing game-changing hypersonic solutions to ensure our customers are always ready for what’s ahead,” underscoring their belief that “hypersonic systems are a game-changer for national security.”17 Similarly, Emergen Research notes that hypersonic weapons, with their unparalleled speed, stealth, and rapid response capabilities, are “revolutionizing modern warfare,” enabling militaries to strike swiftly and accurately.18 A prominent management consulting firm echoes this view, describing hypersonic weapons as “widely considered a revolution in modern warfare” due to their ability to combine high speed with precision strikes capable of penetrating advanced defense systems.19

“Various experts and organizations have described hypersonic weapons using transformative and revolutionary language.”

Beyond commercial marketing, military experts have shown enthusiasm for these weapons’ disruptive potential. The US Army highlighted the long-range hypersonic weapon (LRHW) as a “game-changing capability” that supports rapid precision strikes against emerging threats.20 Vice Adm. Johnny R. Wolfe Jr., director of the navy’s Strategic Systems Programs, emphasized how the army and navy’s hypersonic missile partnership has produced a “transformational hypersonic weapon system” that delivers unmatched joint warfighting capabilities.21 Andrew F. Krepinevich Jr., an expert on military innovation, suggests that precision-accurate hypersonic weapons “may trigger shock waves in the strategic balance,” highlighting their potential to fundamentally alter deterrence and conflict dynamics.22

These characterizations collectively illustrate how hypersonic weapons are widely portrayed as revolutionary and transformative technologies with the potential to reshape military strategy, defense posture, and global security frameworks.

Nevertheless, only one nation has actually used hypersonic weapons in combat to date—Russia in its war against Ukraine. In that conflict, the use of hypersonic weapons has hardly proven decisive. Many have been successfully intercepted by Ukrainian defense forces, and the actual damage caused by those not intercepted has not been significantly greater than that caused by conventional ballistic missiles.

Directed-Energy Weapons

Directed-energy weapons (DEWs) employ concentrated electromagnetic energy, such as high-energy lasers and high-power microwaves, to damage or destroy targets with high precision at the speed of light. Unlike traditional kinetic weapons that rely on physical projectiles, DEWs can engage threats with scalable and controlled effects, often providing advantages like deep magazines, low cost per shot, and minimal collateral damage.

Various defense experts, government officials, and industry leaders have described DEWs as revolutionary and transformative for national security. A Defense Science Board report said that “investments in HEL [high-energy laser] technologies are expected to transform warfighting, enabling revolutionary advances in engagement precision, lethality, speed of attack, and range, while minimizing collateral damage and complementing precision munitions capability.”23 The Congressional Research Service underlined that DEWs “have the potential to change the very nature of warfare and could have implications for US national security.”24 Former Secretary of the Air Force Sheila E. Widnall said: “It isn’t very often an innovation comes along that revolutionizes our operational concepts, tactics, and strategies. You can probably name them on one hand—the atomic bomb, the satellite, the jet engine, stealth, and the microchip. It’s possible the airborne laser is in this league.”25 (The US Airborne Laser [ABL] was a high-energy laser-weapon system developed primarily by the Missile Defense Agency and carried on a modified 747 aircraft. Megawatt in class and fueled chemically, it was designed to intercept ballistic missiles during their boost phase from the air.)

From the think-tank world, the Lexington Institute observed that “an assessment of the current state of US directed-energy technology and its potential to change the nature of modern warfare must conclude that directed-energy weapons are the essence of transformation,” and that “directed-energy weapons offer the potential for the most dramatic transformation of modern militaries since the advent of electronics and possibly even gunpowder,” emphasizing revolutionary capabilities such as long-range, speed-of-light engagement and deep ammunition magazines.26

Officials and contractors have echoed these sentiments. Mark Spencer, director of the Joint Directed Energy Transition Office, called DEWs a “game-changing technology,”27 while Congressman Tim Ryan pointed to their wide-ranging threat applications, spanning from hypersonics to unmanned aerial systems.28 Industry sources also underscore this revolutionizing role: Boeing described its Mk 38 Mod 2 Tactical Laser system as “revolutionary because it combines kinetic and directed energy weapons capability.”

Collectively, this body of authoritative commentary—going back nearly twenty years—frames directed-energy weapons as pivotal advancements poised to reshape warfare and strengthen national defense. DEWs may yet become an important (though niche) component of the US arsenal for all of the reasons described above. Nevertheless, at the time of writing, no DEW has been fully deployed by the United States in an operational status; furthermore, no operational battlefield kill using a laser against an adversary target has yet been documented by any nation. The “revolutionary” ABL was mothballed in December 2011.29 The reason offered by Robert Gates, then secretary of defense, was that the ABL was the prototype for “a fleet of laser-bearing 747s circling slowly inside enemy air space to get off a shot at a missile right after launch,” an operational concept that did not pass the commonsense test.30

Technological Optimism: Warranted or Not?

On the one hand, looking at the long arc of technology applied for military purposes, it is undeniable that military technologies have improved—often dramatically—over time. No one would doubt that today’s airplanes or aircraft carriers are far superior to those in World War II. No one would doubt that military communications and surveillance are vastly different and far superior today than they were in World War II. And no one would doubt that the technologically enabled US military of today far outclasses the numerically superior US military of 1945.

On the other hand, the history of hyperbolic rhetoric about revolutionary or transformational new capabilities from this or that technology—taken alongside the large number of major weapons-acquisition programs that are delayed, go over budget, or are cancelled after considerable expenditures—is strong evidence that the original promises made to justify the start of a program may not have been well founded and policymakers may have erred in not being more skeptical about those promises in the first place. The examples above suggest strongly that at least some of the time, technological optimism is not entirely justified.

So are we excessively optimistic about technology, or too pessimistic in not fully recognizing the promise of technology to advance operational military capabilities? An error in either direction carries nontrivial costs. Being overly optimistic builds false hopes and distracts us from focusing on improvements in military posture that will help to win the battles on the immediate horizon. Being overly pessimistic prevents us from adapting to a conflict environment that changes more rapidly than we expect.

“In early phases of adoption, forecasts typically emphasize first-order effects: direct, observable applications of an innovation.”

In fact, we are both too optimistic and too pessimistic. Amara’s Law, proposed by futurist Roy Amara, holds that “we tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”31 That is, we are both overly optimistic about technology (in the short term) and overly pessimistic about it (in the long term).

Why? In early phases of adoption, forecasts typically emphasize first-order effects: direct, observable applications of an innovation. Not all of these will bear fruit; limitations imposed by present technical capacities in related domains, infrastructure, and complementary assets often lead to inflated expectations in at least some areas that do not materialize as anticipated. This phase provides a foundation for short-term pessimism. In the long term, however, many difficulties in implementation of good ideas are worked out. Moreover, second- and third-order effects, many of which are indirect, systemic, and emergent—and hence difficult to foresee—begin to manifest themselves.

Former Deputy Secretary of Defense Kathleen Hicks has pointed out that the most important military use of electricity was not as a weapon to electrocute enemy troops (which would have been a first-order use case).32 Instead, the main military value of electricity has come in capabilities offered by radios, radar, semiconductors, satellites, night-vision goggles, data links, battle networks, precision-guided munitions, and avionics that enable dagger-shaped stealth bombers to fly—all of which are second- and third-order-use cases.

The story of GPS provides a case in point. In the program’s early stages in the 1970s, skepticism about GPS was significant.33 Amid persistent doubts about its technical feasibility, a Government Accountability Office report in 1977 criticized the proposal (then called NAVSTAR) on the grounds that the user community had not formally established its need for a new capability.34 At the outset of the First Gulf War, manpack GPS receivers were heavy and bulky—eight kilograms and difficult to use. Today, however, GPS is an integral part of nearly every mobile platform or delivery system for the United States—and the US military is concerned about being overly dependent on GPS.

Stories similar to GPS could be told across all of the technologies of warfare (see box 1). Limited official success is followed by a rise in effectiveness as operational experience accumulates, bugs are fixed, and new use cases emerge. What was once a hassle becomes an integral and essential part of current practice. (Conversely, we hear little today about innovations if that operational experience demonstrated that the game isn’t worth the candle.)

Box 1. Evolution in the technologies of warfare over centuries and millennia.

Firepower has evolved from simple projectile weapons like bows and arrows to advanced systems such as tanks, missiles, and nuclear weapons. Recent decades added precision-guided munitions and autonomous weapons for greater efficiency and reduced collateral damage.

Weapons performance parameters such as range and accuracy have improved from human-limited weapons to gunpowder rifles and artillery with extended distances. Modern precision-guided weapons can strike within meters even at ranges of hundreds to thousands of kilometers.

Maneuverability has improved from reliance on human and cavalry movement to ships, trains, motor vehicles, aircraft, and submarines. Now spaceflight and unmanned vehicles enable operations across unprecedented distances and new domains.

Battlespaces have expanded from confined ancient battlefields to global naval and air domains with colonialism and technological advances. Today, cyberspace and the electromagnetic spectrum broaden conflict arenas to physical and virtual multi-domain environments.

Information and decision-making have evolved from early military communications reliant on messengers, flags, and semaphores to innovations such as telegraph and radio. Today, reconnaissance aircraft, satellites, drones, and AI-assisted data fusion enable real-time global intelligence and rapid, coordinated decision-making across dispersed forces.

This history is a clear rebuttal to any claims that technological innovation is inherently undesirable. Technology-based offset strategies were in fact valuable in closing resource gaps that have been present since the end of World War II. But the claim that all technological innovation is ipso facto desirable is not true either. Underlying technological optimism is a sentiment that innovation is inherently desirable—which, in a military context, means that innovation naturally strengthens military power.35 But this proposition can only be true in cases of successful innovation. As a variety of cases discussed above suggest, not all innovations succeed. Unsuccessful innovations can diminish military power, at the very least because of opportunity costs; resources spent to acquire and deploy an unsuccessful innovation, which by definition does not contribute to combat power, could have been spent on something else that would have improved military power. In some cases, standing capabilities that contribute to military power may be underfunded and left to atrophy, while resources are diverted to support unproductive innovations.

An innovation may also be technically successful, but it is not useful or important if it cannot be deployed in a way that significantly contributes to military power. Many of the “wonder weapons” of Germany during World War II—including the V-1 “buzz bomb,” the V-2 ballistic missile, and the Me-262 jet fighter—indeed had capabilities significantly beyond those of ordinary weapons.36 Developing these newer weapons, however, had a high material and human cost, which drained resources from traditional and more proven military programs. The high cost, technical difficulties, and late deployment of these weapons meant that Germany overcommitted to unproven technologies at the expense of its conventional arms. The gains from these wonder weapons did not make up for the loss in conventional capabilities that could have been maintained had Germany followed a different investment path.

Kuo argues that innovation is most risky when expanding security commitments exceed available resources—a situation that has often applied to the US armed forces.37 Under these conditions, policymakers are tempted by the promise of greater “bang for the buck” as a way to close the gap between available resources and needed capabilities. These circumstances are a close-to-ideal environment in which wishful thinking and excessive technological optimism can easily take root.

Overoptimism can be recognized most easily in retrospect, a canonical example of which is the fallacy of the last move (FLM).38 FLM is the mistaken belief that a single technological or strategic action—such as deploying a new weapons system—will provide a permanent, decisive advantage. This fallacy ignores the dynamic and iterative nature of military competition, where adversaries continuously develop countermeasures and innovations, making any “last move” effectively impossible. At best, such deployment will result in a temporary advantage that will last until the adversary develops and deploys an effective counterresponse that either offsets or negates the advantages afforded by the innovation or eliminates the first deployer’s monopoly. But for strategists, policymakers, and warfighters, the temptation to believe otherwise is strong.

One prominent FLM example is the deployment of the multiple independently targetable reentry vehicle (MIRV) in the early 1970s. MIRV technology allowed a single missile to strike multiple, widely separated targets. US policymakers initially advocated MIRV as a decisive counter to Soviet numerical parity in strategic missiles and anti-ballistic missile (ABM) systems, operating under the assumption that American miniaturization capabilities would remain unrivaled for a generation. In 1971, Secretary of Defense Melvin Laird called MIRV the last frontier of US advantage in the strategic nuclear field.39

The Soviet Union, however, was in fact able to respond effectively to this US innovation in short order. The Soviets developed their own MIRV technology and coupled it with their own missiles, which had heavier payload capacity than US missiles and thus could carry more warheads per missile. This development ultimately led to a large increase in the total number of nuclear warheads. Even worse, MIRVed land-based ICBMs on both sides gave rise to a strategic instability in which each side could destroy the other’s ICBM force with only a fraction of its own ICBMs—a condition that provided incentives for each side to go first in a strategic crisis. This risk would drive US nuclear concerns for many years to follow and continues today.

Influences That Promote Excessive Technological Optimism

In the context of this article, excessive technological optimism is a tendency to ignore or downplay information that, if properly taken into account, would moderate claims of future success. This section addresses how cultural, psychological, and institutional factors contribute to excessive technological optimism.

Psychological Influences

The finite cognitive capacity of the human brain—constrained by limited attention, working memory, time, and information—drives certain important decision-making mechanisms. These mechanisms promote efficiency over systematic analysis, often fueling technological optimism in weapons-system acquisition by favoring rapid, positive conclusions.

First, bounded rationality emerges from cognitive limits that prevent full exploration of options under resource constraints like time and incomplete data.40 Decision-makers satisfice with “good enough” thought mechanisms rather than optimization, especially in multidimensional trade-offs (like cost, schedule, performance, and risk). For a hypersonic missile, program managers under time pressure might approve accelerated timelines based on early tests, focusing only or primarily on one salient, important, or defensible attribute (for example, speed or maneuverability) rather than fully weighing trade-offs among them (for example, cost or ease of maintenance).41

“Biases are the systematic errors in thinking that arise from the use of heuristics.”

Second, cognitive economy refers to the conservation of mental resources via heuristics, shortcuts, schemas, and familiar narratives, trading accuracy for efficiency in complex environments.42 This approach embeds systemic biases in any decision-making process but enables quick navigation amidst information overload. (Heuristics are mental shortcuts that reduce mental effort and enable quick decision-making under uncertainty by simplifying complex problems. Biases are the systematic errors in thinking that arise from the use of heuristics. Box 2 describes some cognitive biases that result from the use of heuristics.) Applied to a new AI-integrating drone, cognitive economy might predispose decision-makers to draw on memories of past drone successes, ignoring unique cyber vulnerabilities and projecting seamless integration with non-AI force elements.43

Box 2. Cognitive biases that may be relevant to excessive technological optimism.

Biases are the systematic errors in thinking that arise from the use of heuristics, which are mental shortcuts that enable quick decision-making under uncertainty by simplifying complex problems.* These biases include but are not limited to:

The planning fallacy leads developers to underestimate timelines, costs, and risks for novel systems by focusing on best-case scenarios, ignoring historical overruns from similar projects. For example, teams might project deployment of a $175 billion comprehensive missile defense for the population on the continental United States in three years despite past programs exceeding budgets by billions and some failing to deliver.

Illusions of control foster overconfidence in mastering unpredictable tech trajectories, especially among high-status leaders who overestimate influence. Pentagon officials might assume that tight oversight will tame autonomous drone swarms' emergent behaviors, setting aggressive timelines despite integration uncertainties.

Attribution errors cause decision-makers to internalize successes (in skill and control) while externalizing failures (unforeseen issues), building unchecked optimism. For example, early prototype hits for a laser weapon might be credited to team genius, with test glitches blamed on weather, ignoring systemic flaws and amplifying the desire to pursue the weapon.

Confirmation bias drives selective evidence-seeking, prioritizing supportive data while dismissing contradictions. Advocates for AI targeting systems highlight benchmark successes as proof of reliability, downplaying edge-case failures as anomalies, sustaining belief in flawless field performance.

WYSIATI bias ("what you see is all there is") confines judgments to readily available info, neglecting unknowns or absent evidence. Briefings on a new missile-defense interceptor focus on demonstration intercepts, overlooking untested countermeasures such as decoys, yielding overconfident readiness claims about the progress of the technology.

* Daniel Kahneman, Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011), 7.

Third, the need for cognitive closure (NFC) refers to an individual’s need for firm answers to reduce ambiguity, prioritizing quick judgments that halt resource-intensive deliberation. Those with strong NFC often seize on early information and freeze on initial conclusions, reducing cognitive strain but increasing the risk of premature decisions and resistance to updating beliefs when confronted with new or contradictory evidence. Since programs don’t generally exist unless there was some measure of positive information to support their start, preliminary evidence is always available and is usually positive. If impressions derived from such evidence persist, the risk of overoptimism (as the result of ignoring subsequent negative information) increases.

Fourth, the dual-process model of cognition described by Daniel Kahneman posits a System 1 thinking—which operates automatically, rapidly, and with minimal effort, dominating by default to spare cognitive load—and a System 2 thinking—which is slow, deliberate, logical, and analytical, and needed for complex reasoning, problem-solving, or carefully evaluating evidence.44 Because visual processing of information is a System 1 function (and analysis a System 2 function), one can easily imagine that animated videos of new weapons-system concepts are more readily assimilated into the consciousness of an acquisition officer than a briefing paper addressing pros and cons. One of the most important aspects of System 2 thinking is that it must be deliberately invoked or activated. In the absence of explicit invocation, the always-on System 1 can lead individuals to premature conclusions. But its use is not a guarantee against error, as illustrated by the following phenomenon.

Fifth, motivated reasoning is a cognitive bias in which individuals’ desires, emotions, and preexisting beliefs shape how they interpret and evaluate information, often leading them to favor conclusions that align with what they want to believe rather than those supported by objective evidence.45 This reasoning often begins with an affective reaction to something—for example, a statement that a person wants to believe or disbelieve.

If the circumstances are such that belief or disbelief in the statement is warranted, the person may then invoke a reasoning process that justifies belief or disbelief. In particular, motivated reasoning is demonstrated when a person asks “Can I believe this?” when encountering information that supports what one wants to believe, but asks “Must I believe this?” when encountering information that contradicts one’s preferences.46 In other words, people readily accept minimal evidence for desired beliefs, but require overwhelming proof before accepting undesired ones. Motivated reasoning is still rational in that it requires evidence and rationale, but the evidentiary standards are asymmetric for affective reasons. Motivated reasoning is common in large projects that are managed by ostensibly rational planning and execution processes.47

Motivated reasoning provides a plausible account for the FLM. It seems intuitively obvious that an adversary would attempt to counter the deployment of a superweapon directed against it. Why might decision-makers ignore or downplay such a possibility? But decision-makers had strong motivations to believe that a US innovation (MIRV) would not be matched by the Soviets for a long time, and thus they were able to deploy a variety of psychological mechanisms to justify that belief. For example, the belief in a “last move” is a cognitively economical strategy that confers decisive advantage and provides psychological comfort and clear resolution for decision-makers, strongly appealing to the need for closure.

Together, these psychological tendencies make the fallacy of the last move a common error. Under stress and information overload, decision-makers gravitate toward neat, decisive solutions that promise closure, while minimizing mental effort by employing oversimplified, static models of competition. This approach promotes overconfidence in technological “silver bullets” or one-off strategic breakthroughs that blind decision-makers to the move-countermove cycles of strategic competition.

Cultural Influences

Culture influences cognition and decision-making processes by influencing the mental frameworks and heuristics that individuals use to interpret information and solve problems.48 Cognitive styles can vary significantly across cultures, affecting decision-making strategies and how people evaluate risks, weigh alternatives, and make choices. Heuristics are susceptible to cultural influence,49 as their adoption depends on culturally shaped perceptions of risk, uncertainty, and social expectations. For instance, cultural differences in tolerance for ambiguity and need for cognitive closure can modulate the extent to which individuals rely on heuristics versus systematic processing.50

Consider how this fundamental point might apply in a military context. Heidi Demarest, Tyler Jost, and Robert Schub argue that “the military’s organizational culture, rather than domain-specific expertise, [is] an important factor underpinning bureaucratic optimism about cyber coercion” and hence, by inference, optimism about the military value that high-tech cyberweapons afford.51 Similarly, Jon Lindsay finds that the path from technological innovation to improved military outcomes is neither deterministic nor linear, emphasizing that how people and organizations interact with technology mediates its actual effects.52 Austin Long argues that doctrinal approaches—in particular about counterinsurgency—become central only when they can be framed in ways that fit deeply rooted service beliefs about what “real” warfighting looks like and about whom the organization exists to serve.53 But while Long focuses on counterinsurgency doctrine, his framework implies a broader principle: Service culture acts as a powerful mediator for valuing any innovation—including technology—based on its perceived alignment with deeply rooted beliefs about the service’s purpose and ‘real’ warfighting. Similarly, Adam Yang’s “culture-innovation selection model” posits that warfighting culture (a service’s dominant belief on how wars should be fought) drives organizational responses to innovations, predicting acceptance or promotion only when cultural congruence is high.54

One path of cultural influence is how the culture in question regards the lessons that history affords in defining the enduring narratives and foundational beliefs that are embedded in social traditions, education, and institutions. For example, the surprise attack at Pearl Harbor on December 7, 1941, generated historical lessons—indeed, historical trauma—that guided US nuclear strategy for decades into the Cold War in preparing for a “bolt from the blue” attack (for more on this, see Francis Gavin’s essay in this Roundtable).

For the purposes of this article, another salient historical lesson from World War II is the key role that the technologies of radar and the atomic bomb played in victory by the allied forces over the axis powers. Radar revolutionized warfare by enabling early detection of enemy aircraft and ships, improving air-defense coordination, and enhancing navigation and targeting. From the Battle of Britain to the Battle of the Atlantic and in the Pacific theater against Japan, radar served as a major force multiplier for allied forces. As for the atomic bomb, its use against Hiroshima and Nagasaki was widely believed to have been the direct cause of the Japanese surrender.

The actual utility of these technologies in ending the war and leading the allied forces to victory (the atomic bombings in particular, but to a lesser extent the role of radar as well) is debated by scholars. There is, however, no question about widespread belief in their efficacy. It is not difficult to draw a line from this historical narrative that points toward technology as a decisive force in warfare, and from there, optimism about the promises it affords.

Culture also shapes values. An important value in the US military is the “can-do” ethos,55 which calls for an unwavering determination to place the mission first, never accept defeat, never quit, and never leave a fallen comrade. This ethos shapes soldiers’ attitudes and actions, driving them to push relentlessly to accomplish missions despite obstacles and to persevere through adversity.

“An important value in the US military is the 'can-do' ethos, which calls for an unwavering determination to place the mission first, never accept defeat, never quit, and never leave a fallen comrade.”

Seen in the context of using technology to provide a particular military capability, it is easy to see how the same can-do attitude applied to assessments about emerging and unproven technologies could lead to overly optimistic judgments. In this environment, cautions about what the laws of physics allow (rarely) or what engineering practice or economics could support (more often) can be downplayed or ignored as the natterings of naysayers and Luddites. The can-do mindset thus biases program managers and advocates toward attempting the seemingly impossible, thereby risking operational failures, unnecessary time delays, or even eventual program cancellation.

Two examples are instructive. One is the Future Combat System program, the US Army’s early-2000s effort to field a “system of systems” that coordinated lighter manned vehicles, unmanned air and ground platforms, sensors, and precision munitions to create highly deployable, information-driven combat brigades. Canceled after spending at least $19 billion, a RAND report found a mindset in the program that “managers needed to maintain a positive attitude [emphasis added] to keep people motivated rather than dwelling on problems.”56

A second example of can-do being reflected in assessments of a program is the Strategic Defense Initiative (SDI), the Reagan administration effort to create a multilayered system of space- and ground-based defenses to intercept Soviet ballistic missiles and shield the entire United States from nuclear attack. At its zenith, the concept envisioned an integrated network of advanced sensors, satellites, directed-energy weapons, and non-nuclear interceptor missiles that would detect launches, track warheads, and destroy them before they could reach American soil.

About the program, Gen. James Abrahamson, the director of the SDI program, asserted that “we can do just about anything, technically, if we just decide to do it.”57 Secretary of Defense Caspar Weinberger equated desirability with feasibility, saying that “the goal of strategic defense is so eminently desirable that we can and will find solutions to any problems that might develop along the way.”58

In May 1993, the orientation of the SDI Organization (widely known then as SDIO) shifted to emphasize developing and fielding advanced theater missile defenses, and the organization’s name changed to the Ballistic Missile Defense Organization.59 It is noteworthy that after spending tens of billions of dollars on SDI,60 no defensive weapons were ever deployed as the result of the program.

Organizational Influences

Still another set of influences on optimism arises from the fact that military technologies emerge from and are adopted by bureaucracies. When an individual embedded in these institutions makes a statement that appears to express an opinion about a given technology, it is conceptually important to distinguish between a statement that expresses the individual’s true opinion and another that reflects some bureaucratic or institutional interests. A priori, these two statement types are not necessarily identical.

For example, a program manager’s foremost concern for an innovative military technology is its survival. Since a program cannot prove its worth if terminated early, the manager seeks to protect it from other programs that are competing for the same budget dollars. Under these circumstances, it makes little sense for the manager to be fully candid about problems with the program or the technology in question. Instead, the manager’s reports are likely to be at least cautiously optimistic. A similar example is a politician expressing optimism for a technology that—entirely “coincidentally”—also entails the creation of many jobs in their home district.

One term used for this phenomenon is strategic misrepresentation—the deliberate and systematic distortion of predictions of the future for strategic purposes, such as to secure project approval or funding. Here, a decision-maker’s optimism about a program or a technology is driven by what they have to gain by securing resources to pursue it.61 Other terms that could be used include “insincere optimism” or “staged optimism”—the adjectives are used to describe something other than a candid assessment from the individual about the prospects for a technology.

Alternatively, strategic misrepresentation could be an illustration of motivated reasoning—the program manager or decision-maker has a coherent, logical narrative to explain optimistic conclusions and plausible ways of explaining negative evidence as genuine anomalies, even if more disinterested analysis would be less optimistic. In this case, the program manager or decision-maker believes the story being told. Because it is impossible to know with certainty whether the manager offering optimism about a technology in public believes the same in private, external observers are forced to take the proffered statements at face value, so the effects at an institutional level may be identical.

The stressful conditions under which decision-making occurs in military institutions also influences how individuals process information. Research on decision-making demonstrates that elevated pressure tends to increase reliance on intuitive, heuristic-based reasoning, thereby amplifying susceptibility to cognitive errors.62 Defense acquisition presents stressful conditions of uncertainty, urgency, and organizational complexity. Acquisition leaders and managers confront stringent time constraints, incomplete information, and weighty responsibilities tied to national security, technological innovation, and fiscal stewardship. These factors combine to create a cognitively demanding environment more conducive to the emergence of systematic biases.

Moreover, interservice and intraservice competition constitute some of the most persistent features of peacetime defense affairs, as the various branches vie for budgets, missions, and institutional influence. This phenomenon was noted by Samuel Huntington, who observed that “the less money there [is] in the military budget, the more intense and bitter [is] the competition of the services for it.”63 Similarly, Emilie Berthelsen argues: “During peace, defeat in combat is an abstract risk. External civilian budget interventions are on the other hand perceived as an imminent threat to the military organization.”64 Indeed, Robert Gates, secretary of defense through much of the Afghan war, suggested that, despite being at war, much of the Department of Defense (DoD) was “preoccupied with future capabilities and procurement programs, wedded to lumbering peacetime process and procedures, stuck in bureaucratic low-gear [while t]he needs of those in combat too often were not addressed urgently or creatively.”65

In this environment, program managers, service branches, and their advocates orient their strategic behavior toward securing resources and preserving institutional relevance. These contests unfold both within the defense bureaucracy and in interactions with civilian policymakers, as each actor seeks to defend programs, justify expenditures, and shape the direction of future acquisitions.

Within the defense acquisition context, decision-makers frequently exhibit stress-induced cognitive biases such as the planning fallacy (the tendency to ignore what is known from previous, similar efforts in favor of what is different this time), optimism bias (overestimates of the likelihood of positive outcomes and underestimates of negative outcomes), recency bias (an undue tendency to weight recently acquired information more heavily than warranted), and trade-off bias (a reluctance to make trade-offs because of the difficulty of evaluating them).66

Acquisition professionals must balance competing institutional demands, reconcile ambiguous evidence, and operate within shifting political and strategic constraints—conditions that heighten the potential for flawed assessments and suboptimal judgments. Moreover, bureaucratic and organizational tensions introduce additional complexities that often assume an adversarial character among stakeholders. Consider in particular a common joke heard in service circles: For the navy, the adversary might be Russia or China, but the true enemy of the navy is the air force. (Of course, the names of the service branches are interchangeable.)

The stakes of such competition are extraordinarily high because budget and mission control dictate a service’s ability to innovate, modernize, and maintain operational readiness for future conflicts. This environment transforms peacetime into a continual contest for political influence, institutional prestige, and future performance—effectively turning internal rivalry into the central, and arguably most difficult, activity for military services outside of war.

Optimism 2.0: AI and the Defense-Technology Entrepreneurs

The technological optimism discussed in earlier sections generally refers to technology produced by legacy defense contractors. But in the past decade or so, a new generation of what could be called defense-technology entrepreneurs (DTEs) has sought to introduce a new paradigm for the acquisition of military capabilities.

This new generation shares many of the criticisms of the legacy contractors outlined above. DTEs contend that they are better positioned to deliver (and therefore more optimistic about delivering) on promises of improved military capability—a claim rooted in their business model and design philosophy rather than an established track record.

In their view, the cost-plus contracts favored by legacy contractors reward drawn-out complexity and schedule slippage over functional outcomes. DTEs claim to invert this model by investing private capital upfront to develop platforms that they describe as ready for immediate military use—systems that coordinate specialized hardware with militarily relevant attributes.

In the DTE universe, early product iterations deliver a minimum viable capability, with rapid, field-driven adaptation—ostensibly measured in weeks rather than years—enhancing performance over time. The DTE approach is centered on architecture: Modular systems connect hardware components that can, in principle, be reconfigured rapidly—ideally in near-real time—to respond to battlefield surprises. Over-the-air updates, they argue, transform hardware into rapidly evolvable platforms capable of generating attritable mass and integrating legacy assets into modern kill chains.

To understand the DTE technological worldview, it is helpful to distinguish between two categories they emphasize: artifactual and architectural technologies. Artifactual technologies provide raw physical capabilities (effectors causing physical effects), setting performance ceilings through hardware metrics like range and lethality. DEWs, hypersonics, rail guns, the Future Combat System, and defense against strategic ballistic missiles exemplify this category.

By contrast, architectural technologies provide the process intelligence needed to organize existing artifacts for desired outcomes—for example, command-and-control frameworks synchronizing capabilities to operational environments. The assembly line revolutionized industry not through new machines or tools but through synchronized production flow; the shipping container—nothing more than a low-tech steel box—standardized intermodal interfaces among the diverse hardware of ships, cranes, and trucks, transforming the entire logistics system beyond individual transport platforms.

“DTEs cite the Russia-Ukraine war as inspiration for and vindication of their approach.”

This distinction usefully highlights different sources of competitive advantage.67 However, the capabilities offered by many military systems in practice depend on both artifactual and architectural elements; the categories should be seen as analytic poles rather than empirical bins.

DTEs cite the Russia-Ukraine war as inspiration for and vindication of their approach. For example, Ukrainian forces have employed inexpensive reconnaissance drones not to destroy tanks directly but to enable cheap artillery strikes and loitering munitions—an example, in their telling, of coordination trumping artifactual sophistication.

DTE advocates frame AI as the paradigmatic architectural technology—a general-purpose enabler applicable wherever better information contributes to military success. This framing supports their broader argument that software-defined, rapidly iterable systems will dominate future conflict. AI is thus positioned as quite different in character from DEWs, rail guns, or hypersonic weapons; in this view, AI has a stronger claim to revolutionary significance than any individual artifact.

Senior officials have echoed this transformational framing, lending rhetorical weight to DTE claims—though such statements are better understood as expressions of strategic priority than as validated assessments of AI’s operational impact. For example, Gen. James Rainey, commanding general of Army Futures Command, emphasizes the unprecedented magnitude of this change by comparing it to historic turning points: “To say the period that we’re living in right now is disruptive would be like an epic understatement. Different people use different analogies, but I think what we’re witnessing right now is as significant as the nuclear arms race. Potentially, even the Industrial Revolution.”68

Jake Sullivan, national security advisor to President Biden, described AI as a technological leap unlike any the country has seen before: “I’ve had to grapple with AI and its implications for national security . . . about what makes it so potentially transformative and about [the challenges that] make it different from other technological leaps our country has navigated before, from electrification to nuclear weapons to space flight to the Internet.”69 Extending this theme of transformation, David Sacks, AI and crypto czar for the present Trump administration, highlighted AI’s profound strategic importance: “Artificial Intelligence is a revolutionary technology with the potential to transform the global economy and alter the balance of power in the world.”70

These statements collectively depict AI not merely as an incremental technological improvement, but as a game-changing, paradigm-shifting force that demands urgent adaptation and leadership in both military innovation and national security policymaking. Whether such rhetoric reflects sober analysis or bureaucratic positioning—or both—is difficult to disentangle. But precisely because AI is so broad in scope, its success or failure must be judged case by case.

This case-by-case standard can be illustrated through several present AI deployments, which exhibit both genuine utility and sobering limitations. One success is the FORGE mission data processing (MDP), which integrates AI / machine learning (ML) throughout the processing stack of infrared early-warning satellites to enhance image processing and signal detection.71 This measure dramatically increases missile-warning sensitivity, enabling earlier detection and longer tracking of stealthier threats.

Second, an AI/ML-powered radar warning receiver (RWR)—the Cognitive Algorithm Deployment System—has been installed on a fourth-generation fighter aircraft.72 The system uses AI and ML to autonomously identify and classify radar threats in real time, outperforming traditional RWRs by processing complex electromagnetic signals faster and more accurately.

Third, GAMECHANGER is a DoD AI-powered tool that leverages natural language processing to search, analyze, and synthesize information from thousands of policy documents, supporting noncombat functions like compliance checks and decision-making.73 GAMECHANGER is intended to enable program officers to understand their programs through the lens of higher guidance and to help identify policy gaps and requirements.

Fourth, the US Air Force’s Predictive Analytics and Decision Assistant (PANDA) applies AI and ML to aircraft maintenance data for predictive analytics.74 It monitors thousands of components across over 3,000 aircraft, detects anomalies, predicts failures, optimizes parts availability, and reduces downtime by enabling proactive repairs, thereby boosting mission readiness.

On the failure side, consider Project Maven, a Pentagon initiative launched in 2017 to integrate AI machine learning into military intelligence workflows by automating the analysis of drone and satellite imagery for rapid target identification and location, flagging potential threats like vehicles or buildings for human analysts. In a test of an experimental target-recognition algorithm looking for surface-to-surface missiles, the system was trained on data obtained from one sensor detecting a single missile at an oblique angle.75 However, when the system was tested on its performance in a different scenario—multiple missiles being viewed at a near-vertical angle—it worked around 25 percent of the time. Alarmingly, the system reported a 90 percent confidence in its wrong predictions.

In another example, a target-identification algorithm for the Ripsaw robotic tank package was described as being able to differentiate between armed and unarmed individuals and yet was shown as marking a tree with the identical recognition box applied moments earlier to a person walking nearby in a parking lot.76 Another military recognition system, intended to enhance force protection and perimeter security, was trained on gait recognition and human movement patterns. However, a squad of marines was able to evade detection through the use of unconventional tactics, such as hiding under cardboard boxes, holding a small tree in front of themselves, and performing continuous somersaults over distances exceeding 1,000 feet.77

Another key distinction between artifactual and architectural technologies is that what is possible with the former is constrained by the laws of physics and the material realities of production, while what is possible with the latter is constrained by the limits of human imagination. Thus, artifactual technologies tend to evolve continuously and incrementally—the tank, the airplane, the aircraft carrier, and the submarine of today are vastly different in capability than, say, their 1926 counterparts, and yet these weapons would be recognizable by the soldier or sailor or airman of that time. Because the limits on architectural technologies are far less tangible, changes can be made less expensively and more rapidly (no “bending of metal” needed) and radical change is more feasible, at least in principle.

The optimism of DTEs stems from the malleability (in principle) of software‑defined systems. DTEs emphasize software’s low marginal cost—iterative updates without the need for factory retooling—as a force that allows startups to challenge legacy defense incumbents, much as Silicon Valley disrupted (but notably did not render obsolete) Detroit.

It remains to be seen whether the “in-principle” malleability of software touted by DTEs manifests “in practice.” For example, whether the DTE model will scale to major programs remains contested, with the model’s commercial iteration tempo facing friction from military certification, security classification, and operational testing timelines, leaving the gap between vision and validated practice unclosed at scale. Critics further note that Ukrainian successes hinge on Western intelligence architecture, sustained logistics, and hard-won operator expertise—factors not easily replicated by commercial startups.

Thus, despite the fact that software is a malleable medium and radical changes are possible, large software systems tend to evolve from smaller ones. And though the laws of physics don’t constrain such systems very much, economic, cognitive, and organizational realities do—and limit what changes can be implemented, especially under pressure in times of crisis or war.

Discussion and Policy Implications

This article discusses optimistic views about new military technologies and inquires whether those views are warranted. A few observations are in order.

First, it’s clear that some degree of optimism about a new technology is necessary if it is ever going to play a role in military affairs. If everyone says, “That will never work, and it’s not worth trying,” then no one will do anything and nothing will happen.

Second, it is also clear that there is such a thing as excessive optimism (that is, optimism that lacks rational justification), and by definition, excessive optimism will never be realized, no matter how fervent the belief.

Third, it is unfair to judge a technology negatively on the basis of its first setbacks or positively on the promises of its proponents. All technologies experience setbacks at some point, and it is often possible to recover and learn from them with an eye toward overcoming or circumventing them in the future. But that is not true for all setbacks, and sometimes a setback indicates that a promise should not have been made in the first place.

Dominic Johnson found that overconfidence is a beneficial “strategic instinct” that has high value in high-uncertainty, high-stakes environments such as international politics and warfare, where it boosts ambition to seize opportunities and enhances resolve and perseverance amid setbacks.78 Extrapolating to the issue of technological optimism, which mirrors positive illusions of capability and control, one can see the glimmerings of persevering through high-uncertainty research and development challenges and a determination to work through the problems that inevitably surface during the first parts of a technology’s life cycle.

“All technologies experience setbacks at some point, and it is often possible to recover and learn from them with an eye toward overcoming or circumventing them in the future.”

On the other hand, Johnson also noted that hubristic overconfidence can lead to miscalculation and unnecessary conflict, even if in moderation it helps to foster ambition and resolve amidst uncertainty. Thus, structural safeguards—such as advisers for reality checks, institutional processes for collective deliberation, and historical analysis to temper instincts—help to prevent self-destructive overreach. Such safeguards would be valuable in considering the value of specific technologies as well, and the use of expert advisers, rigorous testing, and institutional checks is important in modern defense acquisition.

Johnson’s findings are consistent with other work on the impact of confidence. For example, David Hirshleifer, Angie Low, and Siew Hong Teoh find that overconfidence helps CEOs exploit innovative growth opportunities.79 By contrast, Ulrike Malmendier and Geoffrey Tate find that CEO overconfidence accounts for corporate investment distortions and overestimations of the returns to their investment projects.80 They posit a “sweet spot” that reconciles this apparent contradiction—in other words, high confidence is advantageous, while excessively high confidence is damaging to outcomes. Confidence needs to be high enough to drive action and persistence but sufficiently reality-tested to allow for course correction and feedback and learning. High confidence is excessive when it becomes divorced from feedback and reality.

Translating the above into the policy domain with respect to the acquisition process suggests the challenge that policymakers have in discerning where to invest limited resources, as well as where to take on responsibilities that demand new or enduring military capabilities. Two policy imperatives suggest themselves.

Imperative 1: Make every effort to assess technological promise realistically and invest accordingly.

First, program advocates should clearly define the period over which technological promise is to be assessed. One definition responsive to concerns about potential failures of innovation might span the interval from initial resource commitment to the delivery of full operational capability. This approach necessarily encompasses the testing, debugging, and early deployment phases, all of which routinely expose unforeseen limitations, dependencies, and integration challenges that shape the actual value and viability of a technology.

Second, those managing an acquisition program should have sufficient technical expertise to understand technology-related choices in acquisition. For example, the selection of program managers could be biased toward individuals with backgrounds in science, engineering, technology, or mathematics, thereby increasing the likelihood that technical risks, limitations, and development challenges will be assessed with appropriate rigor. Without the ability to interrogate technical assumptions, nontechnical decision-makers are more likely to accept optimistic projections. Empirical research further suggests that domain expertise mitigates overconfidence in expected outcomes, strengthening the realism of program assessments.81

Third, program managers should establish milestones and “go”/“no-go” decision points that explicitly incorporate measures of technical maturity—such as technology readiness levels or system integration readiness. Doing so would help prevent premature scaling of immature technologies. By linking major investment decisions to verified progress, these milestones would ensure that spending and expectations remain aligned with what is programmatically achievable within realistic time frames.

Fourth, program managers should require independent technical reviews and red-team analyses early and throughout program development. These reviews and analyses would provide information that could further improve decision quality. Such reviews help surface hidden assumptions, uncover cognitive biases, and challenge unwarranted optimism in development plans. These mechanisms not only validate the realism of underlying engineering assessments but also enable timely adjustments to investment strategy.

Finally, programs should phase investment and iterative prototyping approaches that allow spending to track demonstrated progress and system performance. This pacing mechanism constrains sunk costs in unproven technologies while preserving flexibility to pivot or terminate programs when results or integration challenges indicate that initial expectations are unlikely to be met.

Together, these measures can create a more disciplined framework that grounds acquisition and investment decisions in realistic capabilities, limits hype-driven overcommitment, and fosters sustainable technology development aligned with operational needs.

Imperative 2: Expand the conceptual scope of responses to the growing capabilities of adversaries.

Traditionally, the US military has demonstrated a preference for innovation driven from the top, perhaps as a result of a hierarchical structure that values centralized control to ensure discipline, unity, and interoperability in complex military operations. Such an approach has many virtues, among them alignment of capabilities development with national strategic objectives and maintaining predictability and accountability across long acquisition cycles. By standardizing doctrine, training, and matériel, this approach also helps to maintain force readiness.

But top-down innovation has many disadvantages as well. Because of its reliance on centralized planning, a top-down process must accommodate the needs of many users with different needs, and anything that emerges from this process is by definition a compromise that is not well tailored to local needs. Moreover, centralized planning is time-consuming and not well matched to the rapid evolution of many modern technologies that originate in the private sector.

Top-down innovation is unavoidable in many cases, especially for large capital investments (ships are the most obvious example)—but it should not crowd out other approaches. Another approach to innovation emphasizes decentralized, context-specific experimentation (rather than centralized, large-scale initiatives).82 Through multiple smaller-scale efforts, diverse actors can tailor solutions to their distinct operational environments, diffusing risk and increasing the likelihood of practical, scalable results. Incremental progress, driven by iterative cycles of experimentation, feedback, and adaptation, builds organizational capacity and confidence while mitigating the risks associated with overly ambitious projects that often fail due to complexity or resource misallocation.

The principle of local adaptation—illustrated, for example, by the diffusion of internet technologies83—demonstrates how empowering end users as active innovators accelerates technological advancement. Rather than emerging from centralized mandates, myriad internet applications flourished through user-driven experimentation addressing specific productivity and communication needs. This decentralized approach democratizes innovation, promotes solution diversity, and expedites learning cycles.

General Rainey articulates a logic based on local adaptation in his advocacy of a “continuous transformation” approach,84 which integrates new technologies directly into operational formations and emphasizes collaborative capability development between soldiers and engineers. Iterative in nature, continuous transformation calls for quickly fielding small quantities of minimally viable products that are “good enough” to be useful immediately rather than waiting for fully matured systems from protracted acquisition processes. Although such an approach is impractical for capital-intensive, long-duration platforms such as warships, it holds significant promise for comparatively low-cost systems (say, those with unit costs 1/1,000th to 1/10,000th of a ship). Bottom-up innovation at the tactical edge allows military units to adapt rapidly and effectively to evolving threats and emerging technologies within complex operational environments.

Local adaptation is not unfamiliar to the US military. For example, stopgap weapons, such as Vietnam-era gun trucks improvised with scavenged armor and heavy machine guns to shield convoys from ambushes, have long embodied frontline ingenuity amid urgent but unexpected threats.85 Unofficial acceptance thrives through tactical commanders’ tolerance and soldiers’ can-do ethos, as battlefield success forces pragmatic embrace, sometimes leading to eventual institutionalization, like Iraq’s up-armored gun trucks that were formalized post-2005. But even when officially endorsed at the top, these technologies often encountered official military discouragement due to rigid doctrines prioritizing standardization, safety, and formal acquisition processes.

“Iterative in nature, continuous transformation calls for quickly fielding small quantities of minimally viable products that are 'good enough' to be useful immediately rather than waiting for fully matured systems from protracted acquisition processes.”

The DoD has taken some first steps in the direction of local adaptation and innovation. For example, the Defense Innovation Unit sources specific operational challenges directly from DoD components, combatant commands, and warfighters and matches them with commercial entities that may have technology relevant to them.86 A second example is the AFWERX’s Spark Cells, which is a decentralized network at over eighty air and space force bases worldwide that empowers local innovators to ideate, prototype, and execute solutions tailored to base-specific operational challenges.87 A third is the Rapid Defense Experimentation Reserve, which coordinates joint prototyping and experimentation across DoD components to validate mature technologies addressing combatant command priorities.88

A detailed breakdown of the funding allocated to locally driven innovation is not available. But the DoD investment accounts for procurement and research, development, test, and evaluation total $384.3 billion for fiscal year 2026,89 and funding for locally driven innovation is almost certainly under 1 percent. Budget figures inevitably reflect priorities, and sub–1 percent speaks for itself.

Successful implementation of bottom-up approaches requires both technical agility and a cultural shift toward valuing experimentation and asymmetric problem-solving over technological replication of what adversaries are doing. Aligning technology investments with operational ingenuity enables military forces to generate strategic uncertainty, undermining adversary predictability and reducing the effectiveness, cost-efficiency, and relevance of conventional warfare approaches. Adoption of these two policy imperatives will not solve problems arising from excessive technological optimism, which may in the end be a feature (or a bug!) of the human psyche. But these imperatives may help to moderate the effects.

 

Herbert Lin is a senior research scholar and research fellow at Stanford University, with interests at the intersection of national security and emerging technologies. He is also director of the Stanford Emerging Technology Review (http://setr.stanford.edu). Additionally, he is chief scientist emeritus for the Computer Science and Telecommunications Board of the National Academies. He also served on the Science and Security Board of the Bulletin of Atomic Scientists from 2016 to 2025. Lin was a member of President Obama’s Commission on Enhancing National Cybersecurity (2016) and the Aspen Commission on Information Disorder (2020). He was a professional staff member and staff scientist for the House Armed Services Committee, where his portfolio included defense policy and arms-control issues. He received his doctorate in physics from MIT.

Stanford University, Stanford, CA, USA, email: herblin@stanford.edu.

 

Acknowledgments: I’m grateful to my Stanford colleague Harold Trinkunas for helpful discussions and to editors Sheena Greitens and Josh Rovner for commentary on an earlier draft that was critical in the best of ways.

 

Image: Marine Aviation Weapons and Tactics Squadron-1 by Cpl. Alejandro Fernandez.90

 

© 2026 by Herbert Lin

 

Endnotes

1 Thomas G. Mahnken, Technology and the American Way of War Since 1945 (Columbia University Press, 2008), 5.

2 For example, President Eisenhower noted: “In any combat where these things [atomic weapons] can be used on strictly military targets and for strictly military purposes, I see no reason why they shouldn’t be used just exactly as you would use a bullet or anything else.” See Andrew Glass, “Eisenhower Defends Use of Nuclear Weapons, March 16, 1955,” Politico, March 16, 2019, https://www.politico.com/story/2019/03/16/eisenhower-defends-use-of-nuclear-weapons-march-16-1955-1224003.

3 Dominik P. Jankowski, “Russia and the Technological Race in an Era of Great Power Competition,” CSIS, September 14, 2021, https://www.csis.org/analysis/russia-and-technological-race-era-great-power-competition.

4 Cheryl Pellerin, “Deputy Secretary: Third Offset Strategy Bolsters America’s Military Deterrence,” US Department of Defense, October 31, 2016, https://www.war.gov/News/News-Stories/Article/Article/991434/deputy-secretary-third-offset-strategy-bolsters-americas-military-deterrence/.

5 Karl Eikenberry, “Take No Casualties,” The US Army War College Quarterly: Parameters 26, no. 2 (1996), https://doi.org/10.55540/0031-1723.1773; Sebastian Kaempf, Saving Soldiers or Civilians? Casualty-Aversion Versus Civilian Protection in Asymmetric Conflicts (Cambridge University Press, 2018), chap. 2, https://doi.org/10.1017/9781108551816.003.

6 Dick Cheney, "Defense Strategy for the 1990s: The Regional Defense Strategy," US Department of Defense, January 1993, https://www.bits.de/NRANEU/others/strategy/DoD-Regional-Defense-Strategy-01-1993.pdf.

7 Michael N. Schmitt, “Precision Attack and International Humanitarian Law,” International Review of the Red Cross 87, no. 859 (September 2005): 445–66, https://international-review.icrc.org/sites/default/files/irrc_859_3.pdf.

8 Michael J. Mazarr et al., Disrupting Deterrence: Examining the Effects of Technologies on Strategic Deterrence in the 21st Century, 2022, https://www.rand.org/pubs/research_reports/RRA595-1.html; Michael C. Horowitz, “Do Emerging Military Technologies Matter for International Politics?,” Annual Review of Political Science 23 (2020): 385–400, https://doi.org/10.1146/annurev-polisci-050718-032725.

9 See, for example, Daniel R. Lake, “Technology, Qualitative Superiority, and the Overstretched American Military,” Strategic Studies Quarterly (Winter 2012): 71–99, https://www.airuniversity.af.edu/Portals/10/SSQ/documents/Volume-06_Issue-4/05-Lake.pdf.

10 US House of Representatives, Committee on Science and Astronautics, Research and Development for Defense: Hearings Before the Committee on Science and Astronautics, Eighty-Seventh Congress, 1st sess. (US Government Publishing Office: 1961), https://www.govinfo.gov/content/pkg/CHRG-87hhrg66947/pdf/CHRG-87hhrg66947.pdf. The first quote appears on page 65, the second on page 68.

11 Hearings on Department of Defense Appropriations for FY 1967, House Appropriations Committee, Subcommittee on Defense, part 1, 408, https://www.govinfo.gov/app/details/CHRG-89hhrg60138p1/context.

12 Yasmin Tadjdeh, “Navy’s Electromagnetic Railgun Project Progressing,” June 15, 2017, https://www.nationaldefensemagazine.org/articles/2017/6/15/navys-electromagnetic-railgun-project-progressing.

13 Lucia Sanchez, “Electromagnetic Railgun—A ‘Navy After Next’ Game Changer First Test of Electromagnetic Railgun Facility Is a Success,” CHIPS: The Department of the Navy’s Information Technology Magazine, January–March 2007, https://www.doncio.navy.mil/Chips/ArticleDetails.aspx?ID=2984.

14 Sanchez, “Electromagnetic Railgun."

15 Naresh Chend, “Railgun—Weapon of the Future,” SPSNavalForces.com, April 2016, https://www.spsnavalforces.com/story/?id=437.

16 Konstantin Toropin, “The Navy Finally Pulls the Plug on the Railgun,” Military.com, July 2, 2021, https://www.military.com/daily-news/2021/07/02/navy-finally-pulls-plug-railgun.html.

17 “Hypersonics,” Lockheed Martin, n.d., https://www.lockheedmartin.com/en-us/capabilities/hypersonics.html.

18 Emergen Research, “Top 10 Companies in Hypersonic Weapons Market in 2024 Shaping Global Industry Trends,” March 7, 2025, https://www.emergenresearch.com/blog/top-10-companies-in-hypersonic-weapons-market.

19 “Top 5 Growth Drivers for Hypersonic Weapons,” MarketsandMarkets.com, https://www.marketsandmarkets.com/blog/AD/top-5-growth-drivers-for-hypersonic-weapons.

20 Perla Alfaro, “US Army Showcases Long-Range Hypersonic Weapon During TS25 in Australia,” US Army Pacific, August 2, 2025, https://www.usarpac.army.mil/Our-Story/Our-News/Article-Display/Article/4262893/us-army-showcases-long-range-hypersonic-weapon-during-ts25-in-australia/.

21 Strategic Systems Programs, “Army and Navy Successfully Test Conventional Hypersonic Missile,” US Department of Defense Public Affairs, December 12, 2024, https://www.ssp.navy.mil/News-Media/News/Article/4172369/army-and-navy-successfully-test-conventional-hypersonic-missile/.

22 Andrew F. Krepinevich Jr., The Origins of Victory: How Disruptive Military Innovation Determines the Fates of Great Powers (Yale University Press, 2024).

23 Defense Science Board Task Force on Directed Energy Weapons, "Directed Energy Weapons," Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, 2007, https://dsb.cto.mil/wp-content/uploads/reports/2000s/ADA476320.pdf.

24 Andrew Feickert, "US Army Weapons-Related Directed Energy (DE) Programs: Background and Potential Issues for Congress," Congressional Research Service, Library of Congress, February 12, 2018, https://www.congress.gov/crs_external_products/R/PDF/R45098/R45098.5.pdf.

25 Glen Perram, “Lasers, Chemical,” in Encyclopedia of Physical Science and Technology, ed. Robert A. Meyers, 3rd ed. (Academic Press, 2002), https://doi.org/10.1016/B0-12-227410-5/00935-2.

26 Daniel Goure and Loren B. Thompson, "Directed-Energy Weapons: Technologies, Applications, and Implications," Lexington Institute, 2003, https://lexingtoninstitute.org/wp-content/uploads/directed-energy-weapons.pdf.

27 Stew Magnuson, “Directed Energy Weapons: Here Now? Or 5 Years Off?,” National Defense Magazine, February 29, 2024, https://www.nationaldefensemagazine.org/articles/2024/2/29/editors-notes-directed-energy-weapons-here-now-or-5-years-off.

28 “The Time Has Come for Directed Energy,” Booz Allen, n.d. 2025, https://www.boozallen.com/d/insight/blog/the-time-has-come-for-directed-energy.html.

29 Thomas Collina and Kelsey Davenport, “Airborne Laser Mothballed,” Arms Control Today, March 2012, https://www.armscontrol.org/act/2012-03/airborne-laser-mothballed.

30 Robert Gates, “Address at the American Enterprise Institute—Online Speech Bank,” May 24, 2011, https://www.americanrhetoric.com/speeches/robertgatesamericanenterpriseinstitute.htm.

31 Joshua Roller, “Amara’s Law and Its Place in the Future of Tech,” IEEE Computer Society, September 6, 2024, https://www.computer.org/publications/tech-news/trends/amaras-law-and-tech-future/.

32 US Department of War, “Remarks by Deputy Secretary of Defense Kathleen Hicks Keynote on ‘The Global AI Contest,’” Advantage DOD 2024: Defense Data & AI Symposium in Washington, DC, February 21, 2024, https://www.war.gov/News/Speeches/Speech/Article/3683202/remarks-by-deputy-secretary-of-defense-kathleen-hicks-keynote-on-the-global-ai/.

33 Christie Taylor and Elizabeth Shockman, “Did You Know GPS Used to Be Controversial? Here’s How It Survived,” The World from PRX, May 27, 2016, https://theworld.org/stories/2016/05/27/did-you-know-gps-used-be-controversial-here-s-how-it-survived; Gaylord Green, “Lost in the Desert, They Demanded GPS: The Adoption of GPS by the US Armed Services,” GPS World, December 4, 2023, https://www.gpsworld.com/lost-in-the-desert-they-demanded-gps-the-adoption-of-gps-by-the-us-armed-services/; Matthew E. Skeen, “The Global Positioning System: A Case Study in the Challenges of Transformation,” Joint Force Quarterly 51 (4th Quarter 2008): 88–93, https://apps.dtic.mil/sti/tr/pdf/ADA517963.pdf.

34 General Accounting Office, Comparison of the NAVSTAR Program with the Acquisition Plan Recommended by the Commission on Government Procurement, 1977.

35 This discussion is significantly informed by Kendrick Kuo, “Dangerous Changes: When Military Innovation Harms Combat Effectiveness,” International Security 47, no. 2 (2022): 48–87, https://doi.org/10.1162/isec_a_00446.

36 The Me-262 was the first jet fighter to fly in combat. See National Museum of the United States Air Force, “Messerschmitt Me 262A Schwalbe,” n.d., https://www.nationalmuseum.af.mil/Visit/Museum-Exhibits/Fact-Sheets/Display/Article/196266/messerschmitt-me-262a-schwalbe/.

37 Kendrick Kuo, “Dangerous Changes: When Military Innovation Harms Combat Effectiveness,” International Security 47, no. 2 (2022): 48–87, https://doi.org/10.1162/isec_a_00446.

38 The term is common today, but it was coined first in a military technology context by Herbert York, who was the first director of the Lawrence Livermore National Laboratory and also first chief scientist of the Advanced Research Projects Agency (now DARPA). He was also chief US negotiator for the Comprehensive Test Ban Treaty (CTBT) talks from 1979 to 1981. See Herbert F. York, “Military Technology and National Security,” Scientific American 221, no. 2 (1969): 17–29, https://www.jstor.org/stable/24926434.

39 Melvin R. Laird, “Memorandum from Secretary of Defense Laird to the President's Assistant for National Security Affairs (Kissinger),” August 17, 1971, in Foreign Relations of the United States, 1969–1976, Volume XXXII, SALT I, 1969–1972, ed. Erin R. Mahan (Government Printing Office, 2010), document 174, https://history.state.gov/historicaldocuments/frus1969-76v32/d174.

40 Gregory Wheeler, “Bounded Rationality,” in The Stanford Encyclopedia of Philosophy, ed. Edward N. Zalta and Uri Nodelman (Metaphysics Research Lab, Stanford University: Winter 2024), https://plato.stanford.edu/archives/win2024/entries/bounded-rationality/.

41 This bias is known as the prominence effect. See Paul Slovic, “Choice Between Equally Valued Alternatives,” Journal of Experimental Psychology: Human Perception and Performance 1, no. 3 (1975): 280–87, https://doi.org/10.1037/0096-1523.1.3.280.

42 Shelley E. Taylor and Jennifer Crocker, “Schematic Bases of Social Information Processing,” in Social Cognition: The Ontario Symposium, ed. Tory Higgins, C. Peter Herman, and Mark P. Zanna, vol. 1 (Routledge, 1981), https://www.routledge.com/Social-Cognition-The-Ontario-Symposium-Volume-1/Higgins-Herman-Zanna/p/book/9781032317960.

43 Optimism is enhanced by any indication of prior or potential achievement with that technology. See, for example, Brent B. Clark, Christopher Robert, and Stephen A. Hampton, “The Technology Effect: How Perceptions of Technology Drive Excessive Optimism,” Journal of Business and Psychology 31, no. 1 (2016): 87–102, https://doi.org/10.1007/s10869-015-9399-4.

44 Daniel Kahneman, Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011).

45 Ziva Kunda, “The Case for Motivated Reasoning,” Psychological Bulletin 108, no. 3 (1990): 480–98, https://doi.org/10.1037/0033-2909.108.3.480.

46 Thomas Gilovich, How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life (Free Press, 1991), 84.

47 Bent Flyvbjerg and Alexander Budzier. “Why Your IT Project May Be Riskier Than You Think,” Harvard Business Review, September 2011, https://hbr.org/2011/09/why-your-it-project-may-be-riskier-than-you-think.

48 Li-Jun Ji and Suhui Yap, “Culture and Cognition,” Current Opinion in Psychology 8 (April 2016): 105–11, https://doi.org/10.1016/j.copsyc.2015.10.004.

49 C. Dominik Güss and Bernadette Robinson, “Predicted Causality in Decision Making: The Role of Culture,” Frontiers in Psychology 5 (May 2014): 479, https://doi.org/10.3389/fpsyg.2014.00479.

50 Emiko S. Kashima et al., “Open- and Closed-Mindedness in Cross-Cultural Adaptation: The Roles of Mindfulness and Need for Cognitive Closure,” International Journal of Intercultural Relations 59 (July 2017): 31–42, https://doi.org/10.1016/j.ijintrel.2017.05.001.

51 Heidi Demarest, Tyler Jost, and Robert Schub, “Bureaucracy and Cyber Coercion,” International Studies Quarterly 68, no. 1 (2024): sqad103, https://doi.org/10.1093/isq/sqad103.

52 Jon R. Lindsay, Information Technology and Military Power (Cornell University Press, 2020).

53 Austin Long, The Soul of Armies: Counterinsurgency Doctrine and Military Culture in the US and UK (Cornell University Press, 2016).

54 Adam Yang, “Dreams of Victory: How Military Organizations Select Innovations Based on Warfighting Culture and Civilian Strategic Interests” (PhD diss., American University, 2022), https://www.proquest.com/results/FCF10C12365E4627PQ/1.

55 The can-do ethos is often embodied in what the US military calls the "warrior ethos." For a description, see “Warrior Ethos,” US Army, January 5, 2011, https://www.army.mil/article/50082/warrior_ethos.

56 Christopher G. Pernin et al., Lessons from the Army's Future Combat Systems Program (RAND Corporation, 2012), 139, https://www.rand.org/pubs/monographs/MG1206.html.

57 Aerospace America, July 1984, 81.

58 Caspar Weinberger, “Seeking a Realistic Strategic Defense,” Defense/83, June 1983, 28.

59 “History of the Ballistic Missile Defense Organization,” BMDO Fact Sheet 96–003, US Department of Defense, February 1996, https://apps.dtic.mil/sti/tr/pdf/ADA338705.pdf.

60 James A. Abrahamson and Henry F. Cooper, “What Did We Get for Our $30-Billion Investment in SDI/BMD?,” National Institute for Public Policy, September 1999, https://highfrontier.org/wp-content/uploads/2016/08/What-for-30B_.pdf.

61 Bent Flyvbjerg, “Top Ten Behavioral Biases in Project Management: An Overview,” Project Management Journal 52, no. 6 (2021): 531–46, https://doi.org/10.1177/87569728211049046.

62 See, for example, Rongju Yu, “Stress Potentiates Decision Biases: A Stress Induced Deliberation-to-Intuition (SIDI) Model,” Neurobiology of Stress 3 (February 2016): 83–95, https://doi.org/10.1016/j.ynstr.2015.12.006.

63 Samuel P. Huntington, “Interservice Competition and the Political Roles of the Armed Services,” The American Political Science Review 55, no. 1 (1961): 40–52, https://doi.org/10.2307/1976048.

64 Emilie Berthelsen, “Hybrid Times: War and Peace in Military Innovation Studies,” Journal of Strategic Studies (June 23, 2025): 1–34, https://doi.org/10.1080/01402390.2025.2512238.

65 Robert M. Gates, “Business Executives for National Security: The Pentagon Isn’t at War,” speech presented at Business Executives for National Security, Washington, DC, May 15, 2008, https://web.archive.org/web/20080520182520/https://www.defenselink.mil/speeches/speech.aspx?speechid=1242.

66 Robert Mortlock and Nick Dew, “Behavioral Biases Within Defense Acquisition,” SYM-AM-21-049, Proceedings of the Eighteenth Annual Acquisition Research Symposium, Acquisition Research Program, Graduate School of Defense Management, Naval Postgraduate School, May 10, 2021, https://dair.nps.edu/handle/123456789/4356.

67 Rebecca M. Henderson and Kim B. Clark, “Architectural Innovation: The Reconfiguration of Existing Product Technologies and the Failure of Established Firms,” Administrative Science Quarterly 35, no. 1 (1990): 9–30, https://doi.org/10.2307/2393549. Henderson and Clark use the term “component” where the term “artifact” is used in this paper.

68 Brandi Vincent, “Army Futures Command’s Gen. Rainey Reflects on AI’s Potential in Modern Warfare,” DefenseScoop, June 28, 2024, https://defensescoop.com/2024/06/28/army-futures-commands-gen-rainey-reflects-on-ais-potential-in-modern-warfare/.

69 The White House, “Remarks by APNSA Jake Sullivan on AI and National Security,” October 25, 2024, https://bidenwhitehouse.archives.gov/briefing-room/speeches-remarks/2024/10/24/remarks-by-apnsa-jake-sullivan-on-ai-and-national-security/.

70 The White House, “White House Unveils America’s AI Action Plan,” July 23, 2025, https://www.whitehouse.gov/articles/2025/07/white-house-unveils-americas-ai-action-plan/.

71 “Space Force Operationally Accepts SciTec’s Revolutionary Missile Warning System,” SciTec, October 16, 2025, https://scitec.com/space-force-operationally-accepts-scitecs-revolutionary-missile-warning-system/.

72 Vadim Kushnikov, “The United States Tests the World’s First AI-Based Radar Warning Receiver,” militarnyi.com, February 25, 2025, https://militarnyi.com/en/news/the-united-states-tests-the-world-s-first-ai-based-radar-warning-receiver/.

73 Defense Intelligence Agency, “Gamechanger: Where Policy Meets AI,” February 7, 2022, https://www.dia.mil/News-Features/Articles/Article-View/Article/2926343/gamechanger-where-policy-meets-ai/.

74 Liz Martin, “The US Air Force Improves Aircraft Readiness with AI and Predictive Maintenance Solutions,” AWS Public Sector Blog, December 14, 2023, https://aws.amazon.com/blogs/publicsector/the-us-air-force-improves-aircraft-readiness-with-ai-and-predictive-maintenance-solutions/.

75 Patrick Tucker, “This Air Force Targeting AI Thought It Had a 90% Success Rate. It Was More Like 25%,” Defense One, December 9, 2021, https://www.defenseone.com/technology/2021/12/air-force-targeting-ai-thought-it-had-90-success-rate-it-was-more-25/187437/.

76 Kelsey Atherton, “Industry Starts Work on Weapons That Can See; Autonomy Comes Next,” Breaking Defense, October 15, 2020, https://breakingdefense.com/2020/10/industry-starts-work-on-weapons-that-can-see-autonomy-comes-next/.

77 Paul Scharre, Four Battlegrounds: Power in the Age of Artificial Intelligence (W. W. Norton & Company, 2023).

78 Dominic D. P. Johnson, Strategic Instincts: The Adaptive Advantages of Cognitive Biases in International Politics (Princeton University Press, 2020), https://press.princeton.edu/books/hardcover/9780691137452/strategic-instincts.

79 David Hirshleifer, Angie Low, and Siew Hong Teoh, “Are Overconfident CEOs Better Innovators?,” The Journal of Finance 67, no. 4 (2012): 1457–98, https://doi.org/10.1111/j.1540-6261.2012.01753.x.

80 Ulrike Malmendier and Geoffrey Tate, “CEO Overconfidence and Corporate Investment,” The Journal of Finance 60, no. 6 (2005): 2661–700, https://doi.org/10.1111/j.1540-6261.2005.00813.x.

81 Priscilla Kraft, Teresa Dickler, and Michael Withers, “When Do Firms Benefit from Overconfident CEOs? The Role of Board Expertise and Power for Technological Breakthrough Innovation,” Strategic Management Journal, July 28, 2024, https://sms.onlinelibrary.wiley.com/doi/full/10.1002/smj.3657.

82 Some similar reflections on appropriate paths for innovation are contained in Lindsay’s Information Technology and Military Power, specifically his advocacy of adaptive management as a way to capture the benefits of both top-down and bottom-up approaches to innovation (see chapter 7).

83 Marjory S. Blumenthal and David D. Clark, “Rethinking the Design of the Internet: The End-to-End Arguments vs. the Brave New World,” ACM Transactions on Internet Technology 1, no. 1 (August 2001): 70–109, https://dl.acm.org/doi/10.1145/383034.383037.

84 James Rainey, “Continuous Transformation: Transformation in Contact,” Military Review, August 2024, https://www.armyupress.army.mil/Journals/Military-Review/Online-Exclusive/2024-OLE/Transformation-in-Contact/.

85 Nina Kollars, “Military Innovation’s Dialectic: Gun Trucks and Rapid Acquisition,” Security Studies 23, no. 4 (2014): 787–813, https://doi.org/10.1080/09636412.2014.965000.

86 Defense Innovation Unit, n.d., https://www.diu.mil/.

87 AFWERX, "Overview," July 18, 2023, https://afwerx.com/divisions/spark/overview/.

88 "Rapid Defense Experimentation Reserve (RDER): Prototypes & Experiments," PowerPoint presentation presented at NDIA event, Office of the Under Secretary of Defense for Research & Engineering, July 18, 2022, https://ndia.dtic.mil/wp-content/uploads/2022/future/Wed_Keynote_Beaverson.pdf.

89 US Department of Defense, Office of the Under Secretary of Defense (Comptroller), "FY 2026 Program Acquisition Costs by Weapon System," Department of Defense, 2025, https://comptroller.war.gov/Portals/45/Documents/defbudget/FY2026/FY2026_Weapons.pdf.

90 For image, see https://www.dvidshub.net/image/8387462/wti-2-24-high-energy-laser-expeditionary.

Top