In times of peace and prosperity, both cities and individuals can have lofty ideals because they have not fallen before the force of overwhelming necessity.
— Thucydides1
Prior to the American Civil War, the US Military Academy at West Point was a school of engineering that produced more industrial tycoons than generals. The academy’s first superintendent, Maj. Jonathan Williams, was a scientist and grandnephew to Benjamin Franklin.2 Even under President Dwight Eisenhower in 1954, military history was “not taught as a separate course in the military schools and colleges of the United States Army.”3 A conversation in 1970 between journalist Ward Just and West Point’s history department chair, Col. Thomas E. Griess, might reveal the source of this trend. Griess described the army as a “highly technological” organization, which, to him, explained why there was still no required history course at the academy.4 He was and still is right. It is therefore no wonder that the United States invests so heavily in technological supremacy or “offsets” even after two decades of war in Afghanistan exposed the limits of that dominance.5
Since the beginning of that war, there has been no shortage of scholarship highlighting the dangers of political hubris that extend from technological superiority.6 Even so, unmanned systems are flooding modern battlefields in staggering quantities, giving credence to presentism in Western strategic thinking and devaluing the careful study of broader historical arcs that underscore the human element in war. Britain’s oldest think tank, the Royal United Services Institute, estimated in 2023 that Ukraine lost approximately 10,000 drones per month as it pushed back its Russian invaders.7 President Volodymyr Zelenskyy thus sought to produce one million drones for Ukraine in 2024.8 Russia is scrambling to meet the challenge by striking billion-dollar deals with Iran in an effort to build thousands of unmanned weapons domestically.9
As Western militaries clamber for drones and their machines grow more numerous and sophisticated, their human numbers are plummeting.10 The United States may witness its first true spectator war this century, one in which the preponderance of resources deployed to achieve military objectives cannot empathize with their biological targets—a mathematical enterprise devoid of emotion, with fewer casualties and less potential for human error, or so the world has been led to believe.11
One way of subjecting these predictions to scrutiny is by framing them within the historical context of what Thucydides describes as the human catalysts for war: fear, honor, and interest.12 This ancient passage is often quoted but rarely contextualized, namely because it is not attributed to Thucydides himself, but rather to a delegation of Athenian ambassadors speaking before a Spartan assembly prior to the Peloponnesian War (431–404 BC).13 Their words, used to justify the Athenian empire’s expansion and to explain why they would fight to prevent its contraction, can inform discourse on something future war theorist Christopher Coker described as post-human warfare.14
This essay does not argue that the pursuit of more sophisticated unmanned weapons is futile—only that the extent to which that pursuit should be used to justify the depletion of human end strength must be weighed more judiciously to avoid unrealistic strategies in peacetime and sticker shock when the cost of war becomes apparent.
As Jon Lindsay recently noted in this journal, studies of future warfare generally address “the ways in which autonomous machines will behave in familiar wars,” but fail to imagine “the ways in which human societies will behave in unfamiliar futures.”15 Humankind’s interaction with the uncertainty of existential war and its causalities is a good place to start searching for answers—or at least evidence that leads to asking the right questions. This essay is not a study of the characteristics of unmanned systems or their neural networks, nor is it an analysis that frames the latest conflicts as unique windows into future military exigencies. Rather, it is an interdisciplinary synthesis of long-form historical analysis and realist theory that paints a clearer picture of what the next major war may demand from the United States. It peers into the stubborn and flawed nature of the ghosts that infuse machines with purpose: us.
Below, I advance three main arguments using the fear-honor-interest framework:
- Western nations are overly invested in the belief that future wars will be governed by unmanned systems, not human mass. This faith reflects Western cultural bias more than it does the demands of twenty-first-century warfare.
- The presence of these machines on the battlefield, coupled with fewer humans, will not create a uniquely “modern” war that is cost efficient, controllable, or more precise. Instead, human passions are likely to make machine wars easier to start, harder to finish, and just as messy for combatants and civilians.
- Killing humans will still be the objective in smart wars because human suffering has the greatest effect on the political will that gives machines their raison d’être.
Despite the prophesied exodus of humans from the battlefield to reduce the cost of war and the political risk associated with waging one, war has never been an efficient enterprise because it springs from human passions that the horrors of armed conflict only inflame. Efforts to reduce the number of combatants in war are unlikely to make its conditions more manageable. The evidence suggests that such trends might increase risk to civilian populations, further disconnect military actions from political objectives, and make wars harder to stop once they have started—not despite human control of its machines, but because of it. These approaches contain the same risk identified by Gen. Matthew B. Ridgway when he cautioned against relying on air and naval assets to achieve military objectives in the 1950s: The allure of doing things the “cheap and easy way” can send the military situation on the ground spiraling into a void of control that only large numbers of foot soldiers can fill.16 Indeed, this point may be one of the hardest truths for the West to accept in 2025 after three years of supporting Ukraine’s fight against Russia with every type of aid except ground forces. Amid a backdrop of worsening recruitment challenges, members of Congress, US national security officials, and senior military leaders must come to terms with this reality in their budget debates and particularly in their conversations with the American people about the potential cost of modern war.
The Great Exodus of Humans from the Battlefield
Modern technologies can reshape more than conduct in war; they can alter the calculus that serves as justification for war, especially if they depict conflicts as swift, decisive, or cheap in terms of lives and treasure—decisions over which leaders seek to retain some measure of control.17 This exchange has been part of the American way of war since the early atomic age.18 Cold War military historian Brian McAllister Linn explains how this approach became especially acute by the end of the twentieth century:
With a rare display of unity, all the armed forces shared a single vision of future conflict: technologically empowered specialists executing precise, devastating, decisive, but relatively bloodless assaults against virtually powerless enemies. This, its advocates proclaimed, was a “New American Way of War” that would guarantee victory on the battlefield, national invulnerability, and global supremacy well into the next century.19
That vision, conceived in the 1990s with an eye to 2010, is alive and well in 2025. But as Paul Scharre has argued, the “widespread adoption of military AI” by the Department of Defense could decrease human control and make war “more challenging in terms of being able to manage escalation and bring war to an end.”20 The international community is going to great lengths to ensure that the concepts of jus ad bellum (just declaration of war) and jus in bello (just conduct in war) are protected—but with new capabilities comes the expectation that they will offset the human cost of war by shouldering its heaviest burdens.21
Modern militaries have been captivated by the possibilities associated with the mass application of unmanned systems. Russian Gen. Valery Gerasimov stated that his nation could deploy a “fully robotized unit” in the “near future,” but that was almost a decade ago.22 Former commander of US Army Training and Doctrine Command, Gen. Robert Cone, suggested in 2014 that robots could replace 25 percent of America’s troops by 2030.23 Three years later, British intelligence analyst John Bassett predicted that by 2025 the US military will have “more combat robots than it will have human soldiers.”24 Sir Nick Carter, who headed the United Kingdom’s military, echoed Cone’s assessment in late 2020, claiming that a quarter of Britain’s troops could be robots in the next decade.25 These robots will not be mindless tools. As one US naval officer predicted ten years ago, “military necessity” will make full weapon autonomy inevitable.26
These forecasts might be anecdotal in isolation, but more recent developments concerning end strength in Britain and the United States—two heavy hitters in NATO’s defense architecture—make their projections seem more like self-fulfilling prophecies. The US military is facing historic recruitment challenges with the smallest active force since Adolf Hitler invaded France in 1940.27 At 1.28 million people and falling, America’s active-duty population in 2024 is roughly the size of the US Army alone in 1955—a year in which some were calling for the Army to be absorbed into the Air Force because nuclear weapons had made land warfare obsolete.28 To put this into perspective: During the “air atomic age” when massive retaliation with strategic bombers drove US defense policy, the army was three times larger than it is now, and army leaders still thought it was too small to contend with the Warsaw Pact states.29
After the Cold War ended and America withdrew from its wars in Iraq and Afghanistan, the US military shrunk, but its price tag did not.
After the Cold War ended and America withdrew from its wars in Iraq and Afghanistan, the US military shrunk, but its price tag did not. The 2008 base defense budget excluding contingency operations funds was roughly $480 billion with 1.5 million active-duty members. By 2024, the budget request had nearly doubled to $895 billion even after shrinking the force by hundreds of thousands of troops.30 The US military missed its 2023 recruiting goals by a combined 41,000 recruits, due in part to well-known quality-of-life issues for military families that caught the attention of the House Armed Services Committee.31 In 2024, the army met its recruitment goal after lowering it by roughly 16 percent from 65,000 to 55,000.32
Despite such problems with recruiting and taking care of service members, nearly 40 percent of the Pentagon’s 2025 budget request was allocated to the procurement, research, and development of new weapons and equipment ($310 billion).33 This cost far exceeds the entire defense budget just twenty-five years ago ($258 billion). As the budget slowly expands, the department’s workforce contracts and becomes more reliant on new tools to compensate. Barring a significant increase to defense appropriations, this trend seems baked into the budget process and therefore embedded in Western thinking about future warfare. Like troop cuts in the 1950s, this decrease in manpower is not accompanied by reduced security demands or fewer threats.
As a result, the Department of Defense is experimenting with compensatory measures. Under Secretary of Defense Kathleen Hicks’s announcement of the Pentagon’s Replicator initiative in September 2023 was met with a predictable cocktail of enthusiasm from futurists and skepticism from traditionalists.34 With the aim to achieve mass through the employment of tens of thousands of drones, the program reflects the logic of offset theory borne in President Eisenhower’s reforms that compensated for US-Soviet force imbalances by growing America’s nuclear arsenal.35 Then and now, the goal is for a technological edge to deter, or, if necessary, defeat one’s enemies.
By the time Eisenhower left office in 1961, however, it was clear to President John F. Kennedy that the country needed a more “flexible” strategy.36 Nuclear weapons failed to deter “brushfire wars” or offer viable solutions once they had started.37 Eisenhower’s massive investment in nuclear overmatch as a technological solution to modern warfare was useful in diplomacy but useless on the battlefield. Similarly, it remains to be seen whether the United States has the industrial capacity today to manufacture its unmanned offset and integrate it at scale, or if its adversaries will develop countermeasures that knock these swarms out of the sky or sea, rendering them ineffective.38 The US Army’s latest reorganization in 2024 doubled down on this vision of future warfare by establishing counter-drone units and growing its air and missile defense battalions.39 With fewer soldiers on hand, the army aims to achieve its goals by cannibalizing other units or eliminating hard-to-fill authorized positions.40
Despite such challenges, the United Kingdom has been experimenting with similar policies as well. Along with surging investments in unmanned systems, since 2000 the British Army and Royal Air Force experienced a 30 percent and 41 percent reduction in end strength, respectively. The army shrank from 109,600 to 76,950 soldiers while the air force dropped from 54,600 to 31,940—roughly the size of a single division—thus placing greater demands on its machines to offset risk.41 Such brain drain cannot be replenished overnight, especially considering the loss of experienced aviators and surface and subsurface naval commanders. Retired Col. Tim Collins, a veteran of Britain’s elite Special Air Service, went so far to declare that “Britain no longer has a military.”42 Western defense institutions have therefore embraced a similar approach to long-term planning based on a single, untested assumption: Allied armies comprised of far fewer humans and more machines will be able to meet the demands of conventional warfare in the twenty-first century.
The West is not alone. During the first week of 2024, Russian manufacturer Kalashnikov unveiled a supposedly jam-proof autonomous attack drone called Item-55. The Russian Federation claimed to be training thousands of drone operators even as it struggled to conscript enough soldiers to support its war on Ukraine.43 Within days of that announcement, Moscow launched a devastating rocket and drone attack on Ukrainian cities, killing dozens of civilians, damaging a maternity hospital and several schools, and revealing an indiscriminate application of drone warfare.44 Russian military doctrine has long been partial to fires, but Vladimir Putin’s dependence on such modern measures only increased after he chose to invade the second largest European country with a paltry 150,000 troops. (Soviet leader Leonid Brezhnev used four times that many soldiers just to put down the 1968 rebellion in Czechoslovakia, a nation roughly five times smaller than Ukraine.45)
Increased reliance on unmanned capabilities to achieve war’s political objectives is thereby forcing nations into a deeply dependent relationship with unproven and often experimental systems and concepts—concepts that demand levels of technological synthesis that may be questionable in a multi-front war involving dozens of allied militaries with dissimilar capabilities.46 The sophisticated platforms on which these concepts are predicated must be deployed in ever-greater numbers to compensate for decreased human presence, which in turn places additional burdens on an already strained defense industrial base and the military’s contested logistics plans.47
Despite favorable outlooks on the convergence of artificial intelligence (AI) and autonomous systems, some experts are less sanguine about the benefits for humanity.48 In The Diffusion of Military Power, Michael C. Horowitz details how the proliferation of new weapons technologies can shift the balance of power toward smaller and potentially less stable states, thus increasing the likelihood of conflict.49 Despite the spread of unmanned platforms, Christopher Coker believed that combat can “only remain humane if war’s ‘human space’ is not hollowed out completely.”50 As the increased mechanization of armed conflict suggests, however, this space seems to be attenuating rapidly. Some futurist commentary borders on hyperbole, but even scholars such as Kenneth Payne, who takes a more cautionary approach to the transformational potential of new weapons, have expressed concern over this deemphasis on the role of humans in war.51
Despite favorable outlooks on the convergence of artificial intelligence (AI) and autonomous systems, some experts are less sanguine about the benefits for humanity.
Governments are understandably eager to defy Gen. William Tecumseh Sherman’s maxim: “War is cruelty, and you cannot refine it.”52 But this great exodus of humans from the battlefield presents to the United States several ethical and strategic challenges as it incorporates more unmanned platforms into its theories of success. Even if humans become mere ghosts in future wars, the machines fighting in their stead will, to some extent, always reflect the fear, honor, and interest that motivate their masters.
Fear: The Real Killing Machine
The debate over AI-driven weapons is riddled with fears of sentient machines destroying the world without the wise chaperone of human ethics as their guide. A long history of popular science-fiction books and cinema fuels the fear, ranging from Isaac Asimov’s I, Robot to James Cameron’s Terminator films, in which an AI becomes self-aware and aims to extinguish humanity. The legacy of mankind’s wartime decision-making, however, testifies repeatedly to the ability of technological advances to make matters worse. After all, it was humans in liberal democracies who made the controversial decisions to firebomb Tokyo and deploy the Enola Gay over Hiroshima in 1945. Hundreds of thousands of Japanese civilians died out of a rational fear that more Americans would be killed in the Pacific if President Harry Truman failed to act.53 Once the strikes were deemed necessary, it is unlikely that international law or moral algorithms could have prevented them.
Nuclear physicist and “father” of the bomb J. Robert Oppenheimer believed that science or scientists cannot make these decisions, which he insisted must be left to elected officials, even though he referred to his own creation as a weapon with no military use.54 Modern leaders and the governments they control are still plagued by moral compromise, war crimes, and corruption that are in some instances worsening even as the world’s militaries expand their modernization strategies.55 Canada and the United States recently accused the People’s Republic of China and the Russian Federation of genocide.56 Such accusations against members of the UN Human Rights Council undermine the legitimacy of international institutions, as do allegations that hundreds of UN employees supported Hamas’s October 2023 pogrom in Israel.57
Despite the role of NATO as a moral compass for modern warfare, enduring questions such as those surrounding the legitimacy of the 2003 invasion of Iraq prove that democracies are not immune to entertaining legal and moral ambiguities amid fear of a potentially nuclear adversary.58 Throughout history, fear has trumped notions of morality or even logic, particularly when existential threats became part of the equation. The assumption that war can be made ethical by keeping humans “in the loop” of an AI’s decision cycle is a puzzling leap of faith that fails to acknowledge the human role in previous unethical wartime decisions, even after careful consideration of their implications.59 In human minds, cruelty toward one’s enemy can quickly become the lesser evil.
During the years leading up to the Peloponnesian War, for instance, the most ruthless citizens fared best because the intellectuals who refused to take by force “what it was possible to achieve by policy, were often caught off guard and slaughtered.”60 As Thucydides advised, “War is a harsh teacher.”61 The lesson in fear to which the Athenian general alluded did not concern physical harm as much as the fear of losing status and reputation. It was this type of fear that underpinned Graham Allison’s hotly debated 2017 thesis in Destined for War. Allison used a line from Thucydides to make a structural argument for the inevitability of conflict when a rising power (then Athens, now China) challenges a status quo power (then Sparta, now the United States), and, out of fear, the latter is forced to wage war to uphold the existing power structure.62
There is also, however, another type of fear—the kind that some scholars argue may explain the origins of the Peloponnesian War more astutely. Failure to check the hubris of a rising power can lead it down a path of belligerent military escalation, especially if it fears losing the progress it has already made.63 Indeed, Spartan delegations offered peaceful terms to Athens on numerous occasions, but the Athenians, fearful of the decrease in power that would accompany relinquishing Potidaea or granting independence to Aegina, declined.64
Since the end of the Cold War in 1991, the United States has steadily reduced the size of its military while watching China’s expand astronomically.65 Russia also went relatively unchecked for its brutal assaults into Chechnya (in 1994 and 2000), Georgia (2008), and Ukraine (in 2014 and 2022), even as the Kremlin intensified its malign influence campaigns.66 Over the last twenty years, Iran’s leaders faced minimal backlash for stoking violence with their proxies throughout the Middle East, attacking US interests in Iraq and the United States, and financing chaos around the world.67 The leaders of these nations, like those in Athens, have been emboldened. Although it was not yet necessary in the eyes of America’s leaders to respond strongly to these developments, fear of what comes next may change that.
Human fear, with its capacity to cause leaders to lash out in the name of self-interest, remains the greatest threat to the idea of universal human rights birthed in the Magna Carta, expanded on by the constitutionalists of the eighteenth century, and codified in the UN’s Universal Declaration of Human Rights on December 10, 1948. A nation’s fear of war, now blunted by promises of standoff and robotic sacrifice, may be the very factor that makes the use of force irresistible. Calls for militaries to innovate faster and win the ill-defined but omnipresent technology race neglect this inconvenient truth and fail to sufficiently consider the second- and third-order effects of post-human warfare.68
No Honor Among UAVs?
One of the most significant changes in the character of war since antiquity is the notion of honor and its relationship to the legitimate application of military force. No person or nation enjoys being shamed on the world stage.69 Gen. Curtis E. LeMay, who led US Strategic Air Command from 1948 to 1957 and later served as Air Force chief of staff under Presidents Kennedy and Johnson, described how he came to terms with killing so many: “I had blood upon my hands as I did this, but not because I preferred to bathe in blood. It was because I was part of a primitive world where men still had to kill in order to avoid being killed, or in order to avoid having their loved Nation stricken and emasculated.”70
Feelings of collective dishonor have justified war throughout history, from Adolf Hitler’s exploitation of the shame imposed on Germany by the Versailles Treaty to Vladimir Putin’s use of narratives designed to wash Russia clean of its Soviet failures and return it to a place of prominence on the world stage. These types of jus ad bellum resemble the honor in the Thucydidean trinity most aptly—an idea articulated in the Melian dialogue that inspired the theory of realism.71 Honor, then, can shape not only a nation’s claims to national grievances, but it can also inform the perceived legitimacy of a nation’s chosen strategy. Donald Kagan’s description of the Athenian way of war deserves attention here:
[Athens] had come to think of itself as an invulnerable island since its acquisition of a fleet, a vast treasury, and defensible walls. It had developed a unique and enviable way of fighting that used these advantages and avoided much of the danger and unpleasantness of ordinary warfare. . . . It permitted them to strike others without danger to their own city and population. Success in this style of warfare . . . made it seem the only one necessary.72
Kagan’s use of the word “enviable” warrants further inspection. This way of war was envied because others did not have access to it, but envy must not be confused with admiration. Athens’ techniques brewed resentment among its neighbors who had to fight the old-fashioned way. The ability to win wars without suffering greatly from them also caused Athenians to become so comfortable that they failed to imagine why any other way of war would be necessary. The citizens of Athens eventually became the first in Greece to stop carrying arms in daily life.73 The Athenian way of war thus transformed more than the battlefield—it altered civil tradition for its citizens at home.
A similar type of cultural evolution is taking place in the twenty-first century, as smaller armies wielding innovative technologies disconnect Western populations ever further from the consequences of their wars. Like Athens before it, America’s modern way of war is so appealing to the psyche of its people and their increasingly technocratic culture that no other form of warfare seems necessary.74 Military strategy thus begins to reflect this cultural bias. Indeed, killing without assuming risk is now considered honorable, and even culturally superior, which further depreciates the gravity of the decision to take life.
Although the honor to which Thucydides referred was a collective sense of reputation (or in Gen. LeMay’s words, the need to avoid national emasculation), the role of individual honor in war has also evolved.75 For thousands of years, some Europeans viewed ambushes and killing from a distance unfavorably, in contrast, for example, to far Eastern cultures’ long-held view of the bow as a weapon of high esteem.76 America’s independence came about in part from exploiting this dichotomy by challenging traditional notions of honor and integrating ambush tactics into colonial military patrols against the British Imperial Army.77 Some recent studies suggest that “soldierly virtues” such as honor may even be unhelpful to the modern remote warrior.78 The ability to cast aside customary notions of honor can be useful, but this approach may also be the surest path to winning the war and losing the peace if victory appears ill-gained, or if dishonorable conduct goes too far. A victory secured in the gray area of international law or through the abuse of power can sow deeper seeds of resentment among the defeated population and rob the victor of its political legitimacy as a member of the rules-based democratic order.
As former US Air Force Weapons School instructor M. Shane Riza observes, ultimate accountability for the actions of machines will lay squarely with human operators (and, it should be noted, their governments).79 Rather than simplifying decision-making, the ramifications of human judgment in “hyper war” could multiply by triggering a cascade of events through automated decision networks that magnify the consequences of flawed assumptions.80 Given the number of complications and collateral effects the United States has experienced in conducting drone strikes on civilians—even while exercising air supremacy in undeclared theaters of war—these challenges are likely to be amplified in a more contested environment.81 Even under the most favorable circumstances, foreign civilians might suffer the most in wars from which the United States chooses to distance its human representatives.
Participation in conflicts is easier when the faces of dead soldiers no longer grace one’s newspapers, but death still comes to America’s enemies by drone strike, as does collateral damage.
Between 2017 and 2019, the United States launched at least 108 air strikes in Somalia alone, killing some 800 members of the terrorist group al-Shabaab.82 As the security situation deteriorated in late 2020, President Donald Trump ordered US troops out of Somalia and sent them elsewhere on the Horn of Africa.83 This repositioning received little discussion in the United States; the US military has influenced Africa’s affairs for decades without so much as a mention of these policies in presidential debates or cable news roundtables.84
Participation in conflicts is easier when the faces of dead soldiers no longer grace one’s newspapers, but death still comes to America’s enemies by drone strike, as does collateral damage.85 Data compiled by legal expert Mitt Regan shows that between 2002 and 2020, civilians accounted for between 1 percent and 36 percent of casualties inflicted by US drone strikes, many in undeclared theaters of war.86
Even in the absence of such tragedy there is risk. Throughout the war on terror, NATO troops encountered hidden improvised explosive devices (IEDs) in Iraq and Afghanistan.87 Thousands of service members were killed by an enemy they could not see or confront on the battlefield. Some channeled their anger elsewhere, at times resulting in unprofessional or even illegal conduct, such as the 2005 Haditha Massacre.88 This reaction to an elusive enemy follows a pattern. In March 1968, US troops assigned to Charlie Company, 20th Infantry Regiment, killed between 350 and 500 Vietnamese civilians in the My Lai Massacre. Historian and Vietnam veteran Claude Cookman recalls that within Charlie Company’s first three months in Vietnam, four of its men had been killed and thirty-eight wounded “by mines, booby traps, or snipers.”89 As Cookman writes, “They were frustrated because the enemy avoided open battle, denying them a chance to retaliate.”90 War crimes are the product of many factors, including poor leadership, discipline, and oversight, none of which US policy should rely on to restrain its enemies in the next war. It stands to reason, then, that removing soldiers from the battlefield while still killing humans with machines can redirect an enemy’s aggression onto civilians as a means of imposing will on an otherwise inaccessible opponent.
Other groups have reacted similarly by attacking soft targets and civilians if they cannot defeat the remote weapons targeting them. Militant Islamist violence against African civilians more than doubled between 2018 and 2021, and quadrupled in the Sahel region alone.91 Drone warfare analyst Zak Kallenborn recognized this tradeoff several years ago when he wrote that even if military robots reduce the number of humans in war, “the fight will increasingly focus on those [humans] that remain.”92 “Over-the-horizon” strategies thereby unburden American military forces of risk by reassigning some of it to foreign civilians.93
This reassignment cuts both ways. If an enemy cannot eliminate the remote weapons hunting him, then support chains upstream from that weapon become more attractive targets. Civilian factory workers and engineers who develop the systems, or operators and logisticians responsible for their employment and maintenance, can become more appealing military targets.94 Russia has already adopted this approach in what it views as a proxy war against Ukraine’s NATO backers. In July 2024, US and German intelligence agencies foiled a Russian plot to assassinate Armin Papperger, the CEO of a German arms manufacturer supplying Kyiv with weapons.95 Suddenly, the battlefield’s contours become less distinct. This is the “hollowed out” war Christopher Coker feared.96
The United States has reached a pinnacle of risk aversion, in which its forces can kill with impunity while remaining largely insulated from public pressure or even policy debate.97 Taken to its logical conclusion, the aim of post-human warfare is to make these conditions a permanent fixture of modern conflict, even as America’s dwindling all-volunteer force trims the unruly edges of an imperiled free world.98 But if the notion of honor is excised from wars in which death is ever cheaper for its deliverer, little remains to restrain adversaries who seek to depart from legal restrictions in order to reduce the comfort from which their enemies wage war. When paired with the global proliferation of smart devices and an Internet drenched with anti-Western disinformation, the process of reconciling international law with national interest could come under significant strain in the coming years.99
National Interest and International Law
Much of the literature on ethics in remote warfare examines the relationship between drone operators and their targets, while other treatments explore potential legal restraints on the use of autonomous military systems.100 Fewer scholars have considered the implications of swarms of autonomous drones executing target packages in an existential war under conditions that discourage restraint, such as national survival. In other words, studies typically focus on the imperative of human control over the machine rather than on the implications of the machine’s subordination to human interest in wars that must be won.
Policies and public declarations related to the potential use of strategic AI suggest that world leaders are determined to maintain control over war’s most consequential decisions, such as when wars start, how they are waged, and when and how they end.
Policies and public declarations related to the potential use of strategic AI suggest that world leaders are determined to maintain control over war’s most consequential decisions, such as when wars start, how they are waged, and when and how they end.101 Each of these decisions involves bringing the national interest into harmony with international law. The idea of human rights, written into constitutions since the eighteenth century, was only necessary because those rights needed a lawful means of protection against what Hannah Arendt characterized as “the new sovereignty of the state and the new arbitrariness of society.”102 The international community wanted checks on the realist notion of “might makes right” that Thucydides describes in his account of the Melian dialogue. But if legal documents were enough to quell humanity’s interest in self-preservation, then the drafting of the Magna Carta would have solved these problems centuries before the constitutionalists emerged.103 Hard times have a way of stoking dissonance between national interest and international expectations.
In 1861, for instance, President Abraham Lincoln’s secretary of state William H. Seward, and even some of his Democratic colleagues, believed that the Civil War was about more than Confederate dissension. In their eyes, the Union was engaged in a political battle on multiple fronts against those seeking to undermine the legitimacy of free republics everywhere, which gave their national interest global implications.104 This sense of desperation in an existential struggle to save the Union justified violations of international law, such as the seizing of passengers aboard British sea vessels during the Trent Affair in 1861, and the imposition of incredible human costs on Confederate society in Gen. Sherman’s destructive march to the sea through Atlanta in 1864.105 Necessity altered the acceptable parameters of war; Sherman resorted to targeting civilian property only after years of restraint against his fellow countrymen.106
In the following century, the international community responded to the horrors of industrial warfare by establishing mechanisms of protection against war crimes, such as the International Criminal Court at The Hague, the International Court of Justice and Human Rights, and the Geneva Convention. These institutions have produced mixed results in deterring war crimes and even in bringing war criminals to justice, and instead serve more as expressions of ethical norms agreed upon by friendly nations.107 Some of the worst massacres of that century occurred with support or indifference from the Western world, such as the Ottoman Empire’s genocide of roughly 1.5 million Armenians between 1915 and 1916 or the killing of nearly three million Bengalese Pakistanis (roughly 3 percent of that nation’s population) in 1971 during Pakistani President Yahya Khan’s term.108
The integrity of international agreements have been further corrupted by more recent developments such as the 1994 Budapest Memorandum that compelled Ukraine to relinquish its nuclear arsenal to Moscow in exchange for security guarantees from the United States, United Kingdom, and Russia.109 Vladimir Putin’s repeated invasions of Ukraine since that time might give pause to potential signatories of treaties that ask states to sacrifice national interest for global order, such as those treaties that seek to impose severe restrictions on military AI and autonomous weapons.
Decisions in war are still governed by national interest, and machines have no interest beyond that which is given to them by their human masters. The record of these decisions is imperfect at best and unlikely to be perfected through international law. Smarter machines will not sterilize or simplify war as long as war remains governed by humans and wed to human perceptions of necessity rooted in fear, honor, and interest. International institutions, even those that exist to protect human rights, may therefore continue to uphold concepts of just war in peace, but must also recognize that war has a way of changing a nation’s calculus toward risk acceptance; unmanned weapons are simply the latest variable in that math.
Morality in Existential War
Italian political philosopher Niccolò Machiavelli (1469–1527), another realist in the vein of the Athenian delegation, is famous for inspiring the theory that ends justify means.110 He believed personal ethics were incompatible with public policy because the former can obstruct a clear-eyed approach to the latter. As one of the first authors to give political power to civilian populations in his theories, he influenced Western strategic thought in ways that shape philosophical approaches to warfare even today. Although Gen. Ridgway and British strategist B. H. Liddell Hart were wise to see that victory by any means can sow the seeds of another war, solace in their wisdom is much easier to find during times of limited sacrifice.111
The strategic guidance in the United States, however, directs security institutions to prepare for a kind of war that would demand levels of industrial and societal mobilization from which the public could not be entirely shielded.112 According to the 2022 National Security Strategy of the United States, the “risk of conflict between major powers is increasing,” but many analyses couch that risk within the nation’s “technologically sophisticated military capabilities” rather than its end strength or capacity to reconstitute forces.113 Political analyst Iskander Rehman provides a somewhat optimistic take on America’s readiness for protracted conflict, but in any event, such high-end conventional wars cannot last forever, nor can the United States end them unilaterally on terms of its own choosing.114 America must compel a human opponent to accept such terms, and destruction of military equipment alone has rarely achieved this end.
Existential wars—that is, wars in which the existence of one or all participating nations is at risk—lead to moral dilemmas like those surrounding the atomic bombs dropped on Japan in August 1945. Yet five months before President Harry Truman gave that order to Lt. Gen. Nathan F. Twining, commander of the 20th Air Force in the Pacific, 334 American B-29s released 1,500 tons of napalm and magnesium bombs at low altitude over Tokyo, killing over 100,000 people, most of them civilians.115 The bombings continued, and Robert Oppenheimer recalled that the secretary of war at that time, Henry L. Stimson, was shocked by the lack of public interest in the air raids and the damage they were inflicting on Japan.116
Gen. LeMay, one of the commanders responsible for the Tokyo air raids, defended them in his memoirs. In fact, he insisted that if the United States had treated Pyongyang in 1950 as it had Tokyo in 1945, Washington could have “terminated” the Korean War “almost as soon as it began.”117 LeMay was part of the cult of the offensive. He, like Tecumseh Sherman before him, believed that all war is cruelty, and the crueler it is the sooner it will be over.118 This approach typically involved killing as many humans as possible, not just destroying their equipment. (Better yet to kill those few humans who build, maintain, or employ the most critical systems in an enemy’s arsenal.) One of America’s original strategic theorists, Robert Osgood, believed that US policy in the Korean War was generally successful, but seemed to endorse LeMay’s theory, at least indirectly, in his 1957 book, Limited War.119
The professional literature on future war theory is right to discuss the importance of morality, including how machines might free humans to dedicate more effort to fighting ethically.120 Conversations are scarce, however, on the role that morality will play as desperate governments struggle for survival, perhaps because few living persons have experienced such a struggle. This conundrum brings to mind Thucydides’ point that nations and their citizens can have “lofty ideals” when sheltered from “the force of overwhelming necessity.”121
Since 2014, Russia’s occupation of Ukraine brought the force of such necessity to Kyiv’s doorstep, which prompted Ukrainian agents to carry out assassinations of Russian leaders in response.122 Israel has directed a similar campaign of assassinations against its enemies for some time, with limited outcry from the West until recently—not because of the moral clarity of the acts, but because of the assertions of righteous intentions by those carrying them out.123 The Trump administration’s targeted killing of Iranian Gen. Qasem Soleimani in 2020 became the subject of rigorous legal debate, as did the Obama’s administration’s extrajudicial killing of American citizen and al-Qaeda member Anwar al-Awlaki in 2011.124 Perceptions of morality thus became relative to the presumed collective ethics of the actor, not necessarily the act—a state of affairs that revealed the human ghost inside of war, and its complicated relationship with ethics. States known for fighting virtuously are more likely to be forgiven for resorting to questionable acts.125
A strategy of post-human war—in which unmanned weapons lay waste to entire cities while the people controlling them go to “the mall”—might change this dynamic in ways that do not favor the United States.126 If any quality can be admired about war in the twentieth century, it is not so much that the American government decided to intervene but rather that so many Americans were willing to sacrifice alongside their allies and partners for a mutual cause. The legacy of that shared sacrifice, not simply America’s industrial donation to victory, was what carried so much weight in the post-WWII era.127 The dwindling number of human faces that give international legitimacy to the application of US military force abroad is a problem that cannot be solved with stronger or smarter weapons.
A strategy of post-human war—in which unmanned weapons lay waste to entire cities while the people controlling them go to “the mall”—might change this dynamic in ways that do not favor the United States.
To confront these challenges lucidly, Washington must not flatter itself with euphemistic theories of bloodless wars but should instead address honestly the history of America’s contributions to victory. When not in possession of tremendous tactical and technical advantage—such as in the Gulf War—victory for the United States has been secured by the very means it now deems reprehensible: overwhelming force and the largest human footprint possible.128 America must not be surprised if its competitors take a similar approach to war in the twenty-first century.
The Lies We Tell
If war is a harsh teacher, then one of its hardest lessons is that human necessity in war provokes reciprocal escalation beyond the control of its individual actors or their machines. Modern weapons may introduce new ways to escalate, but these tools cannot untether governments or their people from the burdens of war forever. In his assessment of the Korean War’s harmful psychological impact on the United States, Robert Osgood argued that the “aftermath was partly a result of the way in which the government represented the Korean War to the American people in order to elicit a united national effort.”129
The idea of war on the cheap has always been an intoxicating fantasy. In 1954, as President Eisenhower considered limited military intervention in support of besieged French forces at Dien Bien Phu, Gen. Ridgway marveled at the selective amnesia sweeping over Washington:
In Korea, we had learned that air and naval power alone cannot win a war and that inadequate ground forces cannot win one either. It was incredible to me that we had forgotten that bitter lesson so soon—that we were on the verge of making that same tragic error.130
The siren song of post-human war is dragging the world into similar territory in more spectacular ways.131 Even if machines make it easier to kill on our behalf, telling them to stop could become harder, because fewer friendly human casualties can reduce societal pressure for war termination. On the other hand, this great exodus of humans from the battlefield might fail to produce acceptable outcomes and lead to a technological stalemate or a prolonged war of attrition deemed unacceptable to the American people.
Under such conditions, the United States could be forced to reassess its dominant assumptions about modern war as it is thrown haphazardly into the kind of fight it is least prepared to win.132 If this challenge were to occur, America would not be the first powerful democracy to have its way of war tested fundamentally. Donald Kagan had this to say about the fall of Athens:
Years of success at little cost to human life had made [the Athenians] reluctant to accept the risk and the cost demanded by a new situation in which the traditional strategy was not appropriate. . . . Perhaps that is what Thucydides had in mind when he connected the Athenian defeat with the death of Pericles, who alone among Athenian politicians could persuade the people to fight in a way contrary to their prejudices and experiences.133
Like Athens, the US military’s experiences have backed it into a corner, not through a reliance on naval power, but by a crippling dependence on unimpeded technological supremacy to fight its wars with ever fewer humans. Ironically, the Global War on Terrorism reinforced this dependence instead of weakening it. As the cost, volume, and prestige of new weapons increases, institutional pressure within the Department of Defense to invest less in its human core and more in its unmanned exterior will intensify. And although the United States has the most well-trained and experienced military in the world, the military is also the smallest it has been in nearly one hundred years and, according to numerous reports, its technological and industrial edge may be fading.134 The first step in overcoming this problem is for Congress and the Department of Defense to recognize that they cannot “buy victory” in Silicon Valley.135 Victory must be purchased the old-fashioned way: by putting large, highly disciplined armies in the mud.
Conclusion
America’s prejudices toward war are under strain. Theories driven by leaner, tech-laden formations, depicted in concepts such as multi-domain operations, have become extensions of the Western world’s cultural preference for wars that require fewer humans and better technology.136 Those biases have yet to be vindicated in practice. The assumption that technological breakthroughs can offset the risk imposed by personnel shortages and deter or win a conventional war on the cheap does not withstand scrutiny. Also unsubstantiated is the theory that committing fewer humans to war can make it less costly in life or treasure by controlling the passions of its human arbiters.
From a historical perspective, America’s tendency to exaggerate the efficacy of its newest military capabilities is extraordinary.
The recent history of human conflict in Iraq and Afghanistan, Ukraine, and Gaza reinforces that war is still “the human thing.”137 Nations tend to escalate out of fear, honor, and interest, often with a disregard for traditional notions of morality and law. Unmanned weapons can reduce the presumed cost of escalation, and thereby make such escalation more likely. In other instances, fear of the blowback from using one’s strongest weapons can lead nations to rely on human mass even when other options are available. Once countermeasures or technological parity amend prewar assumptions about the efficacy of new weapons, nations require vast reserves of humans to occupy defenses or wage offensives.
From a historical perspective, America’s tendency to exaggerate the efficacy of its newest military capabilities is extraordinary. Between 1945 and 1950, news coverage of nuclear weapons often contained the term “push-button warfare,” a phrase used to describe the alleged rapid, remote character of future war.138 A 1952 political cartoon took this term to task: The drawing depicted a soldier knee-deep in a snowy Korean foxhole yelling out, “Button, button, who’s got the button?”139 By the end of 1964, Pentagon annexes describing potential courses of action in Vietnam still painted an unrealistic picture of the Johnson administration’s control over escalation with modern bombing practices. The Department of Defense’s assessment that “the military program would be conducted rather swiftly, but the tempo could be adjusted as needed to contribute to achieving our objectives” was shortly disproved by the North Vietnamese Army.140 As the United States looked toward Iraq in 2002, the head of US Central Command, Gen. Tommy Franks, envisioned a new way of war fought with technologies considered “science fiction” ten years prior.141 A decade later, though many hoped these devices would eliminate the need for large occupation forces in Iraq and Afghanistan, each theater had more than 100,000 US troops on the ground—far fewer than some recommended in 2002, and fewer still than others later suggested were needed.142 As of this writing, Western militaries are once again paving shiny new paths to the same muddy foxholes, unencumbered by the nuisance of their history.
Since 1945, America has eluded the dilemma of absolute necessity in its foreign wars. The nation’s limited conflicts, characterized by restraint, technological superiority, and proportionality, have allowed the United States to defend its interests abroad with minimal sacrifice at home. But America’s options are narrowing as its competitors assemble and its allies face threats not seen in generations.143 The age of autonomous warfare has the potential to arouse the most undesirable aspects of the human condition because unmanned weapons reduce the perceived risk associated with first contact between states, creating an illusion of control over the human passions that govern war.144 Civilians often pay the heaviest price and will continue to do so if Western nations embrace the great exodus of humans from the battlefield.
Leaders in Congress, the White House, and the Department of Defense should approach these challenges along two lines of effort. First, they must address the misalignment of human resources and human requirements before the next war forces the decision upon them. According to the 2022 National Defense Strategy, the United States is navigating a “decisive decade” of “growing threats to vital US national security interests” with the smallest military in eighty-five years and a defense industrial base entangled with the very “pacing challenge” it is charged with deterring or defeating.145 Although Curtis LeMay was a leading advocate of exploiting new weapons, he still put this gamble into context bluntly: “No weapon is really a weapon until it is battle-tested. When the firing starts, that popgun may not be as good as you thought it was.”146
The gamble in 2025 is that America’s latest popguns will not only meet expectations in the next war but also exceed them—to the point that humans are scarcely needed, even as the human cost of war mounts ever higher in Ukraine, for example. Assuming the nonnecessity of humans is a risk that, once incurred, will have no quick fix. Army Chief of Staff Gen. Raymond Odierno, who commissioned a study of the Iraq War in 2013, wrote in the foreword of that study that “ground forces were overtaxed by the commitments in Iraq and Afghanistan, and the decision to limit our troop levels in both theaters had severe operational consequences.”147 These consequences would prove far greater in a large-scale conventional war against a state enemy or enemies with similar capabilities, but that is the situation that faith in unmanned warfare may be creating for America’s soldiers, sailors, airmen, marines—and citizens.
Despite talk of a “new Cold War,” the US military has one million fewer active members than it did in 1989 and nearly 250,000 fewer than it did in 2006, when the military was spread thin in a war on terror that spanned Central Asia and the Middle East.148 In the next major conflict, the fear, honor, and interest that govern war’s conduct could force the United States to contend with a perennial reality—that machines are incapable of reconciling the contradictions between the size of its military and its strategic ambitions.
The second line of effort to consider is that wars are won by nations, not militaries.149 America’s modern way of war implies the opposite, and many have fed into this narrative amid the cultural bend toward a revolution in unmanned warfare. Public officials must begin pushing back against this narrative, lest the American people become convinced that the next war will be something like a video game.150 In President Trump’s first interview after returning to office in 2025, cable news host Sean Hannity issued a statement seventy-five years in the making: “I don’t believe we’re going to fight future wars on battlefields; they’re going to be fought in air-conditioned offices.”151 To Mr. Trump’s credit, he disagreed, but admitted that drones are important. Mr. Hannity is no strategist, yet his statement reflects a mode of “push-button” thinking that is as dangerous as it is popular in US history. Such theories promote societal disengagement from conversations on national security and impede the type of whole-of-nation investment needed to deter or win wars.
Like the Athenians who believed that their great navy could deliver them from the horrors of the battlefield, it is tempting to hope that flying or floating machines will do the same for America today. Athens learned its lesson the hard way. The United States does not have to. America is overdue for hard conversations that come to terms with the human core of its unmanned wars and challenge its assumptions about the characteristics of modern warfare. Doing so in the current state of competition, rather than in the heat of any future conflict, is the bitter pill that Washington must swallow if it is serious about securing its interests, and those of its allies, in the twenty-first century.
Maj. Michael P. Ferguson is an active-duty US Army officer and PhD student in the Department of History at The University of North Carolina at Chapel Hill. His main research project covers Dwight Eisenhower’s joint chiefs of staff and their advocacy for “limited” war in the air atomic age. Ferguson has served in Iraq, Afghanistan, Europe, and Africa, and is on assignment to teach history at the US Military Academy at West Point. In addition to his coauthored book, The Military Legacy of Alexander the Great: Lessons for the Information Age, Ferguson is published in Parameters, Joint Force Quarterly, PRISM, Military Review, Æther, Modern War Institute, The National Interest, and other outlets. He is also an opinion contributor at The Hill. The views expressed in this article are those of the author and do not represent the positions of the US Army or the Department of Defense.
Acknowledgments: This article began years ago as a book chapter but was cut due to length, so I’d like to thank my coauthor Ian Worthington for the opportunity to start writing this. I’m also indebted to my UNC advisors and faculty who have challenged my thinking about historical inquiry, especially Professors Wayne E. Lee, Michael C. Morgan, Fitz Brundage, and Erik Gellman. This article benefited immensely from the keen eyes of reviewers and editors at Texas National Security Review, for whom I am thankful. Any mistakes are of course my own. Finally, I am eternally grateful to my wife and two children, who tolerate and at times even support my writing.
Image: 332d Air Expeditionary Wing photo by Senior Airman Jason Epley