Buy Print
Magazine

Buy Print
Magazine

The Scholar

PDF coming soon

Vol 7, Iss 2

-

From Panic to Policy: The Limits of Foreign Propaganda and the Foundations of an Effective Response

American leaders and scholars have long feared the prospect that hostile foreign powers could subvert democracy by spreading false, misleading, and inflammatory information by using various media. Drawing on both historical experience and empirical literature, this article argues that such fears may be both misplaced and misguided. The relationship between people’s attitudes and their media consumption remains murky, at best, despite technological advances promising to decode or manipulate it. This limitation extends to foreign foes as well. Policymakers therefore risk becoming pessimistic toward the public and distracted from the domestic, real-world drivers of their confidence in democratic institutions. Policy interventions may also prove detrimental to democratic values like free expression and to the norms that the United States aims to foster in the information environment.

Russia’s interference in the 2016 U.S. presidential election led to intense public and scholarly debates over the role of foreign propaganda — deliberate and systematic attempts to use media to shape perceptions and direct behavior within domestic politics.1 Russia’s brazen operation and Donald Trump’s victory were both unexpected, leaving analysts grasping for answers about the extent to which Russian activities may have influenced the outcome. The episode breathed new life into an old American fear: widescale societal manipulation by malicious foreign actors weaponizing media at home.2 Such concerns went beyond Russia. China’s investment in social media, for example, led to congressional hearings in which representatives spoke ominously about information campaigns by rival great powers.3 Such campaigns seem particularly insidious. They putatively threaten national security not by changing the balance of military power but by eroding faith in democratic institutions.4

With the 2024 U.S. presidential election looming, these anxieties seem well founded. That Chinese or Russian intelligence services seek to use technology and propaganda to covertly sway the American public is not in doubt.5 Governments, civil society organizations, and online platforms have demonstrated how narratives can spread online, spurred on by fake and foreign actors.6 Meanwhile, online data-harvesting is largely unregulated in the United States, fueling speculation that insights into Americans’ lives might be used to target them with both greater precision and persuasiveness. Insofar as national leaders presume that democratic institutions depend on citizens making rational calculations based on verifiable facts, the potential for disruption can seem catastrophic. Policymakers and researchers have therefore rallied to defend what is presumed to be the primary target of foreign propaganda, democracy itself: the trust among citizens and in institutions necessary for this participatory system of governance to function properly.

Yet this rise of interest and effort is based on potentially misleading views about the prospects for propaganda. Contrary to prevailing assumptions, a range of recent empirical studies have failed to validate any uniform, causal relationship between online media and major changes in human attitudes and behaviors. Moreover, research in this area remains limited in scope and beset by methodological challenges.7 Attempting to trace or wield influence is difficult, even with the help of systematic data collection. Both would-be online propagandists and policymakers often fail to appreciate this complexity.

This failure may lead to ineffective policy prescriptions, relying on military, foreign policy, and national security tools to address what are likely homegrown domestic issues. Insofar as policymakers aim to protect democracy from such subversion, outsized fears of foreign encroachment, undue faith in the power of media and technology, and pessimism toward the American public may prove equally corrosive to the trust in institutions ostensibly under greatest threat. Political leaders and institutions thus risk losing faith in the very public they exist to serve.

This article begins by describing the unsettled academic debate about the impact of media on people. The second section briefly outlines how this debate has persisted across political and academic discourse over the last century. The next section interrogates how technological advances came to revive the notion of media’s direct effects on people, in large part by citing the putative powers of data aggregation about them. The fourth section explains why the temptation by governments to intervene in the information environment risks backfiring, making foreign online influence a convenient scapegoat for home-grown Digital Age socio-political problems.8 The conclusion recommends a more introspective, domestic-focused approach to combatting foreign propaganda online — one that starts by acknowledging its limitations, accepting people’s own agency in the media they consume, and remembering that trust in democracy stems foremost from its ability to meet their real-world needs.

Propaganda’s Unsettled Questions

At present, there is no consensus in the extant literature about the effectiveness of foreign propaganda.9 Rather than aspiring to settle these disputes, this article argues that their very persistence is itself instructive: If the relationship between people and the media they read, watch, and listen to remains mysterious for researchers, it is likely no more scrutable to foreign adversaries. Moreover, insofar as media is but one avenue of subversion — the success of which seems to depend in part on preexisting political conditions — research and policy emphasis focused on countering online manipulation by foreign actors may be empirically and politically misplaced.

Researchers have studied, from a range of perspectives, how states seek to interfere in the domestic affairs of others, particularly in elections. According to some political scientists, outsiders find polarized publics particularly attractive, as domestic partisans often amplify narratives that putatively advance foreign state interests, inform the direction of the targeted state’s foreign policy, or influence its foreign relations.10 Judging from several historical case studies, “covert dissemination of scandalous exposés or disinformation on rival candidates” — alongside other acts such as public threats, promises, campaign donations, and quid pro quos — may have a statistically significant effect on election outcomes.11 Broadly, however, the perceived success of foreign interference appears to hinge on preexisting conditions within the target population, including uncertainty and doubt in their state institutions, elites, and leadership.12 The U.S. military’s own attempts at “winning hearts and minds” through propaganda operations in Afghanistan and elsewhere have cost hundreds of millions of dollars, with mixed success: “Over the years, huge amounts of money have been spent on information operations programs that are largely anchored in advertising and marketing styles of communication, with little concurrent investment, it would appear, in detailed understanding of audiences and environments.”13

This suggests that people are far less impressionable than presumed, their views much less moldable with any skill or reasonable expectation of success — by states or any other actors.

There is no question of the roles that media play in popular discourse — creating and enabling authoritative voices, continuously shaping and reshaping what is considered socially acceptable behavior.14 In this regard, it is unsurprising that foreign actors try to use various media — from newspapers to broadcasts to digital social platforms — as prime avenues for subversion. Nevertheless, media’s role in the construction of people’s identities, beliefs, and behaviors remains a contested scientific question.15 People may exercise more agency and discretion about the media they interact with than is commonly portrayed in accounts about the spread of “fake news.”16 Cognitive scientist Hugo Mercier asserts that most efforts to sway people — by advertisers, politicians, or propagandists — are largely ignored, and that humanity’s biggest problem is more the truth it fails to internalize, less the falsehoods it accepts as true.17 Where false or inflammatory content does seem to resonate, the question arises as to whether people already consciously intended to accept it.18 Most inaccurate or misleading information “reaches people who are already misinformed — or at least very open to being misinformed,” says Nieman Lab’s Joshua Benton.19

However detrimental self-delusion and ignorance may be to social cohesion, they might simply prove more useful to people’s everyday life than facts. According to philosopher Dan Williams, the marketplace of ideas might be more aptly described as a marketplace of rationalizations, in which people and organizations compete to justify their preferred beliefs, in exchange for money, attention, and social status.20 A range of studies suggest that such desires are immutable, overpowering even a conscious preference for truth and accuracy.21 People may therefore learn to value most the worldviews that best suit their own social contexts.22

For example, in a recent study of Russia’s online operations in the runup to the 2016 U.S. presidential election, Eady et al. found no significant linkages between exposure to the subversive content and subsequent changes in attitudes, polarization, or voting behaviors.23 Other studies concluded that content shared via social media had no discernable effect on people’s beliefs or opinions and that significant changes to people’s on-platform experience did not significantly alter their attitudes or levels of polarization.24 These studies contradict previous scholarship suggesting Russia’s efforts might have measurably altered public opinion.25

Even where propaganda may thrive online, “whether or not it has any impact on political outcomes such as levels of political knowledge, trust in democratic institutions, or political polarization remains an open question,” according to the Hewlett Foundation, a U.S.-based private foundation.26 This suggests that people are far less impressionable than presumed, their views much less moldable with any skill or reasonable expectation of success — by states or any other actors.27 This would not be the first time, however, that American threat perceptions about propaganda have solidified before any scientific consensus could.

Historical Echoes

Current research and policy discussions about propaganda and foreign influence resemble those from previous decades. In the 1920s and 1930s, journalist Walter Lippmann and several of his contemporaries analyzed the process of public persuasion, drawing on their experience working to rally public support for U.S. involvement in World War I. Their work initially rested on one key assumption: that media had a direct and powerful influence on the American public, whom they considered “volatile, unstable, rootless, alienated, and inherently susceptible to manipulation.”28

Lippmann found it unrealistic for the average citizen to develop what he called omnicompetence on weighty issues, to break free from their own social clusters to make the world somehow intelligible.29 Edward Bernays, a prominent American public relations practitioner-scholar, took Lippmann’s misanthropic view a step further, arguing that the so-called engineering of consent, “when used for social purposes, is among our most valuable contributions to the efficient functioning of modern society.” Unapologetically elitist in approach, Bernays held that government had a solemn duty to interpret important facts and events on behalf of what he considered an otherwise dim or disinterested public, to lead them “to socially constructive goals and values.”30 This duty, he claimed, was one of democracy’s defining features. Communications theorist Harold Lasswell went so far as to advocate that U.S. officials shield democracy from authoritarianism through systematic, state-led, mass manipulation of their own public.31

Consequently, research into media effects was framed as a liberal democratic imperative to counter fascism (and later communism).32 This model of media influence from the interwar period was later caricatured by scholars as the “hypodermic needle” or “magic bullet” model for its simplicity — signifying the gradual repudiation of the notion of an all-powerful media on one side, the public on the other, with little in between.33 While Lippmann claimed that knowledge “originated in individuals by means of isolated contact with objects,” his philosophical opponent, scholar John Dewey, contended that knowledge instead sprang from human interaction.34 Sociologist Joseph Klapper later expanded on this theme, arguing that media had little direct influence on people, but instead mostly reinforced their biases and attitudes about the world.35

In the post–World War II period, the term propaganda gradually receded from common, often pejorative use in favor of less ideologically charged terms like communication, information, and persuasion.36 This shift reflected growing acceptance among humanities scholars at the time of the need to factor for not only new communication technologies, but also the perplexities of the human condition.37 Research funding from major donors like the Ford Foundation began to shift away from fuzzy means of studying people’s behavior, and instead toward examining it like chemistry or physics, using quantitative methods.38 The idea that social phenomena could be explained, if not manipulated, through the physical sciences also captured the imaginations of cyberneticists and information theorists.39 It appealed to a worldview in which information was objectified — like matter or energy — subject to the laws of nature, able to be deliberately amassed and directed against an opponent.40 However, essentializing information in this way inevitably led to objectifying the people who encountered it.

Consequently, by mid-century, this preoccupation began to draw criticism from hard-science devotees, who resented the focus on finding uniform, stable dynamics within disorderly social relationships.41 Historian John Gaddis characterizes this period as one in which “the ‘soft’ sciences became ‘harder’ just as the ‘hard’ sciences were becoming softer.”42

The 1950s and 1960s were marked by anti-communist fervor in Washington, and the media was a frequent target.43 Most prominently, Senator Joseph McCarthy embodied the prevailing conviction that communist regimes might succeed in their plots to corrode democracies from within. As historian Jennifer Miller details, national leaders sought to help citizens “distinguish between healthy, ‘correct’ ideas and harmful, ‘false’ ones … claiming that democracy stemmed from psychological vigilance, rather than representative institutions or political rights.”44 Ironically, the very liberties democracy putatively guaranteed remained hopelessly out of reach for many Americans — particularly persons of color — at the time.45

This was also the period when the television became a ubiquitous fixture in American homes, spurring even more nuance in media studies over the ensuing decades.46 Scholars gradually gave up on validating so-called direct effects — the idea that people might respond uniformly to a given media stimulus.47 They instead introduced concepts like agenda-setting, priming, and framing to describe the complex dynamics at play between mass media, prominent figures, and audiences.48 They asserted that meaning is neither fixed by the messenger, nor passively received by its recipient, nor necessarily transparent to outsiders — therefore, media influence is both a discursive construction and a byproduct of socio-economic conditions.49

As new information and communications technologies heralded the dawn of the Digital Age, media would become more interactive and diffuse, reviving the prospect that direct effects might be identified — if not perfected — by a more granular understanding of individual and public preferences revealed through these interactions. The long-sought promise of a data-fied and computable — and thus predictable and moldable — public seemed finally within reach.50

Data Deluge

The advent of the Internet upended many of the prevailing paradigms about the relationship between the media and the public. A scarcity of information, filtered by the former, gave way to a scarcity of attention among the latter.51 After the turn of the century, online media would blur the distinction between the two groups altogether. It would throw into doubt earlier theorizing about media effects, making “public opinion” simultaneously ever-present, yet somehow elusive.52 So-called surveillance capitalism would mark the tradeoff between citizens and media: convenience in connectivity in exchange for granular insights on everyday life.53

In the fallout from Russian interference in the 2016 election, concern and scandal emerged over British consulting firm Cambridge Analytica’s harvesting of data from millions of Facebook users and selling it to political campaigns.54 Similar data-harvesting concerns would later punctuate congressional hearings about China-based social media platform TikTok’s influence in 2023.55 Policy discourse about media manipulation now stems from the supposed power of big data about Americans being paired with algorithms feeding information to Americans.56 This logic entails several self-reinforcing assumptions: New technologies will create new ways to generate, collect, and analyze data about people, revealing otherwise unobservable phenomena.57 Automation will minimize human error and bias, thereby making the data more “raw.”58 The resulting judgments about these phenomena will therefore be more accurate.59

Unlike chemical reactions, people do not operate according to any fixed trajectories or rules, but are self-contradictory, paradoxical, and unpredictable.

This logic has served several disciplines well, leading to breakthroughs in fields from genetics to supply chains.60 Meanwhile, researchers, politicians, news producers — and, more recently, social media platforms — have also applied this data-centric paradigm to explain the complex relationship between the media people consume, their attitudes and behaviors, and, by extension, the phenomenon of suasion.61

Social media platforms like Facebook and Twitter (now X) became major proponents of this logic, as they assumed a leading role as intermediaries between users and content. Users, advertisers, and researchers also relegated to platforms the task of identifying value and ascribing meaning to whatever data this relationship generated. The result, according to Sun-ha Hong, was the prospect of “unprecedented knowledge for the human subject, precisely by shifting accepted norms around what counts as better knowledge.”62 Such power is both extensive in practice and seemingly bottomless in promise: human behavior, reduced and reconstituted into a vast pool of measurable units.63 Online media offered a seemingly sterile laboratory through which to observe and study this behavior, ostensibly objectively, in isolation from its most confounding, real-world variables.64

Yet no matter how automated, determining which factors qualify as “data” online remains a subjective exercise.65 Unlike chemical reactions, people do not operate according to any fixed trajectories or rules, but are self-contradictory, paradoxical, and unpredictable. Societies evolve variously across time and geography and conform to no optimal model.66 As a result, observers can be tempted to view flurries of social media activity as deterministic markers or avenues of influence, absent additional context.67

Such flurries of activity are frequently measured against the most direct and worst conceivable outcomes, such as foreign manipulation. But unlike economics or medicine — whereby catastrophic events can be contrasted against an optimally functioning model — communications studies lack such a baseline from which to start.68 Even in the pre-Internet era when the media ecosystem was more settled and coherent, political warfare, conspiracism, and falsehoods also persisted.69 There is no community with a “view from nowhere” to serve as a control group.70 As a result, a person’s media exposure, however extensively it might be measured, is not all-encompassing or wholly representative of their lived experience.71 People use media as much to escape their reality as to shape it, as much to assert their identity as to formulate it, as much to validate their lived experience as to interpret it.72

In historical terms, humanity has only just begun to generate and collect the oceans of data now available.73 The quality of research inquiry has, in many instances, suffered from such abundance. The search for statistically meaningful correlations has largely become a market unto itself.74 The emphasis on statistical significance over practical importance coincided with the deluge of data over past decades, infusing various fields of study — often to questionable ends.75 According to economist Gary Smith, trust in data and trust in science are not synonymous and can undermine each other: 76

A virtually unlimited number of coincidental sequences, correlations, and clusters can be discovered in large databases. … Those who do not appreciate the inevitability of patterns are sure to be impressed. Those who are seduced by these shiny patterns are likely to be disappointed. Those who act on the basis of data-mined conclusions are likely to lose faith.77

The main beneficiaries of the idea that people can be decoded through their online activities are not foreign adversaries or researchers, but major social media platforms and advertisers themselves.78 The latter industry — arguably the largest repository and beneficiary of user data on earth — operates on the assumption that seeing leads to clicking, which then leads to buying. However, the science behind the effect of digital advertising on consumption is also unsettled.79

Social interactions remain largely grounded in the material world, and people’s capacity to absorb communication and juggle relationships also remains constrained.80 To depend primarily on social media data to understand how narratives impact users’ real-world attitudes and behaviors — much less to try and shape them — is thus like trying to discern the plot of a film by staring into the beam of the projector.81 When it comes to data, the closest thing to natural law is “garbage in, garbage out.”82 This law applies equally to the researcher exploring media’s effects and to the foreign adversary seeking to exploit them.

Contesting the Space

States often grapple with the uncertainties of an external threat, prompting excessive policy remedies aimed at returning to some previous state of familiarity or equilibrium.83 The less clarity about another state’s intent and capacity to do harm, the greater the tendency for anxiety.84 For instance, at the height of the Cold War, the United States knew all too well how destructive nuclear weapons could be. What was less clear were Soviet capabilities and intentions. The perceived threat of online manipulation now inverts this dynamic: Ill intent from adversarial states is evident, and the tradecraft of online propagandists is well documented. What remains unclear is the degree to which “information warfare” is a causal factor to real-world events.85 This lack of clarity notwithstanding, the propaganda elements of Russian interference in 2016 sparked a paradigm shift in how U.S. policymakers conceptualized information as both a threat and a warfighting function.86

It is reasonable for states to be wary of foreign influence and subversion. In this vein, leaders understandably feel compelled to protect the public’s sense of routine, stability, and national character.87 Political scientists refer to this as “ontological security” — the idea that states must secure their social existence first before they can succeed at much else.88 States thus generally perceive information as a threat in three general categories: technological (the ability to defend networks), intelligence (the ability to avoid surprise), and cognitive (the ability to protect the public’s sense of well-being).89 Moscow in 2016 breached all three, the last category perhaps most jarringly.

The prospect that adversaries might exploit domestic socio-political fissures by using relatively novel cyber means fueled outrage in the U.S. national security community.90 By 2020, the United States had marshalled military, law enforcement, diplomatic, intelligence, economic, and public messaging tools specifically to thwart foreign attempts to undermine public confidence in the electoral process. Most notably, U.S. Cyber Command reportedly undertook an operation to disrupt a notorious Russian troll farm ahead of the 2018 mid-term elections. By autumn 2020, the U.S. intelligence community was preemptively and directly warning Americans of Russian efforts to influence public opinion.91 This was even though the real-world effects of Russian interference on voting behaviors in 2016 remained contested among researchers, and despite the risk of lending Russian actors both unnecessary amplification at home and unearned clout in Moscow. Meanwhile, domestic political and media figures have been found to be among the biggest purveyors of disinformation about the integrity of the vote in subsequent elections.92

Over the past 25 years, the media ecosystem’s largely top-down structure has nearly vanished, replaced by a more diffuse and immediate — in a word, democratic — architecture.

U.S. civilian and military leaders are loath to cede what they consider a contested information domain to adversaries.93 But such measures are not without risk. Left unchecked, states’ suspicions of foreign orchestration behind every unpreferred narrative fuels the hubris of attempting such orchestration themselves.94 For example, the U.S. Department of Homeland Security’s recent efforts to counter disinformation online were met with skepticism from privacy and free speech advocates.95 Some commentators likewise consider a potential wholesale ban on TikTok — as recently passed by the U.S. House of Representatives and advocated by the Biden administration, state legislatures, and rival tech firms — to be “an entirely un-American, undemocratic, and inappropriate response to an unproven risk.”96 Government engagement with social media platforms about disinformation has also elicited allegations of soft censorship.97 Indeed, a policy bias toward intervening in the information environment risks backfiring by undermining the very trust and confidence it was designed to safeguard.98

Liberal democracies encounter structural limitations on which media and speech-related issues can be legally, normatively, and effectively cast as national security concerns. For government agencies that are unavoidably associated with partisan or political agendas, the goal of cleaning up or contesting the information environment from a credible, neutral position is one at which they are destined to fall short. Worse, democracies may begin to resemble foreign foes, such as when the U.S. Central Command reportedly created a small network of inauthentic social media profiles to boost messaging.99 Aside from being easily detected, the operation looked suspiciously like the very Russian behavior that had prompted outrage from Washington in 2016. Democracies thus risk responding to autocratic adversaries by poorly imitating them, likely being branded as hypocrites in the process by would-be partners in condemning online propaganda.100

The hard truth for democracies in the Digital Age: their citizens are free to fool themselves, says former Stanford Internet Observatory director Alex Stamos. “But that is not something that’s necessarily being done to them. … hunting down their speech and then changing it or pushing information on them is the kind of impulse that probably makes things worse,” particularly if driven by governments aiming to reassert their own authority or to reimpose some sense of ontological security.101

The urge to intervene among national leaders is also likely spurred in part by the memory of an era when the information space was far more hierarchical and confined. Over the past 25 years, the media ecosystem’s largely top-down structure has nearly vanished, replaced by a more diffuse and immediate — in a word, democratic — architecture.102 Political scientists predicted as much in the late 1990s as the Internet became ubiquitous, noting that it would create new and competing claims upon citizens’ allegiances.103 The Digital Age presented humanity with more facts — all of them more subject to individual interpretation — than it had ever encountered previously.104 Major upticks in conspiracism, political polarization, populism, and distrust followed. For many researchers, this heralded the beginning of a “post-truth” era and signaled the end of a shared sense of reality — absent which, they argue, a democracy ceases to function.105 A dwindling faith in traditional authority came with a variety of sense-making options online.106 A once-dominant monoculture fractured into smaller sub-groupings, some adhering to ideologies that were demonstrably false, incomprehensible to, or even violently at odds with other ways of thinking.

This disorienting information landscape may very well undermine the social trust necessary for democracy to thrive. Researchers and policymakers, however, can easily conflate this disorientation with, or attribute it to, foreign-orchestrated propaganda. To the extent the latter is effective, its impacts are more likely to be indirect and emergent (intertwined with other social and political phenomena) than direct and contingent (prompting attitudes or behaviors that would otherwise not have occurred).107 Complexity science explores this dynamic in nature: extreme dependency on initial conditions renders the search for singular causes of various phenomena fruitless and makes their effects wildly unpredictable.108

Online media has made it too easy to conflate the volume of observed foreign attempts at online manipulation with demonstrable consequences.109 However, as intelligence scholars Cormac et al. note, “overestimating the ‘hidden hand’ can lead to conspiracism about foreign actors, undermine trust in democratic institutions, and provide a convenient scapegoat for domestic divisions.”110 This makes it difficult to hold authorities accountable for the domestic conditions that may lead certain communities to accept falsehoods and mischaracterizations, while the online and foreign aspects of their spread occupy the most attention. However, as Lippmann conceded, “the slogans of politics are not the essence of politics.”111 Social cohesion stems more from the formative function democratic institutions are supposed to play in the real world — servicing the needs of citizens — rather than the often performative role they might now play in Digital Age media.112

Protect Democracy: At Home and Offline

Decisionmakers should situate their threat perceptions, operational decisions, and rhetoric within broader historical, research, and political contexts.113 To begin from the assumption that online propaganda is necessarily successful, merely understudied, may legitimize foreign adversaries’ attempts — thereby doing a measure of their work for them. As Cormac et al. note, the success of covert subversion is determined by prominent “observers judg[ing] that an operation met the goals that proponents set out to achieve.”114 In this regard, more deliberate ignoring of online propaganda is potentially in order.115 As artificial intelligence now threatens to open the floodgates of synthetically produced content (so-called deepfakes), hypervigilance is likely to be ineffective, entailing significant opportunity and attention costs.116

Left unchecked, such instincts trend more toward technocracy and oligarchy than strengthened democracy.

Direct, causal linkages between propaganda and personal beliefs and behaviors remain speculative, at best. How large crowds behave and how their beliefs are formed — consciously or subconsciously — remains a largely open question.117 Insofar as media exposure does influence people, research suggests it likely does so in concert with myriad other structural, real-world factors that have gone relatively underexplored.118 Media scholar David Karpf explains:

That we could counter adversary advances in digital propaganda with advances of our own, or that we could regulate our way out of this psychometric arms race … is a story with clear villains, clear plans, and precise strategies that might very well be foiled next time around. It is a story that keeps being told, because it is so easy to tell. But we pay a price for the telling and retelling of this story. The problem is that the myth of the digital propaganda wizard is fundamentally at odds with the myth of the attentive public. If the public is so easily duped, then our political elites need not be concerned with satisfying their public obligations.119

To whatever degree foreign subversion does capitalize on domestic discontent, decisionmakers should turn their focus toward safeguarding democratic trust at home, in real-world spaces. A recent major survey among citizens in 22 democracies by the Organization for Economic Co-operation and Development finds that trust in institutions is shaped by three factors. The first is equal opportunity access to the policymaking process, particularly for disadvantaged, less affluent, less educated, and minority groups. The second is individual policymaker responsiveness to citizen concerns. The final factor is the perceived degree of corruption, cronyism, and nepotism in government, including its capture by special interests and transparency in lobbying.120 These findings are neither surprising nor simple to address. But rather than a reflexive, militarized policy response to safeguard against attempts at foreign subversion, policymakers should focus first and foremost on these domestic factors. Their prospects for success are arguably no dimmer than those for solving — or hijacking — human fallibility through digital media or technology.121

The democratic form of government, being rooted in free expression, has always been subject to — and required a degree of tolerance for — lies, misinterpretations, and machinations. As was true a century ago, attaining a healthier politics will require national security leaders construing and depicting the information environment as more than just a mechanism that can be calibrated at will.122 Recent studies suggest that alarmism about online manipulation itself might diminish faith in democracy and legitimize calls for excessive curbs on speech.123 Instead, policymakers should assume that even the best resourced and sophisticated actors in that space operate under identical, heavy constraints possibly to marginal effects.124

Otherwise, policymakers and researchers will find themselves right back where Lippmann started: pessimistic about the public’s ability to sort facts from nonsense, and thus determined to imbue elites and authorities with the power to interject and interpret on its behalf.125 Left unchecked, such instincts trend more toward technocracy and oligarchy than strengthened democracy.126

Americans must ultimately be slower to accept the premise that foreign online manipulation is more prevalent, their neighbors more gullible, and human behavior more dependent on media than is likely the case.127 Dewey once asserted that, provided the right conditions, American citizens are fully capable of living free from outside coercion.128 In this regard, shutting off avenues for foreign malign influence also means ensuring decisionmakers are not distracted or absolved from servicing the needs of the public.129 Scholars have debated for the past hundred years about where confidence was better placed: in the power of the citizen and of democratic institutions, or in the ability of concerted propagandists to subvert them.130 The choice today is no less stark.

 

Gavin Wilde is a senior fellow at the Carnegie Endowment for International Peace, where his research focuses on cyber, propaganda, emerging technology, and Russia issues. He is also an adjunct professor at the Alperovitch Institute for Cybersecurity Studies at Johns Hopkins University.

Acknowledgments: The author wishes to express sincere gratitude to the editors of the Texas National Security Review, two anonymous reviewers, Michael van Landingham, Sam Forsythe, Lilly Muller, Jeff Rogg, Stepanie Carvin, Jill Kastner, Alicia Wanless, Jon Bateman, Yoel Roth, and other Carnegie colleagues for their generous contributions of time and expertise.

 

Image: Sebastian Wallroth (CC BY 4.0 DEED)

 

Endnotes

1 U.S. Office of the Director of National Intelligence, Assessing Russian Activities and Intentions in Recent U.S. Elections, January 6, 2017, https://www.dni.gov/files/documents/ICA_2017_01.pdf. Propaganda is defined here as “the deliberate and systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist.” For a discussion, see Garth S. Jowett and Victoria J. O’Donnell, Propaganda & Persuasion, 6th ed. (Los Angeles: SAGE Publications, 2018), 1-7.

2 Jeffery L. Bineham, “A Historical Account of the Hypodermic Model in Mass Communication,” Communication Monographs 55, no. 3 (1988): 230-246, https://doi.org/10.1080/03637758809376169.

3 Brian Fung, “Lawmakers Say TikTok Is a National Security Threat, but Evidence Remains Unclear,” CNN, March 21, 2023, https://www.cnn.com/2023/03/21/tech/tiktok-national-security-concerns/index.html.

4 David A. Siegel, “Democratic Institutions and Political Networks,” eds. Jennifer Victor et al., The Oxford Handbook of Political Networks (Oxford: Oxford Academic, 2016), https://doi.org/10.1093/oxfordhb/9780190228217.013.35.

5 U.S. Department of State, GEC Special Report: Pillars of Russia’s Disinformation and Propaganda Ecosystem, August 2020, https://www.state.gov/russias-pillars-of-disinformation-and-propaganda-report; and Albert Zhang et al., “Gaming Public Opinion: The CCP’s Increasingly Sophisticated Cyber-Enabled Influence Operations,” Australian Strategic Policy Institute, April 26, 2023, http://www.aspi.org.au/report/gaming-public-opinion.

6 Dustin Volz and Michael R. Gordon, “China Is Investing Billions in Global Disinformation Campaign, U.S. Says,” Wall Street Journal, September 28, 2023, https://www.wsj.com/world/china/china-is-investing-billions-in-global-disinformation-campaign-u-s-says-88740b85.

7 Chico Q. Camargo and Felix M. Simon, “Mis- and Disinformation Studies Are Too Big to Fail: Six Suggestions for the Field’s Future,” Harvard Kennedy School Misinformation Review, September 20, 2022, http://dx.doi.org/10.37016/mr-2020-106; and Jon Bateman and Dean Jackson, “Countering Disinformation Effectively: An Evidence-Based Policy Guide,” Carnegie Endowment for International Peace, January 31, 2023, 11-15, https://carnegieendowment.org/files/Carnegie_Countering_Disinformation_Effectively.pdf.

8 The “Digital Age” refers to the period from the 1970s with the advent of the personal computer, through the subsequent prevalence of digital technologies in the 1980s and the widespread use of the Internet by the late 1990s, to the present. For a discussion, see Tendai S. Muwani et al., “The Global Digital Divide and Digital Transformation: The Benefits and Drawbacks of Living in a Digital Society,” in Digital Transformation for Promoting Inclusiveness in Marginalized Communities, eds. Munyaradzi Zhou et al., (Hershey, PA: IGI Global, 2022), 217-36, http://dx.doi.org/10.4018/978-1-6684-3901-2.ch011.

9 Zarine Kharazian et al., “Some Criticism of Misinformation Research Fails to Accurately Represent the Field it Critiques,” Center For An Informed Public: University of Washington, January 24, 2024, https://www.cip.uw.edu/2024/01/24/misinformation-field-research.

10 Johannes Bubeck and Nikolay Marinov, Rules and Allies: Foreign Election Interventions (Cambridge: Cambridge University Press, 2019); Michael Tomz and Jessica Weeks, “Public Opinion and Foreign Electoral Intervention,” American Political Science Review 114, no. 3 (August 2020): 856–73, https://doi.org/10.1017/S0003055420000064; Daniel Corstange and Nikolay Marinov, “Taking Sides in Other People’s Elections: The Polarizing Effect of Foreign Intervention,” American Journal of Political Science 56, no. 3 (February 2012): 655–70, https://doi.org/10.1111/j.1540-5907.2012.00583.x; and Benjamin E. Goldsmith and Yusaku Horiuchi, “Does Russian election interference damage support for U.S. alliances? The case of Japan,” European Journal of International Relations 29, no. 2 (2023), 427–448, https://doi.org/10.1177/13540661221143214.

11 Dov H. Levin, “Partisan Electoral Interventions by the Great Powers: Introducing the PEIG Dataset,” Conflict Management and Peace Science 36, no. 1 (2019): 88–106, https://doi.org/10.1177/0738894216661190; and Dov H. Levin “When the Great Power Gets a Vote: The Effects of Great Power Electoral Interventions on Election Results,” International Studies Quarterly 60, no. 2 (June 2016): 189-202, https://doi.org/10.1093/isq/sqv016.

12 Sarah Sunn Bush and Lauren Prather, Monitors and Meddlers: How Foreign Actors Influence Local Trust in Elections (Cambridge: Cambridge University Press, 2022), 47.

13 Tom Vanden Brook, “Propaganda Fails in Afghanistan, Report Says,” USA Today, December 4, 2013, https://www.usatoday.com/story/news/nation/2013/12/04/information-operations-propaganda-afghanistan-pentagon/3870179.

14 Patricia Moy et al., “Agenda-Setting, Priming, and Framing” in The International Encyclopedia of Communication Theory and Philosophy, eds. Klaus Jensen et al. (Hoboken, NJ: John Wiley & Sons, Ltd, 2016) 1-13, https://doi.org/10.1002/9781118766804.wbiect266.

15 Joseph Uscinski et al., “Cause and Effect: On the Antecedents and Consequences of Conspiracy Theory Beliefs,” Current Opinion in Psychology 47 (October  2022): https://doi.org/10.1016/j.copsyc.2022.101364.

16 Felix M. Simon and Chico Q. Camargo, “Autopsy of a Metaphor: The Origins, Use and Blind Spots of the ‘Infodemic,’” New Media & Society 25, no. 8 (2023): 2219-2240, https://doi.org/10.1177/14614448211031908.

17 Hugo Mercier, Not Born Yesterday: The Science of Who We Trust and What We Believe (Princeton, NJ: Princeton University Press, 2020).

18 Andreas Jungherr and Ralph Schroeder, “Disinformation and the Structural Transformations of the Public Arena: Addressing the Actual Challenges to Democracy,” Social Media + Society 7, no. 1 (January 2021): https://journals.sagepub.com/doi/full/10.1177/2056305121988928.

19 Joshua Benton, “Good News: Misinformation Isn’t as Powerful as Feared! Bad News: Neither Is Information,” Nieman Lab, January 10, 2023, https://www.niemanlab.org/2023/01/good-news-misinformation-isnt-as-powerful-as-feared-bad-news-neither-is-information.

20 Daniel Williams, “The Marketplace of Rationalizations,” Economics & Philosophy 39, no. 1 (March 2023): 99–123, https://doi.org/10.1017/S0266267121000389.

21 Cameron Anderson et al., “Is the Desire for Status a Fundamental Human Motive? A Review of the Empirical Literature,” Psychological Bulletin 141, no. 3 (May 2015): 574–601, https://doi.org/10.1037/a0038781; and Dan M. Kahan, “Misconceptions, Misinformation, and the Logic of Identity-Protective Cognition,” (May 24, 2017), Cultural Cognition Project Working Paper Series No. 164, Yale Law School, Public Law Research Paper No. 605, Yale Law & Economics Research Paper No. 575, http://dx.doi.org/10.2139/ssrn.2973067.

22 Brendan Nyhan, “Why Fears of Fake News Are Overhyped,” Reasonable Doubt, February 4, 2019, https://gen.medium.com/why-fears-of-fake-news-are-overhyped-2ed9ca0a52c9; and Gideon Lewis-Kraus, “How Harmful Is Social Media?,” The New Yorker, June 3, 2022, https://www.newyorker.com/culture/annals-of-inquiry/we-know-less-about-social-media-than-we-think.

23 Gregory Eady et al., “Exposure to the Russian Internet Research Agency Foreign Influence Campaign on Twitter in the 2016 U.S. Election and Its Relationship to Attitudes and Voting Behavior,” Nature Communications 14, no. 62 (January 9, 2023):  https://doi.org/10.1038/s41467-022-35576-9.

24 2020 Election Research Project, “First Four Papers from U.S. 2020 Facebook & Instagram Research Election Study Published in Science and Nature,” Medium, July 27, 2023, https://medium.com/@2020_election_research_project/first-four-papers-from-us-2020-facebook-instagram-research-election-study-published-in-science-c099c235fc6c.

25 Damian J. Ruck et al., “Internet Research Agency Twitter Activity Predicted 2016 U.S. Election Polls,” First Monday 24, no. 7 (July 1, 2019): https://doi.org/10.5210/fm.v24i7.10107.

26 Joshua A. Tucker et al., “Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature,” William & Flora Hewlett Foundation, March 19, 2018, 15-16, 57, https://hewlett.org/library/social-media-political-polarization-political-disinformation-review-scientific-literature.

27 Claes Wallenius, “Do Hostile Information Operations Really Have the Intended Effects? A Literature Review,” Journal of Information Warfare 21, no. 2 (Spring 2022): https://www.jinfowar.com/journal/volume-21-issue-2/do-hostile-information-operations-really-have-intended-effects-literature-review.

28 Bineham, “A Historical Account of the Hypodermic Model in Mass Communication,” 232. Also see Jeffrey Whyte, “A New Geography of Defense: The Birth of Psychological Warfare,” Political Geography 67, (November 2018): 32-45, https://doi.org/10.1016/j.polgeo.2018.09.004.

29 Sean Illing, “Intellectuals Have Said Democracy Is Failing for a Century. They Were Wrong.,” Vox, December 20, 2018, https://www.vox.com/2018/8/9/17540448/walter-lippmann-democracy-trump-brexit.

30 Edward L. Bernays, “The Engineering of Consent,” The Annals of the American Academy of Political and Social Science 250 (March 1947): 113–120, https://www.jstor.org/stable/1024656; and Karl R. Popper, The Open Society and Its Enemies: New One-Volume Edition (Princeton, NJ: Princeton University Press, 2013).

31 Jill Lepore, If Then: How the Simulmatics Corporation Invented the Future (New York: Liveright Publishing, 2020), 33.

32 W. Russell Neuman, The Digital Difference: Media Technology and the Theory of Communication Effects (Cambridge, MA: Harvard University Press, 2018), 28-29.

33 Bineham, “A Historical Account of the Hypodermic Model in Mass Communication.”

34 David Greenberg, “Lippmann vs. Mencken: Debating Democracy,” Raritan 32, no. 2 (Fall 2012): 117-140, https://www.proquest.com/scholarly-journals/lippmann-vs-mencken-debating-democracy/docview/1238178273/se-2.

35 Robert H. Wicks, “Standpoint: Joseph Klapper and the Effects of Mass Communication: A Retrospective,” Journal of Broadcasting & Electronic Media 40, no. 4 (September 1996): 563–569, https://doi.org/10.1080/08838159609364377.

36 Andrea Scarantino and Gualtiero Piccinini, “Information Without Truth,” Metaphilosophy 41, no. 3 (April 2010): 313–330, https://www.jstor.org/stable/24439828.

37 Jowett and O’Donnell, Propaganda & Persuasion, 54-55.

38 Gabriel A. Almond and Stephen J. Genco, “Clouds, Clocks, and the Study of Politics,” World Politics 29, no. 4 (July 1977): 489-522, https://doi.org/10.2307/2010037; Lepore, If Then, 54-55.

39 Peter Kirschenmann, “Problems of Information in Dialectical Materialism,” Studies in Soviet Thought 8, no. 2/3 (1968): 105–121, https://www.jstor.org/stable/20098325.

40 John Arquilla and David Ronfeldt, “Information, Power, and Grand Strategy: In Athena’s Camp - Section 2” (Santa Monica, CA: RAND Corporation), 1997, 145, https://apps.dtic.mil/sti/pdfs/ADA485246.pdf. This philosophy persisted among Russian, Chinese — and later, Western — military thinkers. For a discussion, see Tim Stevens, “Information Matters: Informational Conflict and the New Materialism” (September 14, 2012). Paper for presentation at Millennium Annual Conference, ‘Materialism and World Politics’, October 20-21, 2012, London School of Economics, https://ssrn.com/abstract=2146565.

41 Albert O. Hirschman, A Bias for Hope: Essays on Development and Latin America (New Haven, CT: Yale University Press, 1971), 27.

42 John Lewis Gaddis, “International Relations Theory and the End of the Cold War,” International Security 17, no. 3 (Winter 1992-1993): 5, 53-54, https://doi.org/10.2307/2539129.

43 Nancy E. Bernhard, U.S. Television News and Cold War Propaganda, 1947-1960 (Cambridge: Cambridge University Press, 1999), 83-84.

44 Jennifer Miller, “Democracy and Misinformation,” Perspectives on History, June 10, 2019, https://www.historians.org/research-and-publications/perspectives-on-history/summer-2019/democracy-and-misinformation.

45 Andrea Friedman, Citizenship in Cold War America: The National Security State and the Possibilities of Dissent (Amherst, MA: University of Massachusetts Press, 2014), 17.

46 Marshall McLuhan, “Electronics and the Changing Role of Print,” Audio Visual Communication Review 8, no. 5 (1960): 74–83, https://www.jstor.org/stable/30216955.

47 James W. Carey, Communication as Culture: Essays on Media and Society, rev. ed. (New York: Routledge, 2009), 14-23.

48 Dietram A. Scheufele and David Tewksbury, “Framing, Agenda Setting, and Priming: The Evolution of Three Media Effects Models,” Journal of Communication 57, no. 1 (March 2007): 9–20, https://doi.org/10.1111/j.0021-9916.2007.00326.x.

49 Stuart Hall, “Encoding and Decoding in the Television Discourse,” CCCS stenciled paper no. 7 (Birmingham: Centre for Contemporary Cultural Studies, 1973), https://core.ac.uk/download/pdf/81670115.pdf.

50 Lepore, If Then, 3-4.

51 Michael H. Goldhaber, “Attention Shoppers!,” Wired, December 1, 1997, https://www.wired.com/1997/12/es-attention.

52 Hsuan-Ting Chen, “Spiral of Silence on Social Media and the Moderating Role of Disagreement and Publicness in the Network: Analyzing Expressive and Withdrawal Behaviors,” New Media & Society 20, no.10 (October 2018): 3917–36, https://doi.org/10.1177/1461444818763384.

53 Evgeny Morozov, “Capitalism’s New Clothes,” The Baffler, February 4, 2019, https://thebaffler.com/latest/capitalisms-new-clothes-morozov.

54 Heidi Tworek, “Cambridge Analytica, Trump, and the New Old Fear of Manipulating the Masses,” Nieman Lab, May 15, 2017, https://www.niemanlab.org/2017/05/cambridge-analytica-trump-and-the-new-old-fear-of-manipulating-the-masses.

55 Tori Otten, “TikTok Is a Problem—but Not Our Biggest Social Media Problem,” The New Republic, March 24, 2023, https://newrepublic.com/article/171372/tiktok-not-congress-biggest-social-media-problem.

56 Matthew Rosenberg et al., “How Trump Consultants Exploited the Facebook Data of Millions,” New York Times, March 17, 2018, https://www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html; and Sapna Maheshwari and Amanda Holpuch, “Why the U.S. is Weighing Whether to Ban TikTok,” New York Times, May 23, 2023, https://www.nytimes.com/article/tiktok-ban.html.

57 David Beer, The Data Gaze: Capitalism, Power and Perception (London: SAGE Publications Ltd, 2019), 25.

58 Lisa Gitelman, ed., Raw Data Is an Oxymoron (Cambridge, MA: MIT Press, 2013), 3.

59 Danah Boyd and Kate Crawford, “Critical Questions for Big Data: Provocations for a Cultural, Technological, and Scholarly Phenomenon,” Information, Communication & Society 15, no. 5 (May 2012): 662–679, https://doi.org/10.1080/1369118X.2012.678878.

60 David Biro, “How Big Data Is Changing Science,” Mosaic Science, October 2, 2018, https://medium.com/mosaic-science/how-big-data-is-changing-science-97201e911bf0.

61  Sun-ha Hong, Technologies of Speculation: The Limits of Knowledge in a Data-Driven Society (New York: New York University Press, 2020), 13-52; Antoinette Rouvroy, “The End(s) of Critique: Data-Behaviorism vs. Due-Process,” in Privacy, Due Process, and the Computational Turn, eds. Mireille Hildebrandt and Katja de Vries (New York: Routledge, 2013), 143–168; Richard E. Petty and Duane T. Wegener, “Attitude Change: Multiple Roles for Persuasion Variables,” in The Handbook of Social Psychology, 4th ed. (New York: McGraw-Hill, 1998), 323–390.

62 Hong, Technologies of Speculation, 7, 20.

63 Claudia Aradau and Tobias Blanke, Algorithmic Reason: The New Government of Self and Other (Oxford: Oxford University Press, 2022), 1-18.

64 Steven T. Smith et al., “Influence Estimation on Social Media Networks Using Causal Inference,” IEEE Statistical Signal Processing Workshop, Freiburg im Breisgau, Germany, 2018, 328-332, https://doi.org/10.1109/SSP.2018.8450823; Lepore, If Then, 326.

65 Hong, Technologies of Speculation, 19.

66 Rob Kitchin, “Big Data, New Epistemologies and Paradigm Shift,” Big Data & Society 1, no. 1 (April 2014): 1–12, https://doi.org/10.1177/2053951714528481; David Pinsof, “The Evolution of Social Paradoxes” PsyArXiv (March 2023),  https://doi.org/10.31234/osf.io/avh9t; and Eran Fisher and Yoav Mehozay, “How Algorithms See Their Audience: Media Epistemes and the Changing Conception of the Individual,” Media, Culture & Society 41 no. 8 (March  2019): 1176-1191, https://doi.org/10.1177/0163443719831598.

67 Neuman, The Digital Difference, 95; Hong, Technologies of Speculation, 24-25; Kitchin, “Big Data, New Epistemologies and Paradigm Shift;” and Kate Crawford, The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (New Haven, CT: Yale University Press, 2021), 214.

68 Neuman, The Digital Difference, 33, 89-90.

69 Jill Kastner and William C. Wohlforth, “A Measure Short of War,” Foreign Affairs, June 22, 2021, https://www.foreignaffairs.com/articles/world/2021-06-22/measure-short-war; and Heidi Tworek, “Disinformation: It’s History,” Centre for International Governance Innovation, July 14, 2021, https://www.cigionline.org/articles/disinformation-its-history.

70 Tommaso Venturini, “From Fake to Junk News, the Data Politics of Online Virality,” in Data Politics: Worlds, Subjects, Rights, eds. Didier Bigo et al., (London: Routledge, 2019), https://hal.science/hal-02003893.

71 Daniyar Sabitov, “Problematics of Big Data Representation in Media” (master’s thesis, Charles University in Prague, 2020), https://doi.org/10.13140/RG.2.2.14133.91365.

72 Jowett and O’Donnell, Propaganda & Persuasion, 206-207.

73 Bernard Marr, “How Much Data Do We Create Every Day? The Mind-Blowing Stats Everyone Should Read,” Forbes, May 18, 2018, https://www.forbes.com/sites/bernardmarr/2018/05/21/how-much-data-do-we-create-every-day-the-mind-blowing-stats-everyone-should-read.

74 Christie Aschwanden, “We’re All ‘P-Hacking’ Now,” Wired, November 26, 2019, https://www.wired.com/story/were-all-p-hacking-now.

75 For an extensive review of positivism and its effects on the social sciences, see Jason Blakely, We Built Reality: How Social Science Infiltrated Culture, Politics, and Power (New York: Oxford University Press, 2020).

76 Gary Smith, Distrust: Big Data, Data-Torturing, and the Assault on Science (Oxford: Oxford University Press, 2023), 126, 154, 158, 197-199.

77 Smith, Distrust: Big Data, Data-Torturing, and the Assault on Science, 186.

78 Joseph Bernstein, “Bad News: Selling the Story of Disinformation,” Harper’s, September 2021, https://harpers.org/archive/2021/09/bad-news-selling-the-story-of-disinformation; and C. W. Anderson, “Fake News Is Not a Virus: On Platforms and Their Effects,” Communication Theory 31, no. 1 (February 2021): 42–61, https://doi.org/10.1093/ct/qtaa008.

79 Colin F. Jackson, “Information Is Not a Weapons System,” Journal of Strategic Studies 39, no. 5–6 (2016): 820-846, https://doi.org/10.1080/01402390.2016.1139496; Tim Hwang, Subprime Attention Crisis: Advertising and the Time Bomb at the Heart of the Internet (New York: Farrar, Straus, and Giroux, 2020), 62; and Jesse Frederik and Maurits Martijn, “The New Dot Com Bubble Is Here: It’s Called Online Advertising,” The Correspondent, November 6, 2019, https://thecorrespondent.com/100/the-new-dot-com-bubble-is-here-its-called-online-advertising.

80 R. I. M. Dunbar, “Do Online Social Media Cut through the Constraints That Limit the Size of Offline Social Networks?,” Royal Society Open Science 3, no. 1 (January 2016), https://doi.org/10.1098/rsos.150292.

81 Tung-Hui Hu, A Prehistory of the Cloud (Cambridge, MA: MIT Press, 2015), xx.

82 Margaret Rouse, “Garbage In, Garbage Out,” Techopedia, January 4, 2017, https://www.techopedia.com/definition/3801/garbage-in-garbage-out-gigo.

[83] Jane Kellett Cramer, “National Security Panics: Overestimating Threats to National Security” (PhD diss., Massachusetts Institute of Technology, 2002), 30, 34-35, 37, 85, http://hdl.handle.net/1721.1/8312.

84 Robert Jervis, Perception and Misperception in International Politics, rev. ed. (Princeton, NJ: Princeton University Press, 2017), 67-76.

85 Lennart Maschmeyer et al., “Donetsk Don’t Tell – ‘Hybrid War’ in Ukraine and the Limits of Social Media Influence Operations,” Journal of Information Technology & Politics (May 14, 2023): 1–16, https://doi.org/10.1080/19331681.2023.2211969.

86 Timothy D. Haugh et al.,“16th Air Force and Convergence for the Information War,” The Cyber Defense Review 5, no. 2 (Summer 2020): 29–43, https://cyberdefensereview.army.mil/CDR-Content/Articles/Article-View/Article/2288588/16th-air-force-and-convergence-for-the-information-war; Herbert Lin, “Doctrinal Confusion and Cultural Dysfunction in DoD: Regarding Information Operations, Cyber Operations, and Related Concepts,” Cyber Defense Review 5, no. 2 (Summer 2020): 89–106, https://www.jstor.org/stable/26923525; and Sarah P. White, “The Organizational Determinants of Military Doctrine: A History of Army Information Operations,” Texas National Security Review 6, no. 1 (Winter 2022-2023): 51-78, http://dx.doi.org/10.26153/tsw/44440.

87 Hans Morgenthau, Politics among Nations: The Struggle for Power and Peace, 5th ed. (New York: Knopf, 1978), 138-140.

88 Jennifer Mitzen, “Ontological Security in World Politics: State Identity and the Security Dilemma,” European Journal of International Relations 12, no. 3 (September 2006): 341–370, https://doi.org/10.1177/1354066106067346; and Amir Lupovici, “Ontological Security, Cyber Technology, and States’ Responses,” European Journal of International Relations 29, no. 1 (March 2023): 153–178, https://doi.org/10.1177/13540661221130958.

89 Elgin M. Brunner and Myriam Dunn Cavelty, “The Formation of In-Formation by the U.S. Military: Articulation and Enactment of Infomanic Threat Imaginaries on the Immaterial Battlefield of Perception,” Cambridge Review of International Affairs 22 no. 4 (December 2009): 629–646, https://doi.org/10.1080/09557570903325454.

90 Tim Mak, “Senate Report: Russians Used Social Media Mostly To Target Race In 2016,” NPR, October 8, 2019, https://www.npr.org/2019/10/08/768319934/senate-report-russians-used-used-social-media-mostly-to-target-race-in-2016.

91 Ellen Nakashima, “U.S. Cyber Command Operation Disrupted Internet Access of Russian Troll Factory on Day of 2018 Midterms,” Washington Post, February 27, 2019, https://www.washingtonpost.com/world/national-security/us-cyber-command-operation-disrupted-internet-access-of-russian-troll-factory-on-day-of-2018-midterms/2019/02/26/1827fc9e-36d6-11e9-af5b-b51b7ff322e9_story.html; Devlin Barrett, Sari Horwitz, and Rosalind S. Helderman, “Russian Troll Farm, 13 Suspects Indicted in 2016 Election Interference,” Washington Post, February 17, 2018, https://www.washingtonpost.com/world/national-security/russian-troll-farm-13-suspects-indicted-for-interference-in-us-election/2018/02/16/2504de5e-1342-11e8-9570-29c9830535e5_story.html; U.S. Department of State, “Disarming Disinformation: Our Shared Responsibility,” last updated February 8, 2024, https://www.state.gov/disarming-disinformation/; U.S. Office of the Director of National Intelligence, “Intelligence Community Assessment on Foreign Threats to the 2020 U.S. Federal Elections,” March 10, 2021, https://www.dni.gov/files/ODNI/documents/assessments/ICA-declass-16MAR21.pdf; U.S. Department of the Treasury, “Treasury Escalates Sanctions Against the Russian Government’s Attempts to Influence U.S. Elections,” April 15,2021, https://home.treasury.gov/news/press-releases/jy0126; and U.S. Office of the Director of National Intelligence, “Statement by NSCC Director William Evanina: Election Threat Update for the American Public,” August 7, 2020, https://www.dni.gov/index.php/newsroom/press-releases/press-releases-2020/3473-statement-by-ncsc-director-william-evanina-election-threat-update-for-the-american-public.

92 David Bauder et al., “Fox, Dominion Reach $787M Settlement over Election Claims,” AP News, April 18, 2023, https://apnews.com/article/fox-news-dominion-lawsuit-trial-trump-2020-0ac71f75acfacc52ea80b3e747fb0afe.

93 Ellen Nakashima, “Pentagon Opens Sweeping Review of Clandestine Psychological Operations,” Washington Post, September 19, 2022, https://www.washingtonpost.com/national-security/2022/09/19/pentagon-psychological-operations-facebook-twitter.

94 Grigory L. Tulchinsky, “Information Wars as a Conflict of Interpretations: Activating the ‘Third Party,’” Russian Journal of Communication 5, no. 3 (2013): 244–251, https://doi.org/10.1080/19409419.2013.822054; and Whyte, The Birth of Psychological War, 5.

95Amy Goodman, “Dept. of Homeland Security Ramps Up Efforts to Police Online Speech on Ukraine, COVID & Afghanistan,” Democracy Now!, November 4, 2022, https://www.democracynow.org/2022/11/4/dhs_police_online_discourse_the_intercept.

96 Chris Stokel-Walker, “Banning TikTok Is a Bad Solution to the Wrong Problem,” Washington Post, March 15, 2023, https://www.washingtonpost.com/opinions/2023/03/15/tiktok-ban-propaganda-bytedance-china; Rebecca Picciotto, “White House Urges Senate to ‘Move Swiftly’ on TikTok Bill as Lawmakers Drag Their Heels,” March 17, 2024, https://www.cnbc.com/2024/03/17/white-house-senate-tiktok-bill.html.

97 Mayze Teitler, “Missouri v. Biden Raises More First Amendment Questions Than It Answers,” Just Security, July 19, 2023, https://www.justsecurity.org/87311/missouri-v-biden-raises-more-questions-than-it-answers.

98 Albert O. Hirschman, The Rhetoric of Reaction: Perversity, Futility, Jeopardy (Cambridge, MA: Harvard University Press, 1991), 136-137.

99 Nakashima, “Pentagon Opens Sweeping Review of Clandestine Psychological Operations.”

100 Alicia Wanless, “There Is No Getting Ahead of Disinformation Without Moving Past It,” Lawfare, May 8, 2023, https://www.lawfareblog.com/there-no-getting-ahead-disinformation-without-moving-past-it.

101 Peter Kafka, “Are We Too Worried about Misinformation?,” Vox, January 16, 2023, https://www.vox.com/recode/2023/1/16/23553802/misinformation-twitter-facebook-alex-stamos-peter-kafka-media-column; and Noortje Marres, “Why We Can’t Have Our Facts Back,” Engaging Science, Technology, and Society 4 (July 24, 2018): 423–443, https://doi.org/10.17351/ests2018.188.

102 Martin Gurri, The Revolt of the Public and the Crisis of Authority in the New Millennium (San Francisco: Stripe Press, 2018), 395-396.

103 Robert O. Keohane and Joseph S. Nye, “Power and Interdependence in the Information Age,” Foreign Affairs 77, no. 5 (1998): 81-94, https://doi.org/10.2307/20049052.

104 Nils Gilman, “Dictatorships and Data Standards,” The American Interest, April 17, 2018, https://www.the-american-interest.com/2018/04/17/dictatorships-data-standards.

105 Doug Irving, “Truth Decay Is Putting U.S. National Security at Risk,” RAND Corporation, June 28, 2023, https://www.rand.org/blog/rand-review/2023/06/truth-decay-is-putting-us-national-security-at-risk.html; and Jonathan Haidt, “Why the Past 10 Years of American Life Have Been Uniquely Stupid,” The Atlantic, April 11, 2022, https://www.theatlantic.com/magazine/archive/2022/05/social-media-democracy-trust-babel/629369.

106 W. Lance Bennett and Barbara Pfetsch, “Rethinking Political Communication in a Time of Disrupted Public Spheres,” Journal of Communication 68, no. 2 (April 2018): 243–253, https://doi.org/10.1093/joc/jqx017.

107 Jowett and O’Donnell, Propaganda & Persuasion, 199.

108 James Gleick, Chaos: Making a New Science (New York: Penguin Books, 2008), 96.

109 Gillian Murphy et al., “What Do We Study When We Study Misinformation? A Scoping Review of Experimental Research (2016-2022),” Harvard Kennedy School Misinformation Review, November 15, 2023, https://misinforeview.hks.harvard.edu/article/what-do-we-study-when-we-study-misinformation-a-scoping-review-of-experimental-research-2016-2022.

110 Rory Cormac et al., “What Constitutes Successful Covert Action? Evaluating Unacknowledged Interventionism in Foreign Affairs,” Review of International Studies 48, no. 1 (January 2022): 111-128, https://doi.org/10.1017/S0260210521000231.

111 David Greenberg, “Lippmann vs. Mencken: Debating Democracy,” 119.

112 Yuval Levin, A Time to Build: From Family and Community to Congress and the Campus, How Recommitting to Our Institutions Can Revive the American Dream (New York: Basic Books, 2020), 29-42.

113 Anderson, “Fake News Is Not a Virus.”

114 Cormac et al., “What Constitutes Successful Covert Action? Evaluating Unacknowledged Interventionism in Foreign Affairs,” 118.

115 Anastasia Kozyreva et al., “Critical Ignoring as a Core Competence for Digital Citizens,” Current Directions in Psychological Science 32, no. 1 (2023): 81–88, https://doi.org/10.1177/09637214221121570.

116 Josh A. Goldstein et al., “Generative Language Models and Automated Influence Operations: Emerging Threats and Potential Mitigations,” Stanford Internet Observatory, January 2023, https://cdn.openai.com/papers/forecasting-misuse.pdf; and Michael Caulfield, “Recalibrating Our Approach to Misinformation,” EdSurge, December 19, 2018, https://www.edsurge.com/news/2018-12-19-recalibrating-our-approach-to-misinformation.

117 Renée DiResta, “How Online Mobs Act Like Flocks Of Birds,” Noema, November 3, 2022, https://www.noemamag.com/how-online-mobs-act-like-flocks-of-birds.

118 Jungherr and Schroeder, “Disinformation and the Structural Transformations of the Public Arena.”

119 David Karpf, “On Digital Disinformation and Democratic Myths,” MediaWell, December 10, 2019, https://mediawell.ssrc.org/articles/on-digital-disinformation-and-democratic-myths.

120 Organization for Economic Co-operation and Development, Building Trust to Reinforce Democracy: Main Findings from the 2021 OECD Survey on Drivers of Trust in Public Institutions (July 13, 2022), https://doi.org/10.1787/b407f99c-en.

121 For a discussion on digital literacy’s role in countering foreign propaganda, see Calder Walton, “What’s Old Is New Again: Cold War Lessons for Countering Disinformation,” Texas National Security Review 5, no. 4 (Fall 2022): 49-72, http://dx.doi.org/10.26153/tsw/43940.

122 Théophile Lenoir, “Reconsidering the Fight Against Disinformation,” Tech Policy Press, August 1, 2022, https://techpolicy.press/reconsidering-the-fight-against-disinformation.

123 Andreas Jungherr and Adrian Rauchfleisch, “Negative Downstream Effects of Alarmist Disinformation Discourse: Evidence from the United States,” Political Behavior (January 12, 2024): https://doi.org/10.1007/s11109-024-09911-3.

124 Kate Starbird et al., “Disinformation as Collaborative Work: Surfacing the Participatory Nature of Strategic Information Operations,” Proceedings of the ACM on Human-Computer Interaction 3, Issue CSCW (November 7, 2019): https://doi.org/10.1145/3359229.

125 Benton, “Good News: Misinformation Isn’t as Powerful as Feared! Bad News: Neither Is Information.”

126 Illing, “Intellectuals Have Said Democracy Is Failing for a Century. They Were Wrong.”

127 Alice E. Marwick, “Why Do People Share Fake News? A Sociotechnical Model of Media Effects,” Georgetown Law Technology Review 2, no. 2 (July 2018): https://georgetownlawtechreview.org/why-do-people-share-fake-news-a-sociotechnical-model-of-media-effects/GLTR-07-2018.

128 John Dewey, “Creative Democracy—The Task Before Us,” in John Dewey: The Later Works, 1925-1953, Volume 14: 1939-1941 (Carbondale, IL: Southern Illinois University Press, 2008), 224-230.

129 John Dewey, The Public and Its Problems (New York: H. Holt and Company, 1927), 206.

130 Matthew Festenstein, “Dewey’s Political Philosophy,” The Stanford Encyclopedia of Philosophy, eds. Edward Zalta and Uri Nodelman (Spring 2023), https://plato.stanford.edu/archives/spr2023/entries/dewey-political.

Top