Buy Print
Magazine

Buy Print
Magazine

The Scholar

PDF coming soon

-

What’s Old Is New Again: Cold War Lessons for Countering Disinformation

Hostile foreign states are using weaponized information to attack the United States. Russia and China are disseminating disinformation about domestic U.S. race relations and COVID-19 to undermine and discredit the U.S. government. These information warfare attacks, which threaten U.S. national security, may seem new, but they are not. Using an applied history methodology and a wealth of previously classified archival records, this paper uses two case studies to reveal how and why a hostile foreign state, the Soviet Union, targeted America with similar disinformation in the past. The methods that the U.S. government devised during the Cold War to counter Soviet disinformation are still relevant, even in today’s information landscape. By applying that history, this paper recommends developing a new U.S. strategy for countering hostile state disinformation: through promoting digital literacy, which requires a whole-of-society, generational effort. Establishing a coherent strategy is important because disinformation will be a major theme of 21st-century international security, as societies and governments become increasingly interconnected.

Since 2016, hostile foreign states have been using weaponized information to attack the United States. There have been two prominent targets of such attacks: First, disinformation has been used to exploit domestic U.S. race relations, and second, it has been used to allege U.S. culpability for the COVID-19 pandemic. Race was the principal target of Russian disinformation during the 2016 U.S. presidential election, with the aim of assisting Donald J. Trump and undermining his opponent, Hillary Clinton.1 Race again became a target of Russian disinformation when protests against police violence erupted across the United States in May 2020. Meanwhile, China has borrowed Russia’s disinformation playbook to disseminate conspiracies alleging the U.S. government’s responsibility for the novel coronavirus.2 Russian and Chinese information warfare attacks on the United States about race and COVID-19 may appear new, but in fact they have a long history. At the start of the 21st century, the U.S. government has seemingly forgotten that history: the threat that Kremlin disinformation previously posed to U.S. national security, and, crucially, Washington’s own past experience with countering it.

Exploiting U.S. race relations and disseminating disinformation about pandemics were major methods of Soviet disinformation during the later stages of the Cold War. The Kremlin’s aim was to discredit the U.S. government, delegitimize its institutions, disorient and polarize American society, and cast doubt on the true account of events in order to undermine the country’s effective functioning as a democracy. Although the technology that Russia is deploying today to spread disinformation is new, its strategy is the same as that of its predecessor, the Soviet Union. Equally important, the methods that Washington previously devised for debunking Soviet disinformation about race relations and pandemics are still applicable today. Using a growing field of public policy research — applied history — this paper argues that the most effective way to counter hostile foreign state disinformation is through a whole-of-society effort in which U.S. intelligence agencies work with other government bodies as well as the private sector to expose audiences to accurate information and encourage digital literacy.

Despite the disinformation threats that the United States faces, it lacks a coherent, coordinated strategy to counter them.3 Different government agencies are confronting disinformation, as are the social media technology giants, but the United States currently lacks a comprehensive strategy involving the spectrum of public and private sector stakeholders.4 Drawing on a wealth of previously classified intelligence records from multiple international archives and exclusive interviews with intelligence practitioners, this paper offers recommendations for creating such a strategy. Establishing a coherent strategy for countering disinformation is important because, far from being isolated, recent efforts to influence particular events, like the 2016 and 2020 U.S. elections, indicate that disinformation will be a major, persistent theme of 21st-century international security. Foreign states will increasingly weaponize digital information to attack democracies in order to advance their geopolitical grand strategies.

Russia’s invasion of Ukraine in February 2022 damaged its past efforts to use disinformation to divide NATO: It created more resolve in the alliance than arguably anything else in NATO history. It would be a mistake, however, to think that Russian disinformation has been relegated to the past. In fact, there are good reasons to believe that new opportunities for Russian disinformation against the West will open as the war in Ukraine grinds on in the coming winter. For example, we are likely to find Russian disinformation campaigns aimed at exploiting social dislocation caused by continued oil price spikes and food supply disruptions from Ukraine, and resulting increases in inflation in Western countries. Given these circumstances, it is all the more important to understand Russian disinformation. Studying its history is the best way to do so.

Before Russia’s interference in the 2016 U.S. election, the history of Soviet active measures was a niche academic subject within intelligence history. Its study was undertaken by former practitioners and historians like Christopher Andrew.5 More recently, Thomas Rid has built on Andrew’s work.6 Since 2016, the U.S. government and public policy scholarship has busily set about “re-learning” Moscow’s long history of using active measures.7 Scholars are also, more generally, turning to the Cold War as an analog for the resurgence of great powers in the 21st century.8 However, still missing from recent official investigations and scholarship are lessons that can be gleaned from the U.S. government’s own public policy efforts to counter Soviet disinformation in the later stages of the Cold War.9 While the diagnosis of the disinformation problem has advanced, scholars and practitioners have still failed to learn from past treatments. By studying and applying their history, this paper thus prescribes disinformation cures. When it comes to disinformation and debunking it, what is old is new again.

Studying history in depth ... offers decision-makers insight into how their predecessors tackled analogous situations in the past and illuminates policy decisions today.

This paper makes three principal arguments. First, applied history is a valuable field of public policy research, as demonstrated by the history of intelligence, disinformation, and international security. Second, the history of Soviet disinformation targeting U.S. domestic racial protests and Washington’s use of bioweapons to cause pandemics shows how and why hostile foreign states use disinformation to attack liberal democracies. Contrary to past and present claims about foreign malign “hidden hands” in U.S. domestic affairs, in fact the Soviet Union’s disinformation strategy, and its impact, were limited: It targeted and amplified existing divisions within American society, doing nothing more than magnifying them. Third, the U.S. government devised policies for countering Soviet disinformation about race and pandemics that are still applicable, even in today’s digital information landscape, where cyber interconnectivity and the prevalence of social media mean that citizens and policymakers drink from a daily firehouse of information. Although the digital revolution has offered unprecedented capabilities through which states can disseminate disinformation, the history of what came before is still relevant and applicable. In fact, it is impossible to understand contemporary foreign state disinformation strategies without appreciating their past. This will become an increasingly important subject as societies become more interconnected this century. The digitized 21st century will witness “infodemic” events, producing so much information that it will be difficult, if not impossible, for audiences to distinguish facts from state-sponsored lies.

This paper proceeds by assessing Soviet disinformation strategy and using two case studies that illustrate it: the KGB’s targeting of U.S. race relations at the 1984 Los Angeles Olympics, and KGB allegations that the U.S. government created the AIDS virus. These two case studies were chosen because they show the separate aims of Soviet disinformation: to influence a specific outcome and to create a broad atmosphere of distrust. They were also chosen because they reveal the impact of disinformation and, crucially, how the U.S. government debunked that disinformation, providing lessons for doing so today.

The paper uses an applied history methodology: the express effort to use the past to inform current decisions and policymaking. The use of history for public policy has a long tradition, but is currently experiencing a renaissance.10 This article follows a methodological and analytic approach set out by the late Harvard professors Ernest May and Richard Neustadt in their seminal book, Thinking in Time, in which they demonstrate that, to provide policymakers with valuable insights from history, it is necessary to analyze similarities and differences between past and present situations.11 While the history of (counter)disinformation can be applied, some preliminary words of caution are necessary about doing so. History never repeats itself, so lessons cannot be lifted from one period and simply transplanted to another. History is no better at predicting the future than any other field of public-policy scholarship. It is also difficult to take deep historical expertise and turn it into easily consumed products for policymakers. As one scholar-turned-foreign policy adviser has put it, “History does not lend itself easily to the PowerPoints or executive summaries on which our policymakers increasingly rely.”12 However, not studying history is equally mistaken: It would mean living like an amnesiac, failing to learn from past experiences and repeating old mistakes.

Studying history in depth — not through cursory PowerPoint presentations — offers decision-makers insight into how their predecessors tackled analogous situations in the past and illuminates policy decisions today. Scholars are rightly concerned about anachronism and are wary of making “presentist” historical arguments. But May and Neustadt’s methodology, to embrace both similarities and differences with the past, helps to alleviate those concerns. Some scholars prefer “historical sensibility” or “historicism” to the term “applied history,” but the end purpose is the same: to inform contemporary policy decisions by identifying precedents and finding analogs with the past, while avoiding historical violence by applying bad historical lessons.13 

Disinformation Tactics and Strategy

While disinformation currently attracts significant attention in the news, it frequently lacks a clear definition. It can most usefully be understood as false, misleading, or distracting information that is deliberately spread. To be effective, it must be unattributable to a government, which usually necessitates the involvement of a clandestine intelligence service. It is distinct from propaganda, the purpose of which is to persuade, and also from misinformation, which is false or misleading information that a government officially and openly produces.14 When a government secretly plants a false story in a newspaper and disguises the authorship, that is disinformation. When it publicly provides misleading “alternative facts,” that is misinformation.

Disinformation is an ancient part of warfare. However, during the 20th century, it was institutionalized in unprecedented ways. In Soviet Russia, the Bolsheviks adopted and expanded disinformation tactics that their tsarist predecessors had used against them. Disinformation was thereafter part of the Bolsheviks’ political warfare against their ideological enemies, foreign and domestic. The Soviet secret police, the Cheka, later known as the KGB, had a disinformation unit in its foreign intelligence branch from its early days.15 During World War II, Soviet Russia and the other major belligerent powers used disinformation in their war efforts. In the war’s early stages, British intelligence forged documents to deceive President Franklin D. Roosevelt’s administration to bring America into the war on Britain’s side. Once in the war, the British and U.S. governments then used disinformation as part of their successful strategic deception of the Axis powers, using a “bodyguard of lies,” as Prime Minister Winston Churchill put it, before the Allied invasion of Europe on D-Day, June 1944.16 Britain’s wartime foreign intelligence and sabotage services, MI6 and SOE, had forgery departments, as did America’s wartime intelligence service, the OSS.17

In the post-war years, after the wartime Grand Alliance had disintegrated, its members — the United Kingdom, the United States, and the Soviet Union — used disinformation as part of their respective ideologically driven Cold War grand strategies. One of the first acts undertaken by the CIA after its establishment in 1947 was to meddle in Italy’s democratic elections the following year, forging documents to discredit socialist candidates as communists and bribing moderate politicians.18 It did so, in cooperation with British intelligence, to counter Soviet clandestine subversion that was similarly targeting that election. Moscow also rigged elections elsewhere in post-war Eastern Europe. Thereafter, secretly planting stories became a common British and U.S. covert tactic during the Cold War.

As part of the British government’s controversial efforts to reduce Jewish immigration to the Mandate of Palestine, MI6 sabotaged boats carrying Jewish refugees there — including Holocaust survivors — and then forged documents claiming that a bogus pro-Soviet Arab group had undertaken the sabotage. The aim of MI6’s disinformation and sabotage operation, aptly named EMBARRASS, was to trick Soviet authorities into reducing Jewish immigration to Palestine from behind the Iron Curtain in order to prevent a Jewish-Arab civil war in Palestine.19

Meanwhile, as part of the CIA and MI6’s efforts to instigate a coup in Iran in 1953, the CIA planted articles and editorial cartoons in Iranian newspapers to discredit Prime Minister Mohammad Mosaddegh and bolster their preferred leader, Shah Mohammad Reza Pahlavi.20 The CIA used similar disinformation tactics in Latin America, both to topple the government of Jacobo Árbenz in Guatemala in 1954 and later to discredit Chile’s socialist president, Salvador Allende, in 1973. When Washington feared that Indonesian president Sukarno had pro-communist beliefs, a CIA team produced a sex film featuring a Sukarno lookalike in order to embarrass and blackmail him — an effort that failed. (Sukarno’s well-known libido also made him the target of an unsuccessful KGB plot to influence him by using a “honeytrap.”)21 During the Soviet invasion of Afghanistan in 1979, Soviet intelligence disseminated disinformation that Afghans “welcomed” Moscow’s “fraternal assistance.”22 Meanwhile, to discredit the invasion, the CIA planted stories in Afghan newspapers of dubious validity, carrying the Soviet military seal to make them appear official, announcing “invasion day celebrations” at Soviet embassies across the Middle East. One senior CIA officer concerned with Soviet affairs has recalled that the agency regularly disrupted Soviet tactical intelligence operations against the United States by planting false or misleading stories to discredit them.23

Generally, the Soviet disinformation strategy against Western countries was not to create a false story — a lie — but instead to amplify existing grievances.

Although Cold War-era Kremlin disinformation bore similarities to Western disinformation efforts, it was different in scope, scale, and nature. British and U.S. intelligence used disinformation to tactically support covert actions, like those noted above. For the Kremlin, disinformation had its own strategic goal: to destabilize the society of its “Main Adversary,” the United States, and disrupt relations between it and other Western nations.24 It did so to defend its strategic interests against perceived U.S. and NATO aggression.25 Western intelligence services correspondingly tried to use secret, non-avowed propaganda to amplify divisions in Soviet and Eastern Bloc societies. However, because they were police states, lacking free press and freedom of expression, these intelligence services were unable to wage information warfare behind the Iron Curtain like Soviet intelligence did in the West. There was thus a fundamental asymmetry between Soviet and Western disinformation during the Cold War. Additionally, Soviet foreign disinformation against Western countries was an extension of its domestic “political warfare” — Eastern Bloc populations were the principal targets of Soviet active measures.26 Western governments did not similarly use disinformation against their own populations in the Cold War. There is no evidence that U.S. intelligence ever orchestrated a similarly reckless and dangerous public health disinformation campaign as the KGB’s campaign with regard to AIDS, as discussed below. It would therefore be erroneous to draw an equivalence between East and West disinformation efforts during the Cold War.

Soviet intelligence used relative freedoms in Western societies — including freedom of speech, press, and association, and freedom to protest — against them, with the strategic aim of polluting public opinion and undermining democratic decision-making.27 Generally, the Soviet disinformation strategy against Western countries was not to create a false story — a lie — but instead to amplify existing grievances.28 A former senior Soviet Bloc disinformation officer and defector to the United States, Ladislav Bittman, explained that his strategy was never to create a “big lie” out of nothing, because it was unlikely to be believable. Instead, he collected intelligence about existing fractures in targeted Western societies, and then used disinformation to amplify them.29 In that respect, the best Soviet disinformation involved self-deception in Western societies, “playing upon the audience’s political and cultural biases, sending messages it wants to hear,” as one CIA analysis put it.30 Bittman recalled that he never tried to influence people with extreme political views, those whose beliefs were unshakable — either the “true believers” or the “atheists.” Instead, he targeted “the agnostic middle, whose beliefs could be swayed, and driven to extremes.” Other Soviet Bloc disinformation officers were less orthodox about grounding disinformation in facts. Hungarian defector Laszlo Szabo, who testified in the U.S. Congress about his work for the Hungarian intelligence service, the AVH, recalled his instructions when sent to its London residency in 1965:

He [the head of AVH disinformation] told me they preferred that an item for disinformation should have some real basis, be based on facts, but if I can produce a good idea that does not have any fact send it in any way. Truth is not important if the idea is good. Just send it in. They will make it look truthful, then get it published in some little paper somewhere. After that we Hungarians will hand it out, get it republished everywhere. Who can prove it is not true?31

Soviet active measures remain highly sensitive secrets in Putin’s Russia. The Russian Federation’s six-volume official history of foreign intelligence mentions them briefly — and misleadingly.32 However, secrets stolen from KGB archives and accounts from defectors reveal that Soviet disinformation reached unprecedented levels under Yuri Andropov, the longest-serving KGB chairman (1967 to 1982) and then Soviet leader (1982 to 1984). Andropov’s belief in the value of disinformation for Soviet statecraft stemmed from his experiences as Soviet ambassador to Hungary in 1956, where he observed its effective use in crushing the anti-Soviet uprising there.33 Under Andropov, KGB foreign intelligence — the First Chief Directorate — established its own department for carrying out political warfare called Service A (“A” for active measures).34 A 1972 top-secret KGB dictionary defined “disinformation data,” produced by Service A, as “especially prepared data, used for the creation, in the mind of the enemy, of incorrect or imaginary pictures of reality, on the basis of which the enemy would make decisions beneficial” to the Soviet Union.35 KGB officers stationed in residencies (rezidentura) overseas were best placed to suggest subjects for disinformation, which KGB Moscow headquarters (“the Center”), the Soviet Communist Party’s International Department, and the Central Committee’s Propaganda Department then approved.36

According to Bittman, the KGB was a “forgery factory,” producing bogus documents for planting in obscure publications in the hope that they would be picked up by witting and unwitting agents in Western media — “useful idiots,” in the KGB lexicon.37 The Kremlin used a constellation of front groups in the West, like the World Peace Council, to disseminate its disinformation.38 As an indication of the importance that Moscow attached to disinformation and other Soviet active measures, consider that, in the 1980s, it directed KGB political intelligence officers stationed overseas to spend about a quarter of their time on such activities. Between 1975 and 1985, Service A expanded from about 50 to 80 officers at the First Chief Directorate headquarters at Yasenevo, near Moscow, but this was only a tiny proportion of all Soviet officials devoted to active measures.39 In April 1982, Andropov decreed that it was the duty of all Soviet foreign intelligence officers, whatever their “line,” or department, to participate in active measures like disinformation. As a result, the entire KGB First Chief Directorate — approximately 15,000 officers in the 1980s — was engaged in active measures. In addition, other branches of the Soviet government, like the Soviet Communist Party’s International Department, the Central Committee’s Propaganda Department, and Soviet state media outlets like Novosti and Tass, also undertook active measures.40 In 1980, a conservative U.S. estimate put the cost of Soviet active measures at $3 billion.41 By contrast, the U.S. government’s body for countering Soviet disinformation, the Active Measures Working Group, had approximately 20 officials, who were drawn from various other departments. The State Department’s own Office of Active Measures Analysis and Response had just four to five full-time professional staff.42

Disinformation was one of the Soviet Union’s many active measures, which ranged from overt propaganda to covert acts of physical violence, up to and including assassinations. All of these measures were designed to influence world affairs in Moscow’s favor.43 For the Kremlin, active measures like disinformation were not a sideshow but were integral to Soviet foreign policy. They were the clandestine corollary to overt Soviet diplomacy.44 Sitting in the middle of the spectrum of Soviet political warfare, active measures were influence campaigns, which were separate from, but linked to, disinformation.45 An illustration of the relationship between disinformation and Soviet agents of influence was the KGB’s recruitment, in 1958, of a New York law student, Richard Flink. The KGB blackmailed him and then offered to finance his campaign as a Republican candidate for the New York state assembly in the hope of influencing U.S. politics and using him to disseminate Soviet disinformation. Flink, however, had reported his attempted KGB recruitment to the FBI at the outset.46

Soviet Disinformation Exploiting Domestic U.S. Race Relations

From the 1950s onwards, exploiting racial injustice in America became a stock in trade for Soviet disinformation. According to Bittman, Moscow instructed Eastern Bloc disinformation departments to make race in the United States a “top priority.” Amid race protests and the civil rights movement, they lacked no opportunities for inflaming and amplifying U.S. race relations. In August 1967, the Center approved an operation by the deputy head of Service A, Yuri Modin, to discredit the U.S. government on “the Negro issue.” It authorized Modin to organize KGB residencies in the United States to fabricate and distribute leaflets denouncing the U.S. government’s brutal suppression of “the Negro rights movement,” and to forge documents claiming that white supremacist groups were planning to assassinate leading members of the civil rights movement, like Martin Luther King Jr.47 The KGB’s campaign to inflame U.S. race relations is described by a senior KGB officer stationed in the United States, Oleg Kalugin, who worked undercover as a Radio Moscow correspondent in New York and Washington in the 1960s and early 1970s: “Our active measures campaign did not discriminate on the basis of race, creed, or color: we went after everybody.” Kalugin befriended an editor of a black activist newspaper in New York and went on several trips with him to Harlem. “I knew our propaganda was exaggerating the extent of racism in America,” Kalugin later recalled, “yet I also saw first-hand the blatant discrimination against blacks.”48

As Kalugin later recalled, he had no qualms about “stirring up as much trouble as possible for the US government” because it was “all part of the job.”

Soviet active measures involving U.S. race relations during the civil-rights era took two forms: commissioning hate crimes and forging race-based hate letters. Both had the same strategic aim: to divide American society. Following violent riots in the summer of 1965 in a predominantly black district of Los Angeles, Andropov personally approved KGB active measures to exploit relations in other black communities in major U.S. cities. Taking inspiration from a similar KGB operation in West Germany,49 its New York residency paid American agents to paint swastikas on synagogues in New York and desecrate Jewish cemeteries and then make anonymous phone calls to the police claiming they were the work of black activist groups.50 Soviet intelligence archives reveal that, on at least one occasion, the KGB ordered the use of an explosive to exacerbate racial tensions in New York. In July 1971, the Center instructed its New York residency to conduct an operation codenamed PANDORA: They were to plant a delayed-action explosive “in the Negro section of New York,” preferably targeting “one of the Negro colleges.” The Center’s instructions were that, after the explosion, the residency was to make anonymous calls to two or three black organizations claiming that the bomb had been planted by the hardline Jewish Defense League. Available evidence does not reveal PANDORA’s outcome. It is impossible to determine which, if any, of the attacks on black organizations that were blamed on the Jewish Defense League were, in fact, the KGB’s work.51

KGB forgeries designed to exacerbate U.S. racial tensions were also common. According to Kalugin, one of the KGB’s active measures in America “involved a nasty letter writing campaign against African diplomats at the United Nations.” The operation, conceived by the Center and approved by the Soviet Communist Party’s Central Committee itself, was carried out in November 1960. Residency officers like Kalugin typed hundreds of such anonymous letters and sent them to African missions at the United Nations. The letters, purportedly from white supremacists and “normal” Americans, contained virulent racist diatribes. The African diplomats publicized some of the letters as examples that racism was rampant in the United States.52 Kalugin, who covered the United Nations for Radio Moscow, dutifully broadcast about this letter-writing campaign to discredit the United States, which he, but not his listeners, knew was his own handiwork. As Kalugin later recalled, he had no qualms about “stirring up as much trouble as possible for the US government” because it was “all part of the job.” As he put it: “I lost no sleep over such dirty tricks, figuring they were just another weapon in the cold war.”53

Case Study I: The KGB Targets the 1984 Olympic Games in Los Angeles

A revealing example of Service A’s efforts to exacerbate racial tensions in the United States to influence a specific outcome occurred at the 1984 Los Angeles Olympic Games. In the first week of July, the KGB’s Washington residency mailed letters purportedly from the Ku Klux Klan to the Olympic committees of 20 African and Asian countries. The letters stated that the games were “for whites only” and threatened that athletes from these countries would be lynched or shot if they attended the games. Moscow’s disinformation strategy was to disrupt the Olympics, bolster support for its boycott of the games in retaliation for President Jimmy Carter’s embargo of Moscow’s games four years earlier, and discredit the Reagan administration by claiming it was unable to guarantee the safety of athletes.54

The strategy was unsuccessful: None of the Olympic athletes who received the Soviet death threats withdrew from the games. As the games ended in August 1984, U.S. Attorney General William French Smith publicly denounced the letters as a major Soviet disinformation effort.55 Moscow predictably claimed indignation in reaction to Washington’s anti-Soviet “slanders.”

The U.S. intelligence community detected the true origin of the letters through forensic analysis and intelligence collection. Eleven days before the games began, the CIA’s Office of Soviet Analysis produced an assessment for senior officials in the State Department, Department of Defense, and National Security Council that explained why it believed the letters were Soviet forgeries. First, the letters, post-marked within a 30-minute drive from Washington, contained grammar and syntax errors suggesting they were written by Slavic-language speakers. They contained mistakes like placing a hyphen between the first two words of “Ku Klux Klan,” a construction not used by the Klan itself, and an error that notably disappeared in subsequent Soviet media reporting about the letters. Second, Soviet media picked up the story suspiciously quickly. As the CIA assessment noted, “The typical Soviet active measures modus operandi is to disseminate disinformation abroad and subsequently replay it in the Soviet press. According to Soviet calculations, this replaying enhances the credibility of both the original disinformation and any later versions.” Third, the FBI is known to have kept the Klan under tight surveillance, through the use of human agents and technical collection, like telephone taps. According to the CIA, the FBI knew that the Klan had never targeted Asians nor addressed the issue of Third World countries participating in the Olympics.56

The U.S. intelligence community was also able to detect the KGB origins of the letters by using sources inside Soviet intelligence. The FBI and CIA are known to have run three sources inside Soviet intelligence at the time: Sergei Motorin, Valery Martynov, and Boris Yuzhin.57 Although a lack of publicly available documentation makes it impossible to prove conclusively, one of these agents likely provided U.S. intelligence with information about the letters. Martynov is the most likely candidate. He is known to have worked in the KGB’s Washington residency at the time. Codenamed GENTILE by the CIA and PIMENTA by the FBI, he met his agency and bureau handlers twice a month, usually in safe houses in Virginia suburbs, where they paid him between $2,000 and $4,000 per month. The FBI and CIA helped Martynov’s KGB career progress by feeding disinformation of their own to the KGB to make him appear successful. According to the KGB’s Washington resident at the time, Victor Cherkashin, “Martynov gave the FBI a running commentary on the goings on in the rezidentura, including operations and targets, instructions from the centre and rumours from Yasenevo [KGB foreign intelligence headquarters, near Moscow].”58 A year after the Olympics, all three U.S. agents were betrayed by a high-level Soviet agent inside the CIA, Aldrich Ames. Motorin and Martynov were recalled to Moscow and executed, while Yuzhin defected safely to the United States.59 U.S. authorities shared background notes on the forgeries with partners, like the United Kingdom, and international media outlets.60

Soviet Disinformation: U.S. Bioweapons

Another strain of Soviet disinformation, beginning in the 1950s, was to discredit the U.S. government by alleging that it was developing biological weapons capable of causing pandemics. After the Korean War, the KGB disseminated disinformation that the U.S. military had unleashed germ warfare in China and North Korea. At that time, the U.S. government had a biological weapons research program based at Fort Detrick, MD — the same facility currently at the forefront of pandemic biosecurity. The Truman administration considered deploying nuclear weapons during the Korean War, but subsequent Soviet claims that it used biological weapons in Korea was Kremlin disinformation, as Soviet records opened after the Cold War revealed. The U.S. government’s biological research program was closed down by President Richard Nixon, who outlawed U.S. offensive biological research and signed the Biological Weapons Convention in 1972, which banned the development, production, and stockpiling of biological weapons of mass destruction.61 In reality, Soviet disinformation about U.S. bioweapons proved to be mere projection: In violation of the 1972 convention, the Soviet Union — not the United States — was operating the world’s largest secret bioweapon program illegally.62 Nevertheless, Fort Detrick offered a perfect opportunity for the Soviet Union to use disinformation to distract and discredit the U.S. government, by suggesting that America’s bioweapon program secretly continued after Nixon closed it, a claim that, more recently, the North Korean regime has repeated.63

Case Study II: KGB Operation INFEKTION

In May 1983, Soviet intelligence again disseminated disinformation that Washington was using chemical warfare in Southeast Asia, what it called “yellow rain.”64 Another Soviet disinformation trope was that the U.S. military was developing an “ethnic bomb” capable of killing only non-white populations.65 However, Soviet disinformation about American bioweapons reached an entirely new level when a novel pathogen spread across the world in the 1980s: AIDS. In operation INFEKTION, also known as DENVER, the KGB spread disinformation that AIDS had been manufactured by the U.S. military at Fort Detrick — the same conspiracy theory that appeared during the COVID-19 pandemic.66 The KGB’s strategic aim in doing so was to discredit the U.S. government domestically and internationally, cause Americans to lose faith in public health, and destroy Washington’s ability to claim the moral high ground when it came to Soviet human rights abuses.

Within the KGB, the recycling and repackaging of disinformation was known as creating an “echo effect.”

Andropov’s rise to power in the Kremlin coincided with the AIDS outbreak, a novel disease first detected in the United States in 1981 that then grew into a pandemic. It offered the KGB an unprecedented opportunity for making allegations about America’s secret use of bioweapons. Service A laid the groundwork in July 1983, when it manufactured a front-page article in an obscure newspaper in India called the Patriot, titled, “AIDS May Invade India: Mystery Disease Caused by U.S. Experiments.” The story cited a letter from an anonymous but “well-known American scientist and anthropologist” that stated that AIDS was a bioweapon created by the Department of Defense. “Now that these menacing experiments seem to have gone out of control,” the story read, “plans are being hatched to hastily transfer them from the U.S. to other countries, primarily developing nations where governments are pliable to Washington’s pressure and persuasion.” In reality, Service A wrote the letter in the newspaper, which was itself a Soviet front.67

Eighteen months later, as the AIDS pandemic continued to intensify, the mainstream Soviet press picked up the Patriot’s story, reporting it as fact — and carefully omitting that it arose from a letter to a Soviet mouthpiece. Within the KGB, the recycling and repackaging of disinformation was known as creating an “echo effect.” As a U.S. official and expert on Soviet disinformation later stated, it was like a “pin ball game”: “A fake story ran in country A and then was picked up as a legitimate story in country B and C.”68 Service A organized pseudoscientific support for the story by recruiting an East German, Russian-born retired biophysicist, Professor Jacob Segal, who produced a 52-page booklet claiming that the U.S. military had created AIDS by artificially synthesizing two natural viruses, VISNA and HTLV-1. It then “escaped” from Fort Detrick:

It is very easy using genetic technologies to unite two parts of completely independent viruses … but who would be interested in doing this? The military, of course … In 1977 a special top security lab … was set up … at the Pentagon’s central biological laboratory. One year after that … the first cases of AIDS occurred in the US, in New York City. How it occurred precisely at this moment and how the virus managed to get out of the secret, hush-hush laboratory is quite easy to understand. Everyone knows that prisoners are used for military experiments in the U.S. They are promised their freedom if they come out of the experiment alive.69

The KGB’s disinformation campaign alleging that the U.S. military had created AIDS through human experimentation fed on distrust among African Americans toward public health officials. In the “Tuskegee Study of Untreated Syphilis in the Negro Male,” exposed in 1972, the U.S. Public Health Service had experimented on 600 black men, who had been recruited on the promise of receiving free medical care, to find a cure for syphilis.70 The KGB’s AIDS disinformation efforts exploited these concerns, highlighting the prevalence of AIDS in Africa. In mid-1986, the “Segal Report” took off, becoming what may legitimately be called fake news in many Third World countries, particularly in Africa, where similar “proof” of the virus’ American origins was published in letters to newspapers that were likely Service A products disseminated by Soviet front groups.71 In October 1986, the conservative-leaning British tabloid Sunday Express made the KGB narrative its main front-page story. After that, it was picked up by an international news wire. By 1987, the story had received coverage in major media outlets in 80 countries in 30 languages.72

U.S. Efforts to Counter Soviet Disinformation

The U.S. government’s efforts to combat the Soviet disinformation campaigns that targeted the 1984 Los Angeles Olympics and the AIDS pandemic reveal three principles for countering disinformation: first, the necessity of having a strategy; second, the value of intelligence collection; and third, the benefit of cooperation between government and non-government bodies and international partners.

Strategy

The U.S. government’s experience debunking Soviet disinformation shows the value of having a coherent, interagency strategy led by a single body that takes ownership of the counter-effort. Washington only developed such a strategy toward the end of the Cold War — and that strategy remains relevant to countering disinformation today.

In the early days of the Cold War, Washington’s response to Soviet disinformation was ad hoc. In 1961, the CIA’s Richard Helms, later the agency’s director, testified publicly before Congress to expose 32 Soviet counterfeits that the agency had identified over the past four years.73 The U.S. government’s public response to Soviet disinformation slowed in the 1970s amid U.S.-Soviet détente and public criticism of the U.S. intelligence community following the Church Committee’s exposure of its abuses in 1975. However, Soviet intelligence archives reveal that, under Andropov, KGB active measures against the “Main Adversary” continued unabated during the period of détente.74 The defection of a KGB active measures officer, Stanislav Levchenko, to the United States in 1979 warned U.S. authorities of these goings on. Levchenko noted that in the Soviet embassy in Tokyo from which he had defected, there were approximately 50 KGB officers, five of whom were devoted full time to active measures, who ran a network of 25 agents in Japan.75

The Soviet invasion of Afghanistan in 1979 and President Ronald Reagan’s election in the United States, with his public anti-Sovietism, brought a reinvigoration of KGB disinformation targeting America. After efforts to discredit Carter and interfere in Reagan’s election in 1980, the U.S. government created a public policy strategy for countering Soviet disinformation for the first time. In 1981, Reagan’s administration established a dedicated interagency body to neutralize Soviet disinformation: the Active Measures Working Group. Part of the Reagan administration’s broader effort to counter Soviet propaganda — PROJECT TRUTH76 — the working group’s mission was to be the U.S. government’s disinformation response unit: to identify and provide timely responses to Soviet disinformation by publicly exposing it to domestic and foreign audiences.77

The Reagan administration accompanied its face-to-face confrontations about Soviet disinformation with a threat of sanctions, informing the Kremlin that U.S.-Soviet AIDS research, which Soviet scientists needed to address AIDS in their country, would be closed down unless the disinformation stopped.

The working group was led at the time by the State Department (later it would be led by the U.S. Information Agency) and was comprised of CIA, FBI, Department of Defense, and National Security Council officers.78 To disseminate its counter-messages, it worked closely with the Voice of America. Its methodology for debunking Soviet disinformation was threefold: Report, Analyze (or Attribute), and Publish, also known as “RAP.” As Lawrence Eagleburger, the State Department’s undersecretary of state for political affairs and key supporter of the Active Measures Working Group, put it: “[A]ctive measures need to be countered by public exposure. They are infections that thrive only in darkness, and sunlight is the best antiseptic.”79 The group’s first head, Dennis Kux, a career U.S. foreign service officer, established its working methodology: that its efforts had to be based on facts, not broad ideological arguments. The group had high evidentiary standards in order to prove its case before the court of public opinion beyond reasonable doubt, similar to a criminal trial. As Kux recalled, “The fact that we made a credible presentation — not an ideological show — lent a certain amount of professionalism to the whole effort.” He added, “[P]eople don’t like to be duped. Not only were we telling them that they were being duped but we told them how.”80 One of the group’s later heads, Kathleen Bailey, recalled that they frequently knew that a false story was the Kremlin’s work, but could not prove it sufficiently and so refrained from trying to do so.81

The group’s strategy was successful at neutralizing Service A’s racist death threats at the Los Angeles Olympics. According to an FBI special agent who sat on the group, James Milburn, “[W]e really came together on the KKK forgeries in May of 1984. The group got a lot of good exposure in the national news stories. It was one of the few times what I did there was in the news.”82

The group was also successful in changing Soviet policy about AIDS disinformation. After reporting and analyzing the KGB’s AIDS conspiracy theory, in the spring of 1987 the working group published a report about it and publicly attributed it to the Kremlin. In October 1987, U.S. Secretary of State George Shultz had what he would later call a “sour and aggressive” meeting with Soviet leader Mikhail Gorbachev, in which Gorbachev produced the group’s report and angrily said it went against the spirit of glasnost. Schultz replied that when the Soviet Union stopped lying Washington would stop exposing them. Two months later, during a meeting between Gorbachev and Reagan in Washington, Gorbachev pulled aside the head of the U.S. Information Agency (and member of the working group), Charles Wick, and said, “No more lying. No more disinformation; I don’t want politicians and bureaucrats creating all of these tensions anymore, disinformation and all that. It’s going to be a new day.”83 The Reagan administration accompanied its face-to-face confrontations about Soviet disinformation with a threat of sanctions, informing the Kremlin that U.S.-Soviet AIDS research, which Soviet scientists needed to address AIDS in their country, would be closed down unless the disinformation stopped. As one of the group’s senior officials, Herbert Romerstein, stated, “The effect was as if a faucet was turned off. Suddenly the stories practically disappeared.”84 Soon after, the Kremlin officially disowned the AIDS story.85 After the Soviet Union’s collapse, Yevgeny Primakov, the head of Russia’s new foreign intelligence service, the SVR, admitted that the AIDS story had been a Service A fabrication.86 Recently opened Eastern Bloc intelligence records have confirmed this.87

Intelligence Collection

The second principle from later Cold War U.S. efforts against Soviet disinformation is the value of intelligence collection. As Kux put it, “[Y]ou can’t do anything about disinformation without good information.”88 As noted above, the U.S. intelligence community detected the KGB death threats prior to the Olympics through forensic analysis, and also, it seems, from human sources inside Soviet intelligence. Published literature has not hitherto revealed how the U.S. intelligence community detected the KGB’s AIDS disinformation efforts. However, according to this author’s exclusive interviews, it did so through the open-source collection of Soviet broadcast and print media. The CIA’s Foreign Broadcast Information Service, which collected Soviet media, monitored the Patriot in India and then observed the mainstream Soviet press picking up its AIDS story and amplifying it, indicating that the Soviet disinformation strategy was at work.89 The work of the Foreign Broadcast Information Service, and its trans-Atlantic partner, the BBC Monitoring Service, during the Cold War reveals the value of using open-source intelligence in detecting foreign state threats, including disinformation.90

International Cooperation

A third principle that emerged from U.S. efforts to counter Soviet disinformation about the Los Angeles Olympics and the AIDS pandemic is the value of international cooperation. The fact that Soviet disinformation was transnational, spreading across Western countries and the developing world, required the response to be international. British Foreign Office records — only declassified in 2018 — reveal that the Active Measures Working Group worked closely with international partners to detect and expose Soviet disinformation. The group liaised with Britain’s Foreign Office, as well as NATO countries, to collect intelligence about recent developments in Soviet disinformation efforts, like the disinformation campaigns about the Olympics and the AIDS virus. The group had meetings with senior Foreign Office officials in which they shared their discoveries about both disinformation campaigns. The British officials, in turn, agreed to take steps to counter both conspiracy theories by alerting journalists and academics about them.91 Beginning in the spring of 1983, the group launched “truth squads” — road shows to visit and brief NATO countries — and had an annual meeting at its headquarters to share intelligence about disinformation.92

Applying History From the Cold War to the Digital Age

The conclusion of the Cold War did not mark the end of the KGB’s disinformation strategy. The post-Soviet Russian state continued to use the intelligence agency’s former playbook, updating and innovating KGB disinformation for the digital age. Russia’s foreign intelligence service, the SVR, has, from its establishment in 1991, proudly embraced its KGB past — and continues to do so today, regarding itself as the heir to its traditions. Continuity between Soviet and Russian intelligence centers on Putin himself, who was a former KGB officer. He graduated in 1985 from the KGB’s training school, the Andropov Institute, where recruits were trained in disinformation and other active measures.93 Before rising to power in Russia at the end of the 1990s, Putin led the country’s domestic security service, the FSB, which created a special place for Soviet intelligence in Russian security and politics. These are the siloviki, “men of force, in Putin’s Russia, people with backgrounds in intelligence services and the military.” The continuity between Soviet and Russian disinformation was apparent in the revealing, but overlooked, case of Sergei Tretyakov, a former KGB officer, who, in the mid-1990s, became SVR head of station in New York. After defecting to the United States in 1999, Tretyakov disclosed that his work for the SVR had involved visiting the New York Public Library to use public computer terminals to disseminate disinformation about Russia to U.S. media outlets and politicians.94 His efforts directly foreshadowed Russia’s online disinformation efforts less than two decades later, during the 2016 U.S. election.

Russia’s foreign intelligence services, the SVR and GRU, have used disinformation to target the same issues within the United States as their Soviet predecessors: exploiting U.S. race relations and alleging that disease outbreaks are the result of U.S. bioweapons. The U.S. Senate’s bipartisan investigation into Russian interference in the 2016 election revealed that no other subject was targeted more by Russian intelligence disinformation than U.S. race relations, with Russian online troll farms, like the Internet Research Agency, creating bogus Facebook and Twitter accounts purportedly from Black Lives Matter to incite race-related issues, discredit Clinton’s candidacy, and promote Trump’s.95 Russia’s active measures campaign in 2016 involved cyber espionage, hacking into emails of the Democratic National Convention, and then leaking them in an influence operation to discredit Clinton. The tradecraft that Russian intelligence used, masquerading as genuine American people and political groups, is known as “astroturfing.” Race was a key theme. Echoing the KGB disinformation described above, one Internet Research Agency troll described his work in 2015: “When there were black people rioting in the U.S. we had to write that U.S. policy on the black community had failed, Obama’s administration couldn’t cope with the problem, the situation is getting tenser. The negroes are rising up.”96 Russian disinformation similarly targeted U.S. race relations during race and police violence protests that erupted in the United States in 2020.97

Amplifying domestic American anti-vaccine movements, they have also disseminated online conspiracy theories that wearing facemasks is harmful to health and ineffective for combating COVID-19.

Russian intelligence redeployed the KGB’s old conspiracy about the U.S. government’s culpability in disease outbreaks with regard to the Ebola outbreak in West Africa in 2014 and then again during the COVID-19 pandemic.98 During the Cold War, the most successful targets of Soviet active measures were Third World countries, where KGB disinformation exploited existing anti-American sentiments.99 Echoing how Soviet intelligence targeted African countries with AIDS disinformation, Russian outlets have promoted bogus remedies for the novel coronavirus in Africa, with the intention of discrediting the United States. In KGB tradition, Kremlin online actors have disseminated disinformation that Washington is seeking to test COVID-19 vaccines on prisoners, immigrants, and black Americans.100 Amplifying domestic American anti-vaccine movements, they have also disseminated online conspiracy theories that wearing facemasks is harmful to health and ineffective for combating COVID-19.101

Continuity between Soviet and Russian disinformation raises two applied history questions for analysis: First, has today’s digital information ecosystem changed the threat that disinformation poses to Western democracies compared to during the Cold War, or is it a case of plus ça change, plus c’est la même chose? Second, are the U.S. policies that were used to counter Cold War disinformation applicable today, in an age when digital bytes have replaced analog broadsheets?

Has the Digital Revolution Changed the Threat of Disinformation?

Before assessing whether the digital revolution has changed the threat of disinformation, it is first necessary to understand the threat that Soviet disinformation posed to U.S. national security during the Cold War. Contrary to claims made by KGB officers, often intended to exaggerate their successes to superiors in Moscow, Soviet disinformation never actually posed a strategic threat to the United States during the Cold War. This is confirmed by intelligence archives from both sides of the conflict. As Soviet intelligence officers later explained, KGB active measures never managed to be more than a tactical nuisance for the U.S. government.102 The CIA came to similar conclusions in 1985, when two of its senior officers, Robert Gates, future director of central intelligence and secretary of defense, and John McLaughlin, later acting director of central intelligence, testified before Congress about Soviet active measures. Responding to questions from then-Sen. Joe Biden, Gates stated that, in general, Soviet active measures were an annoyance that sapped U.S. government resources but did not pose a Cold War strategic threat. However, in certain circumstances, Gates said, they could provide the Soviet Union with a competitive edge: “[I]n a close election or legislative battle, they could spell all the difference.”103

After the Cold War, however, the threat was perceived to have diminished. This led the U.S. government, and its closest intelligence partner, the United Kingdom, to lose decades of institutional knowledge about Soviet disinformation. To achieve budget cuts — the “peace dividend” — McLaughlin slashed staff in the CIA’s office of Slavic and Eurasian Analysis by 42 percent in three years.104 For experts on Soviet Russia, the post-Cold War “end of history” was the end of many careers.105 The British intelligence community underwent similar staffing reductions. At the height of the Cold War, 70 percent of Britain’s signals intelligence service —GCHQ — was focused on the Soviet Union and the Eastern Bloc. By 2000, its efforts on hostile state activity, including Russia, had decreased to 14 percent, and in 2006 reached just 4 percent. Its resources were overwhelmingly focused on counterterrorism.106 The experience of Soviet defector Bittman illustrates the historical amnesia that befell U.S. public policy regarding Soviet active measures. The American university where he taught a course on disinformation, Boston University, became decreasingly interested in the topic, viewing it as a relic of the Cold War past. Bittman retired in 1996 and the course was not continued.107 By contrast, for Putin, who came to power in Russia at the end of the 1990s, Cold War strategies remained alive. Consequently, while Russia deployed and updated Cold War tradecraft, the U.S. government dangerously failed to apply its own history. Like a person who has lost his or her memory, the U.S. government has had to rediscover its past experiences with disinformation. This historical amnesia has been made worse by political leaders on both sides of the Atlantic who have willfully neglected tasking their intelligence services with investigating Russian interference in elections because they feared that doing so would de-legitimize their electoral victories.108

The digital era has transformed the potential for hostile states to use disinformation to “provide the difference,” as Gates warned active measures could do. Leveraging digital tools, Russia’s intelligence services have spread disinformation more effectively than their Soviet predecessors. Today’s interconnected digital world makes it quicker, cheaper, and easier than ever before to use disinformation as a strategic weapon to deceive, confuse, and undermine democracies. During the Cold War, it was a slow, laborious, and complex process for Soviet intelligence to do so, usually involving forged documents, like the Olympic Games death threat letters and AIDS disinformation campaign. While the KGB previously planted stories and used physical front groups and agents to propagate disinformation, today all that states like Russia need are social media accounts and online operatives (i.e., “trolls”).

Cyber interconnectivity has not only changed how states disseminate disinformation, but it has also changed its consumption, as individuals increasingly look online for information from news sites, search engines, and social media. Social media companies have successfully disrupted mainstream news media outlets such that about half of adults in the United States now often receive news from the former rather than from the latter.109 Traditionally, news outlets played a major role in publicly countering disinformation by fact-checking reporting before publication as part of their standard editorial process. Research shows that a consequence of inhabiting an online, algorithmically driven information environment is the increase in tribalism and conformation bias — listening to and recycling information that conforms to one’s existing views and enhances one’s sense of belonging.110 According to Bittman, who devoted his life to studying disinformation after he defected to America, the advent of social media was a godsend for anyone in his former occupation — a “professional manipulator of public opinions,” as he described himself.111 To put things in perspective, a false story disseminated online is likely to reach 1,500 people six times more quickly than a factual story.112

To answer the first applied history question posed above, then: Evidence suggests that digital interconnectivity has increased the strategic threat of state disinformation seeking to cast doubt on truth, facts, and science. There have always been those who are willing to believe in conspiracy theories about events like disease outbreaks. A lack of data makes it impossible to determine whether contemporary open democratic societies believe in conspiracies more than in the past. However, it is clear that the digital revolution was a watershed moment in the dissemination of disinformation. There is compelling evidence that this has caused a public erosion of truth, facts, and science.113 The national security threat that contemporary digital disinformation poses is measured in its corrosive effects on democracies. Social media has provided megaphones to people who reject fundamental aspects of post-Enlightenment thinking: facts, rational thought, truth, falsifiable theories, and science itself. The more that American society is pulled into a post-fact, post-truth virtual reality, the more the Kremlin considers itself to be strategically winning.114

Understanding the history of successful past U.S. policies to counter disinformation is necessary, but not sufficient, for democracies to counter disinformation today.

It is also impossible to determine whether social media has increased the ability of disinformation to influence specific outcomes, succeeding where the KGB failed with its 1984 Olympic Games disinformation efforts. Scholars are divided with regard to the impact that Russian disinformation had on the 2016 U.S. presidential election.115 Facebook believes that 126 million Americans may have viewed Russian-backed disinformation on its site before that election (Trump won the electoral college by approximately 80,000 total votes across key swing states).116 However, without data about the impact of disinformation on voters, which would need to be obtained from polling, it is impossible to determine whether disinformation determined the outcome of the election. There is no evidence, at least none that is publicly available, that such polling was undertaken — although Russia’s intelligence services may have done, to judge the effectiveness of its 2016 influence campaign.

The U.S. government came to similar conclusions when trying to measure the effects of Soviet disinformation during the Cold War. One of the Active Measures Working Group’s key supporters, Eagleburger, noted that because Moscow did not use active measures independently, but instead in coordination with overt parts of Soviet foreign policy, it was impossible to conduct controlled experiments separating it from those other variables. The fact that the Kremlin’s active measures targeted existing fault lines in a given society made it further difficult to isolate their effects. However, as Eagleburger noted, the Kremlin evidently believed that its disinformation was worth the effort, given the resources it devoted to it.117 The same can be said today.

Do Cold War Countermeasures Apply in the Digital Era?

The second applied history question is whether U.S. policies devised to counter Soviet disinformation are still applicable in the current digital information landscape. Using May and Neustadt’s suggested methodology, an applied history analytic approach studying similarities and differences, the answer to that question is “yes” — up to a point. Understanding the history of successful past U.S. policies to counter disinformation is necessary, but not sufficient, for democracies to counter disinformation today. Truth and facts exist and do still matter in open states today. The methodology that the U.S. working group devised during the Cold War is still applicable to online environments: reporting (detecting) disinformation, attributing it, and publishing (exposing) it. The British government, for example, created a toolkit to counter online disinformation in 2015 called RESIST, later launching RESIST 2 to counter COVID-19 disinformation.118 In essence, RESIST follows the same principles of the Active Measures Working Group: instead of Report, Analyze, Publish, it advocates for “Recognize,” “Early Warning,” “Situational Insight,” “Impact Analysis,” and “Strategic Communication.”

One of the key lessons that still applies from the U.S. government’s efforts to counter disinformation in the Cold War through bodies like the Active Measures Working Group, is the importance of having a unified strategy. In practice, that means having someone, or something, take ownership of the effort, as discussed above. At present, Western democracies lack a strategy to counter disinformation. Their public-policy responses are fragmented, at best. A 2020 report by the U.K. Parliament’s Intelligence and Security Committee, for example, shows that countering Russian disinformation is such a “hot potato” for its government that no one department or agency wants responsibility for it.119

This raises a deeper question: Should governments or technology giants like Facebook (Meta), Google, and Twitter, whose platforms are used to deliver disinformation, be responsible for countering it? Any strategy will obviously need to include these companies. After all, they have unique, proprietary technical capabilities that allow them to detect disinformation that is being spread on their platforms. Since Russia’s interference in the 2016 U.S. election, each company has improved its efforts to detect and expose disinformation, but such efforts are still insufficient and are similar to a whack-a-mole strategy.120 Efforts by individual social media companies to counter disinformation, by de-platforming accounts, removing content, or providing provenance “health warnings” on false information, work in some circumstances, but not in others. Evidence suggests that health warnings can, counter-intuitively, make false information more attractive and increase its dissemination online.121 A study by Britain’s Royal Society in 2022, for example, has argued against removing content as a solution to scientific misinformation because democracies benefit from open and honest discussion of scientific claims.122 De-platforming risks the content simply moving elsewhere online, with potentially new, smaller, and even less trustworthy platforms emerging. A fundamental problem with relying on social media companies to police themselves is that their business models are driven by advertising views, which incentivizes (dis)information dissemination, i.e., “click-bait.” The U.S. legal framework in which these companies currently operate do not necessarily match up with the U.S. government’s public interest in reducing the dissemination and consumption of disinformation.123 The financial incentives of click-bait stories place a premium on outrage (irrespective of accuracy) over that of balanced and more accurate, but less sensational, stories. Aldous Huxley was correct when he noted, “An unexciting truth may be eclipsed by a thrilling falsehood.”124 Social media companies are necessary, but not sufficient, for safeguarding the online public sphere.

Nevertheless, someone needs to take ownership for countering disinformation. If it is not going to be the government, then it is difficult to see who else would take the helm.

The U.S. government should consider establishing a dedicated coordinating body, following the Active Measures Working Group’s example. Such a body could be modeled after the multi-agency center established after 9/11, the National Center for Counterterrorism. A corresponding counter-disinformation body would be responsible for implementing America’s counter-disinformation strategy. Due to the unique collection and analytical capabilities of U.S. intelligence agencies concerning foreign state disinformation — from espionage, technical collection, and open-source intelligence — that body would work with the intelligence community (falling under the Office of the Director of National Intelligence). It would also work with other U.S. government departments, like the State Department and the executive and legislative branches. Crucially, the new center would also act as a bridge to technology firms and news media. Unlike the Active Measures Working Group, it would need to be resourced with full-time, dedicated staff. It would be essential for such a new body to be public facing: Its primary customer should be public audiences, not secret corners of the U.S. government.

Another key lesson from the U.S. government’s history of countering disinformation in the Cold War that still applies today is that to be effective, it is necessary to combine government and non-government efforts in order to increase the public’s digital literacy. As Eagleburger put it regarding the Active Measures Working Group, while counterintelligence was important for detecting disinformation, its antidote was a broad, persistent approach: “Sudden enthusiasms to expose their dirty tricks followed by troughs of apathy are not the answer. A reasoned and effective response must be persistent and continuing, and this is best achieved by a growing public understanding and emerging consensus of the significance of these activities.”125 As the working group recognized, “public understanding” about Soviet disinformation meant creating an informed citizenry that is alert to hostile foreign states that are manipulating facts and casting doubt on them.

The same principles apply today. In the contemporary context, however, creating an informed citizenry means promoting digital literacy — teaching the public to detect online disinformation by separating facts from fiction.126 The best strategy for promoting digital literacy in democracies is through a new public-private partnership that combines government and private sector initiatives. This will require a patient, long-term — even generational — effort. Quick fixes, like removing online content, are insufficient.127 The Royal Society’s report on online scientific misinformation is unambiguous in its recommendation: “The UK should invest in lifelong, nationwide, information literacy initiatives.”128

As in the past, disinformation relies on the public’s willingness to believe objectively falsifiable information. This willingness is higher in times of social upheaval, war, and pandemics.

Today, exposing disinformation requires an appreciation of online behavioral science. Public psychology research shows, for example, that publishing factual information is more effective for countering disinformation than highlighting false information.129 The United Nations has adopted this approach in launching “Verified,” an initiative to “counter the spread of COVID-19 misinformation by sharing fact-based advice.”130 Recent scholarship has demonstrated that purveyors of disinformation use narratives to gain traction among audiences. That suggests that establishing truth-based counter narratives may be a way of fighting back against online disinformation.”131 Likewise, research suggests that “pre-bunking” — preemptively refuting a story — offers a useful method of delivering resistance against fake news.

Here, we get to the heart of the matter about applying the history of Cold War disinformation policies today. As in the past, disinformation relies on the public’s willingness to believe objectively falsifiable information. This willingness is higher in times of social upheaval, war, and pandemics. Conspiracy theories offer people seductive ways to “explain” apparently unexplainable events and find meaning in the otherwise meaningless. This is as true today as it was in the past. Similarities and continuities between past and present conspiracy theorists should not, however, blind us to the major differences that exist today concerning technology, polarization within the U.S. government and society, and the nature of the public sphere since the Cold War’s end.

Technological advances have magnified the speed at which disinformation spreads. This means that lies can now spread quickly across the world before truth can get its boots on, to borrow a phrase attributed to Mark Twain. In the past, it was possible to gradually formulate a response to Soviet disinformation involving documentary fraud. Today, the algorithmic amplification of information and disinformation requires rapid, near real-time, responses. This means that U.S. policies devised in the pre-digital, print-press era cannot simply be grafted onto the online world. The domestic political environment in the United States has also recently undergone seismic changes, polarizing to the left and right extremes. That has itself been facilitated by social media. Polarization has occurred at break-neck speed over the past five years, first under Trump’s presidency and now under Biden’s. The U.S. Congress is as polarized as American society, divided into political tribes that increasingly view each other not as political opponents with whom they can respectfully disagree, but as “enemies.”

In addition, the advent of the digital revolution has changed the nature of the public sphere. At the time of our case studies above, the guardians of the critical public sphere who determined what was respectable news were government press departments and people like newspaper editors and television anchors. Social media has caused that relationship between government and journalists to splinter, which has in turn fueled political polarization and created a crisis of authority in journalism. One way to rehabilitate the embattled public sphere is through digital literacy. This reinforces the importance of public-private cooperation in countering disinformation. Countries like Finland and Norway have, for example, embedded digital literacy in school curriculums, which is showing promising results.132 Whether this would be applicable in the United States, which has lower levels of public trust in government, remains to be seen.133 Things do not bode well, however. In March 2022, Biden’s adviser on disinformation, Nina Jancowicz, resigned from the Department of Homeland Security after just two months, having become the sustained target of critics on the right who labelled her work an Orwellian “Department of Truth.”134

Conclusion

The KGB’s AIDS disinformation effort reveals the lasting damage that disinformation about a pandemic can cause. Despite efforts by U.S. medical experts, like Anthony Fauci, to debunk the Soviet AIDS conspiracy theory, it continued to infect public thinking long after Moscow disowned it.135 In 2005, polling showed that 50 percent of African Americans still believed that AIDS was a man-made virus.136 Denial of the causal link between HIV and AIDS, which KGB disinformation contributed to, has had horrific consequences in African countries. In South Africa, President Thabo Mbeki denied a connection between HIV and AIDS.137 By the time he left office in 2008, there had been almost 330,000 preventable HIV deaths there.138 Meanwhile, HIV-AIDS denialism may have contributed to Russia’s own terrible AIDS epidemic.139 If this example from the pre-digital era, when dissemination of disinformation was relatively controllable, is any indication, then online disinformation about the novel coronavirus is likely to cause even broader damage. Early research shows that hundreds of people have died as a result of false COVID-19 “cures” promoted on social media.140 At the time of writing, polling indicates that one quarter of U.S. adults see at least some truth in the theory that “powerful people” intentionally planned the COVID-19 outbreak.141

Today’s digital revolution constitutes the greatest shift in the history of the transmission of ideas since the development of the printing press in the 15th century, which sparked generations of social dislocation in Europe and the New World, contributing to civil wars in both places. It is too early to understand the evolving impact of social media on our societies. Thanks to the internet, never before has so much information been available to so many people. Ninety percent of the world’s data was created in the two years before 2018.142 That unprecedented and exponential data explosion has transformed the dissemination of information as well as its consumption. This makes it tempting to assume that the history of disinformation from earlier, pre-digital periods is not applicable today. However, as I have demonstrated above, such an assumption is mistaken. In fact, using the applied history methodology proposed by May and Neustadt, we can see that there is much to glean from the history of Cold War disinformation. At the same time, there are major differences relating to the technological, social, and political landscape in the United States today compared to the (relatively recent) Cold War past. This should caution us against applying history, lock-stock-and-barrel, to a contemporary context.

Soviet disinformation campaigns that targeted domestic racial injustice and disease outbreak are direct precedents for disinformation efforts by Russia, and other hostile states, that target Western democracies today. Russia weaponizes disinformation for the same strategic aims as its Soviet predecessor: to discredit the U.S. government and disrupt its effective democratic functioning. By targeting divisive “wedge” issues like race and a government’s purported creation of a pandemic, a hostile foreign state can inculcate public distrust in government institutions and damage the democratic process.143 Studying the history of Soviet disinformation is important for comprehending the contemporary aims of Russian disinformation and the national security threat that it poses. It is also important for understanding the disinformation efforts of the Chinese government, which, whether consciously or not, has deployed the KGB’s old conspiracy that the U.S. government is responsible for a pandemic — in this case, COVID-19.144 The strategic aim of Russian and Chinese disinformation about COVID-19 is to discredit democracy and highlight the comparative “security” that an authoritarian society can provide. The more doubt these countries can cast on events — by questioning official COVID-19 mortality rates, proposing that a cabal of powerful satanic cannibals and pedophiles operates in the United States, or suggesting that 5G cellular networks caused the novel coronavirus outbreak — the weaker they believe Western societies will be.145

Instead of eliminating disinformation, a successful counter-strategy for democracies should be based on inoculating against its effects. We need to learn to live with it, minimize its impact, and prevent it from metastasizing.

U.S. efforts to debunk the KGB’s Olympic death threat letters and its AIDS bioweapon conspiracy provide important lessons for how to counter present Russian disinformation about U.S. racial injustice and COVID-19. As in the Cold War, disinformation today essentially depends on human nature, in particular psychological tendencies to believe stories that confirm existing prejudices. This has not changed over time, despite the immense changes in the technological landscape, as Rid’s study, Active Measures, shows.146 Thus, the policies devised for detecting and countering past disinformation are still relevant. Foreign intelligence collection and assessment remain essential: Forensic analysis of forgeries and well-placed human sources can detect and attribute foreign state disinformation. Meanwhile, the experience of the U.S. government’s Active Measures Working Group shows the value of having a dedicated body to coordinate government efforts to debunk disinformation. Its strategy of reporting, assessing, and publishing was effective in the above two case studies of Soviet disinformation and remains applicable today. However, the true lesson of U.S. efforts to counter later Cold War Soviet disinformation is that it requires broader efforts that go beyond the work of a single government department. Because disinformation targets existing grievances that, by definition, usually constitute broad trends in a society, countering it has to be correspondingly broad.

The digital age, with its global interconnections, is an era in which digital information warfare — with attacks in the form of bytes rather than bombs — has become a pervasive national security concern. The U.K. Parliament’s Intelligence and Security Committee has predicted that disinformation will be the new normal for open democratic governments and societies in the 21st century.147 The advent of immersive social media technologies, like virtual reality and “deep fakes,” will increase opportunities for what we may call disruptive synthetic mind viruses. If the use of disinformation is indeed becoming the “new normal,” democracies need to understand what is needed to successfully counter it. It is unrealistic to suppose that open societies, with freedom of speech and political association, can — or should want to — eradicate disinformation. It is inherent in democratic societies that people are free to express themselves, even if that means allowing the dissemination of objectively false information. Instead of eliminating disinformation, a successful counter-strategy for democracies should be based on inoculating against its effects. We need to learn to live with it, minimize its impact, and prevent it from metastasizing, like someone living with an inoperable disease. It is also worth appreciating that online platforms are not perfect delivery tools for disinformation. Once a shot is fired over social media, it is difficult, if not impossible, to control its spread and achieve its intended aims, while minimizing unintended consequences, including retaliation by targeted states. Cyberspace is not a perfect weapon.148

In an account published in 1988 about his time in the KGB’s Service A, Levchenko provided a primer on how Western democracies should defend against Kremlin disinformation. He advised them to be informed about Soviet disinformation, understand its nature, and be vigilant against its corrosive effects in the news they consume. The most effective countermeasure, Levchenko argued, was for Western citizens to read international newspapers widely: “Train yourself to read the front pages of your newspapers and read them every day. Read news magazines.” By reading widely, Western citizens can establish facts for themselves. Levchenko continued:

Many Americans say that they are dissatisfied with the various media. They say they don’t trust the newspapers. ‘Newspapers are too liberal’, some say, while others say that ‘reporters manage the news’. Whether these charges are true is irrelevant. The press in democratic countries is there to serve the readership. That is why my best advice for the future of the free world is to read, inform yourselves, make your interpretations, and draw your own conclusions.149

Today, Levchenko’s advice applies to promoting digital literacy, which will require a broad, generational effort involving all of society to achieve, for example, by having schools teach students to distinguish facts from fake news. Separating facts from fiction also applies to the subject of disinformation itself. As Eagleburger summarized about Soviet disinformation: “Active measures are not magic, nor does the world dance to a covert Moscow tune. Moscow does not dominate the political process of the western democracies.”150 The same is true today. As in the Cold War, the grievances that Russian disinformation targets in U.S. society are American made, not foreign.

 

Calder Walton is assistant director of the Applied History Project at the Kennedy School of Government, Harvard University, where he is also research director at its Intelligence Project. His new book, Spies: The Epic Intelligence War Between East and West, will be published in 2023.

Acknowledgements: I would like to thank Sean Power and Nidal Morrison, research assistants at the Belfer Center’s Intelligence Project, for their help with this article.

Image: www.kremlin.ru (CC BY 4.0)

Endnotes

1 U.S. Senate Select Committee on Intelligence, Report on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election, Volume 2: Russia’s Use of Social Media with Additional Views, 116th Cong., 1st sess. (2020), 6, 38, https://www.intelligence.senate.gov/sites/default/files/documents/Report_Volume2.pdf.

2 Jessica Brandt and Torey Taussig, “The Kremlin’s Disinformation Playbook Goes to Beijing,” Brookings, May 19, 2020, https://www.brookings.edu/blog/order-from-chaos/2020/05/19/the-kremlins-disinformation-playbook-goes-to-beijing/.

3 Illustrative of which is the fact that the word “disinformation” appears just twice in the National Security Strategy of the United States of America, The White House, December 2017, https://www.whitehouse.gov/wp-content/uploads/2017/12/NSS-Final-12-18-2017-0905-2.pdf, 31, 35; and just once in the National Cyber Strategy of the United States of America, The White House, September 2018, https://www.whitehouse.gov/wp-content/uploads/2018/09/National-Cyber-Strategy.pdf, 21.

4 David E. Sanger, The Perfect Weapon: War, Sabotage, and Fear in the Cyber Age (New York: Broadway Books, 2018) (The United States sees cyber tools as weapons, not part of information warfare.); Robert S. Mueller, III, Report on the Investigation Into Russian Interference in the 2016 Presidential Election, Department of Justice, March 2019, https://www.justice.gov/storage/report.pdf; U.S. Senate Select Committee on Intelligence, Report on Russian Active Measures Campaigns, vol. 1, at 4, 54–61, vol. 2 at 71–76, vol. 5 at 931–37; and Michael J. Mazarr et al., Hostile Social Manipulation: Present Realities and Emerging Trends (Santa Monica, CA: RAND Corporation, 2019), https://www.rand.org/pubs/research_reports/RR2713.html.

5 Christopher Andrew and Oleg Gordievsky, KGB: The Inside Story of Its Foreign Operations from Lenin to Gorbachev (New York: Perennial, 1991); Christopher Andrew and Vasili Mitrokhin, The Sword and the Shield: The Mitrokhin Archive and the Secret History of the KGB (New York: Basic Books, 1999). Other studies include House Permanent Select Committee on Intelligence, Soviet Active Measures, 97th Cong., 2nd sess., 1982; Richard H. Shultz and Roy Godson, Dezinformatsia: Active Measures in Soviet Strategy (Washington, DC: PergamonBrassey’s, 1984); Dennis Kux, “Soviet Active Measures and Disinformation: Overview and Assessment,” Parameters 15, no. 4 (1985): 1928, https://doi.org/10.55540/0031-1723.1388; Herbert Romerstein and Stanislav Levchenko, The KGB Against the ‘Main Enemy’: How the Soviet Intelligence Service Operates Against the United States (Lexington, MA: Lexington Books, 1989); Herbert Romerstein, Soviet Active Measures and Propaganda: ‘New Thinking’ and Influence Activities in the Gorbachev Era, Mackenzie Institute for the Study of Terrorism, Revolution, and Propaganda, 1989; Herbert Romerstein, “Disinformation as a KGB Weapon in the Cold War,” Journal of Intelligence History 1, no. 1 (2001): 54–67, https://doi.org/10.1080/16161262.2001.10555046.

6 Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare (New York: Farrar, Straus and Giroux, 2020).

7 Assessing Russian Activities and Intentions in Recent US Elections, Office of the Director of National Intelligence, Jan. 6, 2017, https://www.dni.gov/files/documents/ICA_2017_01.pdf, ii–iii, 5; Mueller, Investigation Into Russian Interference; and U.S. Senate Select Committee on Intelligence, Report on Russian Active Measures, vol. 2, 11–15.

8 Hal Brands, The Twilight Struggle. What the Cold War Teaches Us About Great-Power Rivalry Today (New Haven, CT: Yale University Press, 2022).

9 The U.S. government body that was established to counter disinformation, the Active Measures Working Group, discussed below, is not mentioned in Rid’s recent book, Active Measures, nor in the five-volume Senate report on Russia’s efforts to interfere in the 2016 election. U.S. Senate Select Committee on Intelligence, Russian Active Measures.

10 R.G. Collingwood, The Idea of History (Oxford: Clarendon Press, 1946); Marc Bloch, The Historian’s Craft (New York: Knopf, 1953); Hal Brands and Jeremi Suri, eds., The Power of the Past: History and Statecraft (Washington, DC: Brookings, 2015); Edward Hallett Carr, What Is History? (New York: Knopf, 1962); Geoffrey Barraclough, An Introduction to Contemporary History (New York: Basic Books, 1964), 1–36; Ernest R. May, Lessons’ from the Past: The Use and Misuse of American Foreign Policy (London: Oxford University Press, 1973); John Lewis Gaddis, The Landscape of History: How Historians Map the Past (New York: Oxford University Press, 2002); Eliot A. Cohen, “The Historical Mind and Military Strategy,” Orbis 49, no. 4 (Autumn 2005): 575–88, https://doi.org/10.1016/j.orbis.2005.07.002; Marc Trachtenberg, The Craft of International History: A Guide to Method (Princeton, NJ: Princeton University Press, 2006); Margaret MacMillan, The Uses and Abuses of History (Toronto: Viking Canada, 2008); Gordon S. Wood, The Purpose of the Past: Reflections on the Uses of History (New York: Penguin, 2008); Philip Zelikow, “The Nature of History’s Lessons,” in The Power of the Past: History and Statecraft, ed. Brands and Suri, 281–310; and Francis J. Gavin, “Thinking Historically: A Guide for Strategy and Statecraft,” War on the Rocks, Nov. 19, 2019, https://warontherocks.com/2019/11/thinking-historically-a-guide-for-strategy-and-statecraft/.

11 Richard E. Neustadt and Ernest R. May, Thinking in Time: The Uses of History for Decision Makers (New York: Free Press, 1988), xi–xxii.

12 John Bew, “United Kingdom: The Best Education,” in “The Big Question: What Lessons from History Keep Being Forgotten,” World Policy Journal 33, no. 3 (2016): 1, https://doi.org/10.1215/07402775-3712909.

13 MacMillan, The Uses and Abuses of History; Ernest R. May and Philip D. Zelikow, “Introduction,” in Dealing with Dictators: Dilemmas of U.S. Diplomacy and Intelligence Analysis, 1945-1990, ed. May and Zelikow (Cambridge, MA: MIT Press, 2007), 1–13; Gavin, “Thinking Historically”; Bew, “United Kingdom: The Best Education”; and Calder Walton, “Delivering Applied History to Policy-Makers: Ideas and Lessons,” an interview with Gill Bennett, OBE, former chief historian of Britain’s Foreign Office, April 2020, https://www.belfercenter.org/sites/default/files/Calder%20Walton%20-%20Applied%20History%20Interview%20with%20Gill%20Bennett%20Dec%202019.pdf.

14 Ion Mihai Pacepa and Ronald J. Rychlak, Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism (Washington, DC: WND Books, 2013); and Ladislav Bittman, The Deception Game: Czechoslovak Intelligence in Soviet Political Warfare (Syracuse, NY: Syracuse University Research Corporation, 1972).

15 E.M. Primakov, ed., Ocherki istorii rossiĭskoĭ vneshneĭ razvedki: v shesti tomakh: Vol. 2 (Moscow: Mezhdunarodnye otnoshenii︠a︡, 1996–2006), chap. 12.

16 F.H. Hinsley, British Intelligence in the Second World War (Cambridge, UK: Cambridge University Press, 1979).

17 Keith Jeffery, MI6: The History of the Secret Intelligence Service, 1909–1949 (London: Bloomsbury, 2010); David Garnett, The Secret History of PWE: The Political Warfare Executive 1939-1945 (London: St. Ermin’s Press, 2002); War Report of the Office of Strategic Services (New York: Walker and Co., 1976); and Morale Operations Field Manual – Strategic Services, Office of Strategic Services, 1943, https://www.cia.gov/readingroom/docs/CIA-RDP89-01258R000100010002-4.pdf.

18 Kaeten Mistry, The United States, Italy and the Origins of Cold War: Waging Political Warfare, 1945–1950 (New York: Cambridge University Press, 2014); and David Shimer, Rigged: America, Russia, and One Hundred Years of Covert Electoral Interference (New York: Knopf, 2020).

19 Jeffery, MI6

20 Donald Wilber, “Overthrow of Premier Mossadeq of Iran, November 1952-August 1953,” Clandestine Service History, March 1954, Electronic Briefing Book No. 29, “The Secret CIA History of the Iran Coup, 1953,” ed. Malcolm Byrne, the National Security Archive, George Washington University, 19, 28–29, 36–37https://nsarchive2.gwu.edu/NSAEBB/NSAEBB28/.

21 John Ranelagh, The Agency: The Rise and Decline of the CIA (New York: Simon & Schuster, 1986); Bittman, Deception Game; and Author interview with Ladislav Bittman, July 24, 2018.

22 “Soviet Influence Activities: A Report on Active Measures and Propaganda, 1986-87,” Department of State Publication 9627, August 1987, 57.

23 Author interview with former senior CIA officer, on condition of anonymity, March 17, 2020.

24 “Statement of Laszlo Szabo,” in Hearings Before the CIA Subcommittee of the Committee on Armed Services of the House of Representatives, 89th Cong., 2nd sess., March 17, 1966, 5348–49; and Andrew and Mitrokhin, Sword and the Shield, 292–321.

25 Primakov, ed., Ocherki istorii rossiĭskoĭ vneshneĭ razvedki: Vol. 5 1945–1965, 466–67.

26 Author interview with Ladislav Bittman, July 24, 2018.

27 Max Holland, “The Propagation and Power of Communist Security Services Dezinformatsiya,” International Journal of Intelligence and Counterintelligence 19, no. 1 (2006): 1–31, https://doi.org/10.1080/08850600500332342.

28 Ladislav Bittman, The KGB and Soviet Disinformation: An Insider’s View (Washington, DC: Pergamon-Brassey’s, 1985), 43–45; and Bittman, The Deception Game, 20–22.

29 Author interview with Ladislav Bittman, July 24, 2018.

30 “Some Highlights from the Airlie House Discussion on Disinformation,” Memorandum for the Director of Central Intelligence, Aug. 12, 1985, CIA Records, https://www.cia.gov/readingroom/docs/CIA-RDP89G01126R000100110022-3.pdf.

31 “Statement of Laszlo Szabo,” 5347.

32 Primakov, ed., Ocherki istorii rossiĭskoĭ vneshneĭ razvedki: Vol. 5 1945–1965. Active measures are discussed in this volume (see pages 466–68), but the subsequent volume (Vol. 6 1966–2005) gives the misleading impression that they primarily functioned before 1965 and makes just one reference to “active measures.”

33 Andrew and Mitrokhin, Sword and the Shield, 7.

34 Andrew and Gordiesvky, KGB, 630­–32; and Andrew and Mitrokhin, Sword and the Shield, 318–19.

35 Romerstein, “Disinformation as a KGB Weapon,” 54–67.

36 “The Structure of the Residency and control of the Residency’s work by the Headquarters,” Handwritten note by Herbert Romerstein, n.d. c. 1990, box 1091, Herbert Romerstein Collection, Hoover Institute Archives [hereafter “HIA”], Stanford University, Palo Alto, CA.

37 Author interview with Ladislav Bittman, June 24, 2018; and “KGB Disinformation using Pesa Sera,” Impedian Report 222, Feb. 24, 1998, box 700, Herbert Romerstein Collection, HIA.

38 “Soviet Active Measures in the United States— An Updated Report of the FBI,” Congressional Record Vol. 133, Part. 24, Dec. 9, 1987, 100th Cong., 1st sess., 34641–50, https://www.congress.gov/bound-congressional-record/1987/12/09/extensions-of-remarks-section?p=0.

39 Andrew and Gordievsky, KGB, 628.

40 According to a statement by former CIA Director William Casey in 1985, Novosti headquarters in Moscow contained a section of 50 KGB officers who worked full time on disinformation programs. “Soviet Use of Active Measures: Address before the Dallas Council on World Affairs,” 1985, box 14, folder 310, William Casey Papers, HIA, 2.

41 U.S. Department of State, “Soviet Influence Activities,” 87; “Lexicon of Soviet Terms Relating to Deception,” 1983, box 700, folder 2, Herbert Romerstein Collection, HIA; Bittman, The KGB and Soviet Disinformation, 17–18; and U.S. Senate Committee on Foreign Relations, Subcommittee on European Affairs, “Soviet Active Measures,” 99th Cong., 1st sess., 1985, 4.

42 Charles Wick, “Soviet Active Measures in the Era of Glasnost: A Report to Congress,” United States Information Agency, March 1988, 89.

43 Vasily Mitrokhin, ed., KGB Lexicon: The Soviet Intelligence Officer’s Handbook (London: Palgrave, 2002), 13; “Hostile Intelligence Threat: Soviet Active Measures,” 1983, 1984, box 1, Kenneth DeGraffenreid files RAC, Ronald Reagan Presidential Library [hereafter “RRL”], Simi Valley, CA; “Soviet Active Measures,” May 30, 1984, box 9, RRL; “Hostile Intelligence Threat: Soviet Active Measures,” Presidential active measures speech, 1985, box 9, RRL; Shultz and Godson, Dezinformatsia,13–33; Oleg Kalugin, Spymaster: My Thirty-two Years in Intelligence and Espionage Against the West (New York: Basic Books, 2009), 34–36; and Rid, Active Measures.

44 Andrew and Mitrokhin, Sword and the Shield, 292–93.

45 Furthermore, for both the KGB and Western intelligence services during the Cold War, there was a nexus between espionage and agents of influence. According to Ambassador Dennis Kux (discussed further below), “The Agency basically does two things: it collects intelligence. For that purpose you recruit people to be your spies. They answer your questions. Then, to help undertake your ‘covert operations,’ you also need people who can influence others. Let us take one of our global, ‘covert operations.’ Say, we were trying to oppose the threat of communism — something very broad. To do that, the Agency may try to hire people to influence the local government. These people are called ‘agents of influence.’ The source of intelligence and the ‘agent of influence’ may be the same person, but not always necessarily. If you happen to recruit, the Secretary General of Party X or some official to provide you with information on what is going on — in other words, a ‘spy’ — he also can serve as an ‘agent of influence.’ Where we get into trouble is when something happens. We say that they are using Mr. X, and they say: ‘No, we are just collecting intelligence.’ It sometimes becomes difficult to distinguish one category of agent from another.” Thomas Stern, “Interview with Dennis Kux,” Association for Diplomatic Studies and Training Foreign Affairs Oral History Project, Jan. 13, 1995, http://hdl.loc.gov/loc.mss/mfdip.2004kux01, 97–98.

46 John Barron, KGB: The Secret Work of Soviet Agents (New York: Reader’s Digest Press, 1974); and Christian Brown, “Flink Discusses Role in Spy Case; Queens Lawyer Speaks of Contacts with Russian,” New York Times, Sept. 17, 1962, https://www.nytimes.com/1962/09/17/archives/flink-discusses-role-in-spy-case-queens-lawyer-speaks-of-contacts.html.

47 Andrew and Mitrokhin, Sword and the Shield, 236–38; The Papers of Vasili Mitrokhin [hereafter ‘MITN’], 1/6/6, Churchill College, Cambridge University, UK [hereafter ‘CCC’].

48 Kalugin, Spymaster, 54–55.

49 Barron, KGB.

50 MITN, 1/6/6, CCC.

51 Andrew and Mitrokhin, Sword and the Shield, 238.

52 “Racist Hate Note Sent to the U.N. Aides; Asian and African Delegates Warned of ‘Klan’ — F.B.I. Investigation Planned,” New York Times, Nov. 29, 1960, https://www.nytimes.com/1960/11/29/archives/racist-hate-note-sent-to-un-aides-asian-and-african-delegates.html.

53 Kalugin, Spymaster, 54.

54 “Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns,” U.S. Department of State, August 1986, 22­–24, 54–56; “Alleged KKK Death Threats to Third World Olympic Athletes: A Soviet Active Measure,” Memorandum, Directorate of Intelligence, July 17, 1984, CIA Records, https://www.cia.gov/readingroom/docs/CIA-RDP85T00287R001400750001-6.pdf. The operation is, unsurprisingly, omitted from Primakov, ed., Ocherki istorii rossiĭskoĭ vneshneĭ razvedki: Vol. 6 1966–2005.

55 “Olympic Sabotage,” New York Times, Aug. 9, 1984, https://www.nytimes.com/1984/08/09/opinion/olympic-sabotage.html; and “Speech on Soviet Active Measures,” Memorandum for DDCI, May 16, 1986, CIA Records, https://www.cia.gov/readingroom/docs/CIA-RDP89G00720R000600670002-0.pdf.

56 New York Times, “Alleged KKK Death Threats to Third World Olympic Athletes.”

57 Author interview with former senior CIA officer, on condition of anonymity, June 12, 2019.

58 Victor Cherkashin, Spy Handler: Memoir of a KGB Officer (New York: Basic Books, 2005), 217.

59 Sandra Grimes and Jeanne Vertefeuille, Circle of Treason: A CIA Account of Traitor Aldrich Ames and the Men He Betrayed (Annapolis, MD: Naval Institute Press, 2012).

60 Author interview with former CIA officer stationed in U.S. embassy, London, on condition of anonymity, July 1, 2019; TNA FO 973/378 ‘The 1984 Olympic Games’ Foreign Office Background Brief (August 1984).

61 Richard Nixon, RN: The Memoirs of Richard Nixon (New York: Grosset & Dunlap, 1978), 589.

62 Ken Alibek, Biohazard: The Chilling True Story of the Largest Covert Biological Weapons Program in the World — Told from Inside by the Man Who Ran It (New York: Random House, 1999); and Philip Zelikow and Condoleezza Rice, To Build a Better World: Choices to End the Cold War and Create a Global Commonwealth (New York: Grand Central Publishing, 2009).

63 Milton Leitenberg, “New Russian Evidence on the Korean War Biological Warfare Allegations: Background and Analysis,” Cold War International History Project Bulletin, no. 11 (Winter 1998): 180–199, https://www.cpp.edu/~zywang/leitenberg.pdf.

64 “Soviet Active Measures: An Update,” U.S. Department of State, Special Report no. 101, July 1982, 3.

65 U.S. Department of State, “Soviet Influence Activities,” 53.

66 Thomas Boghardt, “Soviet Bloc Intelligence and Its AIDS Disinformation Campaign,” Studies in Intelligence 53, no. 4 (December 2009), https://www.iwp.edu/wp-content/uploads/2019/05/20140905_BoghardtAIDSMadeintheUSA.pdf; Douglas Selvage, “Operation ‘Denver’: The East German Ministry of State Security and the KGB’s AIDS Disinformation Campaign, 1985–1986 (Part 1),” Journal of Cold War Studies 21, no. 4 (Fall 2019): 71–123, https://doi.org/10.1162/jcws_a_00907; and “Wuhan Lab Leak Theory: How Fort Detrick Became a Centre for Chinese Conspiracies,” BBC News, Aug. 23, 2021, https://www.bbc.co.uk/news/world-us-canada-58273322.

67 “AIDS May Invade India,” Patriot, July 16, 1983; Ilya Dzhirkvelov, Secret Servant: My Life with the KGB and Soviet Elite (New York: HarperCollins, 1988), 383; and Author interview with former senior Service A KGB officer, on condition of anonymity, Sept. 23, 2017.

68 Stern, “Interview with Dennis Kux.”

69 Jakob Segal, AIDS: Its Nature and Origin (1986), copy in possession of author, not available at author’s university library. The Segal report is discussed here: U.S. Department of State, “Soviet Influence Activities,” 34–36. “Soviet Influence Activities: A Report on Active Measures and Propaganda, 1987–1988,” U.S. Department of State, Publication 9720 (August 1989), 2.

70 DeNeen L. Brown, “‘You’ve Got Bad Blood’: The Horror of the Tuskegee Syphilis Experiment,” Washington Post, May 16, 2017, https://www.washingtonpost.com/news/retropolis/wp/2017/05/16/youve-got-bad-blood-the-horror-of-the-tuskegee-syphilis-experiment/.

71 U.S. Department of State, “Soviet Influence Activities,” 1987, 34–43.

72 U.S. Department of State, “Soviet Influence Activities,” 1987, 34–36; and Wick, “Soviet Active Measures in the Era of Glasnost,” 12.

73 Richard Helms, A Look Over My Shoulder: A Life in the Central Intelligence Agency (New York: Random House, 2003); Richard Helms, “Testimony of Richard Helms on Communist Forgeries,” Hearing Before the Subcommittee to Investigate the Administration of the Internal Security Act and Other Security Laws of the Committee on the Judiciary, U.S. Senate, June 2, 1961.

74 Wick, “Soviet Active Measures in the Era of Glasnost,” appendix, 1–12; and Schultz and Godson, Dezinformatsia.

75 Stanislav Levchenko, On the Wrong Side: My Life in the KGB (Washington, DC: Pergamon-Brassey’s, 1988), 237.

76 Nicholas J. Cull, The Cold War and the United States Information Agency: American Propaganda and Public Diplomacy, 1945–1989 (New York: Cambridge University Press, 2008).

77 The papers of Kenneth DeGraffenreid, held at the Ronald Reagan Presidential Library, have significant holdings concerning the Active Measures Working Group, but almost all are still classified. A valuable study is Fletcher Schoen and Christopher J. Lamb’s “Deception, Disinformation, and Strategic Communications: How One Interagency Group Made a Major Difference,” Strategic Perspectives, no. 11 (2012), https://ndupress.ndu.edu/Portals/68/Documents/stratperspective/inss/Strategic-Perspectives-11.pdf.

78 “Interagency Active Measures Working Group,” Memorandum from Richard J. Kerr to Morton I. Abramowitz, May 6, 1986, CIA Records, https://www.cia.gov/readingroom/docs/CIA-RDP90G01359R000300010043-9.pdf.

79 Lawrence Eagleburger, “Unacceptable Intervention: Soviet Active Measures,” NATO Review 31, no. 1 (April 1983): 6–11.

80 Stern, “Interview with Dennis Kux”; Wick to Howard Baker, letter, March 22, 1988, White House Office of Records Management Subject File Federal Government Organizations [‘FG’] FG 298 USIA 55813, RRL; “Possible Current Active Measures,” Jan. 11, 1982, box 27, Kenneth DeGraffenreid Papers RAC, RRL; Kux, “Soviet Active Measures and Disinformation”; and Romerstein, “Disinformation as a KGB Weapon,” 54–67.

81 Kathleen Bailey, email message to author, June 30, 2020; and Kerr to Abramowitz, “Interagency Active Measures Working Group.”

82 Schoen and Lamb, “Deception, Disinformation, and Strategic Communications,” 70.

83 Schoen and Lamb, “Deception, Disinformation, and Strategic Communications,” 103; and George P. Shultz, Turmoil and Triumph (New York: Charles Scribner’s, 1993), 997.

84 Wick, “Soviet Active Measures in the Era of Glasnost,” 3; and Romerstein, “Disinformation as a KGB Weapon,” 66.

85 Don Oberdorfer, “State Department Hails Moscow for ‘Disavowal’ on AIDS,” Washington Post, Nov. 3, 1987.

86 Yevgeny Primakov, “Vneshniaia Razvedka Ischet Talanty,” Izvestiya, March 19, 1992; and Kalugin, Spymaster, 297. However, Primakov’s admission is — unsurprisingly — omitted in Russia’s official history of foreign intelligence: Ocherki istorii rossiĭskoĭ vneshneĭ razvedki: Vol. 6 1966–2005.

87 Selvage, “Operation ‘Denver.’”

88 Stern, “Interview with Dennis Kux.”

89 “Soviet Propaganda Alert No. 29,” Memorandum from Charles Z. Wick (USIA) to DCI William J. Casey, Nov. 25, 1985, CIA Records, https://www.cia.gov/readingroom/docs/CIA-RDP87M00539R002404010003-5.pdf, 12; Kathleen Bailey, email message to author, June 30, 2020; and Author interview with former senior CIA officer, on condition of anonymity, July 12, 2020.

90 Sir David Omand, “The Importance of BBC Monitoring for Intelligence,” Listening to the World, Imperial War Museums, c. 2015, https://www.iwm.org.uk/sites/default/files/files/2018-11/The%20Importance%20of%20BBC%20Monitoring%20for%20Intelligence%20by%20David%20Omand.pdf.

91 “Soviet Active Measures,” FCO 28/7745, The U.K. National Archives. See, for example, “Record of Talks on Soviet Active Measures between FCO Officials and Mr. Herbert Romerstein,” United States Information Agency, Oct. 30, 1986.

92 Stern, “Interview with Dennis Kux.”

93 Catherine Belton, Putin’s People: How the KGB Took Back Russia and then Took On the West (New York: Farrar, Straus, and Giroux, 2020); Fiona Hill and Clifford G. Gaddy, Mr. Putin: Operative in the Kremlin (Washington, DC: Brookings, 2013); Masha Gessen, The Future Is History: How Totalitarianism Reclaimed Russia (New York: Riverhead Books, 2017); and Peter Pomerantsev, This Is Not Propaganda: Adventures in the War Against Reality (New York: PublicAffairs, 2019).

94 Pete Earley, Comrade J: The Untold Secrets of Russia’s Master Spy in America After the End of the Cold War (New York: Penguin, 2007).

95 Mueller, Investigation Into Russian Interference, 22, 43; and U.S. Senate Select Committee on Intelligence, Report on Russian Active Measures, vol. 2, 38.

96 Tom Parfitt, “Troll Factory Pours Bile on to the Internet,” The Times, Dec. 17, 2018, https://www.thetimes.co.uk/article/troll-factory-that-poured-bile-on-to-the-internet-tb3kgxhvm. Also see Shaun Walker, “The Russian Troll Factory at the Heart of the Meddling Allegations,” The Guardian, April 2, 2015, https://www.theguardian.com/world/2015/apr/02/putin-kremlin-inside-russian-troll-house.

97 Julian E. Barnes and Adam Goldman, “Russia Trying to Stoke U.S. Racial Tensions Before Election, Officials Say,” New York Times, March 10, 2020, https://www.nytimes.com/2020/03/10/us/politics/russian-interference-race.html.

98 Julian E. Barnes and David E. Sanger, “Russian Intelligence Agencies Push Disinformation on Pandemic,” New York Times, July 28, 2020, https://www.nytimes.com/2020/07/28/us/politics/russia-disinformation-coronavirus.html.

99 Andrew and Mitrokhin, Sword and the Shield, 317–18.

100 Shannon Vavra, “Someone Duped Twitter Verification to Spread Racist Disinformation on U.S. Coronavirus Vaccine,” Cyberscoop, Aug. 12, 2020, https://www.cyberscoop.com/twitter-verification-racist-disinformation-coronavirus-vaccine-iranian-endless-mayfly/.

101 William J. Broad, “Putin’s Long War Against American Science,” New York Times, April 13, 2020, https://www.nytimes.com/2020/04/13/science/putin-russia-disinformation-health-coronavirus.html.

102 Kalugin, Spymaster; Markus Wolf, Man Without a Face (New York: Times Books, 1997); and Andrew and Mitrokhin, Sword and the Shield.

103 “Soviet Active Measures,” Hearings Before the Subcommittee on European Affairs of the Committee on Foreign Relations United States Senate, 99th Cong., 1st Sess. (1985), 18–22. Gates explained that he had in mind Spain's upcoming referendum on NATO. Soviet intelligence was known to have meddled in Spain’s 1980 election about joining NATO. In an effort to dissuade Spanish voters from joining NATO, that year the KGB spread disinformation that biological weapons were leaking from the U.S. military facility in Spain.

104 John McLaughlin, “The Changing Nature of CIA Analysis in the Post-Soviet World,” Speech given at the Conference on CIA's Analysis of the Soviet Union, 1947–1991, Princeton, NJ, March 9, 2001.

105 This was, of course, the phrase made famous by Francis Fukuyama in The End of History and the Last Man (New York: Free Press, 1992).

106 “Russia,” U.K. Intelligence and Security Committee of Parliament (July 2020), 21, https://isc.independent.gov.uk/wp-content/uploads/2021/03/CCS207_CCS0221966010-001_Russia-Report-v02-Web_Accessible.pdf.

107 Author interview with Ladislav Bittman, July 24, 2018.

108 U.K. Intelligence and Security Committee of Parliament, “Russia”; and Aaron Holmes, “The White House Reportedly Quashed Part of an Intelligence Report that Said Russia Is Helping the Trump Campaign,” Business Insider, Aug. 9, 2020, https://www.businessinsider.com/trump-russia-report-2020-election-dni-coats-2020-8.

109 Mason Walker and Katerina Eva Matsa, “News Consumption Across Social Media in 2021,” Pew Research Center, Sept. 20, 2021, https://www.pewresearch.org/journalism/2021/09/20/news-consumption-across-social-media-in-2021/.

110 Amy Ross Arguedeas, et al., “Echo Chambers, Filter Bubbles, and Polarisation: A Literature Review,” The Royal Society, Jan. 19, 2022, https://royalsociety.org/-/media/policy/projects/online-information-environment/oie-echo-chambers.pdf; Stephan Lewandowsky, Ulrich K.H. Ecker, and John Cook, “Beyond Misinformation: Understanding and Coping with the ‘Post-Truth’ Era,” Journal of Applied Research in Memory and Cognition 6, no. 4 (2017): 353–69, esp. 359, https://doi.org/10.1016/j.jarmac.2017.07.008; Drew Calvert, “The Psychology Behind Fake News,” KelloggInsight, March 6, 2017, https://insight.kellogg.northwestern.edu/article/the-psychology-behind-fake-news; and Gordon Pennycook and David Rand, “Why Do People Fall for Fake News?” New York Times, Jan. 19, 2019, https://www.nytimes.com/2019/01/19/opinion/sunday/fake-news.html.

111 Author interview with Ladislav Bittman, July 24, 2018.

112 Jo Fox, “‘Fake News — the Perfect Storm: Historical Perspectives,” Historical Research 93, no. 259 (February 2020): 172–87, https://doi.10.1093/hisres/htz011; Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science 359, no. 6380 (2018): 1146–51, https://doi.org/10.1126/science.aap9559; and Robinson Meyer, “The Grim Conclusions of the Largest-Ever Study of Fake News,” The Atlantic, March 8, 2018, https://www.theatlantic.com/technology/archive/2018/03/largest-study-ever-fake-news-mit-twitter/555104/, quoted in Christina Nemr and William Gangware, Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age, Park Advisors, March 2019, 3, https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-Disinformation-in-the-Digital-Age.pdf.

113 Michael V. Hayden, The Assault on Intelligence: American National Security in an Age of Lies (New York: Penguin, 2018); James R. Clapper, Facts and Fears: Hard Truths from a Life in Intelligence (New York: Viking, 2018); and Gordon Gauchat, “Politicization of Science in the Public Sphere: A Study of Public Trust in the United States, 1974 to 2010,” American Sociological Review 77, no. 2 (April 2012): 167–87, https://doi.org/10.1177/0003122412438225.

114 Jennifer Kavanagh and Michael D. Rich, Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life (Santa Monica, CA: RAND Corporation, 2018), https://doi.org/10.7249/RR2314; and Jennifer Kavanagh, Hilary Reininger, and Norah Griffin, “Fighting Disinformation: A Database of Web Tools,” RAND Corporation, Dec. 19, 2019, https://doi.org/10.7249/TL323.

115 Kathleen Hall Jamieson, Cyber-War: How Russian Hackers and Trolls Helped Elect a President (New York: Oxford University Press, 2018); Rid, Active Measures; and Alan I. Abramowitz, “Did Russian Interference Affect the 2016 Election Results?” Sabato’s Crystal Ball, Aug. 8, 2019, https://centerforpolitics.org/crystalball/articles/did-russian-interference-affect-the-2016-election-results/.

116 Mike Isaac and Daisuke Wakabayashi, “Russian Influence Reached 126 Million through Facebook Alone,” New York Times, Oct. 30, 2017, https://www.nytimes.com/2017/10/30/technology/facebook-google-russia.html.

117 Eagleburger, “Unacceptable Intervention.”

118 “RESIST 2 Counter Disinformation Toolkit,” U.K. Government Communication Service, accessed Sept. 7, 2022, https://gcs.civilservice.gov.uk/publications/resist-2-counter-disinformation-toolkit/.

119 Intelligence and Security Committee of Parliament, “Russia,” 10.

120 Mike Isaac, “Facebook Finds New Disinformation Campaigns and Braces for 2020 Torrent,” New York Times, Oct. 21, 2019, https://www.nytimes.com/2019/10/21/technology/facebook-disinformation-russia-iran.html; and Matthew Ingram, “Facebook’s Fact-Checking Program Falls Short,” Columbia Journalism Review, Aug. 2, 2019, https://www.cjr.org/the_media_today/facebook-fact-checking.php.

121 Jeff Smith, Grace Jackson, and Seetha Raj, “Designing Against Misinformation,” Medium, Dec. 20, 2017, https://medium.com/facebook-design/designing-against-misinformation-e5846b3aa1e2.

122 “The Online Information Environment: Understanding how the Internet Shapes People’s Engagement with Scientific Information,” The Royal Society, January 2022, https://royalsociety.org/-/media/policy/projects/online-information-environment/the-online-information-environment.pdf.

123 Daisuke Wakabayashi, “Legal Shield for Social Media Is Targeted by Lawmakers,” New York Times, May 28, 2020, https://www.nytimes.com/2020/05/28/business/section-230-internet-speech.html.

124 Aldous Huxley, Brave New World Revisited (New York: Harper & Brothers, 1958), 129.

125 Eagleburger, “Unacceptable Intervention.”

126 Michael S. Goodman and Filippa Lentzos, “Battles of Influence: Deliberate Disinformation and Global Health Security,” Centre for International Governance Innovation, Aug. 24, 2020, https://www.cigionline.org/articles/battles-influence-deliberate-disinformation-and-global-health-security.

127 “Online Media Literacy Strategy,” U.K. Department for Digital, Culture, Media & Sport, July 14, 2021, https://www.gov.uk/government/publications/online-media-literacy-strategy.

128 The Royal Society, “The Online Information Environment,” 21.

129 Pennycook and Rand, “Why Do People Fall for Fake News?”

130 “‘Verified’ Initiative Aims to Flood Digital Space with Facts Amid COVID-19 Crisis,” United Nations Department of Global Communications, May 28, 2020,https://www.un.org/en/coronavirus/%E2%80%98verified%E2%80%99-initiative-aims-flood-digital-space-facts-amid-covid-19-crisis.

131 Michael F. Dahlstrom, “The Narrative Truth About Scientific Misinformation,” Proceedings of the National Academy of Science 118, no. 15, (April 2021), https://doi.org/10.1073/pnas.1914085117.

132 Jon Henley, “How Finland Starts Its Fight Against Fake News in Primary Schools,” The Guardian, Jan. 29, 2020, https://www.theguardian.com/world/2020/jan/28/fact-from-fiction-finlands-new-lessons-in-combating-fake-news.

133 Megan Brenan, “Americans’ Trust in Government Remains Low,” Gallup News, Sept. 30, 2021, https://news.gallup.com/poll/355124/americans-trust-government-remains-low.aspx.

134 “Biden's U.S. Counter-Disinformation Adviser Resigns After Two Months on the Job,” Reuters, May 18, 2022, https://www.reuters.com/world/us/bidens-counter-disinformation-executive-resigns-after-few-weeks-job-2022-05-18/.

135 Anthony Fauci, “AIDS: Acquired Immunodeficiency Syndrome,” Presentation at the National Institutes of Health, Bethesda, MD, 1984, http://resource.nlm.nih.gov/101674642; and Philip Alcabes, “Dread,” interview by Paul Orgel, C-SPAN, Aug. 17, 2009, https://www.c-span.org/video/?288411-5/dread&event=288411&playEvent.

136 Darryl Fears, “Many Blacks Cite AIDS Conspiracy,” NBC News, Jan. 25, 2005, https://www.nbcnews.com/id/wbna6867177.

137 David Robert Grimes, “Russian Fake News Is Not New: Soviet Aids Propaganda Cost Countless Lives,” The Guardian, June 14, 2017, https://www.theguardian.com/science/blog/2017/jun/14/russian-fake-news-is-not-new-soviet-aids-propaganda-cost-countless-lives.

138 Grimes, “Russian Fake News Is Not New.” Also see Sarah Boseley, “Mbeki Aids Denial ‘Caused 300,000 Deaths,’” The Guardian, Nov. 26, 2008, https://www.theguardian.com/world/2008/nov/26/aids-south-africa.

139 Marc Bennetts, “The Epidemic Russia Doesn’t Want to Talk About,” Politico, May 11, 2020, https://www.politico.eu/article/everything-you-wanted-to-know-about-aids-in-russia-but-putin-was-afraid-to-ask/.

140 Alistair Coleman, “‘Hundreds Dead’ Because of Covid-19 Misinformation,” BBC News, Aug. 12, 2020, https://www.bbc.com/news/world-53755067.

141 Katherine Schaeffer, “A Look at the Americans Who Believe There Is Some Truth to the Conspiracy Theory that COVID-19 Was Planned,” Pew Research Center, July 24, 2020, https://www.pewresearch.org/fact-tank/2020/07/24/a-look-at-the-americans-who-believe-there-is-some-truth-to-the-conspiracy-theory-that-covid-19-was-planned/.

142 Bernard Marr, “How Much Data Do We Create Every Day? The Mind-Blowing Stats Everyone Should Read,” Forbes, May 21, 2018, https://www.forbes.com/sites/bernardmarr/2018/05/21/how-much-data-do-we-create-every-day-the-mind-blowing-stats-everyone-should-read/.

143 Examples of wedge issues include police violence, Black Lives Matter, abortion, and gun control in the United States, and Brexit in the United Kingdom.

144 Laura Rosenberger, “China’s Coronavirus Information Offensive,” Foreign Affairs, April 22, 2020, https://www.foreignaffairs.com/articles/china/2020-04-22/chinas-coronavirus-information-offensive.

145 The FBI has described Q-Anon, the hard-right conspiracy theory and political movement, as a threat to U.S. national security due to its incitement of violence. Jana Winter, “Exclusive: FBI Document Warns Conspiracy Theories Are a New Domestic Terrorism Threat,” Yahoo News, Aug. 1, 2019, https://news.yahoo.com/fbi-documents-conspiracy-theories-terrorism-160000507.html; “Facebook Removes QAnon Conspiracy Theory Group with 200,000 Members,” BBC News, Aug. 7, 2020, https://www.bbc.com/news/technology-53692545; Ben Collins and Joe Murphy, “Russian Troll Accounts Purged by Twitter Pushed Qanon and Other Conspiracy Theories,” NBC News, Feb. 2, 2019, https://www.nbcnews.com/tech/social-media/russian-troll-accounts-purged-twitter-pushed-qanon-other-conspiracy-theories-n966091; Kate Gibson, “Twitter Bans Zero Hedge After It Posts Coronavirus Conspiracy Theory,” CBS News, Feb. 3, 2020, https://www.cbsnews.com/news/twitter-bans-zero-hedge-coronavirus-conspiracy-theory/; and Amy Davidson Sorkin, “The Dangerous Coronavirus Conspiracy Theories Targeting 5G Technology, Bill Gates, and a World of Fear,” New Yorker, April 24, 2020, https://www.newyorker.com/news/daily-comment/the-dangerous-coronavirus-conspiracy-theories-targeting-5g-technology-bill-gates-and-a-world-of-fear.

146 Thomas Rid, Active Measures.

147 Intelligence and Security Committee of Parliament, “Russia,”15.

148 This phrase was popularized by Sanger, The Perfect Weapon.

149 Levchenko, On the Wrong Side, 243.

150 Eagleburger, “Unacceptable Intervention.”

Top