As war envelops Ukraine, Russian sources have strived to create a miasma of disinformation about the invasion. Among ample efforts to distort reality, the Russian Ministry of Defense asserted recently that U.S.-backed labs in Ukraine have been developing bioweapons. Outlandish as this falsehood may be, Fox’s Tucker Carlson gave it credence by arguing that the U.S. government’s response was a “cover-up.”

As the Russia-Ukraine war intensifies, so too will the flow of disinformation. This is an age-old strategy Russia has long history of employing, and a playbook that others, most notably anti-vaccine activists, have borrowed from liberally. Yet, rather than focusing effort on convincing people of a falsehood, the Russian strategy takes a tack reminiscent of a strategy long employed by the tobacco industry: to sow so much doubt about what is true that it sends people into decision paralysis. Faced with a cacophony of wild and conflicting claims, people do nothing, unsure of what is right.

Despite constituting only a small part of our media diet, disinformation campaigns, in our digital world, can be devastatingly effective. We are intrinsically biased towards information that is emotionally visceral. We afford more weight to content that frightens or outrages us, with the ability to induce anger serving as the single greatest predictor of whether content goes viral. This propels the most visceral, divisive narratives to the forefront of discourse, creating a sound and fury of passionately debated claims and counter claims. In that atmosphere, it becomes increasingly difficult to ascertain what to believe, and easy to abandon the task of discerning the truth.

If we are not to fall victim to such rank dishonesty, it is crucial now that we question our sources more carefully than ever before. 

Indecision and distraction have long been central to Russia’s dezinformatsiya (disinformation) policy, a term Stalin himself is credited with coining. While an ancient concept, Russia had by the imperial age mastered dark obfuscation techniques refined for the era of mass communication. By the dawn of the Soviet empire, they realized this potential on an industrial scale, establishing the world's first office dedicated to disinformation in 1923. In the 1960s, the KGB covertly sponsored American fringe groups, amplifying conspiratorial narratives about everything from the assassination of president John F. Kennedy to water fluoridation.

The goal, as KGB Major General Oleg Kalugin elucidated in 1998, was “not intelligence collection, but subversion: active measures to weaken the West, to drive wedges in the Western community alliances of all sorts, particularly NATO, to sow discord among allies, to weaken the United States in the eyes of the people of Europe, Asia, Africa, Latin America....”. Operation INFEKTION, a mid-1980s clandestine effort to spread the myth that AIDS was a CIA-designed bioweapon, was but one infamous exemplar. While utterly fictious, it resonated with communities ravaged by HIV and neglected by the callous indifference of the Reagan administration. Despite Russian intelligence taking responsibility for this lie in 1992, the legacy of AIDS denialism persists to this day worldwide. 

During the Cold War, the doctrine of “active measures” was the beating heart of Soviet intelligence. This philosophy of political and information warfare had wide remit, including front groups, media manipulation, counterfeiting, infiltrating peace groups and even the occasional assassination.

And in our media-saturated era, Russia has been, by far, disinformation’s most enthusiastic user. Take the 2016 U.S. presidential election and the contentious Brexit referendum; Russia appears to have influenced both via lies and distortions.

But disinformation is not solely confined to geopolitics. By summer 2020, the European Commission identified a concerted Russian drive to propagate COVID disinformation worldwide. From the outset of the pandemic, Kremlin-backed troll farms pushed the narrative that COVID was an engineered bioweapon, peddling the explosive fiction that 5G radio frequencies caused the virus—a lie that resulted in dozens of arson attacks on cell towers worldwide.

There is a dark irony in the observation that conspiracy-minded people can be weaponized in plots to which they’re entirely oblivious. The enduring popularity of the virus-as-a-bioweapon mantra is a stark reminder that in the age of social media, such manipulation has become ever easier and more effective. Perhaps the most odious example of this is the cynical rise of anti-vaccine propaganda.

The sheer efficacy of vaccination is scientifically incontrovertible, and after clean water, immunization is the most life-saving intervention in human history. Despite this, the last decade has witnessed precipitous drops in vaccine confidence worldwide. The renaissance of once-virtually-conquered diseases prompted the WHO to declare vaccine hesitancy a top-10 threat to public health in 2019.

Vaccine hesitancy is a spectrum rather than a simple binary, and exposure to anti-vaccine conspiracy theories nudges recipients towards rejection. But critically, many who decline vaccination are not dyed-in-the-wool anti-vaccine zealots, but simply scared by what they have heard, unsure what to believe. Our tendency towards the illusory truth effect exacerbates this inertia, as the mere repetition of a fiction is enough to prime us to accept it, even if we know it to be false on an intellectual level. While Russia has often amplified anti-vaccine conspiracy theories to increase tensions, the anti-vaccine movements exist independently of these efforts, and are masters at sowing the seeds of doubt with torrents of conflicting and emotive claims.   

This illustrates the grim reality that disinformation has no need for consistency and zero commitment to objective reality; claims are frequently contradictory, arguing both sides of the coin in exaggerated and divisive ways. This “Russian firehose” model of propaganda is high-output, contradictory and multichannel. The stream encourages us to sleepwalk into apathy, distrustful of everything. This renders us supremely malleable, and dangerously disengaged.

When it comes to vaccination, concerned parents often opt to stay with the devil they know, delaying or even rejecting vaccination rather than sifting through the symphony of conflicting claims to which they’re subjected. Similarly, the outpouring of fictions about Ukraine, its president, Volodymyr Zelensky, and the war is designed to overwhelm our capacity to analyze, inducing us to implicitly accept uncertainty over aggressor and aggrieved—a manufactured doubt benefitting Russia and other nations. 

Conviction is not the chief goal of disinformation; instilling doubt is. This is why anti-vaccine activists have been so successful online, and why Russian troll-farms push ample resources into hawking lies virtually everywhere. The ubiquity of these fictions gives them an implicit veneer of legitimacy, fueling polarization and distrust.

This is the strategy Putin continues to pursue; already Russian propaganda has tried to paint Ukraine (or NATO / America) as aggressors with staged disinformation. This has been rendered less effective by the Biden administration’s creative approach of releasing intelligence prior to the operation. Across social media, Russian front organizations still try to induce doubt, efforts that will only intensify as the war wages on. Truth, the old adage insists, is the first casualty of war.