Ahead of the 2020 US elections, the disinformation threat is more domestic than foreign

A detail view of the U.S. Capitol Building in Washington, D.C., on September 18, 2020 amid the coronavirus pandemic. (Graeme Sloan/Sipa USA) via Reuters

With barely six weeks to go before the US elections, conspiracy theories, misinformation, and outright lies are more prevalent online than ever—complicating the job of reporters scrambling to cover the deeply contentious race against the backdrop of a global pandemic and a newly vacant Supreme Court seat.

On September 21, three top journalists and editors working at the intersection of national security, technology, and elections discussed the challenges they’re facing on a daily basis during a conversation hosted by the Atlantic Council’s Digital Forensic Research Lab and moderated by Resident Senior Fellow Andy Carvin. The panel featured Stacy-Marie Ishmael, editorial director of the Texas Tribune; Ellen Nakashima, national security reporter at the Washington Post; and Brandy Zadrozny, an investigative reporter with NBC News.

Ishmael said US news outlets have treated fact-checking “as if that’s the point,” when in fact newsrooms are inadequately prepared to deal with visuals such as memes that constitute an ever-growing medium of public disinformation. “We have to slow down, which is the opposite of one of our most finely tuned instincts, to make sure we are not misrepresenting, in this urgency for speed, what may be a longer, more complex [and] nuanced process” of analyzing digital disinformation, she said.

Watch the full event:

Nakashima, who helped expose Russian election interference in the 2016 US presidential election, said the situation today is very different than it was four years ago. After WikiLeaks published more than 20,000 hacked emails from the Democratic National Committee in 2016, Nakashima said she realized “this was no longer straight political espionage, but Russia taking the game to a new level in the form of information warfare.” 

“The political story dominated, and unfortunately the national security aspect of the story was underplayed,” she said. Today, “in the rush to overcompensate for what happened, we’re ever so aware of and looking for the Russian interference when it’s the domestic disinformation threat that is most pervasive and impactful and dangerous.”

As for the latter threat, Zadrozny said that for the past several years she’s followed a range of groups including anti-vaxxers, white supremacists, Antifa, the “far-right media machine,” and—most recently—medical disinformation related to COVID-19.

“All of the weird, insane rabbit holes that I’ve been following down for the last half-decade have now melded into this one animal we’re sort of fighting,” she said. “At the same time, we’re seeing conspiracy theories everywhere, from QAnon to COVID deniers.”

Nakashima said that when doing a “deeper dive into the content” of questionable allegations, editors at the Post insist on investigating three things: the information’s authenticity, its provenance (whether it’s coming from domestic sources, Russian proxies, and so on), and its newsworthiness. “What we pick and choose to report on is also part of the debate,” she said.

Referencing “Plandemic,” a 26-minute video that falsely claimed COVID-19 was created by a shadowy group of elites aiming to profit from a potential vaccine, Zadrozny noted how the video quickly spread online despite being “chock-full of misinformation about coronavirus and ludicrous allegations.”

“We saw [the video’s viral spread] coming for a week,” she said. “Plandemic taught us that maybe we were a little too late. We were so careful about not amplifying misinformation, and I think we could be a little more aggressive.”

The Digital Forensic Research Lab tracked the spread of Plandemic through public Facebook groups, and found that the debunked video endured content-moderation efforts by Facebook, Twitter, and YouTube by finding a home on niche “alt-tech” platforms. These platforms claim to be “pro-free speech,” but largely serve as a safe harbor for hate speech and disinformation.

Among its many other consequences, the pandemic has resulted in greater cooperation between the national security and political staffs at large newspapers such as the Washington Post. “There’s now more coordination on these stories. Health, science, technology, social justice—we’re all in this together in having to crowdsource our insights and expertise,” Nakashima said.

She added that in many ways, it’s more difficult to fight domestic purveyors of disinformation than foreign actors. Unlike Russian trolls, US citizens are protected by the First Amendment.

“We just have to grow thicker skins because we’re all getting attacked and trolled, and called ‘enemies of the people,’” Nakashima said. “We have to ignore it and continue to focus on what our job is, and that’s reporting—trying to get the facts, put them in context [and] get it out there.”

Still, added Zadrozny, “Someone may have freedom of speech, but that doesn’t mean that Facebook needs to amplify these conversations or amplify the posts that harass, target, and disinform other individuals.” Zadrozny wants to see social platforms wrestling with the fact that while those who spread disinformation and lie online may have a constitutional right to do so, the vast majority of users have the right to be free from that harassment while using those platforms.

Despite the innumerable complications newsrooms face in combatting disinformation, Ishmael, a self-proclaimed optimist, said she still had faith: “Where I am hopeful is that people will get better at understanding that [disinformation] is happening and understanding how to deal with it.”

Larry Luxner is a Tel Aviv-based freelance journalist and photographer who covers the Middle East, Eurasia, Africa and Latin AmericaFollow him on Twitter @LLuxner.

Further reading: