Combating everyday falsehoods

From false headlines about veteran benefit cuts to broad campaigns discrediting climate science, the spread of fake or misleading information online has disrupted democratic societies around the world. At the Atlantic Council’s Global Forum on Strategic Communications 2019 on October 23, experts from across the fields of politics, science, media, and public health described their experiences in identifying the spread of misinformation online and what could be done at the individual and organizational level to solve the problem.

Kristofer Goldsmith, chief investigator and associate director for policy and government affairs for the Vietnam Veterans of America, said he was simply trying to access his organization’s Facebook page when he was confronted with a sophisticated disinformation campaign targeting his group’s members. A search result on the platform led him to a page with all the trappings of his organization—the logo, pictures of members, information relevant to veterans of the Vietnam War—but the linked website was not affiliated with the organization. While he initially thought that the group was a well-intentioned effort to build activity in support of the organization, he soon observed sophisticated attempts to manipulate and distribute video content on divisive political events intended specifically for  Vietnam veterans. His careful study of the page and other instances of fake information resulted in a nearly 200-page report on foreign disinformation campaigns targeting US servicemembers and veterans.

Goldsmith said that the page drew veterans in because it would “mostly post benign stuff,” but “once in a while they would find a news story and sensationalize it,” like the vandalization of a veterans memorial. Goldsmith explained that “veterans and especially Vietnam vets are an economically efficient target both for domestic campaigns and foreign disinformation,” as veterans are often seen as leaders in their communities and families. “For every veteran whose mind can be changed or radicalized, a family could go with it and a group of friends go with it,” according to Goldsmith.

Travis View, co-host of the QAnon Anonymous podcast and researcher into the QAnon conspiracy theory, agreed that there are often specific factors that make certain individuals and communities susceptible to promoting fake information. Following conspiracy theories often “give[s] people a sense of purpose and mission” View explained, that often focuses on opposing mainstream narratives, rather than supporting the individual facts of specific cases. View cited research showing that respondents who believed that former al-Qaeda leader Osama bin Laden was already dead before US marines arrived at his compound in 2011—Bin Laden was killed by US marines during the May 2011 raid—were more likely to believe that Osama bin Laden was still alive than those who believed the factual account of the raid. While both those views are logically incongruent, View argued that “what unites those beliefs…is that they both reject the mainstream narrative.”

Joseph McCarthy, an associate editor for the Weather Channel covering climate change, explained that this belief against the mainstream narrative can make even real information a tool for conspiracy theorists. “Real factual information from people that hasn’t been doctored are presented in a context that creates a conspiracy or creates a false narrative,” he said, becoming “malinformation.” He cited the 2010 ClimateGate controversy, during which climate change deniers accused scientists from the University of East Anglia’s Climatic Research Unit of manipulating climate data to prove man-made climate change, using thirteen years of hacked emails as evidence. Although the emails did not show definitive proof of any manipulation of data—and the scientists were cleared of any wrongdoing in their research by multiple courts—just their existence gave fertile ground for those looking for a wide conspiracy, McCarthy explained.

Dr. Heidi Larson, director of the Vaccine Confidence Project and professor of Anthropology, Risk, and Decision Science at the London School of Hygiene and Tropical Medicine, argued that this propensity for some individuals and groups to see everything as evidence against the mainstream is why so many efforts to combat misinformation backfire. In her work on vaccines—a constant source of online misinformation—she explained that many in public health “want to shove more right information” towards those who oppose vaccination but that very information “is exactly what they are rejecting. So it aggravates the situation.” McCarthy agreed, pointing out that when Youtube added small descriptions to videos promoting conspiracy theories in order to educate potential viewers, the conspiracy theorists “loved it because it was another example of how they are being censored.”

Larson suggested that public health advocates need to “put our guns down” and soften language that often directs vitriolic abuse at those who oppose vaccination. Many who are skeptical “feel demonized by asking a question,” she argued, and therefore are driven to fully adopt the conspiracy.

Goldsmith warned that constant misinformation was a “health issue” for many veterans, who are bombarded with fake news about things such as benefit cuts that have very real impacts on their lives. Governments, he argued, need to recognize that misinformation is a problem bigger than just debunking crazy online myths, but rather that “cyber hygiene [is] a health need for Americans.”

As policy makers look to limit the damage of misinformation in their societies, they must realize the factors that feed the conspiracy theory fire and work to build the resilience of their communities against the spread of everyday falsehoods.

David A. Wemer is associate director, editorial at the Atlantic Council. Follow him on Twitter @DavidAWemer.

Related Experts: David A. Wemer

Image: An aide puts out examples of Facebook pages, as executives appear before the House Intelligence Committee to answer questions related to Russian use of social media to influence U.S. elections, on Capitol Hill in Washington, U.S., November 1, 2017. REUTERS/Aaron P. Bernstein