The importance of working together in the fight against disinformation

A report released last week by the European Commission and the European Union’s diplomatic service said “evidence collected revealed a continued and sustained disinformation activity by Russian sources aiming to suppress turnout and influence voter preferences” during the European parliamentary elections in May. The European analysis said it was too soon to conclude whether these online campaigns had influenced the outcome of the elections.

The attribution of this disinformation campaign to “Russian sources” is exceptional as the European Union has in the past been cautious about assigning blame for cyberattacks to a foreign country by name.

But Sasha Havlicek, founding chief executive officer of the Institute for Strategic Dialogue, confessed to being “quite surprised by the number of incidences [more than 900] that were quoted in that report.”

“It doesn’t chime with the analysis that we have done… which is not to say that it wasn’t there, but to say that it is difficult… to do that type of attribution,” Havlicek said. “The actors have evolved in a big way and we need to be looking at transnational nonstate actors, we need to be looking at domestic actors and they are multifold on both those fronts.”

Nahema Marchal, a researcher at the Computational Propaganda Project, said the project had  conducted a study that looked at the spread of information on multiple platforms across seven different European languages. Marchal was struck by the fact that “very, very little content that redirected to known sources of Russian disinformation, particularly RT or Sputnik news, and instead we had a lot of homegrown, hyper partisan or alternative news sites which were shared in great majority… but across the board we found very few instances of junk news or sources of misinformation and disinformation.”

Havlicek and Marchal participated in a discussion at the Atlantic Council’s 360 O/S conference in London on June 20. Mark Scott, chief technology correspondent at POLITICO, moderated the discussion.

The European report said: “Given the increasingly sophisticated nature of disinformation activities, and the difficulties of independent researchers to access relevant data from the platforms, a conclusive assessment of the scope and impact of disinformation campaigns will take time and require a concerted effort by civil society, academia, public actors and online platforms.”

Havlicek said there has been “an evolution of the actors, of the tactics, and of the targets.”

“It’s a much more complex picture maybe than what we had talked about in 2016,” she said. “Much media has been looking for hostile foreign state interventions and fakery… in reality we have seen in that tactical space something much more complex and nuanced happening.”

Marchal pointed out that one of the “striking things” during the European parliamentary elections was that the nature of the problematic content had changed a lot. “In fact,” she said, “there was a lot of hybrid content… There was an attempt to tap into people’s emotions, but also try to validate or either amplify a certain type of worldview, as opposed to demeaning or derogating specific candidates.” A lot of this content revolved around topics like migration, climate change, and even the fire that devastated the Notre-Dame Cathedral in Paris.

Havlicek said the quantity of coordinated disinformation was lesser than in previous election cycles. Social media companies are looking out for bad behavior and are doing a “much better job, but it is certainly not a done deal,” she said.

As for attribution of blame, Marchal said: “We tend to look a lot to the east when looking for specific actors, specific bogeyman. It is high time that we start looking west and talking about the structural linkages that exist between the far right in the United States and in Europe.”

In October 2018, representatives of online platforms, leading social networks, advertisers, and the advertising industry agreed on a self-regulatory Code of Practice to address the spread of online disinformation and fake news.

Marchal said such initiatives should be lauded, but added: “We should also think about the specificity of the EP elections and how representative or not it might be of a threat that is continuing, ongoing, and never pauses.”

“The European elections, despite the great turnout this year, are not the most popular ones… and are focused around parties as opposed to candidates. All of that means it was harder to try to sway the elections one way or another,” she said.

“It is all going in the right direction and I think the Code of Practices is a really important step, it is a collaborative step, more needs to be done between institutions and the companies,” said Havlicek.

Scott wondered what more could be done.

Marchal emphasized the need for coordination among the different actors who are working on detecting disinformation. “Working together… is going to be key,” she said.

Havlicek agreed noting that the big gap at present is that many of the actors are looking at the issue in a limited way. “They’re looking primarily for hostile state actors and specific ones. They are limited in terms of what they are able to really deep-dive research. You have certain organizations that are going beyond that. The connecting of the dots” is critical, she said.

“We need a comprehensive approach where methodologies can be shared… where we can interrogate the research,” Havlicek said.

Asked by Scott whether social media companies are doing enough to address disinformation and fake accounts, Marchal said “it is important to acknowledge some of the laudable efforts that are being made by platforms—from increased collaborations with researchers and election integrity initiatives and the like.”

“However, as a social scientist I take issue with the fact that platforms think of misinformation still as a design problem, something that can have technological solutions,” Marchal said.

Social scientists are not brought into the conversation early to help inform the design processes, Marchal complained. “We need to have a deep understanding of why people are engaging in these types of behavior [spreading disinformation] in the first place.”

Ashish Kumar Sen is the deputy director of communications, editorial, at the Atlantic Council. Follow him on Twitter @AshishSen.

Related Experts: Ashish Kumar Sen

Image: Pound coins are seen in front of a displayed EU flag in this picture illustration taken January 18, 2017. (Reuters/Dado Ruvic/Illustration)