Any US or European response to the ongoing issue of disinformation must not exploit the openness of a democratic society, but work within its boundaries to ensure transparency of information, according to the Atlantic Council’s Daniel Fried.
“We have to fight disinformation within the norms of our government,” said Fried, a distinguished fellow in the Atlantic Council’s Future Europe Initiative and Eurasia Center. Though the United States and its European allies must take important steps to counter Russian actors meddling in future elections, according to Fried, “we don’t have to become them in order to fight them.”
“The solution to this problem is going to look nothing like the problem itself,” said Jonathan Henick, deputy director of the Global Engagement Center at the US Department of State. Rather than turn the Kremlin’s subversive tactics against it, he said: “We are going to need to be much more creative in how to address this problem, and it’s not going to involve troll farms.”
Fried added: “We have options consistent with our values.”
Fried and Henick spoke at an event hosted by the Atlantic Council on March 7 to launch its latest report: Democratic Defense Against Disinformation. Fried and Alina Polyakova, David M. Rubenstein fellow for foreign policy at the Brookings Institution, are the authors of the report.
According to Fried, the report provides a menu of “doable options” for US and European governments, companies, and civil society to counter disinformation while respecting the fundamental rights of citizens.
“It’s not clear to me that bots enjoy First Amendment rights,” he said. According to Fried, “It is within First Amendment norms and the norms of free expression to try to help and urge social media companies to organize themselves to limit bots and apply the principle of a human behind every keyboard.”
Fried, Henick, and Polyakova joined David O’Sullivan, the European Union’s ambassador to the United States; and Corina Rebegea, director of the US-Romania Initiative and fellow in resident at the Center for European Policy Analysis, at the report launch.
Thus far, the conversation around disinformation has been conflated with discussions on its primary perpetrator: Russia. “While the focus is on Russia right now, these are the kinds of tools that can be easily diffused and are already used by other state and non-state actors,” cautioned Polyakova. “Russia is a starting point, but this is about more than one state actor.”
As outlined in the report, the key to combatting disinformation is building resilience within governments and societies. One of the most effective ways to do so, said Fried, is by learning from those who have been there before.
“We recognize this is not an American problem,” he said. “Europeans have been dealing with this for a long time, and are ahead of us” in finding a solution. According to Fried, the most important recommendation in the report is that, together, the United States and Europe organize an informal coalition of stakeholders to determine common methods to counter disinformation.
Rebegea described how Europe has already begun to tighten its regulations, passing a new amendment to existing data protection laws. “This will force social media companies to be more transparent about the data they collect from users,” she said, adding: “This is something that maybe the United States should consider.”
While regulation raises red flags of government intervention and overreach, “there’s always a danger of overreach,” said O’Sullivan. The distinction is that disinformation “is using the fault lines in our political systems and the freedoms granted by our political system for political influence,” he said. Therefore, governments must assess the vulnerabilities in the system.
However, O’Sullivan said that though Europe may not have a First Amendment, there are strong protections for freedom of speech. He said the existing measures, as well as those that must be put in place to fight false narratives, are not designed to restrict information, but clearly label it and provide all the necessary context to understand it. He called for lawmakers and the private sector to bring this important conversation to a public forum so as to develop a “whole-of-society response.”
“We are not advocating heavy regulation,” said Polyakova. However, she said, “social media companies, many of them US companies, need to begin taking steps, not to censor, but clearly identify misinformation.”
“So far,” she said, “neither Facebook or Twitter has done enough.”
While “no one is asking the social media companies to be the arbiters of truth,” said Polyakova, “you also can’t put the burden on the individual user.” Rather, she advocated for “small technical fixes that could work.” For example, said Polyakova, social media companies and democratic governments should look to counterterrorism efforts to inform their strategies to fight false information.
However, Polyakova described how “social media companies don’t know who to talk to in the US government,” and this has hindered productive collaboration. “We clearly need some leadership point of contact position within governments to be able to liaise with the private sector,” she said. Polyakova said that there must be a domestic mandate to address this issue. Such an initiative, she said, would fall to the Department of Homeland Security.
Fried contended: “Where you place the initiative in the US government is less important than someone taking strategic leadership and working with Europeans.”
“When there’s a problem and political imperative to do something, and to need to be seen as doing something, I see an opportunity for someone to come in and take ownership of the issue,” he said, adding, that the “imperative to be seen as doing something, and doing something real, will be overwhelming.”
Ultimately, said Polyakova, “we are behind” in terms of finding a path forward to fight disinformation on the world stage. Russia interfered in the US presidential election in 2016. It is now 2018, she said, and “where are we? We’re still having the conversation about where do we start.”
Key stakeholders can base their conversation about first steps on lessons learned from examples of successful efforts to counter disinformation. “We don’t have to have an abstract discussion of whether it is possible,” said Fried. “We know it is.” France’s 2017 presidential election witnessed a targeted social media campaign, by Russia, to undermine the candidacy of French President Emmanuel Macron. However, the spread of false narratives was promptly quashed.
All panelists agreed on the importance of US and European governments and civil society activists sharing best practices and lessons learned from their individual experiences.
However, said Fried, “we are under no illusions. It’s a moving target.” He added that “the suggestions in our paper are not the end, they are the beginning.”
Rachel Ansley is assistant director for editorial content at the Atlantic Council.