When voters in the twenty-eight European Union (EU) member states went to the polls in May to elect a new European Parliament, the second-largest democratic exercise in the world provided a “very tempting target for somebody who wanted to interfere in our democratic processes,” Julian King, European Commissioner for the Security Union, said at the Atlantic Council’s 360 O/S conference in London on June 20. But thanks to increased measures to protect its citizens from disinformation, he added, the EU “didn’t see any kind of spectacular attack.”
The EU parliamentary elections, held from May 23 to May 26, “as a democratic exercise… went off well,” King said, adding that “we didn’t see a big hack of the sort we have seen in the past… [and] we didn’t see a very high-profile attack against a campaign,” like the leaking of fake material from the campaign of French President Emmanuel Macron in May 2017.
King is the first European Commissioner for the Security Union and coordinates the EU’s counterterrorism, cybercrime, and online disinformation responses. He argued that a big disinformation attack was avoided during the European parliamentary elections because of Europe’s ability to work together, both among member states and within individual governments and societies.
The EU “got all the member states together to work on election security,” King said, as well as set up a rapid alert system, which allowed “experts, including from civil society but also from the [government] administrations to all look out for examples of organized disinformation and share that information” with each other.
King’s team in Brussels also proactively reached out to the main political actors early on in the campaign, which he argued helped socialize the disinformation threat. “We worked with the politicians. We worked with the European Parliament, we worked with the parties, [and] we worked with the individual candidates,” King said. “That helped them, but it also meant they talked about this. They communicated about this problem, which raised some critical awareness.”
The EU also “sat down with the big social media platforms and together we agreed on a new code of practice on tackling disinformation,” King explained. The EU Code of Practice on Disinformation, signed in October 2018, commits social media companies and the EU to work on scrutinizing advertisements on social media platforms, proper labelling and disclosure of political and issue-based advertising, identification and deletion of false and misleading accounts, and empowerment of consumers and the research community to identify instances of disinformation.
The Code of Practice was not Brussels simply decreeing these new standards on social media companies, King said. “We didn’t come up with this, we worked with the platforms to develop it,” he said. The effort involves Google, Facebook, Twitter, Mozilla, and many other platforms.
King maintained that the steps the EU wants social media companies to take are not designed to censor free speech. “We agreed with the platforms that we had to take some measures to increase transparency,” he said. “This isn’t about judging whether a particular piece of political content is in itself true or false, good or bad. That would be to go down the road of censorship.”
The focus, rather, “is about insisting that there is more transparency,” King argued. “A light is shone on the provenance of political material, so that we all as citizens have a better chance to understand what it is we are being shown and to appraise it critically.” The goal of these efforts, he added, is not to completely rid the Internet of fake information, but “to build our resilience, to make us more secure against this type of challenge.”
King cautioned that although there was no big disinformation event, “the European parliamentary elections were not a disinformation-free zone, quite the contrary.” For all the positive benefits of the actions taken by the EU, he conceded that there have also been “changing tactics, changing ways of seeking to do disinformation where the big stuff that was very attractive a few years ago but attracts a lot of negative attention maybe is not quite so popular [now]. But smaller activity—the spreading of divisive and poisonous disinformation designed to disrupt and fan division in our democratic processes and societies is, I’m afraid, all too prevalent.”
There were also significant problems with some of the new protections that need to be addressed, he explained. While he lauded the setting up of Ad Libraries to help users and governments track current ads on social media, King noted that “we still need to make those ad libraries work better. We had problems with false positives and false negatives in Germany,” such as the labelling of a Dungeons and Dragons ad as political, while messages from the far-right Alternative fur Deutschland (AfD) were left unmarked. In addition to these false flags, “we had problems with delays,” King said. “Legitimate political actors, parties and others, faced delays of up to seventy-two hours sometimes to get their ad registered. If we are going to use these libraries and we are going to encourage people to use them, they need to work better.”
While he praised the willingness of the social media companies to engage with Brussels and take real steps to help curb the problem of disinformation, King promised that he is “going to continue to press the platforms to do more and to do better.”
This pressure will be important, King added, because the relative quiet of the EU parliamentary elections may cause an undue sense of relief about the issue. “This problem isn’t going away. We got through the European parliamentary elections, but just in Europe there are elections every week of the year,” he said. “I don’t think we can take a pause. I think we need to find ways of making sure that we continue to do this work. Because the people trying to do us harm, the people trying to use cyber-enabled means to attack our shared values [and] way of life, unfortunately, they are not going to be taking a pause.”
The good news, King added, is that “there are more and more people who are in administrations and in ministries who are taking this much more seriously than they did twelve months ago.” With increased engagement from governments and civil society, along with the willingness of social media companies to work on this issue, King was hopeful that the challenge of disinformation can be successfully managed.
David A. Wemer is assistant director, editorial, at the Atlantic Council. Follow him on Twitter @DavidAWemer.