Atlantic Council

#DisinfoWeek Madrid 2019

 

 

Welcome Remarks:

Benjamin Ziff,

Deputy Chief of Mission,

U.S. Embassy to Spain

 

Vicente J. Montes Gan,

CEO,

Rafael del Pino Foundation

 

Geysha González,

Deputy Director, Eurasia Center,

Atlantic Council

 

Opening Keynote – Democratic Values in the Age of Disinformation:

Ana Palacio,

Former Minister of Foreign Affairs of Spain

 

Storyteller – Bots, Trolls, and Shaping the Political Discourse:

Alexandre Alaphilippe,

Executive Director,

EU DisinfoLab

 

Panel: “Addressing Election Interference in the Age of Disinformation”

Panelists:

Ambassador Daniel Fried,

Distinguished Fellow, Eurasia Center,

Atlantic Council

 

Nicolas de Pedro,

Senior Fellow,

The Institute for Statecraft

 

Kadri Kaska,

Research Fellow, Policy and Law,

NATO Cooperative Cyber Defence Centre of Excellence

 

David Alandete,

Journalist,

Diario ABC

 

Moderated by:

Geysha González,

Deputy Director, Eurasia Center,

Atlantic Council

 

 

Location:  Madrid, Spain

 

Time:  6:00 p.m. Local

Date:  Tuesday, March 5, 2019

 

 

VICENTE J. MONTES GAN:  (Translated.)  (In progress) – is none other than to bring light into the fundamentals, and why not, the consequences of the disinformation phenomena.  With an eye single to such a timely debate, we cannot find individuals more accredited and enlightened on these issues than the ones that accompany us today.  That is why I would like to reiterate our gratitude to all for having accepted the invitation to come to Spain and participate in this dialogue about such determinant aspect for the viability of the democratic systems of the world.  But, why a debate here at the Rafael del Pino Foundation? When our founder, Rafael del Pino, decided to create the foundation that bears his name, his principal motivations were to contribute to the development of leadership, the defense of freedom, and the diffusion of knowledge; something that we try to accomplish every day with initiatives and dialogues such as this one, with those who desire to establish debates in such a way that in the exercise of individual freedom, each one of you will settle where you find yourself more opportune.

For a country such as Spain, that according to the principal indicators find itself among the 20 full democracies of the world, these issues are absolutely fundamental.  We live in very complex times where information in politics and society often appear distorted.  This is the result of the sectarianism that afflicts us.  In that context, unfortunately, words often become an instrument to restrict citizens of their thinking ability.  Without a doubt, these circumstances are not new, and we have analyzed them in the past in conferences organized by the foundation.  We have touched upon these issues from both a political as well as a media point of view.  We recalled it in the conference given by Mark Thompson, the executive director of The New York Times, not more than year and a half ago, who reminded us of the deterioration of the political speech detected in classic antiquity.  Concretely, he mentioned Plato’s dialogues, in which the Greek philosopher narrated a conversation between Socrates and Borgio (sp), where the latter talks about how to manipulate people through the use of words.  In that text, Plato imagined a future in which leaders would plot how to manipulate people.

Now, unfortunately, in many occasions, the same is done, using more advanced and sophisticated tools.  We should not forget, as reminded by Martin Baron, director of The Washington Post, in one of our conferences, that the problem might be that many individuals only want to read information that is in line with their opinions.  Another problem, he says, information is often false or what is even more dangerous, in the words of former director of the BBC, James Hardy, who also visited us, is junk food.  Information that seeks to intoxicate us with pieces that are partially true and have been assembled in a very ingenious way to disguise the authentic story.  All of that is used by the authoritarian and populist regimes that emerge in our societies.

This should be only a brief introduction and is not my place to defend at this time the fundamental role of freedom of the press and of information in our society.  That information that all of us agree, should be truthful.  But we would not be in the Rafael del Pino Foundation if we did not break the mold in order to exercise the profession of the press and the enormous value of politics and the principles of democracy.  Well, I have had the opportunity to point out that today we have the privilege of having accredited experts to analyze this and many other issues.  That is why, I conclude by warmly welcoming you to the Rafael del Pino Foundation, reiterating the foundation’s gratitude to all our invited guests, collaborators, and assistants.  We want you to feel at home here today.

I yield the podium to Mr. Benjamin Ziff, deputy chief of the United States mission of the embassy of the United States in Spain.  Don Benjamin, it is a pleasure to have you collaborate with the foundation once again on this occasion.  I hope we will do it again on many future occasions.  (Applause.)

BENJAMIN ZIFF:  (Translated.)  Very good afternoon to all.

(Continues in English.)  For my Atlantic Council colleagues here, I’ll speak in Spanish, so I think there’s translation available.  But just understand I’m saying brilliant things in Spanish.  So understand that.  (Laughter.)

(Translated.)  Good morning and welcome to #DisinfoWeek.  It is truly an honor to be here with such a distinguish company and participating in a crucial debate of our times; especially, being in this country that is facing elections that will mark its future.  Spain is a partner, friend, and ally of the United States.  The truth is that I miss the old days when our only concern on line was to face that legendary African prince anxious to transport his gold overseas, and who would make you  rich in exchange for a small payment.  Or, test that miraculous treatment for your overweight, your baldness, or your hormonal problem.  Saying nothing of the email that explained how you could become a millionaire working at home with an innovative system, only two hours per day.

Yes, that was truly disinformation-lite.  It seems like it was centuries ago since that time of emails only, but is has been only about 10 years since those simple lies and craftiness.  Today, the tergiversations in social media are elaborated, pointed to and frequently with a pure and simple political end.  The disinformation already marks key spaces in the relations between countries and the functioning of our democracies.  It is digital bait that hides a fishhook that injures our peoples, our politics and international relations.  Then, what should we do? That is why we are here today, to find the answer.

The roots of the problem of disinformation are long and are present in various layers of our societies.  Today, together with the Atlantic Council and the Rafael del Pino Foundation, we debate how to resolve the problem.  More specifically, this threat extends over all the terminals we use on a daily basis: computers, mobile telephones, in a world that is hyperconnected with billions of clicks per second.  The high-level experts that we have here today approach this phenomenon with concrete examples of these processes and warn us about the dangers in evolution.

Our adversaries, and they are many, have embraced this strategy and have rushed to encourage confusion and fear with the sole purpose of creating divisions in our societies.  In short, it is a forceful axiom:  to sow confusion to encourage lawlessness.  I am very much interested to hear the proposals that this high caliber panel of analysts and experts have on the best way of counteracting this threat.

But first, let me introduce one more element to the weakness of the (telematic ?) system that we have today.  It comes less from Twitter and much more from our humanity, our natural curiosity, our faith or distrust of our fellow men.  Our desire for innovation leaves us unarmed when we face those who manipulate those virtues that are so human.  Being less human is not a real answer to the disinformation challenge.  We need to protect that humanity against this danger.  And how can we do this? We are all used to having sexual education classes in the schools.  We are also used to have our children vaccinated to protect them from illnesses.  We treat the danger of disinformation in the same manner; just like one more illness against which we apply intellectual antibodies at an early age.  In other words, we need to educate the younger generations to identify and counteract that which in English is called “bullshit.”  (Laughter.)  We cannot accept that our children and grandchildren accept the contamination of the informatics environment as something unstoppable or as another abnormality of a globally connected world.  We should include in our education systems specific programs that explain the origin, development and the finality of this gangrenous disinformation.  Even more, we have to vaccinate our fellow citizens against this virus that is politically lethal.

In the United States we see how the primary school systems in Rhode Island, New Mexico, Connecticut and others, have begun to develop an online [media] literacy program  as a required subject.  Also, the principal universities include in their study programs, specific classes dedicated to this phenomenon.  Indiana University has developed specific apps to combat the problem.  Hoaxey, Botometer, and Fakey help all to have a critical criteria, among the bushy branches of disinformation.  The Communications College of Michigan University offers in its Web page recourses to create a base of understanding of the theme and increase the available documentation to be able to develop concrete systems that will help us avoid the expansion of disinformation.

From the embassy of the United States, we have also recently organized a series of media literacy workshops in our American spaces for more than 300 high school students in Madrid, Valencia, and Barcelona.  It is a theme that we take very seriously here in Spain.  These workshops provide with the ability and tools to detect disinformation in the internet and social media.  And the subjects taught also emphasize the responsible use of all digital media and the importance of being well-informed citizens.

At the end of this year, we will be working with media literacy projects with an NGO known for offering media literacy directed to high school teachers.  This program will offer not only a firsthand introduction in this complex world, but also specialized methods that these teachers will be able to offer their students, and with which they will be able to create study plans.

I encourage all to seek opportunities to work together and in that way expand these initiatives even more.  Only with a profound knowledge of the subject matter and a continuing effort will we be able to combat with surgical precision this not so new threat.

Thanks to all for sharing your knowledge and experience about the threat of disinformation and how to combat it.  As a diplomat, as a citizen that supports open and free societies and more important, as a parent of children who will also have to face this danger, I share the interest, the concern, and the determination to know more, and to do more.

Thank you very much.  (Applause.)

GEYSHA GONZÁLEZ:  (Translated.)  Good afternoon and welcome to #DisinfoWeek Madrid.

I ask for your patience because I am very Puerto Rican and my Spanish is also very Puerto Rican.  My name is Geysha González and I am the assistant director of the Eurasian Center of the Atlantic Council.  It is a pleasure for me to be here with all of you in this, the second night of this event of #DisinfoWeek.  Athens sends her regards.

First of all, I want to thank the embassy of the United States in Madrid and to the United States Mission to the European Union for their great support for this event.  I also would like to thank our co-hosts and partners, the Rafael del Pino Foundation.  It is a great pleasure for us to collaborate with you in this project, which is a transatlantic effort for us.  We could not have this conversation alone.  For those who are streaming this conference, do not forget to use the hashtag #DisinfoWeek.  I do not know how to say hashtag in Spanish.

AUDIENCE MEMBER:  (Off mic.)

  1. GONZÁLEZ: (In English.) Oh, it is “hashtag.”  Perfect.  Great!

(Translated.)  To continue with our conversation, the Atlantic Council has been studying the themes of the hybrid war that the Kremlin has directed towards Eastern Europe since 2015 when, in addition to invading Ukraine and taking part of its territory, the Kremlin began a disinformation campaign to hide its actions.  Since then, we have seen how those disinformation campaigns have touched all those countries that are, or aspire to be democratic, including Western Europe and the United States.

For us, this #DisinfoWeek is an opportunity to elevate the theme of disinformation and to unpack this and other tools that state and nonstate actors use to affect, infect, and destabilize political speech.  But we do not only look at this problem.  The problem of how we speak.  We know that it is disinformation or false information.  We know it is “bullshit.”  But we also know that there are other tools that are used in this space.  This is an opportunity for us to elevate [political] speech and to talk about what tools exist to counteract this threat to our democracies.

It is an honor for me to be here with you, and I believe that when I switch to English my speech will be smoother, I promise.  But for me, it is a great honor to introduce – to take time to introduce Ana Palacios who is here with us.  She is a member of the Atlantic Council family.  She is also very distinguished in various themes.  She has had a career in government, as a minister of foreign affairs of Spain and as a member of parliament.

I also have the pleasure to introduce Alex Alaphilippe, who is a friend, partner, and a defender of democracies throughout the world.  He will speak about the most technical tools used to destabilize the conversations we have on the internet.

Yes.  Disinformation is not a problem that just began.  It did not start in the year 2015, nor 2016 or 2017.  It is a problem that has been in the world throughout history.  Lies attract; they are like junk food.  You know that it is not good for you, but it satisfies you.

This is an opportunity for me to start a discussion that goes beyond admiring the problem and to truly try to resolve it in a transatlantic way.

Ana, the time is yours.  Thank you very much.  (Applause.)

ANA PALACIO:  (Translated.)  Good morning.  I love to be here, and how this is in – (continues in English) – streaming, and this is part of #DisinfoWeek.  I will go on in English.

By the way, in your presentation you have said that I’m part of the Atlantic Council family.  I’m a proud member of the Executive Committee of the Atlantic Council, and as such I will just express my gratitude – the gratitude of the Atlantic Council to the U.S. representation in Brussels, and to the U.S. embassy here, and to you Benjamin Ziff as representative.  This is an important – extremely important moment, and this is an important – extremely important issue.

And last but not least, my always, always a pleasure to come to Fundación del Pino.  Fundación del Pino is – we owe to Fundación del Pino – and Vicente Montes has expressed a few of the highlights.  We owe to Fundación del Pino a lot of what we know.  And Fundación del Pino has become, well, THE center to be connected to to know what the intellectual life around the world is.

Now, I would start by saying something which is obvious.  Democratic values in this era of disinformation – this is my topic, democratic values in this era of disinformation – is a fitting way to begin this discussion of the #DisinfoWeek, for disinformation and representative democracy are inextricably linked.  Democracy, liberal democracy, is predicated on citizens making informed choices.  And core to the model is critical rationality.  This is the model.  By obscuring the truth, disinformation necessarily undermines that core requisite.  At its most basic level it impairs the ability of democratic society to function properly by replacing facts or faith in facts with rumors and conspiracy.  At its most insidious it accentuates divisions within society, making the common approaches, vision, and narrative needed to sustain mass democratic societies exceedingly difficult.  In other words, disinformation represents an existential threat to liberal democracy.  It is a threat that eats our society from within, and it is a threat that must be responded to.

That is precisely the motivation behind this event of Disinformation Week.  And honestly, thank you.  Thank you, my dear friend from the trenches, Dan Fried, for being the – I would say the engine behind much of these efforts – behind so many of the efforts in the Atlantic Council.

This is a clarion call that is now being taken up transatlantically.  Here in Europe, we see interesting initiatives, though perhaps still more words than deeds.  There is this Commission’s code of practice on disinformation.  There is the EEAS, External – (inaudible) – Force, and this EU versus disinfo site.  In national legislation we also have interesting approaches.  But all these are first steps.  I am sure that we will hear more from all the panel that comes.

My remarks focus specifically on our democratic values and what is needed to preserve them.  In a way it goes a step beyond what Benjamin told us, this idea of vaccination, this idea of being – (speaks in Spanish).  We are just at the beginning of the war against disinformation.  Of course, our moderator has said it:  rumors and information warfare, as well as propaganda, have long existed.  But current onslaught, fueled by technology and social media, represents at the very least a new phase – a new qualitative phase.  Dazed by the steady stream of crises this last decade and deeply shaken by manipulation since 2016, it is only now that policymakers and some of the public are finding their feet and recognizing the challenge ahead.  Organizations such as Atlantic Council, such as Institute for Statecraft – Nicolas – are vital early actors, and NATO has been and will continue to be crucial given the natural linkage between disinformation and security.

But we must be clear about the battle we face and how to address it.  Disinformation is the cause of democratic deterioration, but it is also a symptom of a much deeper disease afflicting liberal democratic society.  And responding to information warfare from Russia or for-profit progenitors of fake news in Macedonia is necessary, but insufficient to address the challenge we face.

I see here – and you will please excuse the fuzzy analogy – a cautionary parallel with the war on drugs.  This is a trap that we must avoid.  When confronted by anonymous danger, the knee-jerk – the first reaction is always action – short-term, immediate, visible action.  Interdiction and unmasking of conspirators, eradication of sources provide ample targets.  And they can be summarized in action plans, tallied in reports, and given budget lines.  This is not to denigrate supply-side policies.  They are needed.  But unless demand is also addressed, much like the war on drugs, we are in for a very, very long, maybe interminable war.

The problem is that addressing demand is a much harder nut to crack.  It certainly involves education, as we have heard.  And the recent moves in many – and Benjamin Ziff has enunciated, just to bring one example of a country that lately doesn’t give us so many good examples, is Italy, where there is – where there is a – the introduction of media literacy in curricula.  And this is an interesting step.  There are others.  But we have to do more.

We must address the idea of responsibility more deeply.  This entails reconfiguring the citizen-state relationship.  The key danger here is fragmentation, the debasement of citizenship.  Today the relationship between government and governed increasingly resembles that of a service provider versus a consumer.  As governance has gotten increasingly complex and technical, the citizen has been left behind even as government intrudes more and more in our lives.  It creates a passive, weak connection between citizen and state.  It reduces a sense of responsibility and denies agency.  A disempowered population is fertile ground for those peddling disinformation.  And as societal connections weaken or disappear, it is easy to fall into alternative realities, all of which is of course propelled by technology.  This is these echo chambers that we know so much about.

In Europe, we have to add a loss of narrative to that.  With the end of the Cold War and more recently with the end of the impulse for enlargement, Europe – the European project ever more heavily rested on prosperity as a legitimating basis.  In the years post-crisis, this too has gone.  The prosperity legitimacy doesn’t any longer work as THE narrative.  Combined with the insecurity of an aging and shrinking population, there is a distinct sense of being adrift at sea in a global environment in which we feel lost and no longer in control.  It is a witch’s brew.

None of this is any great epiphany.  But it needs to be emphasized at the beginning of this discussion that will look at the threats coming from outside, because even as we push forward responding to supply we must be cognizant that we face a much greater challenge in healing the causes of demand within our society.  The cancer of division and disunity has metastasized.  Putting on sunscreen is no longer enough.

This is urgent, not just because it impacts the proper functioning of our democratic societies but because we are living in a world at a time of mutation.  We are reaching the end of 200 years period in which the ideas of the Enlightenment, of the importance of the individual, have been ascendant.  The last seven decades of this period saw the growth of the world order based on liberal democratic ideas.  Today, both liberal democracy and the ideals of the Enlightenment are in retreat.  The authoritarian and/or illiberal democracies model are more deeply a societal perspective that privileges the collective above the individual, and they present a profound challenge to the liberal ideas that have defined our worldview.  If we – and here I’m speaking for the U.S. and Europe, along with our core partners – if we are to meet this challenge, we simply have to get our business, our affairs in order.  We must create resilience within our societies.

And I’m finishing.  Seventy-two years ago, the great American diplomat George Kennan published the very well-known “The Sources of Soviet Conduct” in Foreign Affairs under the pseudonym “X.”  The “X” article is, of course, remembered for establishing the intellectual foundations for that grandest of grand strategies, containment.  Less remembered is, however, how this article finishes:  More important than containing the Soviet Union, Kennan said, was demonstrated – demonstrating resilience and vibrancy.  And I quote:  “It is a question of the degree to which the United States can create among the peoples of the world generally the impression of a country which knows what it wants, which is coping successfully with the problems of its internal life and with the responsibilities of a world power, and which has a spiritual vitality capable of holding its own among the major ideological currents of the time…To avoid destruction the United States need only measure up to its own best traditions and prove itself worthy of preservation as a great nation.”

This is a challenge before all of us here today on both sides of the Atlantic, finding a way to measure up to our own best traditions and to revitalize our liberal democratic bases.  So even as we look at how to contain and respond to the challenges from without, let’s not forget or willfully ignore the challenges we face within.  Thank you.  (Applause.)

ALEXANDRE ALAPHILIPPE:  Hi.  My name is Alexandre Alaphilippe.  I’m executive director of the EU DisinfoLab.  The EU DisinfoLab is a Brussels-based NGO working as a center of resource for people working on disinformation, whether they’re journalist, fact-checker, academics, or other – with other background.

So two things.  First, I’ve been asked to be short, so I will try to keep it – for a good reason, because then you have an excellent panel and we need to save time for questions.  So second is because we all know there is a Real Madrid game tonight and I don’t want to withhold you too much here.  (Laughter.)

So small presentation about “Bots, Trolls, and Shaping the Political Discourse.”  Here the idea is to understand what kinds of techniques are used and how artificial intelligence can be used on a daily basis just to help or to not help information to spread or disinformation to spread.

And I wanted to talk to you about Steven.  Steven is a researcher at the EU DisinfoLab.  And he made this very good comment yesterday.  He said to me:  You know, Alex, what we can do now with artificial intelligence, it’s very simple; we can create people that don’t exist.  In fact, Steven does not exist.  (Laughter.)  So Steven is a part of a project which is called ThisPersonDoesNotExist.com, and thanks to artificial intelligence by learning on how many faces with real pictures you have now the capacity to produce fake images, fake faces of people in an instantaneous way.  On this website, if you click refresh, every second you have a new face going in.  And you can see that Steven has a little bit of problems around just under the eye here.  It’s that because the machine doesn’t see this for the moment because this project is new.  But where we will be in five years, 10 years, or even in six month(s)?

So the question is, if I am refreshing this website let’s say 200 times – takes me 200 seconds – I can create 200 fake Twitter profiles.  And by doing this, I have the capacity to influence very much how our conversation works.

So the question we have to ask ourselves right now is:  How much of the world around us is fake?  You may have read this story a few weeks ago about 60 percent of the Web traffic is fake.  Why?  Because you have so much of the scripts that are interrogating different algorithm together to understand what’s going on.  If you look about online advertising and the bidding section of online advertising, many – every time you go on a website you have 10 requests that goes for every – to every 10 ad servers that can interrogate, like, what is your profile, how do you behave, and they can pay for displaying something to you.  That takes milliseconds to do, and that’s a huge part of the traffic.  And every part of this traffic is linked to each other.  If you are sponsoring a very good story on Twitter with a YouTube link, it has high chances that the YouTube video would be better-ranked by YouTube because it sees that that gains a lot of attraction.

So the problem is because we had fake profile before, but now we also have fake amplification.  The thing is the dummies that we used to have, which were very simple, that were basically accounts that were very false and doing always the same time, now it is – they are no more used by sophisticated disinformation campaigns.  Why?  Because it’s very easy.  If you have a robot that is – every time you have #DisinfoWeek tweeting, it’s very easy to see that if you’re using the hashtag in a different context he will retweet it.  But now, because we have so much knowledge about what is the behavior of someone, we have the capacity to create scripts that can imitate how people behave.  So they can pass maybe on one story, maybe on two stories.  Maybe they can tweet at different times of the day, of the night.  So it’s very difficult for you to know what’s going on.

And if you look at things, this was before.  This was very easy.  This is a study we conducted last year.  You have accounts that are basically tweeting all day long.  So these people, I’m very concerned about them because they don’t sleep, never.  This is kind of easy to catch.  Also, this was a technique that was very well-used.  So you are a French account and you are followed by accounts from all over the world, including people learning how to box in Thailand and people learning how to behave – how to build houses in South Africa.  It’s very good, but if you speak only French why do you have 18,000 Twitter followers doing this?  But this was the old way.

With the deep fake technology, it’s very easy to build a story also on other things that are real.  The number of personal data that are leaked online today through hacking of things like this make it very easy to create someone who has absolutely no history online.  So if I can find someone who’s named Pablo Rodriguez living in that city of Madrid and having no history on these apps, I can invent him, and I can give him a personality, and if I can manage 10, 20, or 100 of these sock puppets I have the capacity to make people believe this is going on, this is a discussion online.  And it’s very easy to – it used to be very easy to catch because at some point, if you look at how the networks are composed, you see that these networks are focused on only one person, and all of this person have followers that will tweet only once, and you can see that you have one central person which is activating sock puppet account.

But this is made possible mainly because we have a new paradigm, and the new paradigm is that we are now living in the economy of attention.  What does it mean?  It means that today the main platforms are designed for you to stay on these.  Why?  Because the more you stay, the more they learn about you.  And you might think, yeah, I stay, but I don’t click.  Yeah, they love when you click, but they also know when you don’t click.  They also know how fast you scroll, how much time you spend on one publication, how much time you don’t spend on one publication.  And they have all of this information, and they can link this with all the other information they have inside the system or sometimes outside the system.  And the whole thing is meant for you to stay, because the longer you stay the more money they make out of your profile because they can sell something which is just designed for you.  And if you add this to the capacity to our brains to not think when we are in front of something which makes us furious, we arrive to a very complex and problematic situation because everything is driven by emotion.

So this is a very provocative picture, I know.  (Laughter.)  Everything is driven by emotion.  When you see the picture on the left, what do you see?  You see policemen charging people having a Catalonian flag.  When you see this, whatever your side is on this very complex issue, you can be furious, you can be sad, you can be angry.  But the thing is you have an emotion, and what you want to do the first thing is to react.  You want to say, no, this is not true; or, yes, this is outrageous; et cetera.  The thing is if you pass this image to a tool to check the image, you see the flag was not here in the picture.  So we don’t know if these people were demonstrating for this event or not.  What we know is that you have people that were charged with policemen.  And this is a real issue because once you say this, once you understand this, once you take the 30 seconds to check this, you have a very different reaction about this because you want to ask, OK, so where was it?  Why do I see this?  What is the point?  How do I know about this issue?  Who should I check with about this?

And emotions are – disinformation is using emotion because it’s easy to hijack your rational brain.  And that’s how we are made because when we were in the – in the jungle, in the caves, and we had to get away from a lion trying to eat us, we didn’t think about, yeah, I need to run, like, this direction for approximately 200 meters and climb on a tree, then on the – no, no, you just run because that’s how you will survive.  And your – our brain still works like this.

And disinformation uses emotion and gets to a context.  So on the right you will see a post that has been published recently on Facebook pages around the gilets jaunes movement in France.  Again, whatever you think about the gilets jaunes movement I don’t care.  The thing is this picture was – this collection of pictures was published to say look at how police is violating demonstrators against gilets jaunes; this is an outburst, this is a shame, et cetera.  But if you look at the pictures, the one with the Arc de Triomphe and the Champs-Elysees is real, from the real day.  The three other one(s) you can see on the bottom corner on the – on the right and the men like this were pictures taken in France, but two years ago from other demonstrations.  And the last one, the man which is in a – in a bloody situation, let’s say, this man is from Spain.  This picture is from the violence in Catalonia.  And this has been used to justify everything.

And when you see this and you understand that in a different context – and that’s why on the edge of the European election this is very important.  So things that are happening in the world are not staying in the country.  Information is circulating and false information is circulating.  I’m pretty sure in two years or in three month(s) we will see the same pictures for a demonstration in Poland or in Italy or something else because this is new content for people that have not been exposed to this.  And they need to understand what’s going on.

And this is taking to the context – we talked a little bit about it – about polarized crowds.  We are in a situation because also of our behavior, because we tend to follow and to be friends with people we choose.  The only people we don’t choose are our family because we don’t choose our family, but we choose our friends.  These people are the ones you are getting in touch with on a daily basis, and they have a particular way of seeing the world.  And I think if you are a particular horse lover you can find people that have the same passion than you on the other side of the Atlantic.  That’s really cool.  But at some point if you are only seeing the same content for you it seems to be true because it has been repeated, repeated, and repeated again, and whether it’s true or not you will have and we will have as human beings the trend to follow what the majority looks like to us.

And this is why right now we have a huge problem in our societies, because the less we trust about institutions, about NGOs, about academics, about journalists, about public – the public sector, about companies, the less – the more we are getting into what our friends, what our – people around us know.  And that is very difficult because this can be manipulated very easily, because we can know exactly what you want to see and we can provide you exactly what you want to see.

And basically, we went out from a system in our democracy where to discuss about politics, first, you went into a bar.  You knew the people in the bar.  You know, you had this guy which is a leftist, this one is rightist, and you could have a discussion.  Then we ended up at having discussion on the internet.  I think I’d call it internet 1.0, 10 years ago.  We could have still discussion on the internet.  It was complicated, it took time, and sometimes you didn’t go to sleep because you have to prove someone is wrong, but at least you had debate.  Now it’s basically shouting at each other:  you’re wrong, you’re right, you’re false, you’re fake news, no you’re fake news, no you’re fake news, no you’re fake news.  And this is not going anywhere because the more polarized we are the less discussion, the less compromise we can make.  And democracy is about making compromise and discussion and having fair debates.

So AI can unfold amplification.  There are many things that AI can do.  AI can learn about how information spreads, about how people are polarized.  AI can help to understand how the content is repeated, how people are behaving the same way, and it should – it doesn’t look authentic.  So AI is a great tool.  And the thing is, so the slide is not the right – the slide is called “don’t blame the internet.”  (Laughter.)  Don’t blame the internet because it’s not about the technology that we have a problem with; it’s about how it’s used.  If you look at printing, Gutenberg invented the printer when we imprinted the Bible.  Two centuries later we printed the encyclopedia.  Two centuries later we printed “Mein Kampf.”  The problem is not to burn all the books; the problem is to understand why is – why do we have these kind of things.  Why do we have these kind of problem with the technology that we’re using?

And we are entering in an era where we have new paradigms.  Basically, we had the paradigm of the representative democracy.  It was very easy.  We had rational arguments, so we could discuss about political visions.  We had representation, so we elected people, or we had academics who we could trust that could lead up the debate.  We had time for deliberation, so basically a law is not about having it passed next week but in two months just to have the time to have checks and balances.  And we have consensus.  Right now this is still how our political system works in Western liberal democracies.

If we look at how internet works, it doesn’t work like this at all.  Internet works on emotion.  It works on no representation, every voice is equal; on instantaneity; and on polarization.

And I’m not sure we will go back to what we had, and I’m not sure we should go to what we have.  So this means we need to find a way to find a new model between these – to use technology, to use internet, and to reform what used to be our system of governance.  And for this I think – I also see disinformation as a – as a very interesting moment where we can discuss about this because this is about what we want to do with democracy.  And we have to put ethical limits to this.  We need to protect people.  And when we say protect, I mean, regular user of the internet.  We need to protect whistleblowers, journalists, and sources, freedom defenders.  We need to protect anonymity.

This is one of my favorite meme(s).  The dog says:  “On the internet, nobody knows you’re a dog,” which is good.  The thing is you should at least know how these person are and how the conversation is influenced, and if his influence is fair or not because we need to have fair debates.  I don’t care what side you are, but use arguments.  The more you lie, the more you invent things, the less powerful your argument is.  We need to go back to having strong political discussions with arguments based on rationality and not on emotion because election is not about the five next weeks or the five next minutes, it’s about the five next years.  So we need to have individually a discussion with ourselves with the argument we’re using to justify our particular position, whatever they are.

And we do not let the machine decide.  The machines are a tool to help us, and we need to be sure that we have human control of what machines say like we have a civil control of what government says, like we have checks and balances.  We cannot rely only on one solution, one silver bullet, because it doesn’t exist – because if it would exist, we already – we would already use it.

So this is a complex challenge and a complex time, and you’ll have a great panel to talk about this now.  Thank you.  (Applause.)

  1. GONZÁLEZ: And, Alex, we’ll definitely pull you back for the Q&A. So if any of you have any questions, we will be taking questions from the audience from Alex as well.

Please, go ahead.  We put you in the middle so you don’t get in trouble.  (Laughs.)

And now we’re here.  So we’ve had these great, great presentations.  Thank you so much, Ana.  Thank you so much, Alex, for really setting the stage for what we’re going to talk about.

Of course, our remarks have focused quite a bit on just disinformation, but one of the key things to understand is that disinformation does not exist in a vacuum.  Disinformation is one of the tools that exists in terms of influence operations, and influence operations are what make so that disinformation actually sticks, disinformation is actually effective.  And I think, you know, thinking about, you know, why a news story about someone who’s very corrupt in a country where corruption is everywhere sticks is because it matches part of their reality, right?  And so I think this discussion we’re really going to focus on unpacking that toolkit, thinking about the different tools that are used to destabilize the political discourse and how that can be effective around elections.

I think that the best way to start will probably be to kick it over to Ambassador Dan Fried.  If you could give us an overview just generally of, you know:  What are influence operations?  What do bad actors do?  And then we will split it up with our experts to talk deeper dive.

DANIEL FRIED:  It isn’t new.  In the old days, my days, the Soviet Union started the rumor that the CIA had invented AIDS.  And the way it planted the story was to go to pliable African newspapers, get the story planted, and then encourage left-wing newspapers in Europe – fringe newspapers – to pick it up, and by degrees move the story from the geographic fringes of the mainstream media to the political fringes to the political mainstream by degrees.  And it worked.  The rumor was picked up.  It remained folklore for years and years:  CIA invented AIDS.  This took a number of weeks to months to put into place.  And the Soviet propaganda apparatus and the KGB, which was running the operation, relied on, in Europe at least, the category of people called useful idiots and fellow travelers – various witting, semi-witting, or unwitting agents.

Now, of course, thanks to the internet, the process doesn’t take weeks; it takes hours, sometimes minutes.  And instead of actually relying on human beings to disseminate the story to favorable or well-inclined newspapers, you can have sites set up in the thousands, or bots, or various impersonator accounts flogging your story and moving it into the mainstream very quickly.

Mr. Alaphilippe was right when he pointed out that the printing press created not only the Gutenberg Bible, but all kinds of religious tracts that helped spark the religious wars.  And propaganda, using the technologies of the time, is as old as the technology of the time.  The United States, 1790s, scurrilous pamphlets making all kinds of claims, Adams about Jefferson, Jefferson about Adams.  This stuff goes on through the 19th century.  The cheap daily newspaper, the radio, the television, every advance of technology can be and has been subverted for propagandistic purposes.  So there is nothing conceptually new about propaganda on the internet except that it’s awfully fast and it is awfully hard to fight, the way all new technologies are seen as hard to fight.  Which gets us into the area of solutions, but that’s the next topic.

My point, though, is that this is not new.  We have dealt with this before.  And it is useful to take a step back and look at the solutions through which democratic and even non-democratic societies have developed and then applied norms of behavior to integrate new information technology without succumbing to the worst features of that technology.

  1. GONZÁLEZ: Ambassador Fried, you mentioned useful idiots, fake idiots. And I think this is a great opportunity to bring David Alandete, who knows very well about useful idiots and fake idiots – (laughter) – that have, you know, decided to take up – (laughs) – quite a few stories and amplify that which is false.  If you could just, you know, get us started on that.

DAVID ALANDETE:  Yeah.  I know a little bit about them.

  1. GONZÁLEZ: A little. (Laughs.)
  1. ALANDETE: Yeah. So the thing is that I completely agree with you, Ambassador.  And I think the problem today is that when you compete online, the ground is leveled and journalists have to compete with people who act like they are journalists, but they are not.  So I’m a journalist.  I’ve always been a journalist.  And I believe that what we do is just, like, we actually are exercising a right that belongs to the whole of society, you know.  Society has to be – in order to make good choices and make good decisions, we need information.  You vote for this candidate or the other one, you know what to do, you know if you have to protest or not.  But disinformation kills the journalist and tries to bypass the press in order to make it easy for politicians or, like, authoritarian regimes to actually talk to the people directly.

So you were talking about these little campaigns that we suffer as journalists, the attacks towards journalists.  I’ve been accused of being an agent of the CIA.  Now I’m supposed to work for NATO, I’ve got money from George Soros, just because, you know, I’ve been trying to report on disinformation here in Europe, first in the Catalan crisis now with the far-right parties, with the Italian referendum.  And yeah, like, we have to suffer a lot of people who actually play the role willingly.

Just to summarize, how did I come to the topic of disinformation and what a great challenge it’s for democracy nowadays?  Actually, it was not here in Spain; I was a correspondent in the Middle East.  I worked for a Spanish newspaper there.  And when I had the chance to go to Syria, I had the chance to see how Russian public media portrayed the conflict and portrayed the United States and portrayed NATO.  The United States was going to start a war in Syria just because they wanted some type of gas or oil or something.  NATO was corrupt and was trying to, like, find ways – this is all, like, portrayed there.  Chemical attacks didn’t exist.  The regime, supported by Russia, was actually defending democracy against all odds.

And you know, this type of little bubble that was created there was later, like, slowly expanded into other areas of attack.  Now we face an environment in which, you know, the Russian public media operate in Spanish, in French, in Arabic – it’s a major problem – in Chinese, in any language that you may imagine.  And they are portraying an alternative and different reality.  What’s what they claim; like, we portray a different reality.

And to actually address your point, when they come here they find people who are willing and ready to believe what they are portraying.  We’ve all – like, I’m sure you’ve heard about it, even if you’re living in the United States:  there were 1,000 people injured in the referendum in Catalonia.  We’ve seen those images right now.  Were there 1,000 people injured?  There were not.  You know, like, this was in the front pages of many, many newspapers.  There were a lot of people who were willing and ready to believe this because there were people here who were actually saying this.

And then, like, the problem is, like, when you don’t have a strong media who actually takes on the challenge of verifying facts and, you know, doing our job.  Like, then, like, I think everything fails, and you have this current dystopian situation in which, you know, anything is possible, and you are far right/far left, and the middle of the ground is lost.

  1. GONZÁLEZ: So much to unpack there, particularly thinking about journalists verifying facts in an era where everything is about being first, being the breaking news, being viral. That is what the business model is designed for.

But before I, like, deep dive into that, I want to talk a little bit more – you know, Nico, we know each other for quite a bit.  You contributed to The Kremlin’s Trojan Horses report series for the Atlantic Council at a time where folks were not paying attention to the challenge of Kremlin, particularly as it targeted Spain.  You’re an expert on hybrid warfare.  If we talk about – you know, we’ve been talking about how the far right and far left are starting to communicate beyond just their own communities and their own networks in their countries.  Help us unpack a little bit about how dark money and disinformation and political networks across domestic actors actually, you know, is operating right now, and the threat behind that.

NICOLAS DE PEDRO:  OK.  So it’s a difficult question, yeah.

  1. GONZÁLEZ: Yeah. Yeah.  Solve the problem.  (Laughter.)
  1. DE PEDRO: Yeah. Yeah, such a question.

Well, the point is that for those of us who are following Russia since many years ago, it was more easy, probably, to see this coming because several years ago it was clear that they made a decision that they need to undermine the West – undermine us in order to protect themselves.  This is always a problem of wrong perception in the Kremlin, but which is real.  I mean, they really think that there is a plan from the West to destroy Russia, actually.  So that’s why they all – why they interpret all the moves as defensive moves.  They always think that they are defending themselves – in Ukraine, in Syria, whatever.  They always think that.  And everything is part of a plan.  That’s their perception.  So if David is publishing whatever news, then it’s part of a plan because either Soros or the CIA or whatever is behind that thing.

And what the Kremlin is doing is basically to offer a platform to multiply this effect of our own crisis, because here we have three – I would say three dimensions.

One is that – the one that has been referred by Minister Palacios, this deeper crisis of the West, of our liberal democracies, which is not exactly coming from 2008 but definitely since then.  We are still struggling with this, and the lack of prosperity or the apparent lack of prosperity, and the crisis of legitimacy of our systems.

Then we have, related to this, the crisis of the traditional media.  The business model is not clear.  This is what David was referring to.  And also, this is connected to digitalization of the world.

So all this combined, this is a pretty severe challenge.  And it’s not only Russia doing this, of course.  Private actors, state actors, and many others, and the media standards are not necessarily the best everywhere.  I’m coming from Barcelona, and we have some local channels which are not that different from Russia Today I have to say.

Sorry, now for a moment I am lost.

But I also enjoy this analogy with the war on drugs, yes?  So we have this problem.  This will be – yeah.  So this will be connected to the demand, yeah?  This wide crisis that we have to approach related to education, media literacy, all these kind of approaches that are part of the solution.  But then we have the suppliers, right?  And I know a bit about one of the main suppliers.  Ambassador Fried was referring, as Russia has been doing this for a long, long time.

And they identify, rightly, that the question of the legitimacy of the West, which is now under pressure, is an element to exploit, because what they are doing is to exploit vulnerabilities.  And we have many vulnerabilities.  And the problem is that we have to decide whether we approach all this all together at the same time – which is going to be difficult because we cannot solve all these vulnerabilities from one day to another – or if we focus on at least to prevent these hostile foreign actors to take advantage of this situation.  But this is easily said and is difficult to tackle because they are basically taking advantage of our free flow of information and freedom of speech.  And how to apply legislation to deal with this problem is particularly difficult.

At least something that needs to be done – and that’s what me and others are trying to do – which is not really easy is to connect all dots, because as you have said in the beginning disinformation is not taking place within a vacuum.  And when we refer to the Kremlin, this is part of a big framework that connects from nuclear intimidation, military means, to the use of dirty money, to disinformation, and to many other – intelligence services, of course, and to many other things.  It doesn’t mean that there is a clear, detailed plan in place, but there is clearly a strategic or coherent strategy behind all these moves.  So we can connect all these dots.

And of course, you can advance your strategic goals funding specific political parties within Europe.  So probably we need to adapt our legislations to this problem.  So far we know about the Front National, which is now called Groupe National if I’m correct.  Since a few days back we heard about the Lega in Italy.  And these are probably just the tip of the iceberg, so there is much more there.  And we can see also a pattern, also, in the kind of actors they are approaching, far right and far left.

And I finish with this.  Alexandre was referring to the yellow vests.  It’s quite interesting the positive coverage that almost all Russian media – not only those owned by the state directly – are giving to the yellow vests as the representatives of the genuine French people against the elite, globalist, this kind of hidden government or whatever.

So this is what they are offering.  They think they are defending themselves, but undermining us.

  1. GONZÁLEZ: Yeah. I mean, I think – when we talk about vulnerabilities, right, I think we ought to make sure that we do separate levels.  We have our societal vulnerabilities and then we have the vulnerabilities to our critical infrastructure, which is where I really want to talk to Kadri and really bring her in into this.

A lot of the political discourse or when we think about democratic institutions, when we think about voting, when we think about elections, we think about protecting our ballot system and cyber.  But the truth is that there’s different elements within cyber that can easily disrupt and affect the way that we see, you know, candidates, the way that we see even our own environment, right?  There is the – the narrative that the Kremlin pushes a lot is the decay of the West, right?  Like, you know, you can’t walk – like, if you go to Copenhagen they will take your kids kind of narrative, right?  And so it would be great if you could just, like, unpack a little bit on the critical infrastructure, the vulnerabilities, and how we can start thinking about that.  No easy task.  I have no easy questions.

KADRI KASKA:  (Laughs.)  It’s funny; I normally get those questions from the other point or from the other perspective, in particular.  So describe to me how is this not a technical issue.  And you come from a – from exactly the opposite angle, because I’ve had – and we – and not least in Estonia we’ve had to preach quite a bit that election security is not simply about securing the – sort of the information system or vote-tallying and voting systems, but it’s about securing our increasingly digital societies.  And we are very often fixated on the – on sort of the technical issues.

In general, I mean, we are rather good at understanding elections in the context of overall constitutional and political processes of a country.  But as soon as we began discussing election cybersecurity, people become very quickly somehow myopic around the – around the issue around technology, around technological or technical issues.  So I think we need to zoom out from there again.

For one, the issue is not primarily about elections and such; it is about the resilience of our digital ecosystems and very much about the resilience of our normal habit or functioning of societies.  So elections are simply an element in that – not unique, not different, just special because of their constitutional and political significance.

And in terms of threat picture or cyber threat picture, what we’ve seen over the past five years, I would say now, is that disinformation and attacks against election processes exploit and are amplified by our inherent weaknesses even in that domain, our dividing issues and value conflicts.  And they are aggravated by the means we tend to use to solve problems, oftentimes by exposure and confrontation and competition.  So there is little inherent drive towards cooperation, and that also means that we are all pretty comfortable in operating within the walls of our siloes, even in the – sort of the technological or cybersecurity sense.  And we often lack the practical appreciation for the level of connectedness we all have with everyone else.

So this – yeah, what I guess I would like to say is that these patterns tend to concentrate around elections because of their significance, because of their concrete rallying point in time, but I think we should not view and work to address them as attacks against elections but rather as attacks against society’s way of life and our values and our coherence.

And that, in turn, implies that election cybersecurity must be planned as – and executed as an integrated whole together with defending the overall digital ecosystem.  So in addition to securing the ballots and tallying systems, comprehensive elections risk management also means that we consider the risks in auxiliary systems that might impact the elections.  So what are the critical dependencies on other digital systems, and where do the threats lie against our critical infrastructure?  So communications services, electricity supply, digital identity used for not just voting, as we do in Estonia, but also vote tallying, for example.  And how could a continuity or integrity incident in these systems – in these auxiliary systems impact the process and the perceived legitimacy of elections process?  And how are the risks managed, these risks in particular, and who’s responsible for that?  So on the one hand critical infrastructure cybersecurity is vital for assuring the security of the elections process, but it also is important because it gives assurance of the reliability of the digital environment and the conditions – the normalcy for us as a society – also during the campaign period, not just sort of the voting days.

So in the EU we have the NIS Directive, the Network and Information System Security Directive, and it’s enforced since last May.  And I would say it’s quite a significant achievement, a rather strong tool in two regards.  First, it creates a consistent understanding across EU what we – what functions and services we consider as essential for our societies.  But it also creates a mandatory and regular risk assessment and management mechanism so that we ideally can be adequately aware of our digital reliance, of the vulnerabilities as well as threats to our most critical functions.

But in that vein, I believe we shouldn’t be dogmatic about the NIS or mandatory mechanisms, especially in the context of democratic processes.  Critical infrastructure is not black and white.  There are varied levels of practicality, for example, due to cross-sector dependencies or across service dependencies.

And for another, there are services and infrastructures going beyond formally designated critical infrastructure that can significantly affect public functions and public trust.  So think media, for example, even social media, or think political parties and the NGOs.  They may not be subject to specific cybersecurity obligations, but extending assistance to them often means a significant leap in their – in bringing them up to speed in understanding and managing not just their cyber risks, but by proxy also building societal resilience to cyber threats.  And yes, of course, this does require resources as well.

And then, if I may, I’ve taken up quite a bit of time, but I think one layer that cannot be ignored in this picture is the if we think about the ecosystem-wide cyber risk management, it also has to do with digital literacy of political parties and the candidates and the electorate.  And I am fairly keen to stress this is not about – just about cyber hygiene, but more fundamentally an issue of literacy because cyber hygiene practices or following them, it may amount to, well, about four-fifths of cybersecurity, but cyber hygiene rules will not stick if their purpose and nature – the why and how things work – are not understood at the fundamental level.  And that’s something we need to work on when we talk about cybersecurity of the – of the society at large and elections in particular as well.

  1. GONZÁLEZ: Wonderful. Thank you, Kadri.

I want to, you know, stay on this topic, on the vulnerabilities side, right, because we’re talked about how often malicious actors use the very same freedoms that we have and that we enjoy in order to exploit and divide us.  You know, we’ve talked about how the Kremlin sees any move from the West as an attack on them and, you know, kind of mobilizes all of the tools that they have to counter.  So I wanted to talk a little bit about, OK, so if we’re thinking about they’re exploiting our freedom of expression and our freedom of speech, they’re getting in there, are – presumably it’s, you know, to also – how to explain this in a way that makes sense?  If our democracy fails, authoritarians succeed, right?  If we – if freedom of expression is what’s being exploited and our response is to limit freedom of expression to really exhaust that, right, to react on that, then authoritarian regimes succeed.  So when we’re talking about solutions, can we think a bit through what are some democratic things that we can employ now to address some of the challenges that we’ve addressed in this session?  Ambassador Fried?

  1. FRIED: I think I’ll start with two basic principles. In attacking any public policy problem, first, unpack the problem.  Don’t look at the problem as a whole because all problems looked at in the aggregate appear to be impossible, and you’ll run screaming from the room because it all looks – it all looks hopeless.  If you regard – if we start looking at the disinformation challenge as a – as a challenge of the existential nature of truth, we’re going to go down a rabbit hole and never be able to solve it.  So unpack the problem into smaller, digestible bits.

Secondly – second principle – work within our democratic norms.  Don’t become them in order to fight them.  The United States learned that lesson the hard way during the Cold War.  Our biggest screw-ups as the United States of America came from ignoring our own best principles and starting to imitate the enemy in tactics.  Bad idea, doesn’t work.  When we were true to our own principles, we succeeded the most, all right?  Basic stuff.

What is – how does that apply to the issue of – how do those two apply – unpack the problem, stay true to your principles – apply to the challenge of disinformation?  First, separate domestic and foreign.  The Russians exploit our divisions within our societies, and of course so do domestic actors.  But our field of maneuver and public policy solutions are more applicable to foreign actors, OK?  There are things we can and should do to limit foreign penetration of – foreign occult, hidden penetration of digital space than we should apply to our own domestic actors.

What do I mean by that?  Content controls really don’t have much of a place, in my opinion, fighting disinformation.  I know that in fighting – that there are controls against pornography or against ISIS beheading videos.  I understand that.  But that is not terribly applicable against, say, Russian disinformation ops.  If you go down the road of content controls, you force a government agency to be the arbiters of truth.  I don’t know about Spain, but I would not like and it is not in the American tradition to trust any administration in the United States with the role of being the arbiter of truth.  Don’t go there.

So use principles of transparency and integrity to filter our foreign disinformation.  By transparency and integrity I mean forcing, for example, disclosure of foreign bots disguised as human beings.  Use technical means to expose deep fakes – you know, fake people like the one Alexandre created for us.  That probably is in the realm of technology, if imperfect.

If Juan from Madrid or Barcelona is not actually Juan – and he isn’t actually from Barcelona, he’s Ivan from the St. Petersburg troll farm that the Kremlin funds – people ought to know that.  There ought to be a ban on impersonator accounts.  And frankly, that is well within the technical capability of most social media companies even today.

Algorithmic bias is another area that we ought to look at.  Algorithmic bias drives sensational and extremist coverage because it pays.  Alexandre talked about this earlier.  Extremist coverage is what we go for when our emotional brain kicks in at the expense of our rational brain.  Well, in the United States in the television era, there was a so-called Fairness Doctrine that the major national television networks had to approximate covering both sides of the issue.  Does that precedent allow us – the “us” being the United States and the European Union – to impose or require a Fairness Doctrine for social media companies’ algorithms?  I mean, I don’t have an answer to this, but it is at least worthy of exploring whether or not we can limit the deliberate polarization as part of the social media company business model exploited by the Russians and the Chinese through regulation.

Now, there are also seemingly boring technical fixes like standard terms of service among social media companies.  That sounds boring and fine print, technical, but it isn’t, because until there is a common definition of a bot or a standard of inauthentic accounts social media companies can simply not work together.  When there is a common standard, they may be forced under common regulation to pull certain bots or certain impersonator accounts that meet a certain definition from the internet.

Now, I’m – I’ve deliberately gone small-bore and technical because my point is the technical solutions actually exist, that we need – we, the European Union – Europe and the United States need to get beyond the existential dread in the face of this crisis and start focusing on the solutions.  And I will admit immediately that no set of solutions will solve the disinformation problem, period, but it may limit it.

Which is where I go to Ana – I’m going to finish up with Ana Palacios’ point.  You have supply.  You have demand.  You whack at supply, hoping to get some control over it.  You will never reduce it to zero, but you may reduce it significantly.  The demand side is, of course, the increase of media literacy and social media literacy, and I refuse to believe that human beings are now much stupider than we were in the 19th century.  (Laughter.)  We may not be smarter, but we aren’t – we are probably not actually stupider.  We did learn to discriminate among daily newspapers – quality journalists, yellow journalists.  It took a generation.  We managed it.  We can do so again.

Our policy problem is to control the supply as much as possible while managing to increase the discrimination of demand.  And since I spent my whole career in public policy, I’m real good with imperfect solutions because those are the only kind you ever get, OK, except during political campaigns when politicians promise to solve everything.  In the real world, partial solutions are – partial solutions that are good enough is as good as it gets, and that actually probably is enough.

Anyway, more on that, but I – the United States has spent the last two years admiring the problem and pulling its hair out about the problem enough.  We have the tools to solve it.  And by the way, a shout-out to the European Commission; they have at least taken a stab at solving it through the code of practice, the action plan, the rapid response mechanism that’s being put in place before the elections.  Critics say it’s full of holes, it’s weak, it’s soft, but the Europeans are doing something about it and I want to give them full credit.  We, the United States, ought to be working with them and up our game.

  1. GONZÁLEZ: Thank you, Ambassador Fried.

David, you’re ready.

  1. ALANDETE: Yeah. Ambassador, like, I cannot imagine how a company that knows what I want before I know it and shows the advertisement in my screen and in my phone doesn’t have the technical solution to actually tackle some bots.  So I completely agree with you.
  1. FRIED: Exactly.
  1. ALANDETE: You know, like, what works in the advertisement field should work also in the ethics field.

But when we talk about the ministry of the truth, I have to disagree a little bit, Ambassador.  I do not think that the European Commission has done enough.  I think they could have done more.  They rely on, you know, educational programs and self-regulation.  Go ask RT and Sputnik to self-regulate and you’ll see what happens, you know?  They rely on some measures that I think are not enough.  If we do nothing – I mean, if we do nothing because we think we’re going to run the risk of being a ministry of truth policywise, then, like, of course, I have some examples here that I noted down, like, fast.

Like, we will believe that that famous flight that dropped in Ukraine was shot down because, you know, like, the people who were going to cure AIDS were flying there and some pharmaceutical company shot it down, as opposed to a Russian missile, you know?  And that’s what the Russian media are reporting.

We would believe that the responsible people for, like, poisoning Skripal would be Donald Trump and Theresa May, or maybe a Spanish company that had interest in, like spreading some type of virus, or maybe it was an overdose.  This has been published.

We would think that the war in Afghanistan is – started because the United States was looking for some type of gas or, like, some mineral that was, like, there, like, that never materialized.  This is published today by the Russian media.

We would believe that George Soros is behind absolutely anything that happens in the world, from, like, far left to far right to independence to nationalist sentiment.

So something needs to be done.  And as a journalist, I refuse to believe that propaganda agents have to be treated as I have to be treated.  You know, I go through some checks before I enter any press conference.  You know, I have to take exams to enter, like, my studies.  You know, I had to, like, do research.  I’m held to some standards, and they are not.  Whenever I make a mistake I have to write a correction; they don’t.  They are offered these platforms in order to spread this misinformation and they are not held accountable as we are.

I’ve seen headlines by RT and Sputnik in election time changing and nobody said anything.  “Tanks in the streets of Barcelona” and they’re like, oh, someone said that he saw a tank.  “One thousand people injured” – well, actually, like, it was not injured; like, they were just treated.  Like, this happens every day.

I’ve seen some things that actually work, and one of them has been done in the United States.  When you force these people to register as foreign agents, they actually suffer a lot and they make a lot of noise.  They make a lot of noise.  And when you see YouTube and it says, like, RT is a company that is funded by the Russian government, they make a lot of noise there.

I remember to – I mean, we’re talking about the election coverage.  There are two experiences that I think were very enlightening recently.  One, in the elections in France, I don’t know if you remember, there’s a gag, you know, like two-day – I think it’s 48 hours in France where you cannot publish interviews, polls, whatever.  This happens in Europe for reasons that escape me nowadays because, you know, like, you can actually bypass this through social media.  But do you guys remember E.M. leaks – Emanuel Macron leaks – during the gag time?  Like, they accused Macron of doing drugs, having a lover, like one million things, and they were all fake.  But they were there and the media could not do anything.  Then Macron – what did Macron do?  These guys are not coming in my press – into my press conferences because they are not journalists, and he said that next to Putin, and said like when they behave like journalists they can come.

The second thing that I think works in election time is what Theresa May did during the Skripal poisoning crisis.  You know, as she was receiving the information she created a crisis – like a war room, and they were releasing the information in real time as they were getting it.  So you saw the photos of these people.  You saw when they came from St. Petersburg, who they were, what hotel they stayed in, what their names were, what their passport was.  And you know, the media and the people who were getting the information to counter – I mean, I counted 20 different versions from Russian media on who poisoned Skripal from, like, overdose to he was not poisoned, he didn’t exist –

  1. GONZÁLEZ: His mother in law.
  1. ALANDETE: Yeah, his mother in law.
  1. GONZÁLEZ: His mother in law, obviously, right? Yeah.  (Laughs.)
  1. ALANDETE: So and just to summarize, like, I think there is something that is really good for fighting disinformation. And you may agree with him or not, you may be for Trump or against Trump, but the Miller report on the Internet Research Agency is a damning indictment on what Russia has been doing for the past year.  And devoting the resources to actually analyze this, research this, and actually see it, you know, I think that has done for the fight against disinformation more than anything that the European Commission has done.  But I’m European and I have to criticize the European Union.  It’s like something that we do here.  (Laughter.)
  1. GONZÁLEZ: That you can do here. (Laughter)  That’s wonderful.

I do want to open it up to the audience.  I will ask Alex to take my seat.  I’m going to move up.  If can get an extra microphone up here just so that you get all of the fun.  And just raise your hand if you have a question and we have wonderful folks here with mics.  And don’t be scared.  And we’ll take them in Spanish, too, all of the options.  Perfect.  No questions?  Did we solve the problem?  OK, I have a question here.  A question here, I got it.  Oh, and please make them questions.  (Laughter.)

Q:  Well, first, thank you for the conference.  I have two quick questions.

One, for Mr. Alandete:  Do you think – (audio feedback) – do you think it’s –

  1. GONZÁLEZ: You’re good.

Q:  Do you think it’s, for example, the problem with the fake news, one thing will help if, for example, every media has to make or like make open all the funding they receive, including the funding advertising they receive for companies – of private companies?

For Mr. Ambassador, I have a question for something in your country.  Do you have to register to vote?  Do you think we can solve also the problem of some people – the influence on the people voting for making also a quick test for be able to vote?  Because now we have to have tests for driving, for – to work in a lot of – with a lot of machines, have a lot of certifications, but nobody talks about a test – a test – an exam to be able to vote.

  1. GONZÁLEZ: That’s a great historical question. So we have one on transparency and we have one on taking a test before you vote.  Go ahead, David.
  1. ALANDETE: OK. So, actually, I think this type of criticism of, like, this dark interests behind every media company, you know, like advertisement, that serves the purpose of disinformation because, I mean, I don’t need to know who funds RT, Sputnik, HispanTV, or other disinformation endeavors because it’s so obvious and so transparent.  When it comes to media, you know, like, The Washington Post has one owner.  New York Times has many owners.  CNN, the same.  But they actually comply with standards, and that’s what doesn’t happen in the other end of the spectrum, you know.  Like, when CNN makes a mistake, when The Washington Post makes a mistake, I see corrections, letters to the editor, ombudsman or ombudswoman.  These other outlets don’t do that, you know?  Like, I’ve told you 20 different versions on how Skripal was poisoned.  Twenty different versions on, like, the Ukrainian war.  So when you actually comply with this, why?  Like, you know, like, there are public and private media.  If you actually comply, I don’t think you should be, you know, like held accountable to some type of, like, investigation of, like, dark money and dark interests being behind every newspaper because that’s the end of journalism.
  1. GONZÁLEZ: Nico, do you want to add anything on the dark money piece of that?
  1. DE PEDRO: Oh, yes, I want to emphasize what David was saying about this transparency. Transparency’s fine, but I have been the last 15 years working at the think tank sector and there are always this understanding that there is a clear connection and well-established between who is giving the funds and what think tanks or the media are producing, and I challenge this idea.  So there is no direct connection.  Actually, in my – in my career, which is not that long but in these last 15 years, in the old location in which I have faced problems related to someone manipulating my text has been when I published it with this Russia Beyond the Headlines in 2010 or ’11.

(Audio feedback.)  Oh, it’s too cold, I mean.  (Laughter.)

They basically – they changed the title of – when they asked me for a piece, so I write a piece – I wrote a piece, and they changed the title and it was completely misleading.  Then they – and normally people read the title and maybe two or three highlighted sentences, but not the whole article.  This happened to them.  And in the other occasion was when I received some correspondent of one Russian media ask me a few questions, and then she was unhappy with the answers and they told me, oh, look, this is too critical to the Kremlin, I cannot publish this, so can you change your answers?  (Laughter.)  And, sorry, no, I’m not – I’m not changing the answers.  And she said, look, then we will not publish this.  I said, fine, that’s up to you; it’s not my business.  But it never happened to me anywhere else, and I have published in almost all Spanish newspapers and in Spanish main think tanks.  And I never faced this problem of someone telling me what I can say or not, so.  But yes, that’s the money.  Sorry.

And one quick point related to this media transparency.  This is getting more sophisticated.  Now what we are seeing is Russia Today or Sputnik are the big two aircraft carriers, yeah?  And then they have subordinate media or connected media, and it’s not that clear the financial connection.  But there is an organic connection of the people working at RT and then all of a sudden, plop, they have a startup media starting in Germany.  And all these people are coming from there; it is unclear where the money’s coming from.  But all of a sudden you have young people, normally, with very sophisticated and very professional equipment, et cetera, traveling to some interesting places, and they are reporting very much in the line of Russia Today or Sputnik or – basically, on the line of the Kremlin.  But it isn’t clear, this connection.  So this is a serious problem.

And the second one, of course, is that the West is vulnerable to money, and this is a problem affecting everyone.  Money coming from authoritarian regimes channeled through to think tanks or – that, of course, is a problem.  Dirty money in Londongrad is a serious problem.  And it’s very difficult to tackle.  So the British have a clear understanding that this is a problem, but how to tackle it is very difficult.  And of course, there are significant interests around that.

What we are seeing also from our institute and our research is how many law firms they are hiring.  And then – (laughs) – this is a part of their welfare endeavor.  So you hire the best or very good professional law firms, and then you start playing and knowing others in the country.  So it’s really challenging.

So I don’t have answers – good answers.  (Laughs.)  Sorry.  I’m sorry.

  1. GONZÁLEZ: That’s really bad. I told them we were going to solve this today.  (Laughter.)

Ambassador Fried, let’s talk about transparency – oh, sorry, talk about voting and whether or not a test on voting.

  1. FRIED: Well, tests on voting have a long and bad history in the United States because they were used, essentially, to disenfranchise African-Americans starting after the end of Reconstruction following our Civil War. For that reason, there is an allergy in our system against any such tests, and there is a now – there is now a domestic debate in the United States about how much identification voters should be required to submit in order to vote.  And that debate is colored by our bad history.  So I doubt that we’re going to go very far down that line, and there is evidence that there is still bias in some of the demands for documentation for individual voters.  So not – you will have a fierce debate in the United States – in fact, there is a fierce debate about some of these issues right now, but it’s colored by our history.

With respect to transparency, I think the issue of – I think the issue of sites that are actually controlled by RT or Sputnik in a hidden way ought to be the focus of transparency requirements.  I think regulation to force disclosure of the actual identity of the ultimate funders of a site ought to be part of a normal digital and maybe ordinary media climate.  So I think transparency rather than content controls.

Look, I don’t disagree with a thing you said about RT.  And you’re absolutely right, I think the alternative Russian explanations for Skripal peaked at about 22 or 23, right?  But rather than ban RT, I think labeling them – as we have – as a foreign agent or doing what Macron did is the way to go.  They ought – they can be anathematized without being banned.  There can simply be – they can have a kind of mental red line around them the way Pravda did during the Cold War.  It didn’t happen at once, but it happened eventually.  And I think Alexandre Alaphilippe has told the story of how the exposure in France of the Russian hack of the Macron campaign turned into a bigger story than the actual hack.

Which brings me to another point:  transparency is not going to be simply derived by regulation; it will also be uncovered by active and even aggressive civil society.  Private groups will be able to expose Russian and other disinformation campaigns.  EU DisinfoLab is private.  East StratCom from the EU ought to be supported.

And in case you think I’m naive, I don’t think the Commission’s voluntary code of practice is the end; I think it’s a decent beginning.  You can look at the holes, and you’re probably right to point out the weaknesses.  But I’m lucky – I feel, myself, lucky that anybody’s doing anything.  Maybe it’s a question of expectations.  (Laughter.)  Anyway, that’s another discussion.

  1. GONZÁLEZ: We’re going to keep taking questions, but I want to make sure that we don’t forget the demand side, right? Because we were talking – you know, there are studies that say if you label something and it says it’s false, we’re five times more likely to click on it because we want to find out what the false new is, right?  Or if it’s sensational, you’re curious, so you click on it.  That’s what drives, you know, some of these things.  So, you know, we can – we can implement some solutions, but still think about everything else that we have around it.

So we have a question here and got you third.  And I think I’ll take a couple of questions.  So there’s one back here.  Oh, OK.  Go ahead.

Q:  First of all, I would like to thank everyone for the panel.

Disinformation is obviously a very big problem, but what I’m saying is more of a – somewhere between a remark and a – and a question.

  1. GONZÁLEZ: Questions only.

Q:  It’s a question.  I end with a question.

  1. GONZÁLEZ: OK. OK.  I will – I will stop you.  OK.

Q:  (Laughs.)  I’ve got the feeling that it might have sometimes been a bit one-sided.  We talked about Russia, the USSR, Putin.  I think almost everyone mentioned them at some point, which led me to conclude that perhaps it’s more about hybrid warfare than about disinformation.  Because, obviously, while a problem – you’ll never hear me say that Putin or that Russian disinformation isn’t a problem.  I think we can all agree that it is.  You only need to look at our elections, as I think you said, the Trump report – or the Mueller report and the congressional reports in the U.S. show it.  But no one mentioned Trump.  I don’t want to make this into a political discussion at all.  I think we would do well to avoid it.  But it’s pretty clear that he disinforms on a consistent basis as well, spreads lies that are easily verifiable.

And so my question to you – and you don’t have to stop me – is:  How do you see us regulating ourselves?  I don’t know if it’s an expression in English; I know it is in Dutch:  How do you see looking in our own bosom about addressing this problem rather than only focusing on the foreign aspect?

  1. ALANDETE: I have something to say. I work as a journalist now in Washington, D.C., and whenever the president of the United States or anybody in the United States capital, State Department, tell a lie, I can write about it and I can fact-check it.  That doesn’t happen in Russia, period.  It doesn’t happen.  It doesn’t happen in Venezuela right now.  It doesn’t happen in Iran.  And I think, like, a political statement is different from disinformation.  The president may tell lies, may, like, you know, like, manipulate the facts.  Then we can fact-check it.  Like, as The Washington Post is, like, and other newspapers are experiencing the Trump bump.  You know, like, a lot of people are reading them and a lot of people are going there.  But it’s – I think it’s fairly different, you know.

And, yes, hybrid warfare, you may correct me, Kadri, but you know, like, there’s a full, like, Russian doctrine about it, right, like from 2013.  But I think there’s a difference there.  Sorry.

  1. GONZÁLEZ: No. Go ahead, Alex.
  1. ALAPHILIPPE: Yeah, I think it’s a very good question. Thank you.

I think right now the question we have to ask ourselves is how we as a society – and when I say “we” it’s everyone in the room – individually and collectively we can deal with this.  I invented this this afternoon, so I’m testing it with you.  It’s like the five fingers of the hand, you know?  The democratic society is made for NGOs, public powers, academics, journalists, and companies, basically are the five main pillars.  And these pillars can work together or not, but at least they are independent.  If we – if we shut down everything, we don’t have a hand; we have a fist.  And a fist is made to fight.  I don’t think we want to fight with each other.  So I think we need to be sure that in the way we are acting together we are keeping this discussion alive and we are keeping checks and balances between everyone, because if we don’t do this we’ll not be able to be resilient enough, as Ambassador Fried said.  We need to be resilient together inside societies because we might face threats from outside – and I don’t name any powers, because everyone can play on this from China to Iran to Venezuela to U.S. to anyone – for internal particular powers.  You have now movements that are ready to lie and to cheat to win at the expense of democracy, and that’s also a big issue.

So we just need to be sure that these five pillars are well together, and that’s the most difficult part for us.  We need to be sure that we can trust each other because if we don’t have this, we don’t have trust, and then we are just manipulated by emotion again.

  1. GONZÁLEZ: Mmm hmm. Kadri, do you have anything to add on this in terms of the resilience piece?
  1. KASKA: Yeah, I was thinking the issue of disinformation versus hybrid warfare, isn’t the hybrid warfare rather a fig leaf of a type to justify interference into our democratic processes and systems rather than a genuine – (audio feedback, inaudible)? Because I’m rather glad to see that the – (audio feedback) – sorry.  (Laughs.)
  1. GONZÁLEZ: Keep going.
  1. KASKA: – that the West hasn’t been overly keen to buy sort of the concept of hybrid warfare. We are pretty certain in our – in the – you know, the rules of the game.  We know what international considers as conflict and the rules that apply in conflict and in peacetime, and we are – we have stuck to our values in that regard and not gotten carried away too much of the whatever way Russia is labeling it.
  1. GONZÁLEZ: Wonderful. Thank you, Kadri.

I have a question all the way in the back and then I’ll move – I had you first and then you third and fourth.  Don’t worry, I’m keeping track.

Q:  (In Spanish.)

  1. GONZÁLEZ: (Translated.) Yes, there is no problem.

Q:  (In Spanish.)

  1. GONZÁLEZ: (In Spanish.)

(Continues in English.)  A reminder that you can use #DisinfoWeek to follow the discussion.

Q:  (In Spanish.)

  1. GONZÁLEZ: (Translated.) Yes.

Q:  (In Spanish.)

  1. GONZÁLEZ: (In Spanish.)

(Continues in English.)  Does anyone –

  1. DE PEDRO: (In Spanish.)
  1. GONZÁLEZ: Yeah. All right.  We’ll have to say it in English.  I mean, yes, the conversation has – you’ve mentioned Russia quite a bit.  But we’ve also mentioned resilience a lot because the goal here is not to, you know, just address the actors.  It has actually become a more resilient society so that state and nonstate actors, whether it be, you know, China, Iran, or right – far right, far left – that we are prepared for any challenge that comes our way.  It’s not about going the whac-a-mole way.  It’s about, you know, making sure that we’re stronger.  We have one over there and then you – I’ve got you, and then I’ve still got you.

Q:  It works?  Yeah.

  1. GONZÁLEZ: Yeah.

Q:  Good evening.  Today we have heard and seen that, basically, the producers of fake news they appeal to emotions of people and we have to say that the outcomes of their work is pretty successful because people actually – they trust this information, they consider this information because it seems that fake news they are – they are on hype.  They are attractive.  They are extravagant.  They actually attract the attention of the recipients.  And in this light, it seems to me, at least after this conversation today, that reliable information and the truth is just boring for the final recipients, let’s say.

So my question would be do you know, or maybe we could invent some ways of making truth and reliable information attractive to the recipients and maybe the techniques that producers of fake news apply to their work we could apply to the providing, disseminating reliable information and use.  Thank you.

  1. GONZÁLEZ: Thank you so much. I will take a second question, if we can go – (translated) – please, yes, here.

(Continues in English.)  Making truth attractive again kind of thing, sexy again.

Q:  Thank you so much.

  1. GONZÁLEZ: Yeah. Yeah.

Q:  Yeah, I go a little bit on the line of the previous question.  I mean, I think the world is getting much faster in the sense of massive information we are having.  So the attention span of the people is getting shorter and shorter and shorter.  So the battlefield is changing, in that sense.  So I don’t know – I mean, it seems these bots are attacking our lizard brain or the lizard part of our brains, so getting out the – what you said before about the amount or the reflection and all that.  So how are you thinking that we can fight against that?  Because it seems that’s going that way and seeing the younger generation – our kids – it’s getting even shorter term than longer term.  Thank you.

  1. GONZÁLEZ: Alex, do you want to kick off some?
  1. ALAPHILIPPE: Yeah. So two things.  I think on the – on the first question about how to make truth reliable again, I think we have right now a problem that we don’t have enough news for the news consumption we have and we are connected online 24/7 and you are following things and you don’t have enough things to consume.  Even if you’re the greatest Real Madrid fan of, like, all the planet, you will never have enough information for you.  So everything that you don’t know and everything that you might know is interesting for you.

So I think that this is the strong issue that we face right now is especially on big and complex issues, that we cannot have fresh news every 30 seconds just by putting out the app.  So we need to understand this.  So maybe we need to find also – I don’t know, that’s a – (inaudible) – open to – (inaudible) – how to open reward for sharing better-quality news.  That’s something we could – we can invent trying.  If we can’t get rid of the – of the reptilian brain, maybe we can use it in a better way.  So I think that’s maybe one other thing.

The second question was about –

  1. GONZÁLEZ: You know, you get so much attention, your span is very short and so how do you actually battle against people, you know –
  1. ALAPHILIPPE: Yeah. So one other thing that I have read and I try to do sometimes when I’m seeing something really awful I try not to click, and I try to do this real painful thing is to wait 30 seconds before my screen not doing anything.  I can assure you 30 seconds before your screen not doing anything is a long time.  And then you try to ask yourself, OK, why do I see this?  Why am I curious about this?  Why do I want to click?  Why do I want to react?  How much do I know about this issue and how much I trust my own opinion about this issue?  And that’s – and who can I trust on this issue?  Who can I ask something?  Where can I check of information?  If you do this, you already have, I think, a built-in resilience that you can use and that can help.  And if you are doing this for every news that you find curious, I’m pretty sure there will be less retweets and less clicks for some things and that might be the first step to apply individually.  So I think there are solutions that need to be taken collectively and that we need to choose research.  We need to have more journalists, more fact checking and much more initiatives, and I think society is the right place for this because that means ordinary citizens are taking a stand on this and they are developing new solutions.  That’s very good.  But I think individually we also need to understand what is going on with our brains and try to behave and try to have ethics on how we behave online.  There is this super quote I found today.  You don’t do things in internet.  You do things.  It’s not – internet is not a different space.  It’s just the same space.
  1. GONZÁLEZ: Mmm hmm. Yeah.  Go ahead, Kadri.
  1. KASKA: I have – regarding your question of making truth attractive, I have unwavering faith in truth being attractive in its own right, I mean, at least in the longer perspective, because there is limits to how much people are willing to consume fast food before their bodies will start to break up and, I mean, you cannot make or build a life upon entertainment and I would say the – still, the society by and large gets this. So and in particular, looking at younger people, I have two nearing adulthood at home.  I would say that they are surprisingly capable of distinguishing crap from the – from the legit stuff.

So the challenge, I would say, is that making sure that the skills to produce and consume legitimate stuff remain while the – while the masses sort of running loose on entertainment – on consuming entertainment or information that qualifies as fast food, making sure that the – there is still, even in the – in the era of a traditional economy that platforms where you can get reliable news and reliable information remain.  And that can be seen as a public function as well.

  1. GONZÁLEZ: Alex, you had a short intervention.
  1. ALAPHILIPPE: Just a very short thing, because we talk a lot about youngsters and media literacy. I’m not going to have many friends around here, but if you – if you know people that are over 60 on social media platforms, I think in terms of fake news I think they’re the worst.  The number of emails, chains, WhatsApp chains I receive from people which are between 60 and 70, 80, is full with disinformation and I think we should – and I’m sorry, but the age range between 60 and 80 goes to polling station at every election.  So the risk for democracy is very high.  So I would encourage everyone to take media literacy class, from 5-year-old to 95 years old.
  1. GONZÁLEZ: There are challenges with that. David, do you want to – yeah.
  1. ALANDETE: I actually – there is really good research on the reasons for disinformation and the spread of disinformation from the Massachusetts Institute of Technology. It’s from last year and it’s probably the best piece I’ve read on how bots are not that important.  I’m sure you’ve read about it – like, 90 percent of disinformation travels through psychological means and there are two important things that journalists should take into consideration.  One is novelty in the headlines – you know, something new is something that you’re going to share.  And, of course, like, it has to be new but true because, I mean, if I say, like, tanks in the streets of Barcelona then – like, you know, that’s a real piece that was published – it’s new but it’s a lie, right.

And the second one is status, and that has to do with your question.  You know, like, you actually show your followers in social networks that you have a high status because you know something that they don’t know and then you share it because it makes you look good and that has to do a lot with education and how also the social media are built.  You know, like, by searching – that’s why Google is, like, better – not perfect, but better, like, containing disinformation because there is no status there.  But when you go to Facebook but especially when you go to WhatsApp and the information that you forward, you know, that element of, like, showing status and, like, showing things that other people don’t know explains a lot, you know, and explains, like, how disinformation travels that fast.  And I think we should bring the big platforms into this conversation because they have a responsibility, you know.  And I’m very happy that Mark Zuckerberg went to testify to, you know, the Capitol.  He didn’t go to the U.K. Parliament but he went to the European Commission.  But they should be involved in policymaking, you know.  I know.  (Laughter.)

  1. GONZÁLEZ: OK. Now, we started a little bit late so I’ll take two more questions and that’s it.  So I have one here and one here, and I’ll take them one after the other, please.

Q:  (In Spanish.)

  1. GONZÁLEZ: OK. You got that?  But wait.  Wait.  Wait right here.  Yeah.

Q:  Yes.  Thank you.  I think David just spoke a bit about the question that I had and it goes to would we be even here if it weren’t because Facebook and Twitter had so much success over the past six, seven years?

  1. GONZÁLEZ: OK. So we start with “libertad expresión,” so freedom of expression, and then different lines.
  1. ALANDETE: Just a brief comment on that. Like, that’s the ministry of truth, right, and, of course, I’m a journalist.  Freedom of speech has to be protected but, you know, like, not all speech is the same.  You know, like, there are limits, you know.  You cannot just publish anything.  I mean, you need to be held accountable somehow, you know.  Like, if I commit perjury or if I, you know, spread, like, not only disinformation but, like, hate speech, you know, like, I think, of course, there are limits.  You know, like, I think that’s a fact and Twitter and Facebook, I honestly think that it’s one social network or the other.  It’s not the content.  It’s the format, you know.  Like, it’s the fact that not internet was invented but the cell phone was introduced and we have, like, internet in our pocket 24/7.  So I think you could change Twitter and Facebook but, you know, there’s SnapChat.  There’s, like, whatever.  There would always be something, I think.
  1. GONZÁLEZ: Wonderful. Anyone else have anything to add on this?  Ambassador Fried, do you want to close us out?
  1. FRIED: I think that the social media companies existed for a long time in their version of a post-national paradise where they were going to usher in a new age of a completely democratic world. Well, they may have thought of themselves as creators of a new – of new utopia.  But to the Russians, they look like suckers and they look like suckers to me.  They need to be responsible for what appears on their – on their platforms.  That does not mean content controls.  It does mean transparency and authenticity, and they should be made to account for what they do.

The social media companies have gone from denial of the problem and ridiculing those who thought there was a problem to wanting to be seen as part of the solution.  I’m not sure to what degree this is sincere.  I think it is mixed.  But they’re responding to an incentive structure which – over which we have some control.  The combined leverage of the European Commission and the United States is sufficient that social media companies will respond to incentives.  In other words, if we get our act together we have the combined power to make them behave better, which, in my mind, means working in the integrity of service space rather than the content control space.

It means exposing the front companies for RT and Sputnik.  It means labeling bots as bots, labeling trolls, labeling fakes sites that are pretending to be American or Spanish or French as – but are, in fact, Russian as such.  And then there are the more complicated issues, as I said earlier, of getting into algorithmic bias, pushing things to our lizard brain.  OK.  That’s a tricky issue.  But if we start breaking down the problem we can deal with it piece by piece, and we need – we have the ability to do so, combining our knowledge and experience on both sides of the Atlantic.

And, finally, the heroes of this are going to be the independent civil society activists.  Alexander Alaphilippe and like-minded groups in the United States, in Europe, the Baltic Elves, StopFake – these groups are the ones who are going to be much better than old bureaucrats such as myself at uncovering and exposing disinformation operations and then once society – then the task is to get society sensitive enough so that a disinformation op is rendered anathema and people start to turn away from it, start to shun it.

Now, easier said than done.  But I will argue, and have been arguing, that the policy solutions do exist.  It is the purveyors of disinformation or those who profit from disinformation who want to convince us that there are no solutions.

  1. GONZÁLEZ: Wonderful. And with that, thank you so much for joining us.  (Applause.)  A brief – a brief reminder that this is just the beginning of the conversation.  We are headed to Brussels next where we will have two full days where we’ll pick on each of every single one of these elements and expand on them.

Thank you for joining us, thank you for listening to us, and come talk to the speakers and get more questions.  Thank you so much again.  (Applause.)

(END)