Mapping the last decade of Russia’s disinformation and influence campaign in Ukraine

Read more about 360/Open Summit: Around the World

360/OS

Jun 7, 2023

Activists and experts assemble in Costa Rica to protect human rights in the digital age

By Digital Forensic Research Lab

Our Digital Forensic Research Lab is convening top tech thinkers and human-rights defenders at RightsCon to collaborate on an agenda for advancing rights globally.

Cybersecurity Disinformation

Event transcript

Uncorrected transcript: Check against delivery

Speakers

Andy Carvin
Senior Resident Fellow, Managing Director, DFRLab, Atlantic Council

Ksenia Iliuk
Co-founder, LetsData

Roman Osadchuk
Research Associate, DFRLab, Atlantic Council

ANDY CARVIN: I’m really excited to have all of you here today. Sorry if there was a bit of an echo there for a moment. Today I’m honored to have Ksenia Iliuk joining us. She is co-founder of LetsData, a Ukrainian research company that uses AI to detect and monitor influence operations. Previously, she was head of Detector Media. We’re also joined today by one of my colleagues from the DFRLab, Roman Osadchuk, a research associate who focuses on Ukraine.

So today we’re getting together to discuss the case study of Russian disinformation and influence operations in Ukraine. We’re now about fifteen or sixteen months since the February 2022 invasion of Ukraine, but of course it goes back much further than that with Russia annexing Crimea and invading eastern Ukraine 2014. And in the years in between 2014 and the reinvasion last year, we saw countless instances of Russian media, Russian influencers, Kremlin politicians, painting Ukraine as aggressors, painting them as Nazis, presenting Ukraine as a country that needs to be stopped.

And in the months leading up to February of last year, despite the fact that there was an enormous amount of mounting evidence that Russia was moving troops and armaments into place to conduct an invasion, Russia continued to amplify these narratives. And basically, trying to justify a war of aggression, while at the same time denying any responsibility of what was about to happen. After the invasion, of course, Russian information operations have continued in a variety of forms, not only targeting Ukraine but targeting Russian citizens to secure and maintain support for the war domestically. Information operations targeting countries and regions all over the world to undermine support for Ukraine and undermine Ukraine’s morale.

And to some extent, Russia has had successes, but they’ve also had a lot of duds. They’ve also had a number of instances where these campaigns clearly have not worked. And so today we’re going to take a look at Russia’s efforts to undermine Ukraine since the invasion started, but also before that as well. And I’d love to start by giving the floor to Ksenia to talk a bit about her work and some of her findings.

Ksenia, the floor is yours.

KSENIA ILIUK: Thank you so much for having me today. So we’ll start by saying within all of these years of analyzing malign information campaigns, and with the start of post their invasion, what became very clear for me as an analyst in this field, as well as Ukrainian myself, and fellow Ukrainians, that malign information campaigns is not something, you know, far away from us. It’s not just about politics. This is something that can literally kill. This very, very tough realization, it actually went upon a lot of Ukrainians. And as of now, according to different studies conducted by Detector Media and other Ukrainian nongovernmental organizations, over 80 percent of Ukrainians consider disinformation as a threat. And that is all because people saw how these threats can be facilitated in their everyday lives.

We see that—how through all of these years, before the first invasion and then the annexation in Crimea, then the full-scale invasion, all of this year Russia has been using its information influence activities to reach any kind of geopolitical and military goals. And so, you know, it’s hard to kind of admit, but all of this—all of the years of the analysis, they showed that Russia has been preparing from the information perspective different audiences in different countries to the full-scale invasion.

And I would focus a bit more on actually the start of the full-scale invasion and the time since the start and how Russia has been targeting with this—with their malign influence different countries worldwide. And I will start by saying that here we have data. We’ve been analyzing information space around Ukraine in forty countries from Brazil to Japan, and we have noticed some quite interesting things that helps us to better understand how malign information influence operates, what are their strong sides and what are their weak sides and how we can resist as democratic countries and build a way to resist actually that is based on democratic principles.

So the first thing that we saw quite visually is that the farther geographically the countries are from Ukraine, the more intense there was Russian malign information campaigns, meaning that with the first wave of Russian malign information campaigns with the start of the full-scale invasion Russia has been specifically in Europe, US, Canada, and partly Australia.

So we’ve seen different polls showing that the support for Ukraine in terms of wider audience population of the countries is quite high. So what we’ve seen is that—and also on the other side most of the things that were done in terms of the regulations like banning [Russia Today] and different social media platforms’ regulations they were mostly done with a focus on Europe—Europe, North America. That is why we see Russia acting across the world differently.

So what they are doing there they are continuing developing their information infrastructure that has remained actually untouched since the start of the full-scale invasion. So, for example, what we reported is that about 20 percent of media publications concerning Ukraine were using as a primary source of information Russian state-affiliated media, the exact same media that were publishing the articles with open genocidal rhetoric with open calls to basically slaughter Ukrainians and doing justification for that.

So we have this part, so we see the information infrastructure. And unfortunately, in that regard Russia remains quite strong, especially the more to the Global South we go where RT is functioning as if nothing happened, continuing pushing its open lies because it’s not—no longer even, you know, the subtle malign influence where they just manipulate a bit with the context. Still until today most of the content of RT is just—outrageously just portraying the reality in a completely different way.

And here when we look into details, we have on one side this is how Russia uses its information infrastructure and making it to the discourse—to the discourse of the media, and on the other side we have social media. And what we see here in terms of topics and how Russian malign influence is using it is that they actually go very hyperlocal.

What they are good at is in actually exploiting the historical background of the countries—basically, the pains of different countries—and attacking it. So, for example, in most parts of South America, Russia is exploiting the anti-US sentiment. So they build numerous conspiracy theories to kind of create this image that the war—the Russian war against Ukraine is allegedly started by the US.

And it’s very interesting because they tailor different messaging to different audiences within the countries. They claim that, for instance—for one audience, they would claim that, oh, US wants to dominate the world again, and with their hands on Ukraine they want to destroy Russia. Well, the other audience they would push that the US elites are just profiting from the war allegedly and that’s kind of the main reason for the war. So they would start, like, constantly, constantly bringing these various conspiracies that are based on this general anti-US sentiment that is, in reality, quite present and widespread in this region.

The other thing quite interesting was from—as well, from most countries in Latin America, particularly in Argentina and Brazil that we noted, that Russia is heavily pushing the topic of corruption. Because, again, the topic of corruption is something that the people in these countries know, they’ve faced. They consider it, as well, as a big threat on their way to prosperity and developing. So what they are doing is that they are trying to make any case of small, minor corruption, invent different cases of corruption, and promote them for this audience. But what they are doing is that the way they distort the reality, they will take one case and they will try to pin it to the very big conspiracy blaming like, oh, you see, here is the level of corruption of Ukraine; that means that Ukraine is a failed state, it cannot exist… They are leading it to the way, so it cannot exist, so why should they fight? They better—they better just give up. They better, like, go to Russia; they are not a country anyway. And these are kind of different, different speculations that are being pushed over there.

However, here, on the other side, this is also rather a positive thing from the perspective that when there are monitorings and media in Brazil, for instance, all of the analytical materials, different pieces about how Ukraine is actually fighting corruption, how Ukraine is fighting corruption in the middle of, of course, an invasion, where very much they got lots of traction in these countries. So thus we see how—you know, how you could shape this, how you could frame it. And unfortunately, Russian malign actors are framing it in a—with a very bad, ill intent to distort the reality and bring these, you know, very simplified conclusions to what should be done about Ukraine.

So, overall, we will—we will talk more about, like, what are the tools how we could resist it. But I would say that what we see so far, especially in the Global South, is that Russian information infrastructure remained untouched. Moreover, they are spreading it and they are developing it even [farther]. We see some very, very concerning developments in terms of the usage of Telegram, the anonymous Telegram channels and how they were—are being replicated in different countries with different models by malign actors. And we see in terms of the content this great orientation on the lack of context, which means combining these two things makes it very, very dangerous for different audiences out there.

Thank you.

ANDY CARVIN: Thanks. Thanks, Ksenia. I really appreciate it.

Roman, let’s turn it over to you now.

ROMAN OSADCHUK: Thank you, Andy. It’s a great pleasure to be here with all of you today.

So I will stomp on a few things that we, the DFRLab, did in—also a long time ago. So we presented two big reports that look at the whole year after the invasion. So one of the reports were called “Narrative Warfare.” Sorry. So it was investigating the rhetoric in pro-Kremlin media in three months after the initial invasion. So what we found there, they were, like, amplifying the messages of the officials, Russian officials; amplifying some negative rhetoric… kind of bringing up this, like, negative intent and perception of Ukrainians, as well. They also amplified multiple false-flag operations, because Russia were, like, in desperate search of casus belli. They didn’t find any, but they tried a lot. So, like, different operations, different disinformation campaigns trying to portray Ukraine as an aggressor when, in reality, we all, as of now, know that Russia is the main aggressor who actually started the invasion and actually annexed Crimea, and were in eastern Ukraine 2014.

So what we found was that there was no actual evidence that there was some, per se, coordination, right? When the media published some messages citing some specific official, it doesn’t mean that they, like, coordinated on that. It’s just because they’re amplifying their officials. But because they were using more escalatory rhetoric, that’s why it became more evident that there is something, truly.

And what is even more interesting is that many of those disinformation narratives that started while back in 2014, get amplified during the winter of 2021, 2022. Many of them ended up being in Putin’s speech. So basically all of those things that media actually vocalized and amplified for their audiences in Russia, they ended up being in those speeches. And the whole world’s seen them. And this is really interesting, that those things were the basis of the speech itself.

The second report is Undermining Ukraine, a slightly different one, because it looked at the different tactics that Kremlin and pro-Kremlin actors did all over the world, so including Russia, Ukraine, Europe, South America, Africa, and et cetera. So what we found is that, first of all, Russians did not abandon their previous old tactics and toolkit. For instance, overinflation of information space, the same thing that they did during the March 2017 Skripal poisoning, they continue doing it now, right? When there is some evidence of war crimes by Russians, they will try to come up with so many explanations of what’s happening that could you possibly imagine, so that people who are not closely following they would see that the truth is contested. So that’s why they will not be actually making final decision or, like, understanding what’s going on. So that’s the main thing that Russia aims for.

The second thing, usage of conspiracies, right? So these different conspiracy leading that. Actually, gaining tractions in some parts of the world in some conspiracy-leaning audiences, like that some parts of the war were, like, filmed in some film studio, which is—actually, it’s not true, but those things gaining traction. They’re being amplified, and they’re being equally consumed by some audiences.

Now, on atrocities. I kind of already said that, but it’s a clear pattern that Russia is trying to avoid any responsibility and deny us basically anything that their troops done on Ukrainian soil, ranging from bombing the maternity hospitals, mass killings, killings of civilians, and this list goes on and on, unfortunately. But what are they trying to do? They always try to avoid this by shifting the attention to either something else or trying to shift the blame to Ukrainians.

Also, another thing is the usage, as Ksenia said, right? So RT and Sputnik might not be that active in Europe, but they are still pretty active in other parts of the world. And they’re still working there. They still have their audiences. Moreover, in some places what we’ve found is that even if the Sputnik is—or, RT are banned, some actors and some smaller channels are actually reappearing and rebroadcasting what those channels are trying to promote and show their audiences.

But also, there are some new tricks, right? So the false fact-checking, the notorious war on facts that actually uses fact-checker toolbox to falsify truth, to promote propaganda, and many things like that, is actually broadcasting—or, not broadcasting—but it’s a website in multiple languages. So and they are being shared quite a lot by Russian embassies, for instance, and Russian ministry of defense.

Another thing is what they are doing is that sometimes they are inventing some things. Put this, as it is, information campaign that Ukrainians launched and marketed, but in reality it didn’t exist. Also, they’ve hacked media websites to get credibility. So Ukrainian media websites were hacked, with at least one evidence when they planted a story, archived it, and then used it as evidence that, look, Ukrainians also have the same conclusions that we did. But in reality, those story were not published by Ukrainians. They were because their website was breached.

Another thing is multistep approach. So there are narratives being told for months, some of them even for years, in order to be used again and again and be sharper, put in the more solid basis to them, so that they are becoming more serious in the view of a person who sees them for the first time or for the second time. Another thing—so, and this build up on the previous messages—creates this illusion that this is well-developed topic, when in reality it isn’t and it’s based on the bogus claims. But when you see, like, fourteen different links to different stories, people might perceive that there might be something behind it. But in the final turn, they’re all—they’re all not that credible. But it’s hard to check it.

And the main idea here, the main topic that we found, the main success probably, is that they aim to reach mainstream media elsewhere, right? So not in Russia, but in other audiences. So to reach even more population in those targeted countries. Because they cannot reach everybody, but the mainstream media in our respected target countries could, and that’s one of their objectives. Another thing that we’ve seen is the identity they have of well-known Western media. So they steal the identity and visuals of BBC, Frankfurter Allgemeine Zeitung, Al Jazeera, to promote their conspiracy stories and videos using the identities and visuals of those of West. In reality, those materials were not being prepared at all, forgeries.

Russians post everything from military plans to some dark web pictures supposedly that Ukraine is supposedly selling Western-donated weapons, when in reality it wasn’t the case… And as Ksenia said, right, they are using regional-specific approach. They are crafting their messages for specific audiences in specific countries. And, again, whataboutism, this is the widespread thing, right?

So they’re always trying to put the blame on somebody else and claiming we’re not the first one who’ve done that. So that’s why we should avoid any responsibility. And that’s their thing. They are trying to find some useful actors on the ground to be their kind of foot soldiers, in the meaning that they would start some narratives so that they could use it further. So they’re looking for some local actors who would actually spread the initial claim, or just amplify their messages for those specifically targeted audiences.

And it continues now. So what I wanted to say is that it doesn’t end with our reports. We’re continuing our work, as many other researchers. And there is many more different campaigns that are ongoing at the moment. So, as of now, they’re actively trying to undermine trust of Ukrainians towards military leadership and political leadership so that Ukrainians will stop actually supporting the government and being discouraged to fight and to resist.

Another thing is that they are trying to undermine support towards Ukraine. They’re launching ads on Facebook that surprisingly, with some caricatures and links to some websites that are actually copycats of the initial and real websites of those media, claiming that you should not help. I’ve seen it two days ago a threat on Israel. So there was, like, ads targeting Israel. And similar ads I’ve seen in Ukraine, targeting Ukrainians. So there is, like, definitely a pattern there.

They also amplified futility of Ukrainian resistance, claiming that you shouldn’t be doing that, any fights would be futile against Russian army, so you should give up. And again, coverage of the war crimes continues, unfortunately. Usually it’s been covered up, like, cui bono—so, who benefits from this? And for any horrible thing that ends up on media, they would create up to, like, twenty different stories claiming who would benefit from it. It’s not beneficial for Russians to commit atrocities in Bucha, or bomb maternity hospital. So probably it might be staged. When the reality actually is far different, and the reality actually—like, actually tells otherwise.

So there will be more campaign in the future, so we need to keep an eye because Russians are playing long game. So my main, I don’t know, idea is that we need to continue doing that, and we need to keep a closer look at what they will do in the nearest future as well.

Thank you.

ANDY CARVIN: Thank you, Roman.

So both of you discussed how Russia weaponizes information in different ways around the world. I’ve sometimes heard media pundits discuss the information war taking place between Russian and Ukraine, as if there’s a singular argument between the two countries and a single—and a singular group of narratives targeting each other. But it seems like what we’re really talking about are, like, theaters of operations around the world where different types of information warfare is taking place. So in Western Europe, you might see examples that are attempting to have people living in the EU or NATO member states to see it as economically detrimental to support Ukraine; while, as Ksenia mentioned, in South America anti-imperialism and America’s history in Latin America is often used to frame the debate; and we’ve seen the same thing in West Africa, as well, targeting French support for Ukraine.

So there really isn’t a single information war there. They’re really tailoring it globally. How do you combat that?

KSENIA ILIUK: That’s a million-dollar question, I would say.

So, first of all, I think the very important thing that is very much lacking from all of those different geographies that we analyzed is actually threat awareness, threat awareness and readiness to facilitate it. I mean, it sounds a bit, you know, like, oh, but we do acknowledge that Russia is doing malign information campaigns. No, the threat awareness—the true threat awareness should be among different decision-makers in the country, I mean from the state institutions to everyday citizens, to everyday life of everyday citizens, because the modern information space we each live in with the development of technologies and sampling requires everyone to be aware of those things, everyone to have the skillsets—different types of skillsets—to navigate through it.

And I think that’s kind of like, when we look at this, this is what’s lacking. Because, for example, when we look at the media, Roman mentioned that Russians are trying to go to mainstream media as much as possible. And how they are managing it? Usually because there is a very low threat awareness from the journalistic community, that think that RT is, oh, just a media outlet. I am even very cautious of using the word “media” when talking about RT, you know, because any analysis of any researches worldwide shows that it has nothing to do with journalism and media. But the threat awareness is so low that, like, people co-opt it. People consider it as a credible source. The RT is not banned. Russia is managing to have their manipulations around, like, oh, that’s a freedom of speech, we just want to express our opinions, and other things. This all comes from the lack of threat awareness and readiness to facilitate it.

So I think that should be, like, the very first building block of everything, because after that we go into details on many—on many tools. There are various tools of, actually, building this resilience to malign information campaigns, starting with developing different policies with advertisements—with advertisers that—to create some different regulation, self-regulation spaces because—Roman mentioned Facebook. There were also numerous cases when Russian-affiliated malign actors, they were just buying—you know the Google ads on different websites, just banners? They were just buying the banner spaces and just, like, putting all—the notorious lies just out there. So this is one of the tools that we have—educational tools. We have free banking and so many different tools out there. But none of the [tools] would work just within itself. We need to combine all these [tools]. But the very first step should be within acknowledging the threat, facilitating it, and starting different discussions on different levels—OK, what should we do about it.

ROMAN OSADCHUK: Yeah. I would maybe start from saying that information is actually the essence of life in a way that actually influences any decisions that we’re making and it’s definitely, like, information warfare. It’s not like one fight is going on. There’s, like, multitude of different things happening simultaneously and it’s incredibly hard to control all of the things simultaneously.

But as Ksenia said, indeed, there should be more done in a way that’s raising awareness that those things exist. Another thing, there should be more skills for journalists and actually wider audiences because media literacy becomes a needed skill. It’s, basically, essential skill in our time like reading, I don’t know, a few centuries ago.

So it becomes more and more needed to people to understand what type of information they are consuming and how to work with that. And I would echo Ksenia’s point that actually some journalists would also need to understand that the equivalence and the balance of two points is not always the right way to promote because if one side does not base their claims on truth and on pure fantasies or disinformation equalizing them is not a way of going.

And the final thing that I would say is that it’s based, actually, on our colleague Jakub Kalenský’s four line of defense, is that it should be harder for disinformers to do their—to do their job, right? So something should be regulated, as Ksenia said, maybe like advertising industry.

Actors should be named and shamed so that, like, their way of how they promote information, what they are writing about, how—what tools they are using audience need to be actually informed on that, know about this, and that’s why this effect might—their effect might be lesser.

But also another problem is that if we do not see the immediate effect of those things they are working slowly. The information builds up slowly step by step because the repetition works, as some research shows of different studies.

So even if you see that there is not much of the audience of RT in the particular country it doesn’t mean that it will not build up and not make this, even this small part of the country, really eager fans of the pro-Kremlin narratives and messages.

ANDY CARVIN: So throughout the war and preceding it as well it’s often felt like the Kremlin is trying to throw every idea they have at the wall like spaghetti to see what sticks and what doesn’t, and certain narratives resonate and take on a life of their own and spread and others don’t.

What factors do you think cause certain narratives to spread and be successful versus ones that don’t? Are there any particular patterns you’ve noticed?

ROMAN OSADCHUK: I could start. So I think the main issue here is whether they are hitting the nerve of a specific audience so if they’re, like, actually hitting the cleavage between some polarized audiences, let’s say, or some groups. So if they are actually with their messaging actually hitting the, I don’t know, side guys or the actual problem that this specific group cares about, then it would work out.

So, for instance, let’s take this—there is a group of people who are seeing the conspiracy that [the] West is, I don’t know, preparing a plot for the worldwide government or something like that. So this audience would be extremely receptive for the conspiracies that the war was induced by the West. It’s not the war between Ukraine and Russia. It’s war because—it’s the war between West and Russia. So this is just, like, slightly explaining the technology here. If it coincides with interest of a specific group then it would probably work. And if—another thing. If it actually involved, again, things that people care about.

So, for instance, in Ukraine some disinformation—when there were, like, a lot of blackouts after the Russian attacks on energy infrastructure, many people lived without electricity for hours, some of them for a few days, and Russians injected messages that actually those energy blackouts are not because there was some shelling but because Ukraine is selling electricity to other places and to other cities.

And it is extremely emotional. You could just imagine, right, because people were definitely desperate and they’re, like, in not the best position and those messages they resonated, not because they were, like, fantastically crafted or something because most of them were just hitting the nerve of the people and the exact situation at the moment.

ANDY CARVIN: Ksenia, anything you want to add to that?

KSENIA ILIUK: Very much agree with Roman on that. I will just add that it’s also important to understand when we look at the narratives of Russian malign influence is that a lot of them and different messages that fuel the narratives itself, they are not always there to kind of stick. They are very often there to completely disorient the audience. And this—and also sometimes, you know, that is varied. Here I’m talking about the more sophisticated Russian malign influence, not the ones that [claim] that Ukraine has biological biting mosquitoes but more elaborated ones.

They are very often—they can promote the narratives that are from the first sight are not beneficial to them and that is what makes it very hard for different audiences to spot them because when you see them you’re, like, oh, but that doesn’t look like this is something that would be beneficial for Russia to promote, you know.

But that’s a part of the strategy because when they are sending any kind of piece of malign information, whether it’s faith, whether it’s manipulation, whether it’s a bigger message, you know, a bigger story they’re spitting out, each of them has its goal and very often this goal is not about actually making sure that these are the narratives that are—that people believe in and that impact their decision making but to distract, to disorient, sometimes to completely focus attention on a different thing.

In Ukraine, for instance, with the start of the full-scale invasion there was numerous fakes that Russia was spreading to, basically, all the—to shift the attention of all of the Ukrainian volunteers that were working and were actually doing the stuff that was very productive—not productive Russia, obviously. So they were trying to sway them into doing other things, useless but, like, to make sure that they do it.

So this is also very important. When we see any piece of information, just ask ourselves, OK, what do we feel about it? Who can benefit from it?… Because sometimes they’re just there to distract.

ANDY CARVIN: So we have a number of questions coming in from the audience, from attendees at RightsCon. So I want to start with a question from Mais, and I’m hoping I’m pronouncing that right. Mais writes: There are striking similarities with the tactics Russia employed in Syria, from amplifying disinformation to vilifying rescue workers responding to war crimes and appealing to anti-West sentiments. What crucial lessons can be learned from Russia’s playbook in Syria to develop more effective resilience against information warfare in Ukraine?

Now, I know neither of you are necessarily Syria experts. But I think it’s fair to say that this isn’t the first time Russia has deployed these techniques in a conflict or just to undermine its adversaries. So I guess I would ask do you see similarities with other aspects in Russian history or going further back in Soviet history of how they’ve weaponized information?

KSENIA ILIUK: I would definitely say that there is nothing new under the sun when we see all of the—when we look at the essence, not of the shape, the format of malign influence but the essence, that the essence is the same.

About Syria what I would like to note that there is, indeed, a lot of strikingly similar or even the same patterns being applied and I think the biggest thing that we kind of failed as democratic societies worldwide to learn from Syria is actually the fact of actually analyzing the situation and learning.

I would personally say that the situation of Syria is much more complicated. Information-wise, with the situation in Ukraine… there is a very clear unity between lots of efforts in civil society, state institutions. So in this regard in Syria, the situation is much more complex.

But I think we actually failed from learning that. We failed because we again say that, oh, these techniques, oh, we already knew them before. So if we knew them before, why haven’t we [prevented] them from spreading again? And with most of malign information campaigns happening right now, they are—they are not the same, but most of them could be easily anticipated and prevented. And here we have a very strong tool of preventing when we can actually inform in advance about the tactics, about the manipulation that will be used. And I think in this regard, in the context of Syria, we just, unfortunately, very much failed to do that.

ROMAN OSADCHUK: I would echo that we kind of failed, probably, on learning on the tactics. They are definitely not new. If you read about the active measures, there is a wonderful book with the same name that actually describes different operations that took place in the twentieth century. So it’s all over—all over there. You could also look at some defectors—Soviet defectors to United States who also acknowledged what they had done in the—in different parts of KGB, and how, and what they spread to different audiences. So it’s definitely not new.

But we, indeed, failed to learn more on the—on the example, Syria. But at the same time, I would say it was slightly different because of the sheer amount of footage and the files that we could actually document. So now this war is probably more captured and—I don’t know—captured meaning on photo, video. There’s so much evidence of basically anything that happens. So that’s why in a sense it’s easier to collect all of this data. So hopefully this time it will be easier to prove involvement of some specific troops in some particular war crimes, or what actually happened on the ground, because having those materials is actually more beneficial.

But again, I completely agree that it’s not new and it’s actually—Russians are using similar tactics again and again.

ANDY CARVIN: We have another question, from Andrew, who writes: Although RT and Sputnik are banned in most Western countries now, Russia has recruited a notable amount of so-called citizen journalists and whistleblowers. Tara Reade’s defection to Russia recently made the news, but she’d previously been using pro-Russian talking points online since at least 2018. Could you provide some perspective on how these disinformation actors are cultivated and amplified by the Russian state?

Do we have any insight on that? And not necessarily her specifically? I mean, it’s—just more broadly, how does Russia exploit outside actors to promote their ideas?

KSENIA ILIUK: We have just finished, actually, the quite interesting analysis of information space of countries of Eastern Partnership plus Georgia and Armenia. And it was—one of the things that we saw that Russia more and more started focusing, as you’ve mentioned, on these kind of amplifiers—these independent people, independent journalists that, like, actually echo the rhetoric completely. And what we see very interesting in there that, especially in these countries, a lot of them are moving to Telegram, heavily going to Telegram, creating Telegram channels. Some of them already have, like, over a million of subscribers or so. They are—most of them that we discovered during this research are focusing on the Russian-speaking population, but again we see more and more of a tendency on doing the same recipe, the same playbook in the national languages of the countries. And here we see, especially concerning, that not only they go to Telegram, because of Telegram it’s very—it’s understandable why they go there, because there is fear of content moderation. So they’re very open in their thoughts and everything. Like, the expressions are just very, very sharp. They usually do not write on—even if they have the Twitter page, that on Twitter that they allow themselves to write on Telegram. And  we also see a tendency of them going to TikTok, and we see—even just, like, the other day we uncovered one of the—of such influencer… we don’t know what category to put them—that was targeting [the] Ukrainian audience. And he was—his videos on TikTok also had almost [a] million of views, each.

And he was doing it in such a sophisticated way, so it’s, like, from the—it took us quite some time to follow him to understand that he actually echoing Russian line narratives. But what was most striking is that actually the advertisement that Roman mentioned today, on Facebook, they were using how he kind of understood that he is a part of the network. They were using his profiles as one of the things against this kind of advertisement. So this is indeed something. Again, there was already an infrastructure for that. A lot of them already existed way before… but we’ve seen more new faces popping up.

ROMAN OSADCHUK: I would add that there are, like, quite a lot of actors. There were quite a few—a surge of Russian darlings or amplifiers in Europe since the annexation of Crimea and invasion of 2014. So… I agree that now there are more actors. And I think we could roughly divide them into a few groups. So some of them are, like, maybe beneath four layers.

So at the beginning of the invasion, there were, like, some information campaigns against Ukrainian refugees, let’s say, in Moldova. And it were mostly TikToks. So those campaigns were kind of factchecked by local factchecking organizations. They were being reported on. And when people spotted them, at some point they could just delete that. And they seemed that they went by the same kind of script, which was there was no direct link with Russia, but the same campaign took place in Russia one week before. So that’s why it kind of coincided.

Another interesting thing is that there are a lot of think tanks that are already partnering with Russian ones. So in for—in pillars of Russian disinformation—it’s the Global Engagement Center well-known report—there are actually identified quite a few platforms… that they are amplifying Russian messages through local actors and partners all over Europe, at least, and not only even Europe.

And finally, there are some people who are either sympathizers towards the Russian cause, or whatever, or just hardcore anti-West, anti—I don’t know, anti-Western imperialism folks, right? So those people, they would share something that goes in line that Russia will inject. So I could provide an example of I made one of the investigation. There was a French politician, just French user on Twitter, who published the news about the Howitzers, that French Howitzers were being acquired by Russians intact so that they could reverse engineer them, something like this. But what’s interesting, when people ask him what was his source, he claimed FSB.

And then this thing was used by Russian media again and again a lot of the time. So then the story lead up to some other publications as well. But how it started is that there was some person abroad writing something on the basis of some message of FSB. So, yeah, there are definitely a lot of people who are actually being used by Russian disinformation machine and propaganda machine.

ANDY CARVIN: So we have a pair of questions that complement each other, one from Jonah and one from Marti, that I’ll read.

First, from Jonah: Does Ukraine and its allies have any means of replicating Russia’s tactics within Russia, and perhaps other authoritarian states? To elaborate, is it feasible to reach ordinary Russians and expose propaganda on social media, for example? Or the Russian state’s control of information space is in constructing—or, is Russia’s state control of information spaces too powerful?

And Marti writes: Do you know of any efforts to counter Russian disinformation campaigns by replying to each post and comment? It seems like the West has accepted that employing AI and troll armies is something Russia does, but why does the US, Ukraine not use the same tools in order to respond with the truth?

So, first of all, are there any efforts that we’re aware of, of direct responses trying to counter campaigns, or these messages in real time? And I guess I’d ask, the next level is, why aren’t we replicating or why isn’t Ukraine or the US replicating Russian strategies? Would it—do you think it would be in Ukraine’s interest to do so, to essentially participate in actively spreading disinformation as well? Or would that be counterproductive?

ROMAN OSADCHUK: I could start. I think it would be slightly immoral to do that. So we should be better than the Russians, because many of these tactics they are, quite frankly, are extremely in a dark area. We look at the ethics side of things. So we should not be replicating. We should find more clever ways how to fight it. If answering on the question whether people try to reach out to Russians, many Ukrainians at the beginning of the invasion—early invasion, right, they started reaching out to Russians, even relatives, families.

Family members reached to their parents who lived in Russia. And many people understood that those attempts were futile, because there were quite a few examples—you could find them in the media—when people talked to their parents, and their parents told them: Don’t worry. Russian Army is only aiming at military facilities. We will de-Nazify you and you will be totally fine. They are not shelling at the residential areas. When literally the person shows the videos of residential areas nearby being shelled.

So those attempts were there, but they definitely didn’t work out as intended. And if you look at some interviews, you know, on the Russian—like, Russians inside Russia, who tried to poll people and ask them about whether they know what’s happening, it seems that they actually know what’s happening. So intentions of many Ukrainians was that, like, when the Russians realized what’s happening they would go and try to do something and be against this war. But in reality, it didn’t actually happen.

There were protests at the beginning, but they quickly stopped. And actually, that’s a big problem. So another thing that I wanted to raise is that at least Telegram is available in Russia. And it means that even opposition platforms, like literally any media that is on Telegram, is available in Russia. So they could read—they could see what’s happening. It’s not that they’re living behind an iron curtain, they cannot reach, like, Western media. They get it. Thank you.

KSENIA ILIUK: Really agree with Roman that we have to be aware of the fact that they actually have the access to information. And that’s the choice they make, whether to go—to see or not to see. And a lot of them are choosing not to because, you know, it’s just the realization of the fact that they are—all of them are responsible for all the atrocity happening in Ukraine by their action or inaction. This is something that they should not do, you know? They choose better to live in the world of Russian propaganda that says that they are this great nation that is saving the world…

So I think this is one very important point to understand. The second, about their tactics, like should we or not? I think we definitely shouldn’t use the same tools. Again, just the principles and values we operate, or at least striving to operate, are very different from the ones that authoritarian regimes are operating. So I would definitely say that it’s not a way for us to go, because—and on the other hand, I believe that we have various tools that are based on the principles of democracy and democratic values that could be applied. And there’s no restriction from doing that.

And in terms of whether there are groups, of course there are some groups that are—some organizations, there are even some initiatives from states trying to reach so-called average Russians in Russia. But I don’t know that many of them are successful. As you see, not so much progress is happening in terms of their resistance. And definitely we can share the information, we can spread, but I don’t think this should be our target priority as of now because it seems that the people are clearly making their own choice.

ANDY CARVIN: So we have about five minutes left, which might be enough for one or two more questions before we begin to wrap up.

We have a question from Levisa, who asks: With the maturity and accessibility of AI tools, what do you recommend we do to identify disinfo created by these tools? Would showing provenance of the content be useful? Is there anything out there that you feel like is allowing people to detect AI-generated content yet? And, arguably, it would say, that for real content the entire open-source community is providing provenance by researching things in real time.

KSENIA ILIUK: Yeah, I would start saying here, because I’m a big supporter of using AI for that, because we use that for our everyday analytics, and it helps us to analyze on average basis—on a daily basis a few millions of publications in multiple languages, and to be much more efficient into spotting things, the Russian malign information campaigns, at the very beginning, you know, when we are able to actually, to some extent, you know, kill them before they gain the momentum. So in that regard, sure.

In terms of using AI tools to verify the contents themselves, the one that was created by AI, I think here it’s—again, there are a lot of tools out there. I think the bigger question would be, what would be the overall policy of doing that? Because we see right now that people are not going and checking the information. You know, it’s media literacy 101. You have to go and check in these sources. People are not doing that. And they are not going to do that because everyone has family, a dog, you know, a job, and everything. Like, we can’t do that. We don’t have time for that. So if we actually are AI and it’s something that any text, any video, any picture could be generated, and we want people to go and check, even if they are told of this… I don’t think people are going to go and do that.

So that’s the question to the policy. Like, are we forcing social-media platforms to apply these kind of tools? Or, for example, are we enforcing some kind of a culture—a few—there are a few startups, actually, that are doing digital signature for the content. Like, for example, if you post a video, you could put its invisible digital signature on it to ensure that this is the content with me, with my presence, and this is not deep fake, whatever. So I think the key question here would be, like, on more of a broader policy level and the regulation level.

ROMAN OSADCHUK: I would agree. It’s a highly complicated issue. If we talk about the—like, facts being generated by AI in some, I don’t know, bot farms or something, extremely hard to spot. And even if you’re using AI tools to identify whether it’s being AI-generated, there is always AI tools to rewrite what’s being generated so that you cannot spot it. It’s a constant game. I guess, like, with the disinformation field at whole, right, so there is always some malign tool being prepared by some malign actors, and then fact-checkers need to understand how to—how to work with that, how to identify there was something happen. So we are still in this space when we understand how it could be used against us, right, and societies, democratic institutions. So we are about to guess and find out how to effectively use AI against those informations…

Maybe there are some other venues. Maybe with some detection, but we’ll see.

ANDY CARVIN: So we have barely a minute left, so if you guys could summarize in maybe twenty seconds or so: What big lessons or takeaways should the rest of the world have regarding Ukraine’s resilience to disinformation?

KSENIA ILIUK: … The thing that I was telling today, threat awareness. What the Ukrainians realize because of their firsthand experience is that malign information influence is actually killing people, that it has enormous capacity to influence our everyday lives—the lives of business, companies… Governments are suffering. Democratic regimes are suffering. So I think the first key and most important to any other step which they are taking is actually acknowledging and facilitating threat on a different level…

ROMAN OSADCHUK: Yeah. I would say that there is no such thing as win or lose in information war. So it will always be with us, information will be with us, and will still be essence of our decision-making process. So, actually, critically evaluate any piece of information that you receive. It might be hard. It’s painful. It’s not extremely pleasant to do so. But try to analyze everything because there is so much manipulation could be coming your way. The ones that we already know, AI brings many more. But we just should be vigilant and continue our fight against those malign influences against democratic institutions.

ANDY CARVIN: Well, on behalf of RightsCon and the DFRLab, I’d like to thank both of you for joining us. I wish we could continue talking because this has been an absolutely fascinating conversation barely scratching the surface on some of these issues. But sincerely, Roman and Ksenia, I do appreciate you taking the time to talk with us today and join everyone virtually at the conference. Thank you.

Thanks, everyone. Enjoy the rest of your event.

Further reading

Image: A man watches an annual nationwide televised phone-in show with Russian President Vladimir Putin at an appliance and electronics store in the Black Sea port of Yevpatoriya, Crimea on June 20, 2019. Photo via REUTERS/Alexey Pavlishak.