Are autocrats winning the internet? Leading activists describe the role of big tech on the ground

Watch the full event

Event transcript

Speakers

Glacier Chung Ching Kwong,
Project Manager, The Committee for Freedom in Hong Kong; Ph.D. Candidate, University of Hamburg

Leonid Volkov,
Head, Network of Regional Headquarters for Alexei Navalny; Founder, Internet Protection Society

Moderator

Moira Whelan,
Director for Democracy and Technology,
National Democratic Institute

MOIRA WHELAN: It’s an honor to moderate this panel today with two heroes of democracy who are joining us. So I want to get started [on] our conversation and introduce them.

First, we have Glacier Chung Ching Kwong, who is the project manager for Freedom in Hong Kong as well as a PhD candidate at the University of Hamburg. We thank you for being with us today. And for those of you who don’t know, Glacier really is responsible for the videos going viral that we all know today about what we’re experiencing and what activists are experiencing in Hong Kong. So thanks for joining us.

And also joining us is Leonid Volkov, who is the founder of the Network of Regional Headquarters for Alexei Navalny and again, also, the society for the defense of the internet. And again, it’s thanks to Leonid and his work that we know the experiences of activists in Russia and the amount of effort that is going into, as Graham said, the double-edged sword that is combining these efforts of democratic movements and technology.

So I want to turn the floor over to our participants first to give us an understanding of what is happening on the ground. And walk us through the current political landscape if you would, Glacier, in Hong Kong, and then, Leonid, we’ll go to you about the conditions in Russia. Glacier?

GLACIER CHUNG CHING KWONG: Thank you very much. First of all, thank you very much for having me here. It is really an honor to be able to talk about the use of tech and the situation in Hong Kong right now.

And I think as all of you know, Hong Kong is in a really difficult situation in terms of politics because, in 2020, the national-security law was being implemented in Hong Kong, bypassing the local legislature in Hong Kong. And this law criminalizes almost every form of activism. Like, if you are protesting on the ground, that means you’re committing “local terrorism.” Or if you are advocating for international support like the way that I’m speaking to all of you right now, it’s “colluding with foreign forces.” And if you are trying to run in elections or organize primaries so that you would maximize the chance of winning [a] majority in the legislature in Hong Kong then you were trying to “subvert the state.” So, basically, the government is trying to crack down on the movement, as we saw in 2019.

But the invention of technology basically helped sustain the movement, in a way, just not as it looked before. Technology enabled the movement in 2019 to be leaderless. Technology gives us a platform and gives us tools and methods so that we can organize ourselves in a leaderless way. That is, there is no one to tell us what to do. People are organizing among themselves on online platforms, on Telegram channels, on Signal chats without actually having a decision-making body, which makes the movement so sustainable because there is no one person or a certain group of people that the government can arrest and that will be the end of the movement because we don’t have a decision-making body. This is not how it worked in 2019.

But after the national-security law, the use of technology has shifted in that sense because, as I said, a lot of things weren’t possible anymore. But the role of technology still enabled resistance and prevents Hong Kong [from] turning into a black box situation. That is, information is still being able to get into Hong Kong and get out of Hong Kong. For example, the status of a lot of prominent activists as they are arrested and being kept behind bars, without technology, without platforms like social media, like Patreon, like Twitter, there is actually no way for the world to know what is happening to them—unless they actually get to visit them in Hong Kong, which is nearly impossible under the very strict COVID-19 rules right now.

And they have been writing letters. They have been passing messages through their lawyers or through their families, and [posting] them onto social media, onto platforms, so that people still know that they’re still there, and so that we can keep the spotlight on them so that they will receive better treatment or, at least, less inhumane treatment, in a way. And it also provided an alternative source of the truth. That is because the Beijing government has been trying very hard to persuade the world that under the national-security law they have restored peace and stability in Hong Kong, in contrast to what happened in the 2019 movement. And with the aid of social media platforms, me, being in Europe, was able to reach out to [a] global audience telling them that that’s not true. Human-rights crackdowns are happening on a daily basis in Hong Kong. And this is how the platforms, and on the internet, enabled us to do that.

At the same time, technologies give us methods to protect ourselves. That is, the way we use technology will cause us to leave a lot of digital traces. And for many reasons technology is so convenient, but at the same time it is convenient for the authorities as well because they can just gather a lot of information about us on the internet. But with the aid of VPNs, with a lot of encryption tools, we’re able to protect ourselves. And there is a lot of ways that technology helped us, in a way.

And it also makes sure that political movement remained more or less a bit visible in the local context. Things happening in Hong Kong were still able to get out, because political actions have to be visible to be meaningful and to gain political momentum, in that sense. And I would slightly round up so that we have more time for, like, an actual conversation and discussion. But the point I’m trying to make is in the face of the Summit for Democracy happening tomorrow, it is of vital importance that global governments make regulations so that platforms will not be directly or indirectly [involved] in human-rights violations, as I’ve explained how much of a role that they play in our movement in Hong Kong. And I think it stands true for all the movements happening around the world as well.

For example, these big tech companies, they’re turning over user data to the authorities in Hong Kong, because Google had found that it has breached a promise that it will never turn over data to the Hong Kong government after the national-security law was passed. But they did. Under some circumstances, they did turn over user data. And they’re actually complying with the government’s request of content removal. For example, are they removing, for example, the slogan behind me… I proposed it on social media because it was banned under national security law. And are they hindering the access to technology on their own platforms? For example, Apple actually has been very notorious for assisting the Beijing government to hinder access to certain technologies, for example, VPNs, or even homosexual dating apps, or even apps related to [the] Dalai Lama and foreign media.

Are we allowing these companies to get away with being indirectly involving, or even directly involving, in human-rights violations, in that sense? So my message to the world leaders that are meeting tomorrow is that please make sure that you will hold these tech companies accountable, so that they won’t be indirectly and directly participating in human rights [violations]. Thank you.

MOIRA WHELAN: Thanks so much for that. And we have so much of this to cover. And many of the things you just said I think we want to dig a little deeper into. But before we do that, Leonid, can you please give us a sense of the current state of play in Russia and, especially with what we’re seeing today playing out on the world stage, how activists are organizing, as Glacier indicated, both inside of Russia and outside so that we do have a sense of what’s going on?

LEONID VOLKOV: Thank you so much. Thank you for the invitation. It’s a big honor for me to be part of this event.

It all started ten years ago. It all started with an election. Ten years ago sharp, early December, there was a rigged election to the Duma, to the federal parliament, and people went out to the street protesting, hundreds of thousands of people all over the country suddenly. Election fraud happened in Russia before. It was not news in December 2011. What was news was [the] Internet. Videos of ballot stuffing and all stuff like this on YouTube sparked, ignited this protest. And from 2011 onwards, Russian government started to realize that [the] Internet is, well, a threat for them. They started to realize that people can organize themselves using the Internet, that those videos could be dangerous for Putin’s regime. And well, they started to build their infrastructure to restrict Internet freedom. The parliament that had been elected through rigged election ten years ago, in December 2011, actually adopted a number of laws that enabled Internet censorship in Russia, and then the next parliament, which [had] been elected in an even more rigged election in 2016, also forced many requirements on international media platforms and international tech giants.

Still, the next melting point came with one more election, which happened just three months ago in September 2021. We have a Duma reelected every five years so there is actually an important melting point of Russian politics every five years. In September 2021, there was an election once again and the next step of pressure against free Internet was taken. By 2021, no Russian opposition politician was allowed to participate in an election anymore, and still, they were actually participating. For instance, for Alexei Navalny, my friend and leader of our movement, the election of the mayor of Moscow in 2013 was the last one. He finished second, nearly forcing the incumbent mayor of Moscow into a runoff. Putin decided it’s too dangerous and Alexei Navalny was disqualified from participating in the election for many years, and then he was poisoned, and then he was imprisoned. The world knows.

Still, we consider elections very important, and we tried to find our way to participate in the election, even not being on the ballot. We created an application, so-called Smart Voting, which we used to endorse independent candidates who had, well, the best chances to defeat United Russia, Putin’s party, in their district. At the end of the day, the Smart Voting application and the Smart Voting website was just a list of names: a list of names of officially registered candidates in official elections—a list of endorsements. “Navalny movement thinks, Navalny movement supports in district number one you vote for this guy, and in district number two you vote for this guy,” and so on, up to the district 225.

What happened in August 2021 on the eve of the election? The Russian government has declared our organization an extremist organization. They outlawed us just like al-Qaeda or [the] Taliban, and they have declared all the contents that we produced extremist content, prohibited content, banned content, as [if] it would be a Molotov cocktail recipe or something like this. So they have declared the list of names of 225 names of the independent candidates that we endorsed, officially registered candidates in an official election; they declared this list of names to be extremist content and demanded that tech platforms remove it from their servers, from their infrastructure.

They also blocked our websites. Of course, Russia has a sophisticated blacklist internet censorship toolkit in place, so they made all our websites inaccessible from within the country without VPN or other sophisticated technology. But we relied on the application. Application, well, this is something that is much harder to block for them. It requires much better technology for the Russian government to block it. And they decided to pursue a different strategy.

On September 16, three days before the day of election, representatives of Apple and Google in Russia [had] been summoned to the Federation Council of Russian Federation, to the upper house of the parliament. And, as New York Times and Bloomberg reported, they were presented with a list of their employees in Russia and they were told these employees will be arrested if Google and Apple do not delete our Smart Voting application from Apple Store and Google Play market, respectively. And they caved. They deleted it.

So once again, on one hand it’s nothing new. I mean, there was once again an issue of human rights versus legal compliance. In Russia, this type of content is illegal. In Russia, the Russian government declared that our endorsements, our political recommendations are extremist content, are dangerous, and it’s “illegal” to support those candidates that we supported. So in this aspect, well, this has been a legitimate requirement.

But a question that I would ask here. OK, tomorrow, Russian government passes a law that 2+2=5 or something like this. Will, in this case, Apple or Google adjust their calculator application accordingly? Well, there is a law. There is a legal requirement. Why shouldn’t they?

On the other hand, of course, there are, well, human rights. The voters have the right to know. We have the right to endorse whatever candidates we want to endorse. Despite the formal extremist designation, there is, of course, nothing extremist in endorsing candidates in parliamentary elections. People do it. That’s what elections are about.

Well, so Google and Apple were presented with a choice. They wanted to stay with compliance in human rights, but then more pressure was applied, and they caved. OK. I can imagine, well, it has been a situation where their managers have had to face a very hard choice. They had to protect their employees that were taken hostage and it has been a very unpleasant situation for them. Still, I’m not happy and we are not happy with the way they dealt with this crisis.

They didn’t consider us to be part of it. They didn’t give us ahead warning so that we could probably do something: publish our list, our endorsements elsewhere; try to find some other ways to bring our voting recommendations to our supporters all over the country. We weren’t aware of this very possibility that the application could actually just be deleted from the stores.

We have never been [given] an official explanation, not yet to mention an apology. Actually, what Google and Apple did, well, they caved to Putin’s censorship request, and they also tried to sweep it under the carpet like nothing happened. After the election, Google just reinstated the application without telling a word. And Apple even didn’t reinstate the application; it’s still unavailable for Russian users.

Well, I believe this is not the way how you deal with terrorists who take hostages. Putin was victorious. He managed. He applied pressure and he got what he wanted. Well, this only makes all other Putins all around the world—like in Iran, in Turkey, elsewhere, you name them—just jealous. They want to follow suit. They want to do the same. A-ha, it’s possible. So you can apply enough pressure on tech platforms and they will crack down, and they will cave, and they will delete the content, they will censor. They will do whatever you want. It’s a very terrible precedent, actually.

And in my opinion, the way they should treat this precedent should have been very different. They had to be very vocal. They had to say, OK, some guy is threatening the employees of American companies. Like, he’s taking hostages. There should be some punishment for this action. Putin’s behavior should be, well, punished somehow, so it doesn’t repeat. But what the tech companies did was, I think, like, really the worst possible way to deal with an incident like this. And we can’t be sure it’s not going to repeat.

Now, on the practical [side], what do I suggest? What do you suggest? Well, there is the Foreign Corrupt Practices Act in the US since [1977]. For fifty years it’s illegal for US companies to engage in activities abroad in terms of bribery and so on, which are illegal in the US. OK, why don’t we consider something similar about values, about censorship, about freedom of speech. We can’t imagine that, like, an American company would, I don’t know, delete a US senator’s voting endorsements for Congress of his state. That’s clearly impossible. That’s First Amendment. That’s very basic thing about political process.

Why [don’t American companies] do the same abroad? Why is it possible for them to trade values and human rights for legal compliance with dictators’ requirements? Probably we need to think about a framework that would protect values and human rights on the tech platforms, using law enforcement, using legal tools similar to the Foreign Corrupt Practices Act. That’s a practical [example] that I would like to share based on our experience of three months ago. Thank you.

MOIRA WHELAN: …. I want to remind our audience that what we’re talking about here are efforts of both communities—in Russia and in Hong Kong—to gather and have an opinion about how communities want to be governed, how they want to be led, how their opinions will be heard, and their governments be responsive to them. And what has changed over the past decade is really the regulation, the use of tools that were once used [for]—the simple act of democracy, the simple act of having an opinion and voting being threatened with regulations—everything from making flags illegal, to VPNs illegal, to surveillance. And that the real threat here is people wanting an opinion.

And so I want to start with one simple question, which is for Leonid. We can start with you on this one. Is it working? Are they winning? Is Putin winning?

LEONID VOLKOV: Putin is winning the battle, not the war. He is a smart tactician. He is very opportunistic. He knows how to use weaknesses, how to like, international law, [the] legal system, how to use weak spots of his adversaries, and so on. He is not good at strategy. And I believe he knows, or maybe he doesn’t know but still, that in general the clock is not ticking in his favor. He still relies on TV propaganda. He is not able to dominate the internet. And last year was the first year in modern Russian history when less than 50 percent of Russian voters told pollsters they were deriving political news mainly from the television. So the internet is now more important than television, which makes Putin’s propaganda machine obsolete.

And… we believe that at the end of the day even with their censorship, even with their—with all their attempts to put the message on the internet under control, we will be able to find ways—both political and technical—to push our message through. [The] Internet gives more flexibility on these. And while we see how people learn how to install VPNs, and how to get around the blacklist, how to circumvent, and so on and so on… [To] some extent, we are back to where we have been fifty years ago.

Like, in the 1970s everyone in Russia, in the Soviet Union, knew how to tweak their radio so that they were able to listen to those foreign voices. So now, fifty years later, it looks like soon everyone will know how to install and set up the VPN to be able to reach uncensored information. At the end of the day, technology will prevail because, well, technology is progress, and Putin is reciprocal of the progress. But, well, it demands a lot of patience, of course.

MOIRA WHELAN: And over to you, Glacier.

GLACIER CHUNG CHING KWONG: I’d say yes and no to that question. Yes, it seems like they’re winning—especially in the context of Hong Kong because I’d finally say, from an outsider perspective, if you don’t know people that are still on the ground, you basically feel like nothing is happening. There is that status quo, where everything is trapped. Like, there are things happening on a daily basis. People are being mentioned to court. People are being persecuted. People are being sent behind bars. Or there are people leaving Hong Kong. There are activists outside of Hong Kong making noises. But none of these things are, like, groundbreaking things that are creating immediate and visible change, in that sense.

So it’s kind of a “yes” if you look at it in that direction. And especially when China is so good with playing along with Western narratives—as in, Western, I mean the free world. Like, the trend of the world is talking about data protection, right? So China basically introduced a new data-protection law. And the wording in it, it’s amazing if you look, if you don’t understand the context of Chinese politics. You basically feel like, oh, they have a very comprehensive data protection law that basically catches up with the GDPR. This is the impression you’ll get if you’re simply looking at the text.

But if you also take into consideration that in China they basically use a very different dictionary, that is words [don’t] carry the same meaning as we know. For example, there won’t be a data protection law available in democratic countries that says basically all [data] will have to be turned over to government authorities. And once they request it, you don’t actually have a chance to deny under any circumstances. And the legislature is basically in full compliance with the government. Even if you require a court warrant to turn over data, they can just print court warrant like, I don’t know, printing a book or something. They can just print it, and then you will have a warrant, so that you will have to turn over the data. And so they’re really good with playing this narrative, again, to fool us into believing that they are actually doing really well in terms of data protection and stuff.

And then the other factor is the market share. For example, Apple relies so much on the Chinese market, on one hand to manufacture its products—iPhones, MacBooks, and so on—and on the other side to sell it, because a lot of people—they are a huge market in China. And if you only get, like, 10 percent of the market share, it’s already a huge part of the revenue for the company. And the last thing makes it a “yes” is the use of surveillance technology in China is comprehensive, to a point that it basically feels like Black Mirror in real life. So the social credit system, the use of facial recognition all around, the use of tracking and AI technology, that is just scary. So from that angle, and all of the three points that I mentioned, it’s definitely a “yes,” that they seem like they’re winning.

But at the same time, it’s a “no” too because, like, every action force will create an equal reaction force, right? It’s a law of physics. And it works the same in resistance and in oppression. That is, there are people fighting very hard to counteract this kind of oppression coming from China.

For example, like, on Twitter we have this hashtag #MilkTeaAlliance, and recently we have hashtag #WhereIsPengShuai. And more recently, we gain a huge success in boycotting the Beijing Olympics that is happening next year in February. A lot of governments are announcing diplomatic boycotts. And a lot of these campaigns sustain momentum on the internet and on platforms, and through [hashtags], through people vigorously retweeting stuff and talking about things.

And there are a lot of things happening when it comes to technology. It’s not the same way that the governments are using it to crack down on us. It’s more like a very creative use of tools that have never been designed for [those] specific purposes. And at the same time, there are laws being make in different countries trying to protect better human rights in terms of data protection or even the US recently passed laws to try to stake out the internet freedom in Hong Kong. That is providing funding to develop new tools, that is supporting people to train themselves and encrypt themselves with basic tools and knowledge to defend themselves. Even these efforts are having locally—in diaspora communities and in the local community of Hong Kong. People are learning: How do we properly use a VPN? How do we protect the information that is on our devices? And so on.

And these very simple and creative [uses] of certain tools, actually, it’s generating a lot of momentum in the local and diaspora community to very actively combat the things that I mentioned; that is, the narrative that Beijing government and them having a huge market. Like, me talking here about we have to hold, like, tech companies accountable… and then we’re using a platform that [enables] me to talk to all of you as well. So there is this “yes” and “no” at the same time.

So I don’t have a very concrete answer. I’m sorry for that.

MOIRA WHELAN: No, I think it’s a very honest answer. And I think, you know, before we get to some of the accountability questions that are coming up in the chat, one thing we do know is sort of this idea that there’s more of us than there are of them, right? There’s more people in the world who want to have a role in how they’re governed. And I wonder if you can—one of the questions in the chat, Leonid, was for you about the activists in Ukraine, but I think this also goes for activists in Myanmar, in Belarus, around the world that we’re seeing who are facing this authoritarian oppression.

You mentioned the creativity of technology. If you could talk in a general sense—and, Glacier, we can start with you—about how these ideas are being exchanged, how people are—as authoritarians are scaling with laws and regulations, we see activists scaling with circumvention techniques, with social movement and amplification. And I wonder if you can talk a little bit about that.

GLACIER CHUNG CHING KWONG: In my experience and my observation, these skills and techniques of [the] creative use of technology get translated to different contexts a lot. For example, I mentioned the hashtag #MilkTeaAlliance. Like, this is actually a term that we coined to—a few places, like Hong Kong, Taiwan, India, Myanmar, and a few separate places that we all have milk tea in our culture and that we name ourselves the Milk Tea Alliance. And Thailand, also. We are all, like, facing a fight for democracy and freedom in our own context.

And for sure, these movements cannot be directly compared because the context and the historical background are so different, but we do see certain skills being shared. For example, how do we organize protests leaderlessly? How do we make use of tools? For example, Burma and Thailand, they’re basically fighting for democracy, and they actively deploy some of the skills that are being developed by Hong Kong activists and Hong Kong protesters in general.

And when it comes to the use of technology, they are always very creative. For example, if you have an iPhone and when you are on the tube, basically you can—you can just open your airdrop and start to airdrop people photos about the movement or something sweet. For example, I myself in Hong Kong, I received airdropped photos of remember to drink more water because you need to be healthy in order to win this fight, like those [kinds] of messages. And it provides so much, like, solidarity and support among all of us. Even though we’re strangers, you can still feel that. And I know that a lot of these small, like, skills and small techniques are being deployed in other places as well.

And there is this constant exchange of methods of [protesting], how do we organize ourselves, that is happening online. For example, a few months ago I was on a Twitter Space discussion with protesters from different places in the Milk Tea Alliance and we were talking about how do we facilitate this so-called alliance. How do we work together more? How do we exchange ideas? How do we make sure that the things that we’re talking about get, like, accurately translated into their context so that it can be useful? How do we develop, like, for example, a self-defense guide in terms of digital tools and digital security for everyone that is part of the bigger struggle for democracy in the world? How do we do that? What are the things that we have to bear in mind? Like, there are different regulations and different contexts, so we have to adapt.

And I’d say this is something truly amazing to see because the tools are fixed. Like, Signal wouldn’t change a feature for Hong Kongers or for Myanmar people that are using it. But in general, we have figured out very creative use. For example, in Hong Kong we used a lot of, like, Telegram channels to spread news and spread information. But maybe in some places Telegram isn’t available, so they will have to use different tools but the same mechanism to spread messages. So these are the translations and the very interesting use of technology and very creative use of technology that I’ve seen in other movements.

MOIRA WHELAN: Leonid, did you want to jump in here? Especially as we see this activism hitting the headlines—I should say hitting the headlines; it hasn’t been—it’s not new in Ukraine—is there—is there coordination that you’re seeing happening from a technology standpoint among activists?

LEONID VOLKOV: No, there is not, and I think it’s good still. Countries are different. Situations are different. And, like, blindly trying to copycat some practices and approaches from abroad to one country or from one country to abroad could be misleading or even dangerous, and we’ve seen examples of this. Even like when the protests in Belarus sparked, like, over a year ago, they tried to do things that like, for instance, we did in Russia before, but as the Lukashenko regime was much more—actually, even worse than Putin’s regime, these things didn’t end well.

And so activists in each country who are well aware of the specific political situation and political process in their country should, at the end of the day, find their own way and don’t believe that some, like, inherited experience or some help from abroad will arrive. That’s a hard lesson that we have learned on several occasions. We tried to use some approaches and some ideas from abroad. We read a lot. We listen a lot. We communicate. We participate in discussions like this. But at the end of the day, to make something successful you really need to go with it through your own experience, first of all, and to apply to your own situation, because, like, every country is, fortunately, very unique, and every political situation is very unique.

MOIRA WHELAN: No, thanks for that. And I think the other big point here is, you know, we often talk about circumvention and a number of these efforts. The burden can’t be on activists, right? The burden needs to be on those that we need to hold accountable.

So let’s talk about that. And this gets to two of the questions that we have in the chat but also the big question that we’re here to discuss, which is you both talked about social media companies and the steps and actions they’ve taken. One big thing that we advocate with tech companies at NDI is really getting them to realize that, you know, as far as dictators are concerned, they’re on our team, and they need to sort of wake up and realize that they have obligations here. But one thing that neither one of you touched on so strongly, and I want to give you a chance to do, is, here we are on the eve of the Summit for Democracy. We have countries around the world gathering to talk about what they’re going to do for democracy. And I want to give you both a chance to put accountability on the countries, and what steps do nations—the United States, Europe, around the world—need to take in order to help you? This is your opportunity to really tell these governments what you need from them.

Leonid, why don’t we start with you?

LEONID VOLKOV: Well, I would say first of all, let’s bring values back onto [the] agenda. And I mean, we are fed up with the realpolitik when there are vast worlds of values that we stand for and then we explore, like, practical opportunities to do business. This doesn’t work.

Like, I want to go back for a second for this issue of, like, human rights versus legal compliance because it’s kind of the last resort that tech companies are taking: Well, we have to comply with local laws; we can’t break them. And it’s a very hypocritical approach, actually.

So, for instance, Russia has passed a personal data domestication law back in 2015. All personal data of Russian users on every international tech platform has to be stored on Russian soil. Google, Apple, Facebook, and all others are in violation of these laws since 2015, for six years already, well, because it’s actually something they can’t properly comply with for economic reasons. Well, they store their data of their users somewhere on the cloud. They can’t split it into pieces. They can’t actually implement this law and, well, support it practically; like, they can’t actually split the Russian part of their database and put it into some data center in Russia. So they ignore it. There is a so-called Yarovaya Law on the decryption that all messages should be decrypted and all encryption case for all encrypted messaging should be passed to the, well, Russian security services. Well, I hope they ignore it as well, and of course, they realize if they comply with it, they lose all the audience. They lose all the customers. No one will use their messengers, like WhatsApp, if people know that encryption doesn’t work, and so on and so on and so on.

So for many years, they actually ignored Russian domestic censorship laws and, yeah, they were right. It’s actually very good that they ignored them. But then suddenly, on the eve of the election, being put under slightly more pressure, they started legal compliance. OK, so now legal compliance suddenly has become something important. Well, that’s what we call hypocrisy. They ignored these stupid and ridiculous regulations when it was, like, economically impossible, when they would lose customers over compliance with these regulations, but once Russian government is, like, pressuring them with some blackmail, they changed their position. We really don’t need and don’t want them to act like this.

I believe that—well, they have to have transparent policies based on human rights and channels of communications with civic organizations, with the civil society. Google and Apple and Facebook and all others are in violation of Russian domestic laws on Internet and censorship for six years. They had to be blocked. Putin is not going to block them, though, because it’s just politically impossible. Putin doesn’t want to antagonize with nonpolitical Internet users. Like, Russia has one-hundred million YouTube users and only fifteen to twenty million of them actually follow political content on the YouTube. The remaining use it for entertainment, sports, cartoons, whatever. Putin knows perfectly if he actually blocks YouTube in the country, all those people will get angry, all these people will come to dislike Kremlin, will come to dislike Putin… And so he tries to lure tech platforms into self-censorship, into deleting the content voluntarily, and that’s a very bad thing. And to avoid it, like, a direct contact, a dialogue with civil society is necessary.

MOIRA WHELAN: Before I turn to you, Glacier, I want to dig on this for one second, Leonid, because the case you raise with Apple and Google—[I] 100 percent agree that there is an accountability that they have to human-rights activists. But one thing we also didn’t see much of were other Western governments standing up in support of human-rights activists, and I want to give you a chance to address that aspect in the particular election scenario. What do governments need to do differently in a scenario like that?

LEONID VOLKOV: Well, that’s all the same. That’s all about that’s all about values and not realpolitik. Putin is so victorious in the last three years because they actually try to play on his field. They try to play his game, the game of political intrigue, the game of agreements behinds closed doors, and so on. The game is a game where Putin is so good. He is weak at values because he has none. He is weak at, well, protecting human rights, and so on, and so on, and so on.

So I believe that all these, like appeasement policies that we have seen the last few years, they failed. And we witnessed a lot—enough of these failures. They just have to actually, like, well, to take a much stronger position and to reinforce those values.

MOIRA WHELAN: Glacier, same question over to you. You know, we see this appeasement strategy, if you can call it that, coming from companies and coming from governments. I guess just one nuance on the question of do you think there is really a strategy here? Or do you think that governments and companies are sort of feeling their way through this? And what do they need to know to be better at it?

GLACIER CHUNG CHING KWONG: I’d say in general governments and company [strategy], as far as I can see, is to choose money over conscience, to be very, very honest. This is my general feeling about governments and big tech companies when it comes to the Chinese market or complying with Chinese laws in general. And I think the one thing that they have to do is they really do have to make laws to make sure that no tech companies are indirectly or directly [involved] in human-rights violations.

And I think this responsibly should also go to the companies themselves as well, because, like, they have always been branding themselves as upholding freedom of speech, freedom of expression, giving people access to, like, the wonders of the internet, and so on. But it’s, like as Leonid said, they’re just hypocrites, in that sense, because on one hand, they are helping governments in Russia, in Hong Kong to crack down on activists, on the people who are fighting for the things that they uphold. And at the same time, they can brand themselves as the defenders of the same values? I just don’t feel that comfortable at all.

And for governments, I think these solutions are coming in this near future. It would be a patchwork system, but we have to bear in mind that we need solutions to these problems. I’m not asking for, like, something like waving a magic wand and everything will be solved overnight. It’s not something that will happen.

But there are ways to hold these companies accountable, like holding hearings, making patchwork regulations on different aspects. For example, in data requests and content moderation, and antiterrorism content laws—these things do have an implication on human rights activists and freedom fighters, as me, and as Leonid, and others—all of our colleagues worldwide, because the Chinese have an authoritarianism state that’s so good with—they use the same playbook, basically. They adopt wordings, as I said. They adopt the same narratives as Western countries and the free world will do, like terrorism, like misinformation. They love these words because they carry different meanings in their—in our context.

But for a lot of people that are not familiar with that context, it just feels right. Like, for sure Chinese need misinformation laws. For sure they need antiterrorism laws. But antiterrorism means cracking down on human rights in Hong Kong or in Russia. So we have to bear in mind when we are talking about these things, there is a different set of context that might have a problem with legal compliance with these big tech companies. And governments have to hold them accountable.

At the same time, I hope that governments will bear in mind that their own regulations will have an impact on different parts of the world. For example, the European Union is now talking about a digital marketing act that basically will try to break Apple’s and Google app store’s monopoly on the apps market. Because if an app is not out there on the app stores, basically it’s not existing at all in any sense, unless you jailbreak your iPhone, maybe. But that’s not ideal either. And so these laws, once applied in the European context, will have an impact through the whole market of Apple’s products. And it might increase the chance of other users in other regions of the world having more access to technology.

And if the United States is considering antitrust laws on these big tech companies, maybe the same implication applies. And are we aware of that? Are policymakers and lawmakers aware of that? And when it comes to that legal compliance problem, I find it very frustrating as well. I have this, like, firsthand experience where a lot of activists—and I put a website called Hong Kong China 2021, which was detailing, like, principles of the diaspora community and what we were trying to fight for. And we hosted it on a company in Israel called Wix.

We thought it was going to be safe because it’s, like, outside of Hong Kong, right? But then Wix received a letter from the Hong Kong police requesting to take down a website, because that’s infringing on the national-security law. And so Wix did that. And we had to make a huge fuss on Twitter in order to get it back up. But then, two days later, it got blocked again in Hong Kong locally, that there was this site blocking technique that they used. And they blocked our website, so that it has to be accessed on a VPN in Hong Kong.

But this kind of incidence reflect that companies—foreign companies just don’t have enough knowledge about what is actually happening in different places of the world. I understand they might not all be, like, as big as Google or Facebook, that they can have a whole legal team and a lot of policy people that are trying to work out how to make these things work. But I really think that there is a lot of agitation work to be done there, so that a lot more companies will be educated with what is happening in other places of the world. And they will be flagged if something similar happens, because we are lucky that we have some very prominent activists that are on board on these channels so people will see it when we tweet about or talk about it. But there are a lot of people who are less lucky to be able to do that.

And at the same time, I think when it comes to content moderation it’s so controversial that there are a lot of [types of] content that are being blocked under community standards, especially for Facebook, that were not supposed to be banned. For example, Facebook took down a page of Uyghurs testifying [about] their experiences being held in reeducation camps. The whole page was being taken down. And that was the only archive they ever had.

And I can’t remember exactly which community standard that they removed it from, but there are repeating occasions where in Hong Kong users are being blocked because they said something or, they’re being supposedly racist, but it’s not exactly racism. It’s more about criticizing political situations in China. But because of the flagging system and how the content moderation system works on Facebook, these messages get deleted. And it’s basically hindering our freedom of expression.

And there are a lot of these, like, very delicate and controversial things that are happening on platforms. And so that’s why I say the solutions coming will have to be a patchwork style first, until we can develop it into something that’s more comprehensive and more coherent, in a way. But I do think governments have to start to make laws to hold big tech companies accountable for indirectly and directly [being involved in] human-rights violations. And companies themselves have to be aware of these situations that are happening in different regions of the world, and stop being a hypocrite in general.

MOIRA WHELAN: Yeah. Thank you for that. And I will say, you know, just speaking on behalf of the democracy community, you have us, right? So where you’re busy in the fight, both of you, there is a global network of organizations that are here to help any tech companies that may not have those giant legal teams, to help them understand your experience and help them to design in a way that makes democracy possible and enables human rights to be respected.

We’re running out of time. And I want to give you one last chance to sort of walk through any messages you should communicate. But, you know, there is one thing that we’ve seen and here, not only on the eve of the Summit [for] Democracy, but also on the eve of Dmitry Muratov and Maria Ressa receiving the Nobel Prize in honor of a lot of the work they’ve done to bring light to the issues we’re discussing, but also the personal attacks that they’ve faced in doing that. So I want to give you a chance to talk about yourself and about your colleagues and your own personal safety, and what we can do to help keep you safe, and keep your ideas at the forefront of our attention as we go about our work.

So, Glacier, let’s start with you. And then Leonid, if you’d like to wrap us up.

GLACIER CHUNG CHING KWONG: I will try to make it very brief. I was born one year before the 1997 British handover of Hong Kong to China. That means I grew up in Hong Kong where it was still relatively free. I was allowed to think critically, to say whatever I saw fit in general. And I just want to really urge every one of you who is watching this and thinking about this: Please don’t take the freedom of expression that you are enjoying on the internet for granted, because this might change. And I think we have all experienced that impact from various occasions, like Cambridge Analytica, and a lot of, like, fake news, and a lot of echo chamber issues, and so on.

And you actually have a role to play in that. You as a user, being a responsible user, or urging your governments to do something will definitely make a difference. And if that’s, like, beyond your reach as well, maybe considering retweeting about things when things are happening to Hong Kong, for example. Tomorrow, Jimmy Lai, Apple Daily founder, will be receiving his verdict for the June 4 case, and there will be a lot of people who are receiving their verdicts in the coming days. So if you see that news on your Twitter feed, retweet it because that will make a difference in the algorithm and there will be more people seeing that.

And when you are thinking about different issues, try to think about what will happen to other people in the world. So, for example, when we’re talking about climate and we’re talking about solar energy and stuff, these things do have implications on human rights as well. Like, all of these very complicated topics are all interlinked and interconnected. So give that a thought and make very conscious decisions about what you are doing in daily life.

And to keep me safe, I think just make sure if I cry for help on Twitter or on social media, please retweet that because that means I definitely need help. And casting more spotlight on Hong Kong and on China-related issues and other human-rights defenders in the world will help give us a platform to be heard, and it means a lot to us personally as well. We feel supported and we feel like people are standing in solidarity with us. So that’s my call for all of you. Thank you.

MOIRA WHELAN: Thanks for that. And thank you for your heroism and sort of living out loud.

Leonid, over to you. Why don’t you wrap us up?

LEONID VOLKOV: Yeah. Thank you once again for doing this event. And it’s been a great pleasure to be along with Glacier, who’s so patient and outspoken. I can’t say better than she said, and I can actually just second it.

Publicity helps. Publicity protects, if anything, and like attention and being vocal and being not silent and being able to communicate. Like, building communications channels, which I was talking about, like communications channels to tech companies, to government, and so on, that’s probably like the only tool available to us. Nothing could guarantee full protection for civic activists, nothing. There is no such thing as safety and full protection. But at least we can do whatever possible. And this is true, of course, publicity through taking care of what’s going on in countries like Russia, Hong Kong, and many others.

MOIRA WHELAN: Yeah. Thank you for that. And again, thank you both for your heroism and your bravery. And I just would urge us all to remember, imagine what we could do if one could just express their opinions online, have the ability to govern themselves, and achieve these basic rights that we’re all working very, very hard for.

Watch the full event

360/Stratcom 2022

360/StratCom is DFRLab’s annual, premier government-to-government forum focused on working with allies and partners to align free and open societies in an era of contested information.

Image: Protesters marching with yellow umbrellas to show support towards Hong Kong during the demonstration. Photo by Viola Kam/SOPA Images/SIPA USA via Reuters.