Maria Ressa, Nanjala Nyabola, and Katherine Maher on why social media is—and is not—a global public square

Watch the full event

Internet Generic

360/Open Summit: The world in motion

June 22 – 25, 2021

The Atlantic Council’s Digital Forensic Research Lab (DFRLab) hosts 360/Open Summit: The World in Motion on June 22-25 online.

Event transcript

Speakers
Maria Ressa,
CEO and President, Rappler

Nanjala Nyabola,
Independent Writer, Researcher, and Political Analyst

Katherine Maher,
Nonresident Fellow, Digital Forensic Research Lab, Atlantic Council

Moderator
Erica Kochi,
Co-founder, UNICEF Innovation

ERICA KOCHI: Hi, everyone. Great to be here. I’m super excited to introduce this panel today.

First, we have Nanjala Nyabola, who is a writer, independent researcher, and political analyst based on Nairobi. And her work focuses on the intersection between politics, society, and technology. She’s a frequent contributor to publications like The Nation and Foreign Affairs, and is the author of “Digital Democracy, Analogue Politics: How the Internet Era is Transforming Politics in Kenya” and also “Travelling While Black: Essays Inspired by a Life on the Move.”

Second of all we have Katherine Maher, who was with all of us on day one as well. She’s the former CEO of the Wikimedia Foundation. And prior to Wikimedia, she held leadership roles at Access Now, the World Bank, the National Democratic Institute, and UNICEF. And she shapes innovation agendas for international development, human rights, and democratic participation.

And then last but not least we have Maria Ressa, who has been a journalist in Asia for over thirty-five years and is the co-founder of Rappler, the top digital-only news site that is leading the fight for press freedom in the Philippines. And as Rappler’s CEO and president, Maria has endured constant political harassment and arrest by the Duterte government. For her courage around this work on disinformation and fake news, Maria has received many accolades.

And I’m so excited to have all of these wonderful women here with us today to talk about the public square—the digital public square.

So, to kick us off, I would like to ask each of the participants, starting with Maria, do you think it’s accurate to describe social media platforms as public squares?

MARIA RESSA: I wish. I mean, you know, it is—it’s default where we gather, but I would compare it more to—especially with these microtargeting tools that it uses, it’s more like a behavior modification system. And those of us who voluntarily enter are part of an experiment that makes us, like, Pavlov’s dogs. We’re experimented on in real time. And the consequences are disastrous. I mean, if you look at it, Facebook is the world’s largest distributor of news, and yet all the studies—I mean, this is for social media in general—they’ve shown that lies laced with anger and hate spread faster and further than facts.

So you can actually argue that the social media platforms that deliver the facts to you are actually biased against facts and they’re biased against journalists. And this is, I think, what’s turned our world upside down because if you don’t have facts, you can’t have truths. Without truth, you can’t have trust. If you don’t have trust, you don’t have a shared reality, you can’t have a foundation for democracy. You can’t—any meaningful human endeavor becomes impossible. And this is the crisis that we’re facing today.

ERICA KOCHI: Yeah. Nanjala, over to you. I know we had a discussion about this earlier on, but I’d love to hear your views on this. Do you think it’s accurate to describe social media platforms as public squares?

NANJALA NYABOLA: I think it’s accurate to describe them as part of the public sphere. I think it’s accurate to describe them as places where people go to have their opinions heard, and places where people go to engage with their governments, to engage with public services. And, you know, I think especially in countries whereby the social-media companies don’t necessarily see the publics there as natural audiences or as natural extensions of their markets, you actually find people taking on these tools that were designed for something else and applying them into their political lives, into their political realities.

So we’ve had massive protests, for example, resistances against the excesses of power—Rhodes must fall, my dress my choice—in many African countries. Because there’s been this massive retreat of the traditional media, you don’t really have a robust media, the social media starts to play this role whereby people go there, as Maria was saying, to get their political information and to get their connection to the political space.

But also using these tools and going beyond that and using them as a primary way to reshape and organize. So maybe not a one-to-one substitution but certainly an extension of a lot of the characteristics that we see in these analogue public square. And just because of the agency, the creativity that people have applied to the social networking sites that is over and above what the makers of the sites might have had in mind.

ERICA KOCHI: Yeah. Katherine, what about you? What are your thoughts on this?

KATHERINE MAHER: I mean, I think that there’s sort of a differentiation between the legal question of are social media platforms public squares and understanding that, you know, for all sort of intents and purposes they are not. They’re private spaces. And the platforms have the rights and also the responsibility to do what they would like to do to uphold their policies and community standards in those spaces. But in some very practical ways I think I would second what Nanjala and Maria have already said, which is these are the venues in which we do express both discontent and elation in our lives, and in the context of the way that we are governed.

I think the analogue of the public square takes on sort of a more pressing resonance at this point in time given that even within our public squares, in many places in the world, we are facing, you know, real threats to freedom of expression. Those threats to freedom of expression that are compelled by force. We’re seeing public squares cleared, both in the digital metaphorical space and in the online real world. So to the extent that there is a distinction between sort of the nature of public versus private and the function that we use these squares for, I think that there a lot of similarities and probably much to learn from the tactics of resistance to efforts to censor and to silence.

ERICA KOCHI: Can you bring a real-life example to that, Katherine? I think what you’re saying is really important, but I think having it resonate with people in, you know, their memory—recent memory of what’s been happening, as well as, like, Maria or Nanjala, if you want to jump in. I think, you know, what you’re talking about is still kind of abstract, and I’d really like to bring it down to what’s happening around the world, and what could be—what could happen in the future.

KATHERINE MAHER: I actually would defer to Maria and Nanjala. I think that they have some immediate examples that would probably really help bring this out of the Zoom boxes.

NANJALA NYABOLA: Sure. So one example that I spend a lot of time with in my book is about the women’s movements in Kenya. And we’re talking about a sociopolitical context in which it’s not outright marginalization the way that people think about. Like, it’s women work, and women own businesses, and do all of these things. But patriarchy is very strong in Kenyan society. So, for example, a lot of communities, women are not allowed to own land or publish in their own names. And one place where it manifests, actually, is in the analogue public sphere whereby we have these unspoken morality codes about how women should dress in public and the consequences not being legal but actually harassment, abuse, and violence.

And in response to one of these incidences, in 2014, we had a massive online mobilization by the women’s movement in Kenya that resulted in the hashtag #MyDressMyChoice. #MyDressMyChoice happened on Twitter; it happened on Facebook; it was also—you know, WhatsApp was still very nascent but it was also a site where people were—women especially were organizing around harassment and violence, not as a, quote/unquote, “morality” issue but as a public safety issue, and this is violence and this is assault and it’s abuse. And it’s the first time, certainly in living memory, that we’ve seen government responding and charging the men who were perpetrating these acts with assault, as opposed to treating it as some kind of violation of, as I said, quote/unquote, “morality.”

But the big thing is that, for radical feminists in Kenya, there was no space in the analogue public sphere. Feminism was always discussed as a dirty word. Some of the iconic Kenyan women that you think about—Wangari Maathai—are women who had the backlash, tremendous backlash by everyone in the society because they’re women who are living outside the norms or the codes, the unspoken codes in the society. So feminism, identifying as a feminist, was a huge personal risk—right?—and you would end up being treated as some kind of stranger.

So for radical feminism, the digital space in Kenya, the digital space has been tremendous, in not just advocating for women’s rights and women’s safety, but in articulating radical feminism as a political discourse. What are we against? What do we stand against? We’re bringing in—making it a broad tent, bringing in LGBTQ+ rights in a country whereby homosexual acts are still punishable by fourteen years imprisonment. It’s allowed people to find each other in an analogue context whereby you couldn’t even say out loud that you were gay; you couldn’t even say out loud that you were a radical feminist. And being able to communicate somewhat freely in online spaces gives momentum to these movements, and these movements result in very key changes in the analogue public sphere.

The same vein—you know, we had the—repeal Section 162, which is the section that prohibits homosexuality in Kenya. The fact that LGBTQ+ Kenyans were able to organize, were able to put out statements, were able to actually be in conversation with the rest of the public, in a country where the traditional media would never publish anybody who has homosexuality anywhere in their bio, will never put them on television, never put them on radio, it was a tremendous boost to the movement, and it allowed for space in the public, in both the digital and the public, for a conversation that the traditional public sphere had been stifling since probably Kenya began.

MARIA RESSA: I would almost argue—

ERICA KOCHI: Maria, yeah.

MARIA RESSA: So I would almost argue the opposite, Nanjala, in the sense that yes, I agree, it was empowering at the beginning, but I think that the tide turned, at least in the Philippines. The Philippines, for the sixth year in a row, has—Filipinos spend the most time online and on social media globally, six years in a row. So, firsthand, I mean, in less than two years the Philippine government has filed ten arrest warrants against me, so I’ve posted bail ten times, just to be able to do my job as a journalist. And, you know, beyond that, it’s the same methodology—attacks bottom up.

In 2017, government propagandists tried to trend the hashtag #ArrestMariaRessa. It didn’t trend so they just kept that. And two years later, I was arrested, twice in a little more than a month. So, you know, in 2016, we wrote investigative pieces showing you how the kind of social—the weaponization of social media happened, and we called out the impunity on two fronts, President Duterte and his brutal drug war. Human rights activists put the death toll in the tens of thousands. That violence in the drug war was facilitated and it was fueled by social media, by American social media companies. Right? So based on this—on big data analysis that we did, we reported the networks that were manipulating us online; they were targeting and attacking not just journalists but human rights activists, truth tellers, hounding to silence—and this is—this is the goal—anyone challenging power, right? So, well, it’s only gotten worse today, and Silicon Valley’s sins came home to roost on January 6 with mob violence on Capitol Hill. What happens on social media doesn’t stay on social media, and online violence leads to real-world violence.

I’d add just one last thing, which is how this is all connected to geopolitical power play, right? Because as we’re talking about the coronavirus, there’s this equally dangerous and insidious—I’ve started calling it the virus of lies that’s been unleashed in our information ecosystem. It’s seeded by power wanting to stay in power, spread by algorithms that are motivated by profit. It’s a business model Shoshana Zuboff calls surveillance capitalism. The reward is our attention. And that is linked to the geopolitical power at play.

You know, the EU has slammed Russia and China for their intensified vaccine disinformation campaigns. And then this, again, becomes personal. Last September, Facebook took down information operations from China that were campaigning for the daughter of Duterte for president next year in our presidential elections. It was creating fake accounts for US elections. And it was attacking me. I’m just one journalist, right? So all of this is connected. Anyway, I’ll—so it’s very personal and also scary.

NANJALA NYABOLA: Yeah. If I—if I may, I by all means would never categorize myself as entirely optimistic because I do agree with you. And one of the points that I like to make is that the reason why we’ve had this use of social media in this way in a lot of African countries is because the social media companies did not see Africa as a site, did not see it as a market, did not see it as a place that was worth paying attention to. And because people were not paying attention to it, then all of these—you didn’t have this investment in the negative for a long time, and that allowed the positive to flourish.

The tide is definitely turning. The tide has definitely changed. And in fact, I end “Digital Democracy” with the 2017 election because Cambridge Analytica, which is a name that everybody’s aware of right now, their first operations, India 2011, Kenya 2013, Kenya 2017, right? So we’ve had the same thing, massive investments in misinformation campaigns, a lot of it—you know, Cambridge Analytica, a British company; Harris Media, American company; a lot of money coming from outside to influence political conversations in Kenya because of this realization that Africa is a place where conversations—where there’s money to be made. I mean, to put it quite simply, where there’s money to be made.

And so until the beginning of this year, Twitter’s Africa office was in Dublin. Last month, Jack Dorsey announced that he’s moving to Ghana for six months. Until 2015, Facebook’s Kenya—Africa office was a sales office in Johannesburg. At the beginning of 2019, Facebook went on a tremendous hiring spree and has expanded its operations in Africa so that right now the Africa office is actually a fully-fledged regional office the same way as the one that exists in Germany, as exists in the U.K. All of this is changing and it’s super dynamic.

And I think what you said about how the people in the Philippines use the internet, use social media more than any people in another country, I think it’s that lag that right now—when I started, right, saying in 2007 there were 100,000 Kenyans on Twitter, now there’s almost two million and it’s growing exponentially. And I think as that happens, the good will also intensify but so will the bad. So will the opportunity for predation. And that’s why it’s really important to understand what’s happening in other parts of the world, because it’s all connected, as you said. A lot of the things that made [January 6] happen in the United States are things that were practiced, seeded with misinformation campaigns/tactics that were being tested in other parts of the world before they were being perfected in the United States.

ERICA KOCHI: Yeah. Thank you so much for that. I think, you know, one of the things that you’re really hinting at here are, like, really sort of coming—pushing us towards is how not just financial reasons, but also how power drives a lot of the narratives that are happening in the digital online world. And Maria and Nanjala and Katherine, I know you all have some riffs and thoughts about the power and money dynamic that’s driving these online platforms.

Katherine, I’d like to start with you because I know you have some great examples from your time at Wikipedia to share.

KATHERINE MAHER: Oh. I mean, I think that—Nanjala, I’m so glad that you brought up this issue of where the offices are based because I think that this is absolutely key to really thinking about whether products and platforms are serving the communities that are now using these products globally in a way that is meaningful and sort of tailored to the needs of those communities.

I think that we’re all very familiar, or perhaps the listeners who are here today are—I hope that if not you’re familiar by the end of this conversation with this idea that most of these platforms have been built out of Silicon Valley. Most of the policies have been built out of Silicon Valley. Most of the policies and practices that are utilized to govern these platforms—both internal policies to the platforms but also the ways in which they are governed from a legal standpoint—are really based off US laws and norms. And when those are expanded out globally, you know, they do sometimes run into very real-world challenges of not being applicable, not being enforceable, not being appropriate—all these sorts of questions.

Now, I think that the connection here is something, Nanjala, you started to speak to but I’m going to make really explicit, which is that historically most these platforms did not see the rest of the world as a market. The market was the United States. The market was Europe. It was where you could make money. It was where you could sell ads. So it was where the infrastructure was developed and it was where the investments were made. And basically, the rest of the world—you know, good luck.

And so I think that one of the things that is really this transformational change, and I don’t—you know, we can talk about whether it’s good or whether it’s bad, to the point of whether, you know, late-stage capitalism is good or bad—is that we are now increasingly seeing platforms be aware of the market power of other parts—other regions of the world, the increasing economic power of individuals living in different countries outside of sort of the Western Europe and North America.

And what is interesting, and I think Erica what you’re trying to tease out, is this idea that all of the product decisions that are made are really ultimately in the service of sort of the maximum revenue per user, the maximum time on site. And those lead to certain decisions around what kind of services that these companies built, but also with decisions that they make relative to enabling censorship, or enabling privacy, or disallowing privacy on their platforms.

And so a great example of this that we’ve seen play out time and time again is when governments go to platforms and say: You know, we don’t like this content. We don’t like the way that you’re enabling people to speak about something that is a challenge to the current government in power. We don’t like the fact that this says something perhaps accurate but defamatory—no, that’s not correct. Accurate but unflattering about a leader in power. And so we’re going to ask you to take this down.

And if you don’t comply and don’t take this down, we’re going to shut off market access to ten million people, eighty million people, two-hundred million people, who would otherwise bring value to you and your advertisers. And so those connections, the value per user, the value per market, are driving the policy decisions. And that’s where we start to run into real questions about is it a public square? And the answer is: Whenever you have a capital incentive, it is not a public square.

And so I see Maria and Nanjala nodding and I want to make sure that I can give them space to take the floor as well, because I’m sure they have much more to add.

ERICA KOCHI: Yeah. Maria, do you want to jump in first? And then Nanjala.

MARIA RESSA: Yeah, I mean, just picking up from what Katherine said, right. Shoshana Zuboff called it surveillance capitalism. But I think, you know, what we have to look at is the very platforms that deliver the news, the facts, the information we need to make accurate decisions in our lives, are by design dividing us and radicalizing us, because they are largely—they are American companies, you know, it’s positioned as a free speech issue. It isn’t. And the users are blamed for a lot of the problems. Saying, well, people are really bad. No. These platforms are not mirroring our humanity. In fact, they’re making all of us our worst selves, by design, because it brings in the most revenue.

And here’s the part that I worry about the most. And, you know, I studied social networks because I was looking first at how the virulent ideology of terrorism—you know, how did that spread through social networks? How did it spread through social media? And that’s when you go from the psychology of an individual to group psychology, and how oftentimes the group exerts its own pressure. So an individual in the group will behave differently.

And finally, when you have it at scale—which is what these platforms have—it’s called immersion behavior, right? How we behave at scale right now is actually—it’s based on violence, on fear, on the very things that power the money that the platforms need, right? Violence, fear, uncertainty. And frankly, at least in my case, right, again, I could go to jail for the rest of my life, enabled by the dystopia in our information ecosystem. So this is a problem that we need to solve.

ERICA KOCHI: Nanjala, do you want to jump in here?

NANJALA NYABOLA: I’m not sure that I have—I can be more profound than what both Katherine and Maria have just said. There is a level of growing consciousness that we are at the mercy, really, of commercial policies—that our political processes, our public conversations, they’re at the mercy of political decisions. So one of the things that I’ve been looking at the last year has been language, and how language makes things possible and makes things not—how we name things affects what we think they are. And the word that has been really stuck in my brain is the idea of community standards.

And when a company says “community,” what do they actually mean? Are we actually talking about community in the way that we understand it in the local sense, or are we talking about an abdication of legal responsibility—I think of what Maria was saying—making it our fault? You know, it’s your fault that you didn’t catch that incident. It’s your fault that this thing went the way that it did. And sort of twisting what we mean when we think about community, we think about interdependence, and we think about relation, when you think about, you know, solidarity, and actually flipping that to: We’re not going to do anything. If you want us—the site to do better, then you have to do it on your own.

And it’s become really apparent, for example, when we talk about content moderation. Content moderation is a really interesting thing because most of Facebook’s content moderation for the rest of the world is actually run through Philippines. And so we have people who are putting up posts in languages that are not even official languages in many of the African countries. This has been a huge problem in Ethiopia. The official language in Ethiopia is Amharic. But a lot of—the majority of Ethiopians don’t speak Amharic because of the sense that they have another one.

And so the content moderation that’s done in English in the Philippines isn’t going to catch what pace which people are spreading in… Somali or in Tigrinya. And this is stuff that’s leading to conflict. You know, sixty-eight people died in October of the year before last because of hate speech that was posted on Facebook that stayed up for about three days. And people died over the—sixty people died over the weekend. And the community moderation fails because it’s not really a community. You’re not really in the same relation as we are with your neighbors and the people that we see and we have to share physical space with.

And so what it really means is that no one is responsible, that actually they’re—that failure to invest in content moderation in African countries before rolling out the business practice to me was a red flag about this whole tension—you know, what Katherine was talking about. When the capitalist interests, when the profit-making interest supersedes the interests—the community interests, the well-being of the people, the well-being of the—of the society, can we really call it a public square in the tradition of the word? And that’s why I think that it mimics the outlines of the public square and it takes certain aspects of it. It’s maybe not a one-to-one substitution in the that the people who make the site want…

ERICA KOCHI: Yeah. One of the things that, you know, all of your points are leading me to is really that there’s very, very little—when bad things happen—and, you know, not just online but they jump into the real world of—you know, the analog world—there’s very, very little opportunity to—for redress. And then there’s also very little accountability, you know, with these in-real-life implications of online public discourse.

So, I mean, one of the things that bothers me personally is that the sort of—the structures that we have for redress in our—in a justice system, or even within our governments with elected officials, just doesn’t exist in this online global public square. I’d love for your thoughts on, well, A, how it doesn’t exist, and any examples that you can bring to the table here. But also, what do you think should be in place beyond some of the regulation that’s starting to come out of the EU and, you know, other places in the world?

Katherine, shall we start with you again?

KATHERINE MAHER: I was worried you were going to call on me first. No, I think that—it’s something that Nanjala said that I think is really important, which is this idea that we’re really not talking about communities, and that sort of—this parallelization of content moderation and community moderation, and what a community actually is. And I think that—I’m just going to go out there and I’m going to stand up for the platforms just a tiny bit, which is to say that I don’t think we know what we want them to do. And I’m not saying that human rights advocates don’t have very specific understandings of how we defend human rights and journalists don’t have very specific understandings of what they would like to see, but as a—as a whole, as societies, as even just the three of us on this panel, I think we would struggle to come up with a set of sort of solution-oriented asks out of platforms that would feel comprehensive and meaningful to address the diverse set of challenges that we have.

And the reason I am saying this is because when we talk about sort of the norms of the public square, each of our public squares, the proverbial public square existed in a community, to Nanjala’s point, and that community had its own norms. It had its own norms about power sharing, you know, space sharing around voice, around what was appropriate language, what was incitement. Those norms did not work perfectly. In the offline space of governance, they still do not work perfectly. We’ve had lots of conversations probably throughout this week about the real challenges and threats that exist within the governance space of democracies—robust democracies today around the world, let alone these conversations that are happening in the online space. And so although they were imperfect, they were community-based practices and norms that became the basis for all of our laws, our, you know, legislative bodies, the ways that we think about jurisprudence and precedent.

The internet and the network at the scale of three billion people has only existed not just for the thirty years that we’ve been on the Web, but the scale of three billion people has really only existed for the last few years that we’ve all been connected. And I think we struggle to be able to articulate what kind of community norms we want at that scale. And so I’m a big advocate of going smaller on these issues and trying to scale back to the size of community-based standards and community-based norms.

But to the—so that—I think that kind of gets a little bit to what should be in place, not so much around redress, although I do think that there are some interesting sort of questions there, but really around sort of community-based standards and norms. And I think that platforms have not done a very good job of spending and internalizing to their own sort of cost structures what does it mean to bring—to bring out—to draw out what the community expectations and community norms are for a lot of these platforms. Instead, they try to apply sort of one-size-fits-all solutions. You know, and I understand that many of those are actually truly and meaningfully grounded in the human rights space, but they tend to break down when they are all subject to individual tests such as the way that the Facebook Oversight Board seems to be moving around individual tests of rights rather than understanding how these apply at the scale of the network and what sort of precedent it sets.

So that’s sort of what I would add to the conversation. And again, I’m seeing Maria nodding—but I’m also seeing Nanjala with a—with maybe a little bit of disagreement, and so I think that that’s a healthy place for us to be. So I’d love to hear from both of them.

ERICA KOCHI: OK. Nanjala, you want to jump in here? And then we’ll go to you, Maria.

NANJALA NYABOLA: It’s not so much a disagreement as it is I do think that there is something to—almost breaking up our idea of a social media—to stop thinking of it at a global level and start thinking about it at a smaller level. It doesn’t necessarily have to be national. It doesn’t have to be tied to national boundaries. But certainly disaggregate and stop—sort of move away from the—you know, also going back to antitrust laws of the Great Depression period and sort of moving away from that federal oil model of let’s go take over the whole world and starting to think small, I think there’s something to that. I think that’s what’s starting to happen, but it isn’t coming organically. It’s coming because a lot of countries are starting to threaten nationalization, that you have to have a local office, there has to be someone in this country that we can sue for all the things that you do when you were upset… I think that the companies are trying to get ahead of that energy because it’s definitely coming down the pike, and certainly in a lot of African countries it’s definitely coming down the pike.

I think that what really one of the things—I do think, though, that there is precedent. And I think that one of the things that the companies are going to have to sift through is that they will have to scale back their profitability in the interest of securing the public good. I think, for example, investing in well-resourced content moderation will require having well-resourced content moderation in every single country that you operate in. That’s a tremendous amount of money. But what’s happening right now is that the costs are being externalized as non-monetary costs, so they are not being measured but we’re still living with them. And so what looks like a savings is actually the—we have non-monetary costs that are not being put into the accounting.

And so thinking about it, I mean, I know in the environmental movement this is something that’s happened where people are starting to think more about externalities, about, you know, what does climate change mean. Even if you made XYZ money and you put this many tons of CO2 in the air, how much is that CO2 costing us? You know, how—and it doesn’t have to be in financial terms. I think that’s maybe something that we need to start thinking. Is it a question of—I know Twitter started doing this a while back, but has not been as vocal about it—but making it more robust, the idea of accounting for the non-monetary costs of having these things.

And from a philosophical perspective, I’ll tell you one thing that I struggle with is maybe we’re not supposed to be communicating as much as we are, right? And I really struggle with articulating what that means. It’s really at this point just an instinct. But again, when you have to pay for a text message—when you have to pay fifteen cents for a text message, then you try to make sure that whatever is in that text message is really important, right? It’s something that is worth that fifteen cents because that adds up. You send ten text messages a day, you spend $1.50. Because there is no financial cost for a lot of the communication that we’re doing that we are—that we are conscious… I think that we are absorbing those costs in other days. And so we think that it’s not costing you anything to put this message up and to send out this tweet, but as Maria was talking about, with the attention economy, the surveillance economy, there’s all these other costs that need to be accounted for. And so it is a question of getting better as both a digital rights movement and the people who are working with the platforms at accounting for the non-monetary costs of what it is that we’re building and what it is that we participate in?

So that’s kind of where I would start, with thinking about what the effective regulation and sort of curbing, I guess, the excesses. What would that look like?

ERICA KOCHI: Yeah. Maria, on to you.

MARIA RESSA: I’m going to take—I’m going to take bits and pieces, I think, you know, of both Katherine and Nanjala.

Look, Katherine, you mentioned the Facebook Oversight Board, right, and it’s set up like it’s a supreme court, and it is trying to put a system of justice in place at a glacial speed while this platform it’s attempting to regulate moves at the speed of light. So you’re talking about, like, the birthrate of humans versus drosophila fruit flies. It doesn’t work.

And so, you know, I’m—I sit on the real Facebook oversight board. We try to put pressure so that they actually do—I guess the basic assumption is this: Why is it OK to move at a glacial speed to deal with the problems that you created at lightning speed, at warp speed? That’s assumption number one.

Assumption number two is: OK, so you have billions of people. So why would you not be held accountable for what happens to all those people? Why is there something like a—why isn’t there anything like a consumer protection board for all the users like me who are abused—the women, the LGBTQ, the women journalists are attacked? In the Philippines, women are attacked at least ten times more than men. It brings out sexism, misogyny. It brings out the worst of human nature. So that’s the other one.

I think the third one is it goes back to—and this is because I come from mass media. We could only expand at the rate of what we could be accountable for. It took me a year to set up the Jakarta bureau for CNN because I had to go learn the culture, the language, and the laws. And when I set up that bureau in Indonesia, I was accountable for it. So why would tech be any different from that?

And that leads right to, like, I guess this last part, which is: If you can’t handle the technology, why would you be making $29 billion in net income?

I sound like I’m really against the tech platforms. Let me say I am the biggest fan of technology. Rappler remains a Facebook partner in the Philippines. We’re one of only two Filipino fact-checking partners. I believe in the tech. We created Rappler in 2012 precisely because I was hoping technology would help jumpstart development in countries like ours. But the excesses and the greed of the people who are running it have now created—have now destroyed democracy, and again, this is documented.

So I guess I’ll end with just that one thing of—I think this is E.O. Wilson, a biologist, who said this, right, that the greatest crisis we’re facing is that we have—we’re paleolithic emotions. This is where we’re being manipulated, insidiously so, without our knowledge, Medieval institutions that can’t regulate it, and God-like technology.

So I go back—and I’ll agree with you in saying that maybe we shouldn’t—we should go back to smaller, but how do you go there? I don’t think it’s that hard, not if you go right back to the principles that have always been there in the Universal Declaration of Human Rights. A lot of these things happened after news organizations lost the gatekeeping powers to the public sphere. And I guess this is where we go to the public sphere. Right? The public sphere is, if you’re a news organization, is governed by a set of standards and ethics, the mission of journalism, that holds you accountable for that. And I guess that’s what we’re missing. The new gatekeepers get the money, get the power, but they’re not accountable at any level and we are all suffering for that.

ERICA KOCHI: Absolutely. One of the things that, you know, I’ve been thinking about a lot over the past few years is, well, how do you create that accountability? And, you know, obviously regulation, you know, moves at a glacial pace so I don’t think that’s just the right approach. But one of the things that’s actually inspired me is other industries and how they’ve done it, so looking at—for example, the food safety industry, I think, has done some really interesting things and you have everyone from the farmer that’s, you know, raising the cow to the meat-processing plant and, like, how when something goes wrong, how do you really delineate where accountability lies? And obviously, that’s, you know, a very different sector, but I think that there are a lot of other sectors that we can draw inspiration from.

Do you see any other sectors or any places where you think we can really start thinking about how you put that accountability into action? OK, Maria –

MARIA RESSA: Katherine I want to hear. You know why—the other thing we haven’t mentioned, of course, is Wikipedia, right? Like, part of the problem that we have lost is the excitement that we had when, before the Arab Spring became the Arab Winter—right?—before—when it really was an empowering—where technology empowered these voices. And I think Wikipedia is one of the last holding—they’re holding it out for the wisdom of crowds. But when you lose independence of thought, when you’re able to pummel and you can’t make decisions anymore, you lose the wisdom of crowds and it becomes mob rule. So I guess Wikipedia is still holding it out, right, but I don’t know if you agree with that, but I’d love to hear from Katherine and then I’ll dump some thoughts also. But I’ll shut up.

ERICA KOCHI: Yeah, absolutely, Katherine. But then I will come back to your specific recommendations on accountability, so think over that. Katherine.

KATHERINE MAHER: I think one of the things that makes Wikipedia so incredibly different relative to these other platforms is the decentralization of the interpretations of policy and the decentralization of accountability and decision making. And so what I mean by that, for those who don’t know how Wikipedia works, is that it is a volunteer platform that is edited by people around the world. It exists in about three hundred languages; some of the largest languages, like English, are—six million articles, hundreds of thousands of contributors. These smaller languages may just have a few dozen. That actually speaks also to the quality model: The more people participate, the more high quality it is. Particularly, the more ideological diversities or life-experience diversity that you have, the more comprehensive, the more robust, but then, specifically at the article level, also, the more neutral or balanced the article actually is because you end up with people sort of really having to negotiate around complex topics that may have—people may have genuine and legitimate differing viewpoints about, including sort of historical incidents or understandings of culture or religion in which there is no sort of one accuracy.

One of the things that I think makes Wikipedia interesting potentially and one of the reasons why I think a lot about some of the challenges of scale is that it doesn’t scale. There’s no centralized decision making or authority on the fifty-five million articles that exist. There are a series of sort of principles around what kind of purpose we want the platform to have that enable people to interpret them in the local context—and I don’t just mean local in a way that says geographical; localized could be in the context of medical information; it could be in the context of information about literature—interpret in the local context and negotiate what the norms are within the community that cares about that context and is responsive to that context. And because all of this happens sort of in public and anyone can participate, there is a built-in accountability mechanism around the decisions that Wikipedians make because anyone can become a Wikipedian and participate in those decisions. And so the public has a direct line of accountability, even if they chose never to participate in it. I think it’s really baked into the sort of norms that have emerged for the platform over time and become sort of a—a sort of self—a closed-loop system that enables for every decision to be scrutinized and then inform the next set of decisions.

I do want to say, though, because I think it’s really important to say: It is imperfect. There are lots of issues. We can—don’t want to make this a Wikipedia advertisement, but there are lots of issues with that system relative to who participates, who has power. You know, we see a lot of exclusion of people from the global south, we see a lot of exclusion of women, and, of course, the platform is working actively to change that. But, by and large, I tend to agree with Maria. I think that it is better as a model than many models, and I think there’s a lot for other platforms potentially to be able to learn.

NANJALA NYABOLA: But of course a big reason why Wikipedia is successful is that it’s mostly dependent on people putting money in, the non-profit-making model. And this aggressive pursuit of profit is part of what colors their decision… is that you’re always cutting corners, you’re always—instead of doing content moderation, you do content moderation lite; instead of doing, you know, investment in understanding local culture and local history, we’re going to skip over that and not invest in understanding the communities that we work in.

I mean, the obvious answer or starting point for me is, you know, in the early twentieth century media, newspapers basically meant that people would be in their houses typing out flyers and handing them out, you know, to whoever would take them. There was no central organization. There was no central regulation. People would just write whatever they wanted. And libel laws, defamation laws have their histories in the early twentieth century, nineteenth century as well, because of that, because he could literally say whatever he wanted, put it in a flyer, and hand them out. And it was a combined effort of self-regulation—so you have media guilds and unions coming up and establish for themselves—but also public regulation, you know, defining what is libel, the hard limits. What is libel? What is defamation? What is slander? What is, you know, abuse? What is incitement to genocide? Characterizing and describing that. And so a combination of that and self-regulation creating a context in which the norms can become part of what we think is normal for a newspaper, right? We think it’s normal for a newspaper to have fact-checking. We think it’s normal for a newspaper to have, you know, all of this backstopping that happens at the back of news production. But it actually wasn’t integral to the system; it was something that was built.

And you know, to go back to the idea that, you know, social—this is social media 4.0. Social media 1.0 you think about Friendster, you think about all of these old platforms. Then you had MySpace. You had—I think the only surviving site from social media 2.0 is LinkedIn. Everything else has gone by the wayside. And now we have 5.0, sort of, you know, beating down the door. It’s TikTok, you know, all of these new sites coming up and bringing another different way of thinking about things.

And so in terms of thinking about a model, I think it’s still very nascent and there’s still an opportunity for us to define both of the combination of, you know, conversations like this, conversations with people who are active on the platforms, who are enduring the worst of the platforms and who are seeing what the ugliness could look like, and people who are benefiting from the platforms, meaning dialogue, but also being—having some overarching hard limits established on, you know, these are things that will be unacceptable regardless of what happens. The ability to entertain the political processes, political discourses… that’s a hard limit. Abuse, sexual assault, you know, and purging sexual assault, incitement to violence, these are hard limits.

And I think—sort of just to add on, I think one of the challenges that has happened is that this idea of free speech, because these are Silicon Valley companies that are rooted in American legal practice, there’s this idea of the absolutism of free speech, and the absolutism of free speech is not even a universal value. A lot of us live in countries where there are actually many limits to speech because we’ve seen what the excesses are. We have lived through genocide. We have lived through—and we are honest about having lived through genocide. That is something—we are honest about how we endured all of these excesses. And so, you know, there are things that you can’t say in Kenya because of what happened in 2007. There are things that you can’t say in Rwanda in a newspaper, in a magazine, whatever, because of the 1994 genocide. There are things that you can’t say in Germany because of the Holocaust. And I think, you know, having an honest conversation about the hard limits to what should be permissible on these platforms is not [an] anti-free speech process. It’s a recognition that the American approach is unique to American history, and actually the rest of the world has always had a different way of thinking about the—you can’t just say whatever you want in the newspaper because of certain histories that are attached to incitement and public discourse.

ERICA KOCHI: Yeah, absolutely. I think, you know, balancing of various rights is, obviously, key in this space.

I’m a bit conscious of time. We only have ten minutes left, so I really want to get to all of you about what you think is the most important step that we need to be taking, and who needs to be taking these steps specifically. Maria, I know you’ve done a lot of work out of the Infodemics Working Group. I’d love for you to talk about some of the recommendations that came out of there. And then I’ll pass to you, Katherine, and then Nanjala.

MARIA RESSA: OK, sure. So I look at this as three main pillars that will kind of restore some sense of a shared reality, because this virus of lies that I talk about it infects real people and changes the way they look at the world, and cognitive biases. I guess I didn’t expect that whole behavioral economics, the way we are being manipulated. But once you’re infected, it’s just like getting the coronavirus. You’re changed and you could literally—well, OK, let me not go there. So what is the solution?

November last year a group of us in the Forum on Information and Democracy formed an Infodemics Working Group. I co-chaired it along with former EU parliament member Marietje Schaake. We came up with, like, a dozen principles. Katherine talked about principles, right? One of the—a pet peeve of mine when we talk about content moderation is the kind of atomization into meaninglessness of how you moderate the public sphere. You don’t moderate the public sphere by deciding on how much—how much breast do you show. You moderate it through principles. Something like the Universal Declaration of Human Rights. You agree on standards and ethics, right?

And then—anyway, so let me go back. Twelve kind of systemic solutions. And they are aligned with, like, not manipulating us insidiously, not taking machine learning and taking every post and building a model of each of us so that we can then feed our most vulnerable point—we can sell it, right? It’s kind of like—I think it was Tristan Harris who said this—that it’s like you went to a psychologist and then that psychologist just said, hey, I’ve got this story this week point of Maria, and I’m going to sell it to you. How much will you give me for it? That’s what’s happening to each of us, each of the users on the platforms. So a lot. So those twelve systemic solutions. And then we came up with 250 others, principles that are easily used.

I think beyond that—that’s just the tech. The second is you have to strengthen independent journalism because as the business model crumbled the people who were challenging power, who were holding power to account, have come under more pressure than ever. It’s been a decade of decline all around the world. Jimmy Lai and Apple Daily just closed shop in Hong Kong. Jimmy Lai is in jail, right? And then finally, the third is we talked a little bit in the panel about community. And you have some of the definitions. But right now the community is global and it is truly sick. And that’s where I appeal—I go back.

All of these things require time. The EU’s Democracy Action Plan requires time. Section 230 requires time. It is back to the social media platforms. This is where I continue to appeal for enlightened self-interest. Anyway, that’s a long-winded answer. Let me kick it back to you—to you guys.

ERICA KOCHI: Yeah, Katherine, do you want to jump in here very quickly? And then Nanjala. And then I have a couple questions from the audience.

KATHERINE MAHER: Oh, OK. Very quickly then. I would say improve legislation regarding the transparency of advertising—not just on social media, but on app networks in particular. I think that that’s a real issue, and sort of the dark—the dark advertising Web needs better accountability.

Number two is localized community-driven standards. So more than—it’s a hard thing to apply, but I would say building on this idea that we’ve talked about that we’re not one community, we do have different communities. And I would say autonomy for the implementation of those community-driven standards in a—the localized context, whether that be linguistic, geographic, or otherwise, but with accountability back to the central entities. So people need to be able to make decisions more quickly, with more contextual understanding, but then there needs to be also sort of chains if accountability if those decisions go wrong.

And then, finally, I would say clearer escalation pathways and case tracking. We talked about the glacial scale of appeals at places like the Facebook Oversight Board. Needing to understand sort of where things are in the pathway, not just the high-profile cases but for everyone.

Those are just a few.

ERICA KOCHI: Mmm hmm.

Nanjala, to you.

NANJALA NYABOLA: I think—I would just recommend one thing that is over and above what has already been discussed, which is we have to get—if this is—if these are going to be global platforms, then the conversations and regulation have to be global also. And we have people from other parts of the world because, as I said in the beginning, a lot of the things that the West is dealing with now are things that have been refined or things that have been perfected in countries that have a much weaker legislative regime before they’re rolled out in the US. Like I said, thinking always like of Kenya 2013, 2017, same practice. Nigeria 2015. India 2011. All of these things are things that we know we’ve had experience with, but we’re not in the conversation where it comes to deciding what the regulations look like…

ERICA KOCHI: OK. Thank you very much.

So I have a couple questions from the audience…. I’ll go to Katherine first.

Katherine, so an audience member had a question on your comment on going small. He—or, she—would like to know what your thoughts are about building alternatives to the major platforms to restore the more local public squares that existed not so long ago before Facebook.

KATHERINE MAHER: I mean, I think we’re already seeing alternatives to major public platforms. We need to really be—I think we—particularly people of my generation need to be thoughtful when we’re talking about what social media is. You know, we’re often—too frequently just talking about legacy social media. And in reality, that’s not necessarily the behavioral patterns of a new generation. I will say, I’m not even on one of the larger social media platforms that we continuously talk about. Like, I’ve got an account, but I never use it.

And so when we talk about sort of the smaller platforms, there are spaces that exist in parallel to these large platforms. There are community-drive spaces. There are ways in which people are building their own community and messaging sites. There are ways in which these newer emergent technologies, like TikTok, are becoming the locus of conversations on critical issues for distributed communities. So I don’t think we want to over-rotate on sort of what are we going to do about—what are we going to do about Facebook question, because I think we’re going to miss a lot of the ways in which emergent behaviors of digital connectivity are sort of coming along.

But what I would say is, like, are small social media platforms going to come along and save the day from these large legacy ones? I’m going to be a little bit skeptical because I think that it’s very hard to take on the network effects. There are conversations around interoperability, data portability, can you actually have sort of the networks locally on the phone as sort of the way of having ownership over your—over your own network graph? I think these are interesting, but I’ve yet to see anything really breakthrough at this point. So by all means, continue to experiment and innovate. But I’m not—I think it’s less about trying to have a Facebook killer and more about what are the ways in which we meet some of the needs that this omnibus product is trying to serve so that we’re not trying to do absolutely everything on one platform.

ERICA KOCHI: Yeah. I think, Nanjala, perhaps we can go to you just for the final minute on making sure businesses track non-monetary costs.

NANJALA NYABOLA: Yeah. I think that, like I said, the environmental movement has been very good about this in thinking about environmental audits in ways that are not just about the money. When we think about offsetting and all of these other practices, it’s about translating some of these non-monetary costs into language that can be understood even if the restitution/resolution isn’t necessarily going to come from money. So, first of all, it would be a question about identifying harms and spending a lot of time systematically thinking about what are the harms, what are the opportunities, what are the gains and the losses that we’re making? How do we quantify these?

And a lot of people think about—there’s already a lot of political science research on, you know, the costs of civil war: exit/voice/loyalty. Like, what are some of these knock-on effects? How can we translate them into a way that, even if it’s not money, it’s a way that makes sense for the people who are doing business and people who are in industry?

And so, yeah, I would say just, you know, to wrap up the conversation, looking at what the environmental movement is doing in terms of thinking about externalities is a great way to start thinking about the externalities of social networking and all of these platforms that we’ve built.

ERICA KOCHI: Yeah. We are just at time now. I’m sorry to cut you off like that, Nanjala. But I wanted to thank all of you so much for your time and your insights, and for, I think, what is a very, very important conversation. So thank you so much.

Watch the full event

Related Experts: Katherine Maher

Image: A woman uses her phone as security forces members look on in Santiago, Chile on December 16, 2019. Photo via REUTERS/Ivan Alvarado.