The international community must protect women politicians from abuse online. Here’s how.

Read more about 360/Open Summit: Around the World

360/OS

Jun 7, 2023

Activists and experts assemble in Costa Rica to protect human rights in the digital age

By Digital Forensic Research Lab

Our Digital Forensic Research Lab is convening top tech thinkers and human-rights defenders at RightsCon to collaborate on an agenda for advancing rights globally.

Cybersecurity Disinformation

Event transcript

Uncorrected transcript: Check against delivery

Speakers

Tracy Chou
Chief Executive Officer, Block Party

Julie Inman Grant
eSafety Commissioner, Australian Government

Neema Lugangira
Member of Parliament, Tanzania

Fernanda Martins
Director, Internet Lab

Moira Whelan
Director, Democracy and Technology, National Democratic Institute

MOIRA WHELAN: Hi, everybody, and thanks for joining us [for] a conversation about women’s political participation and the consequences of harassment. And before we get started today and I introduce our fantastic panelists, I just wanted to express my thanks to Access Now but especially also to DFRLab, who is cosponsoring this panel in particular. And what we’re going to do today is we’re going to walk through a short introduction, I’ll open the conversation to our participants, and then we’re happy to take your questions online.

So just to get us started, I first wanted to acknowledge that this panel is really a representation of a lot of the incredible work that’s been going on in our community for a really long time. And I would point to organizations that we’ve worked with such as DanishChurchAid, Internews, Policy, and many, many others. Here at RightsCon, there are more than thirty sessions happening to address these issues of online violence against women in politics.

And you know, so first acknowledging that others are doing the work. And then, saying that, some of the organizations that we work with—and I think an expectation we now have—is that if we’re doing this work, we face that harassment and that abuse as a community and as an organization, and that goes along with including the organizations that have helped organize this panel.

So first I want to say a little bit about NDI and how we came to this work. NDI is a democracy organization that trains women around the world to help them run for office, help them prepare for their life in civil society and the public sphere. And this issue has become blinking red for us. The number of women who are self-censoring, who are pulling out of politics, who are deciding another path is probably the biggest threat to democracy that we face today.

So we really started down the path of using our traditional models of working on information—on the information space and bringing actors together to address this issue. But we also believe it’s a solvable problem and I want to note that part of what we’re talking about today and the reason we’ve talked about building the community we want to build with our guests is because we want to talk about solutions but also some of the setbacks.

So without further ado, our panelists are Julie Inman Grant, who is the eSafety commissioner of Australia; and also Tracy Chou, who is the founder of Block Party and also an entrepreneur and is—we’re really thrilled to have her; as well as Fernanda Martins, who is the director at Internet Lab; and, finally, Neema Lugangira, who is a member of parliament from Tanzania.

So welcome, all of you, and, Neema, I want to start with you. The thing that we have noticed in doing this work is that it’s very rare for active female politicians to speak up because you don’t want to make, to use your words, this is not the agenda, right. You have other issues as a parliamentarian you want to address.

So I wonder if you can walk us through your personal experience of being so outspoken on the harassment you face and also what that’s done for your political experience.

NEEMA LUGANGIRA: Thank you very much. I, first, want to sincerely thank yourself, Moira, and NDI Tech for facilitating and enabling me to be here at RightsCon. So thank you, once again.

As you rightly said, that being a female in politics, unfortunately, the more outspoken you are, the more popular you are and well known the more abuse you get, and oftentimes you find on social media platforms the abuse that we tend to get it’s a group of people who want to disqualify you, discredit you, belittle you.

So instead of focusing on the issue that you’re presenting, instead of focusing on their agenda, they shift the issue and start focusing on the gender and, unfortunately, being a female politician what they do is they sexualize the issue. So they will sexualize everything that you’ve presented. If it’s a photo they’ll sexualize that. If you happen to take a photo with a guy in a meeting they’ll probably change the backgrounds so just to shift the narrative and to kind of belittle you and kind of shut you up.

And what that has done is, unfortunately, in Africa—and I believe it’s probably the same even in the Global North—is that the number of women in politics or female members of parliament who are active online is very, very minimal.

For example, in Tanzania we have about 146 female MPs and probably less than 5 percent active on social media, using social media for their work, and what that does—what that does very quickly it has a huge detrimental effect because, one, it limits our own visibility and if you’re not visible as a politician it limits your own reelection.

But it also takes a step back. You know, organizations like NDI are making strides to increase the number of women in politics but young women, aspiring women, they see us women in politics who are supposedly in power but we are being abused and we’re helpless and nobody comes to the defense of women in politics.

Like, I’ve seen it over and over again when a female in politics is being abused nobody comes to their defense. Actually, more people mob attack. It’s almost it comes—it comes kind of with the territory.

And just to sum up, I decided that since we’re a group that nobody speaks for us so I’m going to speak for members of parliament. I’m going to speak for women in politics, and as a result of that, yes, it brings about more abuse but then some of us have to go through it so that we can address this issue because I want to see more women in politics visible so that we can strengthen their visibility because we are doing a lot of incredible work and it needs to be seen.

MOIRA WHELAN: I couldn’t agree with you more and I think, quickly, I want to shift to you, Julie, because, you know, there is that issue of full participation and it’s something you’ve really focused on at eSafety in Australia and getting to sort of moving us from the research that we’ve worked on to the solutions.

I wonder if you can walk everyone through here this sort of example of addressing some of the concerns that Neema has raised in Australia.

JULIE INMAN GRANT: [For those] who don’t know what an eSafety commissioner is, we’re the first national independent regulator for online harms and online safety. And we were established in 2015, and so there is an Online Safety Act that enables me to take action when Australians report all forms of abuse to social media platforms, gaming sites, dating sites, you name it, and it isn’t taken down. So we serve as that safety net to advocate on behalf of our citizens when things go wrong online. We know tons fall through the cracks. And so we can bridge that inherent power balance that exists.

So I deal with everything from child sexual exploitation to image-based abuse, the non-consensual sharing of intimate images and videos. And I can say that recently we’ve been getting reports of deepfake videos of female politicians and other prominent women. We have a cyberbullying scheme for youth, and an adult cyber abuse scheme, which is at a much higher threshold to make sure that freedom of expression isn’t undermined. But we all realize here that targeted misogynistic abuse is designed to silence voices. And, as you say, women will self-censor.

Now, we—beyond these laws, we focus on prevention, in the first instance. Protection, through these regulatory schemes. And then what I call proactive change. So part of that has to do with putting responsibility back on the platforms themselves through initiatives like Safety by Design. You know, AI is a perfect use case as to how these—the collective brilliance of the technology industry should be used to tackling this at scale and preventing hateful, and misogynistic, and homophobic content from being shared.

So on the prevention side, well, first of all, I should say all of these forms of abuse are gendered. Ninety-six percent of the child sexual abuse material we look at—which happens, sorry to say, at toddler age—96 percent are of girls. Eighty-five percent of our image-based abuse are from women and girls. And then when you get more to the pointy end, we know that 99 percent of women experiencing domestic and family violence are also experiencing an extension of that, be it through technology-facilitated abuse, in 99.3 percent of cases.

So 89 percent of our adult cyber abuse cases are from women, and many of whom are either being cyber-stalked and doxed as [an] extension of domestic and family violence, or by perpetrators who specifically target women. And as Neema said, the way that online abuse against women manifests is different versus men. It’s sexualized. It’s violent. It talks about rape, fertility, supposed virtue, and appearance. It just manifests in very, very different ways. So I’ve had so many politicians say to me, you know, their male counterparts will say: Well, just toughen up, sweetheart, this is politics. Well, it is different.

So I actually tried to start a program called Women in the Spotlight to provide social media self-defense to women politicians, to journalists, to anyone in the public eye. And I was told by a previous government, we can’t fund that. That’s protecting privileged women. So I set up the program anyway, and started to do the training. And we can’t keep up with demand for social media self-defense training. And I don’t need to tell any of you that if being a woman receiving misogynistic abuse isn’t enough, if you’re from a—you have a disability, you end up—you identify as LGBTQI+, or you’re from a diverse background, that kind of abuse is compounded.

So again, I think we’ll continue to persevere. We need these prevention programs. We also know that the average professional woman in Australia is receiving online abuse. So one in three women. And 25 percent of them won’t take a job opportunity or a promotion if it requires them to be online. So we’re starting to see normalization of this kind of abuse across the population. And that’s why I’m trying to use my powers much more strongly to send a message that you cannot abuse people with total impunity. And this also involves penalties and fines for perpetrators, as well as the platforms themselves that refuse to remove content. We always try and work informally first, but I have used my formal powers. And if the platforms don’t comply, I can take them to court and to fine them as well.

MOIRA WHELAN: Well, and we are going to wing our way to Silicon Valley when we get to Tracy, but I wanted to stop in Brazil first and give Fernanda a chance. Because I think one of the things you said, Julie, was really about the intersectional issues as well that are linked to this. But also, the successes that you’ve had as civil society at Internet Lab, first having to prove to governments that this is a problem; second, getting them to pay attention and to work through the process. And I’m wondering if you can tell us a little bit about your involvement working with the government of Brazil.

FERNANDA MARTINS: Yeah, sure. Thank you, Moira, for this question. And thank you, DFRLab, for organize it.

I think Internet Lab, we have been working to improve the way that political gender-based violence is treated by governments independent of the government at the moment. So at this moment also it’s different because we have a progressive government, but at the same time we have parliamentaries that is not defenders of human rights. So the context is our fragile democracy, yet so we have these challenge to understand how we can contribute to this issue in Brazil.

So at this moment we have the fake news bill to trying to address the problems related to platforms, but it is important to mention that in the bill don’t have any mention to gender, any mention to LGBTQAI+ community, and a brief note about the law, political violence law and racism law in Brazil. But it’s like we are running in parallel avenues. It’s not connected. So we are trying to talk to government, talk to private sector, and understand how we can mix different social sectors to address the problem. And I think we have the law approved in 2021 addressing political violence, but we started the enforcement of the law in the last election and it was really weak. We need to just expand more the comprehension and not focus only on banal answers. We need education and other things in this context.

MOIRA WHELAN: Well, and I think that’s really important, especially as Julie was talking about so much the value of implementation and needing to see that it’s not just legal frameworks that are going to get us there.

But all of you have talked about the platforms. All of you have talked about tech. And I want to turn to Tracy now because I do have to tell you a story. Tracy was with us when DFRLab hosted us in Brussels to really introduce this issue and to really put it on the center stage, literally. And we’re big fans of Block Party. But, Tracy, we have a different panel here today. So we were here celebrating the success of Block Party, but I think you should maybe tell us about the current status.

TRACY CHOU: Yes. So, hi, everyone. I’m Tracy. I’m the founder and CEO of Block Party. We build technology to fight harassment online and make the internet safe for everyone. Until last week, our flagship product was available on Twitter to combat harassment, and it is now sadly on hiatus thanks to platform changes.

Before we get to that, maybe some context. I started my career as an early engineer at social media companies that are now very big platforms—at Facebook, Pinterest, and Quora—so I kind of understand how platforms are built and what are their incentives not just at the high levels for the companies, but also for individual people working at those companies.

And separately, I became an activist for diversity, equity, and inclusion in the tech industry, seeing how the people that are in the room really matter for the product that we’re building. That led to me getting a lot of harassment, and so I set out to solve that problem blending together the different parts of my experience…

So what we built on top of Twitter was something to solve my own problem, essentially a sort of spam folder where you can choose who you want to hear from. Everything gets filtered into that folder that you don’t—you might not want to see. You can review it later and take action later, involve your community for help. And it works really well. Like, it was great for me.

Silicon Valley talks about “dogfooding” your own products, building things that you use yourself. And it was great for me to experience the mental health impact of not having to see all of that terrible stuff. It’s not just me. It’s a lot of other folks that we’ve already heard from on this panel, people who are working in politics, people who are activists, academics. It’s been really sad to see that we’ve had to shut down—or, hopefully just put in hiatus. We’re really hopeful that we can bring it back in some capacity in the future. We’re already seeing the outpouring of folks who are who are using our product on Twitter really sad to see it go. There are people who are tweeting every day now saying, like, I miss Block Party, literally every day, because I’m now getting all this harassment that is no longer filtered. So lots more to share on that. That is the current status.

MOIRA WHELAN: Well, Tracy, I’m not going to—I’m going to stay with you for a second, because you should know that here in this room, I have heard repeatedly people saying they miss Block Party. We wish you could be here with us so that you could feel it directly, but we’re sending it to you virtually, because we need products like this. And I think the other aspect of this story that we would love if you could—if you could share it, if you can channel your rage into helping this room help you. You’re an entrepreneur. You’ve been building.

And yet—and it should be very obvious to all of us the business case for creating safe spaces for all people to fully participate online. And yet, your experience in Silicon Valley had been decidedly different. And I wonder if you can just kind of give us an insight into the experience of going with your fundraising rounds, and when you walked into rooms with funders. Because I think people here need to know just how challenging the environment is from beginning to end. It’s not just about fixing the existing giant platforms. We have a fundamental challenge here.

TRACY CHOU: Yeah. First, I might back up a little bit and talk about the decision to create Block Party as a for-profit entity. And that was because I believe that there is a business case, and that also that in order to align the incentives going for a capitalist approach, which is building solutions for people who pay for the value that they’re getting, is the best way. In order to build really compelling technology as well, be able to hire the best people in technology for a design and product engineering, also requires being able to pay those salaries. And so VC money, venture capital money, made the most sense to me, as aligning all of those things together. There’s a big opportunity there. And we need that initial capital to get going to build the technology.

So when I went out to raise I felt like, so I have, like, a pretty good shot at making this case. I’m a technical founder, with deep experience in top companies. I have two engineering degrees from Stanford, where I graduated with top honors. Like, this is a good resume that Silicon Valley typically likes. I’m solving my own problem, which they also talk about as a great thing. Like, if you know the problem intimately, because you experience it then you’re very motivated to solve it, and you know all the ins and outs of it. Again, usually something that’s very positive.

I did not have a good experience. There were a lot of people who were skeptical. You might imagine the typical demographic of VC, very white, very male. People were dubious that there was a market. So I was told that this was very niche, and also that it’s already a solved problem, and it will be solved by machine learning, the platform’s already addressing it, so, like, no issue anymore. I suspect some of this has to do with the fact that there’s a lack of diversity in the VC industry and even though our products are for everyone, they do disproportionately serve women and people from marginalized communities, who are more targeted by abuse.

I think there’s also the latent sexism in there, where even the people who thought that there might be a market here told me that they didn’t think that I could solve it, which is very frustrating. By comparison, I saw a number of men trying to tackle the same problem. Fewer credentials, building poor copycats of my product, raise exorbitant sums of money. In some cases, ten times as much. I talked with some of these founders and they would say things like, oh, well, just because, like, I used to work at Google and so, you know, I had the credibility. And I would just have to call myself and say, well, I worked at Google, and Facebook, and Pinterest, and Quora, and also have engineering degrees. But I guess that doesn’t matter when I’m a woman.

So very frustrating experience. Had to power through that. Ultimately did raise money. So very glad that I was able to raise the seed round last year and can actually hire people to keep tackling these problems. But I guess to the point that Moira’s trying to draw out here, there are really systemic issues. If we want to be able to solve these problems, we also need the funding to be able to do so. And when there’s systemic biases in the funders and they don’t believe that there is a problem here, we’re going to have additional challenges in trying to create these solutions.

MOIRA WHELAN: Well, thank you for that, Tracy. And I can’t say, again, you know, when we talk about the thing we’ve all been told of putting on a thicker skin, really, does it get any thicker than Tracy’s, having walked through that?

And Julie, I want to talk about these systemic issues, right? We actually had a question come in on Slido, so please all participate. But it gets to the next question I wanted to ask you, which was around the barriers. And is one of the barriers freedom of expression and where we allow freedom of expression and what is abuse? And I think, you know, you’re at the forefront of, like, how we define the digital experience for people, and I wonder if you can talk a little bit about: Is that a barrier? And then my second part is: Why aren’t more countries doing what Australia’s doing, and how do we help them?

JULIE INMAN GRANT: No, that’s—thank you so much.

And I want to thank Tracy for her perseverance. I’ve been watching her journey from afar, all this stuff about funding and tech bros. And this just shows you how gender inequality can manifest in so many different ways and at so many different levels, and we have to support technologists and entrepreneurs like Tracy to create, building these incredible products. Because I can say, having worked at Microsoft and Twitter and Adobe, that not enough is doing—being done inside and safety is always an afterthought. I mean, even if you look at the patterns of layoffs happening at companies like Twitter and Meta and Microsoft, the trust and safety people go first.

But I guess one thing that we have learned is that we’ll never regulate or wrest our way out of online harms with the speed, the scale, and the volume of content online. It’s always going to be a game of Whac-a-Mole, I guess, or Whac-a-Troll if you will.

But we are also talking about fundamental human behavior and societal ills that work underneath. And that was my experience at Twitter. I joined right after the Arab Spring with the belief that it was going to be a great leveler and people would be able to speak truth to power, but what I started to see very clearly is that women and those from marginalized communities were being silenced. So if you don’t draw a line about what constitutes online hate and online harm and you allow it to fester, then you’re actually suppressing freedom of expression. So it’s a—it’s a difficult line to tread.

Our parliament in Australia, online safety is very bipartisan. And there are different approaches that, of course, different parties would want to take, but collectively the government decided that they wanted to draw a line; and if online speech turns into online invective and is designed with a serious intent to harm, to menace, or harass, that we would draw a line and that we would have an investigative process, that there’d be lots of transparency and accountability, and multiple ways to challenge any decision I make. That’s the right thing. Never been challenged by any decision. And we’re actually helping to remediate harm of individuals.

So the good news is there are more countries coming onboard with online harms regulators. Ireland and Fiji both have online safety commissioners now. Of course, the online safety bill in the U.K. is pending, but that again is a much more polarized debate. Canada’s looking at this. I’m not sure where we’ll get to in the United States.

But we do want tech companies to start stepping up and protecting, empowering, and supporting people online. And that’s why five years ago we started the Safety by Design Initiative with industry to ask them to start providing the tools to do just that—to think about the design process, the deployment, the development process, the maintenance and the refresh process rather than retrofitting safety protections after the damage has been done. There will always be room for specialist tools like Block Party and [Privacy] Party, and we want to facilitate that—you know, let thousands of innovative flowers bloom so that we can all have safer, more positive experiences online.

We also have to keep an eye out in the future. I’m very concerned about the power of generative AI and these large language models and, you know, conversational models with the ability to manipulate—to manipulate young people for extortion, for grooming, for, you know, deep fakes and misinformation and disinformation. We need to think about immersive technologies and the Metaverse.

When we’re, you know, in high-sensory, hyper-realistic environments, the online harassment we’re feeling now will be much more extreme and much more visceral. Think about with haptics and headsets that are picking up, you know, your retinal scans and flushing, what that technology can tell these major companies about you. Neuro technology—you bring that into a toxic mix.

If we don’t start putting the onus back on these technology companies to be thinking about the risks and how their technologies can be misused and have them doing this at the forefront we’re never going to be able to get ahead of this.

So I do hope that more governments come on board. We’ve just established a global online safety regulators network with members who are independent statutory authorities who can demonstrate a track record on human rights and independence. But we’re also making room for observers for governments and other organizations that want to consider best practice in terms of setting up online harms regulators.

And with the DSA and other developments, I expect in the next five or ten years we will have a network of online harms regulators and we will no longer in Australia feeling like we’re at the head of the peloton going up [a mountain] with no one drafting behind us.

I think governments need to get together with the civil society sector and start to counter the stealth, the wealth, and the power of the technology industry. It’s the only way we’re going to get ahead this.

MOIRA WHELAN: Well, and I couldn’t agree more and I should say I think we all want to live in Julia Inman Grant’s internet. You know, that’s definitely the space we want to go.

I’d also point to the global partnership that Australia, the United States, and others have founded to address online abuse that NDI is very happy to support and we like the direction it’s going. But I think you made one really important point and that was the really clear leadership of civil society in both identifying this issue, making it a global issue instead of a personal issue that each politician is facing.

And you had, Fernanda, talked a little bit about the barriers you were facing. So you talked about tech versus government and I wonder if you can expand on that a little bit and tell us, like, where do you spend your time. How do you prioritize both of those needs and who needs to change first? Who needs to change in what way to—you know, this is what civil society does. You put yourself in the middle and you change it.

Please tell us a little bit more about how you’re doing that in Brazil.

FERNANDA MARTINS: Yeah. Sure. It was great to hear from Julie because I was thinking in similar things here and we know—we live at this moment a shift of violence concept and in less years ago when you talk to platforms about gender-based violence online we are talking mainly about dissemination of [non-consented materials].

And now when we try to talk about political violence it’s like we are tension the relationship between freedom of expression and the limit that needs to exist. So it’s interesting to note that when we look at the Brazilian context, in the legislative context we have some laws directed to domestic violence. And when we talk to platforms, they told us about the necessity to protect women related to these issues and violence that is targeted by ex-partners, for example.

But it’s difficult. It is a challenge made—government made platforms and everyone involved in this issue—that we are in public is fair. And not just women; we are talking to marginalized groups in general. So our effort at this moment is to demonstrate that, OK, we demonstrated before that the violence exists, so now what we can do inclusively when we talk about difference what needs to be excluded in platforms, what to be—have flagged that there is content here, it is an insult; but we have—we have, too, platforms that have the policy that public figures need to be more tolerant to attacks and insults, as Meta’s platform. So how we can educate society in general if the example on platforms is, say, women candidate could be attacked, the other could be attacked—women, LGBTQI+ community.

So we need to change the policies, and we need to—we need strong—make strong our laws and their relationship globally. So I think it is a little what we’re trying to do.

MOIRA WHELAN: And I think it’s an excellent point. When you were working with NDI on our program to identify interventions, we identified twenty-six. We have colleagues at Web Foundation, at CG, at other places that were coming up with theirs. We just did an inventory, and we have, like, 450 identified opportunities for changes.

But I want to turn us to Neema, because it all comes back to politics, right? A lot of those changes weren’t just with platforms. They weren’t just with governments. They were also within political parties. How media outlets, you know, cover it. Because even though we’re talking about these major global issues, as a politician that’s still a very personal experience and it’s still very—you know, it’s hard to look at fixing the whole tech system when you’re going through this every day. And I wonder if you can talk about—bring us a little closer to home, and what we need to do, and what are the barriers getting in the way of fixing it, for your own political experience?

NEEMA LUGANGIRA: Thank you. I think one of the things—there are different moving blocks. The first one is the social media platforms. And exactly like what she just said, in the sense is that it is expected because we’re in politics we should have thick skin. But why should I have thick skin? Why should I tolerate abuse? If you’re not able to abuse me online, why should you abuse—if you’re not able to abuse me offline, why should you abuse me online? So the challenges on the social media platforms is although Julie said a positive feedback on AI, at the same time artificial intelligence also has an issue.

In the sense that we have—myself, and my colleagues—we have reported on a number of times, you report on abuse, and it’s written in Kiswahili, for example, or the local language, and you try to even go further and translate it. But still, someone replies and says: This doesn’t violate our rules. And you’re thinking, what rules? This violates every kind of rule. So on the social media platforms, there’s a lot of work that needs to be done. And I think one of the things through organizations like NDI is to give us the opportunity also as the women in politics to be in the same room with the decisionmakers at the social media platforms. Because we need to tell them these issues, and they need to hear these issues from us. Not from someone else, but they need to hear these issues from us.

Secondly, when it comes to media, in a lot of countries, unfortunately, media—the way the media do the gender profiling of women in politics also results into abuse. You may find that maybe you’ve been in a meeting. There were several pictures that they were taken—that a particular media took of you. And they decide to use the picture that shows some parts of the body accidentally. You know, maybe your dress went a little bit down, so your shoulder is showing, or the cleavage is showing. And they would use that picture and say: Maybe Honorable Neema said such and such, such a brilliant thing. But because the image they chose to use, it totally shifts the issue and it results into abuse. So sometimes the gender profiling is also an issue.

But the other thing that I’m currently working on in Tanzania is to try and see—there are a lot of laws that are existing that talk about bits and pieces of online abuse. But none are more, like, specific for women in politics. So I’m trying right now in Tanzania to push that we should have a regulatory reform on our political parties act and election acts, so that these two acts recognize online abuse as an offense. Because there’s a number of offenses in political parties acts whereby if you can be proven—let’s say you’re a male, and you have—you’re vying for a position. If it can be proven you’ve done a GBV offense, you can be taken off the candidates list.

So I’m trying to push that online abuse should also be recognized for women in politics, because a lot of the abuse that we get is also related to politics. So that can also reduce a certain group, a group of people, at least those who are aspiring to get into politics. And it can give us the power to now start documenting this. And if you hear, maybe, I don’t know, Gregory has been nominated for something, you can go and use that particular law and say: This person has been abusing women online, kind of thing. So trying to push the political parties act and the election act to do so.

But at the same time, I set up an NGO called Omuka Hub. And what we are trying to do is to strengthen online visibility of women in politics and continentally we are trying to do that through the African Parliamentary Network on Internet Governance, again, to strengthen the visibility of women in politics. But to do that, organizations that have funding or that are talking about digital development, digital gender gaps. Oftentimes they don’t remember that there’s a group of women in politics. So I would like to stress that whenever we are having interventions, we should have funding also allocated to support the training and the capacity, exactly like what Julie said. A lot of us are online, but we don’t know how to protect ourselves.

Very recently, I experienced the most horrific abuse through WhatsApp. Like, I have—I have experienced it a lot on other platforms, but it was the first time experiencing it in WhatsApp. So these are people I know in an WhatsApp group. And it went on for, like, four days. I didn’t want to leave the group, because I didn’t want to be seen like I’m running away, but it didn’t want to be seeing them. And you can’t help it, because they’re there. And I actually got to learn that you can archive the group, so you don’t see it. I just learned this, like, two weeks ago. So I can tell you.

But that was about, like, three or four days of excruciating, like, emotional rage. And you can’t do anything about it. You want to respond, but people are calling you, you know, you’re an MP. Don’t respond. So you’re keeping quiet. At the same time, you have to show up in Parliament, do your contributions. You have to show face and do all of that. But why should I be doing that? Why should I have to do that, you know?

MOIRA WHELAN: Absolutely. I want to back up to one thing. We’re going to go to two things. We have, like, less than five minutes, and I want us to do two things. One, we got a question from online. And I think one of the things we really tried to do here was show the completely different environments that we’re dealing with, right? We have Australia, we have Brazil, we have Tanzania.

And we got a question asking, we’ve all cited social media regulation as an opportunity here, but that’s a challenge, right? How do you regulate social media from all different perspectives and from all different countries, recognizing cultural challenges, recognizing the responsibilities they have to localize platforms? So I don’t know who wants it—who wants to pick up on the—on the regulation. Maybe Julie and Neema, quickly.

And then after that, what we’re going to do is you have a captive audience. We have the entire digital rights community here. We need to send them out with something to do. We’re all good at that. We’re going to give them a job. So be thinking quickly about what your job is for everyone in this room. But, Neema, and then Julie, and then we can kind of go around.

NEEMA LUGANGIRA: So very quickly, in terms of the regulation, I think one is we cannot avoid regulating social media, but the issue is how to regulate because we still want the environment—you don’t want it to be stringent. And we can learn from other countries who have done it. But the bottom line is, especially for Global South countries who don’t have that muscle that Global North have, what I would like to say is when Global North are negotiating with social media companies, getting into agreements, they should insert requirements that the same behavior they do in their bloc—in the EU or the US, Canada, Australia—they should also behave the same way in Africa. We’re seeing the same thing with data protection. They are doing a great job in the EU, horrible job in Africa.

MOIRA WHELAN: That’s a good point.

We’re going to flip it over really quick to Julie and then, Tracy, you’re up with your pitch. So go ahead, Julie, if you want to jump in on that one.

JULIE INMAN GRANT: I was just going to say, you know, the challenge is that laws are national and local and the internet is global.

Moira, you’re aware that we just issued a number of mandatory codes and are working on standards that will apply to eight different sectors of the technology industry. This has to do with illegal and harmful content, specifically child sexual abuse material and terrorist and violent extremist content. But it isn’t very easy for these global technology companies to sort of quarantine their activities just to Australia, and that applies to safety as well. So the hope is as—you know, and like the European Commission deploying the Digital Services Act and possibly the AI Act, as we’ve seen with GDPR there should be systemic changes and reforms that happen.

But again, the really important thing in bringing different countries together with different needs, different levels of resourcing and funding, and even different political systems and approaches to regulation is going to be challenging. And one of the reasons we set up this global network is to prevent a splinternet so that countries coming onboard can learn from what is best practice.

You know, we did not have a playbook. We had to write it as we went along, and we’re happy to share those learnings. And there will be others who will engage and will try to something different that will be successful. So, again, it has to be a whole-of-society approach to tackling this.

MOIRA WHELAN: Absolutely.

So, Tracy, you have, ironically, like a tweet level because we have less than a minute and we’re going to try to get around. So Tracy, then Fernanda: What’s the pitch for everybody here?

TRACY CHOU: I actually want to comment on the regulation side, which is that regulation can also create the space for more solutions. So it doesn’t just have to be about the content or behaviors that are happening. The reason why Block Party had to shut down our classic product on Twitter was that there was no openness in the APIs, these programming interfaces. And what regulation can do here is require that openness such that we can have these consumer solutions. There’s a bill in the New York State Senate introduced this legislative session, S.6686, which introduces this concept. So just want to put that pitch out there for on the regulation side what we can do.

The other one-line pitch is Block Party has a new product called Privacy Party, and this is making it so that we are teaching people what they should do to be safe online and also helping to automate that. So we have automated playbooks for you to lock down your social media settings. Check it out. Give us feedback. And we want to keep building these tools to help keep people safe.

MOIRA WHELAN: Thank you so much, Tracy.

And Fernanda, last word.

FERNANDA MARTINS: I think the next step is to change the way that we are looking at indigenous, women, Black people, and LGBTQAI+ community because we are—we have been seen as a problem to solve, but we are part of the solution. So we need to be included. The digital rights field need to be include these people, these communities to solve the problem together.

MOIRA WHELAN: Absolutely. And I would also say none of us have mentioned it, but we need more male allies. So any of you are out there, we need men in all of these companies, in government, in civil society joining us in this conversation. So we hope to see—that’s a mantle I would take.

So thank you all for joining us today. Have a great RightsCon. Really appreciate everyone being so brave to share your individual stories.

FERNANDA MARTINS: Thank you.

JULIE INMAN GRANT: Thank you.

TRACY CHOU: Thank you.

NEEMA LUGANGIRA: Thank you.

Further reading

Image: A woman holds a placard during the Women March which is dedicated to International Women's Day in downtown Kyiv. Feminists, LGBT representatives, and human rights activists marched on International Women's Day (IWD) protesting against sexism and violence against women, demanding to the Ukrainian authorities to ratify the Istanbul Convention. (Photo by Pavlo Gonchar / SOPA Images/Sipa USA)