Cybersecurity Disinformation Human Rights Internet
360/OS June 7, 2023

Activists and experts assemble in Costa Rica to protect human rights in the digital age

By Digital Forensic Research Lab

Will the world’s human-rights defenders be able to match the pace of quickly moving technological challenges arising from artificial intelligence, information wars, and more?

Rights activists, tech leaders, and other stakeholders are meeting at RightsCon Costa Rica on June 5-8 to collectively set an agenda for advancing human rights in this digital age.

Our experts at the Digital Forensic Research Lab are coordinating part of that effort, with a slate of RightsCon events as part of their 360/Open Summit: Around the World global programming. Below are highlights from the events at RightsCon, which cover digital frameworks in Africa, disinformation in Ukraine, online harassment of women globally, and more.


The latest from San José

Rethinking transparency reporting

Human rights must be central in the African Union’s Digital Transformation Strategy

Day two wraps with a warning about dangerous threats, from militant accelerationism to violence toward women

What’s behind today’s militant accelerationism?

The digital ecosystem’s impact on women’s political participation

Day one wraps with recommendations for Africa’s digital transformation, Venezuela’s digital connectivity, and an inclusionary web

What does a trustworthy web look like?

Mapping—and addressing—Venezuela’s information desert

Where open-source intelligence meets human-rights advocacy


Rethinking transparency reporting

On Day 3 of RightsCon Costa Rica, Rose Jackson, director of the DFRLab’s Democracy & Tech Initiative, joined panelists Frederike Kaltheuner, director for technology and human rights at Human Rights Watch, and David Green, civil liberties director at Electronic Frontier Foundation, for a panel on rethinking transparency reporting. The discussion was led and moderated by Gemma Shields, Online Safety Policy Lead at the United Kingdom’s Office of Communications (Ofcom).

Shields opened the session by describing the online safety bill currently making its way through the UK parliament and the role of Ofcom in its implementation. The bill will give new powers to Ofcom to test mandatory platform transparency reporting requirements. Through these efforts, Ofcom hopes that “good, effective meaningful transparency reporting might encourage proactive action from the platforms,” Shields explained.

During the discussion, the panelists discussed what will be central to implementation of the online safety bill, including what effective transparency reporting looks like. Kaltheuner emphasized the complexity of defining meaningful transparency when the use cases vary across end users, regulators, civil society, journalists, and academics. Green underscored the importance of centering user needs in the conversation and the need to tailor reporting mandates to specific platforms.

Jackson noted that it is a strategic imperative for the UK government to consult experts from the global majority and consider how regulations and norms could be potentially used for harm by non-democratic actors. As Jackson put it, “what happens in the most unprotected spaces is the beta test for what will show up in your backyard.” She also highlighted the importance of global civil society engaging with the UK Online Safety Bill and European transparency regulations, such as the Digital Services Act, because these policies are first movers in codifying more regulation, and future policies will refer back to these efforts.

Human rights must be central in the African Union’s Digital Transformation Strategy

The DFRLab gathered stakeholders from the policy-making, democracy, rights, and tech communities across the African continent to discuss the African Union’s Digital Transformation Strategy. Participants compared notes and identified opportunities for increasing the strategy’s human-rights focus as it approaches its mid-mandate review. Participants also agreed that trusted conveners, such as watchdog agencies within national governments, can play a critical facilitating role in ensuring effective communication between experts, users, and civil society on one hand and policymakers and elected officials on the other. Discussion of particular concerns with the Strategy or recommendations to increasingly center human rights in it will be continued in future gatherings.

Day two wraps with a warning about dangerous threats, from militant accelerationism to violence toward women

The DFRLab kicked off day two at RightsCon with a conversation on how Russian information operations, deployed ahead of the full-scale invasion of Ukraine, were used to build false justifications for the war, deny responsibility for the war of aggression, and mask Russia’s military build-up. The panel also highlighted two DFRLab reports, released in February 2023, that examine Russia’s justifications for the war and Russia’s attempts to undermine Ukraine’s resistance and support from the international community.

Read more

Transcript

Jun 8, 2023

Mapping the last decade of Russia’s disinformation and influence campaign in Ukraine

By Atlantic Council

Since its full-scale invasion of Ukraine, Russia has continued its information operations, targeting more than just Ukraine, say speakers at a RightsCon event hosted by the Digital Forensic Research Lab.

Disinformation Russia

While at RightsCon, the DFRLab participated in a discussion on militant accelerationism, its impact on minority communities, and how bad actors can be held accountable. The event, hosted by the United Kingdom’s Office of Communications and Slovakia’s Council of Media Services, featured panelists who discussed the ways in which policy can hold all voices, including those of the powerful, accountable. During the panel, DFRLab Research Fellow Meghan Conroy discussed how such violent narratives have become increasingly commonplace in some American ideologies and how extremist individuals and groups sympathetic to these narratives have been mobilized.

To close out the day, the DFRLab and the National Democratic Institute co-hosted a panel featuring global experts from civil society, government, and industry on how the threat of violence and harassment online has impacted the potential for women to participate in politics. As noted by the panelists, abuse suffered online is meant to strictly intimidate and silence those who want to get involved, and it is, therefore, all the more important that these very women, and those already established, stand up and speak out so as to serve as role models and protect diversity and equity in politics, tech, and beyond.

What’s behind today’s militant accelerationism?

By Meghan Conroy

While at RightsCon, I—a DFRLab research fellow and co-founder of the Accelerationism Research Consortium—joined an event hosted by the UK Office of Communications and Slovakia’s Council of Media Services on militant accelerationism.

My co-panelists and I provided an overview of militant accelerationism and an explanation of the marginalized groups that have been targets of militant accelerationist violence. I discussed accelerationist narratives that have not only permeated mainstream discourse but have also mobilized extremists to violence. Hannah Rose, research fellow and PhD candidate at King’s College London’s International Centre for the Study of Radicalization, zeroed in on the role of conspiracy theories in enabling the propagation of these extreme worldviews.

Stanislav Matějka, head of the Analytical Department at the Slovakian Council of Media Services, delved into the October 2022 attack in Bratislava. He flagged the role of larger, more mainstream platforms as well as filesharing services in enabling the spread of harmful content preceding the attack. Murtaza Shaikh, principal at the UK Office of Communications for illegal harms and hate and terrorism, highlighted the office’s work on the May 2022 attack in Buffalo, New York. He raised that these attacks result, in part, from majority populations framing themselves as under threat by minority populations, and then taking up arms against those minority populations.

Attendees then broke into groups to discuss regulatory solutions and highlight obstacles that may stand in the way of those solutions’ implementation or effectiveness. Key takeaways included the following:

  • Powerful voices need to be held to account. Politicians, influencers, and large platforms have played an outsized role in enabling the mainstreaming and broad reach of these worldviews.
  • Bad actors will accuse platforms and regulators of censorship, regardless of the extent to which content is moderated. As aforementioned, they’ll often position themselves as victims of oppression, and doing so in the context of content moderation policies is no different—even if the accusations are not rooted in reality.
  • Regulators must capitalize on existing expertise. Ahost of experts who monitor these actors, groups, and narratives across platforms, as well as their offline activities, can help regulators and platforms craft creative, adaptive, and effective policies to tackle the nebulous set of problems linked to militant accelerationism.

This conversation spurred some initial ideas that are geared toward generating more substantial discussion. Introducing those unfamiliar with understudied and misunderstood concepts, like militant accelerationism, is of the utmost importance to permit more effective combatting of online harms and their offline manifestations—especially those that have proven deadly.

Meghan Conroy is a US research fellow with the Atlantic Council’s Digital Forensic Research Lab.

The digital ecosystem’s impact on women’s political participation

By Abigail Wollam

The DFRLab and the National Democratic Institute (NDI) co-hosted a panel that brought together four global experts from civil society, government, and industry to discuss a shared and prevalent issue: The threat of digital violence and harassment that women face online, and the impact that it has on women’s participation in political life.

The panel was facilitated by Moira Whelan, director for democracy and technology at NDI; she opened the conversation by highlighting how critical these conversations are, outlining the threat to democracy posed by digital violence. She noted that as online harassment towards women becomes more prevalent, women are self-censoring and removing themselves from online spaces. “Targeted misogynistic abuse is designed to silence voices,” added panelist Julia Inman Grant, the eSafety commissioner of Australia.  

Both Neema Lugangira (chairperson for the African Parliamentary Network on Internet Governance and member of parliament in Tanzania) and Tracy Chou (founder and chief executive officer of Block Party) spoke about their experiences with online harassment and how those experiences spurred their actions in the space. Lugangira found, through her experience as a female politician in Tanzania, that the more outspoken or visible a woman is, the more abuse she gets. She observed that women might be less inspired to participate in political life because they see the abuse other women face—and the lack of defense or support these women get from other people. “I decided that since we’re a group that nobody speaks for… I’m going to speak for women in politics,” said Lugangira.

Chou said that she faced online harassment when she became an activist for diversity, equity, and inclusion in the tech community. She wanted to address the problem that she was facing herself and founded Block Party, a company that builds tools to combat online harassment.  

Despite these challenges, the panelists discussed potential solutions and ways forward. Australia is leading by example with its eSafety commissioner and Online Safety Act, which provide Australians with an avenue through which to report online abuses and receive assistance. Fernanda Martins, director of InternetLab, discussed the need to change how marginalized communities that face gendered abuse are seen and talked about; instead of talking about the community as a problem, it’s important to see them as part of the solution and bring them into the discussions.

Abigail Wollam is an assistant director at the Atlantic Council’s DFRLab

Read more

Transcript

Jun 8, 2023

The international community must protect women politicians from abuse online. Here’s how.

By Atlantic Council

At RightsCon, human-rights advocates and tech leaders who have faced harassment online detail their experiences—and ways the international community can support women moving forward.

Disinformation Resilience & Society

Day one wraps with recommendations for Africa’s digital transformation, Venezuela’s digital connectivity, and an inclusionary web

This year at RightsCon Costa Rica, the DFRLab previewed its forthcoming Task Force for a Trustworthy Future Web report and gathered human-rights defenders and tech leaders to talk about digital frameworks in Africa, disinformation in Latin America and Ukraine, and the impact online harassment has on women in political life, and what’s to come with the European Union’s Digital Services Act. 

Read more

Transcript

Jun 8, 2023

The European Commission’s Rita Wezenbeek on what comes next in implementing the Digital Services Act and Digital Markets Act

By Atlantic Council

At a DFRLab RightsCon event, Wezenbeek spoke about the need to get everyone involved in the implementation of the DSA and DMA.

Disinformation European Union

The programming kicked off on June 5 with the Digital Sherlocks training program in San José, which marked the first time the session was conducted in both English and Spanish. The workshop aimed to provide human-rights defenders with the tools and skills they need to build movements that are resilient to disinformation.  

On June 6, the programming opened with a meeting on centering human rights in the African Union’s Digital Transformation Strategy. The DFRLab gathered stakeholders from democracy, rights, and tech communities across the African continent to discuss the African Union’s Digital Transformation Strategy. Participants compared notes and identified opportunities for impact as the strategy approaches its mid-mandate review. 

Next, the DFRLab, Venezuela Inteligente, and Access Now hosted a session on strengthening Venezuela’s digital information ecosystem, a coalition-building meeting with twenty organizations. The discussion drew from a DFRLab analysis of Venezuela’s needs and capabilities related to the country’s media ecosystems and digital security, literacy, and connectivity. The speakers emphasized ways to serve vulnerable groups.

Following these discussions, the DFRLab participated a dialogue previewing findings from the Task Force for a Trustworthy Future Web. The DFRLab’s Task Force is convening a broad cross-section of industry, civil-society, and government leaders to set a clear and action-oriented agenda for future online ecosystems. As the Task Force wraps up its report, members discussed one of the group’s major findings: the importance of inclusionary design in product, policy, and regulatory development. To close out the first day of DFRLab programming at RightsCon Costa Rica, the task force notified the audience that it will be launching its report in the coming weeks. 

What does a trustworthy web look like?

By Jacqueline Malaret and Abigail Wollam

The DFRLab’s Task Force for a Trustworthy Future Web is charting a clear and action-oriented roadmap for future online ecosystems to protect users’ rights, support innovation, and center trust and safety principles. As the Task Force is wrapping up its report, members joined Task Force Director Kat Duffy to discuss one of the Task Force’s major findings—the importance of inclusionary design in product, policy, and regulatory development—on the first day of RightsCon Costa Rica.

In just eight weeks, Elon Musk took over Twitter, the cryptocurrency market crashed, ChatGPT launched, and major steps have been made in the development of augmented reality and virtual reality, fundamentally shifting the landscape of how we engage with technology. Framing the panel, Duffy highlighted how not only has technology changed at a breakneck pace, but the development and professionalization of the trust and safety industry have unfolded rapidly in tandem, bringing risks, harms, and opportunities to make the digital world safer for all.

Read more

Digital mouse cursor

Task Force for a Trustworthy Future Web

The Task Force for a Trustworthy Future Web will chart a clear and action-oriented roadmap for future online ecosystems to protect users’ rights, support innovation, and center trust and safety principles.

The three panelists—Agustina del Campo, director of the Center for Studies on Freedom of Expression; Nighat Dad, executive director of the Digital Rights Foundation; and Victoire Rio, a digital-rights advocate—agreed that the biggest risk, which could yield the greatest harm, is shaping industry practices through a Western-centric lens, without allowing space for the global majority. Excluding populations from the conversation around tech only solidifies the mistakes of the past and risks creating a knowledge gap. Additionally, the conversation touched on the risk of losing sight of the role of government, entrenching self-regulation as an industry norm, and absolving both companies and the state for harms that can occur because of the adoption of these technologies.

Where there is risk, there is also an opportunity to build safer and rights-respecting technologies. Panelists said that they found promise in the professionalization and organization of industry, which can create a space for dialogue and for civil society to engage and innovate in the field. They are also encouraged that more and more industry engagements are taking place within the structures of international law and universal human rights. The speakers were encouraged by new opportunities to shape regulation in a way that coalesces action around systemic and forward-looking solutions.

But how can industry, philanthropy, and civil society maximize these opportunities? There is an inherent need to support civil society that is already deeply engaged in this work and to help develop this field, particularly in the global majority. There is also a need to pursue research that can shift the narrative to incentivize investment in trust and safety teams and articulate a clear case for the existence of this work.

Jacqueline Malaret is an assistant director at the Atlantic Council’s DFRLab

Abigail Wollam is an assistant director at the Atlantic Council’s DFRLab

Mapping—and addressing—Venezuela’s information desert

By Iria Puyosa and Daniel Suárez Pérez

On June 6, the DFRLab, Venezuela Inteligente, and Access Now (which runs RightsCon) hosted a coalition-building meeting with twenty organizations that are currently working on strengthening Venezuela’s digital information ecosystem. The discussion was built on an analysis, conducted by the DFRLab, of the country’s media ecosystems and digital security, literacy, and connectivity; the speakers focused on ways to serve vulnerable groups such as grassroots activists, human-rights defenders, border populations, and populations in regions afflicted by irregular armed groups. 

The idea of developing a pilot project in an information desert combining four dimensions—connectivity, relevant information, security, and literacy—was discussed. Participants agreed that projects should combine technical solutions to increase access to connectivity and generate relevant information for communities, with a human-rights focus. In addition, projects should include a digital- and media-literacy component and continuous support for digital security.

Iria Puyosa is a senior research fellow at the Atlantic Council’s DFRLab

Daniel Suárez Pérez is a research associate for Latin America at the Atlantic Council’s DFRLab

Where open-source intelligence meets human-rights advocacy

By Ana Arriagada

On June 5, the DFRLab hosted a Digital Sherlocks workshop on strengthening human-rights advocacy through open-source intelligence (OSINT) and countering disinformation.

I co-led the workshop with DFRLab Associate Researchers Jean le Roux, Daniel Suárez Pérez, and Esteban Ponce de León.

In the session, attendees discussed the worrying rise of antidemocratic governments in Latin America—such as in Nicaragua and Guatemala—who are  using open-source tools for digital surveillance and are criminalizing the work of journalists and human-rights defenders. When faced with these challenges, it becomes imperative for civil-society organizations to acquire and use investigative skills to produce well-documented reports and investigations. 

During the workshop, DFRLab researchers shared their experiences investigating paid campaigns that spread disinformation or promote violence or online harassment. They recounted having used an array of tools to analyze the origin and behavior of these paid advertisements. 

DFRLab researchers also discussed tools that helped them detect suspicious activity on platforms such as YouTube, where, for example, some gamer channels spread videos related to disinformation campaigns or political violence. The workshop attendees also discussed how policy changes at Twitter have made the platform increasingly challenging to investigate, but they added that open-source researchers are still investigating, thanks to the help of available tools and the researchers’ creative methodologies. 

The workshop also showcased the DFRLab’s work with the Action Coalition on Meaningful Transparency (ACT). Attendees received a preview of ACT’s upcoming portal launch, for which the DFRLab has been offering guidance. The new resource will offer access to a repository of transparency reporting, policy documents, and analysis from companies, governments, and civil society. It will also include a registry of relevant actors and initiatives, and it will allow users to establish links between entries to see the connections between organizations, the initiatives they are involved in, and the reports they have published. 

The workshop ended with the DFRLab explaining that social network analysis— the study of social relationships and structures using graph theory—is important because it allows for investigating suspicious activity or unnatural behavior exhibited by users on social media platforms. 

Ana Arriagada is an assistant director for Latin America at the Atlantic Council’s DFRLab

Further reading

Related Experts: Layla Mashkoor, Jacqueline Malaret, Abigail Wollam, Iria Puyosa, Daniel Suárez Pérez, and Ana Arriagada

Image: Aerial view of crowd connected by lines