As a part of the Atlantic Council’s Digital Forensic Research Lab (DFRLab) fifth annual 360/Open Summit, held on June 6 and 7 in Brussels, the DFRLab assembled a cohort of Digital Sherlocks. In total, around eighty Digital Sherlocks from more than forty countries attended.
Combatting disinformation and building a resilient information ecosystem can’t be done in isolation; ultimately, it requires a growing movement of individuals taking the lead around the world and in their own communities. It’s these #DigitalSherlocks who are at the heart of our movement. In 2020, as the COVID-19 pandemic limited travel and in-person convening, the DFRLab adapted its training and capacity program for facilitation online. The result was the launch of the 360/Digital Sherlocks program, a quarterly series of free, online trainings for cohorts drawn from universities, civil society, journalism, and academia committed to monitoring and protecting the information environment in their respective regions.
In the past two years alone, the DFRLab has trained almost 1,400 people across more than one hundred countries as part of this program. In addition to the hands-on training in everything from geospatial analysis to API access, we convene interactive conversations on tech policy and facilitate an ongoing network. The Digital Sherlocks have access to dedicated resources and private social media groups, which they can use to share resources, promote their own research, and pursue collaboration with other participants.
To build on this astounding growth, the DFRLab chose to expand its usual 360/OS program to include a cohort of #DigitalSherlocks in Brussels, in part through sponsorships and scholarships. Those selected participated in innovative trainings on open-source methodologies, dedicated briefings with experts and policymakers, and other tailored opportunities throughout the two days of the summit.
Before the 360/OS conference officially kicked off, Sherlocks joined the DFRLab team for an exclusive gathering welcoming the cohort to Brussels.
Monday, June 6, 2022
The official program for the Sherlocks kicked off on June 6, with DFRLab Senior Director Graham Brookie welcoming the cohort. Lukas Andriukaitis, associate director for the DFRLab, then gave the assembled Sherlocks some logistical details before Associate Researcher Jean le Roux presented them with an open-source investigative challenge.
The interactive program began with a session on the Challenges of Open-Source Ethics, led by DFRLab Managing Editor Andy Carvin and Associate Editor Layla Mashkoor. Open-source information can be a powerful tool for identifying the truth or helping find justice. In some cases, however, open-source data might infringe on individuals’ online privacy and personal data security, if not used with caution. Since there is no code of conduct overseeing how open-source investigators operate, this presentation and the subsequent discussion sought to determine where the line should be drawn. What if the information is being collected for a good cause? What if data used to identify war crimes was collected without consent? This training session raised and addressed questions that all open-source researchers should keep in mind as they go about their work.
As would happen throughout the two days, the Sherlocks were then invited to view the 360/OS mainstage sessions until lunch. Following lunch, they reconvened for a Sherlocks-only session led by Bellingcat researcher Aiganysh Aidarbekova on Using Telegram for EU QAnon Research and More. Aidarbekova discussed how Bellingcat understands Telegram, how to effectively find active groups and networks, and how to archive valuable content.
Aidarbekova focused on how Telegram has continued to host exiled fringe figures after mainstream social media platforms blocked them for spreading QAnon and COVID conspiracies, and neo-Nazi and far-right messaging. Telegram, which functions much like WhatsApp with additional features, is one of the best platforms for conducting research. Exploring how networks use Telegram is an important piece in understanding the wider information environment.
In this session, participants learned about emerging surveillance patterns and evolving digital attacks targeting civil society organizations and voices of political dissent over the past decade. Raoof presented cases of digital attacks from different countries and explored digital security tips and practices, including which tools and systems are actually safe and how to better engage with technology with privacy in mind. Raoof spoke specifically about the Pegasus malware attacks against activists and nonprofit organizations and considered how free and open-source apps and tools could help researchers in their work.
For their last session of the first day, Sherlocks heard from Dan Arnaudo, advisor for information strategies, and Julia Brothers, senior advisor for elections, both from the National Democratic Institute, who presented the session Introduction to Becoming an Elections Super Sleuth.
The deployment of false, manipulated, or disorienting information during elections undermines fundamental principles for democratic elections across the globe, amplifying voter confusion, galvanizing social cleavages, suppressing participation, and degrading trust in electoral institutions and, in some cases, election outcomes themselves.
The National Democratic Institute representatives discussed how to harness social media monitoring techniques to safeguard the electoral process through identifying, exposing, and responding to election-related threats and building resilience in democratic systems more broadly. This session addressed some of the unique disinformation trends and vulnerabilities around elections, the various roles of electoral actors in the information environment, how to conduct an electoral integrity risk analysis, and practical avenues and tactics for intervention by open-source researchers. The session used election monitoring as a method of talking about wider research and methodology. In addition to a brief presentation on core themes and guidance, participants also engaged in practical exercises, shared experiences, and discussed future opportunities.
Tuesday, June 7, 2022
The Sherlocks’ second day started with the talk Dark Social Investigations: How to track election focused mis/disinformation and narratives on WhatsApp and Telegram and invisible mobilization from public social to dark social channels from Allan Cheboi, senior investigations manager with Code for Africa’s iLAB. Cheboi presented how to investigate narratives being propagated on dark or closed social media platforms such as WhatsApp and Telegram. Dark social groups may be invisible, but they can quickly reach a significant scale. For example, conversations in some of the groups iLAB is monitoring ahead of the upcoming Kenyan elections, scheduled for August, hit upwards of fifteen thousand messages during peak debates but tend to average one thousand messages per group every two weeks.
Much of the content within these conversations is grassroots planning or logistics for local rallies or other events, but organizers regularly use inflammatory content to incite or mobilize followers. Encrypted messaging apps are a tried and tested tool for mobilizing voters across Africa, with step-by-step playbooks. Identical strategies are already apparent in different countries, from “hub-and-spoke” networks operating in cells to “command centers” that craft messages for deputies to share on linked groups. The modular system means that nuanced messaging can be customized for specific local or regional audiences or can be harnessed for mass amplification.
Following a coffee break, Meta’s Ben Nimmo and Olga Belogolova presented on the company’s concept of “coordinated inauthentic behavior” (CIB), first coined by its security team in 2017. In the panel, titled “To CIB or Not to CIB,” Nimmo and Belogolova presented the Digital Sherlocks with practical guidance on what to look for in cases of suspected CIB and how to distinguish complex influence operations from simpler, spammier networks. Additionally, they discussed the origins of the term, the policy, other inauthentic behavior protocols, the evolution of the team’s work in studying adversarial influence activities online, and the spectrum of complexity that exists across the practice of digital influence.
The Sherlocks went into the afternoon joining the main stage audience for a panel on ending violence against women online.
After lunch, Digital Sherlocks heard from Amaury Lesplingart, chief technology officer with CheckFirst. Lesplingart’s presentation, titled CrossOver: An analysis of recommendation algorithms on social media platforms against disinformation in Belgium, looked at CrossOver—a joint project from CheckFirst, EU DisinfoLab, Apache, and Savoir Devenir that tracks and measures the influence of content recommendation algorithms on social media in Belgium. The initiative’s work analyzes how algorithms contribute to the spread of mis- and disinformation. The project’s team monitors and investigates disinformation operations in both in Dutch and French, working with the media to expose malicious actors and raise awareness among the public and policymakers. This training included giving the Sherlocks an opportunity to play with the dashboards, experimenting with results from specific keywords, and comparing across social media platforms.
Before closing out the day and the conference as a whole, the cohort again joined the 360/OS main stage audience to watch Nobel Peace Prize recipient Maria Ressa interview US Secretary of State Antony Blinken.
Andriukaitis and le Roux brought this year’s summit to a close by examining the Sherlocks’ analysis around the challenge presented on day one. Participants shared their findings and methods and their approaches to solving the challenge. A question-and-answer session about the challenge and the overall Digital Sherlock training program followed.
The DFRLab would like to thank all the Digital Sherlocks who took the time to attend this year’s gathering, and we cannot wait to see all the work you get up to.
Iain Robertson is a deputy managing editor at the Digital Forensic Research Lab.
Layla Mashkoor is an associate editor at the Digital Forensic Research Lab.