May 5, 2020 – The Atlantic Council’s Digital Forensic Research Lab (DFRLab) is at the forefront of the study of disinformation, foreign influence operations, and – more broadly – social media’s role for free and open societies.

In its April 2020 monthly report on coordinated inauthentic behavior (CIB) and foreign or government interference (FGI) across its platforms, Facebook announced the removal of eight unrelated networks of accounts, pages, and groups. The DFRLab conducted independent analysis prior to removal of the assets originating in Russia and the Donbas region of Ukraine, Georgia, and Myanmar, respectively.

These assets highlighted a number of trends, detailed below. For follow-up inquiries, please reach out to dfrlab@atlanticcouncil.org.

Facebook removes propaganda outlets linked to Russian security services

What you need to know: The disclosure of this network is not necessarily new, but its amplification through the use of coordinated and inauthentic behavior is. Its removal is a significant step in enforcement and content removal from a platform like Facebook.

  • Facebook pages and groups in this set had clear connections to News Front and South Front, two Crimea-based media organizations with ties to the FSB.
  • The pages, groups, and accounts broadly focused on topics and narratives that favored the Kremlin’s geopolitical agenda: denial of Russia’s role in the downing of Flight MH-17, false equivalencies comparing the conflict in the Donbas region of Ukraine to the U.K.’s Brexit, and discrediting Europe’s response to the COVID-19 pandemic.
  • User accounts linked to a third outlet, an Abkhazian news agency that pushed a pro-Kremlin agenda, Anna News, also promoted News Front and South Front content.
  • There were clear attempts to use sockpuppet accounts to push content, but no evidence that the accounts involved used inauthentic means at scale. While the DFRLab could not corroborate Facebook’s finding of CIB, it also found no evidence to contradict it.
  • A defining feature of this operation was its scope: the removed assets disseminated pro-Kremlin propaganda in an array of languages, including Russian, English, Spanish, and Dutch, indicating that they were attempting to reach a diverse, international audience beyond Russia.

Inauthentic Facebook network shut down in Georgia

What you need to know: This case comes at a significant moment in Georgia ahead of elections there this fall. The use of coordinated inauthentic activity, apparently from both ends of the political spectrum, but in this case tied directly to officials connected to the current ruling party, is a cause for concern ahead of elections.

  • This subset of assets was openly linked to Georgia Dream-affiliated media platform Espersona, an attribution the DFRLab was able to independently corroborate via open-source analysis.
  • The network targeted a domestic Georgian audience with posts about politics, elections, and government policies and attempted to discredit or criticize the opposition and local activist organizations.
  • Some of the assets masqueraded as news outlets, while others impersonated opposition leaders and health authorities; despite these attempts at obfuscation, many promoted a specifically anti-opposition, pro-Georgian Dream political agenda.

Inauthentic anti-Rohingya Facebook assets in Myanmar removed

What you need to know: Human rights violations against the Rohingya minority in Myanmar are grievous and well-documented. The most significant component of this disclosure was the extent to which – even though the audience reach was minimal – official social media accounts linked directly to the Myanmar Police Force engaged in encouraging further abuse online and in the real world.

  • Facebook attributed this smaller subset of assets to members of the Myanmar Police Force (MPF). The DFRLab could not corroborate the direct links, although the assets it had access to did demonstrate a heavy pro-MPF bias.
  • The content of this operation was highly divisive and sectarian in nature. Pages, accounts, and groups inflamed anti-Rohingya sentiment by presenting the Rohingya as terrorists, denying that atrocities against the Rohingya took place, amplifying reports of violence by the Rohingya against other groups, and dismissing the existence of the Rohingya in Myanmar.
  • Many of the accounts involved took advantage of strong user privacy settings and other means to shield their identity; this limited the amount of open-source evidence available. Nonetheless, minor signs of coordinated inauthentic behavior, such as the use of stock photos as profile pictures and common “liking” patterns across accounts, were apparent.

Iran’s Broadcaster: Inauthentic Behavior

What you need to know: Facebook also removed assets targeting a global audience that it attributed to the Islamic Republic of Iran Broadcasting Corporation (IRIB), Iran’s state broadcaster. For further analysis of the removed Iranian assets, see the latest report by the DFRLab’s colleagues at Graphika, including DFRLab Nonresident Senior Fellow Ben Nimmo. The Graphika report characterized the network as “public diplomacy, but covert”  –  and noted that it used a large number of fake accounts to promote its content. The DFRLab has conducted analysis for previous takedowns related to the same network of threat actors, including in March 2019 and April 2019