The verdict: Back to you, Facebook. The company’s Oversight Board ruled Wednesday that former US President Donald Trump will remain banned from the platform for encouraging the January 6 insurrection at the US Capitol. But the board—a group of twenty activists, lawyers, and journalists appointed by the company—said Facebook was wrong to institute an “indefinite” suspension and called on the company to decide on a better-defined penalty within six months. So what should we make of the decision? And what consequences is it likely to have on online radicalization and the use and abuse of social media around the world? Our experts from the Atlantic Council’s Digital Forensic Research Lab break it all down.
TODAY’S EXPERT REACTION COURTESY OF
How to read the decision
- Emerson notes that the Oversight Board recently made the right call in restoring content that criticized India’s ruling Bharatiya Janata Party. “Although the Oversight Board’s decision created new complications for Facebook’s India-focused expansion, Facebook enacted it without complaint,” Emerson says. But the Trump case, he adds, was the board’s “great test.”
- And Emerson thinks the board didn’t pass that test, arguing that it “has essentially given up on ruling on the most consequential case of its tenure.”
- Yet in Rose’s view, it’s too early to tell: “By temporarily upholding the ban but demanding the company clarify its fairly haphazard policy on influential people (called ‘newsworthiness’) and indefinite suspensions, the board is testing its ability to take a question posed and throw back a better one. Now it is up to Facebook to respond.”
- Still, Rose adds, given how the board’s power remains undefined, Wednesday’s decision does little to settle “huge questions about the role a board like this, even when functioning at its best, plays in the bigger conversation on aligning tech, connectivity, and democracy.”
Subscribe to Fast Thinking email alerts
Sign up to receive rapid insight in your inbox from Atlantic Council experts on global events as they unfold.
A decision heard ’round the world
- A close read of the board’s recommendations reveals a researchers’ “transparency wishlist,” Rose says, including more details on actions such as take-downs of content and “requests to preserve information in some form that could be used in the prosecution of serious human-rights abuses and violations of international law. Right now once content is removed, it’s gone.”
- Policing harmful content outside the United States has long been a difficult and controversial aim for major platforms, including Facebook, from Myanmar to Libya. The board weighs in here, too, as Rose points out, calling on Facebook to invest in “staff and processes that would enable the company to provide the same level of attention and seriousness” to enforcing its policies “in Sri Lanka as it might in the US.”
- If Facebook does enact the recommendations, it “could impact the trajectory of regulatory debates in the US and Europe” around tech platforms, Rose notes. But we shouldn’t forget that “Facebook is a company, not a government,” she adds. “These decisions matter, but they do not stand in for representative governments grappling with the role of companies in society and democracy.”
A coming shift for online radicalization
- Jared predicts that the board’s decision could result in Facebook receding in importance among “the social-media influencers, groups, and information sources that built their brands and followings on the coattails of Trump.” But the former president could decide to launch his own platform or embrace an existing “alternative” site such as Parler or Gab. “Extremists were among the first regular users of many alternative platforms and enjoy a home-field advantage in those venues as a result,” Jared says. “A Trump bump on one of those sites could greatly increase the threat surface of radicalization.”
- But no other site can match Facebook’s three-billion-strong worldwide user base anytime soon. And that’s why this decision will likely tamp down extremism overall, Jared says. “Many extremist movements in digital spaces were able to boost their visibility and strengthen their propaganda by attaching themselves to statements and attitudes shared by Trump. Often, groups would look to Trump for direction in conversation and action. Without Trump, these groups will lose some of that cover.”
- Jared notes that Facebook has notched some successes against extremist groups. Recent actions against QAnon and militia groups, for example, “affected a majority of those respective ecosystems on the platform.” Where the company has struggled is when high-profile politicians amplify that content. “The board’s decision to uphold the [Trump] ban gets Facebook a small step closer to consistency,” he says, “but it will be imperative for Facebook to address the glaring flaws in its moderation and enforcement against extremist content.”
Thu, Jan 7, 2021
Fast Thinking By
The team at the Atlantic Council’s Digital Forensic Research Lab has conducted exhaustive research into how the event happened, combing through social media and other networks frequented by the far right. Let’s break down what they found.
Thu, Mar 4, 2021
Transcript: YouTube CEO Susan Wojcicki on online speech, government regulation, and Donald Trump’s suspended account
The Youtube CEO talked about the company's responsibilities in the information ecosystem, disinformation, government regulation of speech, and what's next for Donald Trump's account.
Fri, Mar 26, 2021
Fast Thinking By
This wasn’t just another tech hearing. It was the first since the January 6 riot on Capitol Hill. It was focused on extremism and misinformation. And our team at the Digital Forensic Research Lab, which is all over these issues, tells us that lawmakers came at their questioning in a new way that offers insight into what our digital future could look like.