With autocratic regimes more aggressively restricting freedom of speech on the internet, it is all the more important for the European Union (EU) and the United States to put forward a positive, alternative model of online regulation, said two European Commission policy officials Wednesday at the 360/Open Summit, hosted by the Atlantic Council’s Digital Forensic Research Lab.
Prabhat Agarwal, head of the Commission’s Digital Services and Platforms unit, and Gerard de Graaf, director for the digital transformation in the Commission’s Communications Networks, Content and Technology directorate-general, were the leading drafters of the Digital Services Act (DSA). The bill is a first-of-its-kind, comprehensive regulatory framework for governing digital services proposed by the European Commission to EU lawmakers in December. Aimed at making the internet safer while protecting fundamental human rights and freedoms, the DSA takes on modern digital challenges from content moderation to transparent data reporting and oversight.
The DSA is currently being considered by the European Parliament and European Council for revision, with the goal of passing it in early 2022. And its wide-ranging scope makes it “more than just an EU regulation; it’s a potential model and the only fulsome democratic standard with which to engage at the moment,” said moderator Rose Jackson, director of the Democracy & Tech Policy Initiative at the Digital Forensic Research Lab.
Below are some of the highlights from their discussion.
What does the DSA do?
- The DSA aims to protest users’ rights to freedom of expression while also empowering them to report illegal content, protecting their privacy, and allowing them to see why certain online ads or content are shown to them, its framers said. Authorities will receive unprecedented amounts of data for better public supervision. And platforms will receive clearer instruction on liability while facing just one point of contact for regulatory oversight: the EU, as opposed to each of its twenty-seven member states.
- The proposed legislation builds on the EU’s existing e-Commerce Directive, bolstering liability protection for intermediary services like hosting sites and caching services, and it expands to new areas, with a common framework for enforcement and additional due diligence obligations that can include environmental and human-rights checks. That includes a “Good Samaritan” clause, which shields platforms from liability as long as they engage in good-faith efforts to remove illegal content expeditiously.
- The e-Commerce Directive mostly focused on cloud infrastructure and web-hosting services. DSA adds new categories for online platforms that cover marketplaces (such as app stores or sharing/gig economy platforms) and also large-scale social-media sites. Infrastructure intermediaries, such as domain registries or wifi hotspots, have the smallest regulatory responsibilities, while online platforms face increased scrutiny based on the sizes of their audiences. “We are, of course, dealing with some very powerful platforms that are, in some cases, so powerful that they can set the rules of the game,” de Graaf said. “Since so many companies and users depend on these platforms, it is in [their] interest that competition works.”
Shaping the ‘fire exits’ of the digital world
- By clarifying expectations as well as liability issues related to their content, platforms could benefit from the DSA, its drafters said. “It is difficult to scale in Europe, because Europe is fragmented,” de Graaf pointed out. “The rules are not the same. We have twenty-seven member states.” Even small platforms that find success in one state may struggle to adjust to the individual rules of another: “You need to adjust your business model. You need to check out what are the rules that apply to [you], and that slows you down. And the internet is all about scale and speed.”
- Some human-rights experts are concerned that further regulation could limit free speech. “This is not an instrument for authoritarian regimes to dictate how people can express themselves online,” Agarwal said. “An analogy I often use is that this is regulating the fire exits, the alarm buttons, the safety features that we would expect if you go to a shopping mall or a concert hall.” Other pieces of European law define what types of speech are legal and not, and bad actors will abuse regulations regardless of safeguards written into the law, Agarwal said: “A feature of authoritarian regimes is that they are not rights-respecting in the first place.”
- The introduction of required, robust data-reporting is critical. “This kind of data is going to be generated, so if it is abused by some authority, it will leave an unmistakable trace,” Agarwal said. The transparency provisions could also greatly increase knowledge of user behavior on those platforms, allowing independent researchers—typically housed in academic institutions—to create novel studies and shape future governance based on data-driven evidence.
A necessary conversation
- The e-Commerce Directive was enacted in June of 2000. As de Graaf noted, online life has shifted considerably since. In 2002, 9 percent of Europeans shopped online; now 70 percent do—and 40 percent of businesses sell through online platforms. “Platforms have become much more important to our lives in terms of social media, in terms of marketplaces. They are also very important vehicles for small and medium-sized businesses in Europe to reach their customers, so it is timely now to look at that framework,” de Graaf said.
- As of 2018, there were nearly 10,000 high-growth social and hosting-service platforms in Europe. “Most of them are small,” de Graaf said. He noted that the current EU structure makes it difficult for them to grow, citing Spotify as an example. “How did it grow? It started in Europe, then it went to the US. It got scaled in the US, and then it came back to Europe.”
- But the sheer size of platforms like Facebook—which counts 423 million monthly users in Europe, out of a population of about 750 million—illustrates the need to create a framework that is versatile enough to regulate both large- and small-scale platforms effectively. “Very large” online platforms, defined as reaching 45 million users (or 10 percent of the EU population), face more obligations under the proposed DSA. They are required to have compliance officers, independent audits, increased data access, and enhanced transparency reporting, among other responsibilities. “It’s an asymmetric obligation,” de Graaf said. “If you are in a user-, consumer-facing position, like a social-media company, you have greater responsibilities. These due-diligence obligations are the hard core of the Digital Services Act.”
Nick Fouriezos is an Atlanta-based writer with bylines from every US state and six continents. Follow him on Twitter @nick4iezos.
Watch the full event
Transcript Jun 25, 2021
Maria Ressa, Nanjala Nyabola, and Katherine Maher on why social media is—and is not—a global public square
By Atlantic Council
Is social media a global digital public square? The DFRLab’s 360/Open Summit gathered Maria Ressa, Nanjala Nyabola, and Katherine Maher to discuss platforms’ role in the online world.
New Atlanticist Jun 25, 2021
How to spot the latest trends in digital disinformation
By Nick Fouriezos
Top tech journalists joined the 360/Open Summit, hosted by the Atlantic Council’s Digital Forensic Research Lab, to discuss how tech platforms can enhance accountability and transparency
New Atlanticist Jun 24, 2021
Fighting online extremism in ‘the Klan den of the twenty-first century’
By Nick Fouriezos
How can civil society groups, law enforcement, and policymakers assess and combat the threat of online extremism? Experts at the Digital Forensic Research Lab’s 360/Open Summit dive in.