Watch the full event
Event transcript
Uncorrected transcript: Check against delivery
Speakers
Prabhat Agarwal
Head of Unit, Digital Services and Platforms, DG Connect, European Commission
Gerard de Graaf
Director for Digital Transformation, DG Connect, European Commission
Moderator
Kate Klonick
Professor, St. John’s University School of Law
KATE KLONICK: Thank you so much. Thank you, Rose. Good afternoon. My name is Kate Klonick. I’m a professor at St. John’s Law School and a fellow at the Brookings Institution and the Yale Information Society Project. And my research and writing for the last decade have focused on content moderation and online speech governance on private platforms.
So I am especially excited to be here to moderate what should be a fascinating panel, The Digital Services and Markets Act Package: What happened and what comes next, two of the leading drafters—with two of the leading drafters of the regulations. Prabhat Agarwal, head of Unit of Digital Services and Platforms at DG Connect, European Commission, and Gerard de Graaf, director for digital transformation at DG Connect at the European Commission. So welcome to you both, and thank you for being here to talk to us about the [Digital Services Act (DSA)] and the [Digital Markets Act (DMA)].
I wanted to start with a little bit of framing and perspective. So it was just a year ago, as Rose mentioned, that you had joined this conference virtually to discuss the process of framing the DSA. For those uninitiated, the Digital Service Act, or the DSA, aims to protect users’ rights to freedom of expression, while also empowering them to report illegal content, protecting their privacy, and allowing them to see why certain online ads or content are shown to them. Its companion act, the Digital Markets Act, or DMA, which establishes a set of narrowly defined objective criteria for qualifying a large online platform as a so-called gatekeeper, is a type of competition-protecting bill.
And these EU bills are the first of its kind—comprehensive regulatory framework for governing digital services. The DMA and DSA provide rules across a range of topics—from liability to content moderation, from transparency reporting to competition, with global implications as numerous other countries attempt to tackle the same issues. I wanted to start out by asking both of you, on the eve of this act’s, like, rocketing through drafting, what your thoughts have been on the process generally, but in the last year in particular since you joined this conference.
And I’ll start with you, Prabhat.
PRABHAT AGARWAL: Thanks, Kate. Well, it’s been a rollercoaster, I should say. You know, so first of all, I would say there’s been an amazing amount of support for the kind of initiatives and ideas that we put forward. Since we spoke last summer we took the proposals to the Council of the EU, which is where all twenty-seven member states come together. And both the Digital Services Act and the Digital Markets Act were supported unanimously by all member states, in a very relatively short period of time after we spoke. So less than twelve months after we presented the proposals in November of last year.
And in a very short period after that, in December of last year, the European Parliament voted its position on these two acts, also, again, with overwhelming majorities. So I think the highlight really has been the amazing amount of political support that we had across all political parties, across all member states, for the kind of ideas that we are putting forward. That’s, I think, the highlight for me in the last twelve months. And of course, then it led to a very quick agreement early on in the first quarter. Now we’re in the finalization process.
So the main takeaway was when I spoke to you, together with Gerard last time here, I think we didn’t—we knew that here was a lot of support. We didn’t quite anticipate that there was this much support. That’s how I would say.
GERARD DE GRAAF: I would fully agree to that. I think what was particularly heartening was to see that the approach which we had taken, which was a systematic approach, so kind of that the platforms needed to be regulated on the systemic risks that they posed to society, that that approach—which, in a way, was actually inspired by banking regulation, was strongly endorsed. That it’s important that platforms have kind of risk management in place, that they have due diligence obligations.
I think that—so the discussion wasn’t on what we call the architecture, on the fundamentals of the proposals. It was more—and that was another interesting fact, is we thought we had made an ambitious proposal in December 2020. And typically what you have in a negotiation with the Council and the Parliament is that they try to put it down a little bit, and then you end up somewhere around the original proposal. Here we had both the Council and the Parliament saying it’s ambitious, but it’s not yet ambitious enough. So, actually, the end result, the kind of—the measure as it was adopted at the end is even more ambitious than what we proposed in December 2020. And I think that’s unique. I mean, I don’t think we have lived that very often.
I think on the DMA a bit, the same approach here to say, look, we cannot rely on antitrust rules alone. I mean, the movement—I mean, the market is moving very fast. We need to—kind of to equip ourselves with an instrument that can actually address these issues upfront, I mean in an ex-ante way. I think also there was a lot of support.
And similarly to what happened on the DSA. I mean, we put forward, actually—I mean, Prabhat and I, we discussed, like, is—I mean, we put eighteen unfair practices in the original proposal of the DMA. And I think we often asked ourselves the question, well, is this going to fly politically? Is this—is this what the market can bear? And interestingly, yes. And some of these proposals would further be enforced in the—in the process.
KATE KLONICK: So can we talk a little bit about how the DSA is going to be implemented? I think you’ve—people have described it and you’ve described it as having a multilayer process with many different, like, parties and stakeholders implementing different parts of it. Explain kind of how you got to that solution and how you envision it kind of playing out.
PRABHAT AGARWAL: I can start and maybe—first of all, this is a fundamental difference between the DSA, which deals more with content moderation and speech-related issues and illegal content, disinformation. Of course, naturally speaking, language issues are a big factor there; you know, disinformation in one member state is very different from another member state. So the role the member states play in the Digital Services Act is very different from the Digital Markets Act, where we’re talking about unfair practices, and what’s unfair in one country is also unfair in another country.
So from the outset, the Digital Markets Act had foreseen a centralized enforcement by the European Commission of the rules in unfair trading, but a decentralized enforcement of the Digital Services Act, giving member states across the European Union the main power.
Now, like Gerard was saying, in the previous intervention, actually, during the negotiations member states said, well, actually, we would really like to bundle this power in the European Commission as well and have the European Commission be the primary enforcer. So that’s one layer. Of course, we still have to work with member states’ authorities because we don’t speak all the languages. We don’t understand all the national context and the nuances. And we see, particularly in the field of disinformation, enormous sophistication of actors in spreading disinformation, which requires local and cultural context.
I think there’s a second element to this is that multilayer enforcement means action by the regulator. It also means actually empowering third parties, like civil-society actors, to uncover things. And we’ve seen journalists’ investigations or civil-society investigations in the United States; you know, organizations such as ProPublica have shone a light on some of the shortcomings or some of the, you know, problems out there.
We have seen that these are very powerful levers for actions and for change. And so we’ve built into the Digital Services Act, but also in the Digital Markets Act, powerful transparency and accountability levers that actually activate this. This is the second layer, I would say, of enforcement. And then there’s, of course, new powers for the users, Gerard, that we put in as well, huh?
GERARD DE GRAAF: Yes. I mean… action. So all of us, we will have a role. If we see something on the platform and we think it’s illegal or it’s disinformation, you can notify that. Then the platform has to—it becomes aware. It will have actual knowledge, which the European Union triggers the liability, or at least removes the liability exemption. And then there’s an interaction, and the platform will have to explain what it has done. Has it removed it, not removed it? Also the person who posted it will need to be brought into this discussion.
So that’s certainly something that is empowering that allows all of us to play an active role in ensuring that kind of what is on the internet is safe, and at the same time our fundamental rights are preserved.
There’s other elements there; the access of researchers, for example. There’s a legal base that gives researchers a right of access, vetted researchers the right of access. They can look under the hood of the platform. They can uncover situations that so far have escaped our attention. So a platform can’t say, sorry, but I’m not going to give you access. There’s a legal ground for access.
We will have independent audits. At least once a year a platform will need to undergo an independent audit where the auditor comes in, and a bit like what an auditor does in a company or in a financial institution, in a bank, it just looks at all the systems and it will find also certain vulnerabilities that then will need to be addressed by the platform.
There will be reporting. At the moment, there is reporting but is it the kind of meaningful reporting which we would like to receive, we, as regulators—I mean, the civil society, I mean the member states. I mean, that’s—there will be kind of clear templates for what we think is meaningful reporting. And then you can see also from the report what platforms are in terms of, for example, content moderation, how much they are investing content moderation in minority languages as an example.
So this is a multilayer. It’s not just the European Commission, which, of course, will be the central enforcement authority, but it is a multilayer, a multistakeholder kind of enforcement structure that, I think, can work if we all contribute to making it work.
KATE KLONICK: And I know we are on a quick timeline today because we have something coming up on this shortly, so I’m going to skip to kind of the DMA and speaking about the DMA. If we come back, I have a couple of follow-up questions on the DSA.
But one of the things that that the DSA has kind of—or the DMA—one critique of the DMA was that it effectively kind of forfeits competition and consumer choice as a way of shaping platform behavior in favor of having kind of more heavily regulated entities, and, obviously, the DMA is, in part, an answer to that.
And so can you talk a little bit more about how the DMA and, particularly, explain the gatekeeper function and the gatekeeper label and how that will work for the companies?
PRABHAT AGARWAL: So the notion of a gatekeeper, actually, simply is meant to reflect the fact that there are certain situations where platforms inter mediate access between an enormous amount of end users—you and me—and a large number of businesses.
You know, one of the clearest examples are app stores. So app stores you have millions of developers and billions of users of app stores, you know, but you only really have two app stores, at least in the Western world. So that actually means that a gatekeeper function is that somebody who sets the rules of the game at the same time, you know, hasn’t—leaves no opportunity for people to go around it.
So we have this notion that there’s a dependency relationship between business users and end users by a gatekeeper, through a gatekeeper, that they—there is a certain amount of financial power associated with this relationship. So it needs to have a certain amount of turnover to qualify as a gatekeeper and the situation needs to be entrenched so it’s not just a quick flash in the pan like situation but over multiple years the situation persists.
And these criteria are spelled out in the law—in black letter law. They’re backed by an impact assessment where we looked at the different market characteristics that were going on, and what is really meant to be captured here is the unusual network effects—data-driven network effects that lead to a kind of a—this particular situation of lock in or dependency and that characterizes the platform economy.
KATE KLONICK: OK.
Gerard, would you characterize how the DMA thinks about platforms as thinking of them as utilities, thinking about them as regulating them as in the US, as we’ve heard, as common carriers or as some type of, like, basic function that is necessary to be regulated rather than left wholly up to competition?
GERARD DE GRAAF: Well, I mean, they are gatekeepers. So if you are, like, a small business or you’re a small hotel, I mean, it’s very difficult to be successful if you don’t partner with Booking.com or it’s very difficult, very hard, to be successful if you ignore Amazon as a marketplace because you, basically, forgo a very important part of potential turnover.
And then we have observed practices like self-preferencing, tying certain conditions that are being imposed. You cannot offer better deals for your hotel outside of the platform that we have and so the political decision-makers in the European Union have now defined as unfair and, therefore, should be prohibited. There are other kind of practices. These are the don’ts. These are the do’s. Companies that say, look, I’m providing services, for example, through the app store. I’m an editor. I sell a newspaper through the app store. I have no idea who the customers are. I have no relationship with the customer. I can’t get the data even kind of consistent with the GDPR. I cannot find out who the customers are and then maybe tailor a bit my product more to their expectations. So that that’s a do.
Giving access to data is a requirement. Interoperability that is also foreseen. So those practices, I mean, now need to be implemented. It will have a fundamental effect on the business models of these gatekeepers. When you think about sideloading in the app store, you will—in the future you will be able to download apps that do not come through the app store, if you have an iPhone. So in a way, we break open that ecosystem. That would be fundamental changes. But we think these fundamental changes are necessary.
And the argument, like regulation, is, per se, bad for competition. Well, it’s the other way around. We don’t see another way, I mean, through competition policy, for example, to get rid of what we consider these unfair practices. We believe that this will unleash a lot of innovation, a lot of competition, benefits for the user, benefits for businesses, benefits for app developers. So the argument, oh, this is regulation and therefore must be constraining and reducing innovation, we reject completely out of hand.
KATE KLONICK: So one of the—finally, I think this will be our last question—but one of the final kind of critiques or one of the major critiques of the GDPR and now the DSA and DMA is that EU is essentially regulating for the world, from this place of kind of—of market power, and also kind of an ability. You actually have a semi—a mostly functioning legislative process, unlike other countries I won’t name. And but there is—just can you say a little bit about the criticism around the Brussels effect, and, like—and whether or not, like, that’s even necessarily a bad thing in this context?
PRABHAT AGARWAL: And I think the fact is that other regions are struggling with similar problems is not a kind of secret. You know, and people around the world share a problem analysis on how we—how do we—how do we fight, you know, fake news or disinformation campaigns, while preserving freedom of expression? How do we ensure that the markets that are dominated by these very—platforms with very strong network effects, that we maintain possibilities for competitive entry and fair practices? You know, we’re not the only jurisdiction that has kind of struggled—is struggling with this.
So I would say that the problem definition is pretty widely shared across the globe. Now the fact is not everybody is going to come to a solution, or is not necessarily going to come to the same solution. I think that’s also normal because there are different legal systems out there. For the DSA, where fundamental rights are stake and freedom of expression is at stake, I think it’s very important that we kind of orient ourselves with international human rights norms. That’s what we try to do in the drafting of the process. But even looking beyond, I think that these are also opportunities for cooperation at a global stage—you know, in the next stage during the implementation.
And one thing I often say is that the DSA in particular is going to be a huge data generation machine. And I think we’ll need to cooperate across borders to harness that data, and to create insights. You know, and this is a little bit how we view, in this context, the Brussels effect. It’s not necessarily us imposing rules on everyone else, but just creating a platform for collaboration on the important issues.
KATE KLONICK: Gerard, do you want to—
DE GRAAF: I mean, our mandate is to regulate for Europe. We don’t regulate for the world. Even though some of the companies are outside of the European Union, they target the European Union, and therefore they are within scope. When we were kind of making the proposals, and when they were regulated, we were mindful of the kind of—at least, say, the—that the DSA, the DMA could become a reference point for other countries around the world.
And I think if you look at it rather broadly, you see three models, particularly in terms of, like, the regulating the internet. Less so for kind of the DMA. And one model is the Chinese, Russian, Turkey model, which is a very repressive model. It’s a very kind of authoritarian model. You have the European model. And then you have a model which hopefully will be changing, but of course that depends on kind of political developments, which is a laissez-faire model, which is the US model, at least until quite recently.
We think that, as Prabhat said, there’s a lot of countries—we’ve been approached very, very intensively by countries around the world who want to know and ask many of the questions that no doubt cross your mind. Why this? Why not that? Et cetera. So we are spending a lot of time on explaining, because these countries are also looking to regulate or looking to legislate. So they’re looking for a source of inspiration. I mean, we have therefore also a very important responsibility to make sure that it works, that it can work. So the implementation is going to be very critical.
But we will want to offer, I think, as democratic societies and we work a lot with the US in the TTC. And the problem analysis I think is shared. We will need to offer too, as democratic societies, an attractive alternative to those countries who kind of—either are already implementing kind of repressive policies like China and Russia, but there’s a lot of countries who are, like, on the fence, who in the next, say, couple of months and years are going to decide which type of regulatory model are we going to join? Is it the Chinese, Russian, Turkish model? Or is like the European model? And hopefully which will kind of—more kind of interventions on the US side. I think that is the key question for the next couple of years. And if that’s called the Brussels effect, it’s fine for us.
KATE KLONICK: Thank you so much. This was—this was fast, but wonderful. And I think that we got to do kind of a very high-level kind of understanding of both the role of the drafting and the multistakeholder nature of it, the multistakeholder nature of implementation. And we will see where the DSA is next year, hopefully. Thank you.
Watch the full event
Further reading
Mon, Jun 6, 2022
What happens when toxic online behavior enters the metaverse?
Transcript By
As technology and reality intersect, policymakers are going to need to prepare for a world integrated with virtual and augmented reality.
Tue, Jun 7, 2022
Moderating non-English content: Transparency and local contexts are critical
360/OS By Layla Mashkoor
The decisions made by content moderators and algorithms have significant impacts, not only in online information spaces but also offline.
Mon, Jun 6, 2022
Spyware like Pegasus is a warning: Digital authoritarianism can happen in democracies, too
New Atlanticist By
Journalists and citizens targeted by spyware warn the audience at the Digital Forensic Research Lab's 360/Open Summit about the proliferation of state-sponsored digital surveillance.