The European Commission’s Rita Wezenbeek on what comes next in implementing the Digital Services Act and Digital Markets Act

Read more about 360/Open Summit: Around the World


Jun 7, 2023

Activists and experts assemble in Costa Rica to protect human rights in the digital age

By Digital Forensic Research Lab

Our Digital Forensic Research Lab is convening top tech thinkers and human-rights defenders at RightsCon to collaborate on an agenda for advancing rights globally.

Cybersecurity Disinformation

Event transcript

Uncorrected transcript: Check against delivery


Rita Wezenbeek
Director, Platforms, DG CNECT, European Commission

RITA WEZENBEEK: My name is Rita Wezenbeek, and I am the director in charge of the implementation of the new legislation in the European Union concerning tech platforms, so this is the Digital Services Act—the DSA—and the Digital Markets Act—the DMA.

So the Digital Services Act addresses a wide range of potential societal harms on online platforms, ranging from the sale of illegal goods to disinformation, from child pornography to terrorists’ online content.

Providers of online platforms will be subject to democratically adopted rules, which set a comprehensive accountability and transparency framework. The first obligations on this Digital Services Act already started to apply in February this year and on [April 25], the commission designated seventeen very large online platforms and two very large online search engines that reach at least forty-five million active users on a monthly basis in the European Union, which is an equivalent to more than 10 percent of the EU’s population. These [very large online platforms and search engines] fall under the direct supervision of the European Commission.

The effects of these rules will be felt soon. Designated [very large online platforms and search engines] will have to provide the EU with risk assessments at the end of August and the beginning of September. In addition to that, under the Digital Markets Act, which is much more an act on market contestability, the designations under this act will follow the latest by the beginning of September.

For both sets of regulation, the commission will become the regulator for the large platforms and search engines. The commission will supervise under the DSA that the online platforms put into place systems to tackle illegal content and disinformation that uphold users’ rights and also protect users’ health and well-being and in order to do so the commission is equipped with wide-ranging investigatory and supervising powers, including the power to impose sanctions and remedies.

That being said, making the implementation of the DSA work in practice is something that the commission is not going to do alone. Many actors will contribute to the success of this regulation and we would, of course, stand ready to share our first regulatory experiences. We have to act decisively to safeguard the universal principles and we do it in a way that does not exclude adopting a global approach to platform regulation. So our platform rules optimize fundamental rights protections by giving agency back to society, which leads to an informed and effective choice for safety and contestability.

Now, also other rules are relevant in this context. For instance, the UNESCO draft guidelines for platform regulation reflect a similar architecture. And it involves proportionate, risk-based, all-of-society approaches. Under such a global approach, it is important to exchange on standards for key building blocks of human rights-based platform regulation through risk assessments for systemic platforms, also through auditing cycles, and through data access for researchers.

This means that in the EU, we need input from stakeholders around the world to make the most of this opportunity. We need to set out, for instance, how platforms should conduct such a risk assessment. Also, how they can give access to data to researchers in a secure and privacy-preserving manner. And also how third-party auditors should be involved. If we achieve a degree of consistency in implementing these systems globally, we will mutually be more resilient.

We need auditors that are truly independent of online platforms and that have sufficient expertise to have full awareness of civil society’s understanding of the systemic risks and their drivers. In addition to these procedural questions, we need a global debate about what are our priority research questions regarding systemic risks that are caused by online platforms. We can already link our global academic teams to investigate different priority risks. And we can set coherent standards for vetting these researchers, so that they are both independent and able to secure funding. We also need to identify proportionate and effective risk mitigation measures.

Now, you can make your voice heard. State of the art research, as well as technology, will shape our collective responses. Both directly, for instance, in implementing moves, and indirectly, for instance, because auditors look at your evidence. We’re currently seeking feedback on a number of rules. First on how to organize data access for researchers in a user-friendly, yet safe, manner. A short consultation has already ended on the [May 31], but there will be a new consultation on a draft-delegated act that will probably be at the beginning of 2024. We’re also consulting on a draft delegated act of independent audits. And that delegated act, the consultation, will expire on [June 2].

And finally, there will be a big, multistakeholder event in Brussels on [June 22], when many issues addressed by the Digital Services Act will be discussed. You’re welcome to join online, and you can follow the information on this event on our website. Then lastly, our European Center for Algorithmic Transparency is setting up a global network of researchers. And also there, you are welcome to express your interest. Now, other need for guidance may follow as the necessity arises.

On the Digital Markets Act, the legislation concerning contestability of markets, we already had four workshops involving competitors, consumers, and regulators. And more will follow in the future. And these are crucial meeting points where all specialist actors can publicly discuss and challenge the gatekeepers’ proposed remedies with respect to how to address compliance, including regarding technical methods such as end-to-end encryption and interoperability obligations.

These workshops reflect that both the Digital Services Act and the Digital Markets Act make platforms actually regulated entities, similar to systemic banks under the supervision of the European Central Bank. Compliance with these rules has to occur on an ongoing basis. And it will be adapted. In a way, relative to today, the tables will be turned. Platforms have to proactively propose mitigation measures and remedies that need to be proven to work in practice. This proof of concept should come from a broader stakeholder community. So you need to be involved on an ongoing basis too.

Let my key message to you today, therefore, be that your involvement is not a one-off request, such as today. We will need to collectively drive solutions that represent the state of the art in optimizing the protection of all fundamental rights online.

And this also goes to the question of how do we define success under the new legislation. Success will be a consistent implementation of the rules on the one hand and their use by society on the other hand.

So let me end by saying that I hope I can rely on your support and participation. Thank you very much.

Further reading

Image: In this photo illustration, the European Commission (EC) logo is seen on a smartphone screen and the European Union (EU) flag in the background. Photo by Pavlo Gonchar/SOPA Images/Sipa USA.