Digital Policy European Union Politics & Diplomacy United States and Canada Youth
Issue Brief January 14, 2026 • 12:00 am ET

Transatlantic cooperation on protecting minors online

By Michèle Ledger

Bottom lines up front

  • While US and EU policies differ in their approaches to the regulation of the internet, recent policy roundtables made clear that there is agreement on the need to protect children online.
  • Areas of commonality include the use of primary legislation, an emphasis on platform design rather than censoring content, and the need to balance protection of children with other fundamental rights.
  • Further dialogue between the United States and the EU on these questions could help facilitate faster and more efficient rollout of services and technologies to protect users.

Executive summary

While US and European Union (EU) policies differ in their approaches to online safety and the regulation of the internet, there is agreement about the need to protect children online. That is one high-level takeaway from a recent round of US-EU dialogue hosted by the Centre on Regulation in Europe (CERRE) and the Atlantic Council.

Such dialogue helps to identify common policy approaches for the protection of minors and common approaches to enforcing rules. Ultimately, it can also help facilitate faster and more efficient rollout of technologies to protect users. Dialogue will also help global platforms develop services to comply with rules and expectations on both sides of the Atlantic.

At the recent roundtable hosted by CERRE and the Atlantic Council, the synergies and differences in regulatory approaches and philosophies on both sides of the Atlantic centred on four themes. For each theme, some common threads seemed ripe for further discussion and cooperation.

  • New legislation and approaches to enforcement: In terms of the overall governance landscape, legislation has a key role to play in Europe and in the United States, where long-standing federal rules have been supported by an increasing number of state laws.The bulk of legislation in the EU—such as the Digital Services Act (DSA)—is adopted at the EU level, while some member states are adopting supplementary rules. In the United States, most legislation is now being adopted at the state level. Public enforcement by regulators plays a big role in the EU and the United Kingdom (UK). In the United States, state attorneys general are taking action to enforce rules, with powers similar to those of regulators in Europe. More alignment and cooperation on enforcement would be beneficial. Private enforcement through courts is also possible but, while this is already widespread in the United States, it is just emerging in Europe.
  • The harms from which children should be protected: On both sides of the Atlantic, there is a large degree of alignment on the harms from which children need to be protected. A strong commonality is that rules in Europe and the US both require compliance by design to avoid particularly harmful conduct, such as unwanted contact by unknown adults. Other common design elements include data minimization, which is a central component of the European Commission’s guidelines on protecting minors under Article 28 of the DSA and in the UK Office of Communication’s (Ofcom) age-appropriate design code and guidance under the Online Safety Act (OSA).
  • Balancing rights: To balance the protection of fundamental rights (in particular, privacy and freedom of expression) against the need to protect children, there is widespread agreement that everyone—not just children—deserves protections online. The EU, UK, and United States are all cautious about dictating which content is acceptable online and are instead converging on approaches that require platforms to use processes and systems to ensure safety by design. Ensuring the protection of fundamental rights is a common concern and, ultimately, a matter of balance, including at the enforcement level.
  • Age verification: Current debates about banning access to social media and about age verification are critical in Europe and in the United States, both in general and in relation to certain types of platforms (particularly those that host pornographic content). There is no agreement on a single type of technology that should be used, but there are prototypes and guidance on the high-level principles that the technologies should reflect. There are similar discussions on both sides of the Atlantic about how to attribute responsibility for age assurance across the supply chain—i.e., where in the supply chain age verification should take place—and how the division of responsibilities between players in supply chains could work in practice.

Introduction

The EU has put in place important legal building blocks to protect children online. These include the DSA and the European Commission’s guidelines on Article 28 of the DSA, which require providers of platforms accessible to minors to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors.”1“Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and Amending Directive 2000/31/EC,” European Union, October 19, 2022, https://eur-lex.europa.eu/eli/reg/2022/2065/oj; “Communication from the Commission—Guidelines on Measures to Ensure a High Level of Privacy, Safety and Security for Minors Online, Pursuant to Article 28(4) of Regulation (EU) 2022/2065,” European Union, 2025, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ:C_202505519. They also include the Audiovisual Media Services Directive (AVMSD), which contains rules to safeguard minors’ personal data and to protect children online, and the General Data Protection Regulation (GDPR), which provides rules on collection and processing of minors’ data. Other proposals yet to be finalized include the pending Digital Fairness Act (DFA) proposal and the Regulation on Child Sexual Abuse Material (CSAM).2“Proposal for a Regulation of the European Parliament and of the Council Laying Down Rules to Prevent and Combat Child Sexual Abuse,” European Union, May 11, 2022, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:52022PC0209; “Digital Fairness Act,” European Commission, last visited December 22, 2025, https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/14622-Digital-Fairness-Act_en. Member states retain certain powers to enact national laws to protect minors online.3Miriam Buiten, Michèle Ledger, and Christoph Busch, “DSA Implementation Forum: Protection of Minors,” Centre on Regulation in Europe, March 25, 2025, https://cerre.eu/publications/dsa-implementation-forum-protection-of-minors/.

In the United States, the protection of minors online is an important consideration at both the federal and state levels. At the federal level, the Kids Online Safety Act (KOSA) proposal, the Children’s Online Privacy Protection Act (COPPA) and the COPPA 2.0 proposal all seek to address certain aspects of children’s safety online (in particular, privacy, advertising, and CSAM).4A new version of the KOSA has been introduced in Congress with changes in an attempt to clarify that KOSA does not censor, limit, or remove content from the internet. “Blumenthal, Blackburn, Thune & Schumer Introduce the Kids Online Safety Act,” Office of Senator Richard Blumenthal, press release, May 14, 2025, https://www.blumenthal.senate.gov/newsroom/press/release/blumenthal-blackburn-thune-and-schumer-introduce-the-kids-online-safety-act; “Children’s Online Privacy Protection Rule,” Federal Trade Commission, April 22, 2025, https://www.federalregister.gov/documents/2025/04/22/2025-05904/childrens-online-privacy-protection-rule; “S.1418—Children and Teens’ Online Privacy Protection Act,” US Congress, July 27, 2023, https://www.congress.gov/bill/118th-congress/senate-bill/1418/text. At the state level, California’s Age-Appropriate Design Code (CAADCA) has been challenged in court on First Amendment grounds.5“AB-2273: The California Age-Appropriate Design Code Act,” California Legislative Information, November 18, 2022, https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=202120220AB2273&showamends=false; “NetChoice v. Rob Bonta, Attorney General of the State of California, D.C. No. 5:22-cv-08861- BLF,” US Court of Appeals for the Ninth Circuit, August 16, 2024, https://cdn.ca9.uscourts.gov/datastore/opinions/2024/08/16/23-2969.pdf. Other states, including Nebraska and Vermont, have recently adopted similar codes that they hope will withstand First Amendment scrutiny.6For a comparison between both initiatives see: Bailey Sanchez, “Vermont and Nebraska: Diverging Experiments in State Age-Appropriate Design Codes,” Future of Privacy Forum, June 4, 2025, https://fpf.org/blog/vermont-and-nebraska-diverging-experiments-in-state-age-appropriate-design-codes. Utah has also recently enacted a law to protect content-creating minors from financial exploitation and privacy violations.7“Child Actor Regulation,” State of Utah, 2025, https://le.utah.gov/Session/2025/bills/enrolled/HB0322.pdf.

News headlines focus on apparent differences between US and European policies, which are spiraling into growing transatlantic tension. However, there is a large degree of alignment on the need to protect children online while also safeguarding fundamental rights such as privacy and freedom of expression.

The overall governance landscape

The European and US approaches are fairly aligned on some governance aspects of regulating child protection online. Since the adoption of its rules for video sharing platforms in 2018, the EU has embraced a legislative path to protect minors online.8“Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 Amending Directive 2010/13/EU on the Coordination of Certain Provisions Laid Down by Law, Regulation or Administrative Action in Member States Concerning the Provision of Audiovisual Media Services (Audiovisual Media Services Directive) in View of Changing Market Realities,” Article 28b, https://eur-lex.europa.eu/eli/dir/2018/1808/oj/eng. This legislative framework was strengthened in 2022 with the adoption of the DSA. Both the video sharing platform rules and the DSA are largely principle based and rely on a form of collaboration with the industry,placing the onus on the platforms themselves to decide what constitutes an appropriate and proportionate level of protection for minors. The UK has also adopted a legislative path with the OSA and the detailed guidance produced by Ofcom.9“Online Safety Regulatory Documents and Guidance,” Ofcom, last updated December 15, 2025, https://www.ofcom.org.uk/online-safety/online-safety-regulatory-documents. Like the DSA, the OSA adopts a risk-based approach, with the larger and riskier platforms subject to stricter measures. The UK regulator, Ofcom, has supplemented the legislation with detailed guidance.

The European Commission recently adopted guidelines to help online platforms understand and comply with their obligations under Article 28 of the DSA, including setting out a list of recommendations for platforms, but these are nonbinding. Safety by design is at the heart of the guidelines. The EU’s legislative approach focuses on ensuring platforms put in place systems and processes, while steering away from regulating the type of content that should be outlawed.

So far, the EU’s legislative framework has not led to a full harmonization of approaches to protect minors, and some member states have adopted more restrictive approaches. For example, France, Germany, Ireland, and Italy have adopted supplementary legislation to protect minors from harmful content such as online pornography.10Michèle Ledger, “Protection of Minors: Age Assurance,” Centre on Regulation in Europe, March 2025, https://cerre.eu/wp-content/uploads/2025/03/CERRE-DSA-Forum-Age-Assurance.pdf.

In the United States, the federal government has adopted legislation such as the COPPA to tackle some problematic areas such as the need to protect minors’ personal data.11“Part 312—Children’s Online Privacy Protection Rule (COPPA Rule),” Code of Federal Regulations, last updated April 22, 2025, https://www.ecfr.gov/current/title-16/chapter-I/subchapter-C/part-312. Despite heightened partisanship in Congress, leaders of both the Republican and Democratic Parties have expressed interest in supporting additional bipartisan legislation to protect children online.12“Chairmen Guthrie and Bilirakis Announce Legislative Hearing on Protecting Children and Teens Online,” Office of Energy and Commerce Chairman Brett Guthrie, press release, November 25, 2025, https://energycommerce.house.gov/posts/chairmen-guthrie-and-bilirakis-announce-legislative-hearing-on-protections-for-children-and-teens-online. Although there is less appetite for federal legislation with binding obligations on platforms in terms of platform liability, there is appetite at the state level to embrace the legislative path, and safety by design is the cornerstone of many of these initiatives.13“Public Interest Privacy Center Releases Updated State Law Maps,” Public Interest Privacy Center, press release, May 29, 2025, https://publicinterestprivacy.org/state-law-maps. That being said, the Kids Online Safety Act (a federal initiative) received the support of sixty co-sponsors at the federal level, which shows that this is an area with some bipartisan support. The EU and the United States are also converging on some important aspects: more obligations are placed on larger platforms; there is an emphasis on protection and safety by design; and there is no “one size fits all” solution.

There is broad consensus among experts that, irrespective of geopolitical tensions, there has never been so much space for alignment at the policy level between different jurisdictions—and between Europe and the United States in particular. This is partly because Europe (with the DSA at the EU level and the OSA in the UK) takes a systemic risk approach and does not focus on moderating individual pieces of content. That places responsibility on the platforms to have processes and systems in place to design safe spaces at the outset.

There are also similarities in public and private enforcement of norms. In the EU and the UK, regulators play an important role in making sure that industry complies with the DSA, the AVMSD, and the OSA. In the United States, even if new federal laws are adopted, the creation of a dedicated federal regulator to publicly enforce the legislation is unlikely, though existing agencies such as the US Federal Trade Commission already have a remit over some of these issues. At the state level, attorneys general are empowered to enforce COPPA via civil actions despite it being a federal law. State attorneys general have many enforcement tools at their disposal, including the power to undertake industry-wide investigations. These are broadly in line with the enforcement powers of national competent authorities and the European Commission under the DSA (and Ofcom under the OSA). On both sides of the Atlantic, private enforcement through courts is also set to play an important role, though, to date, it has been more common in the United States than in either the EU or UK.

Harms against which children should be protected

In the EU, the harms against which children should be protected are potentially very wide and are not specifically defined in the DSA, which refers only to protecting minors’ “privacy, safety and security.”14“Article 71 Commitments—the Digital Services Act,” European Union, last visited January 3, 2025, https://www.eu-digital-services-act.com/Digital_Services_Act_Article_71.html. Furthermore, member states are free to set their own rules provided they are in the line with EU legislation.

Some harms are outlawed at the EU level, such as the sharing of child sexual abuse material, dark patterns (i.e., deceptive techniques used by online platforms to manipulate users’ behavior), the processing of minors’ personal data without the consent of parents, and the sending of targeted advertising to children based on profiling.15The European Commission defines dark patterns as unfair commercial practices deployed through the structure, design, or functionalities of digital interfaces or system architecture that can influence consumers to take decisions they would not have taken otherwise. “Questions and Answers on the Digital Fairness Fitness Check,” European Commission, October 2, 2024, https://ec.europa.eu/commission/presscorner/detail/fi/qanda_24_4909. US policy initiatives at the state and federal levels also identify these harms as targets for regulation. The dissemination of child sexual abuse material, for example, is already a criminal offense.

A strong focus of legislation to protect minors on both sides of the Atlantic is to make sure that children cannot be contacted on platforms by unknown adults. At the state level (Vermont in particular) lawmakers frame these as safety bills to avoid framing them as content regulation, which could bring challenges on First Amendment grounds. These design architecture elements, such as default settings that prevent children being findable, are also central in the European Commission’s guidelines on Article 28 of the DSA in the UK Information Commissioner’s Office’s age-appropriate design code and in Ofcom guidance under the OSA.16“Age Appropriate Design: A Code of Practice for Online Services,” Information Commissioner’s Office, last visited December 22, 2025, https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/.

Data minimization (meaning only a minimum amount of data can be gathered and processed) is seen as critical to mitigating harms in general, because there is a strong correlation between collecting vast amounts of data about children’s behavior online and using the data to target minors with harmful content. Also, data minimization could lead to stronger protection for all users. While enforcing data minimization principles is a challenge, it can be done. In the UK, for example, Ofcom is required to work closely with the data protection authority. Operational coherence and cooperation between regulators are crucial in this area.

Balancing fundamental rights

The debate about balancing the need to protect children against the protection of certain fundamental rights (especially privacy, freedom of expression, and the rights of the child) is critical in the United States and in Europe. Initiatives in Europe and the United States tend to focus on tools and processes to protect minors, but steer away from regulating content on the platforms. Despite this, there is mounting debate regarding whether laws are creating a form of censorship or unlawfully constraining free speech, limiting users’ choices, or infringing on the rights of children. The question is wider than the need to protect children online, in the sense that some content can be inherently dangerous for some individuals whereas that same content might not be harmful for another person (minor or adult). This need to protect users from harmful (but legal) content is the most difficult to reconcile with the need to protect freedom of speech and the need for data minimization.

In the United States, the question is being argued in court. Some federal courts have ruled that laws requiring age verification are unconstitutional because they undermine the US Constitution’s First Amendment and threaten privacy rights.17Ibid. Age verification laws are being challenged by NetChoice (a coalition of tech companies) and by free speech coalitions. The Supreme Court recently ruled that the age verification law in Texas does not violate the First Amendment because it only requires proof of age to access content that is obscene to minors; it does not directly regulate adults’ speech.18Texas Legislature, Relating to the publication or distribution of sexual material harmful to minors on an Internet website; providing a civil penalty, HB 1181, Passed June 12, 2023, https://capitol.texas.gov/billlookup/History.aspx?LegSess=88R&Bill=HB1181; “Free Speech Coalition, Inc., et al. v. Paxton, Attorney General of Texas,” US Supreme Court, June 17, 2025, https://www.supremecourt.gov/opinions/24pdf/23-1122_3e04.pdf. In both the EU and the United States, a considerable amount of policy work and research is being conducted on how to balance safety and privacy, especially in the context of age assurance requirements.19Stephen Balkam and Andrew Zack, “Balancing Safety and Privacy: A Proportionate Age Assurance Approach,” Family Online Safety Institute, October 10, 2025, https://fosi.org/policy/balancing-safety-and-privacy-a-proportionate-age-assurance-approach/.

At the EU level, the debate about balancing rights was not prominent while the DSA and the AVMSD were being adopted, probably because the rules were principles based and did not mention bans or age verification per se. Furthermore, the DSA contains safeguards to protect fundamental rights, such as giving users’ the right to challenge content moderation decisions (such as removals of posts, demotions of content, and account suspensions). The central article on the protection of minors in the DSA (Article 28) assumes that there cannot be safety for minors unless other rights, such as privacy, are protected as well.

Now that the DSA is being enforced, the protection of minors has become an enforcement priority for the European Commission, and some member states are calling for bans on children accessing social media platforms, some political parties are questioning the legislation and the push for age verification solutions on free speech grounds. This debate is particularly intense in the context of the regulation on the fight against CSAM, which the European Parliament and the Council of the EU are amending in an attempt to reduce the impacts of CSAM detection mechanisms on privacy, particularly in the context of end-to-end encryption.

The ultimate goal should be to protect everyone online, not just minors. This would avoid the need to put in place age assurance and age verification.

The debates on getting the balance right on the need to protect minors online and the need to protect some fundamental rights are crystallizing on age verification and on proposals for an outright ban on access to social media for children.

To date, there is no outright ban at the EU level on children accessing social media. Commission President Ursula von der Leyen had pledged to examine the questionwith the help of a panel of experts originally scheduled to be set up before the end of 2025.20“2025 State of the Union Address by President von der Leyen,” European Commission, September 9, 2025, https://ec.europa.eu/commission/presscorner/detail/ov/SPEECH_25_2053. Some member states are also discussing the option of a social media ban for children.21In particular, these states include Denmark, Greece, France, Spain, Italy, Ireland, and Poland. There is a strong call in the commission’s recently adopted guidelines under the DSA for certain platforms (such as adult content platforms) to prevent children from accessing them. Also, the Danish presidency of the EU and ministers from twenty-five member states recently adopted the Jutland Declaration, which welcomed “assessments” of a digital majority age.22“The Jutland Declaration: Shaping a Safe Online World for Minors,” Danish Presidency, Council of the European Union, October 10, 2025, https://www.digmin.dk/Media/638956829775203140/DIGMIN_The%20Jutland%20Declaration%20Shaping%20a%20Safe%20Online%20World%20for%20Minors%20101025.pdf. This assessment could help to determine the age at which minors should be allowed access to social media and other digital services—“giving them more time to enjoy life without an invasive online presence.”23Ibid., 2. This question is also high on the agenda in the United States, with some states requiring social media to ban minors from accessing them (or requiring parental consent for a minor to have an account).24These states include Arkansas, Florida, Georgia, Ohio, and Utah.

On age verification, there is no mandatory technology at the EU level, but the EU guidelines on the protection of minors adopted under the DSA set out principles that age verification technology used by online platforms should meet.25These principles concern accuracy, reliability, robustness, privacy and data protection safeguards, and non-discrimination. In particular, the systems should be based on the “double anonymity” principle. According to this principle, the platform knows the age of users without identifying them, whereas an external site—which carries out the age verification by issuing a token—does not know which site the user will visit. The EU is also about to launch an EU mini-wallet as a temporary solution, pending the adoption of national solutions.26“Communication from the Commission.” Some member states have also set requirements on age verification that are enforced by national regulators.

In the UK, the OSA has just entered into force, and the biggest and most popular adult platforms such as Pornhub must now deploy age checks for users based in the UK. Other platforms—including Bluesky, Discord, Reddit, and X—have also announced that they will deploy age assurance in the UK as a result of the act. This has led to a surge in virtual private network (VPN) downloads, which shows the importance of global alignment where possible.

In the United States, as noted above, state legislation imposing age verification is subject to frequent court challenges.27“Age Assurance & Age Verification Laws in the United States,” Centre for Information Policy Leadership, September 2024, https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/cipl_age_assurance_in_the_us_sept24.pdf. As in Europe, there is little agreement among the states on the methods and tools to use when verifying the age of online users. Also, like in Europe, states seem to recognize that age assurance alone is not the solution.

On both sides of the Atlantic, the debates are similar in practice, including debates regarding how to attribute responsibility for age assurance across the supply chain (i.e., at what level age verification should take place, whether at the app store layer or by individual applications or websites). Questions about where verification happens raise additional questions about the extent to which other players in the chain can rely on this, or whether relying on a single point of verification could undermine safety by discouraging applications and websites from making their own assessments.

About the author

Michèle Ledger is a researcher at the Research Centre in Information, Law and Society (CRIDS) of the University of Namur where she also lectures on the regulatory aspects of online platforms at the postmaster degree course. She has been working for more than twenty years at Cullen International and leads the company’s Media regulatory intelligence service.

This issue brief benefits from the insights of discussants at an online roundtable on EU-US regulatory co-operation hosted jointly by CERRE and the Atlantic Council. However, the contents of this brief are attributable only to the author.

About CERRE

Providing high-quality studies and dissemination activities, the Centre on Regulation in Europe (CERRE) is a not-for-profit think tank. It promotes robust and consistent regulation in Europe’s network, digital industry, and service sectors. CERRE’s members are regulatory authorities and companies operating in these sectors, as well as universities.

CERRE’s added value is based on

  • its original, multidisciplinary, and cross-sector approach covering a variety of markets (e.g., energy, mobility, sustainability, technology, media, and telecommunications);
  • the widely acknowledged academic credentials and policy experience of its research team and associated staff members;
  • its scientific independence and impartiality; and
  • the direct relevance and timeliness of its contributions to the policy and regulatory development process impacting network industry players and the markets for their goods and services.

CERRE’s activities include contributions to the development of norms, standards, and policy recommendations related to the regulation of service providers, to the specification of market rules, and to improvements in the management of infrastructure in a changing political, economic, technological, and social environment. CERRE’s work also aims to clarify the respective roles of market operators, governments, and regulatory authorities, as well as contribute to the enhancement of those organizations’ expertise in addressing regulatory issues of relevance to their activities.

About the Atlantic Council

The Atlantic Council promotes constructive leadership and engagement in international affairs based on the Atlantic community’s central role in meeting global challenges. The council provides an essential forum for navigating the dramatic economic and political changes defining the twenty-first century by informing and galvanizing its uniquely influential network of global leaders. The Atlantic Council—through the papers it publishes, the ideas it generates, the future leaders it develops, and the communities it builds—shapes policy choices and strategies to create a more free, secure, and prosperous world.

The Atlantic Council’s Europe Center conducts research and uses real-time analysis to inform the actions and strategies of key transatlantic decision-makers in the face of great-power competition and a geopolitical rewiring of Europe. The center convenes US and European leaders to promote dialogue and make the case for the US-EU partnership as a key asset for the United States and Europe alike. The center’s Transatlantic Digital Marketplace Initiative seeks to foster greater US-EU understanding and collaboration on digital policy matters and makes recommendations for building cooperation and ameliorating differences in this fast-growing area of the transatlantic economy.

Related content

Explore the program

The Europe Center promotes leadership, strategies, and analysis to ensure a strong, ambitious, and forward-looking transatlantic relationship.

Image: Stock photo of Facebook, Messenger, Instagram and WhatsApp, social media app icons on a smart phone. (via Reuters)