EU Intensifies Child Safety Enforcement, Flags Gaps in Meta Age Checks
Ramsha Jahangir / Apr 29, 2026
Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy of the European Commission. Source
The European Commission has preliminarily found that Meta may be in breach of the Digital Services Act over how Facebook and Instagram handle access by users under 13. The finding focuses on whether the company has “diligently” identified, assessed and mitigated systemic risks to minors.
According to the Commission, Meta’s safeguards do not effectively enforce its stated minimum age of 13. Users can enter a false date of birth at sign-up without meaningful verification, while systems to detect or remove underage users after account creation appear limited. Reporting tools are also described as difficult to access and inconsistently trigger follow-up action.
Executive Vice-President Henna Virkkunen said Meta's measures for enforcing its own minimum-age rules "are doing very little." "The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users — including children," she said.
If the findings are confirmed, the Commission may issue a non-compliance decision and a fine of up to 6% of Meta's global annual turnover. Periodic penalty payments may also be imposed to compel compliance. The findings do not prejudge the final outcome.
In a statement to The New York Times, Meta said it disagreed with the preliminary findings. It said Instagram and Facebook are intended for users aged 13 and above and that it already uses systems to detect and remove underage accounts. The spokesperson said the company would share more next week about "additional measures rolling out soon" and added: "Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue."
A divided regulatory landscape
The finding comes as the EU expands enforcement under the DSA on minors’ online safety. It is the second preliminary finding this year. In February, the Commission accused TikTok of exposing teenagers to risks linked to addictive design features, signaling a broader focus on platform design and its impact on younger users.
These efforts are taking place amid diverging approaches within Europe. Several member states are advancing national restrictions on minors’ access to social media, including proposals for outright bans for younger users, while the Commission has so far focused on enforcing existing obligations under the DSA.
On the same day as the Meta findings, the Commission adopted a recommendation encouraging member states to make its EU age verification app available by the end of 2026, either as a standalone tool or integrated into the European Digital Identity Wallet. The system uses zero-knowledge proof cryptography, allowing users to prove they meet an age threshold using official identification while platforms receive only a binary confirmation rather than personal data.
Virkkunen described effective age verification as “the next piece of the puzzle” in protecting children online. While use of the app is voluntary, platforms that do not adopt it are expected to demonstrate that their own age assurance systems are equally effective.
Taken together, the measures reflect an increasing focus on child safety within EU digital enforcement. This raises a broader question over whether the EU is converging on a single model for protecting minors online, or whether enforcement under the DSA and parallel development of age verification infrastructure point to diverging regulatory approaches aimed at the same objective.
Officials from eight EU member states told Politico they are either hesitant or not planning to adopt the EU app, with several indicating a preference for national alternatives.
Jessica Galissaire, senior policy researcher for the digital public sphere at Interface, said the Meta finding demonstrates the DSA operating as intended. “Right now, we’re at a crossroads. The Commission is showing it’s committed to enforcing existing rules, but there is also growing momentum for blanket bans on social media for young people. Those are two very different approaches, and this case once again shows we already have the tools to make online platforms safer.”
Risk reports under scrutiny
A central element of the Commission's case is Meta's internal risk analysis, which it describes as "incomplete and arbitrary." Commission spokesperson Thomas Regnier told Tech Policy Press that the case is based on monitoring "across risk assessment cycles," drawing on Meta's risk assessment reports, observations of how the platforms operate, and replies to information requests. The Commission added that civil society reports on under-13 access to Instagram and Facebook have informed its analysis.
The Commission argues that Meta does not adequately reflect the scale of under-13 access in the EU, citing external evidence suggesting that approximately 10% to 12% of children under 13 use Facebook or Instagram. It also points to research on children's heightened vulnerability to online harms, which it says is not sufficiently reflected in Meta's evaluation.
A Eurochild analysis of the first cycle of VLOP risk assessments, published in April 2025, concluded that platform reports across the board "lack transparency, downplay serious risks for children, and propose insufficient safeguarding measures," with mitigations frequently "described generically" and platform-specific risks like beauty filters, popularity metrics and engagement-maximizing recommender design "rarely examined."
Christian Cirhigiri, director of the online expression and civic space program at CDT Europe, said the finding reflected "a recurring theme of the inadequacy of risk assessments and mitigation measures that we have seen in several preliminary findings and enforcement decisions."
He added: "This shows on one hand the need for greater coherence in Meta's approach with well-established risk assessment methodologies in terms of upholding rights-respecting conduct for all users, including minors. Cirhigiri said the case also "speaks to a recurring demand from civil society organizations who have been for a long time advocating for guidelines on risk assessments from the EU Commission to address the lack of consistency by companies in addressing all systemic risks, including for child safety online."
What counts as adequate?
In response to questions, the Commission set out specific expectations for how Meta should approach a revised risk assessment. The Commission said it expects platforms to apply "a clear and objective methodology for the assessment of risks specific to each service" and to consult with civil society organizations and researchers. It also pointed to existing transparency obligations under Articles 34, 35, 37 and 42(4) of the DSA, which require very large online platforms to publish their risk assessment reports, mitigation measures, and audit reports annually.
While the DSA does not prescribe specific mitigation measures, the Commission says platforms must demonstrate effectiveness in practice. It suggests Meta could strengthen age assurance systems, improve internal processes, resources, and testing and documentation for underage access, and better evaluate the performance of existing safeguards to prevent, detect and remove underage users.
The Commission’s approach is grounded in its 2025 Guidelines on the Protection of Minors, published last July, which set out expectations under Article 28 of the DSA. These include making minors’ accounts private by default, adapting recommender systems to reduce exposure to harmful content and “rabbit hole” effects, disabling features associated with excessive use such as streaks, autoplay and push notifications, and strengthening moderation and reporting tools. The guidelines also state that age assurance methods must be accurate, reliable, robust, non-intrusive and non-discriminatory, and make clear that self-declared age alone does not meet that threshold.
On age assurance specifically, the Commission said that Meta can use its EU age verification app "or adopt an equivalent in terms of standards.” Where social media platforms fall within the broader age-assurance framework — and what implementation counts as "effective" — has been a continuing source of debate.
Galissaire said the Commission's existing guidance distinguishes between contexts in which different approaches are appropriate. "The Commission's guidelines on article 28 of the DSA are quite clear: age verification (ID or official database based) for high-risk services (e.g. access to porn platforms, online gambling, sale of alcohol, etc.), and age estimation for medium risk services, under which social media platforms tend to fall," she said, adding that "the only exception would be if serious systemic risks for minors are identified, which hasn't been the case until now, although the recent preliminary findings on TikTok and Meta may reshuffle the cards in this regard."
The Commission's response sets out what it expects of Meta, but the question of process and accountability remains less defined. If Meta submits a new or updated risk assessment in response to the preliminary findings, it is not yet clear how the Commission will evaluate it in practice. It remains to be seen whether the Commission will publish detailed criteria on what a compliant Article 34 risk assessment should include for child safety, or whether expectations will continue to be defined through case-by-case enforcement.
The Commission's preliminary view indicates that compliance with Article 34 extends beyond having policies in place, requiring platforms to demonstrate that their measures are effective in practice. Meta now has the opportunity to examine the Commission's investigation files and respond formally in writing before any final decision is taken.
Meta did not respond to a request for comment.
Authors
