Home

Donate
Analysis

What Europe’s Digital Services Act Says About Age Assurance

Joan Barata / Sep 26, 2025

European Commission President Ursula von der Leyen delivers the 2025 State of the Union address at the European Parliament in Strasbourg on September 10, 2025. Source: X

The Digital Services Act (DSA) establishes a wide regulatory framework applicable to intermediary services and particularly online platforms in the European Union (EU). A key feature of the DSA is the establishment of a series of obligations regarding content moderation by online platforms, as well as the recognition of procedural rights for users. This means in practice the need for platforms to establish and implement a series of tools, procedures, and practices in relation to the dissemination of illegal content, as well as the prevention of certain types of harm.

Article 28 DSA obliges, in its paragraph 1, providers of online platforms accessible to minors to put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors on their service. Paragraph 2 of the same article grants the European Commission the authority to issue guidelines to assist providers of online platforms in implementing paragraph 1, following prior consultation with the European Board for Digital Services.

On 14 July, the Commission published its guidelines on the protection of minors under the DSA to ensure a safe online experience for children and young people. In order to prepare the guidelines, the Commission launched a call for evidence, engaged with experts and children and young people, sought feedback through a public consultation, and held several meetings with the European Board for Digital Services. As acknowledged by the Commission, the guidelines set out a non-exhaustive list of proportionate and appropriate measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.

Human rights standards

Before analyzing the mentioned guidelines in more detail, it is necessary to provide a general overview of the implications that regulations aimed at protecting children online may have in terms of fundamental rights.

Above all, it is important to underscore the fact that children, like any other individual, have their right to express ideas and opinions protected under international human rights standards. Article 19 of the International Covenant on Civil and Political Rights (ICCPR) establishes that “everyone” has the right to seek, receive and impart information and ideas of all kinds. According to Article 12 of the United Nations Convention on the Rights of the Child (UNCRC), children have the right to express their views, feelings, and wishes in all matters affecting them. Article 16 of the Convention protects the specific right of children to privacy. This protection operates not only vis-à-vis possible interferences from state or private actors (such as Big Tech companies and media) but also in cases of unnecessary and disproportionate intrusions from parents and tutors regarding, for example, their online behavior.

As acknowledged by UNICEF, the Internet provides tremendous opportunities for children to express themselves and presents children with a vast quantity of useful information. While children may need assistance to safely exercise their rights to freedom of expression and access to information, protective measures may become unduly restrictive as children’s capacities to navigate the digital world develop. This particularly requires avoiding disproportionate monitoring by governments or parents, as well as unwarranted limitations on anonymity. Children must therefore be able to explore the digital world without encountering overly restrictive filters, whether at a network or device level, or other systems or mechanisms that restrict access to potentially beneficial content.

With specific regard to social media and online platforms, while certain digital services can pose risks to minors’ privacy, security, and physical and mental development, it must not be ignored that the unavoidable role that contact with these resources plays in the acquisition of digital skills, as well as in access to education and knowledge in accordance with their age and maturity. The digital environment provides children with unprecedented opportunities to express their opinions and points of view, access diverse communities, and acquire skills and confidence in a digital environment that will be key to their full adult citizenship.

Apart from protecting the right to freedom of expression of all in article 11, the EU Charter of Fundamental Rights establishes in article 24 that children’s rights deserve “protection and care as is necessary for their well-being,” as well as recognizing their right to “express their views freely.” According to the second paragraph of the mentioned article, all actions taken by public or private institutions must put the interests of the child as a primary consideration.

The Commission guidelines and age assurance

General approach

From a legal perspective, the guidelines should not be viewed as a rigid regulatory document. However, their normative value shall not be neglected since the Commission declares them as a “significant and meaningful benchmark” on which the Commission will base itself when applying the mentioned article 28 DSA and determining the compliance of providers of online platforms accessible to minors with that provision (paragraph 12 of the guidelines). They may also “inspire” the oversight activities of digital service coordinators and national authorities when enforcing the mentioned provisions of the DSA. Furthermore, these measures are without prejudice to other conditions and obligations to be established at the national level via the corresponding legal and regulatory instruments.

The guidelines are based on a series of relevant principles. In this sense, particular references are made to the ideas of proportionality and appropriateness, thus requiring a case-by-case analysis by each provider regarding the adoption of specific measures to tackle concrete and diverse types of risks; protection of all children’s rights, including freedom of expression, non-discrimination, privacy, inclusion, political participation or access to information and education, together with the primary consideration of their best interests; privacy, safety and security by design by embedding protections by default into the design, operation and management of organisations, as well as in products and services; and age-appropriate design, which requires service providers to align with the developmental, cognitive and emotional needs of minors, while ensuring their safety, privacy, and security by paying attention to the specific age or stage of development.

In addition to the above, the guidelines also establish some criteria in order for platforms to properly assess a series of significant risks when protecting minors online, including likeliness that minors will access the service, the actual or potential impact on the privacy, safety and security of minors that the online platform may pose or give rise to, as well as potential positive and negative effects on children’s or other users’ rights of any measure, ensuring that these rights are not disproportionately or unduly restricted and positive effects can be maximised, among others.

Age assurance and age restrictions

Within this general context, a specific section of the guidelines is dedicated to age assurance. This concept refers to the measures implemented by service providers to establish, with varying degrees of accuracy, reliability, and robustness, the age of their users. Age assurance methods may be used to restrict access by users below a certain age to certain types of content and services, to prevent adults from accessing certain platforms that are designed for minors, or to ensure that children only have access to certain content, features, or activities that are appropriate for their consumption, taking into account their age and evolving capacities. Age assurance measures may include various types of tools, ranging from self-declaration to age estimation and strong age verification based on physical identifiers or verified sources of information. Age assurance measures and age restrictions are therefore two different notions, where the former is instrumental to the latter.

Restrictions based on age have obviously important implications that need to be considered not only at the regulatory level but also by online platforms when establishing specific limitations. The guidelines incorporate, in this sense, a series of very relevant recommendations:

A) Age restrictions and age assurance may, in many cases, complement other possible measures also contemplated in the guidelines.

B) A proper assessment must be made to ensure that any restriction on the exercise of fundamental rights and freedoms of the recipients, especially minors, is proportionate.

C) Age estimation methods that require the processing of personal data require that data protection principles, especially data minimization, are properly implemented.

D) Children must be involved in the design, implementation, and evaluation of age restrictions and age assurance methods.

E) Instead of age-restricting the service as a whole, providers of online platforms should assess which content, sections, or functions on their platform carry risks for minors and implement access restrictions supported by age assurance methods to reduce these risks for minors in proportionate and appropriate ways.

F) In this sense, only in cases where the risks to minors' privacy, safety, or security cannot be mitigated by less restrictive measures (for example, access to any type of pornographic content), and when applicable legislation establishes a minimum age to access certain types of products and services (sale of alcohol or tobacco, for example).

As mentioned already, the implementation of the mentioned criteria will require the use of age verification technologies adapted to the characteristics of each case. This is a particularly sensitive area where the protection of the rights of both children and adults (particularly when it comes to privacy, but also freedom of expression and free access to information) may face significant tensions and challenges. This is why the Commission itself has committed and is expected to supplement, in due course” the existing guidelines with a technical analysis on the main existing methods of age estimation. The guidelines particularly contemplate that “age estimation methods can complement age verification technologies” and can be used as temporary alternative, in particular in cases where verification measures that meet the criteria of effectiveness and protection of users’ right to privacy and data protection as well as accuracy are not yet readily available. It is, in any case, established that methods relying on verified and trusted government-issued IDs, without providing the platform with additional personal data, “may constitute an effective age verification method, insofar as they are based on anonymized age tokens.”

In this context, the EU Digital Identity Wallets have been presented as safe, reliable, and private means of electronic identification, embedding the opportunity to receive a token of age, to be provided by member states of the Union by the end of 2026. However, and depending on specific technical characteristics, tokenization may not completely eliminate tracking risks and may still raise privacy issues, as noted, for example, by French regulators. Furthermore, the organization AI Forensics has disclosed how a purportedly secure “double verification” solution deployed for major pornographic platforms was “potentially leaking information about people visiting pornographic websites”.

Regarding these aspects, it is also worth noting that the Australian government has recently issued an official report on the so-called AgeAssuranceTrial, which provides a comprehensive evaluation of age assurance technologies, assessing their performance against a wide range of internationally recognized criteria. Regarding age verification, the report establishes that it can be done in Australia privately, efficiently and effectively.

However, the report also stresses that there is no single solution to age verification, since a range of valid models exist, shaped by different contexts, needs, and expectations. It also points to the need to use opportunities to “enhance risk management and system capability, especially regarding real-time detection of lost or stolen documents”. Regarding age estimation methods, the report emphasizes the need to understand that “there will be a margin of error” which may require “applying a buffer age to reduce that margin to an acceptable level,” acknowledging that false negatives “will then be inevitable and alternative methods will be required to correct them.” Last but not least, the report also contains interesting findings regarding the use of parental controls to manage, restrict or monitor a child’s access to digital content, services or device functions. While such controls may be useful to indicate the presence of a supervised child profile or device without needing to collect identity data, it is also established that “most systems are static and do not adapt easily to children’s evolving maturity, preferences or rights to participate in decisions about their digital lives.” The latter may lead to subjecting children to restrictions “without visibility or recourse, raising important questions around dignity, fairness and transparency.”

There are also critical voices warning about the risks of using the mentioned methods, particularly when, due to a lack of official IDs, minors may be forced to submit invasive biometric data, adults may be subjected to potential surveillance and censorship, and certain enterprises may be forced to restrict access or compromise privacy under the threat of significant fines. Within the context of EU legislation, it has also been pointed out that, since age estimation may constitute, in many cases, profiling, generic assertions about age assurance methods, as those included in the guidelines, are problematic from a data protection perspective (thus showing once again the problematic interplay between the DSA and the General Data Protection Regulation).

To sum up, the guidelines contain relevant standards for the interpretation and implementation of the provisions included in article 28 DSA. It is also important to underscore that they promote a flexible, tailored, contextual, and proportional risk-based approach, which includes the proper consideration and prioritization of the interests and fundamental rights of children. That being said, it is also true that many substantial decisions are still left to the discretion of service providers, based on sometimes broad and vague categorizations. The success of the guidelines will therefore depend on further efforts by both the Commission and the industry to achieve a common understanding around how to better protect children in different contexts, as well as on the development of proper technical solutions aligned with the aim of ensuring a high level of privacy, safety, and security of minors online. In this context, the role of member states will also be very relevant.

What lies ahead

There is no doubt that protecting children online is not only an important public policy area but also a matter of political struggle. It is, particularly, the territory of strong political statements, which quite often are not followed by clear and articulated proposals. Very recently, the president of the European Commission, Ursula von der Leyen, used her annual State of the Union speech to indicate that this body will examine the current “unfettered access to social media” by minors and consider tougher restrictions with the aim of preventing bullying, access to adult content and self-harm online. Similar remarks were also made ahead of the United Nations General Assembly. It was not made clear whether this would involve introducing new regulations beyond the provisions of the DSA and the implementation of the guidelines. It seems that further decisions may depend on the views of a future panel of experts that will advise von der Leyen on these matters.

This statement also coincides with the echoes of recent legal reforms in Australia, which impose what essentially constitutes an outright ban on children’s use of social media. This proposal has nevertheless received severe criticism due to its disproportionate impact on the most fundamental rights of minors, the risks it creates in terms of free access to certain content by adults, as well as the threats to privacy that the imposition of age verification mechanisms can generate. It is worth highlighting the statements made in this regard by organizations such as Amnesty International (“a ban that isolates young people will not meet the government’s objective of improving young people’s lives”), Save the Children (the focus should be on mechanisms to hold social media corporations to account and on responses that would address root causes of harm) and UNICEF (“the proposed changes won’t fix the problems young people face online (…) it’s more important to make social media platforms safer and to listen to young people to make sure any changes actually help”. Furthermore, recent proposals in countries such as France, Spain, and Greece, among others, clearly show the risk of oversimplification of the debate and the use of arbitrary measures in the absence of comprehensive policies.

Policymakers, legislators and regulators must focus their efforts on establishing appropriate regulatory frameworks that establish the duties, obligations, and safeguards that must be respected and implemented by the various actors involved (most importantly, companies operating in the field of digital services). They must also take into account the need for new and better public policies in the areas of education, the elimination of social risk factors and inequality, and adequate awareness-raising among those responsible for minors.

Radical, disproportionate measures such as outright bans are simplistic solutions incompatible with emphatic declarations regarding the comprehensive protection of minors in the digital world. Furthermore, they may end up favoring those willing to act maliciously within the margins of an environment formally reserved for adults.

Authors

Joan Barata
Dr. Joan Barata is an international human rights expert who particularly works on freedom of expression, media regulation, and intermediary liability issues. He is currently Visiting Professor at the School of Law at Católica University in Porto. He was a Senior Fellow at The Future Free Speech proj...

Related

Perspective
When Age Assurance Laws Meet ChatbotsSeptember 5, 2025

Topics