Decoding Article 28 of the DSA: Age assurance and service design for online platforms

  • This article has been produced jointly by the Spanish National Markets and Competition Commission (CNMC) and the Spanish Data Protection Authority (AEPD)

Photo by Emily Wade on Unsplash.

With children rapidly adopting digital technologies, the European Commission’s Guidelines under Article 28(4) of the Digital Services Act (DSA) address how providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors on their service. 

This post, published jointly by the CNMC and the AEPD as Digital Services Coordinator and as the competent authority for the application of Article 28.2 of the DSA in Spain respectively, specifically explores Age assurance. By navigating the interplay between DSA protection mandates and GDPR principles and requirements, the Guidelines promote non-linkable, privacy-preserving solutions, such as anonymous tokens and the EU Digital Identity Wallet, to ensure the best interests of the child are secured without compromising all users’ rights and freedoms.       

 

The Digital Services Act (DSA) aims to contribute to the proper functioning of the internal market and create a safer, more predictable, and more reliable digital environment. To this end, it establishes harmonized rules for intermediary services, particularly online platforms.

A key part of this framework is Article 28(a), which specifically mandates that platforms accessible to minors must implement “appropriate and proportionate” measures to ensure a high level of privacy, safety, and security for them. Crucially, Article 28(3) clarifies that this safety obligation does not require platforms to collect additional personal data to prove a user's age, effectively resolving the “privacy-safety paradox” by encouraging safety and privacy by design and by default solutions. In practice, this means platforms must prioritise the best interests of the child when designing their interfaces, ensuring that younger users are protected, without compromising all users’ fundamental rights and freedoms.

The Guidelines on measures to ensure a high level of privacy, safety and security for minors online aims to support providers of online platforms in addressing the different risks younger users face in the digital environment. Thus, they provide a comprehensive set of the types of risks that may threaten minors and of the measures to help these providers comply with their obligations under Article 28(1) of the DSA. Additionally, these guidelines aim to help Digital Services Coordinators (DSCs) such as the CNMC, and other competent national authorities such as the AEPD, when they apply, interpret, or enforce the rules regarding the protection of minors under the DSA.           

The Guidelines cover several key operational areas for platforms, including risk reviews, service design, user reporting, feedback and complaints, and internal governance. While specifically tailored for minors, the Commission encourages platforms to adopt these measures for all users to create a globally safer online ecosystem. 

Essential for these guidelines is Section 6.1, which addresses Age assurance, the process of determining a user’s age to ensure adequate protection. The term “age assurance” is an umbrella term for three different technical approaches:

  • Self-declaration: Solutions asking the user simply to state their age or confirm their age range. While easy to implement and to use, they are unreliable and easily bypassed.
  • Age estimation: Algorithmic methods (like facial analysis or behavioural profiling) that establish a likelihood that a user falls within an age range or is above or below an age threshold.
  • Age verification: Solutions that rely on verified and trusted sources (such as government IDs or driver's licenses) to establish age with a high degree of certainty.

As mentioned before, Article 28(1) of the DSA requires platforms accessible to minors to implement appropriate and proportionate measures to ensure a high level of privacy, safety, and security. This obligation does not mean they need to know who the user is.

 On the other hand, the GDPR mandates the principle of data minimisation (Article 5(1)(c)), meaning platforms should collect as little personal data as possible in relation to the purposes for which they are processed. To comply with this principle, as well as others related to the protection of personal data, the Guidelines recommend taking into account the European Data Protection Board (EDPB) statement on Age Assurance.

If a platform starts collecting IDs or scanning faces under the pretext of protecting minors without the necessary justification of proportionality, considering the risks that its service represents for minors, and without the appropriate privacy safeguards, it runs the risk of violating the fundamental rights and freedoms of all its users and Article 28.3 of the DSA.

As already mentioned, Article 28(3) of the DSA explicitly states that platforms are not obliged to process additional personal data solely to determine whether someone is a minor. Instead of requiring users’ identities or processing children’s personal data, the Guidelines encourage age assurance solutions that allow users to prove they are above an age threshold without revealing any other information.

According to the Guidelines, access restrictions supported by age verification solutions are necessary when accessing to high-risk products or content (for example drugs, alcohol, tobacco, nicotine products, pornographic content or gambling), when a platform’s Terms and Conditions require users to be at least 18 years old, when a risk review identifies significant content, conduct, contact or consumer risks that less intrusive measures cannot effectively manage, and when Union or national law prescribes a specific minimum age to access certain products or services displayed on the platform. 

In this high-risk scenario, age estimation methods can complement age verification solutions or serve as a temporary alternative (this transitory period should not extend beyond the first review of the Guidelines). These estimation methods are also suitable when a risk review identifies medium risks that are not high enough to mandate strict verification (but cannot be resolved by simple self-declaration) or when a platform’s Terms and Conditions require a minimum age lower than 18 (e.g., 13 or 16) based on the provider's assessment of risks.

Online platforms accessible to minors with only some content, sections, or functions that pose a risk to them do not need to apply a blanket age limit to the entire site. Platforms should identify specific risky areas, such as adult sections, restricted advertisements, or influencer product placement, and use age verification only for those parts. In this way, children are protected by default, and risky areas are made available only to adult users whose age has been verified accordingly.

Once a platform determines that an age assurance solution is necessary, verification or estimation, it must evaluate its chosen method against five rigorous criteria:

1. Accuracy: How correctly does the solution determine if the user is above the age threshold?

2. Reliability: Does the solution work consistently in real-world conditions, not just in a lab?

3. Robustness: How easy is it for a tech-savvy teen to circumvent the age restriction?

4. Non-intrusiveness: Does it respect users’ rights and freedoms?

5. Non-discrimination: Does it work for all users, regardless of skin tone, disability, or socio-economic status (e.g., lacking a passport)?

The Guidelines establish that self-declaration is generally insufficient for these high- and medium-risk identified scenarios because it lacks the accuracy and robustness required to ensure a high level of safety. For the remaining age-assurance methods, the Guidelines promote tokenised, double-blind methods that minimise the processing of users' personal data.

These methods are based on the issuance by the age assurance solution of an anonymised age token, which is basically a digital “YES” or “OK” received by the platform where it must be demonstrated that an age threshold is exceeded: the platform receives the guarantee it needs without accessing an identification or learning anything else about the user, while the provider that verifies or estimates the age and generates the token knows the age, but does not know which specific platform they are accessing.

The EU Digital Identity Wallet, expected to be available to all citizens by 2026, is designed to be the “gold standard” for privacy-preserving assurance. In the interim, the Commission is promoting a standalone EU age verification solution so that there is a provisional harmonised solution in the different Member States that, eventually, will be easily integrated into the EU Digital Wallet.     

Finally, age assurance is not a “set it and forget it” task. Section 8 of the Guidelines mandates that platforms appoint a dedicated safety team with direct access to senior management and conduct regular Child Rights Impact Assessments (CRIAs) to evaluate how design changes affect younger users. The EDPB statement also establishes that age assurance should operate under a governance framework, ensuring that all processes and systems are designed, implemented, revised, documented, assessed, used, maintained, tested or audited in a way that meets data protection regulations and other legal requirements.

For platforms, the message is simple: always prioritise the best interests of the child, safety cannot be achieved at the expense of privacy, protect both by design and by default. The goal is to design digital services safe enough for minors to thrive in; without compromising their safety, protection, and rights. 

Since access restrictions and age assurance alone cannot serve as substitutes for other measures recommended in the Guidelines, we will explain the content and scope of the remaining measures in future joint posts.

 

 

Related entries