Home

Donate
Analysis

How Have Platforms Addressed Addictive Design Under the DSA

Cecilia Isola / Feb 23, 2026

This piece is part of a series with the DSA Observatory, featuring articles adapted from selected papers presented at the second DSA and Platform Regulations Conference, marking two years since the Digital Services Act came into full effect.

Leo Lau, Wheel of Progress, CC-BY 4.0.

This year, the European Commission has intensified its scrutiny of addictive design under the Digital Services Act (DSA). On Tuesday, the European Commission opened formal proceedings against Shein, focusing on risks linked to engagement-based design mechanisms, including systems that reward user activity through points or similar incentives, and on whether the company’s mitigation measures were adequate.

Earlier this month, in its preliminary findings regarding TikTok’s alleged addictive design practices, the Commission identified design features such as “infinite scroll,” “autoplay,” persistent push notifications, and “highly personalized recommender systems” as elements capable of harming the physical and mental well-being of its users, including minors and vulnerable adults.

The Commission’s recent proceedings signal that addictive design has become a central focus of regulatory scrutiny in the EU.

What is addictive design?

What, then, is addictive design?

Although the term is increasingly used in policy and regulatory discourse, it still lacks a precise definition. It is often referred to as a specific type of “dark pattern,” namely an interface design technique aimed at influencing user behavior in the provider’s interest. Yet, this description captures only part of the phenomenon.

Addictive design is better understood as a configuration of the service as a whole. In this sense, addictive design emerges where design features, user data, and algorithmic systems are combined within a broader logic of engagement maximization, designed to keep users interacting with the service for as long as possible. This is particularly evident in some popular online platforms, especially social networks and, more recently, online marketplaces. Within this configuration, specific design features ensure that users’ interaction with the service remains continuous and frictionless (e.g., infinite scroll, autoplay), while also stimulating re-engagement (e.g., push notifications).

At the same time, user data is collected and processed by algorithmic systems to identify which types of content are more likely to attract and retain a particular user’s attention and to organise content exposure accordingly, selecting and prioritizing material with a higher probability of being perceived as rewarding.

Here, “reward” refers to the positive outcome that may follow a user’s action within the platform, for example, social recognition resulting from a like, or interesting content following continued scrolling. Crucially, users do not know when, or even whether, such a reward will occur. It is precisely this uncertainty, comparable to variable reward mechanisms such as slot machines, that increases the likelihood that interaction will continue.

A growing body of empirical research indicates that such dynamics may significantly affect mental well-being. From a psychological perspective, intermittent and unpredictable rewarding stimuli activate dopaminergic processes involved in anticipation and behavioural reinforcement. Over time, exposure to intermittent and unpredictable rewards increases the likelihood that users will repeatedly perform the same action in anticipation of a positive outcome. This repeated engagement, reinforced under conditions of uncertainty, may gradually reduce users’ capacity to disengage, contributing to excessive or compulsive use and, in some cases, to addiction.

Addictive design as a “systemic risk” under the DSA

The Digital Services Act constitutes the central pillar of the European Union’s new regulatory framework for digital services. Adopted in 2022 and fully applicable since February 2024, it lays down harmonized rules on liability, due diligence obligations and enforcement for online intermediary services provided in the EU.

The DSA adopts a layered approach under which obligations become more severe according to the role, size and societal impact of the service. The most demanding obligations (Articles 33-43 DSA) apply to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), defined as intermediary services that reach at least 45 million monthly active recipients within the Union.

Particularly relevant in the context of addictive design are Articles 25 and 34–35 DSA. Article 25 introduces a general prohibition applicable to providers of online platforms, preventing them from designing or organizing their online user interfaces in a manner that deceives or manipulates users or otherwise materially distorts or impairs their ability to make free and informed decisions. By contrast, Articles 34 and 35 apply exclusively to VLOPSEs and concern the assessment and mitigation of systemic risks that may arise from the design, functioning and use of their services. In particular, they require these providers to identify such risks and to adopt appropriate, effective and proportionate mitigation measures.

Although “systemic risk” is not defined, Article 34 DSA highlights four categories: dissemination of illegal content or illegal activities; impacts on fundamental rights; risks to democratic processes and public security; and risks to public health, minors, and users’ physical and mental well-being. While addictive design is not explicitly mentioned, the latter category strongly indicates that engagement-driven architectures capable of producing addictive effects fall within its scope.

Moreover, concerns regarding addictive design have been explicitly articulated at the political level. In its Resolution of December 12, 2023, the European Parliament raised specific concerns about the growing prevalence of online platforms’ digital architectures with addictive potential. The Resolution identifies design features such as "endless scrolling,” "pull-to-refresh" page reload, "never ending auto-play" video features, personalized recommendations, "recapture notifications" as capable of fostering excessive or compulsive use, thereby generating risks for mental health and well-being.

Significantly, it observed that, despite the entry into force of the DSA and other regulatory instruments, addictive design remains insufficiently addressed in EU law and, among its recommendations, called on the European Commission to address addictive design within the framework of Articles 34-35 DSA.

Is addictive design sufficiently addressed in VLOPs’ risk assessment reports?

The risk assessment reports of Instagram and Facebook (operated by Meta) and TikTok (operated by ByteDance) offer a concrete lens through which to observe how VLOPs interpret and operationalize addictive design as a systemic risk in practice, and how corresponding mitigation measures are articulated pursuant to Article 35 DSA.

A comparative reading of the 2023–2025 reports reveals both convergence and divergence. All three platforms adopt broadly similar procedural frameworks for risk assessment, structured around risk identification, evaluation of severity and likelihood, tiering, mitigation, and engagement with experts, drawing on normative sources (e.g., the DSA, the GDPR) as well as other reference frameworks, such as international standards (e.g., ISO/IEC). However, significant differences emerge in the substantive categorization of risks, particularly with regard to potential harms to mental well-being and addiction.

Instagram and Facebook do not address addictive design as a distinct systemic risk in their reports. The two reports, which appear substantially identical on this point, refer to mental well-being only in the introductory section entitled “Systemic Risk Landscape” (Instagram & Facebook Risk Assessment Reports, 2025).

There, it is stated that: “Physical and Mental Well-Being is not listed as its own Systemic Risk Area in our Systemic Risk Landscape, as potential impacts to physical and mental health are applicable to all Problem Areas and risks. Our Severity Principles consider Harm Type, which encompasses impacts on users' physical and psychological health, and we evaluate these risks as part of our Inherent Risk calculation within our Severity Rubric. Higher severity scores are assigned to risks deemed to have a potentially elevated impact on an individual’s physical and psychological well-being. Meta also continuously develops resources to help support our users' well-being (...).”

However, the reports do not subsequently develop a dedicated analysis of mental well-being risks specifically linked to engagement-driven design features, nor do they outline concrete mitigation measures addressing harms potentially arising from excessive or compulsive use (e.g., screen-time management mechanisms).

On the other hand, even without explicitly referring to addictive design, TikTok has progressively integrated risks to mental well-being into its systemic risk taxonomy. In its 2023 report, TikTok identified only 10 systemic risks and made no explicit reference to risks related to excessive use or addictive patterns of engagement.

In 2024, following the opening of the European Commission’s investigation into TikTok, the Commission also initiated proceedings against Meta’s platforms in the same year. Concerning addictive design, TikTok introduced a new risk category entitled “Risk related to age-appropriate content and online well-being.” To mitigate this risk, TikTok proposed several measures, including default screen-time management tools set at 60 minutes per day for minors (and optional for adults), time-delay interventions, updates to TikTok’s Community Guidelines, the introduction of informational feeds, and the establishment of youth-oriented initiatives such as a Youth Council.

In the 2025 report, TikTok expanded its taxonomy to 12 risks, adding a new category relating to illegal and unsafe products, and reformulated the mental well-being risk under the heading “Youth Safety and Online Engagement: Risk related to online engagement.” However, apart from the introduction of additional specialized teams tasked with monitoring safety and well-being and contributing to policy development and risk prevention, the mitigation measures related to engagement-driven harms remained substantially unchanged.

Formal proceedings initiated by the European Commission are currently ongoing regarding the assessment and mitigation of risks associated with addictive design by TikTok, Facebook, Instagram and Shein. To date, preliminary findings have been published only in relation to TikTok. The Commission found that TikTok did not adequately assess how its addictive features could harm the physical and mental well-being of its users, including minors and vulnerable adults.

According to the Commission, certain design features of TikTok fuel the urge to keep scrolling by constantly “rewarding” users with new content and may place users in an “autopilot” mode. Additionally, in its assessment, TikTok allegedly disregarded important indicators of compulsive use of the app, such as the amount of time minors spend on TikTok at night, the frequency with which users open the app, and other relevant behavioral signals.

On that basis, the Commission preliminarily concluded that TikTok failed to comply with Articles 34–35 DSA, as it “[TikTok] seems to fail to implement reasonable, proportionate and effective measures to mitigate risks stemming from its addictive design.”

The tension between the DSA and VLOP’s business model

At its core, this analysis reveals a significant tension between what the DSA requires and how VLOPs approach addictive design. That tension can only be understood by looking at the business model of online platforms.

Most online platforms, and social media in particular, operate under advertising-based business models. Users do not pay directly; revenue derives from the sale of advertising space. In this model, attention is the key resource. The longer users remain engaged, the more advertising can be delivered and the more behavioral data can be collected, increasing the value of targeted advertising. The very features identified as generating systemic risks are, at the same time, the features that sustain the platform’s business model.

This tension must be assessed in light of Article 16 of the Charter, which protects the freedom to conduct a business. Economic freedom, however, is not absolute. It must be balanced against other protected interests, including public health, consumer protection, data protection, and fundamental rights more broadly. That said, any restriction must comply with Article 52: it must be provided for by law, respect the essence of the right, and satisfy proportionality. Accordingly, no asserted higher axiological value of one right over another can be automatically presumed; conflicts must be resolved in accordance with the principle of proportionality.

The 2025 Commission’s decision against X illustrates this balancing exercise. X argued that the Commission’s finding of a violation of Article 25 DSA, in relation to its verification system (‘blue checks’), constituted an unjustified interference with its freedom to conduct a business. Relying on CJEU case law, the Commission rejected this claim, emphasizing that entrepreneurial freedom must be understood in light of its social function and may be limited where the restriction is lawful, necessary, and proportionate. Article 25 of the DSA was thus framed as a proportionate constraint aimed at safeguarding users’ rights without unduly interfering with economic freedom.

Key takeaways

Overall, the analysis suggests that addictive design, although not explicitly defined in the DSA, qualifies as a systemic risk within its regulatory framework and must therefore be addressed under Articles 34–35. Yet, it remains substantively under-addressed in VLOPs’ systemic risk assessments.

It must be acknowledged that platforms increasingly refer to mental well-being and prolonged use concerns; however, their compliance often appears largely symbolic rather than genuinely responsive to the risks generated by potentially addictive design features, as reflected in the Commission’s preliminary findings in the TikTok case.

The Commission’s findings are significant in this regard for two reasons. First, they explicitly confirm that addictive design should be considered a systemic risk under Article 34 DSA and therefore falls within the scope of the risk assessment obligation imposed on VLOPs and VLOSEs. Second, they clarify the requirements of Article 35 by giving concrete substance to the standard of mitigation measures applicable to addictive design.

In this respect, TikTok’s methodology was found to be inadequate by the Commission, despite its efforts over the years to refine its assessment framework, including by following risk assessment methodologies established by international standards such as ISO/IEC. The Commission’s preliminary findings indicate that generic or user-side tools, such as default screen-time limits for minors, time-management prompts, informational resources, and the establishment of specialised internal teams tasked with safety and mental well-being oversight, are insufficient where the risk is embedded in the very configuration of the service.

The Commission’s enforcement action is therefore beginning to narrow the gap between formal compliance and substantive mitigation of engagement-driven harm. However, this enforcement trajectory remains incomplete, and it is still uncertain whether it will ultimately lead to concrete changes in platform design.

Authors

Cecilia Isola
Cecilia Isola is a legal scholar, policy expert, and computer programmer specializing in EU law, consumer protection and digital regulation. She is a Research Fellow at SERICS (SEcurity and RIghts in the CyberSpace) and at the University of Genoa, Italy. She has published on dark patterns, unfair co...

Related

News
The EU Wants to Label 'Addictive Design' a Systemic Risk Under the DSAFebruary 6, 2026
Analysis
What Does the First US Social Media Addiction Trial Mean for the Tech Industry?February 8, 2026

Topics