The European Commission (EC) has preliminarily found that TikTok’s addictive design features violate the bloc’s Digital Services Act (DSA).
The preliminary decision outlines how addictive design features on the platform, such as infinite scroll and autoplay, are resulting in users going into “autopilot mode”, with the EC stating this may lead to “compulsive use”.
The DSA, which sets out rules for online services used by European citizens, is designed to strengthen consumer rights and consumer choice, while also minimising the risk of harm. The act also requires platforms to carry out a risk assessment of negative effects on children’s mental health and present it to the EC.
TikTok is one of the 17 companies defined as Very Large Online Platforms under the act, which means it has to comply with the most stringent rules of the DSA because the size of its user base means there is greater potential for systemic harms to occur.
In its ruling, the EC stated that TikTok had failed to implement reasonable and effective measures to mitigate risks from its addictive design features, arguing that minors and vulnerable adults are at particular risk of harm.
The EC’s investigation also revealed that TikTok’s risk assessment had not adequately addressed how its design features and dark patterns could cause harm to the physical and mental health of its users.
On the protective measures that are in place, including screen time management and parental control tools, the EC noted they “do not seem to effectively reduce the risks stemming from TikTok’s addictive design” due to being easy to dismiss or overlook.
“At this stage, the Commission considers that TikTok needs to change the basic design of its service. For instance, by disabling key addictive features such as ‘infinite scroll’ over time, implementing effective ‘screen time breaks’, including during the night, and adapting its recommender system,” it said.
European Union (EU) tech chief Henna Vikkunen told reporters that minors are more at risk because “they don’t have the same tools” to avoid compulsive behaviour.
The decision could force the app, which has more than one billion users globally, to make design changes to avoid penalties, the European Commission said.
If it fails to make the necessary changes, the app could face fines of up to 6% of annual revenue, which it reportedly hoped could reach $186bn last year. TikTok was also accused of breaking digital advertising rules over transparency in May 2025.
The ruling marks the first time the EC has taken a legal stance on the design features of a social media company, tackling what many online safety advocates and campaigners recognise as addictive design.
Responding to the EC’s preliminary findings, a TikTok spokesperson said they “present a categorically false and entirely meritless depiction of our platform, and we will take whatever steps are necessary to challenge these findings through every means available to us”.
However, many hope this could set a precedent for future action against recommender systems that amplify allegedly harmful and illegal content. In December, the first-ever fine for breaching the DSA was given to X, with a total of £104m.
These findings confirm what people have been saying for years: TikTok’s addictive design is not an accident, it’s a business model
Ava Lee, People vs Big Tech
“These findings confirm what people have been saying for years: TikTok’s addictive design is not an accident, it’s a business model. We need [EC president] Ursula von der Leyen to stand up for European citizens and show the political leadership this moment calls for,” said Ava Lee, executive director of People vs Big Tech.
US states have already pursued a case against TikTok for its addictive design features, with the lawsuit alleging the product is damaging children’s mental health.
The move by the EC comes at a time when concerns around the safety of social media platforms are growing across the continent.
France, for example, has voted on a social media ban for children under the age of 15, while Spain has proposed to criminalise algorithms that amplify illegal content. In his speech to the World Governance Summit in Dubai, Spanish prime minister Pedro Sanchez vowed to protect children “from the Digital Wild West”.
In January 2026, the UK’s House of Lords voted to back a social media ban for under-16s by 261 votes to 150, with the government launching a national consultation to discuss next steps for online safety and digital well-being.
Von der Leyen has expressed support for an EU-wide age limit, following Australia being the first country in the world to ban under-16s from accessing social media in December last year.
“Amidst current discussions of restrictions on children’s access to social media platforms, governments must remember they also have a duty to protect children’s right to participate in the digital world,” said Lisa Dittmer, Amnesty International researcher on children and young people’s digital rights.
“To do so, their focus must be on tackling the toxic design of leading social media platforms, including through effectively enforcing laws like the Digital Services Act,” she added.
Research by Amnesty International has previously found that despite risk mitigation measures announced by TikTok since 2024, the platform continues to expose vulnerable users to content that normalises self-harm, despair and suicidal ideation.
The European Commission’s announcement of its preliminary findings comes just days after a report was published by the US government’s House Judiciary Committee, titled The foreign censorship threat, Part II: Europe’s decade-long campaign to censor the global internet and how it harms American speech in the United States.
The publication states that the European Commission has “pressured major social media platforms to change their global content moderation rules”, revealing growing discontent from the Trump administration towards EU tech regulation.
“For the first time, the European Commission is critically examining the recommender mechanisms through which platform operators manipulate the free choice of users,” said German MEP Alexandra Geese.
“Coupled with high personalisation, this system distorts the idea of freedom online. I expect these recommender mechanisms to be scrutinised on other platforms, too. There are better algorithms for ensuring choice online. It’s not the users who want disinformation, hate and violence, but the platforms.”
Read more about digital practices online
EU law could usher in transformative change to digital ecosystems: The EU’s Digital Fairness Act aims to end exploitative practices online and enhance consumer protection, but civil society groups say it must address the power and information asymmetries between users and tech firms
Invasive tracking ‘endemic’ on sensitive support websites: Websites set up by police, charities and universities to help people get support for sensitive issues like addiction and sexual harassment are deploying tracking technologies that harvest information without proper consent.