In preliminary findings from an investigation to determine TikTok's compliance with the European Union (EU) Deep Digital Services Act, the European Commission said that this short video platform did not "fully assess" whether its design decisions could harm the health of users, especially adolescents and vulnerable adults.
The EU regulatory body said that TikTok ignored "important indicators about the forced use of the application", such as the time users spend on the application at night and the frequency of users opening the application.
By continuously'rewarding' users with new content, some TikTok design features promote the urge to scroll the screen continuously and put the user's brain into'automatic mode'. Scientific research shows that this can lead to coercive behavior and reduce user self-control," the European Commission shared in a statement.
The committee said that TikTok must change the "basic design" of the user interface by disabling features such as infinite scrolling, applying screen usage time limits and changing its suggested system.
Meanwhile, TikTok has denied the accusations.
The Commission's preliminary findings give a completely false and unfounded description of our platform, and we will take all necessary measures to refute these findings by all means available," a TikTok spokesman said in a statement emailed to Techcrunch.
TikTok provides tools to manage screen usage time and parental control, but the European Commission said those tools are still not enough to minimize risks due to the addictive design of the app.
“Time management tools seem ineffective in helping users minimize and control TikTok usage because they are easily turned off and do not cause many obstacles. Similarly, parental control measures may also be ineffective because they require parents to spend more time and skills to establish those control measures,” the European Commission said.
The allegations against TikTok appear in the context that social networking platforms are facing closer surveillance worldwide, with some governments pushing to completely ban young users from accessing social networks.
In November 2025, Australia requested social networking sites to disable accounts of users under 16 years old. The UK and Spain are said to be considering similar measures.
Meanwhile, France, Denmark, Italy and Norway have been and are studying similar age restriction measures for social media platforms. In the US, 24 states have so far enacted age verification laws.
Violations of the Digital Services Act (DSA) have been confirmed to be subject to a range of severe sanctions, including fines of up to 6% of global annual revenue.