NPR recently published details from the original documents of the lawsuit filed by the Kentucky Attorney General against TikTok. According to these documents, TikTok's leaders and employees knew that the app's features were addictive and caused psychological problems for users, especially teenagers.
Specifically, according to TikTok’s internal research, excessive use of the app is associated with a number of negative mental health impacts, including reduced analytical thinking, memory, contextual thinking, conversational and empathetic abilities, and increased anxiety. Additionally, TikTok’s leaders are aware that excessive use of the app can impact sleep, learning, work, and even connecting with loved ones.
Another point revealed in the documents is that TikTok has learned that its time-limiting tool isn’t really effective in helping teens limit their time on the app. Despite the tool’s default limit of 60 minutes per day, young users in the US still spend an average of 107 minutes per day on the app, down just 1.5 minutes from before the tool was introduced.
TikTok is also aware of the existence of “filter bubbles”—where the algorithm pushes users into a loop of negative content. According to the documents, when TikTok employees conducted internal research, they quickly became sucked into “painful” and “sad” content bubbles after only a short time following certain accounts.
In addition to content issues, TikTok has also struggled with moderation. An internal investigation found that underage girls on the platform were receiving “gifts” and “coins” for performing inappropriate acts during live streams.
TikTok's senior managers have even instructed moderators not to delete accounts of users reported as under 13 unless the account explicitly confirms its age.
Responding to the allegations, TikTok spokesperson Alex Haurek said the complaints from the Kentucky Attorney General “cherry-picked misleading quotes and took old documents out of context to misrepresent our commitment to community safety.”
He also asserted that TikTok has robust safeguards in place, including proactively removing accounts suspected of belonging to underage users, and that the platform has voluntarily implemented safety features like default screen time limits, family pairings, and privacy settings for users under 16.