Nav Toor@heynavtoor
If you use TikTok, you should read this once.
In October 2024, a court clerk in Kentucky uploaded the lawsuit against TikTok with the confidential sections still visible. NPR downloaded it before anyone caught the mistake. By the time the court resealed it, the internet had a copy.
What was inside was TikTok's own engineers, in their own words, describing what their app does to a human brain.
Not a critic's brain. Yours.
Here is what they wrote down.
— TikTok ran the math on how long it takes to develop "compulsive use" of the app. The number is 260 videos. With 8-second videos played in rapid-fire succession, that works out to roughly 35 minutes. The company's internal documents call this the compulsive-use threshold.
— TikTok's own research describes what compulsive use causes: "diminished analytical ability, impaired memory, contextual reasoning, conversational depth, empathy, and heightened anxiety." That is not a quote from a critic. That is TikTok's own language, in its own internal documents.
— A team inside the company called "TikTank" wrote in an internal report that compulsive use on the platform was "rampant."
— After 30 minutes of continuous use in one sitting, the company's own documents state that users are placed into "filter bubbles" — algorithmic loops the user did not choose and cannot easily escape.
Then there is the screen-time tool — the one TikTok publicly markets as proof it cares.
— TikTok ran an experiment on the 60-minute screen-time prompt. Daily teen usage dropped from 108.5 minutes to 107. A reduction of 1.5 minutes.
— Internally, the screen-time tool was not measured by whether it reduced screen time. Its top success metric, in writing, was "improving public trust in the TikTok platform via media coverage."
— A project manager wrote in internal chat: "Our goal is not to reduce the time spent." Another employee added that the goal was "to contribute to daily active users and retention."
— A TikTok executive approved the screen-time feature only on the condition that its impact on the company's "core metrics" was minimal. The lawsuit alleges the company planned to "revisit the design" if the tool ever reduced usage by more than 10%.
The "Are you still scrolling?" break videos? An executive admitted in an internal meeting they were "useful talking points" for lawmakers, but "not altogether effective."
Then there is the algorithm itself.
— An internal report flagged that the For You feed was showing what the company called "a high volume of not attractive subjects." TikTok then retooled the algorithm to suppress those users. Kentucky authorities wrote: "By changing the TikTok algorithm to show fewer 'not attractive subjects' in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users."
That sentence is the entire pitch of the platform, said out loud.
— Internally, TikTok also acknowledged that its publicly reported content moderation metrics were "mostly misleading," because they only measured the content the company successfully moderated — never the content it missed.
Now read those bullet points again as one continuous case.
The company knows the addiction threshold. The company measured it. The company ranked engagement over mental health in writing. The company built a screen-time tool whose internal success metric was PR. The company suppressed people it deemed unattractive to keep you scrolling. The company called its own moderation numbers misleading.
None of this is a leaked rumor. None of this is a journalist's interpretation. This is a court filing. The documents are TikTok's. The words are TikTok's. The math is TikTok's.
The 14 state attorneys general who signed onto this lawsuit aren't fringe activists. They're a bipartisan coalition.
Sources at the bottom: NPR, CNN, AP, Mashable, OPB, The Independent. All citing the same accidentally-unsealed Kentucky filing from October 11, 2024.
The next time the company tells you it cares about your wellbeing — the screen-time prompts, the break videos, the safety features, the careful PR statements — remember that its own engineers wrote down, in court-admissible language, that the safeguards were never meant to work.
The app is not broken.
It is performing exactly as designed.
You were the spec.