TikTok was aware that its design features were harmful to young users and that publicly touted tools intended to limit children’s time on the site were largely ineffective, according to internal documents and communications revealed in a lawsuit filed by the state of Kentucky.
The details are among redacted portions of Kentucky’s lawsuit, which includes internal communications and documents unearthed during a more than two-year investigation into the company by several states across the country.
Kentucky’s lawsuit was filed this week separate complaints brought forward by attorneys general in a dozen states and the District of Columbia. TikTok is also facing challenges another lawsuit from the Department of Justice and is itself suing the Department of Justice under a federal law that could ban it in the US by mid-January.
The redacted information — which was inadvertently revealed by the Kentucky attorney general’s office and first reported by Kentucky Public Radio — touches on a range of topics, especially the extent to which TikTok knew how much time young users spent on the platform and how sincere it was. was involved in rolling out instruments to combat excessive use.
In addition to TikTok’s use among minors, the complaint alleges that the short-video sharing app has prioritized “beautiful people” on its platform and internally noted that some of the company’s published content moderation metrics are “largely misleading ‘ are.
The unredacted complaint, which was seen by The Associated Press, was sealed by a state judge in Kentucky on Wednesday after state officials filed an emergency motion to seal the complaint.
When reached for comment, TikTok spokesperson Alex Haurek said: “It is highly irresponsible of the Associated Press to publish information that is under a judicial seal. Unfortunately, this complaint uses misleading quotes and takes outdated documents out of context, misrepresenting our commitment to community safety.”
“We have robust safety measures in place, including proactively removing suspected underage users, and we have voluntarily launched safety features such as default screen time limits, family pairing, and default privacy for minors under 16,” Haurek said in a prepared statement. “We support these efforts.”
The complaint alleges that TikTok has quantified how long it takes for young users to become addicted to the platform, and shared the findings internally in presentations aimed at increasing user retention. The “habit moment,” as TikTok calls it, occurs when users watch 260 videos or more during the first week of having a TikTok account. This can happen in as little as 35 minutes, as some TikTok videos are as short as 8 seconds, the complaint said.
The Kentucky lawsuit also cites a spring 2020 presentation from TikTok that concluded the platform had already “hit a ceiling” among young users. At the time, the company’s estimates showed that at least 95% of smartphone users under the age of 17 used TikTok at least monthly, the complaint said.
TikTok tracks statistics for young users, including how long young users watch videos and how many of them use the platform every day. The company uses the information it gets from these reviews to feed its algorithm, which tailors content to people’s interests and drives user engagement, the complaint said.
TikTok conducts its own internal studies to find out what impact the platform has on users. The lawsuit quotes a group within the company called “TikTank,” which noted in an internal report that compulsive use was “rampant” on the platform. It also quotes an unnamed executive who said kids watch TikTok because the algorithm is “really good.”
“But I think we need to be aware of what it could mean for other opportunities. And when I say other options, I literally mean sleeping, eating, moving around the room and looking someone in the eye,” the unnamed director said, according to the complaint.
TikTok has a daily screen time limit of 60 minutes for minors, a feature that rolled out in March 2023 with the aim of helping teens manage their time on the platform. But Kentucky’s complaint states that the time limit — which users can easily bypass or disable — was intended more as a public relations tool than anything else.
The lawsuit says TikTok measured the success of the time limit feature not by whether it reduced the time teens spent on the platform, but by three other metrics — the first of which was “increasing trust of the audience in the TikTok platform through media reporting’ was.
Reducing screen time among teens was not included as a success measure, the lawsuit said. It was even claimed that the company planned to “rethink the design” of the feature if the time limit had caused teens to reduce their TikTok use by more than 10%.
TikTok conducted an experiment and found that the time limit notifications were just a minute and a half off the average time teens spent on the app — from 108.5 to 107 minutes per day, according to the complaint. But despite the lack of movement, TikTok hasn’t tried to make the feature more effective, Kentucky officials say. They argue that the feature’s ineffectiveness was in many ways by design.
The complaint says a TikTok executive named Zhu Wenjia only approved the feature if its impact on TikTok’s “core metrics” was minimal.
TikTok – including CEO Shou Chew – has spoken about the app’s various time management tools, including videos that TikTok sends users to encourage them to leave the platform. But a TikTok executive said during an internal meeting that these videos are “useful” conversation topics but “not entirely effective.”
In a section detailing the negative impact of TikTok’s facial filters on users, Kentucky claims that TikTok’s algorithm “has prioritized beautiful people” despite knowing internally that the content on the platform “could have a narrow beauty standard perpetuate.”
The complaint alleges that TikTok changed its algorithm after an internal report found that the app was showing a high “volume of… unappealing topics” in the app’s main “For You” feed.
“By changing TikTok’s algorithm to show fewer ‘unappealing topics’ in the For You feed, Defendants have taken active steps to promote a narrow beauty standard, even though it could negatively impact their young users,” the complaint said.
The lawsuit also takes aim at TikTok’s content moderation practices.
It quotes internal communications in which the company notes that the moderation metrics are “largely misleading” because “we are good at moderating the content we capture, but these metrics do not account for the content we miss.”
The complaint notes that TikTok knows it has significant “leakage” rates but does not disclose it publicly, or that there is content that violates the site’s community guidelines but is not removed or moderated. Other social media companies are also facing similar issues on their platforms.
For TikTok, the complaint notes that the “leakage” rates include approximately 36% of content that normalizes pedophilia and 50% of content that glorifies minor sexual assaults.
The lawsuit also accuses the company of misleading the public about its moderation and allowing a number of popular creators, who were considered “high-quality,” to post content that violates the site’s guidelines.