More than a dozen US states recently filed lawsuits against TikTok. The short video app is accused of making children and young people addicted. TikTok rejects the allegations, but internal documents that are not intended for the public and have now been leaked show that the company has probably been aware of the danger posed by its algorithm for years.
A few days ago, the attorneys general of 13 US states and the capital Washington filed a lawsuit against the subsidiary of the Chinese ByteDance group – krone.at reports. They accuse TikTok of deliberately designing its platform in such a way that children and young people wanted to spend more and more time there. Reference was made in this context to features such as the ability to scroll further and further with videos that start automatically. They argue that this causes children in particular to become addicted to more.
TikTok, which is currently battling a law in the US that would force a change in ownership, rejected the allegations and responded that there are robust security measures and limits on the time young users spend on the platform. However, according to the Youth Internet Monitor, the application, which is also used by almost two-thirds of children and young people in Austria, may have misled the public about the dangers and risks it poses.
Connected within 35 minutes
This is evident from internal documents that are part of the investigation into TikTok that lasted more than two years by the 14 offices of the attorneys general and have now been accessed by broadcast network NPR (National Public Radio) due to incorrect redaction in the filing of the lawsuit. the US state of Kentucky and were published despite a confidentiality agreement between the US states and TikTok.
The approximately 30 pages of now not-so-secret documents – mainly summaries of internal investigations and communications – show that TikTok was aware of the highly addictive potential of its platform and of the dangers and risks of potentially harmful content for its underage users, but has taken little action against it.
According to this data, TikTok has determined the exact number of video views needed to form a habit and keep users glued to the smartphone screen, which is 260. After that, according to state researchers, it is “likely that a user will be addicted to the platform ”.
In the previously redacted portion of the lawsuit, Kentucky authorities say, “While this may seem important, TikTok videos can be as short as eight seconds and automatically play to viewers in rapid succession.” .” than 35 minutes addicted to the platform,” NPR quoted.
Compulsive use and its consequences
Another internal document shows that the company was aware that the many features designed to keep young people on the app resulted in a constant and irresistible urge to open the app again and again. TikTok’s own research shows that “compulsive use correlates with a range of negative mental health consequences, such as loss of analytical skills, memory formation, contextual thinking, depth of conversation, empathy, and increased anxiety,” according to the lawsuit.
Additionally, the documents show that TikTok was aware that “compulsive use also interferes with essential personal responsibilities, such as getting enough sleep, work/school, and connection with loved ones.”
Countermeasures “not that useful”
“Our goal is not to reduce time commitment,” a TikTok project manager says in the documents. In a chat message echoing this sentiment, another employee said the goal was to “help retain users.”
Videos specifically produced by TikTok, which are intended to encourage users to stop endless scrolling and take a break, were therefore not widely promoted. According to the report, one senior executive admitted that while these videos were “a good talking point” for policymakers, they were otherwise “not that useful.”
The same applies to a tool developed by TikTok, which limits the app’s usage time to 60 minutes per day by default. However, the leaked documents reveal that TikTok measured the success of this tool based on how it “increases public trust in the TikTok platform through media reporting” – rather than how it saves teens’ time spending on the app increases.
Internal testing showed that the tool had little impact on actual usage time: it only decreased by about 1.5 minutes, while young people previously spent about 108.5 minutes per day on TikTok and about 107 minutes using the tool. According to the attorney general’s complaint, TikTok subsequently “did not revisit the matter,” according to NPR.
Beautiful is preferred
The multi-state lawsuit against TikTok also centers on the company’s beauty filters, which users can place over videos to look slimmer and younger or have fuller lips and bigger eyes. For example, one, the so-called ‘Bold Glamour’ filter, uses artificial intelligence to alter people’s faces so that they resemble models with high cheekbones and a pronounced jawline.
As the documents show, TikTok is aware of the harm these beauty filters can cause to young users. Internally, employees advocated “providing users with educational materials on self-esteem disorders” and launching a campaign “to raise awareness of issues related to low self-esteem (caused by filter overuse and other issues).”
They also suggested that the filters be preceded by a banner or video containing “an educational statement about filters and the importance of positive body image/mental health.” However, this did not happen – on the contrary: the documents show that the app’s algorithm specifically favors beautiful people.
An internal report analyzing TikTok’s main video feed found that “a large number of (…) unattractive ads” were filling the feeds of all users. In response, Kentucky researchers found that TikTok revised its algorithm to prioritize users the company considered beautiful.
“By means of [TikTok] “The TikTok algorithm has been changed to show fewer ‘unattractive topics’ in the ‘For You’ feed. The company has taken active steps to promote a narrow beauty standard, even if it could negatively impact the young users of Kentucky.” Meanwhile, TikTok publicly stated that one of its “key commitments is to support the safety and well-being of teens.”
Deprived of sleep, food and eye contact
According to the documents, an unnamed company executive explained in drastic terms the impact the TikTok algorithm could have on them: The reason kids watch TikTok is the power of the app’s algorithm, “but I think that we need to be aware of what this could mean for other possibilities. And when I talk about other options, I literally mean sleeping, eating, moving around the room, and looking someone in the eye.
“Arms race for attention”
TikTok’s own research found that children are the most vulnerable to being sucked into the app’s endless video feed. “As expected, the younger the user, the better the performance on most engagement metrics,” says a document from the short video app, which is in an “arms race for attention,” according to an internal presentation Year 2019.
Despite this, the company has been careful to remove accounts of users suspected of being under 13 years old. An internal document on “younger users/U13” states that TikTok has instructed its moderators not to take action on reports from underage users unless their account identifies them as under 13 years old.
Harmful content
This is especially problematic because TikTok is also slow to remove potentially harmful content, according to the documents. A separate study points to self-harm videos that were viewed more than 75,000 times before TikTok identified and removed them. At the same time, the documents show that much harmful content, such as eating disorders, drug use, dangerous driving or gore and violence, is actually “permitted,” contrary to the platform’s official community guidelines.
Often the content can be found on TikTok and is simply not “recommended,” meaning it doesn’t show up in users’ “For You” feeds or has a lower priority in the algorithm, NPR writes, citing the improperly redacted documents. Accordingly, TikTok internally admits that there are significant “leak rates” of illegal content that is not removed.
According to the report, these “leak rates” include the “normalization of pedophilia” (35.71 percent), “sexual harassment of minors” (33.33 percent), “physical abuse of minors” (39.13 percent), the ” glorification of sexual abuse of minors” (50 percent) or the “fetishization of minors” (100 percent).
TikTok: publication “very irresponsible”
TikTok itself again rejected the accusations, citing “robust security measures,” including proactively removing suspected underage users, as well as “voluntary security features” such as the ability to limit screen time. Additionally, the short video app denounced NPR’s publication of the court-sealed documents as “highly irresponsible.”
“Unfortunately,” the radio association quoted a company spokesperson, “this complaint uses misleading quotes and takes outdated documents out of context to misrepresent our commitment to public safety.” the one Now the courts must provide clarity.
Source: Krone

I am Wallace Jones, an experienced journalist. I specialize in writing for the world section of Today Times Live. With over a decade of experience, I have developed an eye for detail when it comes to reporting on local and global stories. My passion lies in uncovering the truth through my investigative skills and creating thought-provoking content that resonates with readers worldwide.