Internal TikTok Documents Expose Moderation Issues and Worries

Confidential internal TikTok documentation disclosed in a US court filing, which was obtained by RTÉ, indicates that certain senior employees were aware of potential risks associated with compulsive app usage by young users.

The documents also specify the percentage of content that breaches TikTok’s established guidelines but remains unmoderated or unrevised.

According to the documents cited in the court filings, these include 35.71% of content classified as ‘Normalization of Pedophilia’; 33.33% of ‘Minor Sexual Solicitation’ content; 39.13% of ‘Minor Physical Abuse’ content; 50% of ‘Glorification of Minor Sexual Assault’; and ‘100%’ of content categorized as ‘Fetishizing Minors.’

The details were inadvertently disclosed due to an error in digital redaction procedures during the publication of court documents related to a case filed by the Kentucky Attorney General’s Office.

The documents were released in error following a faulty digital redaction process and have since been resealed.

Kentucky is among 14 US states that have independently sued TikTok, alleging that the app is “designed to addict and otherwise harm minors.”

Not all referenced internal documents in the filings include dates, but some are as recent as May 2022.

Within the various lawsuits, information from numerous internal TikTok communications, documents, and research data have been disclosed, with an agreement to redact any identifiers under confidentiality agreements.

TikTok communicated to Prime Time that the “complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”

WATCH: Internal TikTok documents reveal moderation failures and concerns

We need your consent to load this comcast-player content. We use comcast-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content. Manage Preferences

The social media giant, with its EU headquarters located in Dublin, is also facing an inquiry from the European Commission regarding the “protection of minors” and “risk management of addictive designs and harmful content.”

Among the information disclosed in the US case filings is data from an internal TikTok presentation about a study examining moderation of suicide and self-harm content.

The study revealed that videos related to suicide and self-harm, referred to as ‘SSH,’ often go through the initial moderation stages of TikTok but “unmoderated or incorrectly moderated videos can spread widely before being identified.”

The moderation phases are denoted as ‘R1’ and ‘R2.’

“The SSH videos that passed R1 and R2 gathered an average of 75,370 views on TikTok before being detected and removed,” according to the study referenced in the filings.

Responding to inquiries regarding the case, TikTok informed Prime Time that “of the content we remove for violating our policies, 99% has fewer than 10,000 views when it is deleted.”

The court filings also reference a presentation by TikTok’s Trust and Safety group, which indicated that approximately “42% [of users] are ‘comment only’ users,” yet human review of comments is “disproportionately low.”

“Human moderation for comment review is at 0.25%,” as stated in the presentation, indicating that the vast majority of concerning comments do not undergo human review.

The social media giant TikTok has its EU headquarters in Dublin.

Furthermore, the filings quote an internal document highlighting concerns about how content related to unhealthy eating and weight loss is moderated.

One document indicates that certain content is labeled as ‘not recommended’ within the app instead of being removed. Consequently, it doesn’t appear in users’ feeds, yet remains accessible via the search function.

A TikTok user’s feed is an algorithmically driven stream of content, presented by TikTok based on what users have previously engaged with or viewed, rather than being selected by the user.

Regulators and policymakers in both the US and Europe are alarmed that TikTok’s potent algorithm may lead young users toward increasingly radical and extreme content via their feeds.

There are also concerns regarding the app’s addictive nature. The combination of overuse and exposure to progressively extreme content is sometimes referred to as “the rabbit hole effect.”

Such processes were acknowledged by TikTok executives, as indicated in the communications cited in the court documents.

“The reason kids watch TikTok is because the algo[rithm] is really good,” one executive is quoted as stating in internal messages, adding: “But I think we need to be aware of what it might mean for other opportunities. And by other opportunities, I literally mean sleep, eating, moving around the room, and making eye contact with someone.”

Due to concerns surrounding the rabbit hole effect, the potential impact on users’ mental health, and to assess how younger users experience the app, TikTok conducted internal experiments where employees created new accounts.

One employee noted, “after following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into a ‘negative’ filter bubble. The dense concentration of negative content lowers my mood and increases my feelings of sadness, despite my generally positive outlook on life.”

Similarly, the filings reference an internal TikTok report known as the TikTank Report, which found that “compulsive usage correlates with a variety of negative mental health effects such as diminished analytical skills, difficulties in memory formation, loss of contextual thinking, decreased conversational depth, diminished empathy, and heightened anxiety.”

TikTok recently told Prime Time that “safety remains one of our core priorities.”

Publicly, TikTok’s response to criticisms regarding the app’s addictive characteristics and the rabbit hole effect has been to assert that it implements ‘well-being measures’ and is increasingly ‘dispersing’ content for users.

‘Dispersion of content’ refers to presenting users with a broader range of content topics within their feeds.

For example, the company indicated that it has expanded these measures following concerns raised in a Prime Time report in June.

READ: 13 on TikTok: Self-harm and suicide content shown shocks experts READ: TikTok completes review into harmful content following RTÉ story

However, the Kentucky court filings claim that experts consulted by TikTok in recent years “unanimously” advised adopting a different strategy rather than dispersion to address dangerous rabbit holes.

The experts recommended an approach focused on “increasing user agency and implementing algorithm changes that enable users to discover other interesting content that diverges from a given rabbit hole,” as stated in the documents.

In addition, TikTok has claimed to introduce ‘Screen Time Management’ features to counteract addictiveness for younger users, which include prompts to take breaks after an hour of using the app.

Citing TikTok documents, the Kentucky AG’s court filing claims this measure “proved to have negligible impact.”

“Following an experiment, the company discovered that default screen time prompts only slightly reduced the average daily time teens spent on TikTok, from approximately 108.5 minutes to about 107 minutes,” the filing states.

Quotes from internal TikTok discussions regarding this measure are also detailed. In one internal message, a senior employee indicated that it was not anticipated to significantly affect how long young users remained on the app.

“After discussing these potential tradeoffs with [a senior TikTok executive], he suggested that we could tolerate a 5% drop in stay time for Screen Time Management features aimed at special user groups like minors and excessive users.”

“However, this shouldn’t come at the cost of retention. Nonetheless, we don’t expect significant impacts on stay time from this feature since it primarily raises awareness rather than acting as an intervention.”

Another employee remarked: “Our goal isn’t to reduce time spent but to enhance user experience satisfaction, ultimately contributing to DAU [daily active users] and retention.”

TikTok pointed out to Prime Time that it has “robust safeguards, including proactively removing suspected underage users.”

The documents filed by the Kentucky AG accuse TikTok of measuring the success of its strategies “not by whether it has actually reduced the time teens spend on the platform to mitigate this harm, but by three unrelated ‘success metrics,’ the first being ‘improving public trust in the TikTok platform via media coverage.’

In response to inquiries from Prime Time regarding the contents of the filings, TikTok asserted that it has “robust safeguards, which include proactively removing suspected underage users, and we have voluntarily introduced safety features such as default screen-time limits, Family Pairing, and privacy by default for minors under 16.”

The social media giant, currently grappling with significant cases in the US and Europe concerning the societal harms posed by the content on its platform, also criticized named media outlets for publishing “information that is under a court seal.”

Kate McDonald’s report on TikTok airs on the 17 October edition of Prime Time at 9.35pm on RTÉ One.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More