The Popular Video Platform Allegedly Directs Children's Profiles to Pornographic Content In Just a Few Taps

As reported by a recent investigation, TikTok has been observed to direct children's accounts to pornographic content within a small number of clicks.

Research Methodology

Global Witness created fake accounts using a 13-year-old's birth date and enabled the platform's content restriction feature, which is designed to limit exposure to sexually suggestive content.

Study authors observed that TikTok suggested sexually charged search terms to seven test accounts that were established on unused smartphones with no search history.

Troubling Search Prompts

Search phrases proposed under the "recommended for you" feature featured "provocative attire" and "explicit content featuring women" – and then escalated to keywords such as "hardcore pawn [sic] clips".

For three of the accounts, the sexualized searches were suggested immediately.

Rapid Access to Explicit Content

After a "small number of clicks", the investigators found explicit material from revealing content to explicit intercourse.

Global Witness reported that the content tried to bypass filters, often by displaying the video within an benign visual or video.

Regarding one profile, the process took two interactions after signing in: one tap on the search function and then one on the proposed query.

Compliance Requirements

Global Witness, whose remit includes researching big tech's impact on public safety, said it conducted two batches of tests.

Initial tests occurred before the activation of safeguarding regulations under the United Kingdom's digital protection law on July 25th, and another after the rules took effect.

Alarming Results

The organization added that multiple clips featured someone who looked like they were below the age of consent and had been submitted to the Internet Watch Foundation, which monitors exploitative content.

The research organization asserted that the social media app was in non-compliance of the UK safety legislation, which obligates digital platforms to stop children from encountering dangerous material such as adult material.

Regulatory Response

An official representative for Ofcom, which is responsible for overseeing the act, stated: "We acknowledge the effort behind this research and will examine its findings."

Official requirements for complying with the act indicate that tech companies that present a significant danger of displaying dangerous material must "adjust their systems to remove harmful content from young users' timelines.

The app's policies forbid pornographic content.

Platform Response

TikTok announced that following notification from the research group, it had taken down the offending videos and introduced modifications to its suggestion feature.

"As soon as we were made aware of these allegations, we responded quickly to look into the matter, remove content that contravened our rules, and launch improvements to our search suggestion feature," stated a spokesperson.

Sergio Parks
Sergio Parks

A passionate writer and life coach dedicated to helping others achieve their full potential through actionable advice.