A shocking new investigation reveals that TikTok’s powerful algorithm may be putting children in harm’s way. Despite promises of safety, the platform is reportedly recommending pornographic and highly sexualized content directly to accounts set up 13-year-old children.
This isn’t case of kids seeking out this content. Researchers from the human rights group Global Witness created fake child profiles with safety settings turned on. Without making a single search, the app’s “you may like” feature began suggesting explicit search terms. Following these recommendations led to a disturbing range of sexual content, from suggestive material to videos of penetrative sex.
Safety Features Bypassed
The researchers specifically activated TikTok’s “Restricted Mode,” which the platform states is designed to block “mature or complex themes, such as… sexually suggestive content.” Despite this, the recommendations and videos appeared, often hidden within innocent-looking content to evade moderation.
A Pattern of Problems
This is the second time Global Witness has uncovered this issue. After first finding the problem in April and alerting TikTok, the company promised action. However, a follow-up test in late July and August – after new UK “Children’s Code” laws came into force – found the same failures were still happening.
One user’s confused comment on the platform echoes the concern of many: “can someone explain to me what is up w my search recs pls?”
TikTok’s Response and The Path Forward
TikTok states it is “fully committed to providing safe and age-appropriate experiences” and has over 50 safety features. Upon being notified, the platform said it removed the violating content and made improvements to its search suggestions.
However, this investigation raises critical questions about the effectiveness of self-regulation. With new laws now legally requiring platforms to protect children, advocates are urging regulators to step in and ensure companies like TikTok are held accountable.