• Home
  • News
  • Is TikTok’s ‘For You’ Page Failing Our Kids?
Image

Is TikTok’s ‘For You’ Page Failing Our Kids?

A shocking new investigation reveals that TikTok’s powerful algorithm may be putting children in harm’s way. Despite promises of safety, the platform is reportedly recommending pornographic and highly sexualized content directly to accounts set up 13-year-old children.

This isn’t case of kids seeking out this content. Researchers from the human rights group Global Witness created fake child profiles with safety settings turned on. Without making a single search, the app’s “you may like” feature began suggesting explicit search terms. Following these recommendations led to a disturbing range of sexual content, from suggestive material to videos of penetrative sex.

Safety Features Bypassed

The researchers specifically activated TikTok’s “Restricted Mode,” which the platform states is designed to block “mature or complex themes, such as… sexually suggestive content.” Despite this, the recommendations and videos appeared, often hidden within innocent-looking content to evade moderation.

A Pattern of Problems

This is the second time Global Witness has uncovered this issue. After first finding the problem in April and alerting TikTok, the company promised action. However, a follow-up test in late July and August – after new UK “Children’s Code” laws came into force – found the same failures were still happening.

One user’s confused comment on the platform echoes the concern of many: “can someone explain to me what is up w my search recs pls?”

TikTok’s Response and The Path Forward

TikTok states it is “fully committed to providing safe and age-appropriate experiences” and has over 50 safety features. Upon being notified, the platform said it removed the violating content and made improvements to its search suggestions.

However, this investigation raises critical questions about the effectiveness of self-regulation. With new laws now legally requiring platforms to protect children, advocates are urging regulators to step in and ensure companies like TikTok are held accountable.

Releated Posts

Reddit Fights Australia’s New Social Media Ban for Kids

Reddit has taken a major step by challenging Australia’s new rule that bans children under 16 from using…

ByByNipuni TharangaDec 12, 2025

Australia’s Social Media Ban for Kids Begins

Australia’s unprecedented ban on social media for children under the age of 16 has now taken effect, making…

ByByNipuni TharangaDec 11, 2025

Seven Global Traditions for a Less-Materialistic Christmas

Many people feel that Christmas has become too focused on shopping, gifts and pressure. But in many parts…

ByByNipuni TharangaDec 10, 2025

Australia’s Social Media Ban: Why YouTube Says Kids May Be Less Safe

Australia’s new social media law, which bans children under 16 from platforms like YouTube, is facing a surprising…

ByByNipuni TharangaDec 3, 2025

Leave a Reply

Your email address will not be published. Required fields are marked *