A new report accuses TikTok of recommending pornography and sexualised videos to child users. Researchers created fake child accounts, enabled safety settings, and still received explicit search suggestions. These recommendations led to clips showing simulated masturbation and even penetrative sex. TikTok says it acted quickly once alerted and insists it remains committed to safe online experiences.
Researchers uncover disturbing results
In late July and early August, Global Witness researchers set up four TikTok accounts posing as 13-year-olds. They used false dates of birth, and the app did not request further identity checks. Investigators switched on TikTok’s “restricted mode”. The platform claims this feature blocks mature or suggestive content. Despite this, the accounts received overtly sexual search suggestions in the “you may like” section. These terms led to videos of women flashing underwear, exposing breasts and simulating masturbation. Some videos contained explicit pornography. Many were hidden within innocent-looking content to evade moderation systems.
Global Witness warns of danger
Ava Lee from Global Witness described the findings as a “huge shock”. She stressed the platform not only fails to protect children but also actively pushes harmful material. Global Witness usually focuses on how major tech companies affect democracy, climate change and human rights. The group discovered TikTok’s issue accidentally during other research in April.
TikTok promises tighter control
Researchers reported the issue to TikTok earlier this year. The company said it removed offending videos and adjusted its systems. But in late July, Global Witness repeated the test and saw the same problem. TikTok says it offers more than 50 safety features for teenagers. It claims nine out of ten guideline-breaking videos are removed before viewing. The company also said it improved its search suggestions following the latest warnings.
New law increases pressure
On 25 July, the Children’s Codes within the Online Safety Act took effect. These rules require platforms to use effective age checks and prevent children from seeing pornography. Algorithms must also block harmful content that encourages self-harm, suicide or eating disorders. Global Witness repeated its research after these rules began. Ava Lee urged regulators to step in, saying children’s online safety must come first.
Users voice frustration
During the investigation, researchers also observed reactions from other users. Some questioned why their search recommendations suddenly became sexual. One user wrote: “can someone explain to me what is up with my search recs pls?” Another commented: “what’s wrong with this app?”