A human rights group accuses TikTok of recommending pornography and sexualised videos to child users. Researchers built fake child accounts, enabled safety settings, and still received explicit search prompts. These led to clips of simulated masturbation and even pornographic sex acts. TikTok says it acted quickly after being alerted and remains committed to providing safe online experiences.
Researchers set up child profiles
In late July and early August, Global Witness researchers created four TikTok accounts. They posed as 13-year-olds with false birth dates. The platform did not request further identification. Investigators activated TikTok’s “restricted mode”. The company says this feature prevents access to mature or sexual content. Despite this, search suggestions in the “you may like” section promoted sexual terms. These linked to videos of underwear flashing, breast exposure and masturbation. At the most extreme, investigators found explicit porn hidden in ordinary-looking clips to bypass moderation.
Global Witness issues warning
Ava Lee from Global Witness described the results as a “huge shock”. She said TikTok not only fails to block harmful content but also recommends it to children. Global Witness usually studies how technology influences democracy, climate change and human rights. The organisation first noticed this problem during unrelated research in April.
TikTok responds to criticism
Researchers informed TikTok earlier this year. The company claimed it deleted the inappropriate material and fixed the issue. But when Global Witness repeated its test later in July, sexual videos appeared again. TikTok says it offers more than 50 protective features for teenagers. It claims nine out of ten violating videos are removed before anyone watches them. Following the report, TikTok says it upgraded its search tools and removed additional harmful material.
New safety rules increase pressure
On 25 July, the Children’s Codes from the Online Safety Act came into force. These rules force platforms to use stronger age checks and prevent minors from seeing pornography. Algorithms must also block content linked to suicide, eating disorders and self-harm. Global Witness carried out its second investigation after the new regulations began. Ava Lee said regulators must step in to guarantee children’s online safety.
Users question recommendations
During the investigation, researchers also observed comments from regular TikTok users. Some asked why their search suggestions had become sexual. One wrote: “can someone explain to me what is up with my search recs pls?” Another commented: “what’s wrong with this app?”