A brand new report has discovered that TikTok has directed younger customers toward sexually specific content material by way of its urged search phrases, in response to an investigation by UK not-for-profit watchdog Global Witness, as tech corporations face mounting stress to crack down on age verification.
As part of the investigation revealed on Oct. 3, Global Witness stated it had arrange 7 new TikTok accounts within the UK posing as 13-year-olds – the minimum age required for setting up an account – on factory-reset telephones with no search histories.
Global Witness stated TikTok’s search ideas had been “highly sexualized” for customers who each reported being 13 years outdated and browsed the app utilizing “restricted mode,” which “limits exposure to content that may not be comfortable for everyone” together with “sexually suggestive content,” in response to TikTok’s assist web page.
The report comes amid a broader push each within the United Kingdom and the United States to higher defend kids on-line, and as TikTok faces allegations in lawsuits filed last year that it’s dangerous to younger customers’ psychological well being.
When NCS reached out concerning the report, a TikTok spokesperson stated the corporate was dedicated to retaining their customers’ expertise protected.
“As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies, and launch improvements to our search suggestion feature,” the spokesperson stated in an announcement, including that TikTok has “more than 50 features and settings designed specifically to support the safety and well-being of teens.”
The assertion additionally stated TikTok is “fully committed to providing safe and age-appropriate experiences” and that it removes “9 in 10 violative videos before they are ever viewed.”
Sexualized searches, nevertheless, had been advisable “the very first time that the user clicked into the search bar” for 3 of the check accounts Global Witness created, in response to the report. TikTok surfaced pornographic content material to all seven check customers “just a small number of clicks after setting up the account.”
“Our point isn’t just that TikTok shows pornographic content to minors. It is that TikTok’s search algorithms actively push minors towards pornographic content,” stated Global Witness in its report.
TikTok’s community guidelines prohibit content material containing nudity, sexual exercise, and sexual companies in addition to any content material containing sexually suggestive acts and important physique publicity involving youth.
TikTok stated in its transparency report protecting January by way of March 2025 that roughly 30% of the content material faraway from the platform due to a coverage violation was taken down as a result of delicate and mature themes.
TikTok removes round 6 million underage accounts globally each month through the use of varied age detection strategies, together with expertise to detect when an account could also be utilized by a toddler underneath the age of 13, in response to a spokesperson. It additionally trains moderation groups to identify indicators that children underneath 13 could also be utilizing TikTok.
The report comes after extra guidelines from the UK’s Online Safety Act pertaining to little one security went into impact in late July. Media lawyer Mark Stephens stated in Global Witness’ report that the findings “represent a clear breach” of the act.
TikTok didn’t instantly reply to Stephens’ remark when requested by NCS.
The Online Safety Act 2023 is a set of legal guidelines that is supposed to crack down on web security by imposing new guidelines that require tech corporations to manage sure kinds of content material, akin to implementing age checks to stop kids from accessing content material deemed dangerous akin to pornography and posts associated to self-harm.
The act additionally applies to on-line platforms exterior of the UK which have a big UK person base or are able to being accessed by UK customers. But critics of the act, such because the Electronic Frontier Foundation, have stated the age verification guidelines might threaten privateness of all ages.
Global Witness stated it carried out the primary few checks earlier than the UK’s Online Safety Act little one security guidelines fully applied to TikTok and other online platforms on July 25 and ran extra checks after that date.
TikTok approaches Online Safety Act compliance “with a robust set of safeguards,” the spokesperson stated, having been regulated by UK communications companies regulator Ofcom since 2020 underneath Ofcom’s Video-Sharing Platform regime, which incorporates provisions to guard these underneath 18 from inappropriate content material.
TikTok has launched security measures for teenagers lately, akin to a “guided meditation” function to assist younger customers reduce on scrolling and the disabling of late-night notifications.
TikTok is one in every of many tech giants dealing with extra stress to higher defend kids on-line. YouTube, for instance, launched a system in August that makes use of artificial intelligence to estimate a user’s age and activate age-specific protections if vital. Instagram carried out teen account settings that mechanically make teen accounts non-public final 12 months.