TikTok Promotes Sexual Content, Drugs and Alcohol to Children, Investigation Finds

by Ailan Evans

 

Video sharing platform TikTok promotes sexual content to underage users through its suggestion algorithm, according to an investigation by The Wall Street Journal.

Investigators for The Wall Street Journal set up 31 fake TikTok accounts registered to users between the ages of 13 and 15 and studied their “For You” feeds, which consist of videos recommended to users by TikTok’s suggestion algorithm.

TikTok showed the underage accounts thousands of videos depicting drugs, pornography and sexual content, and alcohol use, including over a hundred videos promoting pornography subscription services and sex products, according to the investigation, which took place over several months.

One particular account, registered to a 13-year-old, was shown at least 569 videos related to drug use, including meth and cocaine addiction. Another account was recommended videos depicting role-playing sexual content until 90% of the account’s feed consisted of videos related to bondage, sadomasochism, and other sexual practices, according to the WSJ.

TikTok did not immediately respond to the Daily Caller News Foundation’s request for comment, but told the WSJ that TikTok doesn’t separate content intended for adult users from content safe for children. A spokeswoman said the app was looking into technology that would filter out adult content for underage users.

“Protecting minors is vitally important, and TikTok has taken industry-first steps to promote a safe and age-appropriate experience for teens,” the spokeswoman told the WSJ.

Yes, Every Kid

Several of the accounts were shown videos with a link sending them to sign up for OnlyFans.com, a subscription-based site featuring pornography and sexual content. However, a TikTok spokeswoman told the WSJ that the app removes links directly to sexual services, including OnlyFans.

TikTok tracks users across the app and measures the amount of time they spend on each video, the WSJ reported. The app then uses this information to select which videos to recommend to users to keep consumers engaged.

“All the problems we have seen on YouTube are due to engagement-based algorithms, and on TikTok it’s exactly the same — but it’s worse,” Guillaume Chaslot, former YouTube engineer, told the WSJ. “TikTok’s algorithm can learn much faster.”

TikTok users must be at least 13 years old, and users under 18 need parent permission to use the app, according to its terms of service. However, there is no mechanism preventing underage users from signing up or verifying parental approval, the DCNF found.

Apple was the subject of a similar investigation last month that found its App Store had few controls in place preventing minors from accessing adult hookup apps and apps with sexual content. An investigation released in June by the Human Trafficking Institute found that 59% of sex trafficking victims were recruited through Facebook.

– – –

Ailan Evans is a reporter at Daily Caller News Foundation.
Photo “TikTok” by 8268513.

 

 

 

 

 


Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact [email protected].

Related posts

Comments