Wall Street Journal Uses Fake Kid Accounts To Prove TikTok Suggests Sex Videos & More To Minors


Is TikTok bad for kids? Do you remember when TikTok first came about? There was all this buzz that the app was actually a ploy by the Chinese government to steal our personal data?


Well, here we are a few years later and it turns out: a Trojan horse by the Chinese government may have been a dreamboat compared to what THE WALL STREET JOURNAL recently discovered about the app.  

While TikTok’s earliest days were dominated by trendy little dance numbers, it didn’t take long for more unsavory elements to creep in. Videos encouraging drug use, disordered eating, and pornography quickly became rampant on the app.

But okay, fine, whatever.

If you’re a consenting adult and that’s what you choose to use the app for, I guess that’s your business.

But, is TikTok bad for kids? 

The problem is: it’s NOT just consenting adults. According to a sting of sorts, launched by the Wall Street Journal, TikTok is actively targeting its young users with inappropriate, harmful content.

That’s right. Just when you thought the app couldn’t go lower than encouraging students around the world to actively deface and destroy school property, you were wrong.

At least that horrible, destructive “challenge” (under the hashtag “#deviouslicks”) was user-generated stupidity.

The Wall Street Journal discovered that the TikTok app itself, through its algorithm, is actively serving extremely inappropriate content to users IT KNOWS are children.

The Wall Street Journal created a multitude of dummy accounts.

All the accounts were registered as children between the ages of 13 and 15. Using such accounts, they did a search for “Only Fans,” a site known for explicit content. The dummy user account then viewed a few of the videos from the search results.

And what happened next should be a crime. Actually, it probably is a crime.

The app began populating the “For You” browsing feed for these teenage users with more and more inappropriate content.

Sure, there were the viral trending videos of funny dances, pranks, puppies, etc. But there were also seriously inappropriate videos, including role-playing videos in which people pretend to be in sexual relationships with their caregivers.

(Gee, what could POSSIBLY go wrong by suggesting to impressionable kids that these types of relationships are okay. It’s not like we have a horrific problem with sexual abuse of minors in this country by people they know. Oh wait…) 

According to the Wall Street Journal,

“As the user scrolled through the videos appearing in the feed, lingering on the more sexually oriented ones while moving more quickly past others, the For You feed was soon almost entirely dominated By TIKTOKS INVOLVING SEXUAL POWER DYNAMICS AND VIOLENCE. 

The app’s algorithm had pushed the user into a rabbit hole that many users call “Kinktok,” featuring whips, chains and torture devices.

Some of the content is banned by the platform.”


And as if that wasn’t bad enough, of course, there’s more. The Wall Street Journal went on to explain, 

“TikTok served one account registered as a 13-year-old at least 569 videos about drug use, references to cocaine and meth addiction, and promotional videos for online sales of drug products and paraphernalia. Hundreds of similar videos appeared in the feeds of the Journal’s other minor accounts.”

But don’t worry. You can rest easy knowing TikTok CLEARLY has its priorities in order.

TikTok recently banned a hashtag for the milk crate challenge (you can read more about that stupid challenge here). As part of that ban, the app also posted this official warning language,



I see… so, let me sum this up. According to TikTok:

People climbing up milk crate towers they constructed = Bad. Dangerous. Must discourage.

Deliberately suggesting and showing young teens videos of sex and drugs in their browsing feed = Good. Actively encourage.

I think the most appalling thing here– and it’s tough to pin that down, since ALL of it is appalling– is TikTok’s official response.

The Wall Street Journal shared their findings with TikTok.

They identified nearly 1,000 inappropriate videos served to the fake-teenage user accounts.

“At that point, a spokeswoman for the app told the WSJ that TikTok removed some of the videos after the Journal’s accounts viewed them, and restricted the distribution of other videos to stop the app from recommending them to other users, but declined to say how many.”

Oh, well, that makes us feel so much better… NOT!

Even worse, the spokeswoman went on to reveal that,

“The app doesn’t differentiate between videos it serves to adults and minors but said that the platform is looking to create a tool that filters content for young users.”


It’s bad enough there’s literally NO filter in place to prevent young users from actively searching for and finding this type of content.

But that’s not what this spokeswoman is saying.

She’s saying that there’s not even a filter for what the app, through its own algorithm, RECOMMENDS to children!

I’m sorry, but if this company gave even the tiniest little rat’s ass about kids or their well-being, that feature would’ve been baked into the software from Day 1.

But of course, they don’t care. Not one bit.

I really hesitate to tell parents how to parent. It’s a hard enough gig without us all judging one another. With that said, if your kids have access to TikTok, I implore you to cut it off. I know there’s plenty of entertaining, innocuous content on the app.

(I even wrote this piece on one of my favorite packing-lunch videos, which came courtesy of TikTok!)

But the entertaining possibilities pale in comparison to the horrible things your kids could be exposed to, not just by specifically looking for it, but by browsing a feed of suggested videos.

If your kid asks to download this app, it’s time to resurrect a line from First Lady Nancy Reagan and, “Just Say No.”

To see the article in the Wall Street Journal go here


Please enter your comment!
Please enter your name here