Mom Flags YouTube Kids Videos That Feature Suicide Instructions & School Shootings

YouTube has responded to the Florida mom's concerns, but she and others believe the platform isn't taking the issue seriously enough. 
Abd. Halim Hadi/Shutterstock

February 26, 2019

When it comes to allowing kids to watch content online, utilizing platforms that are geared to little ones seems like a no-brainer. YouTube Kids is one of those platforms that parents should be able to trust. But as one mom and pediatrician recently pointed out, the app is host to a bevy of inappropriate, violent videos that are far too easy for children to access.

Dr. Free Hess from Gainesville, Florida says she found content on the YouTube Kids app that depicts suicides, school shootings, and violence against female characters. She flagged nearly a dozen clips that have since been deleted, but she believes there are far more that require the platform's prompt action.

Upon finding "about 10 [videos] very quickly and very easily," the concerned mom turned her attention to writing a blog post about the situation on her site PediMom.com. She told BuzzFeed News that she stopped at the 10 "simply because I wanted to get the blog post out, not because there weren’t more."

The videos Hess found included a cartoon inspired by Minecraft’s graphics called “Monster School: SLENDERMAN HORROR GAME,” which features a school shooting. BuzzFeed points out that the parent account of this clip TellBite was first created on YouTube and boasts 167,000 subscribers. Any content that originally lives on regular YouTube and makes its way over to YouTube Kids is supposed to be particularly curated for children "to make it safer and simpler for kids to explore the world through online video," according to the company.

And back in July, Hess spied another concerning video on YouTubeKids which included suicide instructions, specifically footage of a former YouTube personality telling children how to slit their wrists. After putting out a call to action to have the video removed, it took YouTube a week to remove it. This past month, she noticed the video had resurfaced on YouTube. So, she repeated the reporting process in an effort to have it pulled again.

Last week, YouTube confirmed it removed the video, which was a popular children’s Splatoon-style cartoon with the suicide instructions spliced in. 

"It makes me angry and sad and frustrated," Hess told CNN. "I'm a pediatrician, and I'm seeing more and more kids coming in with self harm and suicide attempts. I don't doubt that social media and things such as this is contributing."

And as if that wasn't bad enough, Hess found additional videos glorifying sexual exploitation and abuse, human trafficking, gun violence, and domestic violence. 

YouTube responded to Hess' concerns in a statement to BuzzFeed News: The company insists that they “take feedback very seriously,” and that it is working to “ensure the videos in YouTube Kids are family-friendly.”

“We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video," a spokesperson noted. "Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed."

The company added that while it is “making constant improvements to our systems,” it admits that “there’s more work to do.”

Back in November, the company announced a policy that put age restrictions on videos that feature "inappropriate use of family entertainment characters." One example that fits this description: amateur-created videos of Peppa Pig like one in which she's which “she is basically tortured" at a dentist's office. They also encouraged parents like Hess to flag inappropriate content. 

Hess acknowledges that the platform is quicker to pull problematic videos from YouTube Kids than the original app, but she says taking action needs to be a team effort between tech and parents. Hess is encouraging fellow parents to be even more vigilant when it comes to monitoring their children's content consumption.

"Once someone reports it, it's too late, because a kid has already seen it," she told CNN. "There is this disconnect between what kids know about technology and what their parents know because the parents didn't grow up with it. The kids are the digital natives, and the parents are digital immigrants."

She believes the solution will come in the form of a team effort among concerned parents: "We need to fix this," Hess said. "And we all need to fix this together."



    Parents may receive compensation when you click through and purchase from links contained on this website.