The company announced the safeguard in response to accusations that the site's algorithm encourages the creation of inappropriate videos aimed at children.


Whether you're on a long car trip or hanging out at home on a cold weekend morning, YouTube can be a major saving grace for quick, super-accessible kids' entertainment. But as any parent with a smartphone-savvy kiddo knows, the app presents a host of problems. Namely, that your L.O. can stumble on a inappropriate clip all too easily. In fact, YouTube has been accused of enabling "infrastructural violence" due to its role in the creation of tons of low-quality, seriously disturbing content aimed at preschoolers. Thankfully, help may finally be here.

The Guardian reports that YouTube announced a new policy on Thursday, November 9, which will put age restrictions on videos that feature "inappropriate use of family entertainment characters." One example that fits this description: amateur-created videos of Peppa Pig like one in which she's which “she is basically tortured" at a dentist's office.

YouTube had previously stated that videos like this couldn't bring in advertising revenue, a different rule they had hoped would prevent their production altogether. But since that move hasn't cut down on them completely, the age restriction wall is clearly necessary. It prevents users who aren't logged in or anyone who's registered as below 18 from watching the clip. What's more, age-restricted videos are barred from YouTube Kids, which is geared for kids under 13.

“Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetization,” said Juniper Downs, YouTube’s director of policy, said in a statement from the company. “We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right.”

According to Mashable, YouTube is going to be leaning on users to flag content that should be restricted in this way. The company told the site that "this is an added layer of protection and not the only process that can keep a video from migrating into the kids app from YouTube main." They point to the fact that they also use machine learning and algorithms to find appropriate content for kids and the "system is constantly evolving to block inappropriate content."

Relying on algorithms and other users to report content certainly sounds like an imperfect system. But at the very least, it's heartening to see that YouTube has committed to improving their practices. Let's hope they continue to do so!