"As challenges to our platform evolve and change, our enforcement methods must and will evolve to respond to them. But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering," wrote CEO of YouTube, Susan Wojcicki.
The internet can be a scary place for parents of young kids—a hot topic surrounding the popular video-sharing platform YouTube. The Google-owned website continues to make headlines for its disturbing videos aimed towards children—resulting in the shutdown of channels and removal of thousands of exploitative content. After what seemed to be a continual lack of resolution coming from YouTube—CEO Susan Wojcicki has announced that come 2018, 10,000 human moderators will be responsible for reviewing policy-violating content, in effort to protect children.
News of the company's plans to build their team of human moderators came on Monday, December 4th, through an official YouTube blog post.
"As the CEO of YouTube, I’ve seen how our open platform has been a force for creativity, learning and access to information," Wojcicki wrote. She continued, "But I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm."
The blog post thoroughly explains how implementing human reviewers helps YouTube to confidently remove unwanted content as well as train machine learning systems. The company found human judgment to be a critical component in detecting and removing violent extremism content—leaving them optimistic it will also create a safer space for kids on the internet.
In an effort for greater transparency, YouTube has compiled Community Guidelines so users can clearly understand what content is not allowed on the website and how these rules will be enforced. These guidelines not only affect users but advertisers. "We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand’s values. Equally, we want to give creators confidence that their revenue won’t be hurt by the actions of bad actors," Wojcicki explained.
YouTube has already begun to terminate hundreds of accounts and comments as they work closely with the National Center for Missing and Exploited Children and the Internet Watch Foundation to ensure the safety of their underage users. It will be interesting to see how YouTube's drastic improvements better the internet come 2018. Hopefully, YouTube's call to action influences other websites to be mindful of child endangerment on the internet.