YouTube Will Grow Video Moderation Team to Over 10000 in 2018

Google-owned You Tube has vowed to increase its surveillance of inappropriate content.  AFP

More than 400 hours of videos are uploaded to YouTube every minute, and company executives have long said that monitoring that fire hose of content is hard.

YouTube is now facing fire for publishing videos targeted at children with sexual and violent themes, for which the company said that it is going to substantially increase the amount of people supervising the contents in 2018.

So while some 150,000 video nasties have been purged from YouTube and two million videos screened since June for violent extremist content, Google is ramping up its efforts to sweep away extremists and hateful content.

YouTube has also repeatedly sparked outrage for its role in perpetuating misinformation and harassing videos in the wake of mass shootings and other national tragedies.

YouTube said machine learning was helping its human moderators remove almost five times as many videos that they were previously, and that 98% of videos removed for violent extremism are now flagged by algorithms.

YouTube has grappled with a series of controversies this year concerning videos available on its platform.

Recently, British media reported that big brand advertisements were tied to the videos of children and teens, which drew inappropriate public comments.

In a Monday blog post, YouTube CEO Susan Wojcicki said Google will grow its content review teams to more than 10,000 workers in 2018.

The company will also focus on training its machine-learning algorithm to help human reviewers identify and terminate accounts and comments violating the site's rules.

In a separate post on YouTube Creator Blog, Wojcicki also warned about a growing number of "bad actors" who share extremist content and disseminate videos "that masquerade as family-friendly content but are not".

Ms Wojcicki moved to reassure video-makers that they won't be adversely affected by any changes, saying: "We've heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don't demonetise videos by mistake".

"Because we have seen these positive results, we have begun training machine-learning technology across other challenging content areas, including child safety and hate speech", she added.

"We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising".

Related News: