Google pledges 10000 staff to tackle extremist content

Google expanding moderation team to 10000 amid concerns over 'inappropriate' YouTube videos

YouTube announced on Monday that it will be expanding the staff of its content moderation and rule enforcement team to more than 10,000 people by the end of 2018, which represents a 25 percent increase, according to BuzzFeed.

YouTube CEO Susan Wojcicki said "some bad actors are exploiting" the Google-owned service to "mislead, manipulate, harass or even harm". It was forced to adopt additional screening measures last month on its kid-friendly platform, YouTube Kids, after reports showed numerous videos there contained profanity and violence. Following the furore over the plethora of unsuitable videos, YouTube announced it was to implement measures such as removing adverts from videos depicting family entertainment characters engaged in violent behaviour, and blocking all comments on videos targeted at minors if inappropriate user comments are uploaded.

She said adding more people to identify inappropriate content will provide more data to supply and potentially improve its machine learning software.

This included taking "aggressive action" on comment moderation, and was testing new systems to help fight emerging threats.

We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand's values.

Previous efforts to tackle "problematic" content have seen regular YouTube content creators complain about ads being removed from their videos - while major companies pulling their ads is also likely to have a knock-on effect on regular channels looking to earn advertising revenue. Plus, employees who have to personally view extremely disturbing content in order to properly screen it can experience lasting psychological injuries; a group of moderators sued Microsoft in January alleging that they were suffering from PTSD as a result of watching child abuse and other sadistic acts.

"I've also seen up-close that there can be another, more troubling, side of YouTube's openness", Wojcicki wrote.

Machine learning's a part of that effort: Wojcicki said it is "is helping our human reviewers remove almost five times as many videos than they were previously". However, it has also drawn a range of objectionable material, from videos promoting conspiracy theories to violent extremists. Sources say the video site will ask brands to commit more money to purchase exclusive space next to vetted content. Half of violent extremist content is removed by machine learning in under two hours and 70 percent is removed within eight hours.

Wojcicki, on behalf of YouTube, also pledged to find a "new approach to advertising on YouTube" for both advertisers and content creators.

Related News: