Under-fire YouTube to ramp up attempts targeting inappropriate content

YouTube to counter extremist content with 10000 staff

In a blog post, Wojcicki said the company was already taking "aggressive action" on comments, and was testing new systems to counter threats that combine human and automated checks.

Google, which owns YouTube, announced on Monday that next year it would expand its total workforce to more than 10,000 people responsible for reviewing content that could violate its policies. To date, 98% of all removed videos have been flagged by machine learning algorithms, and 70% of violent extremist videos are taken down within eight hours of upload.

"Human reviewers remain essential to both removing content and training machine learning systems because human judgement is critical to making contextualised decisions on content", she said.

There have been reports of creepy videos aimed at children and pedophiles posting comments on children's videos in recent weeks.

It said adverts for major brands were appearing alongside some of the videos, which led to several big brands including Mars and Adidas pulling advertising from the site.

"Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content", the CEO wrote in a blogpost, saying that moderators have manually reviewed almost 2m videos for violent extremist content since June, helping train machine-learning systems to identify similar footage in the future. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors.

The technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess, according to Wojcicki.

"We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising".

The YouTube CEO also stated that the company had developed a "computer-learning" technology capable of weeding out radical content on the platform, where hundreds of minutes of videos are uploaded each minute.

In order to combat this issue, the video hosting intends to "apply stricter criteria and conduct more manual curation" while simultaneously boosting its team of human reviewers "to ensure ads are only running where they should".

Related news: