YouTube creating team of 10K to moderate, purge dicey videos

Adjust Comment Print

In a tacit admission that its much-hyped artificial intelligence still lags behind humans, Google said it would increase the number of people it has monitoring YouTube for offensive and extremist content to 10,000.

"Some bad actors are exploiting our openness to mislead, manipulate, harass or even harm", Wojcicki said, adding that YouTube's trust and safety teams have reviewed almost 2 million videos for violent extremist content over the past six months.

Wojcicki said the company would take "aggressive action" by launching new comment moderation tools.

Google will have more than 10,000 workers address the problem by next year, though her blog post Monday doesn't say how many the company already has.

YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.

The reports led several big brands including Mars and Adidas to pull advertising from the site.

Ali Abdullah Saleh killed in UAE controlled area
The Houthis' political office on December 2 accused Saleh of staging a "coup" against "an alliance he never believed in". Al-Masirah stated that Saleh was killed while trying to flee the capital for Marib province.

The addition of more crew to the reviewing panel will help YouTube to provide more data for supply, and improve its machine learning software. Its machine learning systems, which flag 98% of videos removed for violent extremism, now help human reviewers remove almost five times as many videos than they were previously.

Earlier this year, advertisers fled the site after ads appeared next to extremist content.

She said that since June, when Youtube deployed new technology to flag violent extremist content for human review, the platform had manually reviewed almost 2m videos and removed 150,000.

"We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should".

Previous efforts to tackle "problematic" content have seen regular YouTube content creators complain about ads being removed from their videos - while major companies pulling their ads is also likely to have a knock-on effect on regular channels looking to earn advertising revenue.

Wojcicki said 180,000 people would have had to work 40 weeks to assess the same amount of content.