YouTube’s 10K Moderator Army Attacks “Extremist” Videos

They’re hiring an army of ‘moderators’ to weed out ‘extremist’ content on YouTube in an ostensible effort to keep advertisers attracted to their platform.  Will the gambit work?  Will they please the advertisers?  And if they do, will they destroy the very content makers that made advertising on YouTube possible in the first place?

YouTube Is Fighting Its Extremist, Child Exploitation Problem With Over 10,000 Content Police Officers

YouTube has a ton of problems. Be it the surplus of extremist content on its site that goes undetected, be it countless instances of child exploitation and abuse that parade around without penalty, and not to mention the utter vitriol of its comments section. Those tumors just keep on growing, at a rate much faster than the site is able to curb.

YouTube’s Content Problem

Just last week it was reported that YouTube removed 150,000 videos over predatory comments targeting children, also disabling comments for more than 625,000 because of the same problem.

To try and mitigate this, YouTube is adding more human moderators and improving its machine learningsaid CEO Susan Wojcicki in a blog post. In 2018, YouTube will raise its content moderation workforce to over 10,000 employees, all of whom will be tasked to screen videos for trouble spots and simultaneously train YouTube’s machine learning algorithm to look for and remove problematic children-centric content.

According to reports, that numbers represent a 25 percent increase from where the number of the company’s employee workforce stands today.

Read More at Tech Times

Facebook Comments
About Paul Gordon 1520 Articles

Paul Gordon is the publisher and editor of iState.TV. He has published and edited newspapers, poetry magazines and online weekly magazines.
He is the director of Social Cognito, an SEO/Web Marketing Company. You can reach Paul at pg@istate.tv