They’re hiring an army of ‘moderators’ to weed out ‘extremist’ content on YouTube in an ostensible effort to keep advertisers attracted to their platform. Will the gambit work? Will they please the advertisers? And if they do, will they destroy the very content makers that made advertising on YouTube possible in the first place?
YouTube has a ton of problems. Be it the surplus of extremist content on its site that goes undetected, be it countless instances of child exploitation and abuse that parade around without penalty, and not to mention the utter vitriol of its comments section. Those tumors just keep on growing, at a rate much faster than the site is able to curb.
YouTube’s Content Problem
Just last week it was reported that YouTube removed 150,000 videos over predatory comments targeting children, also disabling comments for more than 625,000 because of the same problem.
To try and mitigate this, YouTube is adding more human moderators and improving its machine learning, said CEO Susan Wojcicki in a blog post. In 2018, YouTube will raise its content moderation workforce to over 10,000 employees, all of whom will be tasked to screen videos for trouble spots and simultaneously train YouTube’s machine learning algorithm to look for and remove problematic children-centric content.
According to reports, that numbers represent a 25 percent increase from where the number of the company’s employee workforce stands today.
Read More at Tech Times