YouTube Is Fighting Its Extremist, Child Exploitation Problem With Over 10,000 Content Police Officers

YouTube has a ton of problems. Be it the surplus of extremist content on its site that goes undetected, be it countless instances of child exploitation and abuse that parade around without penalty, and not to mention the utter vitriol of its comments section. Those tumors just keep on growing, at a rate much faster than the site is able to curb.

YouTube's Content Problem

Just last week it was reported that YouTube removed 150,000 videos over predatory comments targeting children, also disabling comments for more than 625,000 because of the same problem.

To try and mitigate this, YouTube is adding more human moderators and improving its machine learning, said CEO Susan Wojcicki in a blog post. In 2018, YouTube will raise its content moderation workforce to over 10,000 employees, all of whom will be tasked to screen videos for trouble spots and simultaneously train YouTube's machine learning algorithm to look for and remove problematic children-centric content.

According to reports, that numbers represent a 25 percent increase from where the number of the company's employee workforce stands today.

"Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content," said Wojcicki.

Over the last two weeks, YouTube has removed hundreds of thousands of videos involving children in potentially exploitive situations, including duct-taped into a wall, forced into a washing machine, and other more serious scenarios. The company said it will employ the same method it used this summer when it tried to get rid of extremist content.

YouTube's New Approach To Advertising

Wojcicki wants to find a new approach to advertising on the platform. The company said that in the last two weeks, it has pulled ads from about 2 million videos pretending to be family-friendly. Going forward, she said that YouTube would be "carefully considering which channels and videos are eligible for advertising." In addition, YouTube would also "apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers."

Several brands have previously run into trouble for apparently placing ads for their products on videos that are inappropriate, extremist, or offensive, which then resulted in several brands leaving the platform.

Exactly when these changes will be implemented is hard to say for now, but it's an important pledge regardless. YouTube's finances, for the record, aren't laid out clearly when Google's parent company, Alphabet, reports quarterly earnings, making it hard to determine which advertisers are funding which.

Here's hoping for a safer YouTube going forward.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics