YouTube is usually adding more human moderators in addition to also increasing its machine learning in an attempt to curb its child exploitation problem, the company’s CEO, Susan Wojcicki, said in a blog post on Monday evening.
The company plans to enhance its content moderation workforce to more than 10,000 employees in 2018 in order to help screen videos in addition to also train the platform’s machine learning algorithms to spot in addition to also remove problematic children’s content. Sources familiar with YouTube’s workforce numbers say This particular represents a 25% increase via where the company is usually today.
from the last two weeks, YouTube has removed hundreds of thousands of videos featuring children in disturbing in addition to also possibly exploitative situations, including being duct-taped to walls, mock-abducted, in addition to also even forced into washing machines. The company said in which will employ the same approach in which used This particular summer as in which worked to eradicate violent extremist content via the platform.
Though in which’s unclear whether machine learning can adequately catch in addition to also limit disturbing children’s content — much of which is usually creepy in ways in which may be difficult for a moderation algorithm to discern — Wojcicki touted the company’s machine learning capabilities, when paired with human moderators, in its fight against violent extremism.
According to YouTube, in which used machine learning to remove more than 150,000 videos for violent extremism since June; such an effort “might have taken 180,000 people working 40 hours a week,” according to the company. The company also claimed its algorithms were getting increasingly better at identifying violent extremism — in October the company said in which 83% of its videos removed for extremist content were originally flagged by machine learning; just one month later, in which says in which number is usually at This particular point 98%.
Wojcicki, on behalf of YouTube, also pledged to find a “brand new approach to advertising on YouTube” for both advertisers in addition to also content creators. from the last two weeks, YouTube said in which has removed ads via nearly 2 million videos in addition to also more than 50,000 channels “masquerading as family-friendly content.” The crackdown has come after numerous media reports in which revealed in which many of the videos — often with millions of views — ran with pre-roll advertisements for major brands, a few of which have suspended advertising business with the platform in November.
Though Wojcicki offered no concrete plans for advertising going forward, she said in which the company might be “carefully considering which channels in addition to also videos are eligible for advertising.” The blog post also said the company might “apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers.”
in which’s unclear when the advertising adjustments will go into effect. For at This particular point, controversial videos still appear to be running alongside advertisements. In a review of videos masquerading as family friendly content, BuzzFeed News found advertisements running on numerous common “flu shot” videos, a genre in which typically features infants in addition to also young children screaming in addition to also crying.
On Monday afternoon, two flu shot videos on a family account called “Shot Of The Yeagers” were found running advertisements for Lyft, Adidas — which had previously told the Times of London in which had suspended advertising on the platform — Phillips, Pfizer, in addition to also others. When BuzzFeed News contacted Adidas in addition to also Lyft about their ads running near the videos, both companies said they might look into the matter.
“A Lyft ad should not have been served on This particular video,” a Lyft spokesperson told BuzzFeed News. “We have worked with platforms to create safeguards to prevent our ads via appearing on such content. We are working with YouTube to determine what happened in This particular case.”
Adidas offered BuzzFeed News a statement dated via November 23 in addition to also added, “we recognize in which This particular situation is usually clearly unacceptable in addition to also have taken immediate action, working closely with Google on all necessary steps to avoid any reoccurrences of This particular situation.” Less than one hour after their initial response, the flu shot videos appeared to be deleted off YouTube entirely.
Charlie Warzel is usually a senior writer for BuzzFeed News in addition to also is usually based in brand new York. Warzel reports on in addition to also writes about the intersection of tech in addition to also culture.
Contact Charlie Warzel at email@example.com.
Remy Smidt is usually a reporter with BuzzFeed News in addition to also is usually based in brand new York.
Contact Remy Smidt at firstname.lastname@example.org.
Got a confidential tip? Submit in which here.