Let’s face it, algorithms and artificial intelligence are still far from replacing human eyes and judgment. That’s why YouTube will be hiring to get even more human eyes to keep tabs on questionable content. Over the past few months, there has been an uproar over questionable content migrating to YouTube Kids. Many have blamed YouTube’s algorithm for being fooled by questionable content disguised as kid-friendly.
YouTube CEO Susan Wojcicki gave praise to the platform for fostering creativity and ushering in new and exciting content. But she also admitted there are some issues.
But I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm.
Now, we are applying the lessons we’ve learned from our work fighting violent extremism content over the last year in order to tackle other problematic content. Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.
Wojcicki goes on to say that human eyes are a critical part of the reviewing process.
Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content.
We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.
Wojcicki says YouTube reviewers have manually reviewed nearly 2-million videos for questionable content. There is no doubt that the job is immense, considering how many YouTube videos are uploaded every day. Even with 10,000 plus human eyes reviewing videos in 2018, I’m sure it’s not going to be perfect. Wojcicki also claims that YouTube will be more transparent about how they’re dealing with issues on the platform.
We understand that people want a clearer view of how we’re tackling problematic content. Our Community Guidelines give users notice about what we do not allow on our platforms and we want to share more information about how these are enforced. That’s why in 2018 we will be creating a regular report where we will provide more aggregate data about the flags we receive and the actions we take to remove videos and comments that violate our content policies. We are looking into developing additional tools to help bring even more transparency around flagged content.
It will be interesting to see how well this goes and if they can get a better handle on the “bad actors.” What do you think of YouTube hiring more content moderators? Let us know in the comments below or on Google+, Twitter, or Facebook.Source: YouTubeBlog