YouTube hiring to get more human eyes reviewing questionable content

|
, ,

Let’s face it, algorithms and artificial intelligence are still far from replacing human eyes and judgment. That’s why YouTube will be hiring to get even more human eyes to keep tabs on questionable content. Over the past few months, there has been an uproar over questionable content migrating to YouTube Kids. Many have blamed YouTube’s algorithm for being fooled by questionable content disguised as kid-friendly.

YouTube CEO Susan Wojcicki gave praise to the platform for fostering creativity and ushering in new and exciting content. But she also admitted there are some issues.

But I’ve also seen up-close that there can be another, more troubling, side of YouTube’s openness. I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm.

Now, we are applying the lessons we’ve learned from our work fighting violent extremism content over the last year in order to tackle other problematic content. Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.

Wojcicki goes on to say that human eyes are a critical part of the reviewing process.

Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content.

We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.

Wojcicki says YouTube reviewers have manually reviewed nearly 2-million videos for questionable content. There is no doubt that the job is immense, considering how many YouTube videos are uploaded every day. Even with 10,000 plus human eyes reviewing videos in 2018, I’m sure it’s not going to be perfect. Wojcicki also claims that YouTube will be more transparent about how they’re dealing with issues on the platform.

We understand that people want a clearer view of how we’re tackling problematic content. Our Community Guidelines give users notice about what we do not allow on our platforms and we want to share more information about how these are enforced. That’s why in 2018 we will be creating a regular report where we will provide more aggregate data about the flags we receive and the actions we take to remove videos and comments that violate our content policies. We are looking into developing additional tools to help bring even more transparency around flagged content.

It will be interesting to see how well this goes and if they can get a better handle on the “bad actors.” What do you think of YouTube hiring more content moderators? Let us know in the comments below or on Google+, Twitter, or Facebook.

[button link=”https://youtube.googleblog.com/2017/12/expanding-our-work-against-abuse-of-our.html” icon=”fa-external-link” side=”left” target=”blank” color=”285b5e” textcolor=”ffffff”]Source: YouTubeBlog[/button]
Previous

Samsung UBD-M9500 review: Enjoy 4K UHD content with this Blu-ray player

Mega Man 11 coming late 2018 with new hand-drawn 2.5D art

Next

Latest Articles

Share via
Copy link
Powered by Social Snap