Streaming violence on Facebook Live and other platforms is increasing and Facebook recognizes that. It’s one thing for users to watch Hollywood violence and quite another to witness streaming violence that is real. We’re not linking to any of the most recent cases as they are very disturbing, and I think most people are familiar with them anyway. Outlets like YouTube Live, Facebook Live, Snapchat, and others are easy methods to get content out quickly. Now, these companies have to figure out a way to police these streams so as not to host some of the disturbing content that has been getting through.
Facebook Inc (FB.O) will hire 3,000 more people over the next year to respond to reports of inappropriate material on the social media network and speed up the removal of videos showing murder, suicide and other violent acts, Chief Executive Mark Zuckerberg said on Wednesday.
The hiring spree is an acknowledgment by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts. Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.
Zuckerberg, the company’s co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service.
While this step isn’t going to likely stop streaming violence, it will hopefully be enough to catch it fast enough to limit its reach.
Zuckerberg said: “We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”
Last Updated on May 3, 2017.