Facebook adding 3,000 people to filter out violent content

Facebook

Facebook said Wednesday it would add 3,000 people to screen out violent content as the social media giant faces scrutiny for a series of killings and suicides broadcast on its platform.


"If we're going to build a safe community, we need to respond quickly," chief executive Mark Zuckerberg said on his Facebook page.

"We're working to make these videos easier to report so we can take the right action sooner—whether that's responding quickly when someone needs help or taking a post down."

The 3,000 new hires, added over the coming year, will increase by two thirds the size of Facebook's community operations team, which currently numbers 4,500.

Explore further: Online videos of killings pose tricky problem for Facebook