Facebook’s 3,000 new human content monitors won’t prevent violent content

Ahead of Facebook’s (FB) first-quarter earnings report on Wednesday, CEO Mark Zuckerberg announced that the company will add 3,000 people to its community operations team, which monitors the user requests to take down content.

Since launching roughly a year ago, Facebook Live, once seen as a platform for Chewbacca mom and Buzzfeed’s exploding watermelon, has become an outlet for disturbing acts of violence. Users have streamed at least 50 acts of violence, including murder and suicide.

“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else,” Zuckerberg writes on his Facebook page.

But, during last quarter’s conference call, Zuckerberg acknowledged that humans alone cannot solve the problem. He promised to “build AI systems that can watch a video and understand that it’s going to be problematic and violates the policies of our community and that people aren’t going to want to see it and then just not show it to people.”

Currently, any Facebook user can report content for the following three reasons — it’s annoying or not interesting; I think it shouldn’t be on Facebook; it’s spam.

ARTIFICIAL INTELLIGENCE IN THE WORKS?

Zuckerberg appears to be shifting the burden of monitoring Facebook Live content, at least partially, from the user to his own employees. By having more people reviewing flagged content, he anticipates Facebook can remove negative and harmful posts as quickly and effectively as possible.

He did not give an update on potential technology to screen for violent content in his Facebook post, but investors will await to see whether it will be addressed during the company’s Q1 earnings call, set for after the bell on Wednesday.

Facebook COO Sheryl Sandberg commented on Zuckerberg’s post: “Keeping people safe is our top priority. We won’t stop until we get it right.”

The community operations team currently has 4,500 people, which would bring the total number of the department to 7,500. CFO David Wehner said 2017 would be the year for “aggressive investment” in terms of headcount and R&D during last quarter’s conference call.

Here’s Zuckerberg’s full Facebook post:


Melody Hahm is a writer at Yahoo Finance, covering entrepreneurship, technology and real estate. Follow her on Twitter @melodyhahm.

Read more: