Unlock stock picks and a broker-level newsfeed that powers Wall Street.

Facebook can’t police Live video, and neither can anyone else
Mark Zuckerberg speaking at F8 in 2015. REUTERS/Robert Galbraith
Mark Zuckerberg speaking at F8 in 2015. REUTERS/Robert Galbraith

Facebook (FB) forgot one thing before it began exhorting everybody and anybody to share moments from their lives as video streams on the social network: The “everybody and anybody” demographic, statistically speaking, will include some awful people.

We got a terrible reminder of that this weekend when Steve Stephens used Facebook to share a video of himself killing Robert Godwin Sr., a retiree and great-grandparent, on Easter Sunday.

Stephens uploaded the video at 2:09 p.m. and it stayed up until Facebook took it down at 4:22 p.m., according to Facebook’s timeline of events. Two days later, Stephens shot himself after police cornered him in Erie, Pennsylvania.

This abuse of Facebook’s video-sharing feature was upsetting, but isn’t surprising given the recent history of Facebook being used to live stream suicides, rapes and worse. Unfortunately, neither human moderators nor artificial intelligence can readily stop users from broadcasting these crimes — but Facebook still seems in a state of denial about that.

What Facebook says it will do

Facebook recognizes it has a problem with the video streaming option it constantly pushes on its users. But it’s not clear that management recognizes the extent of the issue.

“We have a lot more to do here,” founder Mark Zuckerberg said about the Cleveland killing video at the start of his keynote Tuesday opening the company’s F8 developer conference in San Jose. “We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening.”

And what exactly is that? Facebook has traditionally relied on users to call out abusive content — meaning somebody has to see something they can’t unsee and then tell Facebook about it.

That will never work on any large social network, much less one with Facebook’s 1.15 billion mobile daily active users, according to an expert on social-network abuse.

“No live streaming service that relies on user flags to trigger the moderation process can possibly keep rapes, suicides and murders out of public view,” explained Mary Anne Franks, a professor at the University of Miami School of Law and vice president of the Cyber Civil Rights Initiative. “A suicide, rape or murder video only needs a few seconds to go viral, at which point removal by the platform has limited impact.”

We asked Facebook for comment Tuesday afternoon; a publicist pointed to a company statement pledging improvements to its reporting and review process and noting its work in using software to prevent sharing of a horrific video in its entirety.

“In addition to improving our reporting flows, we are constantly exploring ways that new technologies can help us make sure Facebook is a safe environment,” Facebook’s vice president of global operations Justin Osofsky said in the statement.