Facebook CEO Mark Zuckerberg in his trademark gray tee. (image: Getty)
It shouldn’t be news that fake news is a problem on Facebook and other social networks: We’ve had decades of practice with hoaxes, urban legends and other fraudulent “facts” floating around the Internet.
But this election season sawfake news run amok. ABuzzFeed study found that over the last three months of the campaign, the 20 most-shared fake election stories got a slightly larger audience than the top 20 election stories from legitimate sites.
Commence the blamestorming
Now, some people are now wondering if Facebook’s role as a vector for fake news played a part in Donald J. Trump’s shocking Election Night win.
Gizmodo’s recent report that Facebook (FB) quashed a News Feed update to call out hoaxes because it would have nailed too many conservative sites — something Facebook denies — amped up that angst. So did the Washington Post interview with fake-news entrepreneur Paul Horner featuring this quote:“I think Trump is in the White House because of me.”
Facebook has responded with a stages-of-denial sequence, first saying this isn’t a big problem that couldn’t have made much of a difference, then taking a step to address the issue.
Last Thursday, founder Mark Zuckerberg called the fake-news issue“small” and inconsequential, then expanded on his thoughts ina Facebook post.
“Of all the content on Facebook, more than 99% of what people see is authentic,” Zuckerberg said. “The hoaxes that do exist are not limited to one partisan view, or even to politics.”
He added that “we don’t want any hoaxes on Facebook” and said the company is helping users to flag fake content.
Friday night, Zuckerbergposted another note, saying that Facebook was working on automatic classification and third-party verification of stories and was “exploring” adding warning labels for fake news.
But fake news can still hurt Facebook. When I see obvious nonsense overrunning the site — like when a click on a link about Trump’s possible tech policy led Facebook to suggest a story froma fake-news factory called “Ending the Fed” — I have to wonder how stupid it thinks I am.
If Facebook is serious about making itself less of an accelerant for lies, it’s got no shortage of advice about what to do next.
Thursday, a group of 20 fact-checking organizations postedan open letter to Zuckerberg urging that the social network “strengthen users’ ability to identify fake posts and false news by themselves.”
Meanwhile, Facebook’s existing tools for flagging fake stories are too obscure: You must click or tap the arrow in the top right corner of a News Feed post,select “Report post,” select “I think it shouldn’t be on Facebook” and then choose “It’s a false news story.”
Then your vote apparently vanishes. If friends see the same link, they won’t know you called it out.
(This function doesn’t work on the related-stories links that Facebook displays after you follow a shared link. Zuckerberg’s Friday-night note said the site was “raising the bar” for those links.)
But Facebook and other social-media sites need to a proactive approach to fake-news publications that exploit their rules and norms.
But when it comes to the sort ofsocial engineering that can make a network unfriendly or unlivable, the managers of our social networks remain shocked, shocked, to find that trolling is going on in here.
It’s past time for them to start doing the same sort of aggressive searching for vulnerabilities that their “infosec” experts already undertake.
“I’ve long thought that this is something they should be doing” saidSarah Jeong, author of“The Internet of Garbage,” a book chronicling how social networks treat abuse as an outsource-able cleanup problem. She has yet to see any such efforts.
Facebook says it’s factored in the possibility of abuse when launching features like Facebook Live. But in too many other cases, it’s shown itself to be as reactive as every other social network — and the way it’s gotten played by fake-news sites and now must pledge that it’s “committed to getting this right,” as Zuckerberg wrote Friday, fits into that pattern.
We now have yet another example of howoptimizing for users of goodwill doesn’t work. The trolls and the con artists and the fraudsters aren’t going to stop coming, any more than the hackers will let up. We need to stop being surprised by their arrival.