Facebook is 'the 800-pound gorilla in the misinformation market'

In This Article:

Facebook (FB) recently announced that it would be banning advertisements that discourage vaccines and banning content related to rabid QAnon conspiracy theories.

But according to Imran Ahmed, the CEO of the Center for Countering Digital Hate, it’s too little too late — especially considering that the platform won’t touch existing anti-vaccine posts.

“Facebook, they are the 800-pound gorilla in the misinformation market,” Ahmed said on Yahoo Finance’s The First Trade (video above). “That’s the truth. … Facebook is the company that can change the lens through which we see the world by repeated misinformation being spread to people. It can actually persuade people the world is a different way.”

Facebook Chairman and CEO Mark Zuckerberg testifies before the House Financial Services Committee on "An Examination of Facebook and Its Impact on the Financial Services and Housing Sectors" in the Rayburn House Office Building in Washington, DC on October 23, 2019. (Photo by Nicholas Kamm / AFP) (Photo by NICHOLAS KAMM/AFP via Getty Images)
Facebook Chairman and CEO Mark Zuckerberg testifies before the House Financial Services Committee in Washington, DC on October 23, 2019. (Photo: Nicholas Kamm / AFP)

It’s not just about vaccine misinformation or QAnon, Ahmed added.

“We saw it with coronavirus and people believing that masks were dangerous or that masks weren’t necessary,” he said. “We see it with identity-based hate, which is what my organization typically looks at. In fact, it’s Facebook that has the biggest problem and has done the least, ironically, to deal with that problem.”

Facebook did not respond to a request for comment.

‘Concerted misinformation actors ... growing rapidly’

After the 2016 election season, during which foreign actors exploited Facebook’s weak content moderation policy, Facebook developed a “fact-checking” system that has been used to label videos and posts it deems to be factually inaccurate.

Ahmed argued that the company’s moderation response has not been enough.

“What they’ve done essentially is they’ve looked at a very, very dirty apartment and they plumped up the cushions and not cleaned anything else,” he said.

A man opens the Facebook page on his computer to fact check coronavirus disease (COVID-19) information, in Abuja, Nigeria March 19, 2020. Picture taken March 19, 2020. REUTERS/Afolabi Sotunde
A man opens the Facebook page on his computer to fact check coronavirus disease (COVID-19) information March 19, 2020. REUTERS/Afolabi Sotunde

For example, research by the Center for Countering Digital Hate found that there are tens of millions of followers of “individuals, of groups, of pages” that are spreading anti-vaccine misinformation.

“38 million we found just on Facebook across the U.K. and the U.S., and that’s been growing rapidly, so the organic reach by concerted misinformation actors has been growing rapidly,” Ahmed said. “In fact, dealing with the ads problem — it was small in terms of numbers but it was extraordinary that there was any point at which Facebook had a business that was based on taking adverts from misinformation actors and spreading misinformation about vaccines into millions of news feeds.”

Other platforms have taken steps to prevent misinformation: Twitter increasingly labels what it deems to be factually inaccurate content, most notably from the 87 million-follower account of Trump.

“Twitter is, in technical terms, we would say that it’s more about the tactical adjustment of what’s on the agenda,” Ahmed said.