Meta’s ‘bonfire’ of safety policies a danger to children, charity says

In This Article:

Meta’s recent “bonfire of safety measures” risks taking Facebook and Instagram back to where they were when Molly Russell died, the charity set up in her name has warned.

The Molly Rose Foundation said new online safety regulator Ofcom must strengthen incoming regulation in order to ensure teenagers are protected from harmful content online.

The charity was set up by Molly’s family after her death in 2017, aged 14, when Molly chose to end her life after viewing harmful content on social media sites, including Meta-owned Instagram.

Molly Russell
Molly Russell (Family handout/PA)

Earlier this month, boss Mark Zuckerberg announced sweeping changes to Meta’s policies in the name of “free expression”, including plans to scale back content moderation that will see the firm ending the automated scanning of content for some types of posts, instead relying on user reports to remove certain sorts of content.

Campaigners called the move “chilling” and said they were “dismayed” by the decision, which has been attributed to Mr Zuckerberg’s desire to forge a positive relationship with new US President Donald Trump.

Andy Burrows, chief executive of the Molly Rose Foundation, said: “Meta’s bonfire of safety measures is hugely concerning and Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died.

“Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms.

“If Ofcom fails to keep pace with the irresponsible actions of tech companies the Prime Minister must intervene.

“Amid a strategic rollback of their safety commitments, preventable harm is being driven by Silicon Valley but the decision to stop it in its tracks now sits with the regulator and Government.”

In a letter sent to Ofcom, the foundation has urged Ofcom to strengthen the Online Safety Act by bolster requirements around content moderation, including requiring firms to proactively scan for all types of intense depression, suicide and self-harm content.

It also urges the regulator to ensure that Meta’s new loosened policies around hate speech are not allowed to apply to children, and gain clarification on whether Meta can change its rules without going through traditional internal processes, after reports suggesting Mr Zuckerberg made the policy changes himself, leaving internal teams “blindsided” – something Ofcom should ensure cannot happen again, the foundation said.

In a statement, a Meta spokesperson said: “There is no change to how we define and treat content that encourages suicide, self-injury and eating disorders.