Express & Star

Meta’s ‘bonfire’ of safety policies a danger to children, charity says

The Molly Rose Foundation has warned social media risks becoming a haven for harmful content if safety regulation is not bolstered.

By contributor By Martyn Landi, PA Technology Correspondent
Published
Last updated
A girl using a mobile phone
Mark Zuckerberg announced sweeping changes to Meta’s policies (Alamy/PA)

Meta’s recent “bonfire of safety measures” risks taking Facebook and Instagram back to where they were when Molly Russell died, the charity set up in her name has warned.

The Molly Rose Foundation said new online safety regulator Ofcom must strengthen incoming regulation in order to ensure teenagers are protected from harmful content online.

The charity was set up by Molly’s family after her death in 2017, aged 14, when Molly chose to end her life after viewing harmful content on social media sites, including Meta-owned Instagram.

Molly Russell
Molly Russell (Family handout/PA)

Earlier this month, boss Mark Zuckerberg announced sweeping changes to Meta’s policies in the name of “free expression”, including plans to scale back content moderation that will see the firm ending the automated scanning of content for some types of posts, instead relying on user reports to remove certain sorts of content.

Campaigners called the move “chilling” and said they were “dismayed” by the decision, which has been attributed to Mr Zuckerberg’s desire to forge a positive relationship with new US President Donald Trump.

Andy Burrows, chief executive of the Molly Rose Foundation, said: “Meta’s bonfire of safety measures is hugely concerning and Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died.

“Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms.

“If Ofcom fails to keep pace with the irresponsible actions of tech companies the Prime Minister must intervene.

“Amid a strategic rollback of their safety commitments, preventable harm is being driven by Silicon Valley but the decision to stop it in its tracks now sits with the regulator and Government.”

In a letter sent to Ofcom, the foundation has urged Ofcom to strengthen the Online Safety Act by bolster requirements around content moderation, including requiring firms to proactively scan for all types of intense depression, suicide and self-harm content.

It also urges the regulator to ensure that Meta’s new loosened policies around hate speech are not allowed to apply to children, and gain clarification on whether Meta can change its rules without going through traditional internal processes, after reports suggesting Mr Zuckerberg made the policy changes himself, leaving internal teams “blindsided” – something Ofcom should ensure cannot happen again, the foundation said.

In a statement, a Meta spokesperson said: “There is no change to how we define and treat content that encourages suicide, self-injury and eating disorders.

“We don’t allow it and we’ll continue to use our automated systems to proactively identify and remove it.

“We continue to have Community Standards, around 40,000 people working on safety and security to help enforce them, and Teen Accounts in the UK, which automatically limit who can contact teens and the types of content they see.”

Earlier this month, Molly’s father Ian, the chairman of the Molly Rose Foundation, told the Prime Minister that the UK was “going backwards” on online safety.

Mr Russell said in a letter to Sir Keir Starmer that Ofcom’s approach to implementing the Online Safety Act has “fundamentally failed to grasp the urgency and scale of its mission”, and changes were needed to bolster the legislation.

The Molly Rose Foundation has also previously warned that Meta’s approach to tackling suicide and self-harm content is not fit for purpose, after research found the social media giant was responsible for just 2% of industry-wide takedowns of such content.

An Ofcom spokesperson said: “All platforms operating in the UK – including Meta – must comply with the UK’s online safety laws, once in force.

“Under the Online Safety Act, tech firms must assess the risks they pose, including to children, and take significant steps to protect them.

“That involves acting swiftly to take down illegal content – including illegal suicide and self-harm material – and ensuring harmful posts and videos are filtered out from children’s feeds.

“We’ll soon put forward additional measures for consultation on the use of automated content moderation systems to proactively detect this kind of egregious content.

“We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force.

“No one should be in any doubt about Ofcom’s resolve to hold tech firms to account, using the full force of our enforcement powers where necessary.”

Sorry, we are not accepting comments on this article.