Critics say Ofcom is too weak on illegal social media content as new rules start
From Monday, Ofcom will start enforcing the Online Safety Act’s illegal content codes.

Technology firms must tackle illegal content on their platforms under new rules, but there are concerns that the changes are too weak.
From Monday, Ofcom will start enforcing the Online Safety Act’s illegal content codes, requiring social media companies to find and remove content such as child sexual abuse material.
The Government says it represents a “major step forward”, but critics say the media regulator’s approach is “timid” because it fails to protect children from harmful content and pacifies online firms.
Ian Russell, whose daughter Molly killed herself aged 14 in November 2017 after viewing harmful content on social media, said Monday “should have been a watershed moment” but that children and families have been “let down by Ofcom’s timidity and lack of ambition”.

The chairman of the Molly Rose Foundation said Ofcom “appears to have lost sight of the fundamental purpose of regulation”, which is preventing harm.
Mr Russell said “it is increasingly apparent” that the regulator’s timid approach has been “dominated by their fear of legal challenge and their eagerness to placate the tech companies”.
He added: “Worried parents across the country are dismayed by yet more half measures and are calling for the Prime Minister to commit to urgent and decisive action.
“Adding further government dither and delay before improving digital protection is unacceptable, the life-threatening gaps in the Online Safety Act need to be fixed now.”
However, Ofcom said it had “moved quickly and set out robust safety measures” and “won’t hesitate to take action against platforms that fall short”.
Technology Secretary Peter Kyle said the changes “represent a major step forward in creating a safer online world”.

Mr Kyle said that “for too long” child abuse material, terrorist content and intimate image abuse has been “easy to find online”, but that social media platforms now have a legal duty to prevent and remove it.
He added: “In recent years, tech companies have treated safety as an afterthought. That changes today.
“This is just the beginning. I’ve made it clear that where new threats emerge, we will act decisively.
“The Online Safety Act is not the end of the conversation, it’s the foundation.”
The illegal content codes relate to material such as child sexual exploitation and abuse, terrorism, hate crimes, content encouraging or assisting suicide, and fraud.
New duties on social media firms require them to detect and remove the content, using advanced tools such as automated hash-matching and robust moderation and reporting mechanisms.
Ofcom has the power to fine non-compliant firms up to £18 million or 10% of their qualifying global turnover under the Online Safety Act – whichever is greater – and in very serious cases can apply for sites to be blocked in the UK.
An Ofcom spokesperson said: “Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that.
“But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action.”
On the changes, Andy Burrows, chief executive of Molly Rose Foundation, said “there is not one single measure to target suicide and self-harm offences”.
Ofcom said it has set out “several specific measures that services can take to protect adults and children from suicide and self-harm content”.
Mr Burrows went on: “As global law enforcement agencies queue up to warn of deeply disturbing new threats, including children being groomed for acts of suicide and self-harm, UK children remain at palpable yet wholly preventable risk.
“The Government needs to urgently intervene with decisive action, not piecemeal proposals that would mean many more years of sticking plasters.”
Chris Sherwood of the NSPCC said while the children’s charity is “hopeful” that the changes will help keep young people safe, it is “concerned that Ofcom’s final codes of practice are not yet strong enough”.
He said there was an “unacceptable loophole” where illegal content is only required to be removed where it is technically feasible, which he said “lets tech platforms off the hook”.
Ofcom said it expects the “vast majority” of platforms will be able to take content down and that it will investigate any claims about it not being technically feasible.
Mr Sherwood added: “Today marks an important step forward.
“But the government and Ofcom must work together and significantly strengthen the codes of practice to ensure this legislation results in meaningful change for children.”
Pepe Di’Iasio, general secretary of the Association of School and College Leaders (ASCL) welcomed progress on the Online Safety Act but also questioned its effectiveness.
He highlighted the ongoing risks of social media on young people, such as bullying, harmful content, and poorly enforced age requirements and called for “decisive steps” to be made.
Rocio Concha, Which? director of policy and advocacy said that while it is “positive” that platforms must do more to stop user-generated fraud, the new rules do not extend to other prevalent scams, such as many paid-for fraudulent ads.
She added: “Under the current timetable for the Online Safety Act, firms in scope of the fraudulent advertising duties in the Act will not be held accountable for breach of those duties until 2027.
“This is simply not good enough and leaves consumers unnecessarily exposed to countless scam ads.
“The Online Safety Act must be implemented in full as soon as possible or the government risks letting millions more fall victim to ruthless online fraudsters.”
Last month, Sir Keir Starmer defended the Online Safety Act after US Vice President JD Vance suggested that the UK had seen “infringements on free speech” which affected American technology firms.
Asked if the Act was trying to censor speech, Sir Keir told Fox News: “No, we don’t believe in censoring speech, but of course we do need to deal with terrorism. We need to deal with paedophiles and issues like that.”