Controversial measures in Online Safety Bill scrapped

The measures would've forced social media sites to take down "legal but harmful" content

Author: Martyn Landi, PA Technology Correspondent/Abi SimpsonPublished 29th Nov 2022
Last updated 29th Nov 2022

Measures that would've forced social media sites to take down "legal but harmful" material are being taken out of the Online Safety Bill.

The Government argues firms would've "over-removed" content, which could damage free speech.

Under the original Bill’s plans, the biggest platforms would have been compelled to not only remove illegal content, but also any material which had been named in the legislation as legal but potentially harmful.

These measures drew criticism from free speech campaigners, who claimed that governments or tech platforms could use the Bill to censor certain content.

Now the key requirements of the Bill are being redefined.

Platforms will be required to remove illegal content, as well as take down any material that is in breach of its own terms of service.

And instead of the legal but harmful duties, there will now be a greater requirement for firms to provide adults with tools to hide certain content they do not wish to see – including types of content that do not meet the criminal threshold but could be harmful to see, such as the glorification of eating disorders, misogyny and some other forms of abuse.

It is an approach which the Government is calling a “triple shield” of online protection which also allows for freedom of speech.

Under the bill, social media companies could also face being fined by Ofcom up to 10% of annual turnover if they fail to fulfil policies to tackle racist or homophobic content on their platforms.

Updates to strengthen accountability and transparency will also be introduced to boost child online safety, it was confirmed, which will require tech firms to publish summaries of risk assessments in regard to potential harm to children on their sites, show how they enforce user age limits and publish details of enforcement action taken against them by Ofcom – the new regulator for the tech sector.

The updated rules will also prohibit a platform from removing a user or account unless they have clearly broken the site’s terms of service or the law.

“...a hugely backward step”

Julie Bentley, chief executive of Samaritans, described dropping the requirement to remove “legal but harmful” content as “a hugely backward step”.

“Of course children should have the strongest protection but the damaging impact that this type of content has doesn’t end on your 18th birthday,” she said.

“Increasing the controls that people have is no replacement for holding sites to account through the law and this feels very much like the Government snatching defeat from the jaws of victory.”

Shadow culture secretary Lucy Powell said it was a “major weakening” of the Bill, adding: “Replacing the prevention of harm with an emphasis on free speech undermines the very purpose of this Bill, and will embolden abusers, Covid deniers, hoaxers, who will feel encouraged to thrive online.”

The Online Safety Bill is due to return to Parliament next week after being repeatedly delayed.

“Unregulated social media has damaged our children for too long and it must end,” Culture Secretary Michelle Donelan said.

“I will bring a strengthened Online Safety Bill back to Parliament which will allow parents to see and act on the dangers sites pose to young people.

“It is also freed from any threat that tech firms or future governments could use the laws as a licence to censor legitimate views.

“Young people will be safeguarded, criminality stamped out and adults given control over what they see and engage with online.

“We now have a binary choice: to get these measures into law and improve things or squabble in the status quo and leave more young lives at risk.”

The latest changes come in the wake of other updates to the Bill, including criminalising the encouragement of self-harm and of “downblousing” and the sharing of pornographic deepfakes.

The Government also confirmed further amendments will be tabled shortly aimed at boosting protections for women and girls online.

In addition, the Victim’s Commissioner, Domestic Abuse Commissioner and Children’s Commissioner will be added as statutory consultees to the Bill, meaning that Ofcom must consult them with drafting new codes of conduct it will create that tech firms must follow in order to comply with the Bill.

Children’s Commissioner for England, Dame Rachel de Souza, said this would ensure “children’s views and experiences are fully understood”.

“We cannot allow any more children to suffer. The loss of children by suicide, after exposure to hideous self-harm and suicide content, are tragic reminders of the powerful consequences of harmful online material,” she said.

“I am determined to see this Bill pass through Parliament and pleased to see an enhanced focus on protecting children.

“I will work to ensure that children’s voices and needs underpin each stage of the legislative process. I look forward to us all getting behind such a crucial moment to protect children online.”

Hear all the latest news from across Edinburgh, the Lothians, Fife and Falkirk on Forth 1. Listen on FM, via the Rayo app, on DAB or on your smart speaker.