Social media companies will have to detect child sexual abuse content under new bill

The Government is to amend the new Online Safety Bill to protect children online

Author: Sonia NyathiPublished 6th Jul 2022
Last updated 6th Jul 2022

Social media companies could be made to develop new technologies to find and prevent the spread of child sexual abuse content under new proposals by the Government.

The Government's proposed amendments to the Online Safety Bill would give greater powers to Ofcom to demand tech companies show they are making reasonable efforts to find and tackle harmful content on their platforms

That would include developing and introducing new technology which can help find harmful content as well as stop it spreading.

The government say that the new technology could be deployed even on encrypted platforms and still protect user privacy.

Ofcom can impose big fines

Ofcom would have the power under the Bill to impose fines of up to ÂŁ18 million or 10% of a company's global annual turnover - whichever is higher.

Officials said the amendment is not an attempt to stop the rollout of further end-to-end encryption services - technology which it said it broadly supports if implemented with assurances that children and others are being protected from harmful material.

Facebook, WhatsApp and Instagram plan to make changes in 2023

Meta, which owns Facebook, WhatsApp and Instagram, has previously announced plans to roll out end-to-end encryption across all its messaging platforms by some time in 2023.

This would allow secure communication that prevents third parties from accessing data while it's transferred from one end system or device to another.

The amendment is the latest put forward for the landmark internet safety laws and will be considered later this month as part of the report stage of the Bill's passage through Parliament.

We must not allow criminals to "run rampant online”

Home Secretary Priti Patel said: "Child sexual abuse is a sickening crime. We must all work to ensure criminals are not allowed to run rampant online and technology companies must play their part and take responsibility for keeping our children safe.

"Privacy and security are not mutually exclusive - we need both, and we can have both and that is what this amendment delivers."

The Government said its self-funded Safety Tech Challenge Fund - which has awarded five firms at least ÂŁ85,000 to further develop prototype products capable of detecting child abuse material within encrypted settings - showed it was possible to find solutions.

Culture Secretary Nadine Dorries said tech firms "have a responsibility not to provide safe spaces for horrendous images of child abuse to be shared online".

"Nor should they blind themselves to these awful crimes happening on their sites," she said.

The amendment has been backed by child safety campaigners, with NSPCC chief executive Sir Peter Wanless saying it "will strengthen protections around private messaging and ensure companies have a responsibility to build products with child safety in mind".

"This positive step shows there doesn't have to be a trade-off between privacy and detecting and disrupting child abuse material and grooming," he said.

First for all the latest news from across the UK every hour on Hits Radio on DAB, at hitsradio.co.uk and on the Rayo app.