Tech firms must take action on illegal activity as new rules come into force
New Ofcom rules kick in today that could see fines handed out worth tens of millions of pounds
Last updated 6 hours ago
Tech firms that don't protect people from illegal activity online could face tougher punishments under new rules that kick in today.
Ofcom says if firms don't show they're taking action on things like grooming, fraud or terrorism content they'll receive multi-million pound fines - and could even be shut down.
These new rules are part of a roll-out of changes under the government's Online Safety Act.
The Online Safety Act
Dame Melanie Dawes, Ofcom’s Chief Executive, said: "For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.
"The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.
"Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them."
Tech firms must respond to the Online Safety Act
Under the rule changes, firms, including social media, must name a senior person from their organisation who is responsible for compliance.
They also need to make sure moderation teams are "appropriately resourced" and are set "robust performance targets."
The changes will also aim to tackle online child grooming, by preventing strangers getting access to a child's location, friends lists, or be able to send direct messages. Childrens' accounts should also no longer appear in lists of people to add to networks.
Ofcom say robust action will be taken if firms don't act on the new rules, stating "While we will offer support to providers to help them to comply with these new duties, we are gearing up to take early enforcement action against any platforms that ultimately fall short.
"We have the power to fine companies up to £18m or 10% of their qualifying worldwide revenue, whichever is greater, and in very serious cases we can apply for a court order to block a site in the UK."
Rules around tackling harmful content online are due to be rolled out later this spring.
Campaigning families say more needs to be done
Ellen Roome's 14 year old son Jools died in 2022.
She believes something that he saw online lead to his death, and has been calling for an overhaul of the rules around tech firms.
While she broadly supports Ofcom taking action, she believes the roll-out could be significantly quicker.
She said: "It's not doing enough, it really needs to be stronger in terms of tightening things up otherwise how many more children are going to suffer seeing harmful content, and in a worst case scenario, like mine, a child dies.
"How many more children are going to be affected before we say 'enough'?
"If this was an unsafe car we wouldn't allow it on the road, yet we're still allowing children to access social media sites that are harmful.
"I'm sure if this happened to someone in Parliament's child things might happen a lot quicker, funnily enough, it would get switched off. I don't want this to happen to anyone else.
"Wake up world, we need to do more about this."
Read more about the changes from OFCOM