Spike in cases of online grooming across the South West
The NSPCC has revealed online grooming cases have risen by 70% across the South West
Last updated 1st Nov 2024
Across the South West, new stats by the NSPCC have revealed the cases of online child grooming have spiked by more than 70% in seven years.
Online safety experts are now calling for tighter regulations on social media companies, as apps like Snapchat are being revealed as the places where these incidents are taking place.
In fact, 48% of all recorded incidents of the sexual exploitation of young people are said to have taken place on Snapchat - something the social media firm say they have a zero tolerance approach towards.
The NSPCC is now urging OFCOM to strengthen their regulations on social media platforms in order to tackle child sexual abuse as over 1800 cases of grooming reported in 2023/24 included a tech platform.
Meta-owned platforms were also found to be popular with offenders, with WhatsApp named in 12% of those cases, Facebook and Messenger in 12% and Instagram in 6%.
Here in the South West, the data revealed:
- Avon and Somerset Police recorded 283 incidents of online grooming in 2023/4 - compared to 229 in 2017/18
- Devon and Cornwall Police recorded 108 incidents of online grooming in 2023/4 - compared to 58 in 2017/18
- Dorset Police recorded 74 incidents of online grooming in 2023/4 - compared to 37 in 2017/18
- Gloucestershire Police recorded 99 incidents of online grooming in 2023/4 - compared to 25 in 2017/18
- Wiltshire Police recorded 76 incidents of online grooming in 2023/4 - compared to 28 in 2017/18
The NSPCC say OFCOM put more focus on acting after the crime has taken place rather than putting in place preventative measures such as ensuring design features of social media apps are not contributing to abuse.
The charity is now urging the online regulator Ofcom to strengthen the Online Safety Act.
'It shouldn't be children’s responsibility to keep themselves safe'
NSPCC senior policy research officer Toni Brunton-Douglas said: “If social media platforms took steps to make sure children were safe on their platforms the situation wouldn’t need to be like this.
“We don’t think it should be children’s responsibility to keep themselves safe, we think these platforms should be safe by design so that children don’t have to go through all this."
Sir Peter Wanless, NSPCC chief executive, said: "One year since the Online Safety Act became law and we are still waiting for tech companies to make their platforms safe for children.
"We need ambitious regulation by Ofcom who must significantly strengthen their current approach to make companies address how their products are being exploited by offenders.
"It is clear that much of this abuse is taking place in private messaging, which is why we also need the Government to strengthen the Online Safety Act to give Ofcom more legal certainty to tackle child sexual abuse on the likes of Snapchat and WhatsApp."
'Social media companies have a responsibility'
Minister for safeguarding and violence against women and girls, Jess Phillips, said: "Child sexual abuse is a vile crime that inflicts long-lasting trauma on victims and the law is clear - the creation, possession and distribution of child sexual abuse images, and grooming a child is illegal.
"I met with law enforcement leads and the NCA (National Crime Agency) only last week to hear about the tremendous work they do to bring these offenders to justice.
"Social media companies have a responsibility to stop this vile abuse from happening on their platforms.
"Under the Online Safety Act they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services, or face significant fines.
"The shocking case involving Alexander McCartney, who alone groomed over 3,500 children, demonstrates more clearly than ever that they should act now and not wait for enforcement by the regulator."
'Any sexual exploitation of young people is horrific and illegal'
A Snapchat spokesperson said: "Any sexual exploitation of young people is horrific and illegal and we have zero tolerance for it on Snapchat.
"If we identify such activity, or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.
"We have extra protections including in-app warnings to make it difficult for teens to be contacted by strangers, and our in-app Family Centre lets parents see who their teens are talking to, and who their friends are."
An Ofcom spokesperson said: "From December, tech firms will be legally required to start taking action under the Online Safety Act, and they'll have to do far more to protect children.
"Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to contact children.
"We're prepared to use the full extent of our enforcement powers against any companies that come up short when the time comes."
The NSPCC offers support to children who have been affected across the UK - and you can find plenty of advice and support on their website.