Child abuse image crimes in the North West rise as NSPCC calls for effective action in Online Safety Bill
child abuse image offences recorded by UK police have increased by two thirds in five years
Last updated 28th Nov 2023
Following an investigation, NSPCC have revealed that child image abuse offences recorded by UK police in the North West have risen by 1469 (70%) in the last five years.
In 2021/22, across Cheshire Constabulary, Greater Manchester Police, Lancashire Constabulary and Merseyside Police discovered a total of 3576 offences compared to 2107 in 2016/17.
The figure has jumped by two thirds in the last half decade. As per the freedom of information data, collected by the children's charity, police have uncovered more than 30,000 crimes involving both the possession and sharing of the illegal pictures.
Offences grew during the Covid-19 Pandemic, and have continued to rise to 30,925, suggesting that the spread of disturbing pictures isn't slowing down.
According to the NSPCC, social media applications are not doing enough to prevent these activities. Each of the criminals in question, can have multiple victims, who are continually revictimized every time the images are shared
Offenders are grooming young children into sending images of themselves. Social media outlets are failing to find an effective method of preventing these groomers from carrying out their illegal activities. In addition, they're using these platforms to share the images.
New research shows that Snapchat is the most utilised site by sex-offenders, with the criminals taking advantage of the teens using the app. The police have told the public that a staggering 43% of the cases occur via the Chinese platform.
As part of the Online Safety Bill, The NSPCC is asking the government to support children, including victims of sexual abuse, to give them a voice in future regulation by creating a child safety advocate.
The aim would be to ensure that victim's experiences are at the forefront of decision making and building an effective method of safeguarding children.
Mark Zuckerberg's META sites, Facebook, Instagram and WhatsApp, make up for a third of instances where a site was flagged.
And in further alarming news, Oculus headsets are being used to share the vile images through the Virtual Reality, Metaverse.
The NSPCC stressed that it is vital that committing to finding a child safety advocate. This would act as a warning system to spot emerging risks and ensure that the companies and Ofcom deal with the situation in the right way.
This ambassador would reflect the experiences of children and counterbalance the power of the big tech lobby to encourage a corporate culture that aims to prevent abuse.
After being abused at 14, Holly called Childline, she said: “I am feeling sick with fear. I was talking with this guy online and trusted him. I sent him quite a lot of nude pictures of myself and now he is threatening to send them to my friends and family unless I send him more nudes or pay him.
“I reported it to Instagram, but they still haven’t got back. I don’t want to tell the police because my parents would then know what I did and would be so disappointed.”
Roxy Longworth was only 13 when a 17-year-old boy coerced her into sending images via Snapchat.
The offender forwarded the images to his mates which resulted in Roxy being blackmailed and manipulated into sending more images to another older boy who illegally shared them via social media.
Roxy said: “I sat on the floor and cried. I’d lost all control and there was no one to talk to about it. I blocked him on everything and prayed he wouldn’t show anyone the pictures because of how young I was.
“After that, I was just waiting to see what would happen. Eventually someone in my year sent me some of the pictures and that’s when I knew they were out.”
Sir Peter Wanless, Chief Executive of the NSPCC, said: “These new figures are incredibly alarming but reflect just the tip of the iceberg of what children are experiencing online.
“We hear from young people who feel powerless and let down as online sexual abuse risks becoming normalised for a generation of children.
“By creating a child safety advocate that stands up for children and families the Government can ensure the Online Safety Bill systemically prevents abuse.
“It would be inexcusable if in five years’ time we are still playing catch-up to pervasive abuse that has been allowed to proliferate on social media.”
In a bid to improve how crimes are handled, The NSPCC want to seek amendments to the Online Safety Bill as it passes through the House of Lords.
Their idea of a child safety advocate would mirror statutory user advocacy arrangements that are in place across other regulated sectors.
This would provide Ofcom with access to young people's voices and experiences through the child advocate, akin to Citizen's Advice acting for energy and postal consumers.
In this instance, the Government would hold senior managers liable if their products contribute to serious harm to children.
Bosses responsible for child safety would be held criminally liable if their platforms expose young people to preventable abuse.
In response to the Meta data, the NSPCC have called on META to pull the plug on rolling out end-to-end encryption on Facebook and Instagram.
They say that META turn a blind eye to child abuse by making it easy for the predators to continue their horrendous activity without detection. Thus making the importance of external bodies such as a child safety advocate even more paramount.
They stress that the Online Safety Bill would be a massive opportunity to motivate companies to invest in technological solutions that prevent sexual abuse and keep children safe.