Peterborough Charity Raises Concerns over Deepfake Abuse

Women’s Aid warns children and teenagers are especially at risk from fake sexual images and online blackmail

Author: Aaliyah Dublin Published 9th Jan 2026

A Peterborough charity says new technology is making it easier for people to create fake sexual images, and says it's growing problem.

It comes after Ofcom raised concerns with Elon Musk’s social media platform X, over claims its AI chatbot Grok was used to make sexualised pictures of people—including children.

Mandy Gerarty from Peterborough Women’s Aid says current protections are not enough, and the impact on victims can be devastating.

“What we would like to happen is stronger, deepfake‑specific laws, faster takedown requirements, and really clear criminal penalties for this type of abuse."

"We know that children and teenagers are especially vulnerable to this kind of manipulation, and we see fake images being used in blackmail."

"For example, pay me or I'll share this—they can often feel violated, humiliated, unsafe, and it can cause long-term damage, anxiety, depression, because people can believe that these images are real, and there's a real fear of those going online.”

She says fake images can also reinforce harmful attitudes towards women and girls, and make it harder for victims to be believed.

“These fake images can normalise sexual violence. And in turn, this can make it harder for victims to be believed.”

Ofcom says it has contacted X to ask what steps are being taken to protect users, but has not started a formal investigation.

Under new laws, social media firms must remove child sexual abuse material as soon as they know about it.

X says 'improvements are ongoing'.

Hear all the latest news from across the UK on the hour, every hour, on Greatest Hits Radio on DAB, smartspeaker, at greatesthitsradio.co.uk, and on the Rayo app.