"96% of deep fakes are pornographic" says sexual abuse charity
Survivors call for new law to stop image-based abuse
A woman working for sexual abuse specialist support services in Essex has said image-based sexual abuse is "widespread" and a matter of "serious concern".
We're one week on from the launch of a campain to stop image-based abuse.
The End Violence Against Women Coalition and a survivor of deepfake sexual abuse have created a campaign calling for a dedicated Image-Based Abuse law to protect women’s rights online and offline.
Kat Hymas works for ICENA, a training consultancy service for 'sexual violence, sexual harassment and sexual misconduct'.
She told us Image-Based Sexual Abuse is a "widespread, serious concern" and we know the "wide-ranging impacts it can have on survivors' wellbeing, their confidence, self-blame etc"
It can come in six forms:
- Voyeurism - "secret-filming" which may include "up-skirting" or "down-blousing", popular referred to as "spy-camming"
- The non-consensual distribution of sexually-explicit material
- Recording sexual violence
- Sexual Extortion or "sex-tortion" - the use of sexual images or videos to blackmail an individual
- Deep Fakes - the alteration of an image or recording of someone to make them appear sexual
- Cyber-flashing - receiving sexually-explicit images without consent.
Ms Hyson told us the most common forms amongst children are 'cyber-flashing' and 'voyeurism'.
"We know that 41% of women aged 18-36 have received a sexually-explicit image of someone else without their consent", says Ms Hyson
She added the campaign mainly focuses on "deep fakes".
"96% of deepfakes - or AI generated fake images - are pornographic", she added.
The End Violence Against Women Coalition (EVAW), GLAMOUR UK, #NotYourPorn, Professor Clare McGlynn, and survivors are calling for a comprehensive law that tackles image-based abuse.
On 13th September 2024, the government announced a “crackdown on intimate image abuse”.
They stated that new changes to the Online Safety Act will make image-based abuse a priority offence and force tech firms to clamp down on it.
However, the End Violence Against Women Coalition have said there are "loopholes" in this legislation.
"Image-based abuse is already a priority offence under the law. The changes being announced are of an administrative nature and do not contain any substantive measures that would strengthen the law or provide any meaningful impact on survivors.
Elena Michael, Director of #NotYourPorn, said “the five “asks” in this campaign are shaped from the needs and experiences of survivors"
These are the "five asks" of their campaign:
- Strengthen criminal laws about creating, taking and sharing intimate images without consent (including sexually explicit deepfakes)
- Improve civil laws for survivors to take action against perpetrators and tech companies (including taking down abusive content)
- Prevent image-based abuse through comprehensive relationships, sex and health education
- Fund specialist services that provide support to victims and survivors of image-based abuse
- Create an online abuse commission to hold tech companies accountable for image-based abuse
Mrs Michael added, "This is what they need as an absolute minimum"
"We will keep calling for it until the Government listens. The Online Safety Act doesn’t go far enough although a starting piece in the jigsaw puzzle. Survivors can’t be expected to do all the work to protect themselves even though this is essentially what they are having to do because of the gaps in the law.”
Ms Hymas told us, "The impacts of image-based abuse exactly reflect those of sexual abuse victims, but we're not talking about it with that language."
"We have to fund the specialist services as well as changing the laws in order that survivors are holistically supported."