Victims of ‘deepfake pornography’ are calling for tougher measures to prevent women being targeted.
Creators of deepfake porn use artificial intelligence to replace the likeness of one person with another, often in the form of videos.
The perpetrator merely requires a digital photograph of their victim which can be transposed onto pornographic content.
One victim of deepfake porn stated, “It really makes you feel powerless, like you’re being put in your place…punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”
British academic and cyber civil rights researcher, Sophie Maddocks, commented that much of the concern has overly focused on the potential for political deepfakes and that, “Deepfakes were being used predominantly as a form of cyber sexual abuse…” Adam Dodge, founder of EndTAB, a non-profit that educates people about technology-enabled abuse similarly argued, “This is a violence-against-women issue.”
Kate Isaacs, leader of anti-porn campaign group Not Your Porn, and Cara Hunter, a 26-year-old Northern Irish politician, were both recent victims of deepfake porn.
Isaacs, reflecting on the profoundly distressing offline repercussions she experienced, stated, “There are real-life ramifications of being a victim of image-based sexual abuse. It has real-life consequences. It’s scary and it’s embarrassing and it’s humiliating.” Similarly, Hunter highlighted the “psychological warfare” she experienced when she was targeted during the late stages of her election campaign.
At present, there is no single criminal offence in England & Wales regulating the sharing of non-consensual intimate images.
Isaacs acknowledged, “The first problem is that our legislative process isn’t fit for purpose when it comes to online harms … It moves far too slowly and cannot keep up with the technology. Our legislative process in the UK cannot safely and proactively protect people online as it is too slow.”
In July 2022, the Law Commission recommended that the Government change the law so that sharing deepfake pornography without consent is a criminal offence.