In 2015, Keeley Richards-Shaw found her name and personal life splashed all over the media. Her photo, her job and links to her Facebook page were all published. She’d been in court seeing her ex-partner sentenced for harassment and sharing sexual images without her consent. After being stalked by him, she said she was now being “stalked by the media”.
Having her private, intimate images shared was devastating. But the invasion of privacy made it even worse, leaving her distraught and humiliated after what should have been a legal victory.
The law had just been changed to criminalise sharing images without consent. But it was not classed as a sexual offence and, crucially therefore for survivors like Richards-Shaw, there was no automatic anonymity for victims.
This act is often called “revenge porn”. But this is a problematic term for many reasons, including that it places the blame on victims by suggesting they have done something worthy of “revenge”.
Keeley Richards-Shaw became the first of many survivors to raise their voices and demand a change in the law. I gladly worked with her and her local police and crime commissioner to raise this issue with the government. I conducted a survey and found that 75% of the public supported automatic anonymity for victims.
I spoke to more victims a few years later, with one telling me and my colleagues and that she would never report this crime to the police due to the lack of anonymity. Another who’d been to the police said the publicity was “worse” than the crime itself.
At last, victims’ voices are being heard. The government has announced it will amend the online safety bill to include significant changes to the law on the sharing and posting of intimate images.
First, it will make sharing intimate images without consent a sexual offence, guaranteeing victims automatic anonymity. It also removes a cumbersome legal barrier that requires proof of an offender’s intent to cause distress. Finally, it criminalises the sharing of deepfake porn, a vital step as AI technology is making it easier to create deepfakes.
Long overdue changes
Classifying the sharing of intimate images as a sexual offence in law is a significant step that victims, campaigners and researchers have been demanding for years. In 2014, the actor Jennifer Lawrence described the hacking and sharing of her nudes a “sex crime”. As one victim, Deborah, whose images were shared without consent told my colleagues and I in 2019: “It’s a type of rape, it’s just the digital version.”
Our research also recommended removing the “motivation threshold” – a legal requirement to prove that perpetrators acted with the intention of causing distress. This standard was one factor leading to low prosecution rates and police refusing to progress cases.
Georgie Matthews learned this when, after reporting her ex-partner to police, he sent her a text saying he hadn’t meant to cause her distress – case dropped. Matthews has campaigned tirelessly for this requirement to be removed from the law.
Hearing these and other experiences, my colleague Erika Rackley and I developed the term “image-based sexual abuse” to better describe how the non-consensual taking and sharing of intimate images can affect victims. We hoped to stop people using the term “revenge porn” – in addition to victim-blaming, it’s one reason why the law is so limited to those deliberately causing distress.
Removing the motive threshold also means many more types of abuse can be prosecuted, such as “collector culture” cases, where men take and trade intimate images in online groups. These private WhatsApp groups and internet forums are on the rise in schools, universities and workplaces, with men striving to boost their status and gain kudos by trading and sharing non-consensual, explicit photos.
The desire for non-consensual material is also driving the phenomenon of “deepfake porn” where ordinary images are altered using technology and AI to make them explicit and pornographic. Sharing such images – though not creating them – will also now be criminalised in the online safety bill.
Sarah shared with us her devastating experience when her ex-partner photoshopped images of her to make them sexual. Despite the adverse impacts on her mental health, she has continued to speak out, to try to help other survivors. Artist and campaigner Helen Mort has also shared her experiences of deepfake porn, calling for the changes in the law that have now been announced.
Only the start
It is very welcome to see reform happening in a way that validates survivors’ experiences. We’ve come a long way since Erika Rackley and I started calling for change. But there remain some big gaps in protection.
These changes do not cover situations where a victim’s images are not considered “intimate”, yet sharing them non-consensually could be threatening for the subject. For example, a Muslim woman photographed not wearing a headscarf that her family or community expects her to wear.
Ultimately, what victims want first and foremost is for material to be removed from the internet. Yet these reforms do not include court powers to make platforms take down material, or insist perpetrators delete imagery, as is the case in many other countries.
The UK needs a comprehensive, holistic response that helps victims get justice and rebuild their lives. The changes to the online safety bill are a great start, but not the end of the fight.
Clare McGlynn received funding from the Australian Research Council.