Deepfake sexual abuse is not ‘porn’. Demand action to stop image-based abuse!

Recent signers:
Pratham Sharma and 19 others have signed recently.

The Issue

Countless women and girls are victims of deepfake sexual abuse without even knowing.

I will never forget the moment I discovered that someone had turned photos of me into sexually explicit deepfake ‘porn’.

In January 2021, I received an anonymous email directing me to a series of images of myself that had been shared online, alongside information about who I was and where I lived.

When I received another email just a few weeks later, sending me to an online forum where users had created and published sexually explicit deepfakes of me, I thought my life was over.

My life was turned upside down. I had no idea who was doing this to me and why. I spent every minute of every day looking over my shoulder, questioning everything and everyone. Eventually, I discovered the person doing this to me was my male best friend.

When I reported this to the police, I was told there’s nothing they could do, and turned away. The police told me it was up to Reddit and the porn site to take the photos down. I had to take on the work of tracking down the images and videos, collecting all the evidence myself while desperately trying to find ways to get the images removed.

When I returned to the police to seek justice, they told me the only charges they could bring against the perpetrator were for the foul language being used online.

No laws had been broken for the solicitation, creation and sharing of the sexually explicit deepfaked images that had caused me so much suffering.

I felt alone. The emotional toll was enormous. There were points I was crying so much I burst the blood vessels in my eyes. I couldn’t sleep and when I did, I had nightmares. It felt impossible not to fall into a dark and hopeless place, knowing that someone so close to me could do this and face no consequences.

Even now, I have no idea where these images of me have been shared, and who has had access to them. It makes me feel sick.

There are so many ways I could have and should have been helped and supported. Until a specific law on this is introduced, what happened to me could happen to anyone. That’s why I’m campaigning for a change in the law with the End Violence Against Women Coalition, #NotYourPorn, Professor Clare McGlynn and GLAMOUR UK, so that no one else has to go through what I did.

To create sexually explicit deepfakes, a perpetrator just needs one photo where the person they are targeting is looking straight into the camera - something most of us have shared online. They can then use AI apps to create depictions of their target having sex or being abused. While the imagery is fake, it looks very real.

As this technology becomes more widely available and accessible, the threat to women – who are disproportionately affected by this abuse – continues to grow at an alarming rate. Now is the time for change.

We urgently need better criminal and civil laws to deter perpetrators and ensure images are removed. Tech companies must also be held accountable for hosting, encouraging, and profiting from image-based abuse – through better regulation and courts. We need funding for specialist support for women like me. And importantly, we must educate people about the harm caused by this abuse.

For too long the government’s approach to tackling image-based abuse has been piecemeal and ineffective. This crisis demands more.

Together, we can end image-based abuse. Please help me by signing and sharing this petition.

Jodie

avatar of the starter
Jodie *Petition Starter

73,188

Recent signers:
Pratham Sharma and 19 others have signed recently.

The Issue

Countless women and girls are victims of deepfake sexual abuse without even knowing.

I will never forget the moment I discovered that someone had turned photos of me into sexually explicit deepfake ‘porn’.

In January 2021, I received an anonymous email directing me to a series of images of myself that had been shared online, alongside information about who I was and where I lived.

When I received another email just a few weeks later, sending me to an online forum where users had created and published sexually explicit deepfakes of me, I thought my life was over.

My life was turned upside down. I had no idea who was doing this to me and why. I spent every minute of every day looking over my shoulder, questioning everything and everyone. Eventually, I discovered the person doing this to me was my male best friend.

When I reported this to the police, I was told there’s nothing they could do, and turned away. The police told me it was up to Reddit and the porn site to take the photos down. I had to take on the work of tracking down the images and videos, collecting all the evidence myself while desperately trying to find ways to get the images removed.

When I returned to the police to seek justice, they told me the only charges they could bring against the perpetrator were for the foul language being used online.

No laws had been broken for the solicitation, creation and sharing of the sexually explicit deepfaked images that had caused me so much suffering.

I felt alone. The emotional toll was enormous. There were points I was crying so much I burst the blood vessels in my eyes. I couldn’t sleep and when I did, I had nightmares. It felt impossible not to fall into a dark and hopeless place, knowing that someone so close to me could do this and face no consequences.

Even now, I have no idea where these images of me have been shared, and who has had access to them. It makes me feel sick.

There are so many ways I could have and should have been helped and supported. Until a specific law on this is introduced, what happened to me could happen to anyone. That’s why I’m campaigning for a change in the law with the End Violence Against Women Coalition, #NotYourPorn, Professor Clare McGlynn and GLAMOUR UK, so that no one else has to go through what I did.

To create sexually explicit deepfakes, a perpetrator just needs one photo where the person they are targeting is looking straight into the camera - something most of us have shared online. They can then use AI apps to create depictions of their target having sex or being abused. While the imagery is fake, it looks very real.

As this technology becomes more widely available and accessible, the threat to women – who are disproportionately affected by this abuse – continues to grow at an alarming rate. Now is the time for change.

We urgently need better criminal and civil laws to deter perpetrators and ensure images are removed. Tech companies must also be held accountable for hosting, encouraging, and profiting from image-based abuse – through better regulation and courts. We need funding for specialist support for women like me. And importantly, we must educate people about the harm caused by this abuse.

For too long the government’s approach to tackling image-based abuse has been piecemeal and ineffective. This crisis demands more.

Together, we can end image-based abuse. Please help me by signing and sharing this petition.

Jodie

avatar of the starter
Jodie *Petition Starter
Support now

73,188


The Decision Makers

Peter Kyle
Peter Kyle
Secretary of State for Science, Innovation and Technology
Keir Starmer
Keir Starmer
Prime Minister

Supporter Voices

Petition updates