Sitting in front of my laptop I watched a naked woman with my face having explicit penetrative sex in different positions with a naked man. The pornographic video was three minutes and 32 seconds long, and as grotesque as I found it, I forced myself to watch the whole thing. I had to understand exactly how realistic these images are, and also recognize how easy it was for people to access them online.
Because as seamless as the images seemed, it wasn’t me at all: my face had been attached to another woman’s body using artificial intelligence (AI) to create so-called ‘deepfake’ pornography.
The video was unearthed by my colleagues at Channel 4 while investigating the exponential and alarming rise of deepfake porn for a special report broadcast last month.
A deepfake pornographic video of Channel 4 broadcaster Cathy Newman was discovered by her colleagues while researching the rise of the technology
Of the 4,000 celebrities they found in deepfake porn videos online, 250 were British – and one of them was me.
None of the celebrities we contacted to comment on this would be made public. Although it was disappointing, I understood: they did not want to perpetuate the abuse they had been victims of by drawing more attention to it.
But for our research to have maximum impact, I knew I had to speak out.
In my 18 years as a Channel 4 journalist, I have unfortunately seen many disturbing images of sexual violence. So while I was nervous about being part of the story, I thought I would get used to the content of the video itself.
But actually it left me disturbed and haunted. I was abused by an abuser whom, as far as I know, I have never met, and I was the victim of a very modern crime that threatens to have a corrosive effect on future generations of women.
I also felt vindicated in my decision to make this public as the government announced earlier this month that creating these sexually explicit deepfakes will be made a criminal offense in England and Wales.
I understand that Laura Farris, the Minister for Victims and Safety, was partly motivated to take action after reviewing our research. This comes after sharing this type of content was banned in the Online Safety Bill last year.
My colleagues were already investigating deepfake pornography when fake explicit images of singer Taylor Swift went viral on X/Twitter in January, with one image being viewed 47 million times before being deleted.
Suddenly the alarming scale of the problem became clear. We found that the four most popular deepfake porn sites hosting doctored images and videos of celebrities had almost 100 million views in just three months, with more deepfake porn videos created in 2023 than all the years combined since 2017.
The videos have been viewed more than 4.2 billion times in total.
You might think that creating them requires a certain degree of technical expertise, but it’s incredibly simple and is usually done using ‘nudify’ apps for smartphones – there are more than 200 available. Users submit a photo – a single photo of someone’s face, culled from social media is all that is required – and it is used to create a gruesomely realistic explicit image.
Due to the sheer number of celebrity photos online, we hear that high-profile personalities are the most likely victims. They include US Congresswoman Alexandria Ocasio-Cortez, who this month described the trauma of discovering she had been targeted during a meeting with aides in February, and Italian Prime Minister Giorgia Meloni, who is seeking damages after deepfake videos of her were uploaded online.
But the biggest victims are undoubtedly the hundreds of thousands of women without a public platform to portray the images as deepfake – the women who might be sitting in a meeting or job interview and not know whether the people in front of them have seen the images and been affected by them. false images.
The recreation of the broadcaster. Of the 4,000 celebrities they found in deepfake porn videos online, 250 were British – and one of them was me, writes Cathy.
I spoke to one such victim, Sophie Parrish, 31, a florist and mother of two from Merseyside, whose deepfake porn video was uploaded to a website by someone in her family, where men then photographed themselves masturbating. She was physically ill when she found out, and the impact on her since then has been profound.
She is a beautiful woman, she has lost confidence and now does not want to put on makeup for fear of attracting attention. She almost blames herself, even though there is clearly no guilt involved. And yet she had the audacity to come out last February and ask the Justice Department to make it illegal to take and share explicit images without consent.
To be honest, I wasn’t entirely surprised when my colleagues told me about the existence of my video, since as a woman in the public eye I had been mercilessly trolled for years.
After my interview with Jordan Peterson, the Canadian psychologist infamous for his divisive views on political correctness, freedom of speech, gender identity and racial privilege, went viral in 2018, I received death threats. I was called a ‘c***’, ‘b****’ and ‘w****’ and my eldest daughter, then 13, was sad when she came across a meme on Instagram of my head being imposed on on a pornographic image.
So it’s understandable that my colleagues were keen that I not feel any pressure to watch the video made of me, while my editor was concerned about its emotional impact. But I felt I owed it to every victim of this crime—especially Sophie Parrish, whom I had interviewed the day before—to understand for myself what it felt like to be targeted, and to speak out.
Of course, I have access to professionals who can help me process the material, but many women – and 98 percent of deepfake pornography victims are women – do not. I was concerned about the reaction to my daughters, who are now 19 and 15 years old, but like all teenagers they are aware of the kind of AI content spreading online and were interested in how we can navigate it.
After looking at the report they told me they were proud. So did my husband – although he understandably didn’t want to watch the unedited video of me, and neither did I.
While the pornographic meme my daughter saw in 2018 was gross, I discovered that six years later, the digital terrain has changed and the lines between what is real and what is not have blurred.
The only saving grace of my surprisingly sophisticated deepfake video was that AI can’t replicate my curly hair (yet), and the bleached blonde bob clearly wasn’t mine. Nevertheless, the images of me having sex with a man who had also presumably not consented to the use of his image felt incredibly invasive.
But I also wanted to be filmed watching it, to show in our report how much of an impact it had on me.
Even though it was clearly done from a distance, by a perpetrator whose motives I can only speculate, I felt violated.
Anyone who knows me will realize that I wouldn’t be involved in making a porn video, and one benefit of getting older is that you are less likely to experience child abuse. But its existence undermines and dehumanizes women. It is a deliberate attempt to belittle and humiliate. Even if they know they are watching deepfake porn, men don’t seem to care.
Seventy percent of viewers visit deepfake porn sites via search engines. When we contacted Google, a spokesperson said they understood how disturbing the images can be, that they are developing additional safeguards to help people protect themselves, and that victims can have pages with this content removed from search results.
Since our investigation, two of the largest deepfake sites – including the site hosting my video – have banned UK users from accessing their content. But the video is still available through a virtual private network – a VPN – that hides a user’s location.
The Government’s legislation to ban the making of these videos – which will result in a criminal record, a fine and a possible prison sentence, and will be introduced as an amendment to the Criminal Justice Bill – is groundbreaking, but experts I have spoken to have already warned of possible loopholes in the law.
Victims will have to prove that the video was made with the intention of causing distress, which can be difficult, and the question is whether asking an app to create the explicit content won’t bother you. of the law.
Another disadvantage is that many of these videos are made outside Britain, where our laws do not apply, so global action is also needed.
Then there’s the matter of timing: Ofcom, the broadcasting watchdog, is still discussing the rules of the law that made sharing these videos illegal. It won’t come into effect until the end of this year, by which time hundreds of thousands more women will have been victims.
Regulation also lags far behind the technology that enables this crime, so ultimately it comes down to the big tech companies that distribute this explicit content, which drives viewers and advertisers to their platforms, for profit.
They are far more powerful than individual jurisdictions, and I see no evidence that they are addressing the issue with the urgency that is needed.
I believe it is in their power to immediately stop the distribution of these videos, but it is not in their interest to do so.
I worried about the possible backlash if I made myself part of this story, but the overwhelming response was supportive, both on social media, in my email inbox, and on the street.
And a month later, as depressed as I am about AI’s corrosive effect on future generations of women, I’m glad I went public.