EXCLUSIVE: Channel 4 star and victim of deepfake p/o/r/n on her chilling fear after s3x vid found online
Four victims – including Channel 4 presenter Cathy Newman and former Love Island contestant Cally Jane Beech – have spoke of their disbelief at finding the harrowing images and footage online.
Deepfake pornographic images of TV presenter Holly Willoughby were found on a phone owned by Gavin Plumb – the monster convicted last week of plotting to kidnap, rape and murder the star.
Holly is the latest high-profile woman whose image has been shockingly manipulated. But women and girls all over the UK are being terrorised by bullies, blackmailers and sexual predators using their images to create deepfake porn. In January, the Online Safety Act made the sharing of intimate deepfakes – created by using AI to impose someone else’s face on existing pornographic material – illegal, punishable by up to six months in prison. This rises to two years if intent to cause distress, alarm, humiliation or to obtain sexual gratification can be proved.
Police say it is still too early to feel the impact of the law change. But the new Labour Government included a proposal to criminalise creating sexually explicit deepfakes in their manifesto, so further legislation is on the cards. A Ministry of Justice spokesperson said: “Sexually explicit deepfakes are degrading and harmful. We refuse to tolerate the violence against women and girls that stains our society which is why we’re looking at all options to ban their creation as quickly as possible.”
Channel 4 presenter Cathy Newman said the images were ‘incredibly invasive’ (
Image:
Getty Images)
Cathy watched the footage after being alerted by colleagues at Channel 4 (
Image:
Sunday Mirror)
It can’t come soon enough, according to campaigners, who are becoming increasingly concerned about the implications of deepfake pornography for younger girls. Just last month, a police investigation was launched after claims that deepfakes were created at a boys’ school by someone manipulating images taken from social media accounts of pupils at a girls’ school.
Now Channel 4 News anchor Cathy Newman, 49, a mum-of-two, who was herself a victim of deepfake pornography, says it is teenage girls that she fears the most for. “I worry, having two teenage girls, that this is the world we live in,” she tells The Mirror. “We talk about it very frankly. I did ask them before I shared my story and they were really supportive. They were keen for me to talk about the issue because they want action. Their generation doesn’t want to live in this way.”
Cathy was alerted to a 3 minute 32 second deepfake pornographic video – digitally altered to look like her – earlier this year by the team who were investigating the issue for a segment on Channel 4 News. Her face had been imposed onto the body of a woman having graphic sex with a man.
“You get used to online abuse, so my first instinct was to laugh about it,” she says. “It wasn’t until I sat down to watch it that I realised how invasive and dehumanising it was. “I found it very disturbing how realistic the facial expressions and movements were. You wouldn’t know that it’s not real. It disturbed me that someone might stumble across this and believe it’s me. You just wonder the motive – why have they done this? The conclusion you reach is that it’s done to degrade women.”
And when she approached major search engines to have the footage removed, she was shocked to learn it couldn’t be taken down. She continues: “They said they would derank it, which means they will move it further down the search results, but they can’t remove it. So, they can make sure people can’t find it as easily, but it’s still there.
“Part of the problem is these sites (where the deepfakes are shown) are hosted outside of the UK, so even if the UK bans the creation of deepfakes, you need international action for anything to take effect. I do feel things are changing but it shouldn’t be okay for women to be targeted online in a way that would be illegal offline. It’s abuse.”
Shocking statistics from End Violence Against Women show non-consensual pornography accounts for 96% of all deepfakes found online, with 99% featuring women. Clare McGlynn, Professor of Law at Durham University, who specialises in pornography, sexual violence and online abuse, is advocating for a more “straightforward and comprehensive law” that covers “all forms of creating a deepfake altered image without consent”.
She says: “I welcome the new proposal that will make creating sexually explicit deepfakes of adults illegal, but we also need to be taking action against these platforms and internet service providers (where they are shown). Search engines like Google have an awful lot to be accountable for, because they provide access to the websites allowing this content to be created. We’re having to live with the risk that someone will make deepfake porn of us. It’s a real threat, and certainly students are well aware of it.”
Sophie Parrish discovered numerous images on a range of sites online
Sophie campaigned for the law change
DON’T MISSMum-of-two Sophie Parrish, 32, who discovered deepfake sexual images of herself in October 2022, which had been created by a family friend, launched a petition asking the government to “make it illegal to create and share explicit images without consent”.
The florist from Merseyside says: “We found numerous pictures on a whole range of threads online. We knew who’d done it because some of the pictures showed items of clothing that he’d stolen from my house, as he had access to our home. There were pictures on these websites of men pleasuring themselves over pictures taken from my personal Facebook and posted online by this family friend. I felt hatred and anger at that moment. This person destroyed the family friendship. We felt betrayed.”
The former friend was arrested in January 2023, but Sophie says: “There was no law about AI then. It was decided that there was not enough evidence, so he was released without charge.” And while she contacted search engines asking for the pictures to be removed, they said they couldn’t take them down, as they were AI generated and, therefore, not nude pictures of her. She says: “I have two young children. My biggest fear is that they’ll find these images online one day. And I might have grandchildren one day. I want to know they are protected from these monsters. These websites need to be shut down.”
Despite difficulties in both getting images removed and catching the perpetrator, the police still urge victims of intimate deepfakes to come forward. A spokesperson for the National Police Chiefs’ Council says: “Help is there for you. Threatening to share intimate images (whether genuine or deepfaked) is a criminal offence.
“It’s important to report it either in person or online. Police are committed to supporting victims and survivors with respect, compassion and empathy.” The proliferation of websites enabling people to create deepfakes is alarming, with a quick online search revealing sites like ‘See Anyone Nude’ and the ‘Deepnude Nudifier App.” UK-based freelance writer Becca Caddy, in her 30s, was threatened with blackmail by the person who created deepfake pornographic pictures of her.
Becca Caddy was blackmailed with deepfake pornographic images
Former Love Island contestant Cally Jane Beech went to Parliament in April (
Image:
Collect)
Refusing to give them money, she responded by posting the images on social media and telling the world they were not real. She also reported it to the police, who she says are investigating. She says: “I received an email that had the subject line ‘Photoshoot – Becca’. I opened it and saw a photo of me that was taken a few years ago, but it had been edited so I was topless. When I scrolled further down the email, I saw another photo that was taken during a day out in December in a coffee shop. My face had been attached to a body that wasn’t mine. There was a follow up email just minutes later, but this one was malicious.”
Demanding money, the email said: “You can figure out the impact this will have on you, your family, your mental health, your relationships and your professional life.” The author further claimed to be a “professional” who had been blackmailing people for “more than a decade”. “I knew these photos couldn’t be held over me if I shared them myself,” says Becca. “It felt important to warn people that this kind of sextortion was happening.”
When former Love Island contestant Cally Jane Beech, 32, was told deepfake pornographic images of her were circulating in January, she told her Instagram followers that they were not real. And in April she was part of a parliamentary roundtable discussing the threat deepfake technology poses to women. These apps that allow people to create these images should not be allowed to exist,” Cally says. “It’s a complete violation of people’s privacy.
“It comes down to consent. I might choose to post pictures in my underwear but I shouldn’t have to worry my privacy will be violated. I felt sick when I saw my deepfake image. I’ve never shared an intimate image of myself with anyone – even ex-boyfriends. So the fact that it was now out there on the internet made me feel really disappointed.”
Mum to a seven-year-old girl, she also worries how schoolchildren will cope with this new form of bullying and abuse. Cally, of Hull, says: “I had floods of messages from my followers who had also been deepfaked. This is happening to everyone – schoolchildren as well.
Cally worries for the future of young children (
Image:
Collect)
“You can drag any picture into these sites and it will generate fake breasts and genitals. There’s still more work to be done and the current law change is only the start.” We approached a number of websites and search engines for a response to criticisms levelled by victims of deepfake pornography.
A Google spokesperson said: “We’ve been actively developing new protections in Search to help people affected by this content, building on our existing policies. We’re continuing to decrease the visibility of involuntary synthetic pornography in Search and develop more safeguards as this space evolves.”
A Reddit spokesperson said they have partnered with SWGfl, a charity devoted to the safe and secure use of technology, in a bid to find and remove this imagery. Their internal safety teams use tools to detect and remove such AI generated content before anyone sees it. They added: “Reddit’s sitewide policies strictly prohibit any non-consensual sharing of intimate or sexually explicit media, including depictions that may have been AI-generated or faked.”