Cathy Newman: I was a victim of deepfake porn

On the face of it Taylor Swift, Jenna Ortega, Alexandria Ocasio-Cortez, Giorgia Meloni and I don’t have a lot in common. But there’s one thing that unites us: we’ve all fallen victim to “deepfake” porn, where AI is used to create sexually explicit material without consent.

By the time graphic pictures of Swift had been viewed some 45 million times, my colleagues on the Channel 4 News investigations team were already several weeks into a project researching the extent and impact of this new and insidious phenomenon. They’d started out with the idea of getting a sense of the number of women who’d been “deepfaked”, as well as the damage caused by this abuse, and what should be done to try to stop it.

They — and I — had no idea they would soon discover that I was among the millions of victims across the world.

The team identified the five most popular deepfake porn sites hosting manipulated images and videos of celebrities. These sites alone had almost 100 million views over three months and we discovered videos and images of about 4,000 people in the public eye. At least 250 of those, myself included, were British.

• Creating ‘deepfake’ pornography to be a criminal offence

We set about approaching some of the celebrities who had been targeted and got in contact with 40. No one agreed to participate in our film, perhaps understandably given the fear of drawing millions more to the abusive videos hiding in plain sight online. So I decided to become my own “case study” to see for myself the effect of being deepfaked and consented to being filmed watching the video for the first time.

The nature of my job means I’ve become accustomed to watching disturbing footage and, having been repeatedly trolled online, I consider myself pretty resilient. I therefore thought that I would be relatively untroubled by coming face to face with my own deepfake. The truth was somewhat different.

The video was a grotesque parody of me. It was undeniably my face but it had been expertly superimposed on someone else’s naked body. Bizarrely, one of my defining characteristics — my curly hair — had been replaced. (I learnt that AI struggles with ringlets, for now at least.)

The deepfake image of Cathy Newman

Most of the “film” was too explicit to show on television. I wanted to look away but I forced myself to watch every second of it. And the longer I watched, the more disturbed I became. I felt utterly dehumanised.

That an unknown perpetrator has used readily available technology to fantasise about forcing me into an array of sexual activities can only be described as a violation. Since viewing the video last month I have found my mind repeatedly returning to it. It’s haunting, not least because whoever has abused me in this way is out of reach, faceless and therefore far beyond the limited sanctions presently available.

Yesterday’s government announcement that those making these images will be liable to prosecution is a step in the right direction. But lawmakers around the world are lagging far behind the technology and the perpetrators intent on using it to abuse women.

• How easy is making a deepfake audio? All I needed was six minutes online

High-profile deepfake victims get media attention but it is private individuals who are targeted most. And deepfake porn is proliferating on a scale well-intentioned ministers have only belatedly begun to grasp. Our team discovered that more deepfake porn videos were created in 2023 than in all other years combined. On the top sites alone videos have been viewed more than 4.2 billion times. To put that in context, that’s almost three times the number of people who watched the last football World Cup final.

There are terrible consequences of all this, as I found out when I spoke to Sophie Parrish, a 31-year-old florist. Someone close to her family had taken photos of her from Facebook and Instagram and uploaded them to a website where men share deepfake nudes and post degrading comments and pictures of them masturbating over the photos.

Parrish told me she was physically sick when she saw the images. A year on, she’s still dealing with the repercussions. “I trusted everybody before this. My wall was always down … but now I don’t trust anybody. It’s horrible. I look at everyone now and I think: what are they doing and why are they looking at me?” she said.

• How political deepfakes could decide who wins the general election

Under the Online Safety Act, which came into effect last year, the sharing of deepfake porn is illegal. But the broadcasting watchdog Ofcom is still consulting on new rules that will only come into force at the end of the year. How much longer will it take to implement the new, tougher legislation, outlawing the making of these disgusting videos too?

In the meantime big tech continues to profit from this kind of content. Almost a month after we alerted several of the key companies to my video, it can still be found in seconds by a quick search. There doesn’t seem to be a great sense of urgency in the industry about tackling the problems. But for all our sakes speed is absolutely of the essence. Because every day’s delay means thousands more videos and images uploaded, to the detriment of women and girls the world over.Cathy Newman is a presenter and the investigations editor at Channel 4 News

Post Comment