Real women whose faces have been used by AI to create deepfake images
Taylor Swift fans were recently left horrified after deepfake pornographic images of the singer were circulated online, prompting a group of US senators to introduce a bill to criminalise the practice of sharing non-consensual artificial intelligence (AI) generated sexualised images.
This comes amid widespread concerns among women from many walks of life about the rapid advances of AI, in a world where new laws have yet to catch up. On October 27, spokesperson from the UK government confirmed the newly introduced Online Safety Bill recognised deepfake porn as harmful, with the sending, or sharing, of such images regarded as a criminal offence.
Under this long-awaited bill, those who share explicit deepfakes could face up to six months behind bars. It remains notoriously difficult however to pin down the real identities behind those who commit this offence, leaving many traumatised victims in a state of limbo.
During an interview with the Mail Online, Kate Isaacs, 30 recalled how her 'stomach dropped' when she came across sexually explicit footage of herself while scrolling through X, previously known as Twitter. The footage in question showed her face superimposed onto the body of another woman, who was performing a sex act. Kate, who founded the #NotYourPorn campaign, recalled: "It was so convincing, it even took me a few minutes to realise that it wasn't me. Anyone who knew me would think the same. It was devastating. I felt violated, and it was out there for everyone to see."
The professional researcher never found out who created the footage, but believes she was targeted because she'd previously spoken out against the rise of 'non-consensual porn'. The footage, which was shared in 2020, had taken completely unrelated images of Kate's face from the internet.
Pigeons are so smart their brains can rival Artificial Intelligence, experts sayAnother woman, referred to by the pseudonym Courtney, opened up to the Mail Online about her horror after a stranger turned her ordinary mirror selfie into a deepfake nude photo. This unknown person, who sent Courtney the photo over Instagram with a smirking emoji, had used one of the many sinister websites that will 'virtually undress' individuals in any photo.
Courtney told the publication: "I hadn't heard from this person in two years and I woke up one morning, last Friday. I woke up at 7am to go to college but I received this extremely graphic image that definitely wasn't real. It had been taken from my Instagram and edited with AI. I was in shock. No way did that just happen. But I realised the severity of it later in the day... who they could have sent that to." The aspiring social worker, who is in her 20s, blocked the account and took down all pictures of herself from social media. She now lives in a constant state of fear of a repeated violation.
During an interview with the i, Northern Irish politician Cara Hunter spoke of her distressing ordeal after fake porn footage was made of her during her election campaign in April 2022. She became aware of the video, in which she appeared to be engaging in an oral sex act, while attending her grandmother's 90th birthday. Cara, 26, recalled: "I was surrounded by family and my phone was just going ding, ding, ding. And over the next couple of weeks, it continued like that."
The clip was shared across various social media platforms, and Cara even experienced harassment on the street as a result. She shared: "Two days after the video started doing the rounds, a man stopped me in the street when I was walking by myself, and asked for oral sex. He brought up the video. It was horrible, it felt like everybody believed it. People would laugh and jeer and mock me in the street."
Although police were sympathetic when she reported the incident, they admitted there was little they could do to help, as their 'cyber team had very limited powers'. A report published in 2023 found there had been a 550 per cent increase in the number of deepfake videos found online that year when compared with 2019. The report, published by homesecurityheroes.com, revealed 98 per cent of all deepfake videos were of a pornographic nature, while 99 per cent of those targeted were women.
If you've been the victim of sexual assault, you can access help and resources via www.rapecrisis.org.uk or calling the national telephone helpline on 0808 802 9999