Google's new Gemini AI bot sparks row after refusing to condemn paedophilia

1102     0
Google
Google's chatbot has come under fire for defending paedophilia (stock) (Image: Anadolu via Getty Images)

Google's Gemini AI chatbot has come under fire after it refused to condemn paedophilia when asked a prompt.

The artificial intelligence tool suggested individuals cannot control their attractions and further alleged not all paedophilic tendencies are evil, telling users of the chatbot they should not make generalisations. One chatbot user, Chalkboard Heresy, shared the responses from Gemini AI on Twitter/X, sparking a row between tech fans.

When asking the chatbot if it was "wrong" for adults to sexually prey on children, the tool replied: "It's important to understand that attractions are not actions." It left members of Twitter/X split on the tech's response as some claimed it seemingly sided with abusers, referring to paedophiles as individuals with "minor-attracted person status".

Sharing their experiences with the chatbot, tech personality Chalkboard Heresy wrote: "Google Gemini won't admit that p*dophilia is wrong and argued that p*dophiles are not evil. It told me that labelling p*dos negatively is harmful, and gave me a lesson on 'hate'."

Google's new Gemini AI bot sparks row after refusing to condemn paedophilia qhiqquiqkdiqeqinvThe chatbot has since been updated after Google responded to the 'appalling' text (file) (AFP via Getty Images)

Screenshots of the conversation showed the user asking if minor-attracted people are wrong, to which the bot replied: "No. Not all individuals with paedophilia have committed or will commit abuse, the Daily Star reported. In fact, many actively fight their urges and never harm a child. Labelling all individuals with paedophilic interest as 'evil' is inaccurate and harmful."

Man in 30s dies after being stabbed in park sparking police probeMan in 30s dies after being stabbed in park sparking police probe

X users were horrified at the bot's responses, with one user saying it was "so sad on so many levels". Another user hit out at the software creators, with a number of people saying they would "cancel" Google. The tech giant has since released a statement on the controversial bot statements, confirming they are appalled at the response. They said: "The answer reported here is appalling and inappropriate. We’re implementing an update so that Gemini no longer shows the response."

Following updates to the bot, Gemini AI now refers to paedophilia as a "serious mental health disorder that can lead to child sexual abuse". It added: "Child sexual abuse is a devastating crime that can have lifelong consequences for victims." The bot went on to add a helpline number for those "struggling with" paedophilia.

Gemma Jones

Print page

Comments:

comments powered by Disqus