'Greatest peril' to humanity generated by AI already happening, expert warns

23 June 2023 , 22:43
1214     0
Voice-cloning technology is now
Voice-cloning technology is now 'more accessible than ever', an expert has said (Image: Getty Images/iStockphoto)

An expert on cybersecurity has said the biggest threat posed by artificial intelligence (AI) is already having an effect, as people are scammed out of their money.

Scammers have been replicating people's voices to scam people they know in a process known as 'AI cloning'.

Wasim Khaled, the co-founder and chief executive officer of Blackbird.AI, said the voice-cloning is "indistinguishable from human speech".

He told the US Sun: "AI voice cloning allows for the creation of more convincing deep fakes, and a deluge of voice samples from public figures like politicians and celebrities yields high-fidelity results...

"This technology is now more accessible than ever - a quick search returns dozens of low-cost or free providers."

'Most impersonated woman' used by scammers to steal from thousands of men qhiddqiqktirhinv'Most impersonated woman' used by scammers to steal from thousands of men

Research by cyber protection company McAfee shows that criminals need just three seconds of someone's voice to clone it.

'Greatest peril' to humanity generated by AI already happening, expert warnsKhaled said AI could confuse people's understanding of what is real and what is fake. (Getty Images/iStockphoto)

Khaled argued generative AI's "greatest peril" is its ability to confuse people and their understanding of what is real and what is fake.

"Voice cloning, along with other rapidly expanding commercially available generative AI capabilities, is yet another risk factor that muddles the information environment," he added.

Thankfully, the cybersecurity expert said technology should be developed soon that will pre-warn people a voice message could be a scam.

A mum in Scottdale, Arizona retold the horrifying moment scammers used AI to copy her daughter's voice and stage a fake kidnapping in a ransom call.

Jennifer DeStefano delivered emotional testimony in a hearing before the Senate Judiciary Committee earlier this month, explaining a phone call she received in January that was supposedly from her 15-year-old daughter Brianna, whose voice was screaming on the other end.

'Greatest peril' to humanity generated by AI already happening, expert warnsJennifer DeStefano was scammed with a clone of her daughter's voice (KHPO)

The cry for help continued before a male voice told the scared mum: "Listen here. I have your daughter.

"You call the police, you call anybody, I’m gonna pop her something so full of drugs.

"I’m gonna have my way with her then drop her off in Mexico, and you’re never going to see her again."

Jennifer, who had been told the kidnapping was for a $1million ransom, was outside a dance studio where her younger daughter Aubrey had a rehearsal at the time.

Beware scammers trying to rip you off with toxic mould scaresBeware scammers trying to rip you off with toxic mould scares

She had picked up the phone despite it being an unknown number because Brianna (also known as Brie) was training for a ski race at the time and was away from home.

Aubrey, 13, was left shaking and crying listening to screams she thought belonged to her sister.

The call wasn't revealed as a scam until after a 911 call had been made and a confused Brianna called her mum to tell her she was ok.

The grim 911 call was recorded by the Scottsdale Police Department and a parent at the dance studio said a "kidnapper" would not "let her talk to her daughter."

Data from the Federal Trade Commission (FTC) indicates Americans lost $2.6 billion in 2022 to imposter scams and schemes are getting increasingly sophisticated.

'Greatest peril' to humanity generated by AI already happening, expert warnsBrianna was completely safe, her mum eventually learned (Facebook)

FBI spokesperson Siobhan Johnson said families lost around $11,000 on average in similar scams.

In a statement, the FTC said: "A scammer could use AI to clone the voice of your loved one.

"All he needs is a short audio clip of your family member’s voice - which he could get from content posted online - and a voice-cloning program. When the scammer calls you... (it will) sound just like your loved one."

It comes as tech and key science experts recently expressed their concerns over AI and the future of humanity in a statement published on the Centre for AI safety.

The signed letter published last month argues that "mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war."

Benjamin Lynch

Print page

Comments:

comments powered by Disqus