Hackers have found a shocking new way to steal your data

423     0
Soon you may not be able to trust your friends and loved one over the phone thanks to hackers (Image: trumzz / Getty)
Soon you may not be able to trust your friends and loved one over the phone thanks to hackers (Image: trumzz / Getty)

We have all seen texts and emails claiming to be from the tax office or from someone urgently requesting payment, most of us are savvy enough to see the scam for what it is and just ignore it.

However, fraudsters have come up with a terrifying new way of tricking victims using AI voice cloning according to a report from a remote biometric digital authentication and automated onboarding technology platform called iiDENTIFii.

Cyber crooks have come up with a scary way to glean information out of you and your family. Imagine receiving an unexpected call from a loved one saying they need help or your boss asking for data about a new project, most of us wouldn't hesitate, right?

Hackers have found a shocking new way to steal your data eiqreidqriddxinvScammers have already posed as victims loved ones (Getty Images/iStockphoto)

Think again as the latest AI technology has opened the door to incredible software that can realistically imitate voices and it could make spotting scams much harder.

AI-generated clones of the voices of singers Drake and The Weeknd have already gone viral online tricking many into believing that a new track had been leaked by the artists, leaving streaming services Spotify, Apple Music, and Deezer scrambling to remove the track.

Microsoft has recently piloted an AI tool called VALL-E. It can clone a person's voice just by listening to a few seconds of an audio clip and then generate audio in a wide range of different languages.

The criminals only need a short audio clip of a family member’s voice which is often stolen from social media and then imported into a voice cloning program to stage an attack.

The threat is a serious concern, as the US Federal Trade Commission recently issued a warning urging consumers to be vigilant for calls in which scammers sound exactly like their loved ones while sounding like something from Sci-Fi is now all too real.

We take for granted the safety of our voices and how while imitative most of us would be able to notice any changes to the voices we hear often.

Hackers have found a shocking new way to steal your dataVoice cloning is a new frontier for fraudsters targeting consumers (Getty Images/iStockphoto)

Thankfully this has not been released for public use, but it does illustrate how easily voice can be manipulated as a medium.
Worse still impersonation attempts are on the rise Gur Geva, founder and CEO of iiDENTIFii, says,

“The technology required to impersonate an individual has become cheaper, easier to use and more accessible. This means that it is simpler than ever before for a criminal to assume one aspect of a person’s identity.”

Audio recognition technology has been a standard security solution for many business and financial services companies around the world.

Man in 30s dies after being stabbed in park sparking police probeMan in 30s dies after being stabbed in park sparking police probe


Barclays, for example, integrated Apple's Siri to facilitate mobile banking payments without the need to open or log into the banking app. Visa partnered with Abu Dhabi Islamic Bank to introduce a biometric voice and voice-based authentication platform for e-commerce which uses biometric sensors built into a standard smartphone.

The rise of voice-cloning helps remind us of the importance of using multi-layered biometric authentication.

Geva adds, “Our experience, research and global insight at iiDENTIFii has led us to create a remote biometric digital verification technology that can authenticate a person in under 30 seconds, but more importantly it triangulates the person’s identity, with their verified documentation and their liveness.”

Hackers have found a shocking new way to steal your dataWe might feel our voice is safe but new AI-enhanced tech proves that isn't the case anymore (iiDENTIFii.)

iiDENTIFii uses biometrics with liveness detection, protecting against impersonation and deep fake attacks.

“Even voice recognition with motion requirements are no longer enough to ensure that you are dealing with a real person. Without high-security liveness detection, synthetic fraudsters can use voice cloning, along with photos or videos to spoof the authentication process.”

While this new technology is impressive and can pool the wool over our eyes, users and businesses need to remain vigilant and create more sophisticated methods of authentication and we might need to think before giving over our data, asking questions like why does Gran need my PIN number?

James Ide

Print page

Comments:

comments powered by Disqus