AI voice cloning scams ‘will come to Australia’ – so how do you guard against it?

Posted by
Check your BMI

The next phone call you receive from your loved one, may not actually be from your loved one.

That's the warning from an Australian artificial intelligence expert who said voice cloning algorithms are becoming more sophisticated, and being increasingly used by scammers.

9news.com.au investigated the accessibility, ease of use, and efficacy of AI voice-cloning tools and we were able to convincingly recreate the voice of one of our journalists using an online voice-cloning tool.

You can listen to the two voices – real and AI-generated – in the video above.

So, how does the technology actually work?

READ MORE: Do you feel like your phone is listening to you? You're not wrong

Human face and mouth and sound waves - 3D illustration

toonsbymoonlight

Dr Diep N Nguyen, Associate Professor with the School of Electrical and Data Engineering at the University of Technology Sydney, said AI models can recreate vocal frequencies from a relatively short voice clip and string snippets into coherent sentences.

"Some AI models and algorithms need as little as a minute or less of recording," he told 9news.com.au.

"That's good enough for them to synthesise a reasonably high-quality voice clone.

"(But) the more a person talks the better."

Nguyen added advanced AI models and algorithms can synthesise a voice so well "it is hard for an ordinary person to differentiate the cloned and the authentic one".

READ MORE: Will AI ever reach human-level intelligence? Five experts weigh in 

"Some models/AI algorithms boast that they can achieve 99 per cent accuracy, in comparison with the authentic voice," he said.

"No doubt it would be possible.

"With this one per cent difference, for humans, it would be very difficult to recognise the difference". 

The rise of AI voice cloning scams

Scammers are increasingly using emerging technology to swindle victims out of large sums of money.

"(There's been) many cases overseas, in the United Kingdom, United Arab Emirates, Hong Kong," Nguyen said.

"No doubt it will come to Australia.

"I believe Australia is vulnerable to many cyber-attacks and voice cloning is for sure a high risk."

READ MORE: Sydney mum shares entire conversation with scammer

Jennifer DeStefano received a scam call about a family emergency or fake kidnapping using a voice clone.

A mother in the US claimed she nearly fell victim to an AI voice scam last month after receiving a panicked call – allegedly from her daughter – who said she had been kidnapped.

The 15-year-old, who was on a ski trip, never said any of the words Jennifer DeStefano heard.

What to watch out for

Nguyen said scammers could theoretically use existing social media videos and run them through an AI voice cloner to recreate a voice.

"Whatever we share online via Facebook, TikTok, YouTube – including our voice, photos and videos – can be potentially used by deep fake technologies to train AI models for cloning purposes," he said.

"With the ever-fast advances of AI models, less and less training data will be required to synthesise voices that are sufficiently close to the authentic ones for fraudulent behaviors.

"So be wary of your personal digital "profiling/data" that you may share."

READ MORE: The ten passwords to avoid at all costs

Assorted popular social media apps - BeReal, Instagram, Twitter, Snapchat, TikTok, Clubhouse, Facebook, YouTube, and Discord - are seen on an iPhone.

He added people should start second guessing any call, which makes a request for a money transfer; whether it be from a loved one or not.

"The best thing people can do is say, 'thank you very much', and hang up the phone," Nguyen said.

"Then re-call the person who allegedly just called you to confirm.

"Let's say it's from my mum, I should call my mum and ask 'did you just call me?' and if she confirms, then yes it's probably ok."

Sign up here to receive our daily newsletters and breaking news alerts, sent straight to your inbox.