Have you ever heard of AI voice cloning? It’s a new technology that uses the power of artificial intelligence to create realistic copies of a person’s voice. And of course, scammers are already using cloning to commit fraud.

What is AI voice cloning and how does it work?

If you have ever experimented with ChatGPT, you know AI does a good job of writing human-sounding text. And Microsoft Copilot does a passable impression of a human artist, generating realistic pictures from text descriptions.

And so AI voice cloning does something very similar. Using a small snippet of audio, the artificial intelligence engine can mimic a speaker’s voice. The algorithm analyzes the sounds to create a soundalike audio pattern. You can then supply the AI with a line of text (like a script) and the algorithm records a sound bite that is almost indistinguishable from the original speaker. And the more samples the AI algorithm can analyze, the more accurate the final audio clip becomes.

AI generated sound clips can be used for all kinds of legitimate activities, from automated phone system menu attendants to mimicking the voice of long-dead celebrities for movies and presentations.

How is AI voice cloning being abused

Unfortunately, AI voice cloning can also be used for crime. Hackers are collecting audio clips from online video sites like YouTube and Tiktok, then using them to train AI models. They can then use these cloned voices to defraud victims. 

Imagine your phone rings. You answer the call and hear your boss speaking, sounding worried. Something has gone wrong and they urgently need you to share your logon details so they can fix something at work.

Or perhaps its one of your children experiencing some kind of emergency. They ask you to transfer some money to their online bank account urgently to help them out.

Because the cloned voices sound so realistic, you do what the caller asks. The trouble is, the whole call is fake – and when you transfer the money or hand over your logon details, the hackers have successfully defrauded you.

How to protect yourself 

The key to protecting yourself against AI voice cloning is to be aware. Never heard of AI voice cloning? You’re not alone – just 54% of people have heard of this cyberattack technique.

In the same way that you check all emails to avoid falling victim to phishing, you should carefully consider every phone call you receive. Would your boss really ask for your logon details? Wouldn’t they call the IT support team first? Would your kids really call asking for a bank transfer to an unknown account? Could they wait five minutes while you check the details and call them back?

In every case, taking a moment or two to consider what you are being asked is an effective way to protect yourself against AI voice cloning attacks.