Voice Clone AI Scams: What You Need To Know About This New Scam

Bad news over the phone stock photo
Mixmike / iStock.com

As if telephone, email and messenger scams weren’t bad enough, a new breed of scammers can now use artificial intelligence to clone the voices of loved ones. No, this is not dystopian science fiction. News outlets like Fortune are reporting that it’s happening now.

But what are voice clone AI scams? And how can you protect yourself?

What Are Voice Clone AI Scams?

In January 2023, Microsoft unveiled a text-to-speech AI tool that could simulate a person’s voice based on an audio clip as short as three seconds. Microsoft called it VALL-E and said that the technology “may carry potential risks in misuse… such as spoofing voice identification or impersonating a specific speaker,” Fortune reported.

Bad actors saw this as an opportunity and used it to spoof the voices of people’s friends and family. They have been convincing others to wire them money, send them gift cards or give them personal account information.

How To Avoid Voice Clone AI Scams

Since no software can effectively identify AI voice cloning, common sense will be your best defense against any hackers.

If you receive a phone call from a person claiming to be a close friend or family member in distress, pause and think before taking action. Do not send money under any circumstances.

Ask yourself these questions:

  • Is the situation your loved one is describing plausible?
  • Are you able to physically locate them and verify the story?  
Make Your Money Work for You

In other words, if your grandson claims to be calling from another part of the world, can you phone his parents to verify the story? Does the story even make sense? If not, hang up immediately.

Do not try to “trap” the scam artist by asking personal questions only your loved one will know. The longer they keep you on the phone, the more time they have to convince you that they are your loved one in a tough situation. Voice scammers are professionals and know the right things to say to convince you to send them cash or gift cards.

Protect Yourself by Never Sending Money or Gift Cards

Do not, under any circumstances, send any cash, cryptocurrency or gift cards by any method, including mail, email or wire transfer.

Hang up the phone and try to get in touch with your loved one. If you call their phone, make sure it is the number you know and not the number the scammer might provide.

How Quickly Voice Clone AI Developed

Several years ago, you would have needed a substantial amount of audio samples to clone an individual’s voice. Now, experts say, a 30-second TikTok clip or Facebook reel can give voice clone AI scammers all they need to develop a convincing conversation.

AI software can convincingly fill in the conversation, based on the victim’s responses.

What To Do If You’ve Been a Victim of AI Scammers

Unfortunately, if you do act impulsively — out of fear or concern — you are not likely to get your money back. However, the faster you act, the better your odds are. File a report with the Crime Complaint Center as soon as possible.

Make Your Money Work for You

What AI Developers Are Doing To Help

Aware of the growing problem, developers are stepping in to limit the use of their tools.

ElevenLabs, for instance, tweeted in late January 2023 that it would be rescinding capabilities for voice cloning in the free version of VoiceLab. The company acknowledged, “Almost all of the malicious content was generated by free, anonymous accounts.”

The paid tiers require identity verification. Scammers might be hesitant to go through this verification step. Realistically though, if a bad actor is willing to use AI to voice scam people, there isn’t much stopping them from using stolen credit card information to open an account.

ElevenLabs also announced new technology that would allow people to verify if a sample was generated using VoiceLab. If it was, and was used for malicious purposes, people are encouraged to report the incident to ElevenLabs as well as to local authorities. That capability does not seem to be available yet on the ElevenLabs website, at least not without creating an account.

Resemble AI, another voice clone developer, has proposed an “invisible watermark” to identify AI-generated speech. Resemble says the watermark is “imperceptible” and “difficult to remove.”

The drawback? It will only be able to detect AI-generated audio from Resemble. This will help to keep AI scammers away from the company. It won’t halt the growing problem.

Final Note

Voice AI scams are particularly terrifying because they prey on our most powerful emotions: love and fear. If you do fall victim to an AI voice scammer, don’t feel ashamed.

Report the incident to your local police and the FBI. You can also file a report to the Federal Trade Commission. The FTC tracks cases of fraud and shares fraud reports with law enforcement partners. This helps reduce fraud and catch scammers, including AI scammers who are manipulating other people’s voices for their own gains.

Make Your Money Work for You

Our in-house research team and on-site financial experts work together to create content that’s accurate, impartial, and up to date. We fact-check every single statistic, quote and fact using trusted primary resources to make sure the information we provide is correct. You can learn more about GOBankingRates’ processes and standards in our editorial policy.


See Today's Best
Banking Offers