AI Kidnapping Scam: How To Protect Your Family and Money

A young woman frowning as she speaks on the phone.
ArtistGNDphotography / Getty Images

Scams have advanced quite a way from the likes of three-card monte. Now if you fall victim to something fraudulent created by artificial intelligence, or AI, it is more unexpected as the high-quality AI voice cloning software or other technologies make it nearly impossible to decipher between what is legit and what is a scam in real time. Next time your phone rings and it’s an unknown number, the voice-generated call on the other end could be so much more than spam. 

Read: What To Do If You Owe Back Taxes to the IRS

The AI Kidnapping Scam: Quick Take

In January, Jennifer DeStefano’s 15-year-old daughter Briana was on a ski trip when she received a terrifying call from what sounded like her daughter’s voice from an unknown phone number. Briana sounded panicked and was begging for her mother to save her from kidnappers.

Luckily, this turned out to be a hoax as Briana was safe and not kidnapped. Unluckily, this is systemic of a larger trend of AI voice scams where scammers will call family members and try to extort money in order to have their loved one safely returned even though no kidnapping has occurred. It must be jarring to hear what you think is your child or spouse asking for help.

Make Your Money Work Better for You

How AI Can Copy Your Voice and Face

Everything you see or hear nowadays seems to be through some kind of filter. Whether it is a robotic phone call or images you see on social media, it begs the question of how much of what you perceive is real. Now with advancements in AI not only is your perception at risk but also your identity. 

AI Voice Cloning 

Voice synthesis or voice mimicry technology often referred to as AI voice cloning uses text-to-speech as well as machine learning to simulate a specific voice. To accomplish a believable impersonation AI technology uses learning algorithms that need voice data such as vocal characteristics to duplicate the intricacies of the person it’s trying to mimic. Once it has enough data it can duplicate such nuances as:

  • Tone
  • Cadence
  • Pitch
  • Laughter
  • Pauses

With these capabilities, you can see where it would be easy to believe you were talking to someone you know or even a family member. To avoid falling for an AI kidnapping scam keep these takeaways in mind:

  • The call is coming from a phone number you don’t recognize.
  • The person claiming to be someone you know cannot answer questions directly or doesn’t seem to know answers to questions they should.
  • The person repeats the same message over and over without pausing and then someone else begins to speak to request money. 
Make Your Money Work Better for You

AI Face Swapping

A computer vision technique that uses photos or videos of one or more people is an advanced technology known as an AI face swap. General adversarial networks based on deep learning algorithms can be fun as a Snapchat filter, but dangerous if they are used for fraudulent activities. Deep fakes are becoming harder to spot and if this technology keeps advancing it could have much larger legal consequences and implications.

Final Take To GO 

Scam artists know that preying upon your fears is one of the easiest ways to get you to hand over money without thinking it through. It’s unfortunate that, as quickly as AI is advancing, so are the people that will take advantage of it. Familiarizing yourself with the warning signs of a potential AI voice scam or phone call could save you both time and heartache.

FAQ

Here are the answers to some of the most frequently asked questions about AI voice scam calls.
  • How can you recognize an AI kidnapping scam?
    • AI tech is getting harder to keep up with but here are some signs the call you just received is part of a AI kidnapping scam:
      • The call is coming from a phone number you don't recognize.
      • The person claiming to be someone you know cannot answer questions directly or doesn't seem to know answers to questions they should.
      • The person repeats the same message over and over without pausing and then someone else begins to speak to request money.
  • Is a loved one calling for help a scam?
    • Though not every phone call you receive from a loved one should be considered a scam, there are AI voice scamming abilities that can duplicate people's voices and can therefore be used in scams such as the AI kidnapping scam where a scammer will clone your loved ones voice to convince you they've been kidnapped in order to extort money from you.
Make Your Money Work Better for You

Our in-house research team and on-site financial experts work together to create content that’s accurate, impartial, and up to date. We fact-check every single statistic, quote and fact using trusted primary resources to make sure the information we provide is correct. You can learn more about GOBankingRates’ processes and standards in our editorial policy.

BEFORE YOU GO

See Today's Best
Banking Offers