Reviewed against FTC.gov and FBI.gov guidance · AI-assisted content, see our editorial standards
Quick answer
Quick answer: AI voice cloning scams work by using a few seconds of someone's recorded voice, from social media or voicemail, to create a fake call that sounds exactly like a family member in distress. The protection is simple and reliable: establish a family "safe word" that anyone can use to prove their identity in an emergency. Always hang up and call back on a number you already have before sending any money.
In 2023, the FTC received reports of consumers losing hundreds of millions of dollars to impersonation scams, and voice cloning technology is making these scams more convincing at a pace that has alarmed federal law enforcement.
The technology itself is remarkable: modern AI voice cloning requires as little as three seconds of audio to generate a near-perfect replica of someone's voice. That audio can come from anywhere, a social media video, a voicemail, a podcast appearance, a family video call recording. Scammers use it to call grandparents, parents, or spouses in the voice of a loved one who is "in trouble."
What you're about to read will make you a harder target, and will help you protect the family members who matter to you.
3 sec
of audio needed to clone a voice with AI
Source: Sensity AI 2024
$4.8B
lost by adults 60+ to fraud in 2023
Source: FTC Consumer Sentinel 2024
#1
adults over 60 are most targeted by financial fraud
Source: FTC 2024
How voice cloning scams work
The mechanics are more straightforward than most people expect.
Step 1: The scammer finds audio. They search social media, Facebook, Instagram, TikTok, YouTube, for videos of the person whose voice they want to clone. If the target has a grandchild who posts videos, that audio is the raw material.
Step 2: They clone the voice. Using one of several commercially available AI voice cloning services (some costing as little as a few dollars), they generate a customizable voice model. They can make this voice say anything.
Step 3: They call a family member. The call comes to a grandparent, parent, or spouse. The voice they hear sounds genuinely like their loved one. The script usually involves:
- An urgent, emotional situation (accident, arrest, hospital)
- A request for immediate money, often through wire transfer, gift cards, or cryptocurrency
- A reason to keep the call secret ("Don't tell Mom and Dad, I'm so embarrassed")
- An "authority figure" who gets on the line after the fake family member, someone posing as a lawyer, police officer, or bail bondsman
Step 4: The money is sent. The victim, convinced they're helping a loved one in genuine distress, sends the money before verifying.
The FBI reports that the typical loss in these scams ranges from several hundred to tens of thousands of dollars.
Why these scams work on intelligent people
This is the question victims most often ask themselves: How did I fall for this? The honest answer is that these scams are engineered specifically to defeat your rational defenses.
The voice sounds real. Modern voice cloning is convincing enough that victims who've heard it can't reliably distinguish it from the real person, especially when they're already emotional from the distressing content of the call.
Emotion overrides reasoning. The belief that a grandchild is in danger activates a protective response that is faster and stronger than skepticism. Scammers know this and deliberately escalate the emotional stakes.
The request for secrecy isolates the victim. Asking the target not to tell other family members prevents the most natural protective behavior, calling another family member to verify.
Time pressure prevents verification. "The bail hearing is in two hours" or "the hospital needs payment right now" are constructed to prevent the target from taking the time to verify.
Per the FTC, this combination, emotional urgency, social isolation, and time pressure, is the consistent architecture of the most effective financial scams.
What AI can fake, and what it still can't
Voice cloning technology is impressive, but it has real limits. Understanding them helps you spot a scam even when the voice sounds right.
AI can convincingly fake
- The sound of a familiar voice from just a few seconds of audio
- Regional accent, speaking cadence, and word choice
- Emotional tone, panic, crying, urgency
- Short phrases used in a pre-scripted conversation
- Background noise like traffic or a hospital waiting room
AI still can't fake
- A real-time, unscripted conversation with specific shared memories
- Answers to questions only the real person would know
- A pre-agreed safe word that was never shared online
- Letting you hang up and call the person back directly
- Waiting patiently while you verify through another channel
🚫 Rule
No legitimate caller will refuse to give the safe word or insist you send money immediately. Urgency plus secrecy plus money equals scam, every time.
The protection that actually works: the family safe word
The most reliable protection against voice cloning scams is one that requires no technology and costs nothing to implement.
Establish a family safe word or phrase. Choose a word or short phrase, something unusual enough that it wouldn't appear in a public conversation, and share it with the family members most likely to be targeted (older parents and grandparents) and those whose voices might be cloned (adult children and grandchildren).
The protocol: if someone receives a distressing call from a person claiming to be a family member, they ask for the safe word. If the caller can't provide it, hang up immediately and call the real family member directly at a known number.
Important: The safe word should not be shared publicly, should not be something easily guessable, and should be updated occasionally. Don't use it in a message or email, share it in person or on a direct voice call with a number you already know.
⚠ Important
Never send money via gift card, wire transfer, or cryptocurrency to anyone, no matter how urgent or convincing the situation sounds. No legitimate emergency ever requires these payment methods.
Step-by-step: what to do when you get a suspicious call
Step 1: Don't panic. The scammer's goal is to get you to act before you think. Take a breath.
Step 2: Ask for the safe word. If the caller claims to be a family member, ask for the family safe word. If they don't know it or give a reason why they can't say it, hang up.
Step 3: Hang up and call back directly. Call the family member at a number you already have, not a number the caller gave you. If the call was legitimate, you'll reach them. If they're fine, you'll know immediately.
Step 4: Never send money before verifying. No matter how convincing the call, no matter how urgent the stated situation, do not send money, wire transfer, gift card, cryptocurrency, or any other form, before you've verified the situation through a separate channel.
Step 5: If someone asks you to keep it secret, that's the scam. Legitimate emergencies don't require secrecy. A family member who is genuinely in trouble would want you to call other family members, not hide it from them.
A real example
Patricia, 76, from Ohio, received a call from what sounded exactly like her 23-year-old grandson Jason. He was crying, said he'd been in a car accident, and was being held by police because the other driver said he'd been drinking. He begged her not to call his parents, he said he was embarrassed and scared.
Then a "lawyer" got on the line and said Jason could be released if $4,200 in bail was paid via gift cards before 5 PM.
Patricia was about to drive to CVS to buy the gift cards when her daughter happened to call for an unrelated reason. She told her daughter what was happening. Her daughter immediately called Jason, who was at work, completely unaware of the call.
Patricia had not yet established a family safe word. After this experience, her family did, and she shared this story at her church group, where three other members recognized the exact same script.
How to protect younger family members too
The voice used in these scams is usually a younger family member's. You can help protect them by:
Reviewing their social media privacy settings together. Accounts set to "friends only" make it harder (though not impossible) for scammers to access voice recordings. Public TikTok or Instagram videos are the most accessible source material.
Sharing this information with them. Many younger people don't realize that their casual social media presence can be used to harm their grandparents. The knowledge often motivates them to review their privacy settings.
Establishing the safe word as a family. Make it a clear, spoken agreement that everyone understands and will use.
Frequently asked questions
The bottom line
Voice cloning scams work by exploiting the most powerful human impulse: the desire to protect a family member in danger. The technology is sophisticated, but the protection is simple.
Establish a family safe word. Hang up and call back on a number you know. Never send money before verifying.
Share this article with family members who could be targeted, particularly parents and grandparents. The most effective way to stop these scams is making sure people know what they look like before they receive the call.
For more on protecting yourself against financial scams, also visit our network partner RetirementScamGuide.com.
Sources & further reading
- FTC Consumer Alerts, consumer.ftc.gov, Voice cloning scam reports and statistics
- FBI Internet Crime Complaint Center, ic3.gov, Elder fraud and impersonation data
- AARP Fraud Watch Network, aarp.org/money/scams-fraud, Elder fraud prevention resources
Related reading: