Protecting Your Family from AI Voice Cloning Scams

This report includes descriptions of an AI-enabled extortion scam involving threats of violence against a minor. Some readers may find the scenario distressing, especially parents or caregivers. Reader discretion is advised.
Jennifer DeStefano was at her youngest daughter’s dance rehearsal when her phone rang. Unknown number. She almost didn’t answer, but on the final ring she picked up, thinking it might be a hospital or doctor.
“Hello,” she said, putting the call on speaker as she walked through the parking lot.
Her 15-year-old daughter Brianna was sobbing on the other end. “Mom. Mom, I messed up.”
Jennifer thought she’d hit the ski gates during training. That had happened before. “Ok, what happened?”
Then a man barked at Brianna: “Lay down and put your head back.”
“MOM THESE BAD MEN HAVE ME. HELP ME, HELP ME!!”
A different man grabbed the phone and threatened to drug Brianna, assault her, and dump her in Mexico if Jennifer didn’t pay $1 million. All the while, Brianna was screaming in the background [1].
Jennifer was certain she was hearing her daughter. Not just the voice, but the specific way Brianna cried when she was terrified. The sobs that were unique to her. She told the Senate months later that it was impossible to fake [1].
Brianna was asleep at the ski lodge. Safe. But in those minutes before Jennifer found out, she had zero doubt her daughter had been kidnapped.
This happened in April 2023, but cases just like it have kept piling up ever since.
The FTC reported fraud losses exceeding $10 billion in 2023, with AI voice cloning flagged as an emerging threat [2]. By March 2025, Consumer Reports tested six major voice cloning services and found that four out of six lack meaningful safeguards to prevent this exact type of fraud [3].
Then there’s the McAfee survey from May 2023 that found that 77% of AI voice scam victims lost money, and 25% of adults had either experienced or knew someone who experienced an AI voice scam [4].
The technology works with just three seconds of audio [5]. That means any TikTok your teenager posted and any cringy voicemail greeting you recorded can be used against you. Any video where your voice is audible, really. Scammers can feed those clips into AI systems that capture pitch, tone, accent, and speech rhythm to create a clone that sounds exactly like you.
I’ve looked at this issue thoroughly to try to come up with the best strategies to stay protected, and I’m sharing what I’m finding with you.
First, I’ll explain how the technology actually works and where scammers get voice samples. Then we’ll look at the verification systems that security researchers and banks are now recommending to families.
So, to start, let’s look at how exactly voice cloning works, and why it has become so easy.
How Voice Cloning Went From Science Fiction to Three-Second Reality
Cloning a voice used to require hours of recordings and serious programming knowledge. The barrier collapsed in the last few years.
The technology relies on neural networks called autoencoders. These systems compress audio into two key pieces: what was said (the content) and how it was said (the speaker’s unique vocal characteristics). The AI captures pitch, tone, accent, rhythm, and even emotional inflection, then combines those elements to generate new audio in that person’s voice saying things they never actually said [5]. Spooky, right?
🔒This is where the free preview ends.
Join thousands of professionals who get the complete story
Our Deep Dive subscribers rely on these investigations to stay ahead of emerging threats and make informed decisions about technology, security, and privacy.
✅ Complete access to this investigation
✅ All future Deep Dive reports
✅ Searchable archive of past investigations
✅ No ads, no sponsored content
✅ Cancel anytime


