In partnership with

A man in Canada got a phone call from his son.

His son sounded panicked. He said he had been in a car accident and needed money immediately for legal fees.

The father recognised the voice instantly. It sounded exactly like him.

So he sent the money.

The problem: his son had never called.

The voice was generated using AI.

This is called a voice cloning scam, and it’s becoming one of the fastest-growing forms of fraud. Scammers can now recreate someone’s voice using only a short audio clip taken from social media, videos, or voice messages.

And the people most likely to fall for it are often parents and grandparents.

How the Scam Actually Works

The process is surprisingly simple.

Step 1: They find your voice.

Most people have small clips of themselves online without thinking about it:

  • Instagram stories

  • TikTok videos

  • YouTube clips

  • voice messages on apps like WhatsApp

Modern voice cloning tools can recreate a voice using less than 30 seconds of audio.

Step 2: They clone it.

Using AI voice software, scammers generate a model that can speak in your tone, accent, and rhythm.

To the human ear, it can sound extremely convincing.

Step 3: They call your family.

The scammer then phones a parent or relative pretending to be you.

The call usually includes panic and urgency.

Common stories include:

  • “I’ve been arrested”

  • “I’ve been in an accident”

  • “My phone was stolen”

  • “I need money urgently”

The goal is simple: create panic so the person sends money before thinking.

Why This Scam Works

For years, people relied on a simple rule:

If I hear their voice, it must be them.

That rule used to be safe.

It isn’t anymore.

AI voice cloning breaks the last layer of trust people have on the phone.

The Simple Defence Most Families Don’t Use

There is a surprisingly simple way to protect against this.

Create a family verification rule.

It can be either:

A safe word

Example:

“Purple lighthouse.”

Or a private question only your family knows the answer to, like:

  • “What was the name of our first dog?”

  • “Where did we go on holiday in 2015?”

  • “What was my childhood nickname?”

If someone calls asking for urgent help, ask the question.

If they can’t answer it, assume something is wrong.

AI can copy a voice, but it can’t guess a personal memory it has never heard before.

One More Rule That Stops Most Scams

Never send money during a panic call.

Instead:

  1. Hang up

  2. Call the person directly

  3. Or message them on another platform like WhatsApp or FaceTime

If the situation is real, they will confirm.

If it’s a scam, the story usually falls apart very quickly.

Your Task This Week

Take 30 seconds and warn your parents or family about AI voice scams.

Then pick a family verification question or safe word.

It might feel unnecessary today.

But as AI gets better at copying voices, this simple habit could stop a very expensive phone call before it starts.

Have a great week everyone 🙂 ,

Jamie

88% resolved. 22% stayed loyal. What went wrong?

That's the AI paradox hiding in your CX stack. Tickets close. Customers leave. And most teams don't see it coming because they're measuring the wrong things.

Efficiency metrics look great on paper. Handle time down. Containment rate up. But customer loyalty? That's a different story — and it's one your current dashboards probably aren't telling you.

Gladly's 2026 Customer Expectations Report surveyed thousands of real consumers to find out exactly where AI-powered service breaks trust, and what separates the platforms that drive retention from the ones that quietly erode it.

If you're architecting the CX stack, this is the data you need to build it right. Not just fast. Not just cheap. Built to last.