← Back to resources

Deepfake Scams – When It Looks Real, But Isn’t

April 21, 2025

Scams have always relied on deception, but now they’re entering a new era—one where seeing and hearing isn't believing. Deepfake scams use artificial intelligence to create fake videos, audio, or images that look and sound real. The results can be shockingly convincing—and dangerous.

From impersonating CEOs to mimicking loved ones, scammers are using fake messages to steal money, access sensitive information, and manipulate emotions. Here's how to stay ahead of the deception.

How It Works

Deepfake scams use AI to generate realistic audio or video of someone’s face or voice—often pulled from social media or public videos. Here’s how the scam might unfold:

  1. The scammer creates audio or video of a trusted person—like a boss, family member, or public figure.
  2. They send that message, requesting money, access, or sensitive info—typically through email, text, or messaging apps.
  3. Because the message looks and sounds real, people are more likely to follow through without questioning it.
  4. The scammer vanishes after receiving payment or data, leaving the victim confused and betrayed.

These scams are especially effective in corporate settings, where urgency and hierarchy make it hard to say no to what appears to be a direct request from leadership.

What It Looks Like

You get a voicemail from your child saying they got into a car wreck, or are in jail. They need money immediately. You panic. But when you call them back, they’re fine. That call? A sophisticated ruse.

or

You’re sitting at work when you get a video message from your CEO on Slack. She says there’s a surprise acquisition happening and she needs you to wire $10,000 to a partner ASAP. Her tone is serious. Her face and voice seem real. You do it—only to find out hours later she never sent that message.

How to Spot Them

  • Unusual requests for money, gift cards, or sensitive access—even from a “trusted” source—should raise red flags.
  • Urgency or secrecy ("Don't tell anyone else") is a manipulation tactic.
  • Verify requests through a second method—call or text the real person directly before acting.
  • Check email or message headers for spoofed addresses.
  • Be cautious about what you post online—the more video and audio of you that’s public, the easier you are to mimic.
  • Talk with your loved ones about a Secret Phrase which others wouldn't know to use in real emergencies.

As deepfake technology evolves, staying skeptical is your best defense. Don’t trust appearances alone—verify before you act. A few extra minutes could save you from a costly scam.

Facebook Share Button
Twitter / X Share Button