Thatโ€™s Not a Real Soldier: AI Scams Targeting Veterans and Families

A new awareness campaign is raising alarms about how easily criminals can fabricate faces and voices to pose as military personnel — targeting veterans and their families. Using AI-powered deepfakes, scammers impersonate soldiers or loved ones to manipulate victims into giving up money, sensitive information, or access to accounts. (Ground News, via militarytimes.com coverage)

Why This Matters for Disabled Veterans

  • Trust Undermined: Veterans and their families are often emotionally vulnerable — exploiting that trust with AI deepfakes changes the scam game entirely.
  • Sophistication Not Limited to High-Tech Users: These arenโ€™t random phishing emails — fake voices or videos make these scams feel disturbingly real. A Wall Street Journalโ€“style deepfake once defrauded a company of tens of millions in a single phone call. (Barronโ€™s analysis, Military Times discussion)

Real Cases and Broader Concern โ€ฆ

  • “Safe Word” Saves Lives: A highly publicized case involved a family whose grandchildrenโ€™s voices were cloned to fake a kidnapping. Experts recommend establishing a unique, shared “safety code” phrase — something only family members would know — to confirm identity.

What You Can Do Right Now โ€ฆ

  • Establish a โ€œsafe phrase.โ€ Ask for it if you get a call that seems urgent and personal.
  • Stop and verify. Never act on a message that pressures you. Pause, then call the person back on a known, trusted line.
  • Watch for AI red flags: Flat-tone voices, perfect grammar, over-the-top emotion, or urgency — these are all warning signs.
  • Protect your digital footprint. Restrict sharing of voice or video clips online — scammers can train AI on even brief clips.
  • Report suspicious incidents. Notify VA benefits or command leadership about fraudulent content or impersonations.

Final Thoughts โ€ฆ

Letโ€™s not let the technology that saved us — through medical advances, logistics, and accessibility — be used to hurt our community. AI can mimic nearly anything, but it canโ€™t replicate loyalty, values, or your instincts.

Protect your digital lane. Stay skeptical. And when in doubt: verify.

For more veteran-focused updates on digital safety, visit DisabledVeterans.org.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

One Comment

  1. Veterans Healthcare Administration psychology scam… you subject yourself to their opinion.. they get paid a pile of money.. sometimes you go away offended or with a sore asshole. Folks, this is what we’ve been reduced to in this country. There’s a fraud on every corner, most people “talking for living” and not actually producing anything. Better use for some VA employees would be replacing the Mexicans they’ve kicked out.. picking strawberries. Instead, useless super citizens with nothing better to do but fuck you.