Thatโs Not a Real Soldier: AI Scams Targeting Veterans and Families
A new awareness campaign is raising alarms about how easily criminals can fabricate faces and voices to pose as military personnel — targeting veterans and their families. Using AI-powered deepfakes, scammers impersonate soldiers or loved ones to manipulate victims into giving up money, sensitive information, or access to accounts. (Ground News, via militarytimes.com coverage)
Why This Matters for Disabled Veterans
- Trust Undermined: Veterans and their families are often emotionally vulnerable — exploiting that trust with AI deepfakes changes the scam game entirely.
- Sophistication Not Limited to High-Tech Users: These arenโt random phishing emails — fake voices or videos make these scams feel disturbingly real. A Wall Street Journalโstyle deepfake once defrauded a company of tens of millions in a single phone call. (Barronโs analysis, Military Times discussion)
- Everyone is a Target: JPMorgan Chase warns that AI-driven scams caused a record $16.6 billion in losses in 2024 — up 33% from 2023. No one is immune. (Military Times, Investopedia via Kingโs warning)
Real Cases and Broader Concern โฆ
- “Safe Word” Saves Lives: A highly publicized case involved a family whose grandchildrenโs voices were cloned to fake a kidnapping. Experts recommend establishing a unique, shared “safety code” phrase — something only family members would know — to confirm identity.
- At-Risk Veterans: Congressional testimony in the 2020 โHijacking Our Heroesโ report highlighted that veterans are especially targeted online — not just for scams, but for disinformation and identity theft.
What You Can Do Right Now โฆ
- Establish a โsafe phrase.โ Ask for it if you get a call that seems urgent and personal.
- Stop and verify. Never act on a message that pressures you. Pause, then call the person back on a known, trusted line.
- Watch for AI red flags: Flat-tone voices, perfect grammar, over-the-top emotion, or urgency — these are all warning signs.
- Protect your digital footprint. Restrict sharing of voice or video clips online — scammers can train AI on even brief clips.
- Report suspicious incidents. Notify VA benefits or command leadership about fraudulent content or impersonations.
Final Thoughts โฆ
Letโs not let the technology that saved us — through medical advances, logistics, and accessibility — be used to hurt our community. AI can mimic nearly anything, but it canโt replicate loyalty, values, or your instincts.
Protect your digital lane. Stay skeptical. And when in doubt: verify.
For more veteran-focused updates on digital safety, visit DisabledVeterans.org.
Veterans Healthcare Administration psychology scam… you subject yourself to their opinion.. they get paid a pile of money.. sometimes you go away offended or with a sore asshole. Folks, this is what we’ve been reduced to in this country. There’s a fraud on every corner, most people “talking for living” and not actually producing anything. Better use for some VA employees would be replacing the Mexicans they’ve kicked out.. picking strawberries. Instead, useless super citizens with nothing better to do but fuck you.