Be skeptical and know what to look for.
I’ve been doing a lot of reading and listening to essays and podcasts discussing scams that are tricking people out of scary amounts of money.
With AI entering the realm, it’s only getting worse. The risk and danger is higher than ever.
The best defense? Well, besides a healthy dose of skepticism, it’s education: knowing what to look for.
Here’s a summary of a more extensive list of scams from the FBI.
Protect yourself from AI scams
AI is making scams harder to spot by creating fake messages, images, voices, and videos that look real. Protect yourself by staying skeptical, verifying claims, and recognizing red flags like urgency or untraceable payments. If scammed, report it to authorities and share your experience to help others stay vigilant.
AI and Fraud
Criminals use AI to make scams more believable and reach more victims. This is incredibly important to be aware of. Even if things look legit, they may not be.
Fake text. AI is used to create fake messages for phishing emails, romance scams, investment fraud, and more. It’s used to make fake websites and social media profiles look real. Even if the usual red flags of poor spelling and grammar are missing, that doesn’t imply that the text you’re reading is legitimate.
Fake images. AI can create realistic photos for social media, fake IDs, or other scams. Scammers use fake photos to get donations, employ blackmail, or legitimize other types of scams. “Seeing is believing” is no longer true.
Fake voices. Using a technique called audio cloning, scammers use AI to mimic voices to sound like loved ones or famous people to trick you into sending or spending money. This is often combined with urgent telephone calls using your loved one’s voice to ask for help or access to your accounts. More on urgency below.
Fake videos. AI is being used to create videos of fake authority figures or “proof” that someone is real. These are then used for fake investment/cryptocurrency scams or fake investments or job offers.
And as we’ve seen, AI is just getting better and better.
If anything can be fake, what can we do?
Protect yourself from AI scams
Here are steps to take and things to look for.
- Create a family password to confirm you’re talking to who you think you are.
- Look closely at photos and videos for details that look “off”, like odd hands or fake-looking faces.
- It’s not always practical, but if you can, limit sharing your voice or photos online so they can’t be copied.
- Hang up and verify incoming calls by contacting the company or person directly yourself.
- Don’t answer caller IDs you don’t recognize.
- Don’t trust Caller ID; it can easily be spoofed.
- Don’t send money or share personal info with people you’ve only met online.
- Never ever trust urgency. The more desperate they sound and the more insistent the timeline, the more suspicious you should be. It’s the #1 sign of a potential scam.
- Never make payments using gift cards, cryptocurrency, gold bars1, or other methods that cannot be challenged and reversed.
In short: familiarize yourself with the signs and be skeptical.
If it happens to you
Even the best prepared can fall. It may someday be you.
Do not be ashamed. This stuff is happening. As you can see from the lists above, it can be easy to let your guard down.
Instead of being embarrassed and silent, in the United States, report it to the FBI’s IC3 (Internet Crime Complaint Center) at ic3.gov. They’re overwhelmed, as you might imagine, and you may not get a response, but your details still help the FBI track down scammers. In other countries, look for a similar authority to report to.
Consider sharing your story with friends. It can be the wake-up call people need to understand they could be scammed, too.
Do this
Familiarize yourself with the signs and techniques above.
Be skeptical, even of things you’ve never had to be suspicious about before.
Report it if it happens to you.
Subscribe to Confident Computing! Less frustration and more confidence, solutions, answers, and tips in your inbox every week.
Footnotes & References
1: Yes. Apparently, it’s a thing.
Reference: This article was based primarily on this recent and extensive Public Service Announcement from the FBI: Criminals Use Generative Artificial Intelligence to Facilitate Financial Fraud.