Our contestants had a blast playing “Claims Jeopardy” at the Alabama State Bar Meeting. This month, we will highlight some of the categories, clues, and questions from the game. Check them out and see how well you play.
Category: Hacked
Clue: “A phone call from a guy who sounds like my client. He’s changing wiring instructions on a deal we are doing.”
Question: “What is an AI Audio Scam?”
An AI-voice scam is a sophisticated hoax where criminals use artificial intelligence to replicate human voices, often impersonating trusted individuals or organizations, to deceive victims into revealing sensitive information or sending money. Sometimes this is by telephone, where it is known as “Vhishing” (voice-phishing) – a scheme where fraudsters impersonate some trusted person or company to deceive victims. This is similar to the scam that may have fooled your grandmother into sending money to someone posing as you when you were supposedly in jail and needed bail money. But it’s much more sophisticated than that now. And far more convincing. With AI, voice cloning can make the caller sound just like you. Or your Managing Partner. Or your controller. Or someone at your bank. Or a family member. Maybe they tell you to transfer money, or they change wiring instructions. Or, maybe they get you to provide a password, bank account number, or social security number.
Bad guys are using AI to generate deep fakes to steal REAL money. Already in 2024, fraudsters posed as the company CFO and other staff on a video conference call, using their fake likenesses and voices to trick a finance employee in Hong Kong into transferring $25M to the fraudsters. A Brooklyn, New York, couple was wakened after midnight by a call appearing to be from the husband’s parents, but which in fact used AI generated snippets of the parents’ voices sounding frantic. Then an unknown male voice demanded a Venmo transfer, threatening to kill the parents. Fortunately, the unsuspecting parents were completely safe and asleep in their beds (and Venmo refunded the money). Fraudsters are also using AI to trick facial recognition software to create fake ID cards. These are very dangerous scams that can target literally anyone or any organization – and lawyers are not immune.
How do the bad guys do it? With inexpensive and readily accessible software called an AI Voice Generator. The technology itself is not illegitimate – only when used by scammers. They can access an app, pay a small monthly fee, and clone voices in a few seconds.
Know what to look for to detect an AI Voice Scam:
Are there unnatural pauses or robotic-sounding speech?
Unexpected requests for personal or financial information?
Are there inconsistencies in the caller’s story?
Can you verify the caller's identity through a trusted source?
Protect yourself from AI voice cloning scams:
Be wary of unsolicited calls requesting personal information.
If in doubt, hang up.
Verify the caller’s identity - especially if they are seeking sensitive data.
Avoid sharing personal or financial information over the phone unless you are positive the caller is legitimate.
Add an extra layer of security by using two-factor authentication for sensitive accounts.
If in doubt, hang up, and independently verify the caller’s identity, before providing any information.
How to avoid being cloned:
Use a spam blocking service on your cell phones. Don’t answer calls identified as “potential spam.”
Don’t answer and say, “hello, hello, hello” if there’s no one there.
Be mindful of sharing voice recordings or personal information online. Think about your firm or cellular voicemail – a bad actor can take your voicemail from your office, home or cell phone and replicate your voice from that in a few seconds. Also, think about videos you have posted to social media – your voice is out there for scammers to steal.
Regularly review privacy settings on social media platforms and limit access to your personal data.
Avoid participating in voice challenges or quizzes that require recording your voice.
Consider using voice authentication or biometric security measures where available.