Scammers Are More Sophisticated than Ever — Be Vigilant about the New AI Voice Scam

Q. I heard about a new type of “emergency scam,” where scammers are using artificial intelligence to mimic voices and scam loved ones into sending money. How do these scams work, and what can I do to stop something like this from happening to me or my elderly parents? Thanks for your help!

A. “Emergency scams” have been around for years, but artificial intelligence (AI) has made them more sophisticated and convincing. In the past, emergency phone scams that mimicked loved ones used to blame a poor connection or bad accident to explain why a “loved one’s voice” sounded different. Today, with AI technology, it is almost impossible for the human ear to be able to tell that the voice on the other end of the phone is not the person it purports to be. Let’s look at an example about a grandmother who was nearly the victim of a sophisticated AI emergency scam:

Mary’s 18-year-old granddaughter, Katie, visited her for her 82nd birthday. After the surprise visit, Mary sent Katie off with some of her favorite homemade sweets and wished her safe travels. As a film student who wanted to practice her craft, Katie posted a video tribute to her grandmother with some memories from their visit on social media.

Mary relaxed on her porch that evening, smiling as she watched the beautiful tribute. Later that evening, she answered a phone call from an unknown number and was horrified to hear what sounded exactly like the panicked voice of her granddaughter, who was begging to be saved from kidnappers who said they picked her up along the road.

In reality, Mary’s granddaughter was safe and sound. Scammers had used artificial intelligence to mimic Katie’s voice (captured from the social media video she posted) to try and extort money out of her grandmother in exchange for Katie’s safety. The scammer demanded a hefty sum from Mary to let her granddaughter go.

Mary was so panicked that she nearly had a heart attack. She opened the door, put the phone on mute, and started screaming for help! A neighbor overheard the commotion and called 911. The dispatcher told her it sounded like Mary was being targeted by a popular scam and asked if she had spoken to her granddaughter directly.

Mary was finally able to get through to her daughter, who confirmed that her granddaughter was okay and had arrived home a few hours ago!

A New Alarming Trend Emerges

The situation described above is an example of an alarming trend, where scammers are becoming more cunning and ruthless with the use of AI. One of the largest cybersecurity firms in the country, Check Point Technologies, says they’ve seen a substantial increase in AI-based scams and attacks from just the past year. Phone and cyber scams, in total, took approximately $10 billion from Americans in 2022, according to the FBI Internet Crime Complaint Center. Emergency scams similar to the one described robbed $2.6 billion from Americans in the past year, according to the Federal Trade Commission.

Eva Velasquez, president and CEO of the Identity Theft Resource Center, says that the center receives reports of AI scam calls that convince victims their relative needs money to pay for damages from a car accident or other incident or to send money for bail. Other scams include using a manager or executive’s voice in a voicemail instructing someone to pay a fake invoice.

How Do These New AI Scams Work?

As AI technology has become readily accessible over the past year or so, criminals are more and more frequently using it to impersonate the voices of our friends and loved ones to trick us into sending them money. A clever scammer with a good AI program doesn’t need much more than a few-second recording of a loved one’s voice to be able to clone the person’s voice and apply their own script. From there, they can play the audio over the phone to convince their victims that someone they love is in a desperate situation and needs money immediately.

Here are the typical steps a scammer would take in an AI phone scam:

  1. Collect the recording: To carry out an AI scam call, criminals first must find a three- to ten-second audio recording of your loved one’s voice, such as a clip from YouTube or a post on Facebook or Instagram.
  2. Feed it to an artificial intelligence tool: Scammers use an AI tool that learns the person’s voice patterns, pitch, and tone—and simulates their voice.
    • Generative AI models such as ChatGPT or Microsoft’s VALL-E need to listen to only three seconds of an audio “training” clip of someone speaking to create a replica of their voice.
    • These tools are widely available and cheap or even free to use, which makes them even more dangerous, according to experts.
  3. Put together a script: Once the AI software learns the person’s voice, con artists can tell it to create an audio file of that cloned voice saying anything they want.
  4. Call the potential victim and play the AI-generated clip: The calls might use a spoofed local area code to convince you to answer the phone, but don’t be fooled. Many phone-based fraud scams originate from countries with large call-center operations, such as India, the Philippines, or Russia, according to Velasquez.
  5. Set the trap: The scammer, using your loved one’s AI-generated voice, will claim to be in urgent danger (such as having been kidnapped and needing ransom money, or being in jail and needing bail money) and try to convince you that you must send money immediately in an untraceable way, such as with cash, via a wire transfer, or using gift cards.

Sadly, many victims panic and send the money without independently confirming the truth of what the AI-generated voice told them. “The nature of these scams plays off of fear, so in the moment of panic these scams create for their victims, it is also emotional and challenging to take the extra moment to consider that it might not be real,” says Nico Dekens, director of intelligence and collection innovation at ShadowDragon (a software company).

How Can You Avoid AI Scams?

Unfortunately, AI scams will continue to increase as the technology improves, making it easier to locate targets and clone voices. And, because these capabilities are so new to our society, many people don’t even know they exist and can therefore easily be targeted. That’s why it’s important to take proper precautions to boost your online security to avoid being targeted in the first place, and to spread the word about these scams, for instance by forwarding this newsletter to friends and loved ones to alert them about this dangerous trend. Here are a few suggestions from the experts to avoid being scammed:

Make Your Social Media Accounts Private

Before sharing audio and video clips of yourself on Facebook, Instagram, YouTube, or any other social media accounts, limit your privacy settings (including who can see your posts) to people you know and trust. Users who keep their posts open to everyone should review and remove audio and video recordings of themselves and loved ones from social media platforms to thwart scammers who may seek to capture their voices.

Assign a Secret Phrase

Determine a secret phrase or code word that you can exchange with your loved one ahead of time. That way, if you receive a call alleging that they have been kidnapped or need money right away, you can authenticate that you are indeed speaking to the real person.

Erase Your Digital Footprint

Scammers often rely on the trail of bread crumbs you leave about yourself online, from your pet’s name to your high school mascot, to learn about your life and build a scam around it. Limit the amount of information you share about yourself publicly to lower the risk of these types of scams. Online tools such as DeleteMe can automatically remove your name, address, and other personal details from data brokers, which will make it more difficult for scammers to target you.

Google is working on an updated version of the Results About You tool that’ll alert you when your personal info appears in its search results and will make it easy to request their removal. Read more about this privacy tool in today’s Critter Corner!

What Should You Do if You Receive an AI Scam Call?

If you are on the phone with a purported loved one who is insisting they need you to send money, don’t panic, and don’t believe it without verifying it first through a third party. Experts recommend taking these steps before agreeing to send money to anyone based on an “emergency” phone call:

  • Call your loved one directly using a trusted phone number. If you can’t reach them, try to contact them or find out about them through a family member, friend, or colleague.
  • Ask the caller to verify a detail that only they would know, such as the secret phrase mentioned above.
  • Listen for any audio abnormalities, such as unusual voice modulation or synthetic-sounding voices, to possibly identify a scammer.
  • Write down or screenshot the phone number that called you, so that if you determine that the call is a scam, you can report it to government authorities.
  • Block the number on your phone to avoid receiving a call from them again, but understand that they can always spoof calling you from a different phone number; never let down your guard.
  • Alert law enforcement. They can help you verify whether the call you received is legitimate or a scam.
  • Report the call to your mobile phone carrier so the company can take appropriate action.

Scams, such as AI phone scams, are specially designed to catch you off guard, and they can happen to anyone. There’s nothing to be ashamed of if you think you’re a victim. Keep handy the phone numbers of resources that can help, including the local police, your bank (if money has been taken from your accounts), and Adult Protective Services. You can also report scams online to the FTC. Sharing your experience can help prevent it from happening to another older adult.

Planning to Protect Loved Ones

Protecting seniors from scams is very important, which is why we continually share information about new scams and how you can protect yourself. It is also very important to plan for your future and to help plan for the future of your loved ones. If you or your loved ones have not done Incapacity Planning, Estate Planning, or Long-Term Care Planning, or if you have a loved one who is nearing the need for long-term care or already receiving long-term care, please contact us to make an appointment:

Fairfax Elder Law: 703-691-1888
Fredericksburg Elder Law: 540-479-1435
Rockville Elder Law: 301-519-8041
Annapolis Elder Law: 410-216-0703
DC Elder Law: 202-587-2797

Print Friendly, PDF & Email
About Evan H Farr, CELA, CAP

Evan H. Farr is a 4-time Best-Selling author in the field of Elder Law and Estate Planning. In addition to being one of approximately 500 Certified Elder Law Attorneys in the Country, Evan is one of approximately 100 members of the Council of Advanced Practitioners of the National Academy of Elder Law Attorneys and is a Charter Member of the Academy of Special Needs Planners.

Leave a comment

Thank you for your upload