AI ‘VOICE CLONING’ SCAMS

Sherry McCoy

Dr. Sherry McCoy PhD

is a freelance writer & actor for the Stop Senior Scams Acting Program (SSSAP) in Los Angeles. Follow SSSAP on Facebook . For more info re: SSSAP, contact Adrienne Omansky at SSSAP4U@gmail.com. Questions for the writer should be directed to “Dear Sherry” at Not Born Yesterday! P.O. Box 722, Brea, CA 92822 or nbynews@juno.com.


AI ‘VOICE CLONING’ SCAMS

Dr. Sherry McCoy, PhD – Stop Senior Scams ℠ Acting Program

 

In particular, the rapid development of AI (Artificial Intelligence) scams, especially those involving ‘voice cloning,’ presents a serious cause for concern for seniors. Let’s take a look at some of those developments and consider ways to protect ourselves from criminals whose nefarious plans are literally intent upon using our voices and the voices of our loved ones against us.

 

AI ‘VOICE CLONING’ SCAMS

In his March 5th, 2023, Washington Post article, “They Thought Loved Ones Were Calling. It Was An AI Scam,” (see REFERENCES below), Pranshu Verma outlines the damage that can result from imposter / impersonation scams that utilize AI ‘voice cloning.’ A popular scam of this sort is the “grandparents’ scam,” — an imposter scheme where the con artist pretends to be your grandchild who is in trouble (car accident, jail, etc.) and needs urgently for you to send financial help, usually via wire transfer, gift cards, or cryptocurrency, which is difficult or nearly impossible for law enforcement to track. Please note that criminals also use this same approach to impersonate a spouse, a friend, or other family member. So, if you receive a desperate phone call in the middle of the night from someone claiming to be your grandchild / spouse / friend – who sounds just like your grandchild / spouse / friend, and who speaks with the same idiomatic expressions and cadence that your grandchild / spouse / friend – you are more likely to respond with panic, believing your loved one is in serious trouble and needs your immediate help! This is exactly what scammers hope for, and why they’ve upped their game by using AI ‘voice-cloning’ technology to impersonate your voice or the voice of your loved ones.

 

According to Hany Farid, professor of digital forensics at the University of California at Berkeley –

AI voice-generating software analyzes what makes a person’s voice unique — including age, gender and accent — and searches a vast database of voices to find similar ones and predict patterns … It can then re-create the pitch, timbre and individual sounds of a person’s voice to create an overall effect that is similar … It requires a short sample of audio, taken from places such as YouTube, podcasts, commercials, TikTok, Instagram or Facebook videos. Further, Farid noted — Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice. Now … if you have a Facebook page … or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice. (See Washington Post under REFERENCES below.)

 

Sadly, impersonation scams are on the rise in the U.S., and easy, inexpensive AI technology is making it easier than ever for fraudsters to perpetrate these types of crimes. For example — Powered by AI, a slew of cheap online tools can translate an audio file into a replica of a voice, allowing a swindler to make it “speak” whatever they type. According to the Federal Trade Commission (FTC), there were over 36,000 reports of people who were victimized by scammers pretending to be friends or family members in distress in 2022. Further, FTC officials indicated that over 5,100 of those incidents occurred over the phone and accounted for over $11M in losses. In addition, the courts, law enforcement and federal regulators are ill-prepared to handle these scams. The majority of fraud victims of AI voice cloning scams do not have very many leads to help identify the perpetrators; and since scammers operate these scams from different locations around the world, it’s difficult for law enforcement to trace funds and calls. And finally, there’s little legal precedent for courts to hold the companies that make the tools accountable for their use. (See Washington Post under REFERENCES below.)

 

If you’re interested in watching a powerful, but short video (2 minutes:17 seconds) on AI ‘voice cloning’ technology, you might want to watch Gadi Schwatz’s March 13th, 2023, spot on NBC Nightly News called “Artificial intelligence can realistically replicate voices, raising new tech concerns.” It discusses how the new AI ‘voice cloning’ technology can be used for good, or for bad, i.e., misinformation and abuse, as in instances of fraud. (See NBC News under REFERENCES below for link.)

 

 

 

HOW TO PROTECT YOURSELF

Please note that scammers are applying the new AI ‘voice cloning’ technology to other scams besides the grandparents’ scam. Some examples of scams where AI ‘voice cloning’ technology can be used are: the “Virtual Kidnapping Ransom Scam,’ IRS scams, Social Security scams, and “Can You Hear Me Now?” scams. Here are some tips to help you steer clear of AI ‘voice cloning’ scams.

 

MAKE sure your privacy settings on social media accounts allow only people you know to see your posts, comments, and pictures. Scammers are always on the lookout for ways to glean your personal information and use it against you.

 

IF you get a call from someone claiming to be a grandchild (or spouse or friend) asking for money to help with an alleged emergency of some kind, just hang up! Immediately call the grandchild (spouse or friend) on a phone number you know is legit, to make sure they are OK and safe. Also, contact other family members to make sure there is no actual emergency.

 

SETUP a “code word” between you and your grandchild (spouse or friend), a word that only the two of you know. Then, if you get a phone call from someone claiming to be your grandchild (spouse or friend) or a friend of your loved one asking for money to help with an alleged emergency, you can ask the caller what the code word is. If the caller doesn’t know the code word, you will know the call is not legit. Hang up!

 

DON’T volunteer personal information. Scammers fish for information they can use to make their impersonation more authentic. For example, if a caller says something like “Hi grandma! It’s me!” don’t say the name of your grandchild. Let the caller say it.

 

REMEMBER: Be discerning in what you hear. “Just because you hear it, doesn’t mean you should believe it.” – Gadi Schwartz (“Artificial intelligence can realistically replicate voices, raising new tech concerns,” NBC Nightly News, March 13, 2023.)

 

DON’T wire money or send cash or buy gift cards to give to someone claiming to be your grandchild (spouse or friend). Con artists ask for money using these methods because it’s very difficult to track.

 

TRUST your instincts. The American Bar Association says — If something doesn’t feel right, it probably isn’t!

 

NOTE – If you sent cash through Western Union (WU) to someone you suspect may be a scammer, call their hotline immediately at 800-448-1492. Likewise, if you used MoneyGram (MG) to send cash to a suspected scammer, call their hotline ASAP at 800-926-9400. If the transfer hasn’t yet been made, WU or MG may be able to stop the transfer and your money could be refunded.

 

Be Empowered. Find Your Voice. Speak Out About Fraud!

 

WHERE TO REPORT SCAMS

 

Federal Trade Commission at 877-382-4357 or online at https://www.ftccomplaintassistant.gov/#crnt&panel1-1.

For questions about Medicare fraud / abuse, contact Senior Medicare Patrol (SMP*) at 1-855-613-7080.

U.S. Senate Special Committee on Aging’s Fraud Hotline at 1-855-303-9470.

 

REFERENCES

 

“They Thought Loved Ones Were Calling. It Was An AI Scam,” Washington Post, Pranshu Verma, March 5, 2023.

https://www.washingtonpost.com/technology/2023/03/05/ai-voice-scam/?pwapi_token=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJzdWJpZCI6IjQzOTMxNzkyIiwicmVhc29uIjoiZ2lmdCIsIm5iZiI6MTY3Nzk5MjQwMCwiaXNzIjoic3Vic2NyaXB0aW9ucyIsImV4cCI6MTY3OTI4NDc5OSwiaWF0IjoxNjc3OTkyNDAwLCJqdGkiOiI1MGVlMjdiNC1hNDZhLTQ0ZWUtOGJlNS0wMTUyZDM1ODQ0YjgiLCJ1cmwiOiJodHRwczovL3d3dy53YXNoaW5ndG9ucG9zdC5jb20vdGVjaG5vbG9neS8yMDIzLzAzLzA1L2FpLXZvaWNlLXNjYW0vIn0.GUZ5rfYH_eK59PD5c9ydrIHj0moZgHIAMopsbiPvfF4

 

“Artificial intelligence can realistically replicate voices, raising new tech concerns,” NBC News, Gadi Schwartz, March 13, 2023.

https://www.nbcnews.com/nightly-news/video/artificial-intelligence-can-realistically-replicate-voices-raising-new-tech-concerns-165109829933

 

Remember:  You may be a target, but you don’t have to be a victim!

Dr. Sherry McCoy, PhD is a freelance writer & actor for the Stop Senior Scams ℠ Acting Program (SSSAP) in Los Angeles. Follow SSSAP on Facebook at https://www.facebook.com/SSSAP2016/?fref=ts. For more info re: SSSAP, contact Adrienne Omansky at SSSAP4U@gmail.com. Questions for the writer should be directed to “Dear Sherry” at Not Born Yesterday! P.O. Box 722, Brea, CA 92822 or nbynews@juno.com.

 

SAVE THE DATE!

 

May is National Senior Fraud Awareness Month. Check back for our program date in our May NOT BORN YESTERDAY issue or on the SSSAP Face Book Page at https://www.facebook.com/SSSAP2016.

 

Photo below: SSSAP 2023 ensemble cast rehearsing for new program for Senior Fraud Awareness Month.
We encourage Senior organizations to have a program for this important designation.

 

No Comments

Sorry, the comment form is closed at this time.