Watch out for “voiceprinting”: Scammers record your voice for “deepfake” impersonations

Last Updated: September 6, 2024By Tags: , , , , , , ,

Fraudsters are using all sorts of AI technology to steal money, data and more. One of the latest schemes is “voiceprinting.” Thieves will capture a recording of your voice and use it as a source to create “deepfake” impersonations which they will then use to contact your bank and other financial institutions. “That voiceprint can be used to access your insurance or financial institution,” says Michael Bruemmer, vice president of data breach resolution and consumer protection at Experian. Specifically, scammers have been using deepfake voice copies to call banks, requesting that funds be transferred to an account under their control.

Scammers might record any “Yes” response to voice-authorize charges to a specific account. They might also use that voiceprint as “proof” of your opt-in for the call initially. The crook might also go a bit further by persuading you to engage in light conversation in order to obtain a voice sample which they will then clone and manipulate with AI software.

These criminals will also use this particular AI scheme to create deepfake voices imitating the voices of trusted individuals, which can include family members, friends, corporate executives, public officials, celebrities — or even entities like banks and institutions.

What you can do to protect yourself

Sadly, because of such new AI technologies, we must all accept that casual phone conversations with strangers is pretty much over (even a seemingly innocent wrong number call might be suspect). This doesn’t mean that you have to become hyper-paranoid, but try limiting phone conversations to family members and close friends. When you do need to contact a company’s customer service department, make sure that you are calling the correct number from their official website and/or printed packaging.

“If someone outside your circle needs to get hold of you, they can text you,” Bruemmer explains. He warns that even calls that appear to be from known numbers could be coming from a phone that’s been stolen or had its SIM card cloned.

From the FCC:

Deep-Fake Audio and Video Links Make Robocalls and Scam Texts Harder to Spot

You have probably heard about Artificial Intelligence, commonly known as AI. Well, scammers have too, and they are now using AI to make it sound like celebrities, elected officials, or even your own friends and family are calling.

Also known as voice cloning, these technologies emulate recognizable voices for robocalls to consumers and are often used in imposter scams that spread misinformation, endorse products, or steal money and personal information. Scammers may try to fool an unsuspecting grandparent that a grandchild is in trouble and needs immediate financial assistance or solicit donations to a fake charity endorsed by what sounds like a trusted celebrity.

Consumers should also watch out for text messages that may include links to AI-generated “deep fake” videos that feature celebrities and political figures. The Better Business Bureau (BBB) reports that consumers have been pushed by deep-faked celebrities Gordon Ramsey, Taylor Swift, and Jennifer Garner to counterfeit websites offering cookware. While the videos look real, the celebrity in the video is generated using AI and did not actually film the message or request for support.

Read More…

Share This Story, Choose Your Platform!

Archives

Categories

Leave A Comment