Worried About AI Voice Clone Scams? Create a Family Password

<

div class=”field field–name-body field–type-text-with-summary field–label-hidden”>

<

div class=”field__items”>

<

div class=”field__item even”>

Your grandfather receives a call late at night from a person pretending to be you. The caller says that you are in jail or have been kidnapped and that they need money urgently to get you out of trouble. Perhaps they then bring on a fake police officer or kidnapper to heighten the tension. The money, of course, should be wired right away to an unfamiliar account at an unfamiliar bank. 

It’s a classic and common scam, and like many scams it relies on a scary, urgent scenario to override the victim’s common sense and make them more likely to send money. Now, scammers are reportedly experimenting with a way to further heighten that panic by playing a simulated recording of “your” voice. Fortunately, there’s an easy and old-school trick you can use to preempt the scammers: creating a shared verbal password with your family.

The ability to create audio deepfakes of people’s voices using machine learning and just minutes of them speaking has become relatively cheap and easy to acquire technology. There are myriad websites that will let you make voice clones. Some will let you use a variety of celebrity voices to say anything they want, while others will let you upload a new person’s voice to create a voice clone of anyone you have a recording of. Scammers have figured out that they can use this to clone the voices of regular people. Suddenly your relative isn’t talking to someone who sounds like a complete stranger, they are hearing your own voice. This makes the scam much more concerning. 

Voice generation scams aren’t widespread yet, but they do seem to be happening. There have been news stories and even congressional testimony[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.

This article has been indexed from Deeplinks

Read the original article: