AI voice clones can be used to impersonate loved ones and scam you and your family. Be careful, and think of strategies to defend against it.
Journalist Evan Ratliff cloned his voice with AI and had it talk to scammers. On Big Technology Podcast, he talks about the dangers of AI voice scams, and what you might want to consider to fend them off:
Spotify: https://spoti.fi/32aZGZx
Apple: https://apple.co/3AebxCK
Etc. https://lnkd.in/g4pQA-qD
This is the greatest scamming technology that has ever been invented. It's already being deployed for scams as we speak, including volume scams where you can just use AIS to call people all the time and then narrow down the number of Marks and then send it to a human operator to to close the deal basically. And these kind of personalized scams where you can clone. Someone's voice off of their Instagram or anywhere, if they've appeared anywhere in video and their voice is there, all you need is a few seconds. You can clone their voice. You can look up their relatives. You can call a relative and say I'm in trouble in the voice. Use your AI to say I'm in trouble. I need a lawyer or I have a lawyer. The lawyer needs money. I've been in an accident. It's called the grandparents scam. Oftentimes now, and these are happening, I mean they're happening every day all over the country. And that's just the very first level of scamming that people are attempting. And so I think people have to now. Be aware. The great thing is, if you're aware of it, you can actually prevent it if you talk to people about it, if you tell your relatives, you know, I'm not going to call. I'm not going to call you in this way. Or if you get a call like this, watch out for it. Or if you get a call like this, text me and ask me if this is really me. There are ways around it, but it's just actually the tip of the iceberg in terms of the way this technology will be used to try to separate people from their money. Yeah, we have. You know, obviously there's a concern in my family because my voice is all it's out there. You're quotable, man. I I have been glowing just my my podcast audio was used to clone me and with 11 laps yeah and. I'm I, I once embraced the technology, but I also know the risks. And so with my family, if we have a rule that if any of us ever call and say I'm in a distressed situation, I need help, I need money, we have a code word that we've created, you know, in the privacy of your own home that you have to use that code word. And that's when we know it's real. There's no way for the. I know that. I hope so. Yeah. Until you train up an AI to be like you like that, and then it shares it with other AI.