Scammers are constantly looking for new ways to rip people off, and with the emergence of artificial intelligence-powered tools, individuals are starting to notice a new trend making the rounds.

Nefarious actors are always looking for new ways to pull the wool over unsuspecting eyes in hopes of manipulating the individual into sending over thousands of dollars. Scams can come in various forms, and it appears the next big con going around involves the use of artificial intelligence-powered tools that can clone an individual's voice. The scam is relatively simple.
Nefarious actors pick a target family, making sure one of the family members has audio of their voice uploaded to the internet. This audio could be uploaded to any public platform; YouTube, Facebook, TikTok, Instagram, etc.
The scammers then rip that audio down from its source platform, isolate the parts of the audio file that contain the family member's voice and then feed it into an AI tool that is designed to clone voices. With the now cloned voice, the scammers call unsuspecting members of the family, claiming that their family member is in some sort of situation that requires the transfer of funds to help them out. One family received a call that a loved one was in jail and needed money for bail. The scammers then "put on" the family member by playing the AI-cloned voice, convincing them that the call with authentic.
This situation happened to 73-year-old Ruth Card, who believed she was at one point speaking to her grandson Brandon who was telling Card that he needed money for bail. Card was convinced by the voice on the other end of the line and rushed to the bank and withdrew 3,000 Canadian dollars. Card went to another bank to get more money but was luckily stopped by a bank manager that told her that a similar situation had happened to another customer at their bank and that the phone call was likely phony.
However, the parents of 39-year-old Benjamin Perkin weren't as lucky. The parents received a phone call from someone impersonating a lawyer who then informed them their son killed a US diplomat in a car crash, and he needed money for legal fees. Perkin's parents said the lawyer then let their son speak to them on the phone, which convinced them the call was legitimate. The impersonating lawyer then requested CAD $21,000 in legal fees that Perkin's parents sent via Bitcoin.
Horror stories such as these really need to be passed around to family members, especially with the rise of AI-based tools that make voice impersonation very easy. A simple rule of thumb to dodge this scam, in particular, is to hang up the phone after hearing the story of why your family member needs money and call your family member with the contact you already have in your phone. They'll likely pick up and be very confused at your wild accusations, but you'll be relieved to hear none of it is true.