Florida Investor's Voice Cloned in New AI Bank Scam

Aug 30, 2023
In a recent high-tech scam attempt, Florida investor Clive Kabatznik's voice was artificially replicated by software in an attempt to deceive his Bank of America representative into transferring funds. This emerging threat, known as voice deepfakes, uses artificial intelligence to mimic real voices, alarming cybersecurity experts. While comprehensive data on the frequency of such scams is lacking, companies like Pindrop, which monitors audio traffic for major U.S. banks, have noticed an uptick in these sophisticated voice fraud attempts.
We just showed you how generative A.I. systems can create a voice deepfake with just three seconds of sampled audio. Make sure you have multiple safeguard methods including 2FA enabled on your bank accounts.
We just showed you how generative A.I. systems can create a voice deepfake with just three seconds of sampled audio. Make sure you have multiple safeguard methods including 2FA enabled on your bank accounts.
The vast amount of voice recordings available online, combined with stolen customer data, makes these scams especially concerning. Vijay Balasubramaniyan, CEO of Pindrop, highlighted that a generative A.I. system can create a voice deepfake with just three seconds of sampled audio. Despite the growing sophistication of these attacks, they still originate from a long-standing cybersecurity issue: data breaches revealing personal bank customer information.