WARNING: AI-powered DEEPFAKE VOICE SCAMS are now coming for your bank balance
09/06/2023 // Laura Harris // Views

Deepfake voice scams are now targeting your bank balance. Scammers have started to use artificial intelligence (AI) to generate voices in real time.

Clive Kabatznik, a Florida-based investor, recounted how he found himself at the center of a high-tech fraud attempt last spring. Kabatznik called twice on his local Bank of America representative. On his first call, he discussed his plan to transfer a large sum of money, then he called again to transfer the money elsewhere. Except that the second caller wasn't him.

According to the banker, the second caller attempted to persuade her to transfer funds to a different account. However, the voice on the other end was repeating itself, kept on interrupting her, and using incomprehensible phrases. The banker ended the call and promptly reported the issue to their security team.

The banker then declined all calls and emails, authentic or not, coming from Kabatznik to ensure the safety of the account. It took approximately 10 days for them to reestablish contact and arrange a meeting at the office of the Bank of America.

To date, the data of wealthy customers are widely available on the dark web, so it is easier for scammers to execute their plans. Kabatznik, whose public appearances and speeches are readily accessible online, makes him a perfect target for this kind of deepfake scam.  (Related: Free AI voice generation software successfully hacked into bank accounts using simulated voices.)

"There's a lot of audio content out there," explained Vijay Balasubramaniyan, the CEO and founder of Pindrop, a company that monitors audio traffic for major U.S. banks.

Deepfake voice scams are on the rise

Pindrop, a company that reviews automatic voice-verification systems for eight of the largest U.S. lenders, reported a surge in deepfake voice scams when fake voices created by AI programs started making an appearance in 2022. Nuance, another voice authentication vendor, also noted the first successful deepfake attack on a financial services client late last year.

In an article published by Gadget, a personal technology magazine, deep fake technology is supposed to be harmless, unless it lands in the wrong hands.

As technology advances, scamming techniques also evolve. Now, criminals can quickly turn their voice into the voice of their targets with the help of generative AI systems like Microsoft's VALL-E. In some instances, these attacks can be executed with just a few seconds of sampled audio.

Dmitry Anikin, a senior data scientist at Kaspersky, said the technology required to produce high-quality deepfakes isn't readily accessible for widespread use at the moment. However, he foresees a potential future scenario where this technology becomes more widely available, potentially leading to a significant increase in related fraudulent activities.

According to Anikin, scammers might attempt to generate convincing voices in real-time situations, such as impersonating a family member to deceive them into giving away money.

"Such a scenario is not realistic for now because creating high-quality deepfakes involves a lot of limited resources. However, to make low-quality audio fake, fewer resources are required," Anikin said.

Visit FutureTech.news for more news related to artificial intelligence-powered platforms.

Watch the video below that talks about how ElevenLabs AI system cloned Health Ranger Mike Adam's voice.

This video is from the Health Ranger Report channel on Brighteon.com.

More related stories:

AI-powered bot successfully requested refund from Wells Fargo using FAKE voice.

East Palestine resident develops “Mickey Mouse” helium-like voice following train derailment chemical exposure.

AI likely to WIPE OUT humanity, Oxford and Google researchers warn.

AI startup under fire after trolls used its voice cloning tool to make celebrities say “offensive things.”

Google worshipers applaud their own total enslavement as Google AI unveils near-perfect human voice mimicry tech.

Sources include: 

DNYUZ.com

Gadget.co.za

Brighteon.com



Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.