AI voice cloning turns biometric authentication into a liability, amplifying fraud risk and forcing banks and regulators to tighten security while educating vulnerable users.
The video warns that AI‑generated voice clones are being weaponised in the United Kingdom to bypass bank authentication and steal money from vulnerable consumers.
Criminal groups first conduct seemingly innocuous lifestyle surveys, harvesting health, financial and personal details. Using that data they train voice‑cloning models that mimic the victim’s speech, then call banks or service providers to authorise direct debits. National Trading Standards (NTS) reports that UK adults receive an average of seven scam calls per month, with 21 % hearing scams daily, and that the agency blocked nearly 21 million fraudulent calls and shut down 2,000 numbers in the past six months.
The head of NTS’s scams team said, “We’ve seen a deeply disturbing combination of old and new techniques… criminals are using AI not just to deceive victims but to trick legitimate systems into processing fraudulent payments.” The scammers also deploy avatar software with British accents to disguise Indian call‑centre operators, further convincing targets.
The episode underscores the fragility of voice‑based authentication, urging banks to reassess biometric security and prompting families to monitor statements and discuss scam awareness. Without rapid countermeasures, AI‑driven impersonation could erode consumer trust across financial services.
Comments
Want to join the conversation?
Loading comments...