Fintech Videos
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests
NewsDealsSocialBlogsVideosPodcasts
FintechVideosThey’ve Scamming with AI Clones of YOUR Voice
CybersecurityAIFinTechBanking

They’ve Scamming with AI Clones of YOUR Voice

•February 18, 2026
0
David Bombal
David Bombal•Feb 18, 2026

Why It Matters

AI voice cloning turns biometric authentication into a liability, amplifying fraud risk and forcing banks and regulators to tighten security while educating vulnerable users.

Key Takeaways

  • •AI voice clones enable fraudsters to bypass bank authentication.
  • •Older adults targeted through surveys harvesting personal data.
  • •Scammers set up unauthorized direct debits using cloned voices.
  • •UK regulators blocked 21 million scam calls in six months.
  • •Families urged to monitor statements and discuss scam awareness.

Summary

The video warns that AI‑generated voice clones are being weaponised in the United Kingdom to bypass bank authentication and steal money from vulnerable consumers.

Criminal groups first conduct seemingly innocuous lifestyle surveys, harvesting health, financial and personal details. Using that data they train voice‑cloning models that mimic the victim’s speech, then call banks or service providers to authorise direct debits. National Trading Standards (NTS) reports that UK adults receive an average of seven scam calls per month, with 21 % hearing scams daily, and that the agency blocked nearly 21 million fraudulent calls and shut down 2,000 numbers in the past six months.

The head of NTS’s scams team said, “We’ve seen a deeply disturbing combination of old and new techniques… criminals are using AI not just to deceive victims but to trick legitimate systems into processing fraudulent payments.” The scammers also deploy avatar software with British accents to disguise Indian call‑centre operators, further convincing targets.

The episode underscores the fragility of voice‑based authentication, urging banks to reassess biometric security and prompting families to monitor statements and discuss scam awareness. Without rapid countermeasures, AI‑driven impersonation could erode consumer trust across financial services.

Original Description

Criminals are using AI voice cloning to bypass bank security and set up unauthorized direct debits. Here is how the new 2026 scam works and why you must stop using voice authorization immediately.
#ai #aivoice #security
0

Comments

Want to join the conversation?

Loading comments...