
By delivering on‑device deepfake detection, Avast improves user privacy while reducing the risk of fraud that exploits convincing AI‑generated media, a growing threat across both consumer and enterprise environments.
The proliferation of AI‑generated media has turned deepfakes into a preferred weapon for fraudsters. In the last quarter of 2025, Gen Threat Labs recorded more than 159,000 unique deepfake scams, many embedded in everyday video streams on platforms such as YouTube, Facebook and X. These manipulations combine convincing audio‑visual cues with classic social‑engineering tactics, making it difficult for users to distinguish legitimate content from malicious intent. Avast’s launch of Deepfake Guard directly addresses this threat by scanning video audio in real time, alerting users before they act on deceptive material.
Deepfake Guard runs entirely on the user’s Windows PC, leveraging on‑device AI accelerators found in the latest Intel and Qualcomm processors. By processing data locally, the solution eliminates latency associated with cloud analysis and preserves user privacy, as no video streams are transmitted off‑device. The engine employs neural‑network models trained on millions of synthetic and authentic samples, enabling it to flag subtle audio anomalies that indicate manipulation. Real‑time alerts appear as unobtrusive notifications, giving users a moment to verify content without disrupting their workflow.
The expansion of Avast Scam Guardian to Android and iOS completes a cross‑platform defense against scam communications, from text messages and calls to video‑based deception. This unified ecosystem positions Avast as a rare provider offering end‑to‑end protection across mobile and desktop environments. Enterprises can now extend the same safeguards to remote workforces, reducing exposure to credential theft and financial loss. As deepfake technology continues to evolve, the market will likely see increased demand for on‑device, privacy‑first solutions, and Avast’s early move could set a new industry standard.
Comments
Want to join the conversation?
Loading comments...