AI News and Headlines
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests

AI Pulse

EMAIL DIGESTS

Daily

Every morning

Weekly

Sunday recap

NewsDealsSocialBlogsVideosPodcasts
AINewsSecurity Researchers Catch "Privacy" Browser Extensions Siphoning AI Chats and Selling Them via a Data Broker
Security Researchers Catch "Privacy" Browser Extensions Siphoning AI Chats and Selling Them via a Data Broker
AI

Security Researchers Catch "Privacy" Browser Extensions Siphoning AI Chats and Selling Them via a Data Broker

•December 29, 2025
0
THE DECODER
THE DECODER•Dec 29, 2025

Companies Mentioned

Microsoft

Microsoft

MSFT

Google

Google

GOOG

Why It Matters

The covert extraction of AI conversation data threatens user privacy and may violate emerging data‑protection regulations, prompting scrutiny of extension marketplaces and third‑party data brokers.

Key Takeaways

  • •Eight extensions harvest AI chat data.
  • •Over 8 million combined users affected.
  • •Data sold via Urban VPN to affiliates.
  • •Google, Microsoft badges mislead users.

Pulse Analysis

The discovery by Koi underscores a growing blind spot in the browser extension ecosystem. Eight popular extensions—most notably Urban VPN Proxy and its sibling tools—have silently added code that intercepts prompts and responses from leading large‑language‑model services such as ChatGPT, Claude, Gemini, and Microsoft Copilot. The feature was pushed in a July 2025 auto‑update, meaning users never consented to the extra data collection. Even when the VPN toggle is disabled, the extensions continue to pipe raw conversation strings to Urban’s backend, where they are aggregated with standard browsing telemetry.

From a regulatory perspective, the practice collides with emerging privacy frameworks in the U.S., EU, and Asia that demand explicit user consent before harvesting personal or behavioral data. By bundling AI‑prompt harvesting with a generic “browsing data” clause, the extensions skirt the spirit of the GDPR and the California Consumer Privacy Act, exposing both developers and the affiliated broker BiScience to potential enforcement actions. Moreover, the discrepancy between the Chrome Web Store’s claim of “no data sales” and the privacy policy’s admission of affiliate sharing erodes consumer trust in platform vetting processes.

Enterprises and individual users can mitigate exposure by auditing installed extensions, disabling or removing any that claim VPN or ad‑blocking functions without transparent data practices. Security teams should incorporate extension monitoring into their endpoint protection policies and educate staff about the false sense of security conferred by “Featured” badges from Google or Microsoft. As AI assistants become integral to workflow, the market will likely see stricter oversight of third‑party tools that access conversational data, prompting developers to adopt clearer consent mechanisms and auditors to demand independent privacy certifications.

Security researchers catch "privacy" browser extensions siphoning AI chats and selling them via a data broker

Read Original Article
0

Comments

Want to join the conversation?

Loading comments...