
The abuse of Telegram’s native login process gives criminals undetectable, fully authorized sessions, raising the risk of large‑scale account compromise across messaging platforms. It forces providers to rethink authentication design and user education.
Phishing attacks have moved beyond fake login pages to co‑opt genuine authentication mechanisms, and the latest Telegram campaign exemplifies this shift. By embedding a Telegram‑style QR code or a manual phone‑number form on a malicious site, the attackers trigger a real login request through their own API credentials. When the victim scans the code with the official app, Telegram treats the attempt as legitimate, creating a session that the attacker controls. This approach sidesteps password theft entirely, giving criminals a fully authorized token that bypasses encryption safeguards.
The operation relies on two Telegram API keys—api_id and api_hash—registered by the criminal group, allowing them to communicate directly with Telegram’s servers. Once the victim supplies a phone number, OTP, or two‑step password, the data is forwarded in real time, and Telegram sends an in‑app confirmation asking the user to verify the new device. The phishing page frames this prompt as a routine security check, prompting users to click “Yes.” Because the approval originates from the official client, traditional anti‑phishing filters struggle to flag the activity.
Messaging services now face a dilemma: tightening login flows could frustrate users, while leaving the current design exposes them to session hijacking. Industry experts recommend adding contextual cues—such as displaying the originating IP address or device name—when a new session is requested, and requiring secondary verification for QR‑code logins. For users, the safest practice remains to deny any unexpected in‑app authorization and to verify URLs before entering credentials. As attackers continue to weaponize legitimate APIs, platforms must adapt their threat models to include abuse of native authentication features.
Comments
Want to join the conversation?
Loading comments...