
Eliminating forced scanning protects end‑to‑end encryption and fundamental privacy, while the pending age‑verification rules could reshape how citizens access private messaging services.
The EU’s Child Sexual Abuse (CSA) Regulation has been a flashpoint between child‑protection objectives and digital‑rights safeguards. Early drafts called for systematic scanning of encrypted chats, prompting fierce opposition from privacy advocates and tech firms that champion end‑to‑end encryption. Denmark’s surprise proposal in November 2025 to scrap forced detection and embed encryption protections signalled a dramatic policy reversal, aligning the Council more closely with the European Parliament’s emphasis on proportionality and fundamental rights.
Despite the Council’s new voluntary‑detection stance, the legislative battle is far from settled. The European Parliament continues to argue for limited mandatory scanning when a warrant‑based suspicion exists, while the European Commission still pushes elements of its original “Chat Control” vision. With the Council presidency now held by Cyprus, the final trilogue is scheduled for 29 June 2026, but lingering issues—such as the lack of independent national authorities and a controversial search‑engine delisting clause—threaten to delay consensus. These technical and procedural gaps could shape the regulatory landscape for messaging platforms across the bloc.
Perhaps the most consequential proposal is the age‑verification requirement, which would compel services to collect facial data or government‑issued IDs before allowing private communication. This could effectively lock out vulnerable groups—children, the elderly, undocumented migrants—and create a chilling effect on free expression. By tying identity checks to private messaging, the regulation risks establishing a new surveillance infrastructure that undermines the EU’s own Digital Services Act safeguards. Stakeholders, from civil‑society groups to tech companies, must therefore mobilise to ensure that the final CSA framework balances child‑safety goals with the preservation of digital civil liberties.
Comments
Want to join the conversation?
Loading comments...