How Can the EU Protect Children Online While Dismantling the Very Rules Designed to Keep Them Safe?

How Can the EU Protect Children Online While Dismantling the Very Rules Designed to Keep Them Safe?

EDRi —
EDRi —Apr 15, 2026

Key Takeaways

  • EU Digital Omnibus may exempt pseudonymised data from GDPR
  • Weakening automated decision rules could expand AI profiling of minors
  • Age‑gating shifts data‑collection burden onto children
  • 300+ civil‑society groups warn deregulation harms child rights
  • Data reuse may entrench educational inequality for disadvantaged youths

Pulse Analysis

Across Europe, protecting children in the digital sphere has become a political rallying cry, prompting the European Commission to launch the Special Panel on child safety and to embed safeguards in the Digital Services Act, AI Act and GDPR. These statutes were designed to treat personal data and algorithmic decision‑making as core rights issues, ensuring platforms are accountable for the content and services offered to minors. However, the upcoming Digital Omnibus package seeks to streamline regulation, potentially carving out exemptions for pseudonymised datasets and diluting automated‑decision thresholds, a move that civil‑society groups argue could undo years of progress in child‑focused data protection.

The practical implications of the proposed changes are stark. Allowing large, pseudonymised datasets to fall outside GDPR oversight would enable companies to reuse behavioural signals for targeted advertising, recommendation engines and AI training without explicit consent. For a teenager searching for mental‑health resources, this could translate into a feedback loop of increasingly narrow content, reinforcing harmful narratives. In education, weakened safeguards risk misclassifying students from disadvantaged backgrounds, entrenching inequality through biased algorithmic assessments. Age‑gating, touted as a quick fix, merely shifts the burden of data disclosure onto children, often requiring biometric or identity documents that further erode privacy.

Experts contend that the solution lies not in restricting access but in reinforcing structural safeguards. Robust data‑protection rules, transparent platform accountability and a rights‑based approach to AI can mitigate manipulative design practices that disproportionately affect young users. The pending Digital Fairness Act offers a legislative window to address profiling, enforce purpose limitation, and ensure that digital environments remain safe, fair, and inclusive. Maintaining strong EU digital frameworks is essential to protect the next generation’s online wellbeing and to prevent a regulatory vacuum that could amplify existing harms.

How can the EU protect children online while dismantling the very rules designed to keep them safe?

Comments

Want to join the conversation?