Govtech Blogs and Articles
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests
NewsDealsSocialBlogsVideosPodcasts
GovtechBlogsOnline Harms: Millions Could Be Forced to Use Unregulated Age Verification
Online Harms: Millions Could Be Forced to Use Unregulated Age Verification
GovTechLegal

Online Harms: Millions Could Be Forced to Use Unregulated Age Verification

•February 16, 2026
0
Open Rights Group — Blog —
Open Rights Group — Blog —•Feb 16, 2026

Why It Matters

Forcing millions to provide biometric data threatens privacy and civil liberties while granting tech firms unprecedented control over digital identity without robust parliamentary scrutiny.

Key Takeaways

  • •Government seeks Henry VIII powers for swift under‑16 social media ban.
  • •Age‑verification could require biometric data for everyday online services.
  • •Open Rights Group urges regulation of private age‑assurance providers.
  • •Lack of parliamentary scrutiny raises democratic accountability concerns.
  • •Proposed AI chatbot duties aim to remove illegal content.

Pulse Analysis

The UK’s latest online safety push marks a decisive shift toward delegated legislation, using Henry VIII clauses to sidestep full parliamentary debate. By allowing a statutory instrument to enforce an under‑16 social‑media ban, the government can act quickly, but it also reduces transparency and public input. Coupled with new consultations on age‑gating VPNs and AI chatbots, the policy expands the reach of the Online Safety Act, embedding age‑verification mechanisms across a broader swath of digital services.

At the heart of the controversy is the reliance on private age‑assurance providers to verify users’ identities. Companies such as Persona, backed by investors linked to surveillance firms, already collect facial scans and other biometric data for platforms like Roblox, Reddit and Discord. This data flows into global commercial ecosystems, where it can be repurposed for targeted advertising or sold to third parties. The lack of a regulatory framework means users often hand over irreversible identifiers without clear consent, raising profound privacy and security concerns.

Open Rights Group’s call for mandatory privacy and security standards seeks to fill this regulatory vacuum. By involving the ICO and Ofcom, the government could enforce data‑minimisation, encryption, and independent oversight of age‑verification services. Such safeguards would protect users from potential misuse while preserving the intended goal of protecting children online. Without them, the expansion of biometric age‑gating risks entrenching a new layer of digital infrastructure that consolidates power in the hands of a few private entities, undermining democratic accountability and digital rights.

Online harms: Millions could be forced to use unregulated age verification

Read Original Article
0

Comments

Want to join the conversation?

Loading comments...