
The rules could cement platform dominance in creative markets by limiting creators’ ability to contest algorithmic bias, reshaping revenue distribution across the UK’s cultural economy.
The Data (Use and Access) Act 2025 marks a pivotal shift in the UK’s data‑privacy landscape. By replacing the near‑total ban on solely automated decisions with a nuanced test for "meaningful human involvement" and a significance threshold, the legislation grants controllers broader leeway while retaining consent‑based safeguards for special‑category data. The Secretary of State retains the power to fine‑tune these definitions through secondary regulations, creating a flexible but uncertain regulatory environment for businesses deploying AI.
In the music‑streaming sector, the Act’s narrow focus on personal‑data consent leaves a gap around recommendation engines that drive revenue. Spotify’s new monetised ecosystem—combining ChatGPT‑powered discovery with its Discovery Mode pay‑for‑placement model—relies on behavioural signals such as skips and volume changes, effectively creating biometric proxies for creator performance. Because these signals are treated as user‑generated data rather than special‑category data, platforms can sidestep the Act’s redress mechanisms, leaving independent artists with limited recourse when algorithmic tags depress streaming royalties.
Policymakers and industry groups are therefore urging a move beyond the Act’s privacy‑centric lens. The EU Digital Services Act mandates systemic risk assessments, transparency registers and diversity audits for large platforms, tools that could expose entrenched bias in music recommendation pipelines. Introducing comparable audit obligations or public registers in the UK would give creators collective leverage, shift accountability from individual complaints to systemic oversight, and help preserve cultural pluralism in an increasingly AI‑driven market.
Comments
Want to join the conversation?
Loading comments...