
UK Report on Copyright and Artificial Intelligence Published
Key Takeaways
- •Government opts for no immediate copyright reforms on AI
- •Section 9(3) likely to be removed, limiting AI‑generated protection
- •No new text‑and‑data‑mining exemption approved
- •Transparency and labeling guidance left to industry working groups
- •Digital replica “personality right” under consideration
Summary
The UK Departments for Science, Innovation and Technology and Culture, Media and Sport have released the long‑awaited Report on Copyright and Artificial Intelligence, mandated by section 136 of the Data (Use and Access) Act 2025. The report signals a hands‑off approach: no immediate legislative changes, no new text‑and‑data‑mining exemption, and a tentative review of the contentious s.9(3) authorship provision. It also notes the government’s reluctance to impose transparency or labeling rules, preferring industry‑led best‑practice work‑groups, while flagging a possible new personality right for digital replicas. Overall, the government is deferring major decisions to the courts and future evidence.
Pulse Analysis
The newly published UK Report on Copyright and Artificial Intelligence provides a rare glimpse into the government’s strategic calculus as it grapples with the clash between emerging AI technologies and a copyright framework rooted in the 1988 Act. By invoking s.136 of the Data (Use and Access) Act 2025, ministers fulfill a parliamentary commitment while signaling that they view litigation, market dynamics, and foreign regulation as the primary drivers of policy evolution. The decision to keep s.9(3)—the clause that attributes authorship of computer‑generated works to the person arranging the creation—under review, and likely to repeal it, underscores a preference for protecting human creativity over extending protection to purely algorithmic output.
For AI developers, the report’s refusal to introduce a broad text‑and‑data‑mining exemption or mandatory transparency disclosures means the UK remains a rights‑holder‑friendly jurisdiction. This stance could deter firms from training large models on UK‑based data, especially given the pending Getty Images v Stability AI appeal, which may set a precedent on cross‑border training liability. Conversely, creative industries welcome the strong default protections, but they also face practical challenges in monitoring and enforcing rights when AI‑assisted works blur the line between human and machine contribution. The government’s reliance on industry working groups for best‑practice guidance offers a limited, collaborative pathway, yet it leaves many legal uncertainties unresolved.
Looking ahead, the report hints at two potential reforms: a possible new personality right for digital replicas and a future clarification of s.9(3). Both could reshape the balance of power between innovators and rights‑holders, influencing investment decisions and the UK’s position in the global AI race. Stakeholders should monitor forthcoming court decisions and any legislative drafts, as incremental changes—whether through judicial interpretation or targeted statutes—are likely to define the practical landscape for AI‑driven content creation in the coming years.
Comments
Want to join the conversation?