
A unified stance against unchecked AI use could force new contractual norms, protecting performers’ rights and reshaping production costs across the UK entertainment sector.
The rise of artificial intelligence in film and television has moved beyond speculative debate to practical implementation, with studios increasingly employing digital scanning to create reusable digital doubles. While the technology promises cost efficiencies and creative flexibility, it also raises profound questions about ownership of a performer’s likeness, voice, and movement. In the United Kingdom, the lack of clear legal frameworks has left actors vulnerable to contracts that grant studios perpetual, platform‑agnostic rights, a scenario that threatens both creative agency and future earnings.
Equity’s recent indicative ballot underscores the depth of industry unease. With 99% of surveyed members voting to refuse scanning and a robust 75% participation rate, the union now wields a powerful mandate to negotiate with Pact, the principal producers’ association. The proposed safeguards include explicit consent clauses, fair compensation for digital reuse, and the right to opt out without jeopardising employment. Although these measures remain non‑binding until formalized, the sheer scale of member support signals that production schedules could be disrupted if producers ignore the demand for contractual reform.
The broader implications extend beyond the UK. The vote mirrors the 2023 Hollywood writers’ strike and the ongoing debate over AI‑generated content, suggesting a global shift toward protecting creative labor in the age of machine learning. Should Equity secure enforceable standards, it could set a precedent that other jurisdictions emulate, compelling studios worldwide to renegotiate talent agreements. Ultimately, the outcome will influence how quickly AI can be integrated into mainstream production without eroding the rights and livelihoods of the performers who bring stories to life.
Comments
Want to join the conversation?
Loading comments...