Unchecked AI Could Harm The Creative Sector

The UK government’s plan to let AI companies use copyrighted works unless creators opt out has sparked criticism from academics. They argue it prioritises tech industry profits over the rights of artists, musicians, and writers.

A new report from experts at the University of Cambridge warns that shifting the burden onto creators is unfair, especially for those without the legal knowledge or resources to navigate the system. The report also urges the government to clarify that only humans can hold copyright, even when AI is involved in the creative process.

Harming creativity

The study, produced by the Minderoo Centre for Technology and Democracy, the Bennett Institute for Public Policy, and ai@cam, argues that unchecked AI use will not guarantee economic growth and could harm the UK’s creative sector.

If the UK moves to an opt-out model, it could allow AI companies to scrape UK-made content for profit without compensating its creators, the researchers warn. They argue that this approach weakens protections for emerging artists and enables foreign firms to exploit British work.

“By making opt-out the default, the government is telling UK creatives that tech companies’ profits matter more than their work,” the report states.

Supporters of AI claim it could boost creativity, but researchers note there is little data on how AI is actually being used in creative industries. They call for studies that directly involve creators to inform policies that balance innovation with fair treatment.

Bypassing the law

Current copyright law automatically protects creators’ work, but some AI firms have tried to use “fair dealing” rules to justify using copyrighted material. Some now negotiate licenses with publishers, which researchers say could be a model for fairer compensation.

Performers face additional risks. AI tools create composites of their voices and likenesses, often using recordings made under contracts that never considered such technology. The researchers urge the UK to implement the Beijing Treaty on Audio Visual Performance, which would extend performers’ rights to all forms of reproduction and distribution.

At present, neither creators nor AI firms benefit from legal uncertainty. Artists are not properly paid for their work being used in AI training, and AI firms hesitate to invest in the UK due to unclear rules.

Mandatory transparency

The researchers propose mandatory transparency on AI training data and standardised licensing agreements. Without clear policies, the UK risks damaging its creative industries while gaining little from AI developments.

The report also examines AI-generated content and whether prompting an AI system should confer copyright. It concludes that AI cannot hold copyright and calls for policies to compensate artists when their names or works are used as prompts.

The opt-out model, the researchers say, goes against the spirit of copyright law and is difficult to enforce. Even if artists opt out, it is unclear how their work would be tracked, labelled, or compensated.

Without proper safeguards, the UK risks allowing foreign AI firms to exploit British creative works without fair payment. “AI policies should help UK creatives, not undermine them,” the researchers conclude.

Facebooktwitterredditpinterestlinkedinmail