Express & Star

COMMENT: Artificial intelligence offers many benefits to society – but raises challenges to news organisations like ours

Today marks the last day of the Government’s consultation on changing the UK’s gold standard copyright laws to make it easier for AI companies to use British creative content without payment or permission.

Published
Last updated

Midlands News Association, which publishes the Express & Star, Shropshire Star and associated websites, weekly titles and magazines, has joined other media organisations across the UK in a call for rules that “make it fair”.

It is an important issue when considering the future of the press in this country, and its role in covering important issues that matter to us all.

The Creative Rights in AI Coalition has been set up to collectively call for the Government to protect copyright, by providing control for creators across the UK’s vibrant creative industries in how their content is used and transparency from the AI companies using it.

This is the only way to drive long-term growth across the UK for both the creative and tech sectors.

The UK’s unique status as the home of world-leading creative and tech sectors puts us in pole position to lead the way globally in the age of AI.

However, advances in generative AI are entirely reliant on the high-quality, human-created data that the creative industries produce, such as the many exclusive news stories, features, sport and video reports that we fill our websites and newspapers with every day.

This unique content is the essential fuel of the AI products we see and use today.

There is a huge potential market for licensing the content produced by the UK’s creators that our country could take the lead in.

But this will only happen if creators have proper control of the content they make and fair payment for its use.

Tech firms are happy to pay for the huge quantities of electricity that power their data centres, and the right incentives will ensure they compensate creators as well.

Yet the Government’s consultation proposes to weaken copyright law and stymie the development of this market, sweeping the rug from under the creative industries that generate £126 billion to the UK economy and are a key component of our soft power abroad.

Without fair payment, high-quality creative content will become harder to make and this will also see generative AI innovation stall too, going against the Government’s own ambitions for growth in this sector.

Much has been said by the Government about the ‘uncertainty’ surrounding UK copyright law.

But the law is clear: text and data mining - the method used to train generative AI models - is not allowed for commercial purposes without a licence.

The only uncertainty is precisely who has already used the UK’s creative crown jewels as training material without paying for that use.

We at the Creative Rights in AI Coalition instead urge the Government to enforce existing copyright law with meaningful transparency.

This approach will drive a dynamic licensing market by preserving and upholding our copyright framework giving creatives exclusive control on how their work is used.

Transparency will enable those in the creative industries to hold AI firms accountable, incentivising tech firms to comply with the law and fostering a mutually beneficial partnership.

As the consultation closes, MPs are also debating measures introduced by Baroness Kidron to the Data Bill.

These would introduce robust transparency measures to make existing copyright law enforceable, rather than this transparency being offered as a ‘trade-off’ for the degradation of copyright protections the Government proposes.

Proper control for creators like Midlands News Association is the only route which will allow them to continue producing the creative works that generative AI firms could then access through licensing.

We invite the Government and the tech sector to partner with us in shaping a future that prioritises, safeguards, and enhances the role of human creativity in AI.

To find out more, go to Make It Fair.