Chris Vallance
Senior Technology Reporter
Getty Images
Millions of websites - including Sky News, The Associated Press and Buzzfeed - will now be able to block artificial intelligence (AI) bots from accessing their content without permission.
The new system is being rolled out by internet infrastructure firm, Cloudflare, which hosts around a fifth of the internet.
Eventually, sites will be able to ask for payment from AI firms in return for having their content scraped.
Many prominent writers, artists, musicians and actors have accused AI firms of training systems on their work without permission or payment.
In the UK, it led to a furious row between the government and artists including Sir Elton John over how to protect copyright.
Cloudflare's tech targets AI firm bots - also known as crawlers - programmes that explore the web, indexing and collecting data as they go. They are important to the way AI firms build, train and operate their systems.
So far, Cloudflare says its tech is active on a million websites.
Roger Lynch, chief executive of Condé Nast, whose print titles include GQ, Vogue, and The New Yorker, said the move was "a game-changer" for publishers.
"This is a critical step toward creating a fair value exchange on the Internet that protects creators, supports quality journalism and holds AI companies accountable", he wrote in a statement.
However, other experts say stronger legal protections will still be needed.
'Surviving the age of AI'
Initially the system will apply by default to new users of Cloudflare services, plus sites that participated in an earlier effort to block crawlers.
Many publishers accuse AI firms of using their content without permission.
Recently the BBC threatened to take legal action against US based AI firm Perplexity, demanding it immediately stopped using BBC content, and paid compensation for material already used.
However publishers are generally happy to allow crawlers from search engines, like Google, to access their sites, so that the search companies can in return can direct people to their content.
Perplexity accused the BBC of seeking to preserve "Google's monopoly".
But Cloudflare argues AI breaks the unwritten agreement between publishers and crawlers. AI crawlers, it argues, collect content like text, articles, and images to generate answers, without sending visitors to the original source—depriving content creators of revenue.
"If the Internet is going to survive the age of AI, we need to give publishers the control they deserve and build a new economic model that works for everyone," wrote the firm's chief executive Matthew Prince.
To that end the company is developing a "Pay Per Crawl" system, which would give content creators the option to request payment from AI companies for utilising their original content.
Sir Elton John spoke to the BBC's Laura Kuenssberg about AI and Copyright
Battle the bots
According to Cloudflare there has been an explosion of AI bot activity.
"AI Crawlers generate more than 50 billion requests to the Cloudflare network every day", the company wrote in March.
And there is growing concern that some AI crawlers are disregarding existing protocols for excluding bots.
In an effort to counter the worst offenders Cloudflare previously developed a system where the worst miscreants would be sent to a "Labyrinth" of web pages filled with AI generated junk.
The new system attempts to use technology to protect the content of websites and to give sites the option to charge AI firms a fee to access it.
In the UK there is an intense legislative battle between government, creators and the AI firms over the extent to which the creative industries should be protected from AI firms using their works to train systems without permission or payment.
And, on both sides of the Atlantic, content creators, licensors and owners have gone to court in an effort to prevent what they see as AI firms encroachment on creative rights.
Ed Newton-Rex, the founder of Fairly Trained which certifies that AI companies have trained their systems on properly licensed data, said it was a welcome development - but there was "only so much" one company could do
"This is really only a sticking plaster when what's required is major surgery," he told the BBC.
"It will only offer protection for people on websites they control - it's like having body armour that stops working when you leave your house," he added.
"The only real way to protect people's content from theft by AI companies is through the law."
Filmmaker Baroness Beeban Kidron, who is campaigning for more protection for the creative industries, welcomed the news saying the company had shown leadership.
"Cloudflare sits at the heart of the digital world and it is exciting to see them take decisive action," she told the BBC.
"If we want a vibrant public sphere we need AI companies to contribute to the communities in which they operate, that means paying their fair share of tax, settling with those whose work they have stolen to build their products, and, as Cloudflare has just shown, using tech creatively to ensure equity between digital and human creators on an ongoing basis."