As generative AI continues disrupting the online search dynamics, publishers are weighing possible strategies on how to secure their revenue results amid the global change.
One thing is clear: the continuously evolving generative AI has already had a profound, mostly negative impact on publishers’ search traffic. In fact, as the recent study by Pew Research Center reveals, the consequences are much more dire than one could imagine. Namely, according to the research results, only ~8% searches in Google with AI overview results present usually end in a user click, which constitutes a little more than half the CTR for searches without an AI summary.
More importantly, according to experts, this negative tendency not only decreases publishers’ organic search traffic by 20 – 60%, depending on the market vertical, but can also cause the overall ad revenue loss by as much as $2,000,000,000, which is huge.
And as if that weren’t enough…
Google Pushes Forward the AI-driven Search Evolution
Just recently, Google has introduced a few new features in its Search Labs, which promise even a greater transformation of the search experience for end users.
In particular, the announced “Web Guide” feature is promised to reorganize search results, grouping them by relevance, using generative AI, while the “Preferred Sources” option will enable personalizing one’s newsfeed, i.e. by marking specific resources as a so-to-speak preferred source within Google’s Top Stories.
Even though both features are in an experimental mode yet, their market-wide launch is right around the corner, and so is the supposedly next chapter of what some analysts already dare call “an AI-driven dystopian age.”
Not to sound too pessimistic, but one thing is obvious: things will never be as they were, and it’s high time that publishers acted, if they wish to maintain their traffic and ad revenue results.
News Licensing: Fast Remedy or Foe
While many reputable media outlets are still fighting AI scraping practices in courts, it looks like some publishers, particularly in the small- and mid-size segments are ready for a faster resolution, i.e. by licensing their content for Google’s AI use via signing exclusive licensing agreements with the company. In this respect, according to Bloomberg, for instance, a pilot Google’s project implies signing deals with ~20 outlets at this point, with more publishers joining it later.
However, even though signing a licensing deal with Google may seem like an effective way for struggling media businesses to maintain their revenue outcomes amid the loss of their ad partners and paying readers, chances are this will be just a temporary fix for a much bigger problem.
Not only can Google change its mind about the scale and scope of further publisher deals, but the inevitable search transformation will breed such consequential results for their traffic, that even if such deals remain in place for a long time, the earned money won’t be able to compensate for the overall revenue decline.
When GEO Isn’t Enough Already: Here Comes the LLM Framework
Quite predictably, looking at the perspective of never getting back their organic reader audience from search engines again, many publishers have already embraced the newly-emerged generative search optimization (GEO) practices, aiming to maintain their discoverability by AI agents, like ChatGPT, Google’s Gemini and others.
But as time passes, adopting GEO tactics may simply not be enough in the long run. And here’s where the newly-introduced IAB Tech Lab’s LLM Framework comes along.
Launched in June 2025, the new LLM initiative aims to help publishers find new and effective ways to ensure their content visibility to AI systems, while taking such access under control, including in ways, which will enable monetizing it with AI vendors, if needed.
Namely, in addition to covering bot content access management practices via robots.txt & WAF (Web Application Firewall) methods, the guidance also explains why & how publishers can implement llms.txt, i.e. provide LLM models with easier understanding of a website’s content by adding some background information, basic content guidance, links to the detailed markdown files, and more, in a separate /llms.txt file, in the root folder of a website domain.
And there’s more.
The IAB’s LLM Framework also outlines the basis of publishers’ monetization capabilities via AI, including the newly-launched LLM Content Ingest API. In brief, this API implies determining a certain price for content crawl based on customer demand, and enabling LLM access to such content via API based on particular end user prompts, accordingly.
From a technical perspective, an LLM vendor will need to pass the price per token that it’s willing to pay for a specific publisher’s content, whilst the publisher will either return the content (or authorization code with content paths), if the price matches its expectations, or decline the offer, if not.
What the Future of AI Search Holds
Like it or not, but the generative AI evolution is irreversible, and so is the global digital search transformation.
This means that publishers need to put extra effort into the modification of their traffic strategies as soon as possible in order to stay afloat, whether they decide on implement llms.txt, opt for adopting a pay-per-crawl model or choose a completely different monetization path, if any.
In any case, the online publishing industry is obviously entering its new disruptive era – and 2026 will probably reveal what it will actually be like.