Web Discoverability is Entering it’s Forked-Path Era

Rob Knight, Managing Director of Web Design & Development agency, Kitty

Rob Knight – Managing Director , Kitty

In the beginning, web design used to be about one audience – humans. People with browsers, thumbs, short attention spans, and feelings. We crafted the look, the flow, the subtle cues that build trust.

Then came the first great shift in discoverability. Search engines – from the early efforts of the likes of ESpotting and later the dominance of Google after 2001 – began indexing the web. For more than two decades, these search bots became the gatekeepers of online visibility. Brands spent millions optimising for them, writing content not for people, but for algorithms scanning keywords, phrases, and metadata. It wasn’t pretty, but it worked.

Now, we are experiencing the second great shift in discoverability – and this one is seismic.
Artificial intelligence has emerged as a third audience for our websites, alongside humans and traditional search engines. But unlike the first two, AI is omnivorous. It consumes, interprets, and synthesises content in a way that makes the old keyword-matching era look almost naïve.

And the uncomfortable truth is that AI is quickly becoming the most influential audience of all.

For brands, this shift is exciting. If an LLM can instantly surface your product, your service list, your credentials – brilliant. That’s the new shop front. It’s the new “position 1” on Google. We’re already seeing clients explicitly ask how to “appear inside AI answers” when prospects ask for recommendations or comparisons.

But for content-driven businesses – publishers, newsrooms, specialist media – the same shift is brutal. AI sucks up their work, summarises it, and leaves. Users get their answers inside the platform and never click through. And for publishers who rely on traffic for ad impressions, the numbers are staggering, with some reporting drops of up to 40%.

Traffic didn’t just move. It evaporated into the models.

Which is why the conversation in web development is rapidly morphing from “How do we design a great site?” into “How do we design for multiple needs and requirements of content?”

The Multi-Layered Web

Modern websites increasingly require three modes of content thinking:

  1. Human Layer:
    Clean journeys. Clear storytelling. Strong design. The familiar front-end experience.
  2. Search Engine Layer:
    Structured schema, crawlable architecture, metadata, and long-tail content that historically existed to catch obscure search terms.
  3. AI Layer:
    Deep, richly detailed, semantically meaningful information – FAQs, context, insights, explanations – crafted so LLMs can accurately learn, summarise, and represent your brand.

We’ve done this ourselves: hidden, unlinked pages designed purely for AI indexing. Lists, comparisons, deep dives – the sort of content no real person needs to read, but every AI model loves. And many brands are drifting toward the same solution. Not publicly. Not proudly. But pragmatically.

In development terms, this means we’re no longer building “a website.” We’re building a multi-purpose information architecture living on the same domain, each optimised for a completely different mode of consumption.

And that makes web design, in the true sense, more complex than it has been in years.

The Technical Reality 

AI doesn’t care about aesthetics when it reads content from your site. It doesn’t care whether the font or colours are beautiful or the spacing is calming. Machines need structure.

However, there are instances when users might send AI agents to your site – these will navigate it as a real user would using a browser in the cloud. In this instance having a well-structured and accessible site will aid the AI agents as they navigate.

This means a lot of traditional SEO fundamentals are suddenly back in fashion – clean markup, structured data, schema, internal linking with purpose, crawlable hierarchies, content depth, factual clarity. What’s interesting is that some argue that optimising for AI is essentially the same as optimising for SEO – just a new version of the same three pillars: technical foundations, on-page clarity, authority.

Others, ourselves included, think it’s a fork. SEO was about ranking. GEO (generative engine optimisation) is about influence. Different stakes. Different patterns of reward. Different expectations of content volume.

Either way, the technical side of web development is about to matter a lot more than ever before. WordPress 7 is baking AI awareness into its core. Plugins are emerging that prepare content for model consumption. Cloudflare is exploring ways to broker AI-to-publisher access. This is only the beginning.

The Legal and Ethical Layer 

The industry knows what’s coming – rules. Permissions. Standards. AI companies won’t be allowed to scrape everything forever. There’ll be lawsuits, compensation models, and, eventually, global-ish agreements on what AI is allowed to ingest.

In the meantime, developers are left in the peculiar position of building sites for machines that might, one day, no longer be allowed to read them. That’s new territory.

The Design Question That’s Keeping Teams Up at Night 

With all this machine-targeted content work, how do we stop websites losing their soul?

This is the anxiety I hear most often from design leads. If we over-rotate toward AI optimisation, do we lose the craft? The originality? The thing that brings a brand to life online?

Here’s where I’m at with it. AI isn’t replacing the need for human design; it’s amplifying it.

Because if you let AI design too much of your site, you get sameness – the flavourless consensus of what the model thinks “works.” That’s the danger. A thousand websites converging toward the same layout, the same tone, the same patterns, because models optimise for generalised correctness, not distinctive expression.

The counterargument is that prompts and brand guidelines would create variation. Perhaps. But the early signs aren’t convincing. AI design tools tend to converge, not diverge.

Which is why we continue, deliberately, to use human designers for bespoke interfaces. Real designers interpret nuance. They understand the delicate psychology of interaction. They can extend a brand, not merely replicate a pattern library.

In an AI-heavy web, that human element becomes a differentiator.

The New Build Brief

Web briefs are changing. Clients now ask, explicitly, how a site will perform in AI search. They want reassurance that the architecture, the content approach, and even the hidden indexing strategy will satisfy the models.

This isn’t something to bolt on at the end. It must be part of the initial planning.

Web design now begins with two questions:

  1. What emotional and experiential needs does the human visitor have? 
  2. What information and structure do machines require – from traditional search bots to GEO and LLMs?

And the answers rarely overlap cleanly.

The job of modern web teams is to balance them without letting either dominate.

The Forked-Path Future

AI isn’t destroying web design. It’s splitting it. Human-first design and machine-first design will co-exist. Sometimes elegantly, sometimes awkwardly. Both matter. Both are essential.

The web is growing up, becoming a multi-audience medium. And our role as designers and developers is to build something that works for the people who buy things. And for the machines that increasingly decide what those people see.

It’s not the future we expected. But it’s definitely the one we’re building.

Source: Kitty

You must be logged in to post a comment Login