
For years, firms have assumed the web was constructed for individuals.
Web sites have been designed to draw human consideration, clarify, persuade, reassure, and ultimately convert. SEO, consumer expertise, digital merchandising, and checkout design all rested on the identical primary premise: the consumer was an individual sitting in entrance of a display.
That premise is starting to crack.
Not as a result of individuals are disappearing, however as a result of they’re beginning to delegate. Increasingly usually, the primary system studying your website, evaluating your provide, decoding your insurance policies, and even initiating a purchase order won’t be a human being. Will probably be a software program agent appearing on somebody’s behalf. That’s the course implied by Anthropic’s Model Context Protocol, by Google’s Agent2Agent protocol, its guide to agent protocols and its Universal Commerce Protocol, by OpenAI’s Operator and Agents SDK, and by the rising work from firms comparable to Visa, Mastercard, and Cloudflare to make agentic commerce reliable and operational at scale.
This isn’t only a story about higher chatbots or prettier interfaces. It’s a story concerning the internet buying a second interface: one for people, and one other for machines.
From pages to actions
The outdated internet revolved round pages. You revealed info, individuals discovered it, after which clicked by means of a sequence you managed. The rising internet revolves an increasing number of round actions. Brokers don’t care very a lot about your homepage, your visible hierarchy, or the emotional arc of your funnel. They care about whether or not they can perceive your catalog, confirm your insurance policies, entry dependable knowledge, and full a job with out pointless friction.
That’s the reason probably the most consequential developments in AI are more and more not simply fashions, however protocols. Anthropic describes MCP as “a common, open commonplace for connecting AI techniques with knowledge sources,” meant to interchange fragmented integrations with a single protocol. Google’s A2A describes a world by which brokers promote capabilities by means of an “Agent Card,” uncover each other, and collaborate on duties. Google’s personal commerce work goes one step additional: UCP is explicitly designed to combine checkout logic instantly with Google AI Mode and Gemini, with “native checkout” framed because the default path for unlocking “full agentic potential.” In different phrases, the stack is transferring from content material to execution.
The following search engine optimisation is just not search engine optimisation
For twenty years, firms realized that visibility relied on being legible to serps. What’s now rising is extra demanding. It’s now not sufficient to be indexable. You must turn into usable.
That’s the reason concepts comparable to llms.txt matter. As I argued in a recent piece, web sites have been constructed for people, whereas language fashions are higher served by a concise, “fat-free” entry level that reduces ambiguity and strips away the noise of menus, scripts, repeated parts, and format. The llms.txt proposal is simple: place a markdown file at /llms.txt that acts as a curated map for language fashions, exposing what issues, what’s canonical, and the place the helpful assets reside. The official proposal frames it as a option to “provide information to help LLMs use a website at inference time,” exactly as a result of context home windows are restricted and changing advanced HTML into helpful plain textual content is commonly troublesome and imprecise.
That doesn’t make llms.txt some magical rating hack. It’s not. It’s nearer to digital housekeeping for a world by which an increasing number of discovery, summarization, and advice are mediated by AI techniques. The purpose is to not sport a rating algorithm. The purpose is to cut back machine confusion. That distinction issues.
The identical logic applies to newer, extra experimental concepts comparable to identity.txt. The location describes it as “a transportable identification file that tells AI instruments who you’re, the way you suppose, and on what phrases,” including that “llms.txt tells AI about web sites. identification.txt tells AI about individuals.” Whether or not identification.txt itself turns into broadly adopted is nearly secondary. What issues is the course of journey: the online is starting to supply machine-readable self-descriptions on function, moderately than leaving fashions and brokers to deduce every part from noisy HTML, metadata fragments, and guesswork.
And that is unlikely to cease with these two examples. Google’s agent protocol information explains that every A2A agent can publish an Agent Card describing its identify, capabilities, and endpoint. The purpose is apparent: techniques are beginning to announce themselves to different techniques in standardized methods. As soon as that logic takes maintain, it’s straightforward to think about a broader ecosystem of machine-readable recordsdata for insurance policies, permissions, provenance, achievement, pricing logic, returns, and authenticated identification.
Manufacturers will nonetheless matter. However manufacturers will now not be sufficient
Many firms nonetheless deal with AI as one thing layered on prime of the online: a chatbot in customer support, some generated copy in marketing, an assistant within the app. That view is simply too shallow.
What is definitely occurring is {that a} machine-facing layer is being added beneath the seen internet and, in some contexts, in entrance of it. When a consumer asks an agent to search out the perfect black blazer below a sure worth, with fast supply, first rate return situations, and a match much like earlier purchases, the interplay doesn’t start with a homepage go to. It begins with machine interpretation.
That modifications the idea of competitors. Robust manufacturers will nonetheless matter as a result of belief nonetheless issues. However belief will more and more have to be expressed in types machines can course of: structured attributes, present stock, clear return guidelines, supply guarantees, verified service provider identification, and fee techniques that may distinguish a reliable agent from a malicious bot. Visa says its intention is to “guarantee solely authorized AI brokers transact,” whereas Mastercard argues that protocols are important to scaling agentic commerce as a result of they help clear consumer intent, safe credentials, and verifiable agent identification. Cloudflare, working with the fee ecosystem, has made the identical level extra bluntly: retailers will want methods to grant entry to reliable AI brokers whereas stopping fraudulent visitors on the entrance door.
What this implies for firms: the case of Inditex
A world chief comparable to Inditex makes this shift simpler to know as a result of it sits proper on the intersection of name, logistics, e-commerce, and scale.
Inditex began comparatively late in e-commerce in contrast with digital natives, however it will definitely constructed some of the efficient built-in retail techniques available in the market. In its FY2025 results, the corporate reported gross sales of €39.9 billion, on-line gross sales of €10.7 billion, and explicitly highlighted that the combination of retailer and on-line operations permits a “seamless international omnichannel expertise.”
That offers Inditex a significant benefit in an agent-mediated setting. Zara and the remainder of the group already possess most of the issues brokers are prone to worth: robust model recognition, speedy stock rotation, built-in logistics, broad geographic protection, and operational coordination between bodily and digital channels.
However there’s additionally a danger. Vogue has traditionally relied on presentation, aspiration, curation, and friction that was usually commercially helpful. Brokers compress all of that. They scale back merchandising to a choice layer by which worth, availability, dimension confidence, supply date, returns, and trusted identification can turn into extra seen than the ambiance of the positioning itself. In that world, the query is now not “Is your website compelling?” It turns into: “Can an agent use you effectively?” For Inditex, the strategic response is just not beauty. It’s structural.
So what ought to Inditex do?
- First, it ought to begin treating its web sites not solely as locations for people, however as structured surfaces for brokers. Meaning richer machine-readable catalogs, extra specific dimension and match indicators, clearer stock and supply metadata, cleaner coverage publicity, and extra sturdy authentication layers.
- Second, it ought to critically experiment with machine-oriented descriptive recordsdata. A well-designed llms.txt at group and model degree would make sense, particularly for clarifying what’s canonical, how content material is organized, how briskly product info modifications, and which assets are official. It will not be an search engine optimisation trick. It will be an agent usability layer.
- Third, it ought to put together for protocol-driven commerce moderately than assuming that each one transactions will proceed to start inside its personal interface. If Google is constructing UCP to help commerce inside AI-native environments, and if fee networks and infrastructure firms are constructing belief layers for agentic commerce, then massive retailers ought to assume that agent-facing checkout, verification, and discovery will turn into strategically necessary.
Inditex could possibly be unusually nicely positioned for that transition. However the firms that win within the subsequent part of commerce won’t essentially be those with the prettiest interfaces. They would be the ones that make themselves best for brokers to know, belief, and use.
The online is beginning to expose its machine layer
There may be an comprehensible temptation to dismiss issues like llms.txt, identification.txt, Agent Playing cards, or machine-readable coverage layers as marginal technical curiosities. That may be a mistake.
They’re early signposts.
No, llms.txt is just not but some universally adopted commonplace. And no, including it won’t magically rework an organization in a single day. However that misses the purpose. Small recordsdata and light-weight conventions matter as a result of they reveal the place infrastructure goes. The online spent many years perfecting interfaces for human eyes. Now it’s starting, awkwardly however unmistakably, to show interfaces for software program brokers.
That’s the deeper shift.
The unique internet related paperwork. The platform internet related customers and providers. The following one will more and more join brokers, instruments, retailers, fee techniques, and authenticated identities. And when that occurs, the strategic query modifications.
It’s now not simply, “How do I get individuals to go to my web site?”
It turns into, “How do I make my firm comprehensible, reliable, and actionable to the techniques that more and more stand between me and my clients?”
That isn’t a design tweak. It’s a new layer of digital technique.