TL;DR
- llms.txt = readability (agents understand you). WebMCP = capability (agents act on your site). They solve different problems.
- SDF and ARW optimize for token efficiency. CAP standardizes commerce transactions. All are complementary.
- For most businesses today, the first step is readability: get an AI Website Profile deployed.
A landscape, not a competition
The machine-readable web is emerging through multiple complementary standards, not a single winner. Each protocol solves a different layer of the agent-website interaction. Understanding the landscape helps you prioritize what to implement now versus what to prepare for.
The agent-ready web stack
Agents discover, cart, and purchase
Agents invoke tools on your site
99% token reduction
Agents understand your business
Start here
llms.txt - the readability layer
llms.txt is a plain markdown file at your domain root that gives AI agents a structured summary of your business. It answers the question: who are you and what do you do? Over 844,000 websites have adopted it, including Stripe, Cloudflare, and Vercel. Its strength is simplicity: no API, no JavaScript, no vendor dependency. You host a text file and agents can find it at a predictable URL.
For most businesses, this is where to start. If an agent cannot understand your business, no amount of transaction infrastructure will help. Readability is the prerequisite for everything else.
WebMCP - the capability layer
WebMCP (Web Model Context Protocol) is a W3C-standardized JavaScript API that lets websites register tool contracts for AI agents. Instead of agents guessing how to interact with your site, you explicitly declare capabilities: search inventory, book an appointment, check availability. Chrome 146 shipped a preview implementation in early 2026.
WebMCP answers a different question than llms.txt: not who you are, but what agents can do on your site. For businesses with interactive services - booking, purchasing, account management - WebMCP will become essential. For businesses that primarily need to be discovered and recommended, llms.txt is sufficient today.
SDF Protocol - the efficiency layer
The Structured Data Format protocol converts raw HTML into canonical, schema-validated semantic representations. It reduces a typical 89KB webpage to approximately 750 tokens for AI processing - a 99 percent token reduction. SDF pre-extracts entities, claims, and relationships across over 50 content subtypes.
SDF addresses the same efficiency problem as llms.txt but from a different angle: llms.txt is authored by the business, while SDF is a conversion protocol that can be applied to any existing page. Both serve the goal of making the web cheaper for agents to process.
ARW - the discovery layer
The Agent-Ready Web standard provides a full stack: discovery via /llms.txt, machine-readable content via markdown views (85 percent token reduction), OAuth-protected tool endpoints, and observability headers so you can track agent interactions. ARW claims 10x faster agent discovery versus full-site crawling.
ARW is the most comprehensive framework, but also the most complex to implement. For businesses looking for a practical starting point, the discovery layer - your llms.txt file - delivers the highest value with the lowest effort.
CAP - the commerce layer
The Commerce Agent Protocol standardizes how agents discover products, manage carts, and complete purchases. CAP Lite uses existing web standards (HTML5, JSON-LD, Schema.org) for immediate adoption. CAP Full Profile adds real-time interactive capabilities through Google's Agent2Agent Protocol for operations like cart management and order tracking.
If you sell products or services online, CAP will matter for your transaction layer. But again: agents must be able to read your catalog before they can add items to a cart. Readability comes first.
The implementation priority
For the vast majority of businesses in 2026, the priority is clear. Start with readability: deploy an AI Website Profile at /llms.txt. This is the foundation that every other standard builds on. As your industry's agent ecosystem matures, layer in capability (WebMCP), efficiency (SDF), and commerce (CAP) as they become relevant to your customer interactions.
Frequently asked questions
Do I need to implement all of these?
No. Start with llms.txt for readability. The other standards serve more advanced use cases - interactive capabilities, token optimization, and transactional commerce - that most businesses will adopt incrementally as the agentic ecosystem matures.
Will these standards converge into one?
Unlikely in the near term. They solve different problems at different layers. The web has always had multiple complementary standards - HTML, CSS, HTTP, DNS, TLS - and the agentic web will follow the same pattern.
Which standard do AI agents actually use today?
Agents from major platforms crawl for llms.txt and structured data. WebMCP is in early preview (Chrome 146). CAP and SDF are gaining adoption among e-commerce platforms. The practical answer for most businesses is: start with what agents can use today, which is llms.txt.
Start with the foundation: readability
Run a free Site Scan to assess your current agent readability. Then get a production-ready AI Website Profile built with your industry blueprint.