This article is based on a single primary source and has not been independently corroborated. DeepBrief is monitoring for additional confirmation. Cloudflare has introduced isitagentready.com, a scoring tool that evaluates whether websites support emerging standards for AI agents, alongside a new dataset on Cloudflare Radar tracking adoption of those standards across the web.

The company announced the tool in a blog post on April 17, 2026, describing it as an analog to performance auditors like Google Lighthouse, but focused on AI agent compatibility rather than browser performance. Source: https://blog.cloudflare.com/agent-readiness/

How the score is calculated

According to Cloudflare, the tool makes requests to a submitted URL and grades the site across four dimensions. Discoverability covers robots.txt, sitemap.xml, and Link Headers defined in RFC 8288. Content covers Markdown for Agents. Bot Access Control covers Content Signals, AI bot rules in robots.txt, and Web Bot Auth. Capabilities covers Agent Skills, the API Catalog standard defined in RFC 9727, OAuth server discovery via RFC 8414 and RFC 9728, MCP Server Cards, and WebMCP.

Cloudflare says the tool additionally checks for support of agentic commerce standards including x402, the Universal Commerce Protocol, and the Agentic Commerce Protocol, though these checks do not currently factor into the score.

For each failing check, Cloudflare says the tool generates a prompt that site owners can paste into a coding agent to implement the missing standard.

Radar dataset on adoption

To measure how prepared the web is for AI agents, Cloudflare says it scanned the 200,000 most visited domains tracked by Cloudflare Radar, filtering out categories such as redirects, ad servers, and tunneling services before running the readiness checks.

It's still early — there is lots of opportunity to stand out by being one of the first sites to adopt new standards and work well with agents.

Cloudflare reports that 78% of the scanned sites have a robots.txt file, but states that most are configured for traditional search engine crawlers rather than AI agents. The company says 4% of sites have declared AI usage preferences using the Content Signals standard, and that 3.9% of sites pass the Markdown content negotiation check, which tests whether a site serves text/markdown when requested via the Accept header.

Cloudflare states that newer standards, including MCP Server Cards and API Catalogs (RFC 9727), appear on fewer than 15 sites across the entire 200,000-domain dataset combined.

The results are published in a new "Adoption of AI agent standards" chart on the Cloudflare Radar AI Insights page, which Cloudflare says will be updated weekly. The company says the underlying data is also accessible through its Data Explorer and the Radar API.

Standards covered by the checks

The tool evaluates a mix of established and emerging specifications. According to Cloudflare, robots.txt, which dates to 1994, serves two purposes for agents: defining crawl rules and pointing to sitemaps. The Link response header defined in RFC 8288 allows agents to discover resources directly from HTTP responses without parsing HTML, Cloudflare says, and can be used to advertise resources such as an API catalog at /.well-known/api-catalog.

Cloudflare notes that llms.txt, proposed in September 2024 as a way to provide LLM-friendly content summaries, is among the content accessibility standards tracked in the broader ecosystem.

The company lists Content Signals, which launched as a new standard, and Web Bot Auth, currently an IETF draft, as part of the Bot Access Control category. Agent Skills, tracked under Capabilities, is linked to a Cloudflare RFC repository on GitHub.

Self-referential implementation

Cloudflare says isitagentready.com itself is built to pass the checks it runs. The company states that the site exposes a stateless MCP server at https://isitagentready.com/.well-known/mcp.json with a scan_site tool over Streamable HTTP, allowing any MCP-compatible agent to scan websites programmatically without using the web interface.

Cloudflare also says the site publishes an Agent Skills index at /.well-known/agent-skills/index.json with skill documents for each standard it checks, so agents can retrieve implementation guidance directly from the service.

Developer documentation overhaul

Alongside the tool, Cloudflare says it has reworked its Developer Documentation site at developers.cloudflare.com to support the agent standards it tracks, describing the result as optimized for AI agent access and stating that the changes reduce latency and cost for AI tools retrieving content from the documentation.

Cloudflare compares the new tool's role to that of Google Lighthouse, which scores websites on performance and security and has been used to guide adoption of web platform standards. The company says it wants a similar mechanism to exist for agent-oriented standards.