0
Needs Work

mcdonalds.com

Scanned just now1061ms total

Issues (4)

llms.txt Checkhigh impact

No llms.txt found at /llms.txt or /.well-known/llms.txt. Create one following the llmstxt.org spec to help AI agents understand your site.

Business Impact

llms.txt is the new robots.txt for AI. It tells AI agents what your site does, what content matters, and where to find it. Without it, AI must guess — and guessing means inaccurate recommendations. Early adopters already see better AI visibility.

What We Measured

We checked for /llms.txt and /.well-known/llms.txt files. We also checked for /llms-full.txt which provides your complete content in Markdown for AI consumption.

How to Fix

Create a /llms.txt file with a summary of your site for AI agents, and optionally a /llms-full.txt with your complete content in Markdown (like SXSW does with their full program). See llmstxt.org for the specification.

# /llms.txt — Summary for AI agents
# Your Site Name

## About
Brief description of your website and what it offers.

## Key Pages
- /products - Product catalog
- /docs - Documentation
- /pricing - Pricing plans

## Optional: /llms-full.txt
# Provide your full site content as Markdown
# so AI agents can read everything directly.
# See sxsw.com/sxsw.md for a real-world example.
Structured Data Qualitymedium impact

No JSON-LD / Schema.org structured data found.

Business Impact

Structured data is the language AI agents speak. It lets them understand your business type, products, locations, reviews, and FAQs with 100% accuracy — instead of trying to parse HTML. This directly affects whether an AI recommends you.

What We Measured

We parsed your HTML for JSON-LD script tags containing Schema.org structured data. Any valid structured data types found (Organization, Product, FAQ, etc.) are counted.

How to Fix

Add JSON-LD structured data for your main entity types (Organization, Product, BreadcrumbList, FAQ, etc.).

<script type="application/ld+json">
{
  "@context": "https://schema.org",
  "@type": "Organization",
  "name": "Your Company",
  "url": "https://example.com",
  "description": "What your company does"
}
</script>
WebMCP Endpointlow impact

No WebMCP endpoint found. Adding one lets AI agents programmatically interact with your site.

Business Impact

WebMCP (Web Model Context Protocol) is an emerging standard that lets AI agents directly call actions on your website — like booking, purchasing, or querying data. Early adopters get native AI agent integration.

What We Measured

We checked /.well-known/webmcp and /webmcp.json for a valid WebMCP configuration with an actions array.

How to Fix

Add a WebMCP endpoint at /.well-known/webmcp or /webmcp.json to let AI agents interact with your site programmatically.

agents.json Discoverylow impact

No agents.json found. This file helps AI agents discover what your site offers.

Business Impact

agents.json is a discovery mechanism that tells AI agents what your website can do — similar to how robots.txt tells crawlers what they can access. It helps agents understand your services, API endpoints, and interaction capabilities.

What We Measured

We checked /agents.json and /.well-known/agents.json for a valid JSON configuration.

How to Fix

Create an /agents.json file that describes your site's capabilities for AI agents.

All Checks (12)

robots.txt AI Bot Check

CAPTCHA Detection

Cookie Consent Wall

Machine-Readable Prices

llms.txt Check

Login-Wall Check

Structured Data Quality

Sitemap Check

JS Dependency

TTFB / Response Time

WebMCP Endpoint

agents.json Discovery

Your basic scan is done — want the full picture?

The Free Basic Scan checks static HTML. The Deep Scan simulates how AI agents actually navigate your site — and finds issues invisible to basic analysis.

Simulated agent navigation

Tests your pages the way an AI agent experiences them — catches CAPTCHAs, cookie walls, and JS-dependent content that basic HTTP scanning misses.

Multi-page analysis

Scans your homepage plus up to 5 subpages from your sitemap. Problems on inner pages won't slip through.

3 additional checks

Tests for API documentation (OpenAPI/Swagger), MCP server endpoints, and form accessibility — the building blocks of true agent interoperability.

One-time payment — no subscription
Results in ~40 seconds
Definitive 13-check score
Secure payment via Stripe
Which pages should we scan? (homepage is always included)1/5
mcdonalds.com/

No pages added? We'll auto-detect from your sitemap.

Want to scan another website?

Scan another site