0
Needs Work

shakeshack.com

Scanned just now877ms total

Issues (5)

CAPTCHA Detectionhigh impact

Blocking CAPTCHA/challenge page detected. Agents cannot access content. Patterns: challenge-platform.

Business Impact

AI agents cannot solve CAPTCHAs. A full-page CAPTCHA means no AI assistant can read your content, compare your prices, or recommend your services. Every blocked page is a lost opportunity.

What We Measured

We rendered your page as an AI agent would see it and checked the DOM for CAPTCHA providers (reCAPTCHA, hCaptcha, Cloudflare Turnstile) that block content access.

How to Fix

Move CAPTCHA challenges to specific forms (login, signup) instead of blocking the entire page. Use invisible CAPTCHA or challenge-on-interaction.

llms.txt Checkhigh impact

No llms.txt found at /llms.txt or /.well-known/llms.txt. Create one following the llmstxt.org spec to help AI agents understand your site.

Business Impact

llms.txt is the new robots.txt for AI. It tells AI agents what your site does, what content matters, and where to find it. Without it, AI must guess — and guessing means inaccurate recommendations. Early adopters already see better AI visibility.

What We Measured

We checked for /llms.txt and /.well-known/llms.txt files. We also checked for /llms-full.txt which provides your complete content in Markdown for AI consumption.

How to Fix

Create a /llms.txt file with a summary of your site for AI agents, and optionally a /llms-full.txt with your complete content in Markdown (like SXSW does with their full program). See llmstxt.org for the specification.

# /llms.txt — Summary for AI agents
# Your Site Name

## About
Brief description of your website and what it offers.

## Key Pages
- /products - Product catalog
- /docs - Documentation
- /pricing - Pricing plans

## Optional: /llms-full.txt
# Provide your full site content as Markdown
# so AI agents can read everything directly.
# See sxsw.com/sxsw.md for a real-world example.
Sitemap Checkmedium impact

No sitemap.xml found at /sitemap.xml.

Business Impact

A sitemap is your site's table of contents for AI crawlers. Without it, agents may miss important pages like product pages, documentation, or pricing — reducing the completeness of their recommendations about you.

What We Measured

We fetched /sitemap.xml and checked for a valid XML sitemap with <url> entries and recent <lastmod> dates.

How to Fix

Create an XML sitemap at /sitemap.xml with all important pages and recent lastmod dates. Reference it in your robots.txt.

# Add to robots.txt
Sitemap: https://example.com/sitemap.xml
WebMCP Endpointlow impact

No WebMCP endpoint found. Adding one lets AI agents programmatically interact with your site.

Business Impact

WebMCP (Web Model Context Protocol) is an emerging standard that lets AI agents directly call actions on your website — like booking, purchasing, or querying data. Early adopters get native AI agent integration.

What We Measured

We checked /.well-known/webmcp and /webmcp.json for a valid WebMCP configuration with an actions array.

How to Fix

Add a WebMCP endpoint at /.well-known/webmcp or /webmcp.json to let AI agents interact with your site programmatically.

agents.json Discoverylow impact

No agents.json found. This file helps AI agents discover what your site offers.

Business Impact

agents.json is a discovery mechanism that tells AI agents what your website can do — similar to how robots.txt tells crawlers what they can access. It helps agents understand your services, API endpoints, and interaction capabilities.

What We Measured

We checked /agents.json and /.well-known/agents.json for a valid JSON configuration.

How to Fix

Create an /agents.json file that describes your site's capabilities for AI agents.

All Checks (12)

robots.txt AI Bot Check

CAPTCHA Detection

Cookie Consent Wall

Machine-Readable Prices

llms.txt Check

Login-Wall Check

Structured Data Quality

Sitemap Check

JS Dependency

TTFB / Response Time

WebMCP Endpoint

agents.json Discovery

Your basic scan is done — want the full picture?

The Free Basic Scan checks static HTML. The Deep Scan simulates how AI agents actually navigate your site — and finds issues invisible to basic analysis.

Simulated agent navigation

Tests your pages the way an AI agent experiences them — catches CAPTCHAs, cookie walls, and JS-dependent content that basic HTTP scanning misses.

Multi-page analysis

Scans your homepage plus up to 5 subpages from your sitemap. Problems on inner pages won't slip through.

3 additional checks

Tests for API documentation (OpenAPI/Swagger), MCP server endpoints, and form accessibility — the building blocks of true agent interoperability.

One-time payment — no subscription
Results in ~40 seconds
Definitive 13-check score
Secure payment via Stripe
Which pages should we scan? (homepage is always included)1/5
shakeshack.com/

No pages added? We'll auto-detect from your sitemap.

Want to scan another website?

Scan another site