ChatGPT, Claude, Perplexity, and Google's AI Overview are becoming the front door to the internet. When someone asks "What's the best project management tool?", these agents synthesize an answer — and your website is either part of that answer, or it doesn't exist. Here's why most sites are invisible, and exactly how to fix it.
For 25 years, websites optimized for one thing: Google's crawler. Keywords, backlinks, page speed, meta tags. An entire industry — SEO — exists to win at this game.
But something fundamental changed. AI agents don't use Google's index. They crawl the web themselves, parse your content, and decide whether to recommend you — all based on signals that have nothing to do with traditional SEO.
Think about what happens when someone asks ChatGPT "What accounting software should I use for my freelance business?" If your accounting SaaS isn't legible to GPTBot — structured data, clear content hierarchy, explicit permissions — you're not in the conversation. Period.
This isn't theoretical. Companies that optimize for AI agents are already seeing measurable traffic from ChatGPT referrals, Perplexity citations, and Claude recommendations. The ones that don't? They're becoming invisible to a growing share of internet discovery.
This is the most common — and most fixable — problem. AI agents have their own crawlers:
If your robots.txt has a blanket Disallow: / under User-agent: *, you're blocking all of them. Many CMS defaults and security plugins do this.
Add explicit allow rules for AI bots:
User-agent: GPTBot Allow: / User-agent: ClaudeBot Allow: / User-agent: PerplexityBot Allow: / User-agent: Google-Extended Allow: /
Even if you allow *, explicitly listing AI bots signals intent. Agents notice.
AI agents don't "read" your page like a human. They look for machine-readable context: JSON-LD structured data using Schema.org vocabularies.
Without it, your page is a wall of text. With it, an AI agent instantly knows: "This is a SaaS product, it costs $29/mo, it handles invoicing, and here are 47 FAQs about it."
The most impactful schema types for AI discoverability:
Add a JSON-LD block to your page's <head>:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [{
"@type": "Question",
"name": "What does [your product] do?",
"acceptedAnswer": {
"@type": "Answer",
"text": "A clear, concise answer..."
}
}]
}
</script>
Start with FAQPage — it has the highest impact-to-effort ratio for AI discoverability.
robots.txt tells bots what they can access. llms.txt tells them what they should know.
It's a plain text file at /llms.txt on your domain — a briefing document for AI agents. Think of it as your elevator pitch, but for machines. What does your product do? Who is it for? What makes it different? What are the key pages an agent should understand?
Less than 3% of websites have one. The ones that do get a significant edge in how accurately AI agents describe them.
Create /llms.txt at your domain root:
# Your Product Name > One-sentence description of what you do. ## What We Do Clear explanation of your product/service. Target audience. Key differentiators. ## Key Pages - /pricing — Plans and pricing - /features — Full feature list - /docs — API documentation - /blog — Industry insights ## FAQ - Q: Most common question? A: Clear answer. ## Contact support@yourcompany.com
Keep it under 2000 words. Plain language. No marketing fluff — agents see right through it.
AI agents parse heading structure to understand what a page is about. They need:
Many modern websites — especially SPAs and heavily-designed marketing pages — break heading hierarchy. Three H1s, no H2s, headings used for styling instead of structure. Google is forgiving about this. AI agents are not.
Audit your heading structure:
✅ Good:
H1: Your Product — Main Headline
H2: Features
H3: Feature 1
H3: Feature 2
H2: Pricing
H2: FAQ
❌ Bad:
H1: Welcome
H1: Features (second H1!)
H4: Some random heading
H2: Contact us
Use your browser's dev tools: document.querySelectorAll('h1,h2,h3').forEach(h => console.log(h.tagName, h.textContent))
This one applies to SaaS products and developer tools. If you have an API, AI agents should be able to find and understand it — so they can recommend it as a tool, not just a website.
Two standards matter:
/.well-known/ai-plugin.json — The manifest file that describes your tool to AI agents (originally from ChatGPT plugins, now a de facto standard)/openapi.json — Your API specification, so agents know what endpoints exist and how to call themWith MCP (Model Context Protocol) gaining adoption, having a machine-readable API description is becoming table stakes for any tool that wants AI agents as a distribution channel.
Create /.well-known/ai-plugin.json:
{
"schema_version": "v1",
"name_for_human": "Your Product",
"name_for_model": "your_product",
"description_for_human": "What users see.",
"description_for_model": "Detailed description for AI.",
"api": {
"type": "openapi",
"url": "https://yourdomain.com/openapi.json"
},
"logo_url": "https://yourdomain.com/logo.png",
"contact_email": "support@yourdomain.com"
}
Here's everything in order. Bookmark this.
AEO Checker scans all 7 factors and gives you a concrete score with actionable recommendations. Free. No signup.
Check Your Site →Traditional search isn't dying overnight. But the share of discovery that happens through AI agents is growing fast — and the window to establish yourself is right now.
Here's the thing about AI recommendations: they compound. When an AI agent successfully recommends your product and the user has a good experience, that signal feeds back. Early movers in AEO get a compounding advantage, just like early movers in SEO did in 2005.
The difference? AEO is still wide open. Your competitors probably haven't heard of it yet. The 30 minutes you spend today could define whether AI agents recommend you or your competitor for the next decade.