← Back to AEO Checker
March 19, 2026 · 8 min read

Why Your Website Is Invisible to AI Agents (And How to Fix It in 30 Minutes)

ChatGPT, Claude, Perplexity, and Google's AI Overview are becoming the front door to the internet. When someone asks "What's the best project management tool?", these agents synthesize an answer — and your website is either part of that answer, or it doesn't exist. Here's why most sites are invisible, and exactly how to fix it.

68%
of websites block at least one AI bot
<3%
of sites have an llms.txt file
0
ranking signals shared between SEO & AEO

What you'll fix today

  1. Your robots.txt is blocking AI bots
  2. You have no structured data
  3. There's no llms.txt file
  4. Your content has no hierarchy
  5. Your API has no description

The shift nobody's talking about

For 25 years, websites optimized for one thing: Google's crawler. Keywords, backlinks, page speed, meta tags. An entire industry — SEO — exists to win at this game.

But something fundamental changed. AI agents don't use Google's index. They crawl the web themselves, parse your content, and decide whether to recommend you — all based on signals that have nothing to do with traditional SEO.

Think about what happens when someone asks ChatGPT "What accounting software should I use for my freelance business?" If your accounting SaaS isn't legible to GPTBot — structured data, clear content hierarchy, explicit permissions — you're not in the conversation. Period.

This isn't theoretical. Companies that optimize for AI agents are already seeing measurable traffic from ChatGPT referrals, Perplexity citations, and Claude recommendations. The ones that don't? They're becoming invisible to a growing share of internet discovery.

The 5 reasons your site is invisible

1

Your robots.txt is actively blocking AI bots

This is the most common — and most fixable — problem. AI agents have their own crawlers:

If your robots.txt has a blanket Disallow: / under User-agent: *, you're blocking all of them. Many CMS defaults and security plugins do this.

⚡ Fix — 5 minutes

Add explicit allow rules for AI bots:

User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: Google-Extended
Allow: /

Even if you allow *, explicitly listing AI bots signals intent. Agents notice.

2

You have no structured data (or the wrong kind)

AI agents don't "read" your page like a human. They look for machine-readable context: JSON-LD structured data using Schema.org vocabularies.

Without it, your page is a wall of text. With it, an AI agent instantly knows: "This is a SaaS product, it costs $29/mo, it handles invoicing, and here are 47 FAQs about it."

The most impactful schema types for AI discoverability:

⚡ Fix — 10 minutes

Add a JSON-LD block to your page's <head>:

<script type="application/ld+json">
{
  "@context": "https://schema.org",
  "@type": "FAQPage",
  "mainEntity": [{
    "@type": "Question",
    "name": "What does [your product] do?",
    "acceptedAnswer": {
      "@type": "Answer",
      "text": "A clear, concise answer..."
    }
  }]
}
</script>

Start with FAQPage — it has the highest impact-to-effort ratio for AI discoverability.

3

There's no llms.txt file telling agents who you are

robots.txt tells bots what they can access. llms.txt tells them what they should know.

It's a plain text file at /llms.txt on your domain — a briefing document for AI agents. Think of it as your elevator pitch, but for machines. What does your product do? Who is it for? What makes it different? What are the key pages an agent should understand?

Less than 3% of websites have one. The ones that do get a significant edge in how accurately AI agents describe them.

⚡ Fix — 5 minutes

Create /llms.txt at your domain root:

# Your Product Name

> One-sentence description of what you do.

## What We Do
Clear explanation of your product/service.
Target audience. Key differentiators.

## Key Pages
- /pricing — Plans and pricing
- /features — Full feature list
- /docs — API documentation
- /blog — Industry insights

## FAQ
- Q: Most common question?
  A: Clear answer.

## Contact
support@yourcompany.com

Keep it under 2000 words. Plain language. No marketing fluff — agents see right through it.

4

Your content has no clear hierarchy

AI agents parse heading structure to understand what a page is about. They need:

Many modern websites — especially SPAs and heavily-designed marketing pages — break heading hierarchy. Three H1s, no H2s, headings used for styling instead of structure. Google is forgiving about this. AI agents are not.

⚡ Fix — 5 minutes

Audit your heading structure:

✅ Good:
H1: Your Product — Main Headline
  H2: Features
    H3: Feature 1
    H3: Feature 2
  H2: Pricing
  H2: FAQ

❌ Bad:
H1: Welcome
H1: Features (second H1!)
H4: Some random heading
H2: Contact us

Use your browser's dev tools: document.querySelectorAll('h1,h2,h3').forEach(h => console.log(h.tagName, h.textContent))

5

Your API exists but agents can't discover it

This one applies to SaaS products and developer tools. If you have an API, AI agents should be able to find and understand it — so they can recommend it as a tool, not just a website.

Two standards matter:

With MCP (Model Context Protocol) gaining adoption, having a machine-readable API description is becoming table stakes for any tool that wants AI agents as a distribution channel.

⚡ Fix — 10 minutes (if you have an API)

Create /.well-known/ai-plugin.json:

{
  "schema_version": "v1",
  "name_for_human": "Your Product",
  "name_for_model": "your_product",
  "description_for_human": "What users see.",
  "description_for_model": "Detailed description for AI.",
  "api": {
    "type": "openapi",
    "url": "https://yourdomain.com/openapi.json"
  },
  "logo_url": "https://yourdomain.com/logo.png",
  "contact_email": "support@yourdomain.com"
}

The 30-minute AEO checklist

Here's everything in order. Bookmark this.

Check your AEO score in 10 seconds

AEO Checker scans all 7 factors and gives you a concrete score with actionable recommendations. Free. No signup.

Check Your Site →

Why this matters now (not later)

Traditional search isn't dying overnight. But the share of discovery that happens through AI agents is growing fast — and the window to establish yourself is right now.

Here's the thing about AI recommendations: they compound. When an AI agent successfully recommends your product and the user has a good experience, that signal feeds back. Early movers in AEO get a compounding advantage, just like early movers in SEO did in 2005.

The difference? AEO is still wide open. Your competitors probably haven't heard of it yet. The 30 minutes you spend today could define whether AI agents recommend you or your competitor for the next decade.

Related reads