You've heard the term AEO. You might have even checked your score. Now what?
This is the implementation guide. Seven concrete steps, ordered by impact, with code you can copy-paste. Most of this can be done in an afternoon. The sites that do it now will be the ones AI agents recommend six months from now. The ones that don't will be invisible to the fastest-growing discovery channel in a decade.
No theory. No fluff. Let's fix your site.
Step 1: Audit Your Baseline
Before optimizing anything, you need to know where you stand. Run your homepage through an AEO checker and note your score. The average production site scores between 30 and 50 out of 100. Even well-known companies — Stripe (68), Anthropic (46), OpenAI (41) — have significant gaps.
Pay attention to which categories are weakest. Most sites fail on the same three things: no llms.txt (0 points out of ~15), sparse structured data (5-8 out of ~25), and incomplete bot directives in robots.txt. These three fixes alone can jump your score by 25-35 points.
Step 2: Create Your llms.txt
llms.txt is to AI agents what robots.txt is to search engines — a machine-readable map of your site's most important content. It lives at yourdomain.com/llms.txt and gives language models a curated index instead of forcing them to guess which pages matter.
The format is simple. Here's a real example:
# Acme Corp
> Acme builds developer tools for API testing.
## Main Pages
- [Homepage](https://acme.dev): Overview and pricing
- [Product](https://acme.dev/product): Feature breakdown
- [Docs](https://acme.dev/docs): Full API documentation
## Key Resources
- [Changelog](https://acme.dev/changelog): Release notes
- [Blog](https://acme.dev/blog): Technical articles
- [API Reference](https://acme.dev/api): OpenAPI spec
Common mistakes: Don't list every page on your site. The point isn't completeness — it's curation. List your 10-20 most important pages. Think: "If an AI agent could only read 15 pages on my site, which ones would tell the full story?" Those go in llms.txt.
Step 3: Configure robots.txt for AI Crawlers
Your robots.txt probably handles Googlebot. It almost certainly doesn't handle AI crawlers. Here's what a modern robots.txt should include:
# Search engines
User-agent: Googlebot
Allow: /
# AI Crawlers — explicitly permitted
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Googlebot-Extended
Allow: /
User-agent: ChatGPT-User
Allow: /
# Reference files
Sitemap: https://yourdomain.com/sitemap.xml
The key insight: absence is ambiguous, but explicit is clear. Without named AI bot directives, different crawlers interpret default rules differently. Some assume they're allowed. Some assume they're blocked. Adding explicit Allow directives removes all ambiguity.
Disallow: / per bot. But be intentional. The worst outcome is an ambiguous robots.txt where you think you're being crawled but aren't, or vice versa. Make a decision and be explicit.
Step 4: Add JSON-LD Structured Data
Structured data is the highest-leverage AEO optimization after llms.txt. It tells AI agents what things are, not just what words are on the page. A product description in plain text is useful. A Product schema with name, price, rating, and availability is machine-actionable.
Priority 1: Organization schema (homepage)
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Organization",
"name": "Acme Corp",
"url": "https://acme.dev",
"description": "Developer tools for API testing",
"logo": "https://acme.dev/logo.png",
"sameAs": [
"https://twitter.com/acmedev",
"https://github.com/acmedev"
],
"contactPoint": {
"@type": "ContactPoint",
"email": "hello@acme.dev",
"contactType": "customer support"
}
}
</script>
Priority 2: FAQ schema (any page with Q&A content)
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "FAQPage",
"mainEntity": [
{
"@type": "Question",
"name": "What does Acme do?",
"acceptedAnswer": {
"@type": "Answer",
"text": "Acme provides API testing tools..."
}
}
]
}
</script>
Priority 3: Product or SoftwareApplication (product pages)
If you sell a product or SaaS, this schema is essential. Include offers with price, aggregateRating if you have reviews, and a clear description. AI agents use this data to generate product comparisons and recommendations.
Step 5: Structure Content for Machine Parsing
AI agents don't "read" your page the way humans do. They parse it structurally: headings create sections, the first paragraph of each section gets weighted heavily, and lists/tables get extracted as data. Knowing this changes how you write.
Rules that matter:
- One H1 per page. It should describe what this page is about, not be a tagline. "API Testing Platform — Features & Pricing" beats "Build better APIs" every time.
- H2s = section boundaries. Each H2 should be a standalone topic that makes sense without reading the rest of the page. AI agents extract H2 sections as independent content chunks.
- Front-load key information. The first sentence of each section should contain the core claim or fact. AI summaries pull heavily from opening sentences.
- Use lists for features, pricing, and comparisons. Bullet points and numbered lists get extracted more reliably than prose paragraphs.
- Avoid clever headings. "Why We're Different" tells an AI agent nothing. "3 Features That Make Acme Faster Than Postman" gives it a specific, extractable claim.
Step 6: Fix Technical Accessibility
The most beautifully structured content in the world is worthless if AI crawlers can't reach it. Technical accessibility is the foundation everything else sits on.
Server-side rendering
If your site is a React/Vue SPA that renders client-side, AI crawlers see a blank page with a <div id="root"></div>. This is the single most common reason for a 0/100 structured data score. Solutions:
- Next.js / Nuxt / Astro: Server-render your marketing pages by default.
- Static export: Pre-render to HTML at build time. Works for any framework.
- Prerender middleware: If you can't change frameworks, tools like Prerender.io serve static HTML to crawlers.
Page speed
AI crawlers have timeouts. If your page takes 8 seconds to load because of a 4MB hero image and 12 third-party scripts, crawlers may give up before they see your content. Keep your LCP under 2.5 seconds. Remove unused JavaScript. Compress images.
Meta tags and Open Graph
Every indexable page needs: title, meta description, og:title, og:description, og:type, og:url. These are the first things an AI agent reads when it encounters a URL. Incomplete meta tags signal a low-quality page.
sitemap.xml. It takes 5 minutes to generate one. Every static site generator has a plugin for it. Reference it in your robots.txt. This is table stakes — if you don't have one, fix it before you do anything else in this guide.
Step 7: Monitor and Iterate
AEO isn't a one-time setup. AI crawlers re-index frequently. New crawlers emerge. Schema standards evolve. The sites that maintain their AEO scores are the ones that check them regularly.
A monthly cadence that works:
- Re-run your AEO check on your homepage and top 3-5 pages.
- Compare to your baseline. Score should be trending up. If it's flat or down, check what changed.
- Review your llms.txt. Did you launch new pages? Deprecate old ones? Update the map.
- Check for new AI crawlers. The bot landscape changes quarterly. New agents (like Grok, Gemini, Mistral assistants) have their own User-agents.
- Update structured data if your product, pricing, or team changed.
The 30-Minute Quick Start
Don't have an afternoon? Here's the maximum-impact subset you can do in 30 minutes:
- 5 min: Create
llms.txtwith your top 10 pages. Upload it to your root. - 5 min: Add AI crawler directives to
robots.txt. Copy the template above. - 10 min: Add Organization JSON-LD to your homepage. Copy the template, fill in your details.
- 5 min: Add FAQ JSON-LD with 3-5 questions about your product.
- 5 min: Run your AEO check again. Compare to your earlier screenshot.
That's a 20-30 point score improvement for half an hour of work. Not bad.
What Comes Next
AI agents are becoming the primary discovery channel for a growing segment of users. The shift from "search a query, click a link" to "ask a question, get a recommendation" is structural, not cyclical. The companies that optimize for this channel now are building a compounding advantage.
The mechanics are not complex. The tools are free. What's rare is prioritization. Most of your competitors haven't heard of AEO yet. By the time they do, you'll already be the site that AI agents recommend.