Popular Posts

Latest Articles

Semrush Partner

Agentic Engine Optimization – The Roadmap for Optimizing Content for AI Agents

Google Cloud AI Engineering Director Addy Osmani published a landmark guide titled “Agentic Engine Optimization (AEO)” on April 11. Analyzed in depth by Search Engine Land, this guide serves as a critical technical roadmap for the evolving SEO industry.

Osmani’s definition is clear: AEO is the practice of structuring, formatting, and presenting technical content so that AI agents can actually use it. We’ve spent years learning to optimize for search engine crawlers. Now we’re applying the same principle to AI agents that autonomously find, parse, and reason about web pages.

This is the technical extension of last week’s Sundar Pichai interview. Pichai said “search will become an agent manager.” Osmani explains how to optimize your content for those agents. Read together, they paint a clear picture: Google is moving the future of search toward an agent-based architecture, and its own engineering director is writing the first technical guide.

The Five Pillars of AEO

Osmani defines AEO not as a single thing but as a layered stack of signals and standards. From the foundation up, five components:

  1. Discoverability: Can agents find your content without running JavaScript? Does your robots.txt allow AI bots? Do you have an llms.txt file?

  2. Parsability: Is the content machine-readable without requiring visual layout interpretation? Can you serve Markdown?

  3. Token Efficiency: Does the content fit within typical agent context windows? Can it be consumed without truncation?

  4. Capability Signaling: Does the content tell agents what your site can do? Can an agent understand your site’s purpose without reading every page?

  5. Access Control: Do agents clearly know which content they can access?

Token Count: Now the Primary Optimization Metric

The most striking message in Osmani’s guide: token limits have become the fundamental constraint shaping content performance. If you’re not tracking page token counts, you’re missing a signal that determines whether agents will even attempt to read your content.

Long, bloated pages get hurt by agents in three ways: truncation (the end of the page never gets read), skipping (the agent passes over the page entirely), or poor chunking (content gets split at the wrong points). All three lead to incomplete answers or hallucinated outputs.

Osmani’s recommended token limits are concrete:

  • Quick-start guides at roughly 15,000 tokens

  • Conceptual guides at roughly 20,000 tokens

  • Individual API references at roughly 25,000 tokens

These numbers are for technical documentation, but the principle is universal. The shorter and more focused your content, the higher the probability of accurate consumption by AI agents.

The First 500 Tokens: The Golden Zone

Osmani says the first 500 tokens of every page should answer three questions: “What is this?”, “What can it do?”, and “What do I need to get started?” Agents have “limited patience for introductory prose.” Background information, history, or motivation should live at the end of the page, not the beginning.

This is the AI agent version of SEO’s BLUF (Bottom Line Up Front) principle. In GEO, we were already recommending delivering the core message in the first 150 words. Osmani converts this into a more precise metric: 500 tokens.

Structural Design: Agents Don’t Read Linearly

Agents don’t read a page from top to bottom. They parse the structure. Osmani’s structural design rules:

  • Hierarchy: Use consistent heading hierarchy: H1, H2, H3, no skipping.

  • Conclusion First: Start each section with the conclusion, not the background.

  • Contextual Placement: Place code examples immediately after the claim they support.

  • Data Compression: Present parameter references in tables because tables compress better than prose.

These rules were written for technical documentation, but they apply directly to SEO content. The same principles work for blog posts and product pages: clear heading hierarchy, conclusion first, supporting detail second.

llms.txt: The Sitemap for AI Agents

Osmani describes llms.txt as “a sitemap for AI agents.” A plain Markdown file hosted at yourdomain.com/llms.txt. It presents the site’s content directory with descriptions. This allows agents to determine which content is relevant without crawling the entire site.

While XML sitemaps tell search engines “look at these pages,” llms.txt tells AI agents “here’s what content exists and what you can do with it.” The difference is significant: llms.txt provides not just location but context and capability information.

AGENTS.md and SKILL.md: Progressive Disclosure

Beyond llms.txt, Osmani recommends two additional files:

  • AGENTS.md: A machine-readable entry point for codebases. The agent learns what the project is, which files matter, and how to get started from this file.

  • SKILL.md: A capability file for APIs and services. It enables an agent to answer “what can this service do?” without reading all the documentation. It works through progressive disclosure: SKILL.md is the entry point, supporting references load only when needed. Token usage stays minimal.

The “Copy for AI” Button and Markdown Serving

One of Osmani’s low-effort, high-impact recommendations: add a “Copy for AI” button to your pages. This button copies a clean Markdown version of the page to the clipboard. Agents and users get clean content without HTML tag noise, navigation menus, and footer clutter.

A more systematic solution: serve Markdown content by appending .md to URLs or via a query parameter. Agents process Markdown at a far lower token cost than HTML. Navigation, menus, and footers unnecessarily consume the token budget.

agentic-seo: The Open Source Audit Tool

Alongside the guide, Osmani released an open-source audit tool called agentic-seo. The tool crawls your site and checks: robots.txt configuration, llms.txt presence, AGENTS.md file, SKILL.md file, token counts, Markdown accessibility, and agent-permissions.json file.

This tool offers a parallel approach to the Agent Readiness score Cloudflare announced this week. Used together, they provide a comprehensive assessment of your site’s readiness for the AI agent era.

Osmani’s Recommended Implementation Order

Osmani recommends implementing AEO in phases, not all at once:

  1. Audit your robots.txt (are you blocking AI bots?).

  2. Add an llms.txt file.

  3. Measure token counts and make them visible.

  4. Write SKILL.md files for APIs.

  5. Add “Copy for AI” buttons.

  6. Start monitoring AI traffic.

SEO, GEO, AEO: Three Layers, One Strategy

At this point, it’s worth clarifying the terminology. In 2026, there are three distinct optimization layers:

  • SEO (Search Engine Optimization): Ranking in traditional search engines. Visibility in Google’s organic results.

  • GEO (Generative Engine Optimization) / AEO (Answer Engine Optimization): Being cited and visible in AI search engines (ChatGPT, Gemini, Perplexity, Claude). Having your content used as a source in AI-generated answers.

  • Agentic Engine Optimization (AEO): Having autonomous AI agents find, parse, and use your content to complete tasks. This goes a step further: the agent doesn’t just cite you; it uses your data to take action.

Digiday wrote in February 2026 that “agencies, publishers, and SEO experts are giving different abbreviations to the same trend.” That’s partly true: all three layers share the same foundational principles (structured data, clean content, entity signals). But Osmani’s framework demonstrates that the agentic layer has technical requirements distinct from the others. Token efficiency, llms.txt, AGENTS.md, and Markdown serving are requirements specific to the agentic world, absent from GEO.

HubSpot and Webflow AEO Tools Expand the Ecosystem

While Osmani’s guide laid out the theoretical framework, two major platforms integrated AEO into their products the same week.

HubSpot launched its AEO tool on April 14. It tracks how your brand appears in ChatGPT, Gemini, and Perplexity. Scoring across five dimensions: sentiment analysis, entity quality, brand recognition, share of voice, and market position. The standout feature is CRM-powered prompt intelligence: it connects to your HubSpot CRM and suggests the prompts your customers are most likely to use in AI searches based on real customer data. HubSpot’s numbers are striking: customer organic traffic dropped 27% year-over-year, but AEO beta users saw 20% growth in referral traffic.

Webflow announced Webflow AEO on April 13: a closed-loop, agentic Answer Engine Optimization system. Three components: measure AI visibility, recommend improvements, implement changes. AEO agents deliver prioritized recommendations from broken links to outdated metadata to new content opportunities.

Mert’s Take: Why Agentic Engine Optimization Is This Week’s Most Important Development

The reason I made Osmani’s guide the headline: this is a concrete technical roadmap for the future of search, coming from Google’s own engineering director. Pichai draws the vision, Osmani explains the implementation, HubSpot and Webflow provide measurement tools, Cloudflare delivers the readiness score. All in the same week.

At Stradiji, we’re telling our clients: SEO isn’t dying; it’s layering. Traditional SEO remains the foundation. GEO and AEO (answer engine optimization) add the AI search visibility layer. Agentic Engine Optimization is the third layer: getting AI agents to find, understand, and use your site.

The first thing you should do this week: run Osmani’s agentic-seo tool on your site. If you don’t have an llms.txt file, create one. Measure your token counts. Review your robots.txt for AI bots. These steps are the foundation of preparation for the agent era.

Share!

These May Also Interest You

Craving more SEO knowledge? Extend your learning with #SEOSDINERSCLUB

Subscribe to our newsletter for weekly SEO insights, join the discussion in our community, or engage with professionals on our Twitter group.