Why technical SEO now includes AI agents — and what to do about it

Search Engine Land contributor Ludwig Makhyan lays out a practical course for what he calls generative engine optimization (GEO): control how AI agents access your site, make content fragment-ready, and connect facts with schema so models can cite your pages. As Makhyan puts it, “Your site must become the de facto source of truth for the world’s models.” This shift doesn’t replace traditional SEO; it extends technical SEO to include new protocols, extraction-friendly structures, and freshness signals that feed retrieval-augmented generation (RAG).

Technical SEO for generative search: Optimizing for AI agents

Key takeaways at a glance

  • Agent access control matters: use robots.txt and consider llms.txt (but don’t expect universal adoption).
  • Design content for extractability: chunk facts, use semantic HTML, and avoid bloated JS that hides content.
  • Structured data is connective tissue: Organization, FAQPage, HowTo and new directives like SignificantLink help models map entities.
  • Freshness and RAG readiness: show last-updated signals and keep performance fast so agents can fetch reliable content.
  • Audit for impact: use log files, citation share, and custom tracking to measure how agents interact with your site.

What the original article recommends

Makhyan’s piece stresses practical controls: robots.txt to allow or block specific agent user-agents, awareness of Perplexity and Claude naming conventions, and adoption of llms.txt as an optional, future-facing protocol. He writes that llms.txt can appear either as a concise map of links or as an llms-full.txt aggregate that reduces agent crawling overhead. The article also advises prioritizing semantic HTML (<article>, <section>, <aside>) to separate core facts from boilerplate and to keep context windows lean for agents.

Practical implications for site owners and SEOs

Start with access and visibility. If you want certain agents to use your content for grounding or real-time answers, add clear user-agent rules in robots.txt. For example, to allow a retrieval bot but block a training bot, you might use separate allowances and disallows. Keep in mind the landscape is dynamic: tools and user-agent names change, so plan rule reviews into your cadence.

Next, remove extractability blockers. Audit pages that rely heavily on client-side JavaScript to render critical content. Convert those pages to server-rendered or pre-rendered HTML where possible so agents and crawlers see the same canonical facts users do. Use concise headings, tables, definition lists, and schema to make facts easily parsable.

On llms.txt: proceed with pragmatic caution

Many teams are experimenting with llms.txt, but adoption across major AI services remains inconsistent. As Google’s John Mueller bluntly put it on social media: “FWIW no AI system currently uses llms.txt.” That doesn’t mean llms.txt is worthless; it can be helpful for future-proofing and for services that decide to adopt it. However, don’t depend on it as a primary delivery mechanism — treat it as a supplemental map rather than a replacement for sitemaps and strong on-page structure.

How to measure GEO work

Makhyan recommends several audit signals that translate into client KPIs. Start with log file analysis to identify agent traffic: who is visiting, which endpoints they read, and how often. Track citation share — mentions and excerpts that cite your content in answers — using brand monitoring and third-party SEO tools. For pages that likely feed RAG, add persistent tracking parameters and measure zero-click referrals and downstream “read more” clicks. Use these signals to create monthly GEO health reports.

Actionable checklist

  • Update robots.txt with targeted rules for known agent user-agents (maintain a change log).
  • Audit and reduce JavaScript-rendered critical content; provide server-side or pre-rendered fallbacks.
  • Add and validate relevant schema: Organization, sameAs, FAQPage, HowTo, and SignificantLink where applicable.
  • Include a visible last-updated <time datetime=""> and surface freshness in schema.
  • Run weekly log-file checks to detect agent traffic and adjust access rules.
  • Consider llms.txt as a low-effort, optional addition but don’t rely on it exclusively.

Where this changes your strategy

Traditional ranking signals aren’t gone — they’re augmented. GEO asks site owners to think beyond ranking positions to being authoritative, extractable, and current. Technical teams should bake in automation for fragment creation and schema generation, while content teams should structure pages so that key facts are available in short, well-labeled chunks. Makhyan’s closing call is clear: “Start with your robots.txt and work your way up to structure, fragmented data, and extractability. Audit your success over time and keep tweaking your efforts until you see positive results. Then, scale with automation.”

Further reading and attribution

This summary and guidance are based on Ludwig Makhyan’s Search Engine Land article, “Technical SEO for generative search: Optimizing for AI agents” (Mar 31, 2026). For a practical note on llms.txt adoption, we referenced John Mueller’s comment that “FWIW no AI system currently uses llms.txt.”

Original article: https://searchengineland.com/technical-seo-generative-search-optimizing-ai-agents-473039

Categories: News, SEO

Awards & Recognition

Recognized by clients and industry publications for providing top-notch service and results.

  • Clutch Top B2B Digital Marketing Agency
  • 50Pros Leadership Award
  • The Manifest Video Award
  • Clutch Top Digital Marketing Agency
  • Clutch Top SEO Agency
  • Clutch Top Company in Georgia 2021
  • Clutch Top Company in Georgia 2022
  • Vendor of the Year 2020
  • Vendor of the Year 2022
  • Expertise Best Legal Marketing Agency
  • Expertise Best SEO Agency
  • Top 10 SEO Agency
  • Top Rated SEO Agency
  • Best Rated SEO Agency
  • Top Digital Marketing Agency
  • Best Digital Marketing Agency

Ready To Grow?

Contact Us to Set Up A Discovery Call

Show Up Higher in Google


Our clients love working with us, and we think you will too. Give us a call to see how we can work together - or fill out the contact form.

This field is for validation purposes and should be left unchanged.
Opt-In