Search Engine Land contributor Ludwig Makhyan lays out a practical course for what he calls generative engine optimization (GEO): control how AI agents access your site, make content fragment-ready, and connect facts with schema so models can cite your pages. As Makhyan puts it, “Your site must become the de facto source of truth for the world’s models.” This shift doesn’t replace traditional SEO; it extends technical SEO to include new protocols, extraction-friendly structures, and freshness signals that feed retrieval-augmented generation (RAG).

Makhyan’s piece stresses practical controls: robots.txt to allow or block specific agent user-agents, awareness of Perplexity and Claude naming conventions, and adoption of llms.txt as an optional, future-facing protocol. He writes that llms.txt can appear either as a concise map of links or as an llms-full.txt aggregate that reduces agent crawling overhead. The article also advises prioritizing semantic HTML (<article>, <section>, <aside>) to separate core facts from boilerplate and to keep context windows lean for agents.
Start with access and visibility. If you want certain agents to use your content for grounding or real-time answers, add clear user-agent rules in robots.txt. For example, to allow a retrieval bot but block a training bot, you might use separate allowances and disallows. Keep in mind the landscape is dynamic: tools and user-agent names change, so plan rule reviews into your cadence.
Next, remove extractability blockers. Audit pages that rely heavily on client-side JavaScript to render critical content. Convert those pages to server-rendered or pre-rendered HTML where possible so agents and crawlers see the same canonical facts users do. Use concise headings, tables, definition lists, and schema to make facts easily parsable.
Many teams are experimenting with llms.txt, but adoption across major AI services remains inconsistent. As Google’s John Mueller bluntly put it on social media: “FWIW no AI system currently uses llms.txt.” That doesn’t mean llms.txt is worthless; it can be helpful for future-proofing and for services that decide to adopt it. However, don’t depend on it as a primary delivery mechanism — treat it as a supplemental map rather than a replacement for sitemaps and strong on-page structure.
Makhyan recommends several audit signals that translate into client KPIs. Start with log file analysis to identify agent traffic: who is visiting, which endpoints they read, and how often. Track citation share — mentions and excerpts that cite your content in answers — using brand monitoring and third-party SEO tools. For pages that likely feed RAG, add persistent tracking parameters and measure zero-click referrals and downstream “read more” clicks. Use these signals to create monthly GEO health reports.
<time datetime=""> and surface freshness in schema.Traditional ranking signals aren’t gone — they’re augmented. GEO asks site owners to think beyond ranking positions to being authoritative, extractable, and current. Technical teams should bake in automation for fragment creation and schema generation, while content teams should structure pages so that key facts are available in short, well-labeled chunks. Makhyan’s closing call is clear: “Start with your robots.txt and work your way up to structure, fragmented data, and extractability. Audit your success over time and keep tweaking your efforts until you see positive results. Then, scale with automation.”
This summary and guidance are based on Ludwig Makhyan’s Search Engine Land article, “Technical SEO for generative search: Optimizing for AI agents” (Mar 31, 2026). For a practical note on llms.txt adoption, we referenced John Mueller’s comment that “FWIW no AI system currently uses llms.txt.”
Original article: https://searchengineland.com/technical-seo-generative-search-optimizing-ai-agents-473039
Recognized by clients and industry publications for providing top-notch service and results.
Contact Us to Set Up A Discovery Call
Our clients love working with us, and we think you will too. Give us a call to see how we can work together - or fill out the contact form.