AEO vs SEO: 5 Fundamentals That Still Work for AI Search Engines (2026)
AEO didn't replace SEO. The same 5 fundamentals — sitemap, canonical, Open Graph, internal links, title and meta — that rank you on Google also feed ChatGPT, Claude, Perplexity, and Google AI Mode. Here's exactly what shifted in weight, and what's actually new in 2026.

Max Tsygankov · Founder, Crawloria
Published May 8, 2026 · 12 min read
By the time most teams sit down to draft an "AEO strategy" in early 2026, the trade press has already declared SEO dead at least three times — replaced, depending on who's writing, by Answer Engine Optimization, Generative Engine Optimization, AI Optimization, Search Everywhere Optimization, or some other letter combination.
The actual technical reality is calmer than the headlines. ChatGPT, Claude, and Perplexity all crawl the web with traditional bots — GPTBot, ClaudeBot, OAI-SearchBot, PerplexityBot. They read HTML. They parse Schema.org JSON-LD. They follow internal links. They render Open Graph tags when they cite you in an answer. The five fundamentals that ranked you on Google — XML sitemap, canonical URL, Open Graph metadata, internal links with descriptive anchor text, and accurate titles and meta descriptions — are the same five that get you cited in AI search.
What changed is the weight of each signal in a particular surface, plus a small layer of genuinely new things on top. This article walks through both: which SEO fundamentals still work for AI search engines, where the weight shifted in AEO vs SEO, and what's actually new for 2026.
The 5 SEO fundamentals that still work for AI search engines
The five fundamentals below are not new. They have governed Google ranking for over a decade. They also govern AI search visibility today, with predictable shifts in emphasis. We walk through each.
1. XML sitemap — still the canonical way to declare your URLs
GPTBot, OAI-SearchBot, ClaudeBot, and PerplexityBot all read /sitemap.xml the same way Googlebot does. A broken or missing sitemap means new pages get discovered only through external links and crawl chains — which on a young site means weeks of delay.
For AI search, the sitemap matters in two specific ways beyond what it does for Google:
- Discovery latency. Search-index crawlers (Class 2 in the four-classes taxonomy) crawl less frequently than Googlebot. A correct sitemap shaves the time-to-citation for new pages — sometimes by weeks.
- Selective indexing. Your sitemap tells AI crawlers what's important and what isn't. Auto-generated sitemaps that include thin pages, search results, or expired listings dilute the signal. Curate it the way you would for Google.
Declare the sitemap explicitly in robots.txt:
Sitemap: https://your-domain.com/sitemap.xml
This works for every major crawler and costs nothing.
2. Canonical URL — still the answer to duplicate content
When you have five URLs pointing at one product card — filter variants, UTM-tagged versions, sort-order parameters — the canonical tag tells the crawler which one is "real." Without it, AI crawlers index the duplicate they happened to fetch first, then cite that URL when answering. Users land on a stripped-down filter view of your product instead of the canonical page.
The mechanism is unchanged from classic SEO:
<link rel="canonical" href="https://your-domain.com/products/the-real-one" />
Canonical errors have a more visible failure mode in AI search than they do on Google. Google often deduplicates near-identical pages on its own. ChatGPT and Perplexity, building citations from a smaller index, are more likely to surface the wrong one if you don't declare the right one.
3. Open Graph tags — now your most-rendered surface
This is the fundamental whose weight increased the most in the AEO vs SEO shift, and it's worth understanding why. The Open Graph protocol, originally designed at Facebook for social-media previews, defines four required properties per the Open Graph specification at ogp.me: og:title, og:type, og:image, and og:url. AI answer interfaces — ChatGPT, Perplexity, Claude, Bing Copilot — pull these tags when they cite your page.
Three things follow:
- Your
og:titleandog:descriptionare the first impression for an AI-cited reader. Often they are the only impression — if the answer engine summarizes the page, the user may never click through. Treat OG copy as ad copy, not as an afterthought. - Your
og:imageshows up in citation cards in ChatGPT Search, Perplexity, and Bing Copilot. A missing or broken image leaves a gray rectangle next to your brand. A 1200x630 PNG rendered server-side (or generated dynamically per route) is the bare minimum. - Your
og:urlshould match your canonical. A common bug: canonical is set, butog:urlis not — citation cards link to the wrong URL. Fix this once and the impact compounds across every future citation.
The shift here is not technical, it's surface-level. In classic SEO, OG tags affected social-media reach. In AEO, they are the visible chrome around AI citations.
4. Internal links with descriptive anchor text — now also clicked by agents
Two distinct AI behaviors depend on internal links:
- Search-index crawlers use them to traverse your site and build citation indices. This is unchanged from classic SEO crawling.
- Autonomous agents (ChatGPT Operator, Claude for Chrome, Perplexity Comet) click them on behalf of users. They cannot guess what "click here" means. They navigate by anchor text alone.
So an anchor that reads "size guide" or "shipping policy" gives an autonomous agent a clear path to the next step in a task. An anchor that reads "click here" or "this article" leaves the agent stuck at the previous page, sometimes abandoning the task entirely.
Two practical rules from this:
- Replace any "click here" / "this link" / "more" anchor with descriptive text. SEO advice has been telling you this for fifteen years. Agents now make the cost of ignoring it visible.
- Keep important conversion paths within 2-3 clicks of the homepage. Each click an agent takes consumes vision-token budget. Longer paths increase the chance the agent abandons before reaching the goal.
5. Title tag and meta description — still the triage signal
Real-time fetchers (ChatGPT-User, Claude-User, Google-Agent — Class 3 in the taxonomy) often preview pages by title and meta description before deciding whether to fetch the full body. An empty <title> or missing <meta name="description"> gets the page skipped at triage.
The pattern is the same as classic SEO:
- Title under 60 characters, primary keyword early. Tested on every result surface for over a decade and unchanged for AEO.
- Meta description under 160 characters, descriptive of what's on the page. Both Google and AI engines penalize description-vs-content mismatch more aggressively in 2026 than they did pre-AI.
- Unique titles per page. Templated
Page Title — Brand Namestrings are now a citation-quality red flag — answer engines deprioritize sites with high template duplication.
These are still the same five fundamentals. None of them is new. None of them was made obsolete by AEO or GEO. Now we look at what shifted in weight.
AEO vs SEO: where the actual weight shifted
The five fundamentals all still apply. What changed is how heavily each one matters in a given AI surface. Based on observed behavior across ChatGPT Search, Perplexity, Claude, and Google AI Overviews:
| Signal | Classic SEO weight | AI-search weight | Direction |
|---|---|---|---|
| Open Graph metadata | Medium (social previews) | High (citation cards) | ↑ |
| Internal anchor text | Medium (link equity) | High (agent navigation) | ↑ |
| Schema.org JSON-LD | Medium (rich results) | High (entity extraction) | ↑ |
| Author E-E-A-T | Medium (quality raters) | High (citation eligibility) | ↑ |
| Page speed | High (Core Web Vitals) | Medium (bots are patient) | ↓ |
| Backlinks (referring domains) | High (PageRank) | Still high (citation correlation) | → |
| llms.txt | n/a | Low-but-rising | new |
The "Direction" column reflects observed behavior, not formally documented ranking factors. Treat it as a working hypothesis, not a Google-published spec.
Two of these shifts deserve a closer look.
Why Open Graph and internal anchor text moved up
Both signals are now read by surfaces that classic SEO didn't contemplate. OG tags render inside ChatGPT's citation card, inside Perplexity's source list, inside Claude's reply context. Internal anchors are clicked by autonomous agents trying to complete user tasks. In classic SEO these signals served indirect roles — social referral and link equity. In AI search they serve direct, user-facing roles. A weak og:image is now visible to every cited reader.
Why page speed weight softened (a little)
Be careful with this claim. Page speed still matters for human users, and humans land on your site as a result of AI citations. The shift is narrow: AI crawlers and autonomous agents are typically more patient than an impatient mobile user. They will wait several seconds for a page that a human might bounce on. So the part of "page speed" that's about retaining a panicked human still matters fully. The part that's about not getting deindexed for being slow has loosened, because AI bots aren't running tight Core Web Vitals budgets the way Googlebot's mobile-first indexing does.
Don't treat this as license to ship slow pages. Treat it as license to stop optimizing past the threshold of "fast enough that humans don't bounce."
What's actually new in AEO vs SEO for 2026
Two genuinely new things matter, both sitting on top of the SEO fundamentals rather than replacing them.
llms.txt — small adoption today, growing
The /llms.txt convention, proposed by Jeremy Howard at Answer.AI in September 2024, is a markdown file at your domain root that gives AI a curated table of contents of your site. The llms.txt explainer goes deep on what it does and who reads it. The short version:
- It is read today by retrieval pipelines like Cloudflare AutoRAG, Mintlify-generated docs sites, and developer-focused integrators (Cursor, Continue).
- It is not read at query time by ChatGPT, Claude, or Perplexity in regular user sessions. Those operate on their own crawled indices.
- Adoption is growing but uneven. As of 2026, the directory at directory.llmstxt.cloud lists about 1,655 sites. A year ago it listed under 200.
If your site has substantial documentation, a help center, or a developer-tools surface, publishing a curated llms.txt is a 30–60-minute investment that may pay off through AI-driven retrieval. If you run a classic DTC Shopify store or a small marketing site, it is lower priority.
Bot-class-aware policy in robots.txt and WAF
Classic SEO treated bot management as one decision — block bad bots, allow Google. AI-search bot management has four decisions, one per class.
The four classes of AI bots article goes deep. The short version for AEO vs SEO purposes:
- Training crawlers (GPTBot, ClaudeBot, CCBot, Google-Extended): allow unless you have a legal opt-out reason. Training inclusion drives what the model "knows" about your brand for the next 1–3 years.
- Search-index crawlers (OAI-SearchBot, PerplexityBot): always allow. There is no upside to blocking these — they are how you get cited in answers.
- Real-time fetchers (ChatGPT-User, Claude-User, Google-Agent): cannot be blocked via robots.txt anyway, because they are user-initiated rather than crawler-scheduled. Treat as referral traffic.
- Autonomous agents (Operator, Claude for Chrome, Comet): the only actionable layer is your WAF — Cloudflare Bot Fight Mode and similar. Whitelist verified AI bots above any blocking rule.
This is the bot-management half of AEO. Classic SEO never had to think about it because Googlebot was the only consequential crawler. In 2026 it is seven distinct bots in four classes.
AEO vs SEO vs GEO: clearing up the alphabet soup
Three terms get used interchangeably and shouldn't be.
- SEO (Search Engine Optimization) is the original discipline: optimize a site to rank on a traditional search engine result page. Still applies wherever there are ranked organic results, which includes Google AI Mode's classic-results tab.
- AEO (Answer Engine Optimization) is the variant focused on being the cited source in an AI-generated answer — ChatGPT Search, Perplexity, Google AI Overviews, Bing Copilot. The optimization target is citation, not rank.
- GEO (Generative Engine Optimization) is, in most usage, a near-synonym for AEO with a different rhetorical emphasis ("generative engine" instead of "answer engine"). Some authors stretch GEO to cover all generative AI surfaces, including those without explicit citation. The practical strategies are largely the same. The
geo vs seodebate in marketing blogs is, mostly, a vocabulary debate.
You don't need to pick one term. You need to apply the technical fundamentals (the five above) and add the two new things (llms.txt and bot-class policy) on top. The acronym you use to describe what you're doing is a marketing decision, not a technical one.
What "AI will kill SEO" gets wrong
Three claims show up in the trade press and don't survive contact with how AI engines actually work.
Claim 1: "AI search ignores Google's signals." False. AI bots crawl HTML the same way Google does, parse Schema.org the same way Google does, and follow internal links the same way Google does. They diverge in surface (citation card vs result page) and in some signal weights (OG tags matter more, page speed matters less for the bot itself), but the underlying signals are nearly identical.
Claim 2: "AEO requires entirely new content." Mostly false. The same well-structured article — clear H1, descriptive H2s, FAQ section with FAQPage schema, factual claims with citations, accurate title and meta — that ranks on Google also gets cited by ChatGPT and Perplexity. The largest delta is at the start of the article: an answer-first opening (a one-sentence direct answer in the first paragraph) increases citation eligibility. That is a paragraph-level edit, not a content rewrite.
Claim 3: "Backlinks don't matter anymore for AI." Partially false. Backlinks correlate with AI citation frequency, just less mechanically than they did with PageRank. Recent analysis suggests referring-domain count is one of the stronger predictors of ChatGPT citation eligibility, alongside brand-mention velocity across the open web. Treat backlinks as still load-bearing, not a relic.
The honest framing: AI search added a new surface, shifted some signal weights, and introduced a small layer of net-new infrastructure. It did not invalidate the substrate. Most of the SEO playbook still works, and the parts that don't are smaller than the trade press suggests.
Where to start: a 7-day AEO upgrade for an existing SEO program
If you already have a solid SEO program, the bridge to AEO is a one-week sprint. Here is the order.
Day 1 — Audit the five fundamentals. Crawl your site (Screaming Frog, Sitebulb, or a free tool). Verify: sitemap declared in robots.txt, canonical tags present and correct, Open Graph tags on every public page (with og:url matching canonical), internal anchors descriptive, titles and meta descriptions unique under their character limits.
Day 2 — Verify bot access by class. Run a free Crawloria audit on your homepage. The audit sends real HTTP requests with each AI crawler's documented user-agent — GPTBot, OAI-SearchBot, ClaudeBot, PerplexityBot, CCBot, Google-Extended — and reports which ones get HTTP 200 vs blocked. Most AEO problems trace to a Cloudflare "Block AI Bots" rule someone enabled and forgot.
Day 3 — Schema.org pass. Add Article schema to blog posts, Organization to the homepage, Product to product pages, and FAQPage schema to any page with a real FAQ section. Validate at validator.schema.org. Avoid SoftwareApplication schema unless you have real, displayed reviews — Google rejects it otherwise.
Day 4 — Answer-first rewrites on the top 10 informational pages. Move the direct answer to the first paragraph. Before: "There are many factors to consider when…" After: "[Direct one-sentence answer.] [Then expand.]"
Day 5 — Author profiles. Replace anonymous bylines with named authors who have on-site profile pages, linked LinkedIn, and a paragraph of credentials. AI engines weight named authorial expertise the same way Google's E-E-A-T does — sometimes more, because citations carry the author through.
Day 6 — llms.txt where it makes sense. If you publish documentation, a help center, or a developer surface, draft a curated llms.txt (5–25 links, descriptive one-liners). Use the free Crawloria llms.txt generator for a starting point.
Day 7 — Measure baseline. Set up Google Search Console if you haven't. Bookmark Bing Webmaster Tools (it surfaces ChatGPT-Search-adjacent queries since ChatGPT Search uses Bing's index). Add a recurring search prompt for your brand in ChatGPT and Perplexity once a month, log the citations, watch the trend.
That is the full bridge. Nothing on this list is exotic. Nothing requires a new content strategy. The work is auditing, completing, and slightly reweighting an SEO program you almost certainly already have.
Test how AI search engines actually see your site
Crawloria runs the bot-access checks above plus 17 more agent-readability signals — robots.txt parsing, JS-render comparison, Schema.org validity, llms.txt presence, Open Graph completeness, canonical alignment, and more — in 20 seconds. Free. No signup. The audit returns a 0–100 score and a prioritized list of fixes ordered by AI-search impact.