AEO recovery.
12-step DIY audit.
Lost organic clicks to AI Overviews? Run this yourself. Each step is a yes/no — fail any of them and you have recovery work to do.
- 01
Have you quantified click loss?
GSC: clicks down ≥20% while impressions stable = AI Overview impact.
- 02
Do you have full schema graph?
Organization + Person + Service + FAQ + Article + Speakable. All 6.
- 03
Is llms.txt at root?
Drafted from your real content, not boilerplate.
- 04
Do your blog posts have author Person schema?
Author + credentials + URL — AI engines preferentially cite authored content.
- 05
Do top pages start with TL;DR blocks?
Structured opening summary, AI-citable.
- 06
Are FAQs schema-tagged?
FAQPage schema on pages with FAQs.
- 07
Do pages contain citable original stats?
Numbers AI engines can quote — your data, not generic stats.
- 08
Are you in Wikipedia or Wikidata?
If eligible, single highest-leverage entity-graph signal.
- 09
Are you cited on sources AI engines cite?
Statista, Search Engine Land, niche publications, podcasts.
- 10
Are you tracking AI citations?
AthenaHQ / Otterly + manual audits monthly.
- 11
Are your top 20 commercial pages restructured?
TL;DR + structured Q&A + citable stats + author bio.
- 12
Do you publish original research quarterly?
1 per quarter. Industry-defining data. Highest-leverage long-term signal.
AEO recovery is a 12-signal compounding rebuild — not a single fix.
When Google’s AI Overviews launched at scale, brands that depended on top-of-funnel informational queries saw 30–60% click loss across their highest-volume terms — even when impressions held steady. ChatGPT search, Perplexity, Gemini, and Claude compounded the shift: the answer is now formed inside the engine, not on your page. The recovery question stopped being “how do I rank?” and became “how do I become the cited source inside the answer?”
That answer is structural. AI engines extract entities (people, organisations, services, statistics) from the structured surface of a page, attribute them to authors and sources, then decide who to credit. The 12 steps in this checklist cover the structural surface (schema graph, llms.txt, TL;DR + Q&A blocks), the citable substance (original stats, authored bylines, third-party citations), and the entity graph (Wikipedia/Wikidata, podcast/publication mentions). Each step compounds with the others; no single one is enough.
- 90 days: First citation lifts visible across ChatGPT + Perplexity for top-20 commercial pages.
- 180 days: Meaningful citation share across all 4 AI engines; brand-vs-category share trending upward.
- 12 months: Wikipedia/Wikidata adjacency, original-research citation density, third-party podcast/publication mentions all compounding.
- Reality check: Brands that ship 12 steps in 90 days outperform brands that ship 6 steps in 180 by 4–6× on citation share.
Common AEO recovery questions.
What is AEO recovery?+
AEO (Answer Engine Optimization) recovery is the structured process of reclaiming organic visibility lost when AI Overviews, ChatGPT search, Perplexity, Gemini, and Claude began answering queries directly inside the SERP — without sending the click to your site. Recovery work shifts the goal from "rank for the keyword" to "be the cited source inside the answer." It combines schema graph completeness (Organization, Person, Service, FAQ, Article, Speakable), llms.txt/ai.txt publishing, TL;DR + structured Q&A page restructures, citable original statistics, author Person schema, and entity-graph signals (Wikipedia/Wikidata adjacency). Most brands recover citation share across 4 AI engines within 90–180 days when all 12 steps in this checklist ship.
How do I know if AI Overviews are taking my traffic?+
Open Google Search Console. Filter to the last 90 days. Sort queries by impressions (high → low). For each top query, look at the clicks-to-impressions ratio. If impressions are stable or rising but clicks dropped 20%+, that query is being answered inside an AI Overview, ChatGPT search, or Perplexity panel — and the click is no longer reaching you. Run the same check on your top 50 queries. The pattern is unmistakable when it is happening, and silent when it is not.
How long does AEO recovery take?+
First citation lifts usually appear inside 30–45 days after schema graph + llms.txt + TL;DR restructure ship for top-20 commercial pages. Meaningful share shift across 4 AI engines (ChatGPT, Perplexity, Gemini, Claude) typically lands in 90–180 days. The longest-leverage signals — original research published quarterly, Wikipedia/Wikidata adjacency, third-party citation density — compound over 6–12 months. Brands that wait for "AEO best practices to settle" lose category share that takes 3–4× longer to rebuild.
Do I need llms.txt if I already have a sitemap?+
Yes. Sitemap.xml tells search crawlers which pages exist; llms.txt tells AI training and inference systems how you want them to use your content — preferred canonical URLs, content categories, citation expectations, off-limits paths. The two files do different jobs. ChatGPT, Perplexity, and Claude crawlers explicitly check for llms.txt; sitemap parsing is secondary. Drafting llms.txt from your real content (not boilerplate templates) typically takes 2–4 hours and is the single highest-leverage AEO-specific file you can publish.
Will fixing schema actually move AI citations?+
Schema is necessary but not sufficient. A complete graph (Organization, Person, Service, FAQPage, Article, Speakable) gives AI engines the structured surface they need to extract entities, attribute claims, and decide who to cite. Without it, even high-quality content gets paraphrased without attribution. With it, the same content gets cited as the source. Schema alone will not move you to top-cited if your content lacks citable original stats, authored bylines, or entity-graph signals — but every other lever is held back without it.
How do I track AI citation share over time?+
Three options. First, manual: query 30–50 of your priority terms across ChatGPT, Perplexity, Gemini, and Claude monthly; log who is cited. Second, paid tools: AthenaHQ, Otterly.ai, Profound, and Peec.ai automate citation tracking across engines with brand vs competitor share, query-level breakdowns, and trend lines. Third, brand-mention monitoring: Brand24 or Mention catch AI-engine outputs that reference your brand even when not formally cited. We recommend a paid tool plus monthly manual spot-checks — purely automated tracking misses tone and context shifts.