AI Overviews now appear in roughly 48% of tracked Google queries, according to BrightEdge research. ChatGPT, Perplexity, and Claude are answering questions that used to drive clicks to your site.
The real question is: can these AI engines actually find and use your content? Most site owners assume the answer is yes. They’re often wrong.
Enter your URL below. This free tool checks 15 technical signals AI search engines look for, and tells you exactly what to fix. You can also use the standalone AI Visibility Audit Tool for a distraction-free experience.
Checks 15 AI readiness signals including robots.txt, llms.txt, schema, SSL, security headers, content quality, E-E-A-T, FAQ structure, internal linking, page speed, and more. Limited to 5 audits per hour.
Get a detailed report with step-by-step fix instructions sent to your inbox.
The audit runs from our server, so it can check any public website. It does not install anything on your site. Your URL is not stored.
Why AI Visibility Matters
Traditional SEO focused on one search engine: Google. That world is expanding fast. Your content now needs to be discoverable by ChatGPT (via GPTBot and OAI-SearchBot), Perplexity (via PerplexityBot), Google AI Overviews, Claude, and Gemini.
Each of these systems crawls the web differently, but they all rely on the same foundation: accessible content, structured data, and clear signals about what your site offers. Miss any of these, and you’re invisible to a growing share of how people search.
Between 17-38% of sources cited in AI Overviews also rank in the organic top 10, depending on the study. The majority of AI citations pull from content that doesn’t rank on page 1. – BrightEdge / Ahrefs, 2026
This means AI engines are surfacing content that traditional SEO might overlook. Your visibility in AI search depends on technical readiness, not just rankings.
What the Tool Checks
The audit evaluates 15 signals that AI crawlers look for. Here’s what each check measures and why it matters.
1. robots.txt AI Bot Access
Your robots.txt controls which crawlers can access your site. Many WordPress security plugins add blanket Disallow rules that accidentally block AI bots. The audit checks for blocks against GPTBot, OAI-SearchBot, PerplexityBot, Google-Extended, ClaudeBot, CCBot, Amazonbot, and others.
A clean robots.txt explicitly allows the bots you want while blocking only the ones you don’t.
2. llms.txt File
The llms.txt file is a Markdown document at your site root that tells AI systems what your site does, what content to prioritize, and how to cite you. Think of it as a curated index for AI. Without it, AI engines guess which pages matter.
The audit checks whether the file exists, has meaningful content, and includes links and structure.
3. llms-full.txt File
While llms.txt provides an index, llms-full.txt serves your full content as Markdown in a single request. This lets AI systems read your entire site without crawling every page individually. See the llms-full.txt setup guide for WordPress implementation details. Sites that provide one make it significantly easier for AI engines to ingest and cite their pages.
4. Structured Data (Schema Markup)
JSON-LD structured data helps AI engines understand what your content represents. The audit checks your homepage for key schema types: Organization, WebSite, SearchAction, Article, FAQPage, and BreadcrumbList. Sites with rich structured data are more likely to be cited in AI responses. For a breakdown of which schema types matter most for generative engines, see the schema markup for AI search guide.
5. XML Sitemap
AI crawlers use your sitemap to discover content efficiently. The audit verifies your sitemap.xml exists, contains valid XML, and is referenced in your robots.txt via a Sitemap: directive. If you haven’t set one up yet, see how to create and submit a sitemap in WordPress.
6. Meta Tags for AI
Meta description, Open Graph tags, and canonical URLs help AI systems understand and attribute your content correctly. The audit also checks for noai or noimageai directives in your meta robots tag, which explicitly tell AI engines to ignore your content.
7. Content Accessibility (Server-Side Rendering)
AI crawlers don’t execute JavaScript. If your site relies on client-side rendering (React SPAs, heavy JS frameworks), the content may be invisible to AI bots. The audit checks whether your homepage HTML contains meaningful text content, heading tags, and semantic landmarks like <main> and <article>.
WordPress sites generally pass this check since PHP renders HTML server-side. But headless WordPress setups or sites with JS-heavy themes may fail.
8. HTTPS & SSL
HTTPS is a baseline trust signal for both traditional and AI search engines. Sites without SSL certificates may be deprioritized or skipped entirely by AI crawlers. If you haven’t migrated yet, follow the complete HTTPS migration guide.
9. Security Headers
The audit checks for X-Robots-Tag headers that could block AI indexing at the server level. This is easy to miss since it doesn’t appear in your HTML source, only in the HTTP response headers. Learn more about configuring security headers correctly.
10. Content Quality Signals
AI engines prioritize well-structured content when deciding what to cite. The audit analyzes your homepage for word count, heading density, lists, tables, FAQ sections, and content-to-HTML ratio. Pages with structured, extractable content get cited more frequently by AI systems.
A homepage with 500+ words, multiple subheadings, and at least one list or table scores well here. Thin pages with little structure score low.
11. Page Speed (TTFB)
AI crawlers have timeout budgets. If your server takes too long to respond, the crawler may skip your page or only partially process it. The audit measures your server’s Time to First Byte (TTFB) and your homepage response size.
A TTFB under 1 second and a response under 500KB is ideal. Pages over 2MB or with a TTFB over 3 seconds risk incomplete crawls. For a deeper look at performance metrics, see the Core Web Vitals guide. Note that TTFB is measured from our server, so your actual response time may vary by location. The check uses generous thresholds to catch genuine performance issues rather than geographic latency.
12. E-E-A-T Signals
Experience, Expertise, Authoritativeness, and Trustworthiness (E-E-A-T) matter for AI citations too. The audit looks for Organization or Person schema with sameAs social profiles, links to your About and Contact pages, and deeper authorship signals.
For personal sites, it checks for author bio sections, headshots, and credentials. For company sites, it looks for team pages, editorial bylines, and organizational trust signals. Sites with visible authorship get significantly more AI citations.
13. Open Graph & Social Tags
AI engines use Open Graph tags to understand content context and language. The audit checks for og:type, og:image, og:site_name, og:locale, twitter:card, and twitter:title. Complete social tags give AI systems more context for accurate attribution.
Most SEO plugins generate these automatically, but it’s common for og:locale or Twitter Card tags to be missing.
14. Structured FAQ Detection
AI engines favor content with clear question-and-answer formatting. FAQ sections, question-phrased headings (H2/H3 containing “?”), and collapsible <details> elements all signal extractable Q&A content. The audit also checks for FAQPage schema markup, which helps AI systems identify and cite individual answers.
Pages that combine Q&A-structured content with proper schema get cited more frequently. Even a few well-phrased question headings make a difference.
15. Internal Linking Depth
AI crawlers follow internal links to discover and contextualize your content. A homepage that links to key content pages creates a clear topic map. The audit counts unique internal content links on your homepage, filtering out non-content URLs like anchors, login pages, and asset paths.
Sites with strong hub-spoke architecture – where the homepage connects to topic clusters – get crawled more thoroughly. If your homepage has fewer than 10 internal content links, AI bots may not discover your best pages.
How to Fix Common Issues
Most audit failures fall into a few categories. Here’s how to address the most common ones:
Allow AI Bots in robots.txt
If your robots.txt blocks AI crawlers, add explicit Allow rules. Be selective about which bots you allow. Training bots (like CCBot) scrape content for model training, while search bots (like OAI-SearchBot, PerplexityBot) power AI search results that link back to your site.
# Allow AI search bots
User-agent: GPTBot
Allow: /
User-agent: OAI-SearchBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Google-Extended
Allow: /
# Block training-only bots
User-agent: CCBot
Disallow: /Create llms.txt and llms-full.txt
Both files go in your site root. The llms.txt is a curated summary, while llms-full.txt contains your complete content as Markdown. I’ve covered the exact setup in my llms.txt guide and the llms-full.txt guide.
Add Structured Data
If you use Yoast SEO or Rank Math, you already have basic schema markup. Check that your homepage includes at least Organization, WebSite, and SearchAction schemas. For blog posts, add Article and FAQPage schema. Both plugins handle this automatically when configured correctly.
Add a Sitemap Reference to robots.txt
At the bottom of your robots.txt, add:
Sitemap: https://yourdomain.com/sitemap.xmlYoast and Rank Math generate sitemaps automatically. Make sure the URL in your robots.txt matches the actual sitemap location.
Fix Missing Meta Tags
Description and Open Graph tags are essential. If your SEO plugin is active and configured, these should be generated automatically. Check that each page has a unique meta description and that og:title, og:description, and og:image are present in your HTML source.
Verify Server-Side Rendering
View your page source (not Inspect Element) and check if your content appears in the raw HTML. If the body is mostly empty with a <div id="root"> and JavaScript bundles, AI crawlers can’t see your content. WordPress with traditional PHP themes doesn’t have this problem, but headless setups and JS-heavy page builders might.
FAQs
Common questions about AI visibility audits:
robots.txt, llms.txt, sitemap.xml, and your homepage). It works exactly like an AI crawler would. Nothing is installed, no credentials are needed, and your URL is not stored.llms.txt file or incomplete structured data. Below 50 means critical issues like blocked AI crawlers or missing schema.llms.txt file gives AI systems a structured overview of your site. Without it, AI engines have to guess which pages are most important. Sites with llms.txt give AI crawlers clear guidance on what to prioritize and cite.robots.txt to allow search-oriented bots like GPTBot, OAI-SearchBot, and PerplexityBot (which power AI search and link back to you) while blocking training-only bots like CCBot (which scrape content for model training without attribution). See my guide on blocking AI crawlers with robots.txt for the full list.robots.txt edits. AI search is evolving quickly, and a configuration change can accidentally block AI crawlers. A quarterly check is a reasonable baseline.Summary
AI search is no longer a future trend. Nearly half of Google queries already trigger AI Overviews, and standalone AI tools like ChatGPT and Perplexity are handling millions of searches daily. If your site’s technical foundation blocks or confuses these systems, you’re missing traffic you could be getting.
Run the audit above on your site. Fix what fails. The 15 checks cover the same signals that AI crawlers evaluate when deciding whether to index and cite your content – from basic access rules to content quality, FAQ structure, internal linking, and authorship signals. Results are sorted by impact, with effort estimates for each issue. Most fixes are quick, and the payoff is showing up in a channel that’s growing fast. For a structured approach to all these optimizations, use the WordPress AEO Checklist.

