Free AI Visibility Checker

Check if your website is allowing AI search crawlers to access your content.

Essential Files for AI Visibility

Set up these three files so AI search engines can crawl and index your content.

Files 1. robots.txt 2. sitemap.xml 3. llms.txt
Primary Goal 🛑 Security/blocking 📍 Discovery 🧠 Understanding
What it says "Allowed vs. Blocked" "List of all URLs" "Clean facts & summary"
Target Audience All Web Crawlers Google, Bing Claude, Perplexity
File Format Plain Text Rules XML List Markdown (Text)

1. robots.txt

Place at your domain root (yoursite.com/robots.txt) to tell AI crawlers which pages they can access.

Sample robots.txt
# Allow all search engines and AI crawlers
User-agent: *
Allow: /

# Allow specific AI crawlers
User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: Claude-Web
Allow: /

# Sitemap location
Sitemap: https://yoursite.com/sitemap.xml

# Disallow specific paths (optional)
Disallow: /admin/
Disallow: /private/
Disallow: /api/

💡 Tip: Replace "yoursite.com" with your domain!

2. sitemap.xml

Place at your domain root (yoursite.com/sitemap.xml) to help AI crawlers discover all your pages.

Sample sitemap.xml
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <!-- Homepage -->
  <url>
    <loc>https://yoursite.com/</loc>
    <lastmod>2025-12-19</lastmod>
    <changefreq>weekly</changefreq>
    <priority>1.0</priority>
  </url>
  
  <!-- About Page -->
  <url>
    <loc>https://yoursite.com/about</loc>
    <lastmod>2025-12-19</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.8</priority>
  </url>
  
  <!-- Blog Page -->
  <url>
    <loc>https://yoursite.com/blog</loc>
    <lastmod>2025-12-19</lastmod>
    <changefreq>weekly</changefreq>
    <priority>0.9</priority>
  </url>
  
  <!-- Blog Posts -->
  <url>
    <loc>https://yoursite.com/blog/your-first-post</loc>
    <lastmod>2025-12-19</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.7</priority>
  </url>
  
  <url>
    <loc>https://yoursite.com/blog/another-post</loc>
    <lastmod>2025-12-18</lastmod>
    <changefreq>monthly</changefreq>
    <priority>0.7</priority>
  </url>
  
  <!-- Add more URLs as needed -->
</urlset>

💡 Tip: Update dates and add your pages. Priority: 1.0 = most important, 0.0 = least.

3. llms.txt

Place at your domain root (yoursite.com/llms.txt) to help AI assistants understand your site's content and structure.

Sample llms.txt
# llms.txt - About Your Site

# Site Information
> Site Name: Your Site Name
> URL: https://yoursite.com
> Description: A brief description of what your site does or offers

# What We Do
Your Site is a [describe your product/service]. We help [target audience] to [main value proposition].

# Main Features
- Feature 1: Description of your first key feature
- Feature 2: Description of your second key feature
- Feature 3: Description of your third key feature

# Key Pages
- Homepage: https://yoursite.com/
- About: https://yoursite.com/about
- Blog: https://yoursite.com/blog
- Contact: https://yoursite.com/contact

# Target Audience
We serve [describe your ideal customer/user] who need [describe their needs].

# Recent Updates
- [Date]: Brief description of recent update or content
- [Date]: Brief description of another update

# Contact
- Email: contact@yoursite.com
- Twitter: @yoursite
- Support: https://yoursite.com/support

💡 Tip: Keep it concise and factual. AI assistants use this to understand your site when answering questions about it!

✅ Next Steps

  1. 1. Upload all three files to your root directory
  2. 2. Verify they're accessible (yoursite.com/robots.txt, /sitemap.xml, /llms.txt)
  3. 3. Submit sitemap to Google Search Console
  4. 4. Check back in 48-72 hours