Robots.txt Checker

Check and analyze robots.txt file configuration. Validate your XML sitemap for complete SEO coverage.

You can enter a domain (example.com) or full URL (https://example.com)

Quick guide

  1. Enter a domain (example.com) or full URL
  2. We fetch /robots.txt and validate HTTP status + content-type
  3. We parse User-agent rules (Allow/Disallow) and highlight risky blocks
  4. We detect Sitemap lines and common mistakes

robots.txt controls crawler access; it does not hide private content.

What this Robots.txt Checker analyzes

This tool analyzes your website's robots.txt file to understand how search engine crawlers are allowed or restricted from accessing your pages.

  • Detects robots.txt availability and HTTP status
  • Parses Allow / Disallow rules for all bots
  • Simulates crawler access for common paths
  • Detects sitemap declarations inside robots.txt
  • Highlights common SEO mistakes and risks

robots.txt controls crawler access — it does not secure or hide pages from the public.

Tip: Enter a domain (example.com) or a full URL.