Robots.txt Tester

Paste your robots.txt file and test any URL path to see whether Googlebot or other crawlers are allowed or blocked.

Advertisement
Daily uses: 0 / 10 free  Upgrade for unlimited →

You've used 10 free runs today

Upgrade to Pro for unlimited testing and no ads.

Upgrade to Pro – $9/mo
Resets at midnight · Cancel anytime
Advertisement

How to use this tool

  • Paste your full robots.txt content into the text box.
  • Select the bot you want to test (Googlebot is most common).
  • Enter the URL path you want to check (e.g. /admin/page).
  • Click Test URL to see if the path is allowed or blocked.

How robots.txt rules work

Robots.txt uses a simple priority system. The most specific rule wins. If both an Allow and Disallow rule match a path, the longer (more specific) rule takes precedence.

Important: robots.txt controls crawling, not indexing. A page can still be indexed even if blocked in robots.txt if other pages link to it. Use noindex meta tags to prevent indexing.

Advertisement