Description from extension meta
Instantly audit robots.txt, meta-robots, X-Robots-Tag and canonical headers to verify crawlability & indexability for any page.
Image from store
Description from store
“Why isn’t this page in Google?” — get the answer in two seconds.
Robots.txt Checker runs a one-click, on-page scan that reads the URL’s robots.txt rules, meta-robots tag, X-Robots-Tag headers, HTTP status and canonical link, cross-checks them for conflicts, and shows a blunt Allowed or Blocked verdict with the exact line responsible—everything happens locally in your browser and you can copy any rule or header with a tap for instant debugging.
Built for
- SEO specialists & agencies who live in Search Console and need a good crawlability check on every client page.
- Developers shipping code to production who want to spot index-blocking headers before they merge.
- Content editors & marketers doing a quick “Will Google see this?” test before publishing.
- QA teams running smoke tests on staging sites to be sure nothing is accidentally hidden from search bots.