Technical SEO

robots.txt policy

robots.txt policy is one of the public readiness signals included in isitready.dev reports.

Why it matters

Crawler policy should be easy to fetch, syntactically valid, and aligned with AI crawler intent.

How to improve it

  • Expose the signal on the canonical public origin.
  • Link it from discovery surfaces such as robots.txt, sitemap.xml, HTML head metadata, or HTTP Link headers where appropriate.
  • Re-run the scan and confirm the evidence row now reports a passing or informational status.