robots.txt Allow/Disallow Conflict

Allow and Disallow directives conflict for the same path. Search engines may ignore one or both.

Search engines apply longest-prefix or first-match rules depending on the bot. Overlapping Allow/Disallow for the same path creates ambiguous crawl behavior.

Common causes

  • Copy-paste blocks leaving redundant or contradictory lines.
  • Wildcard patterns (`*`, `$`) overlapping for the same URL prefix.
  • Order sensitivity: some bots use the first matching rule, others the most specific.

How to fix

  • Use one clear rule per path prefix; remove duplicates.
  • Put more specific rules before general ones when your bot documentation recommends that order.
  • Test with the robots.txt tester and live URL checks.

Use our tool

Test robots.txt

Advertisement

All guides