Skip to content
  1. Aug 18, 2015
    • Ben Bodenmiller's avatar
      disallow irrelevant pages by default in robots · 595a93ee
      Ben Bodenmiller authored
      Update default robots.txt rules to disallow irrelevant pages that search
      engines should not care about. This will still allow important pages
      like the files, commit details, merge requests, issues, comments, etc.
      to be crawled.
      595a93ee
  2. Oct 13, 2011
  3. Oct 08, 2011