disallow irrelevant pages by default in robots
Update default robots.txt rules to disallow irrelevant pages that search engines should not care about. This will still allow important pages like the files, commit details, merge requests, issues, comments, etc. to be crawled.
Showing with 63 additions and 1 deletion