Commit graph

2 commits

Author SHA1 Message Date
Ben Bodenmiller
595a93ee2c disallow irrelevant pages by default in robots
Update default robots.txt rules to disallow irrelevant pages that search
engines should not care about. This will still allow important pages
like the files, commit details, merge requests, issues, comments, etc.
to be crawled.
2015-08-17 23:11:16 -07:00
gitlabhq
9ba1224867 init commit 2011-10-09 00:36:38 +03:00