Commit Graph

7 Commits

Author SHA1 Message Date
Achilleas Pipinellis 71f707bfe8
Add the /help page in robots.txt
The /help page has docs which we don't want to be crawled
as we prefer the docs website instead.

Related https://gitlab.com/gitlab-org/gitlab-ce/issues/44433
2018-03-22 10:55:02 +01:00
eric sabelhaus 3ac6054bf9 correct User_agent placement in robots.txt 2017-01-18 07:21:16 -05:00
Matt Harrison f1df7b1bc2 update robots.txt disallow
Allows projects in groups starting with "s" while still disallowing the
snippets short urls.
2016-09-23 10:57:44 -04:00
Connor Shea d71edf0ded Disallow search engines from indexing uploads from a GitLab project.
This can sometimes include sensitive information from private projects and confidential issues. It shouldn't be indexed. Resolves #15551.
2016-05-16 15:04:14 -05:00
Ben Bodenmiller abf18abb52 allow crawling of commit page but not patch/diffs
Commit page has valuable information that search engines should be allowed to crawl however the .patch and .diff pages have no new information that is not on commit page
2015-10-04 23:39:24 -07:00
Ben Bodenmiller 595a93ee2c disallow irrelevant pages by default in robots
Update default robots.txt rules to disallow irrelevant pages that search
engines should not care about. This will still allow important pages
like the files, commit details, merge requests, issues, comments, etc.
to be crawled.
2015-08-17 23:11:16 -07:00
gitlabhq 9ba1224867 init commit 2011-10-09 00:36:38 +03:00