GitLab Bot
ce493944f4
Add latest changes from gitlab-org/gitlab@master
2020-06-19 18:08:39 +00:00
GitLab Bot
03c3f9f501
Add latest changes from gitlab-org/gitlab@master
2020-05-25 09:08:30 +00:00
GitLab Bot
6755df108b
Add latest changes from gitlab-org/gitlab@master
2020-01-21 00:08:46 +00:00
GitLab Bot
aa0f0e9921
Add latest changes from gitlab-org/gitlab@master
2020-01-16 18:08:46 +00:00
GitLab Bot
c0d8f9f3f9
Add latest changes from gitlab-org/gitlab@master
2020-01-03 09:07:33 +00:00
GitLab Bot
5707f305f4
Add latest changes from gitlab-org/gitlab@master
2019-09-26 12:06:00 +00:00
otheus
e35693475c
Update robots.txt to exclude group_members and project_members, which can expose sensitive user information to the web. Please see https://developers.google.com/search/reference/robots_txt for the correct wildcard format.
2018-11-29 22:06:42 +00:00
Moritz Schlarb
2be7ddb704
Update robots.txt ( #51167 )
2018-09-07 07:27:29 +00:00
Achilleas Pipinellis
71f707bfe8
Add the /help page in robots.txt
...
The /help page has docs which we don't want to be crawled
as we prefer the docs website instead.
Related https://gitlab.com/gitlab-org/gitlab-ce/issues/44433
2018-03-22 10:55:02 +01:00
eric sabelhaus
3ac6054bf9
correct User_agent placement in robots.txt
2017-01-18 07:21:16 -05:00
Matt Harrison
f1df7b1bc2
update robots.txt disallow
...
Allows projects in groups starting with "s" while still disallowing the
snippets short urls.
2016-09-23 10:57:44 -04:00
Connor Shea
d71edf0ded
Disallow search engines from indexing uploads from a GitLab project.
...
This can sometimes include sensitive information from private projects and confidential issues. It shouldn't be indexed. Resolves #15551 .
2016-05-16 15:04:14 -05:00
Ben Bodenmiller
abf18abb52
allow crawling of commit page but not patch/diffs
...
Commit page has valuable information that search engines should be allowed to crawl however the .patch and .diff pages have no new information that is not on commit page
2015-10-04 23:39:24 -07:00
Ben Bodenmiller
595a93ee2c
disallow irrelevant pages by default in robots
...
Update default robots.txt rules to disallow irrelevant pages that search
engines should not care about. This will still allow important pages
like the files, commit details, merge requests, issues, comments, etc.
to be crawled.
2015-08-17 23:11:16 -07:00
gitlabhq
9ba1224867
init commit
2011-10-09 00:36:38 +03:00