Merge branch 'correct_robots_txt' into 'master'

correct User-agent placement in robots.txt

Closes #26807

See merge request !8623
This commit is contained in:
Rémy Coutable 2017-01-18 19:15:24 +00:00
commit 6b0ea00d84
2 changed files with 7 additions and 2 deletions

View file

@ -0,0 +1,4 @@
---
title: "Correct User-agent placement in robots.txt"
merge_request: 8623
author: Eric Sabelhaus

View file

@ -4,13 +4,12 @@
# User-Agent: *
# Disallow: /
User-Agent: *
# Add a 1 second delay between successive requests to the same server, limits resources used by crawler
# Only some crawlers respect this setting, e.g. Googlebot does not
# Crawl-delay: 1
# Based on details in https://gitlab.com/gitlab-org/gitlab-ce/blob/master/config/routes.rb, https://gitlab.com/gitlab-org/gitlab-ce/blob/master/spec/routing, and using application
User-Agent: *
Disallow: /autocomplete/users
Disallow: /search
Disallow: /api
@ -23,12 +22,14 @@ Disallow: /groups/*/edit
Disallow: /users
# Global snippets
User-Agent: *
Disallow: /s/
Disallow: /snippets/new
Disallow: /snippets/*/edit
Disallow: /snippets/*/raw
# Project details
User-Agent: *
Disallow: /*/*.git
Disallow: /*/*/fork/new
Disallow: /*/*/repository/archive*