2017-03-17 07:19:38 -04:00
# Gitaly
2018-12-18 07:56:58 -05:00
[Gitaly ](https://gitlab.com/gitlab-org/gitaly ) is the service that
2018-11-20 09:00:07 -05:00
provides high-level RPC access to Git repositories. Without it, no other
2019-07-18 21:47:39 -04:00
components can read or write Git data. GitLab components that access Git
2019-09-26 02:06:27 -04:00
repositories (GitLab Rails, GitLab Shell, GitLab Workhorse, etc.) act as clients
2019-07-18 21:47:39 -04:00
to Gitaly. End users do not have direct access to Gitaly.
2017-03-17 07:19:38 -04:00
2020-02-29 16:08:32 -05:00
On this page, *Gitaly server* refers to a standalone node that only runs Gitaly
and *Gitaly client* is a GitLab Rails app node that runs all other processes
except Gitaly.
2017-03-17 07:19:38 -04:00
2019-09-18 10:02:45 -04:00
## Architecture
Here's a high-level architecture overview of how Gitaly is used.
![Gitaly architecture diagram ](img/architecture_v12_4.png )
2017-03-17 07:19:38 -04:00
## Configuring Gitaly
2019-09-18 10:02:45 -04:00
The Gitaly service itself is configured via a [TOML configuration file ](reference.md ).
2017-03-17 07:19:38 -04:00
2020-02-29 16:08:32 -05:00
If you want to change any of its settings:
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
**For Omnibus GitLab**
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
1. Edit `/etc/gitlab/gitlab.rb` and add or change the [Gitaly settings ](https://gitlab.com/gitlab-org/omnibus-gitlab/blob/1dd07197c7e5ae23626aad5a4a070a800b670380/files/gitlab-config-template/gitlab.rb.template#L1622-1676 ).
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
**For installations from source**
2017-10-03 07:28:40 -04:00
2019-07-18 21:47:39 -04:00
1. Edit `/home/git/gitaly/config.toml` and add or change the [Gitaly settings ](https://gitlab.com/gitlab-org/gitaly/blob/master/config.toml.example ).
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ).
2017-10-03 07:28:40 -04:00
2017-06-22 09:19:41 -04:00
## Running Gitaly on its own server
2019-07-18 21:47:39 -04:00
This is an optional way to deploy Gitaly which can benefit GitLab
2017-06-22 09:19:41 -04:00
installations that are larger than a single machine. Most
installations will be better served with the default configuration
used by Omnibus and the GitLab source installation guide.
2020-01-06 13:08:01 -05:00
Follow transition to Gitaly on its own server, [Gitaly servers will need to be upgraded before other servers in your cluster ](https://docs.gitlab.com/omnibus/update/#upgrading-gitaly-servers ).
2017-06-22 09:19:41 -04:00
2019-03-04 07:15:18 -05:00
Starting with GitLab 11.4, Gitaly is able to serve all Git requests without
2019-08-06 07:32:08 -04:00
requiring a shared NFS mount for Git repository data.
2019-03-04 07:15:18 -05:00
Between 11.4 and 11.8 the exception was the
2019-06-28 06:33:58 -04:00
[Elasticsearch indexer ](https://gitlab.com/gitlab-org/gitlab-elasticsearch-indexer ).
2019-03-04 07:15:18 -05:00
But since 11.8 the indexer uses Gitaly for data access as well. NFS can still
2020-01-07 22:08:05 -05:00
be leveraged for redundancy on block level of the Git data. But only has to
2019-03-04 07:15:18 -05:00
be mounted on the Gitaly server.
2017-06-22 09:19:41 -04:00
2019-12-20 04:24:38 -05:00
From GitLab v11.8 to v12.2, it is possible to use Elasticsearch in conjunction with
2019-09-26 02:06:27 -04:00
a Gitaly setup that isn't utilising NFS. In order to use Elasticsearch in this
2019-12-20 04:24:38 -05:00
scenario, the [new repository indexer ](../../integration/elasticsearch.md#elasticsearch-repository-indexer )
needs to be enabled in your GitLab configuration. [Since GitLab v12.3 ](https://gitlab.com/gitlab-org/gitlab/issues/6481 ),
the new indexer becomes the default and no configuration is required.
2019-08-06 07:32:08 -04:00
2019-01-22 06:17:55 -05:00
### Network architecture
2019-07-18 21:47:39 -04:00
The following list depicts what the network architecture of Gitaly is:
- GitLab Rails shards repositories into [repository storages ](../repository_storage_paths.md ).
- `/config/gitlab.yml` contains a map from storage names to
`(Gitaly address, Gitaly token)` pairs.
- the `storage name` -\> `(Gitaly address, Gitaly token)` map in
`/config/gitlab.yml` is the single source of truth for the Gitaly network
topology.
- A `(Gitaly address, Gitaly token)` corresponds to a Gitaly server.
- A Gitaly server hosts one or more storages.
2019-08-21 08:55:37 -04:00
- A GitLab server can use one or more Gitaly servers.
2019-07-18 21:47:39 -04:00
- Gitaly addresses must be specified in such a way that they resolve
correctly for ALL Gitaly clients.
2019-09-26 02:06:27 -04:00
- Gitaly clients are: Unicorn, Sidekiq, GitLab Workhorse,
GitLab Shell, Elasticsearch Indexer, and Gitaly itself.
2019-07-18 21:47:39 -04:00
- A Gitaly server must be able to make RPC calls **to itself** via its own
`(Gitaly address, Gitaly token)` pair as specified in `/config/gitlab.yml` .
- Gitaly servers must not be exposed to the public internet as Gitaly's network
traffic is unencrypted by default. The use of firewall is highly recommended
to restrict access to the Gitaly server. Another option is to
[use TLS ](#tls-support ).
- Authentication is done through a static token which is shared among the Gitaly
and GitLab Rails nodes.
2017-06-22 09:19:41 -04:00
2019-08-21 08:55:37 -04:00
Below we describe how to configure two Gitaly servers one at
`gitaly1.internal` and the other at `gitaly2.internal`
with secret token `abc123secret` . We assume
your GitLab installation has three repository storages: `default` ,
2019-10-17 08:07:33 -04:00
`storage1` and `storage2` . You can use as little as just one server with one
repository storage if desired.
2017-06-22 09:19:41 -04:00
2019-11-06 13:06:29 -05:00
Note: **Note:** The token referred to throughout the Gitaly documentation is
just an arbitrary password selected by the administrator. It is unrelated to
tokens created for the GitLab API or other similar web API tokens.
2019-07-18 21:47:39 -04:00
### 1. Installation
2019-06-04 08:01:05 -04:00
2019-08-21 08:55:37 -04:00
First install Gitaly on each Gitaly server using either
Omnibus GitLab or install it from source:
2019-06-04 08:01:05 -04:00
2019-07-18 21:47:39 -04:00
- For Omnibus GitLab: [Download/install ](https://about.gitlab.com/install/ ) the Omnibus GitLab
package you want using **steps 1 and 2** from the GitLab downloads page but
**_do not_** provide the `EXTERNAL_URL=` value.
- From source: [Install Gitaly ](../../install/installation.md#install-gitaly ).
2019-06-04 08:01:05 -04:00
2019-07-18 21:47:39 -04:00
### 2. Client side token configuration
2019-06-04 08:01:05 -04:00
2019-07-18 21:47:39 -04:00
Configure a token on the instance that runs the GitLab Rails application.
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
**For Omnibus GitLab**
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
1. On the client node(s), edit `/etc/gitlab/gitlab.rb` :
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
```ruby
gitlab_rails['gitaly_token'] = 'abc123secret'
```
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
**For installations from source**
1. On the client node(s), edit `/home/git/gitlab/config/gitlab.yml` :
```yaml
gitlab:
gitaly:
token: 'abc123secret'
```
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ).
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
### 3. Gitaly server configuration
2017-06-22 09:19:41 -04:00
2019-08-21 08:55:37 -04:00
Next, on the Gitaly servers, you need to configure storage paths, enable
2017-06-22 09:19:41 -04:00
the network listener and configure the token.
2019-10-17 08:07:33 -04:00
NOTE: **Note:** If you want to reduce the risk of downtime when you enable
2017-06-22 10:25:10 -04:00
authentication you can temporarily disable enforcement, see [the
documentation on configuring Gitaly
authentication](https://gitlab.com/gitlab-org/gitaly/blob/master/doc/configuration/README.md#authentication)
2017-06-22 09:19:41 -04:00
.
2018-11-02 04:37:46 -04:00
Gitaly must trigger some callbacks to GitLab via GitLab Shell. As a result,
2018-09-20 17:53:23 -04:00
the GitLab Shell secret must be the same between the other GitLab servers and
the Gitaly server. The easiest way to accomplish this is to copy `/etc/gitlab/gitlab-secrets.json`
from an existing GitLab server to the Gitaly server. Without this shared secret,
2018-11-02 04:37:46 -04:00
Git operations in GitLab will result in an API error.
2018-09-20 17:53:23 -04:00
2019-07-18 21:47:39 -04:00
**For Omnibus GitLab**
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
1. Edit `/etc/gitlab/gitlab.rb` :
2018-12-18 07:56:58 -05:00
2019-07-18 21:47:39 -04:00
<!--
updates to following example must also be made at
2019-09-03 00:36:02 -04:00
https://gitlab.com/gitlab-org/charts/gitlab/blob/master/doc/advanced/external-gitaly/external-omnibus-gitaly.md#configure-omnibus-gitlab
2019-07-18 21:47:39 -04:00
-->
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
```ruby
# /etc/gitlab/gitlab.rb
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
# Avoid running unnecessary services on the Gitaly server
postgresql['enable'] = false
redis['enable'] = false
nginx['enable'] = false
unicorn['enable'] = false
sidekiq['enable'] = false
gitlab_workhorse['enable'] = false
2017-06-22 09:19:41 -04:00
2020-01-07 13:07:34 -05:00
# If you don't want to run monitoring services uncomment the following (not recommended)
# alertmanager['enable'] = false
# gitlab_exporter['enable'] = false
# grafana['enable'] = false
# node_exporter['enable'] = false
# prometheus['enable'] = false
# Enable prometheus monitoring - comment out if you disable monitoring services above.
# This makes Prometheus listen on all interfaces. You must use firewalls to restrict access to this address/port.
prometheus['listen_address'] = '0.0.0.0:9090'
2019-07-18 21:47:39 -04:00
# Prevent database connections during 'gitlab-ctl reconfigure'
gitlab_rails['rake_cache_clear'] = false
gitlab_rails['auto_migrate'] = false
# Configure the gitlab-shell API callback URL. Without this, `git push` will
# fail. This can be your 'front door' GitLab URL or an internal load
# balancer.
# Don't forget to copy `/etc/gitlab/gitlab-secrets.json` from web server to Gitaly server.
gitlab_rails['internal_api_url'] = 'https://gitlab.example.com'
2017-06-22 09:19:41 -04:00
2019-10-17 08:07:33 -04:00
# Authentication token to ensure only authorized servers can communicate with
# Gitaly server
gitaly['auth_token'] = 'abc123secret'
2019-07-18 21:47:39 -04:00
# Make Gitaly accept connections on all network interfaces. You must use
# firewalls to restrict access to this address/port.
2019-10-17 08:07:33 -04:00
# Comment out following line if you only want to support TLS connections
2019-07-18 21:47:39 -04:00
gitaly['listen_addr'] = "0.0.0.0:8075"
2019-08-21 08:55:37 -04:00
```
1. Append the following to `/etc/gitlab/gitlab.rb` for each respective server:
2019-08-28 02:06:34 -04:00
2020-01-07 13:07:34 -05:00
<!--
updates to following example must also be made at
https://gitlab.com/gitlab-org/charts/gitlab/blob/master/doc/advanced/external-gitaly/external-omnibus-gitaly.md#configure-omnibus-gitlab
-->
2019-10-17 08:07:33 -04:00
On `gitaly1.internal` :
2019-08-21 08:55:37 -04:00
2020-01-17 22:08:23 -05:00
```ruby
2019-10-23 05:06:03 -04:00
git_data_dirs({
'default' => {
'path' => '/var/opt/gitlab/git-data'
},
'storage1' => {
'path' => '/mnt/gitlab/git-data'
},
})
2019-08-21 08:55:37 -04:00
```
2019-08-28 02:06:34 -04:00
2019-10-17 08:07:33 -04:00
On `gitaly2.internal` :
2017-06-22 09:19:41 -04:00
2020-01-17 22:08:23 -05:00
```ruby
2019-10-23 05:06:03 -04:00
git_data_dirs({
'storage2' => {
'path' => '/srv/gitlab/git-data'
},
})
2019-07-18 21:47:39 -04:00
```
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
**For installations from source**
1. On the client node(s), edit `/home/git/gitaly/config.toml` :
```toml
listen_addr = '0.0.0.0:8075'
2019-11-20 04:06:27 -05:00
internal_socket_dir = '/var/opt/gitlab/gitaly'
2019-07-18 21:47:39 -04:00
[auth]
token = 'abc123secret'
2019-10-21 05:06:22 -04:00
[logging]
format = 'json'
level = 'info'
dir = '/var/log/gitaly'
2019-08-21 08:55:37 -04:00
```
1. Append the following to `/home/git/gitaly/config.toml` for each respective server:
2019-08-28 02:06:34 -04:00
2019-10-17 08:07:33 -04:00
On `gitaly1.internal` :
2019-07-18 21:47:39 -04:00
2019-08-21 08:55:37 -04:00
```toml
2019-07-18 21:47:39 -04:00
[[storage]]
name = 'default'
2019-10-23 05:06:03 -04:00
path = '/var/opt/gitlab/git-data/repositories'
2019-07-18 21:47:39 -04:00
[[storage]]
name = 'storage1'
2019-10-23 05:06:03 -04:00
path = '/mnt/gitlab/git-data/repositories'
2019-07-18 21:47:39 -04:00
```
2019-08-28 02:06:34 -04:00
2019-10-17 08:07:33 -04:00
On `gitaly2.internal` :
2019-08-21 08:55:37 -04:00
```toml
[[storage]]
name = 'storage2'
2019-10-23 05:06:03 -04:00
path = '/srv/gitlab/git-data/repositories'
2019-08-21 08:55:37 -04:00
```
2019-07-18 21:47:39 -04:00
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ).
### 4. Converting clients to use the Gitaly server
As the final step, you need to update the client machines to switch from using
their local Gitaly service to the new Gitaly server you just configured. This
is a risky step because if there is any sort of network, firewall, or name
resolution problem preventing your GitLab server from reaching the Gitaly server,
then all Gitaly requests will fail.
2017-06-22 09:19:41 -04:00
2019-07-08 18:24:54 -04:00
Additionally, you need to
2019-06-05 14:43:38 -04:00
[disable Rugged if previously manually enabled ](../high_availability/nfs.md#improving-nfs-performance-with-gitlab ).
2019-08-21 08:55:37 -04:00
We assume that your `gitaly1.internal` Gitaly server can be reached at
2019-08-28 02:06:34 -04:00
`gitaly1.internal:8075` from your GitLab server, and that Gitaly server
2019-08-21 08:55:37 -04:00
can read and write to `/mnt/gitlab/default` and `/mnt/gitlab/storage1` .
We assume also that your `gitaly2.internal` Gitaly server can be reached at
2019-08-28 02:06:34 -04:00
`gitaly2.internal:8075` from your GitLab server, and that Gitaly server
can read and write to `/mnt/gitlab/storage2` .
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
**For Omnibus GitLab**
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
1. Edit `/etc/gitlab/gitlab.rb` :
2017-07-06 08:34:26 -04:00
2019-07-18 21:47:39 -04:00
```ruby
git_data_dirs({
2019-08-21 08:55:37 -04:00
'default' => { 'gitaly_address' => 'tcp://gitaly1.internal:8075' },
'storage1' => { 'gitaly_address' => 'tcp://gitaly1.internal:8075' },
'storage2' => { 'gitaly_address' => 'tcp://gitaly2.internal:8075' },
2019-07-18 21:47:39 -04:00
})
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
gitlab_rails['gitaly_token'] = 'abc123secret'
```
2017-06-22 09:19:41 -04:00
2019-07-18 21:47:39 -04:00
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
1. Tail the logs to see the requests:
2020-01-30 10:09:15 -05:00
```shell
2019-07-18 21:47:39 -04:00
sudo gitlab-ctl tail gitaly
```
**For installations from source**
1. Edit `/home/git/gitlab/config/gitlab.yml` :
```yaml
gitlab:
repositories:
storages:
default:
2019-08-21 08:55:37 -04:00
gitaly_address: tcp://gitaly1.internal:8075
2019-08-26 14:53:25 -04:00
path: /some/dummy/path
2019-07-18 21:47:39 -04:00
storage1:
2019-08-21 08:55:37 -04:00
gitaly_address: tcp://gitaly1.internal:8075
2019-08-26 14:53:25 -04:00
path: /some/dummy/path
2019-08-21 08:55:37 -04:00
storage2:
gitaly_address: tcp://gitaly2.internal:8075
2019-08-26 14:53:25 -04:00
path: /some/dummy/path
2019-07-18 21:47:39 -04:00
gitaly:
token: 'abc123secret'
```
NOTE: **Note:**
2019-08-26 14:53:25 -04:00
`/some/dummy/path` should be set to a local folder that exists, however no
data will be stored in this folder. This will no longer be necessary after
[this issue ](https://gitlab.com/gitlab-org/gitaly/issues/1282 ) is resolved.
2019-07-18 21:47:39 -04:00
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ).
1. Tail the logs to see the requests:
2020-01-30 10:09:15 -05:00
```shell
2019-07-18 21:47:39 -04:00
tail -f /home/git/gitlab/log/gitaly.log
```
When you tail the Gitaly logs on your Gitaly server you should see requests
coming in. One sure way to trigger a Gitaly request is to clone a repository
from your GitLab server over HTTP.
2019-11-29 16:06:13 -05:00
DANGER: **Danger:**
2020-01-20 13:08:44 -05:00
If you have [Server hooks ](../server_hooks.md ) configured,
2019-11-29 16:06:13 -05:00
either per repository or globally, you must move these to the Gitaly node.
2020-01-20 13:08:44 -05:00
If you have multiple Gitaly nodes, copy your server hook(s) to all nodes.
2019-10-31 11:06:41 -04:00
2019-07-18 21:47:39 -04:00
### Disabling the Gitaly service in a cluster environment
If you are running Gitaly [as a remote
service](#running-gitaly-on-its-own-server) you may want to disable
the local Gitaly service that runs on your GitLab server by default.
Disabling Gitaly only makes sense when you run GitLab in a custom
cluster configuration, where different services run on different
machines. Disabling Gitaly on all machines in the cluster is not a
valid configuration.
To disable Gitaly on a client node:
**For Omnibus GitLab**
1. Edit `/etc/gitlab/gitlab.rb` :
```ruby
gitaly['enable'] = false
```
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
**For installations from source**
1. Edit `/etc/default/gitlab` :
```shell
gitaly_enabled=false
```
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ).
2017-06-22 09:19:41 -04:00
2018-11-02 04:37:46 -04:00
## TLS support
2020-02-06 10:09:11 -05:00
> [Introduced](https://gitlab.com/gitlab-org/gitlab-foss/-/merge_requests/22602) in GitLab 11.8.
2019-01-11 15:59:57 -05:00
2019-03-26 11:52:16 -04:00
Gitaly supports TLS encryption. To be able to communicate
2019-10-17 08:07:33 -04:00
with a Gitaly instance that listens for secure connections you will need to use `tls://` URL
2019-07-03 06:55:15 -04:00
scheme in the `gitaly_address` of the corresponding storage entry in the GitLab configuration.
2018-11-02 04:37:46 -04:00
2019-07-18 21:47:39 -04:00
You will need to bring your own certificates as this isn't provided automatically.
2019-10-17 08:07:33 -04:00
The certificate to be used needs to be installed on all Gitaly nodes, and the
certificate (or CA of certificate) on all
2019-07-18 21:47:39 -04:00
client nodes that communicate with it following the procedure described in
[GitLab custom certificate configuration ](https://docs.gitlab.com/omnibus/settings/ssl.html#install-custom-public-certificates ).
2019-03-26 11:52:16 -04:00
2019-10-14 05:07:54 -04:00
NOTE: **Note**
The self-signed certificate must specify the address you use to access the
Gitaly server. If you are addressing the Gitaly server by a hostname, you can
either use the Common Name field for this, or add it as a Subject Alternative
Name. If you are addressing the Gitaly server by its IP address, you must add it
as a Subject Alternative Name to the certificate.
[gRPC does not support using an IP address as Common Name in a certificate ](https://github.com/grpc/grpc/issues/2691 ).
2019-07-18 21:47:39 -04:00
NOTE: **Note:**
It is possible to configure Gitaly servers with both an
2019-03-26 11:52:16 -04:00
unencrypted listening address `listen_addr` and an encrypted listening
address `tls_listen_addr` at the same time. This allows you to do a
gradual transition from unencrypted to encrypted traffic, if necessary.
2019-07-18 21:47:39 -04:00
To configure Gitaly with TLS:
2018-12-21 04:43:45 -05:00
2019-07-18 21:47:39 -04:00
**For Omnibus GitLab**
2018-11-02 04:37:46 -04:00
2019-10-17 08:07:33 -04:00
1. On the client node(s), edit `/etc/gitlab/gitlab.rb` as follows:
2018-12-21 04:43:45 -05:00
2019-07-18 21:47:39 -04:00
```ruby
git_data_dirs({
2019-08-21 08:55:37 -04:00
'default' => { 'gitaly_address' => 'tls://gitaly1.internal:9999' },
'storage1' => { 'gitaly_address' => 'tls://gitaly1.internal:9999' },
'storage2' => { 'gitaly_address' => 'tls://gitaly2.internal:9999' },
2019-07-18 21:47:39 -04:00
})
2018-11-02 04:37:46 -04:00
2019-07-18 21:47:39 -04:00
gitlab_rails['gitaly_token'] = 'abc123secret'
```
2018-11-02 04:37:46 -04:00
2019-10-17 08:07:33 -04:00
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ) on client node(s).
2019-11-04 13:06:28 -05:00
1. On the Gitaly server, create the `/etc/gitlab/ssl` directory and copy your key and certificate there:
2019-10-17 08:07:33 -04:00
2020-01-30 10:09:15 -05:00
```shell
2019-10-17 08:07:33 -04:00
sudo mkdir -p /etc/gitlab/ssl
2019-11-04 13:06:28 -05:00
sudo chmod 755 /etc/gitlab/ssl
2019-10-17 08:07:33 -04:00
sudo cp key.pem cert.pem /etc/gitlab/ssl/
```
1. On the Gitaly server node(s), edit `/etc/gitlab/gitlab.rb` and add:
<!--
updates to following example must also be made at
https://gitlab.com/gitlab-org/charts/gitlab/blob/master/doc/advanced/external-gitaly/external-omnibus-gitaly.md#configure-omnibus-gitlab
-->
2018-11-02 04:37:46 -04:00
2019-07-18 21:47:39 -04:00
```ruby
gitaly['tls_listen_addr'] = "0.0.0.0:9999"
2019-10-17 08:07:33 -04:00
gitaly['certificate_path'] = "/etc/gitlab/ssl/cert.pem"
gitaly['key_path'] = "/etc/gitlab/ssl/key.pem"
2019-07-21 21:49:37 -04:00
```
2018-12-21 04:43:45 -05:00
2019-10-17 08:07:33 -04:00
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ) on Gitaly server node(s).
1. (Optional) After [verifying that all Gitaly traffic is being served over TLS ](#observe-type-of-gitaly-connections ),
you can improve security by disabling non-TLS connections by commenting out
or deleting `gitaly['listen_addr']` in `/etc/gitlab/gitlab.rb` , saving the file,
and [reconfiguring GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure )
on Gitaly server node(s).
2018-12-21 04:43:45 -05:00
2019-07-18 21:47:39 -04:00
**For installations from source**
2018-12-21 04:43:45 -05:00
2019-10-17 08:07:33 -04:00
1. On the client node(s), edit `/home/git/gitlab/config/gitlab.yml` as follows:
2018-11-02 04:37:46 -04:00
2019-07-18 21:47:39 -04:00
```yaml
gitlab:
repositories:
storages:
default:
2019-08-21 08:55:37 -04:00
gitaly_address: tls://gitaly1.internal:9999
2019-08-26 14:53:25 -04:00
path: /some/dummy/path
2019-07-18 21:47:39 -04:00
storage1:
2019-08-21 08:55:37 -04:00
gitaly_address: tls://gitaly1.internal:9999
2019-08-26 14:53:25 -04:00
path: /some/dummy/path
2019-08-21 08:55:37 -04:00
storage2:
gitaly_address: tls://gitaly2.internal:9999
2019-08-26 14:53:25 -04:00
path: /some/dummy/path
2018-11-02 04:37:46 -04:00
2019-07-18 21:47:39 -04:00
gitaly:
token: 'abc123secret'
```
2018-11-02 04:37:46 -04:00
2019-07-18 21:47:39 -04:00
NOTE: **Note:**
2019-08-26 14:53:25 -04:00
`/some/dummy/path` should be set to a local folder that exists, however no
data will be stored in this folder. This will no longer be necessary after
[this issue ](https://gitlab.com/gitlab-org/gitaly/issues/1282 ) is resolved.
2018-12-21 04:43:45 -05:00
2019-10-17 08:07:33 -04:00
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ) on client node(s).
1. Create the `/etc/gitlab/ssl` directory and copy your key and certificate there:
2020-01-30 10:09:15 -05:00
```shell
2019-10-17 08:07:33 -04:00
sudo mkdir -p /etc/gitlab/ssl
sudo chmod 700 /etc/gitlab/ssl
sudo cp key.pem cert.pem /etc/gitlab/ssl/
```
1. On the Gitaly server node(s), edit `/home/git/gitaly/config.toml` and add:
2018-12-21 04:43:45 -05:00
2019-07-18 21:47:39 -04:00
```toml
tls_listen_addr = '0.0.0.0:9999'
2018-12-21 04:43:45 -05:00
2019-07-18 21:47:39 -04:00
[tls]
2019-10-17 08:07:33 -04:00
certificate_path = '/etc/gitlab/ssl/cert.pem'
key_path = '/etc/gitlab/ssl/key.pem'
2019-07-18 21:47:39 -04:00
```
2019-03-26 14:12:53 -04:00
2019-10-17 08:07:33 -04:00
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ) on Gitaly server node(s).
1. (Optional) After [verifying that all Gitaly traffic is being served over TLS ](#observe-type-of-gitaly-connections ),
you can improve security by disabling non-TLS connections by commenting out
or deleting `listen_addr` in `/home/git/gitaly/config.toml` , saving the file,
and [restarting GitLab ](../restart_gitlab.md#installations-from-source )
on Gitaly server node(s).
### Observe type of Gitaly connections
2019-03-26 14:12:53 -04:00
2019-07-18 21:47:39 -04:00
To observe what type of connections are actually being used in a
production environment you can use the following Prometheus query:
2019-03-26 14:12:53 -04:00
2020-01-17 22:08:23 -05:00
```prometheus
2019-07-18 21:47:39 -04:00
sum(rate(gitaly_connections_total[5m])) by (type)
2019-03-26 14:12:53 -04:00
```
2019-07-18 21:47:39 -04:00
## `gitaly-ruby`
2019-03-26 14:12:53 -04:00
2019-07-18 21:47:39 -04:00
Gitaly was developed to replace the Ruby application code in GitLab.
In order to save time and/or avoid the risk of rewriting existing
application logic, in some cases we chose to copy some application code
from GitLab into Gitaly almost as-is. To be able to run that code,
`gitaly-ruby` was created, which is a "sidecar" process for the main Gitaly Go
process. Some examples of things that are implemented in `gitaly-ruby` are
RPCs that deal with wikis, and RPCs that create commits on behalf of
a user, such as merge commits.
2019-03-26 14:12:53 -04:00
2019-07-18 21:47:39 -04:00
### Number of `gitaly-ruby` workers
2019-03-26 14:12:53 -04:00
2019-07-18 21:47:39 -04:00
`gitaly-ruby` has much less capacity than Gitaly itself. If your Gitaly
server has to handle a lot of requests, the default setting of having
just one active `gitaly-ruby` sidecar might not be enough. If you see
`ResourceExhausted` errors from Gitaly, it's very likely that you have not
enough `gitaly-ruby` capacity.
2019-03-26 14:12:53 -04:00
2019-07-18 21:47:39 -04:00
You can increase the number of `gitaly-ruby` processes on your Gitaly
server with the following settings.
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
**For Omnibus GitLab**
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
1. Edit `/etc/gitlab/gitlab.rb` :
2018-02-26 08:54:09 -05:00
2019-07-18 21:47:39 -04:00
```ruby
# Default is 2 workers. The minimum is 2; 1 worker is always reserved as
# a passive stand-by.
gitaly['ruby_num_workers'] = 4
```
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
1. Save the file and [reconfigure GitLab ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
**For installations from source**
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
1. Edit `/home/git/gitaly/config.toml` :
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
```toml
[gitaly-ruby]
num_workers = 4
```
2017-03-17 07:19:38 -04:00
2019-07-18 21:47:39 -04:00
1. Save the file and [restart GitLab ](../restart_gitlab.md#installations-from-source ).
2018-11-30 12:26:37 -05:00
2019-08-26 16:51:15 -04:00
## Limiting RPC concurrency
It can happen that CI clone traffic puts a large strain on your Gitaly
service. The bulk of the work gets done in the SSHUploadPack (for Git
SSH) and PostUploadPack (for Git HTTP) RPC's. To prevent such workloads
from overcrowding your Gitaly server you can set concurrency limits in
Gitaly's configuration file.
```ruby
# in /etc/gitlab/gitlab.rb
gitaly['concurrency'] = [
{
'rpc' => "/gitaly.SmartHTTPService/PostUploadPack",
'max_per_repo' => 20
},
{
'rpc' => "/gitaly.SSHService/SSHUploadPack",
'max_per_repo' => 20
}
]
```
2019-08-28 02:06:34 -04:00
2019-08-26 16:51:15 -04:00
This will limit the number of in-flight RPC calls for the given RPC's.
The limit is applied per repository. In the example above, each on the
Gitaly server can have at most 20 simultaneous PostUploadPack calls in
flight, and the same for SSHUploadPack. If another request comes in for
2020-01-07 22:08:05 -05:00
a repository that has used up its 20 slots, that request will get
2019-08-26 16:51:15 -04:00
queued.
You can observe the behavior of this queue via the Gitaly logs and via
Prometheus. In the Gitaly logs, you can look for the string (or
structured log field) `acquire_ms` . Messages that have this field are
reporting about the concurrency limiter. In Prometheus, look for the
`gitaly_rate_limiting_in_progress` , `gitaly_rate_limiting_queued` and
`gitaly_rate_limiting_seconds` metrics.
The name of the Prometheus metric is not quite right because this is a
concurrency limiter, not a rate limiter. If a client makes 1000 requests
in a row in a very short timespan, the concurrency will not exceed 1,
and this mechanism (the concurrency limiter) will do nothing.
2019-09-24 11:06:34 -04:00
## Rotating a Gitaly authentication token
Rotating credentials in a production environment often either requires
downtime, or causes outages, or both. If you are careful, though, you
*can* rotate Gitaly credentials without a service interruption.
This procedure also works if you are running GitLab on a single server.
In that case, "Gitaly servers" and "Gitaly clients" refers to the same
machine.
### 1. Monitor current authentication behavior
2019-09-26 02:06:27 -04:00
Use Prometheus to see what the current authentication behavior of your
2019-09-24 11:06:34 -04:00
GitLab installation is.
2020-01-17 22:08:23 -05:00
```prometheus
2019-09-24 11:06:34 -04:00
sum(rate(gitaly_authentications_total[5m])) by (enforced, status)
```
In a system where authentication is configured correctly, and where you
have live traffic, you will see something like this:
2020-01-17 22:08:23 -05:00
```prometheus
2019-09-24 11:06:34 -04:00
{enforced="true",status="ok"} 4424.985419441742
```
There may also be other numbers with rate 0. We only care about the
non-zero numbers.
The only non-zero number should have `enforced="true",status="ok"` . If
you have other non-zero numbers, something is wrong in your
configuration.
The 'status="ok"' number reflects your current request rate. In the example
above, Gitaly is handling about 4000 requests per second.
Now you have established that you can monitor the Gitaly authentication
behavior of your GitLab installation.
### 2. Reconfigure all Gitaly servers to be in "auth transitioning" mode
The second step is to temporarily disable authentication on the Gitaly servers.
```ruby
# in /etc/gitlab/gitlab.rb
gitaly['auth_transitioning'] = true
```
2019-09-26 02:06:27 -04:00
After you have applied this, your Prometheus query should return
2019-09-24 11:06:34 -04:00
something like this:
2020-01-17 22:08:23 -05:00
```prometheus
2019-09-24 11:06:34 -04:00
{enforced="false",status="would be ok"} 4424.985419441742
```
Because `enforced="false"` , it will be safe to start rolling out the new
token.
### 3. Update Gitaly token on all clients and servers
```ruby
# in /etc/gitlab/gitlab.rb
gitaly['auth_token'] = 'my new secret token'
```
Remember to apply this on both your Gitaly clients *and* servers. If you
2019-09-26 02:06:27 -04:00
check your Prometheus query while this change is being rolled out, you
2019-09-24 11:06:34 -04:00
will see non-zero values for the `enforced="false",status="denied"` counter.
2019-09-26 02:06:27 -04:00
### 4. Use Prometheus to ensure there are no authentication failures
2019-09-24 11:06:34 -04:00
After you applied the Gitaly token change everywhere, and all services
involved have been restarted, you should will temporarily see a mix of
`status="would be ok"` and `status="denied"` .
After the new token has been picked up by all Gitaly clients and
servers, the **only non-zero rate** should be
`enforced="false",status="would be ok"` .
### 5. Disable "auth transitioning" Mode
Now we turn off the 'auth transitioning' mode. These final steps are
important: without them, you have **no authentication** .
Update the configuration on your Gitaly servers:
```ruby
# in /etc/gitlab/gitlab.rb
gitaly['auth_transitioning'] = false
```
### 6. Verify that authentication is enforced again
2019-09-26 02:06:27 -04:00
Refresh your Prometheus query. You should now see the same kind of
2019-09-24 11:06:34 -04:00
result as you did in the beginning:
2020-01-17 22:08:23 -05:00
```prometheus
2019-09-24 11:06:34 -04:00
{enforced="true",status="ok"} 4424.985419441742
```
Note that `enforced="true"` , meaning that authentication is being enforced.
2019-12-17 07:08:11 -05:00
## Direct Git access in GitLab Rails
Also known as "the Rugged patches".
### History
Before Gitaly existed, the things that are now Gitaly clients used to
access Git repositories directly. Either on a local disk in the case of
e.g. a single-machine Omnibus GitLab installation, or via NFS in the
case of a horizontally scaled GitLab installation.
Besides running plain `git` commands, in GitLab Rails we also used to
use a Ruby gem (library) called
[Rugged ](https://github.com/libgit2/rugged ). Rugged is a wrapper around
[libgit2 ](https://libgit2.org/ ), a stand-alone implementation of Git in
the form of a C library.
Over time it has become clear to use that Rugged, and particularly
Rugged in combination with the [Unicorn ](https://bogomips.org/unicorn/ )
web server, is extremely efficient. Because libgit2 is a *library* and
not an external process, there was very little overhead between GitLab
application code that tried to look up data in Git repositories, and the
Git implementation itself.
Because Rugged+Unicorn was so efficient, GitLab's application code ended
up with lots of duplicate Git object lookups (like looking up the
2020-01-07 22:08:05 -05:00
`master` commit a dozen times in one request). We could write
2019-12-17 07:08:11 -05:00
inefficient code without being punished for it.
When we migrated these Git lookups to Gitaly calls, we were suddenly
getting a much higher fixed cost per Git lookup. Even when Gitaly is
able to re-use an already-running `git` process to look up e.g. a commit
you still have the cost of a network roundtrip to Gitaly, and within
Gitaly a write/read roundtrip on the Unix pipes that connect Gitaly to
the `git` process.
Using GitLab.com performance as our yardstick, we pushed down the number
of Gitaly calls per request until the loss of Rugged's efficiency was no
longer felt. It also helped that we run Gitaly itself directly on the
Git file severs, rather than via NFS mounts: this gave us a speed boost
that counteracted the negative effect of not using Rugged anymore.
Unfortunately, some *other* deployments of GitLab could not ditch NFS
like we did on GitLab.com and they got the worst of both worlds: the
slowness of NFS and the increased inherent overhead of Gitaly.
As a performance band-aid for these stuck-on-NFS deployments, we
re-introduced some of the old Rugged code that got deleted from
GitLab Rails during the Gitaly migration project. These pieces of
re-introduced code are informally referred to as "the Rugged patches".
### Activation of direct Git access in GitLab Rails
The Ruby methods that perform direct Git access are hidden behind [feature
flags](../../development/gitaly.md#legacy-rugged-code). These feature
flags are off by default. It is not good if you need to know about
feature flags to get the best performance so in a second iteration, we
added an automatic mechanism that will enable direct Git access.
When GitLab Rails calls a function that has a Rugged patch it performs
two checks. The result of both of these checks is cached.
1. Is the feature flag for this patch set in the database? If so, do
what the feature flag says.
1. If the feature flag is not set (i.e. neither true nor false), try to
see if we can access filesystem underneath the Gitaly server
directly. If so, use the Rugged patch.
2020-03-13 05:09:23 -04:00
To see if GitLab Rails can access the repo filesystem directly, we use
2019-12-17 07:08:11 -05:00
the following heuristic:
- Gitaly ensures that the filesystem has a metadata file in its root
with a UUID in it.
- Gitaly reports this UUID to GitLab Rails via the `ServerInfo` RPC.
- GitLab Rails tries to read the metadata file directly. If it exists,
and if the UUID's match, assume we have direct access.
Because of the way the UUID check works, and because Omnibus GitLab will
fill in the correct repository paths in the GitLab Rails config file
`config/gitlab.yml` , **direct Git access in GitLab Rails is on by default in
Omnibus**.
### Plans to remove direct Git access in GitLab Rails
For the sake of removing complexity it is desirable that we get rid of
direct Git access in GitLab Rails. For as long as some GitLab installations are stuck
with Git repositories on slow NFS, however, we cannot just remove them.
There are two prongs to our efforts to remove direct Git access in GitLab Rails:
1. Reduce the number of (inefficient) Gitaly queries made by
GitLab Rails.
1. Persuade everybody who runs a Highly Available / horizontally scaled
GitLab installation to move off of NFS.
The second prong is the only real solution. For this we need [Gitaly
HA](https://gitlab.com/groups/gitlab-org/-/epics?scope=all& utf8=%E2%9C%93& state=opened& label_name[]=Gitaly%20HA),
which is still under development as of December 2019.
2019-08-05 09:30:07 -04:00
## Troubleshooting Gitaly
2019-11-21 01:06:32 -05:00
### Checking versions when using standalone Gitaly nodes
When using standalone Gitaly nodes, you must make sure they are the same version
as GitLab to ensure full compatibility. Check **Admin Area > Gitaly Servers** on
your GitLab instance and confirm all Gitaly Servers are `Up to date` .
![Gitaly standalone software versions diagram ](img/gitlab_gitaly_version_mismatch_v12_4.png )
2019-09-18 10:02:45 -04:00
### `gitaly-debug`
The `gitaly-debug` command provides "production debugging" tools for Gitaly and Git
performance. It is intended to help production engineers and support
engineers investigate Gitaly performance problems.
If you're using GitLab 11.6 or newer, this tool should be installed on
your GitLab / Gitaly server already at `/opt/gitlab/embedded/bin/gitaly-debug` .
If you're investigating an older GitLab version you can compile this
tool offline and copy the executable to your server:
2020-01-30 10:09:15 -05:00
```shell
2019-09-18 10:02:45 -04:00
git clone https://gitlab.com/gitlab-org/gitaly.git
cd cmd/gitaly-debug
GOOS=linux GOARCH=amd64 go build -o gitaly-debug
```
To see the help page of `gitaly-debug` for a list of supported sub-commands, run:
2020-01-30 10:09:15 -05:00
```shell
2019-09-18 10:02:45 -04:00
gitaly-debug -h
```
2019-08-05 09:30:07 -04:00
### Commits, pushes, and clones return a 401
2020-01-17 22:08:23 -05:00
```plaintext
2019-08-05 09:30:07 -04:00
remote: GitLab: 401 Unauthorized
```
You will need to sync your `gitlab-secrets.json` file with your GitLab
app nodes.
2019-07-18 21:47:39 -04:00
2020-01-07 13:07:34 -05:00
### Client side gRPC logs
2019-07-18 21:47:39 -04:00
Gitaly uses the [gRPC ](https://grpc.io/ ) RPC framework. The Ruby gRPC
client has its own log file which may contain useful information when
you are seeing Gitaly errors. You can control the log level of the
gRPC client with the `GRPC_LOG_LEVEL` environment variable. The
default level is `WARN` .
2020-03-03 10:08:08 -05:00
You can run a gRPC trace with:
2020-01-24 10:09:00 -05:00
2020-01-30 10:09:15 -05:00
```shell
2020-01-24 10:09:00 -05:00
GRPC_TRACE=all GRPC_VERBOSITY=DEBUG sudo gitlab-rake gitlab:gitaly:check
```
2019-07-18 21:47:39 -04:00
### Observing `gitaly-ruby` traffic
[`gitaly-ruby` ](#gitaly-ruby ) is an internal implementation detail of Gitaly,
so, there's not that much visibility into what goes on inside
`gitaly-ruby` processes.
If you have Prometheus set up to scrape your Gitaly process, you can see
request rates and error codes for individual RPCs in `gitaly-ruby` by
querying `grpc_client_handled_total` . Strictly speaking, this metric does
not differentiate between `gitaly-ruby` and other RPCs, but in practice
(as of GitLab 11.9), all gRPC calls made by Gitaly itself are internal
calls from the main Gitaly process to one of its `gitaly-ruby` sidecars.
Assuming your `grpc_client_handled_total` counter only observes Gitaly,
the following query shows you RPCs are (most likely) internally
implemented as calls to `gitaly-ruby` :
2020-01-17 22:08:23 -05:00
```prometheus
2019-07-18 21:47:39 -04:00
sum(rate(grpc_client_handled_total[5m])) by (grpc_method) > 0
```
2019-08-01 05:27:04 -04:00
### Repository changes fail with a `401 Unauthorized` error
If you're running Gitaly on its own server and notice that users can
successfully clone and fetch repositories (via both SSH and HTTPS), but can't
push to them or make changes to the repository in the web UI without getting a
`401 Unauthorized` message, then it's possible Gitaly is failing to authenticate
with the other nodes due to having the [wrong secrets file ](#3-gitaly-server-configuration ).
Confirm the following are all true:
- When any user performs a `git push` to any repository on this Gitaly node, it
fails with the following error (note the `401 Unauthorized` ):
2020-01-30 10:09:15 -05:00
```shell
2019-08-01 05:27:04 -04:00
remote: GitLab: 401 Unauthorized
To < REMOTE_URL >
! [remote rejected] branch-name -> branch-name (pre-receive hook declined)
error: failed to push some refs to '< REMOTE_URL > '
```
- When any user adds or modifies a file from the repository using the GitLab
2020-01-07 22:08:05 -05:00
UI, it immediately fails with a red `401 Unauthorized` banner.
2019-08-01 05:27:04 -04:00
- Creating a new project and [initializing it with a README ](../../gitlab-basics/create-project.md#blank-projects )
successfully creates the project but doesn't create the README.
2019-09-16 14:06:05 -04:00
- When [tailing the logs ](https://docs.gitlab.com/omnibus/settings/logs.html#tail-logs-in-a-console-on-the-server ) on an app node and reproducing the error, you get `401` errors
2019-08-01 05:27:04 -04:00
when reaching the `/api/v4/internal/allowed` endpoint:
2020-01-30 10:09:15 -05:00
```shell
2019-08-01 05:27:04 -04:00
# api_json.log
{
"time": "2019-07-18T00:30:14.967Z",
"severity": "INFO",
"duration": 0.57,
"db": 0,
"view": 0.57,
"status": 401,
"method": "POST",
"path": "\/api\/v4\/internal\/allowed",
"params": [
{
"key": "action",
"value": "git-receive-pack"
},
{
"key": "changes",
"value": "REDACTED"
},
{
"key": "gl_repository",
"value": "REDACTED"
},
{
"key": "project",
"value": "\/path\/to\/project.git"
},
{
"key": "protocol",
"value": "web"
},
{
"key": "env",
"value": "{\"GIT_ALTERNATE_OBJECT_DIRECTORIES\":[],\"GIT_ALTERNATE_OBJECT_DIRECTORIES_RELATIVE\":[],\"GIT_OBJECT_DIRECTORY\":null,\"GIT_OBJECT_DIRECTORY_RELATIVE\":null}"
},
{
"key": "user_id",
"value": "2"
},
{
"key": "secret_token",
"value": "[FILTERED]"
}
],
"host": "gitlab.example.com",
"ip": "REDACTED",
"ua": "Ruby",
"route": "\/api\/:version\/internal\/allowed",
"queue_duration": 4.24,
"gitaly_calls": 0,
"gitaly_duration": 0,
"correlation_id": "XPUZqTukaP3"
}
# nginx_access.log
[IP] - - [18/Jul/2019:00:30:14 +0000] "POST /api/v4/internal/allowed HTTP/1.1" 401 30 "" "Ruby"
```
To fix this problem, confirm that your [`gitlab-secrets.json` file ](#3-gitaly-server-configuration )
on the Gitaly node matches the one on all other nodes. If it doesn't match,
update the secrets file on the Gitaly node to match the others, then
[reconfigure the node ](../restart_gitlab.md#omnibus-gitlab-reconfigure ).
2019-08-22 12:25:36 -04:00
### Command line tools cannot connect to Gitaly
If you are having trouble connecting to a Gitaly node with command line (CLI) tools, and certain actions result in a `14: Connect Failed` error message, it means that gRPC cannot reach your Gitaly node.
Verify that you can reach Gitaly via TCP:
2020-01-30 10:09:15 -05:00
```shell
2019-08-22 12:25:36 -04:00
sudo gitlab-rake gitlab:tcp_check[GITALY_SERVER_IP,GITALY_LISTEN_PORT]
```
If the TCP connection fails, check your network settings and your firewall rules. If the TCP connection succeeds, your networking and firewall rules are correct.
If you use proxy servers in your command line environment, such as Bash, these can interfere with your gRPC traffic.
If you use Bash or a compatible command line environment, run the following commands to determine whether you have proxy servers configured:
2020-01-30 10:09:15 -05:00
```shell
2019-08-22 12:25:36 -04:00
echo $http_proxy
echo $https_proxy
```
If either of these variables have a value, your Gitaly CLI connections may be getting routed through a proxy which cannot connect to Gitaly.
To remove the proxy setting, run the following commands (depending on which variables had values):
2020-01-30 10:09:15 -05:00
```shell
2019-08-22 12:25:36 -04:00
unset http_proxy
unset https_proxy
2019-08-28 02:06:34 -04:00
```
2019-10-01 11:06:05 -04:00
2020-03-09 05:07:45 -04:00
### Gitaly not listening on new address after reconfiguring
When updating the `gitaly['listen_addr']` or `gitaly['prometheus_listen_addr']` values, Gitaly may continue to listen on the old address after a `sudo gitlab-ctl reconfigure` .
2020-03-13 05:09:23 -04:00
When this occurs, performing a `sudo gitlab-ctl restart` will resolve the issue. This will no longer be necessary after [this issue ](https://gitlab.com/gitlab-org/gitaly/issues/2521 ) is resolved.
2020-03-09 05:07:45 -04:00
2019-10-01 11:06:05 -04:00
### Praefect
Praefect is an experimental daemon that allows for replication of the Git data.
It can be setup with omnibus, [as explained here ](./praefect.md ).