12 KiB
stage | group | info | type |
---|---|---|---|
Verify | Pipeline Authoring | To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments | reference |
Multi-project pipelines (FREE)
Moved to GitLab Free in 12.8.
You can set up GitLab CI/CD across multiple projects, so that a pipeline in one project can trigger a pipeline in another project. You can visualize the entire pipeline in one place, including all cross-project interdependencies.
For example, you might deploy your web application from three different projects in GitLab. Each project has its own build, test, and deploy process. With multi-project pipelines you can visualize the entire pipeline, including all build and test stages for all three projects.
For an overview, see the Multi-project pipelines demo.
Multi-project pipelines are also useful for larger products that require cross-project interdependencies, like those with a microservices architecture. Learn more in the Cross-project Pipeline Triggering and Visualization demo at GitLab@learn, in the Continuous Integration section.
If you trigger a pipeline in a downstream private project, on the upstream project's pipelines page, you can view:
- The name of the project.
- The status of the pipeline.
If you have a public project that can trigger downstream pipelines in a private project, make sure there are no confidentiality problems.
Create multi-project pipelines
To create multi-project pipelines, you can:
Define multi-project pipelines in your .gitlab-ci.yml
file
Moved to GitLab Free in 12.8.
When you create a multi-project pipeline in your .gitlab-ci.yml
file,
you create what is called a trigger job. For example:
rspec:
stage: test
script: bundle exec rspec
staging:
variables:
ENVIRONMENT: staging
stage: deploy
trigger: my/deployment
In this example, after the rspec
job succeeds in the test
stage,
the staging
trigger job starts. The initial status of this
job is pending
.
GitLab then creates a downstream pipeline in the
my/deployment
project and, as soon as the pipeline is created, the
staging
job succeeds. The full path to the project is my/deployment
.
You can view the status for the pipeline, or you can display the downstream pipeline's status instead.
The user that creates the upstream pipeline must be able to create pipelines in the
downstream project (my/deployment
) too. If the downstream project is not found,
or the user does not have permission to create a pipeline there,
the staging
job is marked as failed.
Trigger job configuration keywords
Trigger jobs can use only a limited set of the GitLab CI/CD configuration keywords. The keywords available for use in trigger jobs are:
trigger
stage
allow_failure
rules
only
andexcept
when
(only with a value ofon_success
,on_failure
, oralways
)extends
needs
Specify a downstream pipeline branch
You can specify a branch name for the downstream pipeline to use. GitLab uses the commit on the head of the branch to create the downstream pipeline.
rspec:
stage: test
script: bundle exec rspec
staging:
stage: deploy
trigger:
project: my/deployment
branch: stable-11-2
Use:
- The
project
keyword to specify the full path to a downstream project. - The
branch
keyword to specify the name of a branch in the project specified byproject
. In GitLab 12.4 and later, variable expansion is supported.
Pipelines triggered on a protected branch in a downstream project use the role of the user that ran the trigger job in the upstream project. If the user does not have permission to run CI/CD pipelines against the protected branch, the pipeline fails. See pipeline security for protected branches.
Pass CI/CD variables to a downstream pipeline by using the variables
keyword
Sometimes you might want to pass CI/CD variables to a downstream pipeline.
You can do that by using the variables
keyword, just like you would for any other job.
rspec:
stage: test
script: bundle exec rspec
staging:
variables:
ENVIRONMENT: staging
stage: deploy
trigger: my/deployment
The ENVIRONMENT
variable is passed to every job defined in a downstream
pipeline. It is available as a variable when GitLab Runner picks a job.
In the following configuration, the MY_VARIABLE
variable is passed to the downstream pipeline
that is created when the trigger-downstream
job is queued. This is because trigger-downstream
job inherits variables declared in global variables blocks, and then we pass these variables to a downstream pipeline.
variables:
MY_VARIABLE: my-value
trigger-downstream:
variables:
ENVIRONMENT: something
trigger: my/project
You can stop global variables from reaching the downstream pipeline by using the inherit
keyword.
In this example, the MY_GLOBAL_VAR
variable is not available in the triggered pipeline:
variables:
MY_GLOBAL_VAR: value
trigger-downstream:
inherit:
variables: false
variables:
MY_LOCAL_VAR: value
trigger: my/project
You might want to pass some information about the upstream pipeline using, for example, predefined variables. In order to do that, you can use interpolation to pass any variable. For example:
downstream-job:
variables:
UPSTREAM_BRANCH: $CI_COMMIT_REF_NAME
trigger: my/project
In this scenario, the UPSTREAM_BRANCH
variable with a value related to the
upstream pipeline is passed to the downstream-job
job. It is available
in the context of all downstream builds.
Upstream pipelines take precedence over downstream ones. If there are two variables with the same name defined in both upstream and downstream projects, the ones defined in the upstream project take precedence.
Pass CI/CD variables to a downstream pipeline by using variable inheritance
You can pass variables to a downstream pipeline with dotenv
variable inheritance and cross project artifact downloads.
In the upstream pipeline:
-
Save the variables in a
.env
file. -
Save the
.env
file as adotenv
report. -
Trigger the downstream pipeline.
build_vars: stage: build script: - echo "BUILD_VERSION=hello" >> build.env artifacts: reports: dotenv: build.env deploy: stage: deploy trigger: my/downstream_project
-
Set the
test
job in the downstream pipeline to inherit the variables from thebuild_vars
job in the upstream project withneeds:
. Thetest
job inherits the variables in thedotenv
report and it can accessBUILD_VERSION
in the script:test: stage: test script: - echo $BUILD_VERSION needs: - project: my/upstream_project job: build_vars ref: master artifacts: true
Use rules
or only
/except
with multi-project pipelines
You can use CI/CD variables or the rules
keyword to
control job behavior for multi-project pipelines. When a
downstream pipeline is triggered with the trigger
keyword,
the value of the $CI_PIPELINE_SOURCE
predefined variable
is pipeline
for all its jobs.
If you use only/except
to control job behavior, use the
pipelines
keyword.
Mirror status of a triggered pipeline in the trigger job
- Introduced in GitLab Premium 12.3.
- Moved to GitLab Free in 12.8.
You can mirror the pipeline status from the triggered pipeline to the source
trigger job by using strategy: depend
. For example:
trigger_job:
trigger:
project: my/project
strategy: depend
Mirror status from upstream pipeline
You can mirror the pipeline status from an upstream pipeline to a bridge job by
using the needs:pipeline
keyword. The latest pipeline status from the default branch is
replicated to the bridge job.
For example:
upstream_bridge:
stage: test
needs:
pipeline: other/project
Create multi-project pipelines by using the API
Moved to GitLab Free in 12.4.
When you use the CI_JOB_TOKEN
to trigger pipelines,
GitLab recognizes the source of the job token. The pipelines become related,
so you can visualize their relationships on pipeline graphs.
These relationships are displayed in the pipeline graph by showing inbound and outbound connections for upstream and downstream pipeline dependencies.
When using:
- CI/CD variables or
rules
to control job behavior, the value of the$CI_PIPELINE_SOURCE
predefined variable ispipeline
for multi-project pipeline triggered through the API withCI_JOB_TOKEN
. only/except
to control job behavior, use thepipelines
keyword.
Trigger a pipeline when an upstream project is rebuilt (PREMIUM)
Introduced in GitLab Premium 12.8.
You can trigger a pipeline in your project whenever a pipeline finishes for a new tag in a different project.
Prerequisites:
- The upstream project must be public.
- The user must have the Developer role in the upstream project.
To trigger the pipeline when the upstream project is rebuilt:
- On the top bar, select Menu > Projects and find your project.
- On the left sidebar, select Settings > CI/CD.
- Expand Pipeline subscriptions.
- Enter the project you want to subscribe to, in the format
<namespace>/<project>
. For example, if the project ishttps://gitlab.com/gitlab-org/gitlab
, usegitlab-org/gitlab
. - Select Subscribe.
Any pipelines that complete successfully for new tags in the subscribed project now trigger a pipeline on the current project's default branch. The maximum number of upstream pipeline subscriptions is 2 by default, for both the upstream and downstream projects. On self-managed instances, an administrator can change this limit.
Multi-project pipeline visualization (PREMIUM)
When you configure GitLab CI/CD for your project, you can visualize the stages of your jobs on a pipeline graph.
In the merge request, on the Pipelines tab, multi-project pipeline mini-graphs are displayed. They expand and are shown adjacent to each other when hovering (or tapping on touchscreen devices).