Merge branch 'refactor-builds-creation-service' into 'master'

Refactor pipeline creation service

## What does this MR do?
This refactors GitLab CI build processing: all builds for pipeline are pre-created when a pipeline object is created. 
The builds are created with a new introduced status `created`.
The builds are then automatically promoted to `pending` when a previous stage do succeed.
This significantly simplifies pipeline processing code solving a lot of problems of lazily initialisation of previous approach (builds were created on-demand).

## Why was this MR needed?
The previous mechanism had a lot of flows (shown in related issues) in how it work, but also in code design. Removing cross model-service-library dependencies.

The current approach moves a build creation to single place `CreatePipelineService` and removes a dynamic dependency on `config_processor` significantly simplifying a build creation and pipeline processing. Pipeline processing is implemented in `ProcessPipelineService`.

This also allows to easily extend GitLab with Manual Actions which is part of 8.10 direction issue.

## Migration problem
~~This MR removes the a on-demand creation of builds in pipelines.
 Pipelines that are running and are in mid-stage (some stages started, but not all) will not be fully evaluated after application restart. 
This happens, because the code responsible for on-demand creation is removed. 
There's no easy way to migrate existing pipelines, other than doing offline migration and putting pipeline processing in migration code (which seems to be a really bad idea).~~

To support old pipelines I added a lazy initialization of builds if none is found.

## What are the relevant issue numbers?
Fixes: https://gitlab.com/gitlab-org/gitlab-ce/issues/12839
Solves: https://gitlab.com/gitlab-org/gitlab-ce/issues/18644 https://gitlab.com/gitlab-org/gitlab-ci-multi-runner/issues/289
Allows to easily implement: https://gitlab.com/gitlab-org/gitlab-ce/issues/17010

## Does this MR meet the acceptance criteria?

- [x] [CHANGELOG](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CHANGELOG) entry added
- [x] [Documentation created/updated](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/doc/development/doc_styleguide.md)
- [x] API support added
- Tests
  - [x] Added for this feature/bug
  - [ ] All builds are passing
- [x] Conform by the [style guides](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CONTRIBUTING.md#style-guides)
- [x] Branch has no merge conflicts with `master` (if you do - rebase it please)
- [ ] [Squashed related commits together](https://git-scm.com/book/en/Git-Tools-Rewriting-History#Squashing-Commits)


See merge request !5295
This commit is contained in:
Rémy Coutable 2016-08-11 14:27:42 +00:00
commit 7e9b41896d
47 changed files with 1049 additions and 973 deletions

View File

@ -20,6 +20,7 @@ v 8.11.0 (unreleased)
- Cache the commit author in RequestStore to avoid extra lookups in PostReceive
- Expand commit message width in repo view (ClemMakesApps)
- Cache highlighted diff lines for merge requests
- Pre-create all builds for a Pipeline when the new Pipeline is created !5295
- Fix of 'Commits being passed to custom hooks are already reachable when using the UI'
- Fix awardable button mutuality loading spinners (ClemMakesApps)
- Add support for using RequestStore within Sidekiq tasks via SIDEKIQ_REQUEST_STORE env variable

View File

@ -6,7 +6,7 @@ class Projects::BuildsController < Projects::ApplicationController
def index
@scope = params[:scope]
@all_builds = project.builds
@all_builds = project.builds.relevant
@builds = @all_builds.order('created_at DESC')
@builds =
case @scope

View File

@ -134,8 +134,8 @@ class Projects::CommitController < Projects::ApplicationController
end
def define_status_vars
@statuses = CommitStatus.where(pipeline: pipelines)
@builds = Ci::Build.where(pipeline: pipelines)
@statuses = CommitStatus.where(pipeline: pipelines).relevant
@builds = Ci::Build.where(pipeline: pipelines).relevant
end
def assign_change_commit_vars(mr_source_branch)

View File

@ -160,7 +160,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController
@diff_notes_disabled = true
@pipeline = @merge_request.pipeline
@statuses = @pipeline.statuses if @pipeline
@statuses = @pipeline.statuses.relevant if @pipeline
@note_counts = Note.where(commit_id: @commits.map(&:id)).
group(:commit_id).count
@ -362,7 +362,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController
@commits_count = @merge_request.commits.count
@pipeline = @merge_request.pipeline
@statuses = @pipeline.statuses if @pipeline
@statuses = @pipeline.statuses.relevant if @pipeline
if @merge_request.locked_long_ago?
@merge_request.unlock_mr

View File

@ -19,7 +19,7 @@ class Projects::PipelinesController < Projects::ApplicationController
end
def create
@pipeline = Ci::CreatePipelineService.new(project, current_user, create_params).execute
@pipeline = Ci::CreatePipelineService.new(project, current_user, create_params).execute(ignore_skip_ci: true, save_on_errors: false)
unless @pipeline.persisted?
render 'new'
return

View File

@ -16,7 +16,7 @@ module Ci
scope :with_artifacts_not_expired, ->() { with_artifacts.where('artifacts_expire_at IS NULL OR artifacts_expire_at > ?', Time.now) }
scope :with_expired_artifacts, ->() { with_artifacts.where('artifacts_expire_at < ?', Time.now) }
scope :last_month, ->() { where('created_at > ?', Date.today - 1.month) }
scope :manual_actions, ->() { where(when: :manual) }
scope :manual_actions, ->() { where(when: :manual).relevant }
mount_uploader :artifacts_file, ArtifactUploader
mount_uploader :artifacts_metadata, ArtifactUploader
@ -65,17 +65,11 @@ module Ci
end
end
state_machine :status, initial: :pending do
state_machine :status do
after_transition pending: :running do |build|
build.execute_hooks
end
# We use around_transition to create builds for next stage as soon as possible, before the `after_*` is executed
around_transition any => [:success, :failed, :canceled] do |build, block|
block.call
build.pipeline.create_next_builds(build) if build.pipeline
end
after_transition any => [:success, :failed, :canceled] do |build|
build.update_coverage
build.execute_hooks
@ -461,7 +455,7 @@ module Ci
def build_attributes_from_config
return {} unless pipeline.config_processor
pipeline.config_processor.build_attributes(name)
end
end

View File

@ -13,11 +13,10 @@ module Ci
has_many :trigger_requests, dependent: :destroy, class_name: 'Ci::TriggerRequest', foreign_key: :commit_id
validates_presence_of :sha
validates_presence_of :ref
validates_presence_of :status
validate :valid_commit_sha
# Invalidate object and save if when touched
after_touch :update_state
after_save :keep_around_commits
# ref can't be HEAD or SHA, can only be branch/tag name
@ -90,12 +89,16 @@ module Ci
def cancel_running
builds.running_or_pending.each(&:cancel)
reload_status!
end
def retry_failed(user)
builds.latest.failed.select(&:retryable?).each do |build|
Ci::Build.retry(build, user)
end
reload_status!
end
def latest?
@ -109,37 +112,6 @@ module Ci
trigger_requests.any?
end
def create_builds(user, trigger_request = nil)
##
# We persist pipeline only if there are builds available
#
return unless config_processor
build_builds_for_stages(config_processor.stages, user,
'success', trigger_request) && save
end
def create_next_builds(build)
return unless config_processor
# don't create other builds if this one is retried
latest_builds = builds.latest
return unless latest_builds.exists?(build.id)
# get list of stages after this build
next_stages = config_processor.stages.drop_while { |stage| stage != build.stage }
next_stages.delete(build.stage)
# get status for all prior builds
prior_builds = latest_builds.where.not(stage: next_stages)
prior_status = prior_builds.status
# build builds for next stage that has builds available
# and save pipeline if we have builds
build_builds_for_stages(next_stages, build.user, prior_status,
build.trigger_request) && save
end
def retried
@retried ||= (statuses.order(id: :desc) - statuses.latest)
end
@ -151,6 +123,14 @@ module Ci
end
end
def config_builds_attributes
return [] unless config_processor
config_processor.
builds_for_ref(ref, tag?, trigger_requests.first).
sort_by { |build| build[:stage_idx] }
end
def has_warnings?
builds.latest.ignored.any?
end
@ -182,10 +162,6 @@ module Ci
end
end
def skip_ci?
git_commit_message =~ /\[(ci skip|skip ci)\]/i if git_commit_message
end
def environments
builds.where.not(environment: nil).success.pluck(:environment).uniq
end
@ -207,39 +183,33 @@ module Ci
Note.for_commit_id(sha)
end
def process!
Ci::ProcessPipelineService.new(project, user).execute(self)
reload_status!
end
def predefined_variables
[
{ key: 'CI_PIPELINE_ID', value: id.to_s, public: true }
]
end
private
def build_builds_for_stages(stages, user, status, trigger_request)
##
# Note that `Array#any?` implements a short circuit evaluation, so we
# build builds only for the first stage that has builds available.
#
stages.any? do |stage|
CreateBuildsService.new(self).
execute(stage, user, status, trigger_request).
any?(&:active?)
end
end
def update_state
def reload_status!
statuses.reload
self.status = if yaml_errors.blank?
statuses.latest.status || 'skipped'
else
'failed'
end
self.status =
if yaml_errors.blank?
statuses.latest.status || 'skipped'
else
'failed'
end
self.started_at = statuses.started_at
self.finished_at = statuses.finished_at
self.duration = statuses.latest.duration
save
end
private
def keep_around_commits
return unless project

View File

@ -5,7 +5,7 @@ class CommitStatus < ActiveRecord::Base
self.table_name = 'ci_builds'
belongs_to :project, class_name: '::Project', foreign_key: :gl_project_id
belongs_to :pipeline, class_name: 'Ci::Pipeline', foreign_key: :commit_id, touch: true
belongs_to :pipeline, class_name: 'Ci::Pipeline', foreign_key: :commit_id
belongs_to :user
delegate :commit, to: :pipeline
@ -25,28 +25,36 @@ class CommitStatus < ActiveRecord::Base
scope :ordered, -> { order(:name) }
scope :ignored, -> { where(allow_failure: true, status: [:failed, :canceled]) }
state_machine :status, initial: :pending do
state_machine :status do
event :queue do
transition skipped: :pending
transition [:created, :skipped] => :pending
end
event :run do
transition pending: :running
end
event :skip do
transition [:created, :pending] => :skipped
end
event :drop do
transition [:pending, :running] => :failed
transition [:created, :pending, :running] => :failed
end
event :success do
transition [:pending, :running] => :success
transition [:created, :pending, :running] => :success
end
event :cancel do
transition [:pending, :running] => :canceled
transition [:created, :pending, :running] => :canceled
end
after_transition pending: :running do |commit_status|
after_transition created: [:pending, :running] do |commit_status|
commit_status.update_attributes queued_at: Time.now
end
after_transition [:created, :pending] => :running do |commit_status|
commit_status.update_attributes started_at: Time.now
end
@ -54,13 +62,20 @@ class CommitStatus < ActiveRecord::Base
commit_status.update_attributes finished_at: Time.now
end
after_transition [:pending, :running] => :success do |commit_status|
after_transition [:created, :pending, :running] => :success do |commit_status|
MergeRequests::MergeWhenBuildSucceedsService.new(commit_status.pipeline.project, nil).trigger(commit_status)
end
after_transition any => :failed do |commit_status|
MergeRequests::AddTodoWhenBuildFailsService.new(commit_status.pipeline.project, nil).execute(commit_status)
end
# We use around_transition to process pipeline on next stages as soon as possible, before the `after_*` is executed
around_transition any => [:success, :failed, :canceled] do |commit_status, block|
block.call
commit_status.pipeline.process! if commit_status.pipeline
end
end
delegate :sha, :short_sha, to: :pipeline

View File

@ -1,18 +1,22 @@
module Statuseable
extend ActiveSupport::Concern
AVAILABLE_STATUSES = %w(pending running success failed canceled skipped)
AVAILABLE_STATUSES = %w[created pending running success failed canceled skipped]
STARTED_STATUSES = %w[running success failed skipped]
ACTIVE_STATUSES = %w[pending running]
COMPLETED_STATUSES = %w[success failed canceled]
class_methods do
def status_sql
builds = all.select('count(*)').to_sql
success = all.success.select('count(*)').to_sql
ignored = all.ignored.select('count(*)').to_sql if all.respond_to?(:ignored)
scope = all.relevant
builds = scope.select('count(*)').to_sql
success = scope.success.select('count(*)').to_sql
ignored = scope.ignored.select('count(*)').to_sql if scope.respond_to?(:ignored)
ignored ||= '0'
pending = all.pending.select('count(*)').to_sql
running = all.running.select('count(*)').to_sql
canceled = all.canceled.select('count(*)').to_sql
skipped = all.skipped.select('count(*)').to_sql
pending = scope.pending.select('count(*)').to_sql
running = scope.running.select('count(*)').to_sql
canceled = scope.canceled.select('count(*)').to_sql
skipped = scope.skipped.select('count(*)').to_sql
deduce_status = "(CASE
WHEN (#{builds})=0 THEN NULL
@ -48,7 +52,8 @@ module Statuseable
included do
validates :status, inclusion: { in: AVAILABLE_STATUSES }
state_machine :status, initial: :pending do
state_machine :status, initial: :created do
state :created, value: 'created'
state :pending, value: 'pending'
state :running, value: 'running'
state :failed, value: 'failed'
@ -57,6 +62,8 @@ module Statuseable
state :skipped, value: 'skipped'
end
scope :created, -> { where(status: 'created') }
scope :relevant, -> { where.not(status: 'created') }
scope :running, -> { where(status: 'running') }
scope :pending, -> { where(status: 'pending') }
scope :success, -> { where(status: 'success') }
@ -68,14 +75,14 @@ module Statuseable
end
def started?
!pending? && !canceled? && started_at
STARTED_STATUSES.include?(status) && started_at
end
def active?
running? || pending?
ACTIVE_STATUSES.include?(status)
end
def complete?
canceled? || success? || failed?
COMPLETED_STATUSES.include?(status)
end
end

View File

@ -1,62 +0,0 @@
module Ci
class CreateBuildsService
def initialize(pipeline)
@pipeline = pipeline
@config = pipeline.config_processor
end
def execute(stage, user, status, trigger_request = nil)
builds_attrs = @config.builds_for_stage_and_ref(stage, @pipeline.ref, @pipeline.tag, trigger_request)
# check when to create next build
builds_attrs = builds_attrs.select do |build_attrs|
case build_attrs[:when]
when 'on_success'
status == 'success'
when 'on_failure'
status == 'failed'
when 'always', 'manual'
%w(success failed).include?(status)
end
end
# don't create the same build twice
builds_attrs.reject! do |build_attrs|
@pipeline.builds.find_by(ref: @pipeline.ref,
tag: @pipeline.tag,
trigger_request: trigger_request,
name: build_attrs[:name])
end
builds_attrs.map do |build_attrs|
build_attrs.slice!(:name,
:commands,
:tag_list,
:options,
:allow_failure,
:stage,
:stage_idx,
:environment,
:when,
:yaml_variables)
build_attrs.merge!(pipeline: @pipeline,
ref: @pipeline.ref,
tag: @pipeline.tag,
trigger_request: trigger_request,
user: user,
project: @pipeline.project)
# TODO: The proper implementation for this is in
# https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/5295
build_attrs[:status] = 'skipped' if build_attrs[:when] == 'manual'
##
# We do not persist new builds here.
# Those will be persisted when @pipeline is saved.
#
@pipeline.builds.new(build_attrs)
end
end
end
end

View File

@ -0,0 +1,42 @@
module Ci
class CreatePipelineBuildsService < BaseService
attr_reader :pipeline
def execute(pipeline)
@pipeline = pipeline
new_builds.map do |build_attributes|
create_build(build_attributes)
end
end
private
def create_build(build_attributes)
build_attributes = build_attributes.merge(
pipeline: pipeline,
project: pipeline.project,
ref: pipeline.ref,
tag: pipeline.tag,
user: current_user,
trigger_request: trigger_request
)
pipeline.builds.create(build_attributes)
end
def new_builds
@new_builds ||= pipeline.config_builds_attributes.
reject { |build| existing_build_names.include?(build[:name]) }
end
def existing_build_names
@existing_build_names ||= pipeline.builds.pluck(:name)
end
def trigger_request
return @trigger_request if defined?(@trigger_request)
@trigger_request ||= pipeline.trigger_requests.first
end
end
end

View File

@ -1,49 +1,100 @@
module Ci
class CreatePipelineService < BaseService
def execute
pipeline = project.pipelines.new(params)
pipeline.user = current_user
attr_reader :pipeline
unless ref_names.include?(params[:ref])
pipeline.errors.add(:base, 'Reference not found')
return pipeline
def execute(ignore_skip_ci: false, save_on_errors: true, trigger_request: nil)
@pipeline = Ci::Pipeline.new(
project: project,
ref: ref,
sha: sha,
before_sha: before_sha,
tag: tag?,
trigger_requests: Array(trigger_request),
user: current_user
)
unless project.builds_enabled?
return error('Pipeline is disabled')
end
if commit
pipeline.sha = commit.id
else
pipeline.errors.add(:base, 'Commit not found')
return pipeline
unless trigger_request || can?(current_user, :create_pipeline, project)
return error('Insufficient permissions to create a new pipeline')
end
unless can?(current_user, :create_pipeline, project)
pipeline.errors.add(:base, 'Insufficient permissions to create a new pipeline')
return pipeline
unless branch? || tag?
return error('Reference not found')
end
unless commit
return error('Commit not found')
end
unless pipeline.config_processor
pipeline.errors.add(:base, pipeline.yaml_errors || 'Missing .gitlab-ci.yml file')
return pipeline
unless pipeline.ci_yaml_file
return error('Missing .gitlab-ci.yml file')
end
return error(pipeline.yaml_errors, save: save_on_errors)
end
pipeline.save!
if !ignore_skip_ci && skip_ci?
return error('Creation of pipeline is skipped', save: save_on_errors)
end
unless pipeline.create_builds(current_user)
pipeline.errors.add(:base, 'No builds for this pipeline.')
unless pipeline.config_builds_attributes.present?
return error('No builds for this pipeline.')
end
pipeline.save
pipeline.process!
pipeline
end
private
def ref_names
@ref_names ||= project.repository.ref_names
def skip_ci?
pipeline.git_commit_message =~ /\[(ci skip|skip ci)\]/i if pipeline.git_commit_message
end
def commit
@commit ||= project.commit(params[:ref])
@commit ||= project.commit(origin_sha || origin_ref)
end
def sha
commit.try(:id)
end
def before_sha
params[:checkout_sha] || params[:before] || Gitlab::Git::BLANK_SHA
end
def origin_sha
params[:checkout_sha] || params[:after]
end
def origin_ref
params[:ref]
end
def branch?
project.repository.ref_exists?(Gitlab::Git::BRANCH_REF_PREFIX + ref)
end
def tag?
project.repository.ref_exists?(Gitlab::Git::TAG_REF_PREFIX + ref)
end
def ref
Gitlab::Git.ref_name(origin_ref)
end
def valid_sha?
origin_sha && origin_sha != Gitlab::Git::BLANK_SHA
end
def error(message, save: false)
pipeline.errors.add(:base, message)
pipeline.reload_status! if save
pipeline
end
end
end

View File

@ -1,20 +1,11 @@
module Ci
class CreateTriggerRequestService
def execute(project, trigger, ref, variables = nil)
commit = project.commit(ref)
return unless commit
trigger_request = trigger.trigger_requests.create(variables: variables)
# check if ref is tag
tag = project.repository.find_tag(ref).present?
pipeline = project.pipelines.create(sha: commit.sha, ref: ref, tag: tag)
trigger_request = trigger.trigger_requests.create!(
variables: variables,
pipeline: pipeline,
)
if pipeline.create_builds(nil, trigger_request)
pipeline = Ci::CreatePipelineService.new(project, nil, ref: ref).
execute(ignore_skip_ci: true, trigger_request: trigger_request)
if pipeline.persisted?
trigger_request
end
end

View File

@ -0,0 +1,77 @@
module Ci
class ProcessPipelineService < BaseService
attr_reader :pipeline
def execute(pipeline)
@pipeline = pipeline
# This method will ensure that our pipeline does have all builds for all stages created
if created_builds.empty?
create_builds!
end
new_builds =
stage_indexes_of_created_builds.map do |index|
process_stage(index)
end
# Return a flag if a when builds got enqueued
new_builds.flatten.any?
end
private
def create_builds!
Ci::CreatePipelineBuildsService.new(project, current_user).execute(pipeline)
end
def process_stage(index)
current_status = status_for_prior_stages(index)
created_builds_in_stage(index).select do |build|
process_build(build, current_status)
end
end
def process_build(build, current_status)
return false unless Statuseable::COMPLETED_STATUSES.include?(current_status)
if valid_statuses_for_when(build.when).include?(current_status)
build.queue
true
else
build.skip
false
end
end
def valid_statuses_for_when(value)
case value
when 'on_success'
%w[success]
when 'on_failure'
%w[failed]
when 'always'
%w[success failed]
else
[]
end
end
def status_for_prior_stages(index)
pipeline.builds.where('stage_idx < ?', index).latest.status || 'success'
end
def stage_indexes_of_created_builds
created_builds.order(:stage_idx).pluck('distinct stage_idx')
end
def created_builds_in_stage(index)
created_builds.where(stage_idx: index)
end
def created_builds
pipeline.builds.created
end
end
end

View File

@ -1,69 +0,0 @@
class CreateCommitBuildsService
def execute(project, user, params)
return unless project.builds_enabled?
before_sha = params[:checkout_sha] || params[:before]
sha = params[:checkout_sha] || params[:after]
origin_ref = params[:ref]
ref = Gitlab::Git.ref_name(origin_ref)
tag = Gitlab::Git.tag_ref?(origin_ref)
# Skip branch removal
if sha == Gitlab::Git::BLANK_SHA
return false
end
@pipeline = Ci::Pipeline.new(
project: project,
sha: sha,
ref: ref,
before_sha: before_sha,
tag: tag,
user: user)
##
# Skip creating pipeline if no gitlab-ci.yml is found
#
unless @pipeline.ci_yaml_file
return false
end
##
# Skip creating builds for commits that have [ci skip]
# but save pipeline object
#
if @pipeline.skip_ci?
return save_pipeline!
end
##
# Skip creating builds when CI config is invalid
# but save pipeline object
#
unless @pipeline.config_processor
return save_pipeline!
end
##
# Skip creating pipeline object if there are no builds for it.
#
unless @pipeline.create_builds(user)
@pipeline.errors.add(:base, 'No builds created')
return false
end
save_pipeline!
end
private
##
# Create a new pipeline and touch object to calculate status
#
def save_pipeline!
@pipeline.save!
@pipeline.touch
@pipeline
end
end

View File

@ -69,7 +69,7 @@ class GitPushService < BaseService
SystemHooksService.new.execute_hooks(build_push_data_system_hook.dup, :push_hooks)
@project.execute_hooks(build_push_data.dup, :push_hooks)
@project.execute_services(build_push_data.dup, :push_hooks)
CreateCommitBuildsService.new.execute(@project, current_user, build_push_data)
Ci::CreatePipelineService.new(project, current_user, build_push_data).execute
ProjectCacheWorker.perform_async(@project.id)
end

View File

@ -11,7 +11,7 @@ class GitTagPushService < BaseService
SystemHooksService.new.execute_hooks(build_system_push_data.dup, :tag_push_hooks)
project.execute_hooks(@push_data.dup, :tag_push_hooks)
project.execute_services(@push_data.dup, :tag_push_hooks)
CreateCommitBuildsService.new.execute(project, current_user, @push_data)
Ci::CreatePipelineService.new(project, current_user, @push_data).execute
ProjectCacheWorker.perform_async(project.id)
true

View File

@ -33,7 +33,7 @@
Cant find HEAD commit for this branch
- stages_status = pipeline.statuses.latest.stages_status
- stages_status = pipeline.statuses.relevant.latest.stages_status
- stages.each do |stage|
%td.stage-cell
- status = stages_status[stage]

View File

@ -46,5 +46,5 @@
- if pipeline.project.build_coverage_enabled?
%th Coverage
%th
- pipeline.statuses.stages.each do |stage|
= render 'projects/commit/ci_stage', stage: stage, statuses: pipeline.statuses.where(stage: stage)
- pipeline.statuses.relevant.stages.each do |stage|
= render 'projects/commit/ci_stage', stage: stage, statuses: pipeline.statuses.relevant.where(stage: stage)

View File

@ -0,0 +1,9 @@
class AddQueuedAtToCiBuilds < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
def change
add_column :ci_builds, :queued_at, :timestamp
end
end

View File

@ -170,6 +170,7 @@ ActiveRecord::Schema.define(version: 20160810142633) do
t.integer "artifacts_size"
t.string "when"
t.text "yaml_variables"
t.datetime "queued_at"
end
add_index "ci_builds", ["commit_id", "stage_idx", "created_at"], name: "index_ci_builds_on_commit_id_and_stage_idx_and_created_at", using: :btree

View File

@ -10,20 +10,22 @@ module SharedBuilds
end
step 'project has a recent build' do
@pipeline = create(:ci_pipeline, project: @project, sha: @project.commit.sha, ref: 'master')
@pipeline = create(:ci_empty_pipeline, project: @project, sha: @project.commit.sha, ref: 'master')
@build = create(:ci_build_with_coverage, pipeline: @pipeline)
@pipeline.reload_status!
end
step 'recent build is successful' do
@build.update(status: 'success')
@build.success
end
step 'recent build failed' do
@build.update(status: 'failed')
@build.drop
end
step 'project has another build that is running' do
create(:ci_build, pipeline: @pipeline, name: 'second build', status: 'running')
@pipeline.reload_status!
end
step 'I visit recent build details page' do

View File

@ -62,7 +62,7 @@ module Ci
# - before script should be a concatenated command
commands: [job[:before_script] || @before_script, job[:script]].flatten.compact.join("\n"),
tag_list: job[:tags] || [],
name: job[:name],
name: job[:name].to_s,
allow_failure: job[:allow_failure] || false,
when: job[:when] || 'on_success',
environment: job[:environment],

View File

@ -7,6 +7,7 @@ FactoryGirl.define do
stage_idx 0
ref 'master'
tag false
status 'pending'
created_at 'Di 29. Okt 09:50:00 CET 2013'
started_at 'Di 29. Okt 09:51:28 CET 2013'
finished_at 'Di 29. Okt 09:53:28 CET 2013'
@ -45,6 +46,10 @@ FactoryGirl.define do
status 'pending'
end
trait :created do
status 'created'
end
trait :manual do
status 'skipped'
self.when 'manual'

View File

@ -18,7 +18,9 @@
FactoryGirl.define do
factory :ci_empty_pipeline, class: Ci::Pipeline do
ref 'master'
sha '97de212e80737a608d939f648d959671fb0a0142'
status 'pending'
project factory: :empty_project

View File

@ -7,6 +7,30 @@ FactoryGirl.define do
started_at 'Tue, 26 Jan 2016 08:21:42 +0100'
finished_at 'Tue, 26 Jan 2016 08:23:42 +0100'
trait :success do
status 'success'
end
trait :failed do
status 'failed'
end
trait :canceled do
status 'canceled'
end
trait :running do
status 'running'
end
trait :pending do
status 'pending'
end
trait :created do
status 'created'
end
after(:build) do |build, evaluator|
build.project = build.pipeline.project
end

View File

@ -29,12 +29,16 @@ feature 'Merge request created from fork' do
include WaitForAjax
given(:pipeline) do
create(:ci_pipeline_with_two_job, project: fork_project,
sha: merge_request.diff_head_sha,
ref: merge_request.source_branch)
create(:ci_pipeline,
project: fork_project,
sha: merge_request.diff_head_sha,
ref: merge_request.source_branch)
end
background { pipeline.create_builds(user) }
background do
create(:ci_build, pipeline: pipeline, name: 'rspec')
create(:ci_build, pipeline: pipeline, name: 'spinach')
end
scenario 'user visits a pipelines page', js: true do
visit_merge_request(merge_request)

View File

@ -33,7 +33,10 @@ describe "Pipelines" do
context 'cancelable pipeline' do
let!(:running) { create(:ci_build, :running, pipeline: pipeline, stage: 'test', commands: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) }
before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it { expect(page).to have_link('Cancel') }
it { expect(page).to have_selector('.ci-running') }
@ -49,7 +52,10 @@ describe "Pipelines" do
context 'retryable pipelines' do
let!(:failed) { create(:ci_build, :failed, pipeline: pipeline, stage: 'test', commands: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) }
before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it { expect(page).to have_link('Retry') }
it { expect(page).to have_selector('.ci-failed') }
@ -80,7 +86,10 @@ describe "Pipelines" do
context 'when running' do
let!(:running) { create(:generic_commit_status, status: 'running', pipeline: pipeline, stage: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) }
before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it 'is not cancelable' do
expect(page).not_to have_link('Cancel')
@ -92,9 +101,12 @@ describe "Pipelines" do
end
context 'when failed' do
let!(:running) { create(:generic_commit_status, status: 'failed', pipeline: pipeline, stage: 'test') }
let!(:failed) { create(:generic_commit_status, status: 'failed', pipeline: pipeline, stage: 'test') }
before { visit namespace_project_pipelines_path(project.namespace, project) }
before do
pipeline.reload_status!
visit namespace_project_pipelines_path(project.namespace, project)
end
it 'is not retryable' do
expect(page).not_to have_link('Retry')
@ -211,7 +223,7 @@ describe "Pipelines" do
context 'for invalid commit' do
before do
fill_in('Create for', with: 'invalid reference')
fill_in('Create for', with: 'invalid-reference')
click_on 'Create pipeline'
end

View File

@ -5,6 +5,7 @@ describe Ci::Charts, lib: true do
before do
@pipeline = FactoryGirl.create(:ci_pipeline)
FactoryGirl.create(:ci_build, pipeline: @pipeline)
@pipeline.reload_status!
end
it 'returns build times in minutes' do

View File

@ -19,7 +19,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref(type, "master").first).to eq({
stage: "test",
stage_idx: 1,
name: :rspec,
name: "rspec",
commands: "pwd\nrspec",
tag_list: [],
options: {},
@ -433,7 +433,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({
stage: "test",
stage_idx: 1,
name: :rspec,
name: "rspec",
commands: "pwd\nrspec",
tag_list: [],
options: {
@ -461,7 +461,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({
stage: "test",
stage_idx: 1,
name: :rspec,
name: "rspec",
commands: "pwd\nrspec",
tag_list: [],
options: {
@ -700,7 +700,7 @@ module Ci
expect(config_processor.builds_for_stage_and_ref("test", "master").first).to eq({
stage: "test",
stage_idx: 1,
name: :rspec,
name: "rspec",
commands: "pwd\nrspec",
tag_list: [],
options: {
@ -837,7 +837,7 @@ module Ci
expect(subject.first).to eq({
stage: "test",
stage_idx: 1,
name: :normal_job,
name: "normal_job",
commands: "test",
tag_list: [],
options: {},
@ -882,7 +882,7 @@ module Ci
expect(subject.first).to eq({
stage: "build",
stage_idx: 0,
name: :job1,
name: "job1",
commands: "execute-script-for-job",
tag_list: [],
options: {},
@ -894,7 +894,7 @@ module Ci
expect(subject.second).to eq({
stage: "build",
stage_idx: 0,
name: :job2,
name: "job2",
commands: "execute-script-for-job",
tag_list: [],
options: {},

View File

@ -96,9 +96,10 @@ describe Gitlab::Badge::Build do
end
def create_build(project, sha, branch)
pipeline = create(:ci_pipeline, project: project,
sha: sha,
ref: branch)
pipeline = create(:ci_empty_pipeline,
project: project,
sha: sha,
ref: branch)
create(:ci_build, pipeline: pipeline, stage: 'notify')
end

View File

@ -764,6 +764,53 @@ describe Ci::Build, models: true do
end
end
describe '#when' do
subject { build.when }
context 'if is undefined' do
before do
build.when = nil
end
context 'use from gitlab-ci.yml' do
before do
stub_ci_pipeline_yaml_file(config)
end
context 'if config is not found' do
let(:config) { nil }
it { is_expected.to eq('on_success') }
end
context 'if config does not have a questioned job' do
let(:config) do
YAML.dump({
test_other: {
script: 'Hello World'
}
})
end
it { is_expected.to eq('on_success') }
end
context 'if config has when' do
let(:config) do
YAML.dump({
test: {
script: 'Hello World',
when: 'always'
}
})
end
it { is_expected.to eq('always') }
end
end
end
end
describe '#retryable?' do
context 'when build is running' do
before do

View File

@ -38,9 +38,6 @@ describe Ci::Pipeline, models: true do
it { expect(pipeline.sha).to start_with(subject) }
end
describe '#create_next_builds' do
end
describe '#retried' do
subject { pipeline.retried }
@ -54,304 +51,20 @@ describe Ci::Pipeline, models: true do
end
end
describe '#create_builds' do
let!(:pipeline) { FactoryGirl.create :ci_pipeline, project: project, ref: 'master', tag: false }
def create_builds(trigger_request = nil)
pipeline.create_builds(nil, trigger_request)
end
def create_next_builds
pipeline.create_next_builds(pipeline.builds.order(:id).last)
end
it 'creates builds' do
expect(create_builds).to be_truthy
pipeline.builds.update_all(status: "success")
expect(pipeline.builds.count(:all)).to eq(2)
expect(create_next_builds).to be_truthy
pipeline.builds.update_all(status: "success")
expect(pipeline.builds.count(:all)).to eq(4)
expect(create_next_builds).to be_truthy
pipeline.builds.update_all(status: "success")
expect(pipeline.builds.count(:all)).to eq(5)
expect(create_next_builds).to be_falsey
end
context 'custom stage with first job allowed to fail' do
let(:yaml) do
{
stages: ['clean', 'test'],
clean_job: {
stage: 'clean',
allow_failure: true,
script: 'BUILD',
},
test_job: {
stage: 'test',
script: 'TEST',
},
}
end
before do
stub_ci_pipeline_yaml_file(YAML.dump(yaml))
create_builds
end
it 'properly schedules builds' do
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending', 'failed')
end
end
context 'properly creates builds when "when" is defined' do
let(:yaml) do
{
stages: ["build", "test", "test_failure", "deploy", "cleanup"],
build: {
stage: "build",
script: "BUILD",
},
test: {
stage: "test",
script: "TEST",
},
test_failure: {
stage: "test_failure",
script: "ON test failure",
when: "on_failure",
},
deploy: {
stage: "deploy",
script: "PUBLISH",
},
cleanup: {
stage: "cleanup",
script: "TIDY UP",
when: "always",
}
}
end
before do
stub_ci_pipeline_yaml_file(YAML.dump(yaml))
end
context 'when builds are successful' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('success')
end
end
context 'when test job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when test and test_failure jobs fail' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when deploy job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when build is canceled in the second stage' do
it 'does not schedule builds after build has been canceled' do
expect(create_builds).to be_truthy
expect(pipeline.builds.pluck(:name)).to contain_exactly('build')
expect(pipeline.builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(pipeline.builds.running_or_pending).not_to be_empty
expect(pipeline.builds.pluck(:name)).to contain_exactly('build', 'test')
expect(pipeline.builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:cancel)
expect(pipeline.builds.running_or_pending).to be_empty
expect(pipeline.reload.status).to eq('canceled')
end
end
context 'when listing manual actions' do
let(:yaml) do
{
stages: ["build", "test", "staging", "production", "cleanup"],
build: {
stage: "build",
script: "BUILD",
},
test: {
stage: "test",
script: "TEST",
},
staging: {
stage: "staging",
script: "PUBLISH",
},
production: {
stage: "production",
script: "PUBLISH",
when: "manual",
},
cleanup: {
stage: "cleanup",
script: "TIDY UP",
when: "always",
},
clear_cache: {
stage: "cleanup",
script: "CLEAR CACHE",
when: "manual",
}
}
end
it 'returns only for skipped builds' do
# currently all builds are created
expect(create_builds).to be_truthy
expect(manual_actions).to be_empty
# succeed stage build
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_empty
# succeed stage test
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_empty
# succeed stage staging and skip stage production
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_many # production and clear cache
# succeed stage cleanup
pipeline.builds.running_or_pending.each(&:success)
# after processing a pipeline we should have 6 builds, 5 succeeded
expect(pipeline.builds.count).to eq(6)
expect(pipeline.builds.success.count).to eq(4)
end
def manual_actions
pipeline.manual_actions
end
end
end
context 'when no builds created' do
let(:pipeline) { build(:ci_pipeline) }
before do
stub_ci_pipeline_yaml_file(YAML.dump(before_script: ['ls']))
end
it 'returns false' do
expect(pipeline.create_builds(nil)).to be_falsey
expect(pipeline).not_to be_persisted
end
end
end
describe "#finished_at" do
let(:pipeline) { FactoryGirl.create :ci_pipeline }
it "returns finished_at of latest build" do
build = FactoryGirl.create :ci_build, pipeline: pipeline, finished_at: Time.now - 60
FactoryGirl.create :ci_build, pipeline: pipeline, finished_at: Time.now - 120
pipeline.reload_status!
expect(pipeline.finished_at.to_i).to eq(build.finished_at.to_i)
end
it "returns nil if there is no finished build" do
FactoryGirl.create :ci_not_started_build, pipeline: pipeline
pipeline.reload_status!
expect(pipeline.finished_at).to be_nil
end
@ -359,7 +72,7 @@ describe Ci::Pipeline, models: true do
describe "coverage" do
let(:project) { FactoryGirl.create :empty_project, build_coverage_regex: "/.*/" }
let(:pipeline) { FactoryGirl.create :ci_pipeline, project: project }
let(:pipeline) { FactoryGirl.create :ci_empty_pipeline, project: project }
it "calculates average when there are two builds with coverage" do
FactoryGirl.create :ci_build, name: "rspec", coverage: 30, pipeline: pipeline
@ -426,31 +139,30 @@ describe Ci::Pipeline, models: true do
end
end
describe '#update_state' do
it 'executes update_state after touching object' do
expect(pipeline).to receive(:update_state).and_return(true)
pipeline.touch
end
describe '#reload_status!' do
let(:pipeline) { create :ci_empty_pipeline, project: project }
context 'dependent objects' do
let(:commit_status) { build :commit_status, pipeline: pipeline }
let(:commit_status) { create :commit_status, :pending, pipeline: pipeline }
it 'executes update_state after saving dependent object' do
expect(pipeline).to receive(:update_state).and_return(true)
commit_status.save
it 'executes reload_status! after succeeding dependent object' do
expect(pipeline).to receive(:reload_status!).and_return(true)
commit_status.success
end
end
context 'update state' do
context 'updates' do
let(:current) { Time.now.change(usec: 0) }
let(:build) { FactoryGirl.create :ci_build, :success, pipeline: pipeline, started_at: current - 120, finished_at: current - 60 }
let(:build) { FactoryGirl.create :ci_build, pipeline: pipeline, started_at: current - 120, finished_at: current - 60 }
before do
build
pipeline.reload_status!
end
[:status, :started_at, :finished_at, :duration].each do |param|
it "update #{param}" do
it "#{param}" do
expect(pipeline.send(param)).to eq(build.send(param))
end
end

View File

@ -291,7 +291,8 @@ describe HipchatService, models: true do
end
context 'build events' do
let(:build) { create(:ci_build) }
let(:pipeline) { create(:ci_empty_pipeline) }
let(:build) { create(:ci_build, pipeline: pipeline) }
let(:data) { Gitlab::BuildDataBuilder.build(build) }
context 'for failed' do

View File

@ -238,6 +238,10 @@ describe API::API, api: true do
it { expect(response.headers).to include(download_headers) }
end
before do
pipeline.reload_status!
end
context 'with regular branch' do
before do
pipeline.update(ref: 'master',

View File

@ -89,16 +89,29 @@ describe API::API, api: true do
it "returns nil for commit without CI" do
get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user)
expect(response).to have_http_status(200)
expect(json_response['status']).to be_nil
end
it "returns status for CI" do
pipeline = project.ensure_pipeline(project.repository.commit.sha, 'master')
pipeline.update(status: 'success')
get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user)
expect(response).to have_http_status(200)
expect(json_response['status']).to eq(pipeline.status)
end
it "returns status for CI when pipeline is created" do
project.ensure_pipeline(project.repository.commit.sha, 'master')
get api("/projects/#{project.id}/repository/commits/#{project.repository.commit.id}", user)
expect(response).to have_http_status(200)
expect(json_response['status']).to be_nil
end
end
context "unauthorized user" do

View File

@ -50,7 +50,8 @@ describe API::API do
post api("/projects/#{project.id}/trigger/builds"), options.merge(ref: 'master')
expect(response).to have_http_status(201)
pipeline.builds.reload
expect(pipeline.builds.size).to eq(2)
expect(pipeline.builds.pending.size).to eq(2)
expect(pipeline.builds.size).to eq(5)
end
it 'returns bad request with no builds created if there\'s no commit for that ref' do

View File

@ -6,112 +6,102 @@ describe Ci::API::API do
let(:runner) { FactoryGirl.create(:ci_runner, tag_list: ["mysql", "ruby"]) }
let(:project) { FactoryGirl.create(:empty_project) }
before do
stub_ci_pipeline_to_return_yaml_file
end
describe "Builds API for runners" do
let(:shared_runner) { FactoryGirl.create(:ci_runner, token: "SharedRunner") }
let(:shared_project) { FactoryGirl.create(:empty_project, name: "SharedProject") }
let(:pipeline) { create(:ci_pipeline_without_jobs, project: project, ref: 'master') }
before do
FactoryGirl.create :ci_runner_project, project: project, runner: runner
project.runners << runner
end
describe "POST /builds/register" do
it "starts a build" do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
pipeline.create_builds(nil)
build = pipeline.builds.first
let!(:build) { create(:ci_build, pipeline: pipeline, name: 'spinach', stage: 'test', stage_idx: 0) }
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
it "starts a build" do
register_builds info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response['sha']).to eq(build.sha)
expect(runner.reload.platform).to eq("darwin")
end
it "returns 404 error if no pending build found" do
post ci_api("/builds/register"), token: runner.token
expect(response).to have_http_status(404)
end
it "returns 404 error if no builds for specific runner" do
pipeline = FactoryGirl.create(:ci_pipeline, project: shared_project)
FactoryGirl.create(:ci_build, pipeline: pipeline, status: 'pending')
post ci_api("/builds/register"), token: runner.token
expect(response).to have_http_status(404)
end
it "returns 404 error if no builds for shared runner" do
pipeline = FactoryGirl.create(:ci_pipeline, project: project)
FactoryGirl.create(:ci_build, pipeline: pipeline, status: 'pending')
post ci_api("/builds/register"), token: shared_runner.token
expect(response).to have_http_status(404)
end
it "returns options" do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
pipeline.create_builds(nil)
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response["options"]).to eq({ "image" => "ruby:2.1", "services" => ["postgres"] })
end
it "returns variables" do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
pipeline.create_builds(nil)
project.variables << Ci::Variable.new(key: "SECRET_KEY", value: "secret_value")
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response["variables"]).to include(
{ "key" => "CI_BUILD_NAME", "value" => "spinach", "public" => true },
{ "key" => "CI_BUILD_STAGE", "value" => "test", "public" => true },
{ "key" => "DB_NAME", "value" => "postgres", "public" => true },
{ "key" => "SECRET_KEY", "value" => "secret_value", "public" => false }
{ "key" => "DB_NAME", "value" => "postgres", "public" => true }
)
end
it "returns variables for triggers" do
trigger = FactoryGirl.create(:ci_trigger, project: project)
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
context 'when builds are finished' do
before do
build.success
end
trigger_request = FactoryGirl.create(:ci_trigger_request_with_variables, pipeline: pipeline, trigger: trigger)
pipeline.create_builds(nil, trigger_request)
project.variables << Ci::Variable.new(key: "SECRET_KEY", value: "secret_value")
it "returns 404 error if no builds for specific runner" do
register_builds
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response["variables"]).to include(
{ "key" => "CI_BUILD_NAME", "value" => "spinach", "public" => true },
{ "key" => "CI_BUILD_STAGE", "value" => "test", "public" => true },
{ "key" => "CI_BUILD_TRIGGERED", "value" => "true", "public" => true },
{ "key" => "DB_NAME", "value" => "postgres", "public" => true },
{ "key" => "SECRET_KEY", "value" => "secret_value", "public" => false },
{ "key" => "TRIGGER_KEY_1", "value" => "TRIGGER_VALUE_1", "public" => false }
)
expect(response).to have_http_status(404)
end
end
it "returns dependent builds" do
pipeline = FactoryGirl.create(:ci_pipeline, project: project, ref: 'master')
pipeline.create_builds(nil, nil)
pipeline.builds.where(stage: 'test').each(&:success)
context 'for other project with builds' do
before do
build.success
create(:ci_build, :pending)
end
post ci_api("/builds/register"), token: runner.token, info: { platform: :darwin }
it "returns 404 error if no builds for shared runner" do
register_builds
expect(response).to have_http_status(201)
expect(json_response["depends_on_builds"].count).to eq(2)
expect(json_response["depends_on_builds"][0]["name"]).to eq("rspec")
expect(response).to have_http_status(404)
end
end
context 'for shared runner' do
let(:shared_runner) { create(:ci_runner, token: "SharedRunner") }
it "should return 404 error if no builds for shared runner" do
register_builds shared_runner.token
expect(response).to have_http_status(404)
end
end
context 'for triggered build' do
before do
trigger = create(:ci_trigger, project: project)
create(:ci_trigger_request_with_variables, pipeline: pipeline, builds: [build], trigger: trigger)
project.variables << Ci::Variable.new(key: "SECRET_KEY", value: "secret_value")
end
it "returns variables for triggers" do
register_builds info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response["variables"]).to include(
{ "key" => "CI_BUILD_NAME", "value" => "spinach", "public" => true },
{ "key" => "CI_BUILD_STAGE", "value" => "test", "public" => true },
{ "key" => "CI_BUILD_TRIGGERED", "value" => "true", "public" => true },
{ "key" => "DB_NAME", "value" => "postgres", "public" => true },
{ "key" => "SECRET_KEY", "value" => "secret_value", "public" => false },
{ "key" => "TRIGGER_KEY_1", "value" => "TRIGGER_VALUE_1", "public" => false },
)
end
end
context 'with multiple builds' do
before do
build.success
end
let!(:test_build) { create(:ci_build, pipeline: pipeline, name: 'deploy', stage: 'deploy', stage_idx: 1) }
it "returns dependent builds" do
register_builds info: { platform: :darwin }
expect(response).to have_http_status(201)
expect(json_response["id"]).to eq(test_build.id)
expect(json_response["depends_on_builds"].count).to eq(1)
expect(json_response["depends_on_builds"][0]).to include('id' => build.id, 'name' => 'spinach')
end
end
%w(name version revision platform architecture).each do |param|
@ -121,8 +111,9 @@ describe Ci::API::API do
subject { runner.read_attribute(param.to_sym) }
it do
post ci_api("/builds/register"), token: runner.token, info: { param => value }
expect(response).to have_http_status(404)
register_builds info: { param => value }
expect(response).to have_http_status(201)
runner.reload
is_expected.to eq(value)
end
@ -131,8 +122,7 @@ describe Ci::API::API do
context 'when build has no tags' do
before do
pipeline = create(:ci_pipeline, project: project)
create(:ci_build, pipeline: pipeline, tags: [])
build.update(tags: [])
end
context 'when runner is allowed to pick untagged builds' do
@ -154,17 +144,15 @@ describe Ci::API::API do
expect(response).to have_http_status 404
end
end
end
def register_builds
post ci_api("/builds/register"), token: runner.token,
info: { platform: :darwin }
end
def register_builds(token = runner.token, **params)
post ci_api("/builds/register"), params.merge(token: token)
end
end
describe "PUT /builds/:id" do
let(:pipeline) {create(:ci_pipeline, project: project)}
let(:build) { create(:ci_build, :trace, pipeline: pipeline, runner_id: runner.id) }
let(:build) { create(:ci_build, :pending, :trace, pipeline: pipeline, runner_id: runner.id) }
before do
build.run!
@ -189,7 +177,7 @@ describe Ci::API::API do
end
describe 'PATCH /builds/:id/trace.txt' do
let(:build) { create(:ci_build, :trace, runner_id: runner.id) }
let(:build) { create(:ci_build, :pending, :trace, runner_id: runner.id) }
let(:headers) { { Ci::API::Helpers::BUILD_TOKEN_HEADER => build.token, 'Content-Type' => 'text/plain' } }
let(:headers_with_range) { headers.merge({ 'Content-Range' => '11-20' }) }
@ -237,8 +225,7 @@ describe Ci::API::API do
context "Artifacts" do
let(:file_upload) { fixture_file_upload(Rails.root + 'spec/fixtures/banana_sample.gif', 'image/gif') }
let(:file_upload2) { fixture_file_upload(Rails.root + 'spec/fixtures/dk.png', 'image/gif') }
let(:pipeline) { create(:ci_pipeline, project: project) }
let(:build) { create(:ci_build, pipeline: pipeline, runner_id: runner.id) }
let(:build) { create(:ci_build, :pending, pipeline: pipeline, runner_id: runner.id) }
let(:authorize_url) { ci_api("/builds/#{build.id}/artifacts/authorize") }
let(:post_url) { ci_api("/builds/#{build.id}/artifacts") }
let(:delete_url) { ci_api("/builds/#{build.id}/artifacts") }

View File

@ -42,7 +42,8 @@ describe Ci::API::API do
post ci_api("/projects/#{project.ci_id}/refs/master/trigger"), options
expect(response).to have_http_status(201)
pipeline.builds.reload
expect(pipeline.builds.size).to eq(2)
expect(pipeline.builds.pending.size).to eq(2)
expect(pipeline.builds.size).to eq(5)
end
it 'returns bad request with no builds created if there\'s no commit for that ref' do

View File

@ -1,32 +0,0 @@
require 'spec_helper'
describe Ci::CreateBuildsService, services: true do
let(:pipeline) { create(:ci_pipeline, ref: 'master') }
let(:user) { create(:user) }
describe '#execute' do
# Using stubbed .gitlab-ci.yml created in commit factory
#
subject do
described_class.new(pipeline).execute('test', user, status, nil)
end
context 'next builds available' do
let(:status) { 'success' }
it { is_expected.to be_an_instance_of Array }
it { is_expected.to all(be_an_instance_of Ci::Build) }
it 'does not persist created builds' do
expect(subject.first).not_to be_persisted
end
end
context 'builds skipped' do
let(:status) { 'skipped' }
it { is_expected.to be_empty }
end
end
end

View File

@ -0,0 +1,214 @@
require 'spec_helper'
describe Ci::CreatePipelineService, services: true do
let(:project) { FactoryGirl.create(:project) }
let(:user) { create(:admin) }
before do
stub_ci_pipeline_to_return_yaml_file
end
describe '#execute' do
def execute(params)
described_class.new(project, user, params).execute
end
context 'valid params' do
let(:pipeline) do
execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: "Message" }])
end
it { expect(pipeline).to be_kind_of(Ci::Pipeline) }
it { expect(pipeline).to be_valid }
it { expect(pipeline).to be_persisted }
it { expect(pipeline).to eq(project.pipelines.last) }
it { expect(pipeline).to have_attributes(user: user) }
it { expect(pipeline.builds.first).to be_kind_of(Ci::Build) }
end
context "skip tag if there is no build for it" do
it "creates commit if there is appropriate job" do
result = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: "Message" }])
expect(result).to be_persisted
end
it "creates commit if there is no appropriate job but deploy job has right ref setting" do
config = YAML.dump({ deploy: { script: "ls", only: ["master"] } })
stub_ci_pipeline_yaml_file(config)
result = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: "Message" }])
expect(result).to be_persisted
end
end
it 'skips creating pipeline for refs without .gitlab-ci.yml' do
stub_ci_pipeline_yaml_file(nil)
result = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: 'Message' }])
expect(result).not_to be_persisted
expect(Ci::Pipeline.count).to eq(0)
end
it 'fails commits if yaml is invalid' do
message = 'message'
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { message }
stub_ci_pipeline_yaml_file('invalid: file: file')
commits = [{ message: message }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq('failed')
expect(pipeline.yaml_errors).not_to be_nil
end
context 'when commit contains a [ci skip] directive' do
let(:message) { "some message[ci skip]" }
let(:messageFlip) { "some message[skip ci]" }
let(:capMessage) { "some message[CI SKIP]" }
let(:capMessageFlip) { "some message[SKIP CI]" }
before do
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { message }
end
it "skips builds creation if there is [ci skip] tag in commit message" do
commits = [{ message: message }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "skips builds creation if there is [skip ci] tag in commit message" do
commits = [{ message: messageFlip }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "skips builds creation if there is [CI SKIP] tag in commit message" do
commits = [{ message: capMessage }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "skips builds creation if there is [SKIP CI] tag in commit message" do
commits = [{ message: capMessageFlip }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "does not skips builds creation if there is no [ci skip] or [skip ci] tag in commit message" do
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { "some message" }
commits = [{ message: "some message" }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.first.name).to eq("rspec")
end
it "fails builds creation if there is [ci skip] tag in commit message and yaml is invalid" do
stub_ci_pipeline_yaml_file('invalid: file: fiile')
commits = [{ message: message }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("failed")
expect(pipeline.yaml_errors).not_to be_nil
end
end
it "creates commit with failed status if yaml is invalid" do
stub_ci_pipeline_yaml_file('invalid: file')
commits = [{ message: "some message" }]
pipeline = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: commits)
expect(pipeline).to be_persisted
expect(pipeline.status).to eq("failed")
expect(pipeline.builds.any?).to be false
end
context 'when there are no jobs for this pipeline' do
before do
config = YAML.dump({ test: { script: 'ls', only: ['feature'] } })
stub_ci_pipeline_yaml_file(config)
end
it 'does not create a new pipeline' do
result = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: 'some msg' }])
expect(result).not_to be_persisted
expect(Ci::Build.all).to be_empty
expect(Ci::Pipeline.count).to eq(0)
end
end
context 'with manual actions' do
before do
config = YAML.dump({ deploy: { script: 'ls', when: 'manual' } })
stub_ci_pipeline_yaml_file(config)
end
it 'does not create a new pipeline' do
result = execute(ref: 'refs/heads/master',
before: '00000000',
after: project.commit.id,
commits: [{ message: 'some msg' }])
expect(result).to be_persisted
expect(result.manual_actions).not_to be_empty
end
end
end
end

View File

@ -1,7 +1,7 @@
require 'spec_helper'
describe Ci::CreateTriggerRequestService, services: true do
let(:service) { Ci::CreateTriggerRequestService.new }
let(:service) { described_class.new }
let(:project) { create(:project) }
let(:trigger) { create(:ci_trigger, project: project) }
@ -27,8 +27,7 @@ describe Ci::CreateTriggerRequestService, services: true do
subject { service.execute(project, trigger, 'master') }
before do
stub_ci_pipeline_yaml_file('{}')
FactoryGirl.create :ci_pipeline, project: project
stub_ci_pipeline_yaml_file('script: { only: [develop], script: hello World }')
end
it { expect(subject).to be_nil }

View File

@ -5,8 +5,8 @@ module Ci
let(:service) { ImageForBuildService.new }
let(:project) { FactoryGirl.create(:empty_project) }
let(:commit_sha) { '01234567890123456789' }
let(:commit) { project.ensure_pipeline(commit_sha, 'master') }
let(:build) { FactoryGirl.create(:ci_build, pipeline: commit) }
let(:pipeline) { project.ensure_pipeline(commit_sha, 'master') }
let(:build) { FactoryGirl.create(:ci_build, pipeline: pipeline) }
describe '#execute' do
before { build }
@ -14,6 +14,7 @@ module Ci
context 'branch name' do
before { allow(project).to receive(:commit).and_return(OpenStruct.new(sha: commit_sha)) }
before { build.run! }
before { pipeline.reload_status! }
let(:image) { service.execute(project, ref: 'master') }
it { expect(image).to be_kind_of(OpenStruct) }
@ -31,6 +32,7 @@ module Ci
context 'commit sha' do
before { build.run! }
before { pipeline.reload_status! }
let(:image) { service.execute(project, sha: build.sha) }
it { expect(image).to be_kind_of(OpenStruct) }

View File

@ -0,0 +1,288 @@
require 'spec_helper'
describe Ci::ProcessPipelineService, services: true do
let(:pipeline) { create(:ci_pipeline, ref: 'master') }
let(:user) { create(:user) }
let(:all_builds) { pipeline.builds }
let(:builds) { all_builds.where.not(status: [:created, :skipped]) }
let(:config) { nil }
before do
allow(pipeline).to receive(:ci_yaml_file).and_return(config)
end
describe '#execute' do
def create_builds
described_class.new(pipeline.project, user).execute(pipeline)
end
def succeed_pending
builds.pending.update_all(status: 'success')
end
context 'start queuing next builds' do
before do
create(:ci_build, :created, pipeline: pipeline, name: 'linux', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'mac', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'rspec', stage_idx: 1)
create(:ci_build, :created, pipeline: pipeline, name: 'rubocop', stage_idx: 1)
create(:ci_build, :created, pipeline: pipeline, name: 'deploy', stage_idx: 2)
end
it 'processes a pipeline' do
expect(create_builds).to be_truthy
succeed_pending
expect(builds.success.count).to eq(2)
expect(create_builds).to be_truthy
succeed_pending
expect(builds.success.count).to eq(4)
expect(create_builds).to be_truthy
succeed_pending
expect(builds.success.count).to eq(5)
expect(create_builds).to be_falsey
end
it 'does not process pipeline if existing stage is running' do
expect(create_builds).to be_truthy
expect(builds.pending.count).to eq(2)
expect(create_builds).to be_falsey
expect(builds.pending.count).to eq(2)
end
end
context 'custom stage with first job allowed to fail' do
before do
create(:ci_build, :created, pipeline: pipeline, name: 'clean_job', stage_idx: 0, allow_failure: true)
create(:ci_build, :created, pipeline: pipeline, name: 'test_job', stage_idx: 1, allow_failure: true)
end
it 'automatically triggers a next stage when build finishes' do
expect(create_builds).to be_truthy
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:status)).to contain_exactly('failed', 'pending')
end
end
context 'properly creates builds when "when" is defined' do
before do
create(:ci_build, :created, pipeline: pipeline, name: 'build', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'test', stage_idx: 1)
create(:ci_build, :created, pipeline: pipeline, name: 'test_failure', stage_idx: 2, when: 'on_failure')
create(:ci_build, :created, pipeline: pipeline, name: 'deploy', stage_idx: 3)
create(:ci_build, :created, pipeline: pipeline, name: 'production', stage_idx: 3, when: 'manual')
create(:ci_build, :created, pipeline: pipeline, name: 'cleanup', stage_idx: 4, when: 'always')
create(:ci_build, :created, pipeline: pipeline, name: 'clear cache', stage_idx: 4, when: 'manual')
end
context 'when builds are successful' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('success')
end
end
context 'when test job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'success', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when test and test_failure jobs fail' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'test_failure', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'failed', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when deploy job fails' do
it 'properly creates builds' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'pending')
pipeline.builds.running_or_pending.each(&:drop)
expect(builds.pluck(:name)).to contain_exactly('build', 'test', 'deploy', 'cleanup')
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.pluck(:status)).to contain_exactly('success', 'success', 'failed', 'success')
pipeline.reload
expect(pipeline.status).to eq('failed')
end
end
context 'when build is canceled in the second stage' do
it 'does not schedule builds after build has been canceled' do
expect(create_builds).to be_truthy
expect(builds.pluck(:name)).to contain_exactly('build')
expect(builds.pluck(:status)).to contain_exactly('pending')
pipeline.builds.running_or_pending.each(&:success)
expect(builds.running_or_pending).not_to be_empty
expect(builds.pluck(:name)).to contain_exactly('build', 'test')
expect(builds.pluck(:status)).to contain_exactly('success', 'pending')
pipeline.builds.running_or_pending.each(&:cancel)
expect(builds.running_or_pending).to be_empty
expect(pipeline.reload.status).to eq('canceled')
end
end
context 'when listing manual actions' do
it 'returns only for skipped builds' do
# currently all builds are created
expect(create_builds).to be_truthy
expect(manual_actions).to be_empty
# succeed stage build
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_empty
# succeed stage test
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_one # production
# succeed stage deploy
pipeline.builds.running_or_pending.each(&:success)
expect(manual_actions).to be_many # production and clear cache
end
def manual_actions
pipeline.manual_actions
end
end
end
context 'creates a builds from .gitlab-ci.yml' do
let(:config) do
YAML.dump({
rspec: {
stage: 'test',
script: 'rspec'
},
rubocop: {
stage: 'test',
script: 'rubocop'
},
deploy: {
stage: 'deploy',
script: 'deploy'
}
})
end
# Using stubbed .gitlab-ci.yml created in commit factory
#
before do
stub_ci_pipeline_yaml_file(config)
create(:ci_build, :created, pipeline: pipeline, name: 'linux', stage: 'build', stage_idx: 0)
create(:ci_build, :created, pipeline: pipeline, name: 'mac', stage: 'build', stage_idx: 0)
end
it 'when processing a pipeline' do
# Currently we have two builds with state created
expect(builds.count).to eq(0)
expect(all_builds.count).to eq(2)
# Create builds will mark the created as pending
expect(create_builds).to be_truthy
expect(builds.count).to eq(2)
expect(all_builds.count).to eq(2)
# When we builds succeed we will create a rest of pipeline from .gitlab-ci.yml
# We will have 2 succeeded, 2 pending (from stage test), total 5 (one more build from deploy)
succeed_pending
expect(create_builds).to be_truthy
expect(builds.success.count).to eq(2)
expect(builds.pending.count).to eq(2)
expect(all_builds.count).to eq(5)
# When we succeed the 2 pending from stage test,
# We will queue a deploy stage, no new builds will be created
succeed_pending
expect(create_builds).to be_truthy
expect(builds.pending.count).to eq(1)
expect(builds.success.count).to eq(4)
expect(all_builds.count).to eq(5)
# When we succeed last pending build, we will have a total of 5 succeeded builds, no new builds will be created
succeed_pending
expect(create_builds).to be_falsey
expect(builds.success.count).to eq(5)
expect(all_builds.count).to eq(5)
end
end
end
end

View File

@ -1,241 +0,0 @@
require 'spec_helper'
describe CreateCommitBuildsService, services: true do
let(:service) { CreateCommitBuildsService.new }
let(:project) { FactoryGirl.create(:empty_project) }
let(:user) { create(:user) }
before do
stub_ci_pipeline_to_return_yaml_file
end
describe '#execute' do
context 'valid params' do
let(:pipeline) do
service.execute(project, user,
ref: 'refs/heads/master',
before: '00000000',
after: '31das312',
commits: [{ message: "Message" }]
)
end
it { expect(pipeline).to be_kind_of(Ci::Pipeline) }
it { expect(pipeline).to be_valid }
it { expect(pipeline).to be_persisted }
it { expect(pipeline).to eq(project.pipelines.last) }
it { expect(pipeline).to have_attributes(user: user) }
it { expect(pipeline.builds.first).to be_kind_of(Ci::Build) }
end
context "skip tag if there is no build for it" do
it "creates commit if there is appropriate job" do
result = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: [{ message: "Message" }]
)
expect(result).to be_persisted
end
it "creates commit if there is no appropriate job but deploy job has right ref setting" do
config = YAML.dump({ deploy: { script: "ls", only: ["0_1"] } })
stub_ci_pipeline_yaml_file(config)
result = service.execute(project, user,
ref: 'refs/heads/0_1',
before: '00000000',
after: '31das312',
commits: [{ message: "Message" }]
)
expect(result).to be_persisted
end
end
it 'skips creating pipeline for refs without .gitlab-ci.yml' do
stub_ci_pipeline_yaml_file(nil)
result = service.execute(project, user,
ref: 'refs/heads/0_1',
before: '00000000',
after: '31das312',
commits: [{ message: 'Message' }]
)
expect(result).to be_falsey
expect(Ci::Pipeline.count).to eq(0)
end
it 'fails commits if yaml is invalid' do
message = 'message'
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { message }
stub_ci_pipeline_yaml_file('invalid: file: file')
commits = [{ message: message }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq('failed')
expect(pipeline.yaml_errors).not_to be_nil
end
context 'when commit contains a [ci skip] directive' do
let(:message) { "some message[ci skip]" }
let(:messageFlip) { "some message[skip ci]" }
let(:capMessage) { "some message[CI SKIP]" }
let(:capMessageFlip) { "some message[SKIP CI]" }
before do
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { message }
end
it "skips builds creation if there is [ci skip] tag in commit message" do
commits = [{ message: message }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "skips builds creation if there is [skip ci] tag in commit message" do
commits = [{ message: messageFlip }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "skips builds creation if there is [CI SKIP] tag in commit message" do
commits = [{ message: capMessage }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "skips builds creation if there is [SKIP CI] tag in commit message" do
commits = [{ message: capMessageFlip }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
end
it "does not skips builds creation if there is no [ci skip] or [skip ci] tag in commit message" do
allow_any_instance_of(Ci::Pipeline).to receive(:git_commit_message) { "some message" }
commits = [{ message: "some message" }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.first.name).to eq("staging")
end
it "skips builds creation if there is [ci skip] tag in commit message and yaml is invalid" do
stub_ci_pipeline_yaml_file('invalid: file: fiile')
commits = [{ message: message }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.any?).to be false
expect(pipeline.status).to eq("skipped")
expect(pipeline.yaml_errors).to be_nil
end
end
it "skips build creation if there are already builds" do
allow_any_instance_of(Ci::Pipeline).to receive(:ci_yaml_file) { gitlab_ci_yaml }
commits = [{ message: "message" }]
pipeline = service.execute(project, user,
ref: 'refs/heads/master',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.count(:all)).to eq(2)
pipeline = service.execute(project, user,
ref: 'refs/heads/master',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.builds.count(:all)).to eq(2)
end
it "creates commit with failed status if yaml is invalid" do
stub_ci_pipeline_yaml_file('invalid: file')
commits = [{ message: "some message" }]
pipeline = service.execute(project, user,
ref: 'refs/tags/0_1',
before: '00000000',
after: '31das312',
commits: commits
)
expect(pipeline).to be_persisted
expect(pipeline.status).to eq("failed")
expect(pipeline.builds.any?).to be false
end
context 'when there are no jobs for this pipeline' do
before do
config = YAML.dump({ test: { script: 'ls', only: ['feature'] } })
stub_ci_pipeline_yaml_file(config)
end
it 'does not create a new pipeline' do
result = service.execute(project, user,
ref: 'refs/heads/master',
before: '00000000',
after: '31das312',
commits: [{ message: 'some msg' }])
expect(result).to be_falsey
expect(Ci::Build.all).to be_empty
expect(Ci::Pipeline.count).to eq(0)
end
end
end
end

View File

@ -110,19 +110,15 @@ describe MergeRequests::MergeWhenBuildSucceedsService do
context 'properly handles multiple stages' do
let(:ref) { mr_merge_if_green_enabled.source_branch }
let(:build) { create(:ci_build, pipeline: pipeline, ref: ref, name: 'build', stage: 'build') }
let(:test) { create(:ci_build, pipeline: pipeline, ref: ref, name: 'test', stage: 'test') }
let!(:build) { create(:ci_build, :created, pipeline: pipeline, ref: ref, name: 'build', stage: 'build') }
let!(:test) { create(:ci_build, :created, pipeline: pipeline, ref: ref, name: 'test', stage: 'test') }
let(:pipeline) { create(:ci_empty_pipeline, ref: mr_merge_if_green_enabled.source_branch, project: project) }
before do
# This behavior of MergeRequest: we instantiate a new object
allow_any_instance_of(MergeRequest).to receive(:pipeline).and_wrap_original do
Ci::Pipeline.find(pipeline.id)
end
# We create test after the build
allow(pipeline).to receive(:create_next_builds).and_wrap_original do
test
end
end
it "doesn't merge if some stages failed" do

View File

@ -53,7 +53,13 @@ describe PostReceive do
subject { PostReceive.new.perform(pwd(project), key_id, base64_changes) }
context "creates a Ci::Pipeline for every change" do
before { stub_ci_pipeline_to_return_yaml_file }
before do
allow_any_instance_of(Ci::CreatePipelineService).to receive(:commit) do
OpenStruct.new(id: '123456')
end
allow_any_instance_of(Ci::CreatePipelineService).to receive(:branch?).and_return(true)
stub_ci_pipeline_to_return_yaml_file
end
it { expect{ subject }.to change{ Ci::Pipeline.count }.by(2) }
end