gitlab-org--gitlab-foss/app/services/merge_requests/refresh_service.rb

310 lines
11 KiB
Ruby
Raw Normal View History

# frozen_string_literal: true
module MergeRequests
class RefreshService < MergeRequests::BaseService
include Gitlab::Utils::StrongMemoize
attr_reader :push
def execute(oldrev, newrev, ref)
@push = Gitlab::Git::Push.new(@project, oldrev, newrev, ref)
return true unless @push.branch_push?
refresh_merge_requests!
end
private
def refresh_merge_requests!
# n + 1: https://gitlab.com/gitlab-org/gitlab-foss/issues/60289
Gitlab::GitalyClient.allow_n_plus_1_calls(&method(:find_new_commits))
# Be sure to close outstanding MRs before reloading them to avoid generating an
# empty diff during a manual merge
close_upon_missing_source_branch_ref
post_merge_manually_merged
link_forks_lfs_objects
reload_merge_requests
merge_requests_for_source_branch.each do |mr|
outdate_suggestions(mr)
refresh_pipelines_on_merge_requests(mr)
abort_auto_merges(mr)
mark_pending_todos_done(mr)
end
abort_ff_merge_requests_with_when_pipeline_succeeds
Improve performance of the cycle analytics page. 1. These changes bring down page load time for 100 issues from more than a minute to about 1.5 seconds. 2. This entire commit is composed of these types of performance enhancements: - Cache relevant data in `IssueMetrics` wherever possible. - Cache relevant data in `MergeRequestMetrics` wherever possible. - Preload metrics 3. Given these improvements, we now only need to make 4 SQL calls: - Load all issues - Load all merge requests - Load all metrics for the issues - Load all metrics for the merge requests 4. A list of all the data points that are now being pre-calculated: a. The first time an issue is mentioned in a commit - In `GitPushService`, find all issues mentioned by the given commit using `ReferenceExtractor`. Set the `first_mentioned_in_commit_at` flag for each of them. - There seems to be a (pre-existing) bug here - files (and therefore commits) created using the Web CI don't have cross-references created, and issues are not closed even when the commit title is "Fixes #xx". b. The first time a merge request is deployed to production When a `Deployment` is created, find all merge requests that were merged in before the deployment, and set the `first_deployed_to_production_at` flag for each of them. c. The start / end time for a merge request pipeline Hook into the `Pipeline` state machine. When the `status` moves to `running`, find the merge requests whose tip commit matches the pipeline, and record the `latest_build_started_at` time for each of them. When the `status` moves to `success`, record the `latest_build_finished_at` time. d. The merge requests that close an issue - This was a big cause of the performance problems we were having with Cycle Analytics. We need to use `ReferenceExtractor` to make this calculation, which is slow when we have to run it on a large number of merge requests. - When a merge request is created, updated, or refreshed, find the issues it closes, and create an instance of `MergeRequestsClosingIssues`, which acts as a join model between merge requests and issues. - If a `MergeRequestsClosingIssues` instance links a merge request and an issue, that issue closes that merge request. 5. The `Queries` module was changed into a class, so we can cache the results of `issues` and `merge_requests_closing_issues` across various cycle analytics stages. 6. The code added in this commit is untested. Tests will be added in the next commit.
2016-09-15 04:59:36 -04:00
cache_merge_requests_closing_issues
merge_requests_for_source_branch.each do |mr|
# Leave a system note if a branch was deleted/added
if branch_added_or_removed?
comment_mr_branch_presence_changed(mr)
end
notify_about_push(mr)
mark_mr_as_wip_from_commits(mr)
execute_mr_web_hooks(mr)
end
true
end
def branch_added_or_removed?
strong_memoize(:branch_added_or_removed) do
@push.branch_added? || @push.branch_removed?
end
end
def close_upon_missing_source_branch_ref
# MergeRequest#reload_diff ignores not opened MRs. This means it won't
# create an `empty` diff for `closed` MRs without a source branch, keeping
# the latest diff state as the last _valid_ one.
merge_requests_for_source_branch.reject(&:source_branch_exists?).each do |mr|
MergeRequests::CloseService
.new(mr.target_project, @current_user)
.execute(mr)
end
end
# Collect open merge requests that target same branch we push into
# and close if push to master include last commit from merge request
# We need this to close(as merged) merge requests that were merged into
# target branch manually
# rubocop: disable CodeReuse/ActiveRecord
def post_merge_manually_merged
commit_ids = @commits.map(&:id)
merge_requests = @project.merge_requests.opened
.preload(:latest_merge_request_diff)
.where(target_branch: @push.branch_name).to_a
.select(&:diff_head_commit)
.select do |merge_request|
commit_ids.include?(merge_request.diff_head_sha) &&
merge_request.merge_request_diff.state != 'empty'
end
merge_requests = filter_merge_requests(merge_requests)
return if merge_requests.empty?
2018-12-07 04:58:39 -05:00
commit_analyze_enabled = Feature.enabled?(:branch_push_merge_commit_analyze, @project, default_enabled: true)
if commit_analyze_enabled
analyzer = Gitlab::BranchPushMergeCommitAnalyzer.new(
@commits.reverse,
relevant_commit_ids: merge_requests.map(&:diff_head_sha)
)
end
merge_requests.each do |merge_request|
2018-12-07 04:58:39 -05:00
if commit_analyze_enabled
merge_request.merge_commit_sha = analyzer.get_merge_commit(merge_request.diff_head_sha)
end
2017-06-21 09:48:12 -04:00
MergeRequests::PostMergeService
.new(merge_request.target_project, @current_user)
.execute(merge_request)
end
end
# rubocop: enable CodeReuse/ActiveRecord
# Link LFS objects that exists in forks but does not exists in merge requests
# target project
def link_forks_lfs_objects
return unless @push.branch_updated?
merge_requests_for_forks.find_each do |mr|
LinkLfsObjectsService
.new(mr.target_project)
.execute(mr, oldrev: @push.oldrev, newrev: @push.newrev)
end
end
# Refresh merge request diff if we push to source or target branch of merge request
# Note: we should update merge requests from forks too
def reload_merge_requests
2017-06-21 09:48:12 -04:00
merge_requests = @project.merge_requests.opened
.by_source_or_target_branch(@push.branch_name).to_a
2015-10-15 04:57:45 -04:00
merge_requests += merge_requests_for_forks.to_a
filter_merge_requests(merge_requests).each do |merge_request|
if branch_and_project_match?(merge_request) || @push.force_push?
merge_request.reload_diff(current_user)
# Clear existing merge error if the push were directed at the
# source branch. Clearing the error when the target branch
# changes will hide the error from the user.
merge_request.merge_error = nil
elsif merge_request.merge_request_diff.includes_any_commits?(push_commit_ids)
merge_request.reload_diff(current_user)
end
merge_request.mark_as_unchecked
end
# Upcoming method calls need the refreshed version of
# @source_merge_requests diffs (for MergeRequest#commit_shas for instance).
merge_requests_for_source_branch(reload: true)
end
def push_commit_ids
@push_commit_ids ||= @commits.map(&:id)
end
def branch_and_project_match?(merge_request)
merge_request.source_project == @project &&
merge_request.source_branch == @push.branch_name
end
def outdate_suggestions(merge_request)
outdate_service.execute(merge_request)
end
def outdate_service
@outdate_service ||= Suggestions::OutdateService.new
end
def refresh_pipelines_on_merge_requests(merge_request)
create_pipeline_for(merge_request, current_user)
UpdateHeadPipelineForMergeRequestWorker.perform_async(merge_request.id)
end
def abort_auto_merges(merge_request)
abort_auto_merge(merge_request, 'source branch was updated')
2015-11-02 11:27:38 -05:00
end
def abort_ff_merge_requests_with_when_pipeline_succeeds
return unless @project.ff_merge_must_be_possible?
merge_requests_with_auto_merge_enabled_to(@push.branch_name).each do |merge_request|
next unless merge_request.auto_merge_strategy == AutoMergeService::STRATEGY_MERGE_WHEN_PIPELINE_SUCCEEDS
next unless merge_request.should_be_rebased?
abort_auto_merge_with_todo(merge_request, 'target branch was updated')
end
end
def abort_auto_merge_with_todo(merge_request, reason)
response = abort_auto_merge(merge_request, reason)
response = ServiceResponse.new(response)
return unless response.success?
todo_service.merge_request_became_unmergeable(merge_request)
end
def merge_requests_with_auto_merge_enabled_to(target_branch)
@project
.merge_requests
.by_target_branch(target_branch)
.with_auto_merge_enabled
end
def mark_pending_todos_done(merge_request)
todo_service.merge_request_push(merge_request, @current_user)
end
2015-10-21 11:34:12 -04:00
def find_new_commits
if @push.branch_added?
2015-10-21 11:34:12 -04:00
@commits = []
2015-10-16 01:45:06 -04:00
2015-10-21 11:34:12 -04:00
merge_request = merge_requests_for_source_branch.first
return unless merge_request
2015-10-16 01:45:06 -04:00
2015-10-21 11:34:12 -04:00
begin
# Since any number of commits could have been made to the restored branch,
# find the common root to see what has been added.
common_ref = @project.repository.merge_base(merge_request.diff_head_sha, @push.newrev)
2015-10-21 11:34:12 -04:00
# If the a commit no longer exists in this repo, gitlab_git throws
# a Rugged::OdbError. This is fixed in https://gitlab.com/gitlab-org/gitlab_git/merge_requests/52
@commits = @project.repository.commits_between(common_ref, @push.newrev) if common_ref
2015-10-21 11:34:12 -04:00
rescue
2015-10-16 01:45:06 -04:00
end
elsif @push.branch_removed?
2015-10-21 11:34:12 -04:00
# No commits for a deleted branch.
@commits = []
else
@commits = @project.repository.commits_between(@push.oldrev, @push.newrev)
2015-10-21 11:34:12 -04:00
end
end
2015-10-16 01:45:06 -04:00
2015-10-21 11:34:12 -04:00
# Add comment about branches being deleted or added to merge requests
def comment_mr_branch_presence_changed(merge_request)
presence = @push.branch_added? ? :add : :delete
2015-10-21 11:34:12 -04:00
SystemNoteService.change_branch_presence(
merge_request, merge_request.project, @current_user,
:source, @push.branch_name, presence)
end
# Add comment about pushing new commits to merge requests and send nofitication emails
def notify_about_push(merge_request)
2015-10-21 11:34:12 -04:00
return unless @commits.present?
mr_commit_ids = Set.new(merge_request.commit_shas)
new_commits, existing_commits = @commits.partition do |commit|
mr_commit_ids.include?(commit.id)
end
SystemNoteService.add_commits(merge_request, merge_request.project,
@current_user, new_commits,
existing_commits, @push.oldrev)
notification_service.push_to_merge_request(merge_request, @current_user, new_commits: new_commits, existing_commits: existing_commits)
end
def mark_mr_as_wip_from_commits(merge_request)
return unless @commits.present?
commit_shas = merge_request.commit_shas
2017-02-09 05:34:02 -05:00
wip_commit = @commits.detect do |commit|
commit.work_in_progress? && commit_shas.include?(commit.sha)
end
if wip_commit && !merge_request.work_in_progress?
merge_request.update(title: merge_request.wip_title)
SystemNoteService.add_merge_request_wip_from_commit(
merge_request,
merge_request.project,
@current_user,
wip_commit
)
end
end
# Call merge request webhook with update branches
def execute_mr_web_hooks(merge_request)
execute_hooks(merge_request, 'update', old_rev: @push.oldrev)
end
Improve performance of the cycle analytics page. 1. These changes bring down page load time for 100 issues from more than a minute to about 1.5 seconds. 2. This entire commit is composed of these types of performance enhancements: - Cache relevant data in `IssueMetrics` wherever possible. - Cache relevant data in `MergeRequestMetrics` wherever possible. - Preload metrics 3. Given these improvements, we now only need to make 4 SQL calls: - Load all issues - Load all merge requests - Load all metrics for the issues - Load all metrics for the merge requests 4. A list of all the data points that are now being pre-calculated: a. The first time an issue is mentioned in a commit - In `GitPushService`, find all issues mentioned by the given commit using `ReferenceExtractor`. Set the `first_mentioned_in_commit_at` flag for each of them. - There seems to be a (pre-existing) bug here - files (and therefore commits) created using the Web CI don't have cross-references created, and issues are not closed even when the commit title is "Fixes #xx". b. The first time a merge request is deployed to production When a `Deployment` is created, find all merge requests that were merged in before the deployment, and set the `first_deployed_to_production_at` flag for each of them. c. The start / end time for a merge request pipeline Hook into the `Pipeline` state machine. When the `status` moves to `running`, find the merge requests whose tip commit matches the pipeline, and record the `latest_build_started_at` time for each of them. When the `status` moves to `success`, record the `latest_build_finished_at` time. d. The merge requests that close an issue - This was a big cause of the performance problems we were having with Cycle Analytics. We need to use `ReferenceExtractor` to make this calculation, which is slow when we have to run it on a large number of merge requests. - When a merge request is created, updated, or refreshed, find the issues it closes, and create an instance of `MergeRequestsClosingIssues`, which acts as a join model between merge requests and issues. - If a `MergeRequestsClosingIssues` instance links a merge request and an issue, that issue closes that merge request. 5. The `Queries` module was changed into a class, so we can cache the results of `issues` and `merge_requests_closing_issues` across various cycle analytics stages. 6. The code added in this commit is untested. Tests will be added in the next commit.
2016-09-15 04:59:36 -04:00
# If the merge requests closes any issues, save this information in the
# `MergeRequestsClosingIssues` model (as a performance optimization).
# rubocop: disable CodeReuse/ActiveRecord
Improve performance of the cycle analytics page. 1. These changes bring down page load time for 100 issues from more than a minute to about 1.5 seconds. 2. This entire commit is composed of these types of performance enhancements: - Cache relevant data in `IssueMetrics` wherever possible. - Cache relevant data in `MergeRequestMetrics` wherever possible. - Preload metrics 3. Given these improvements, we now only need to make 4 SQL calls: - Load all issues - Load all merge requests - Load all metrics for the issues - Load all metrics for the merge requests 4. A list of all the data points that are now being pre-calculated: a. The first time an issue is mentioned in a commit - In `GitPushService`, find all issues mentioned by the given commit using `ReferenceExtractor`. Set the `first_mentioned_in_commit_at` flag for each of them. - There seems to be a (pre-existing) bug here - files (and therefore commits) created using the Web CI don't have cross-references created, and issues are not closed even when the commit title is "Fixes #xx". b. The first time a merge request is deployed to production When a `Deployment` is created, find all merge requests that were merged in before the deployment, and set the `first_deployed_to_production_at` flag for each of them. c. The start / end time for a merge request pipeline Hook into the `Pipeline` state machine. When the `status` moves to `running`, find the merge requests whose tip commit matches the pipeline, and record the `latest_build_started_at` time for each of them. When the `status` moves to `success`, record the `latest_build_finished_at` time. d. The merge requests that close an issue - This was a big cause of the performance problems we were having with Cycle Analytics. We need to use `ReferenceExtractor` to make this calculation, which is slow when we have to run it on a large number of merge requests. - When a merge request is created, updated, or refreshed, find the issues it closes, and create an instance of `MergeRequestsClosingIssues`, which acts as a join model between merge requests and issues. - If a `MergeRequestsClosingIssues` instance links a merge request and an issue, that issue closes that merge request. 5. The `Queries` module was changed into a class, so we can cache the results of `issues` and `merge_requests_closing_issues` across various cycle analytics stages. 6. The code added in this commit is untested. Tests will be added in the next commit.
2016-09-15 04:59:36 -04:00
def cache_merge_requests_closing_issues
@project.merge_requests.where(source_branch: @push.branch_name).each do |merge_request|
Improve performance of the cycle analytics page. 1. These changes bring down page load time for 100 issues from more than a minute to about 1.5 seconds. 2. This entire commit is composed of these types of performance enhancements: - Cache relevant data in `IssueMetrics` wherever possible. - Cache relevant data in `MergeRequestMetrics` wherever possible. - Preload metrics 3. Given these improvements, we now only need to make 4 SQL calls: - Load all issues - Load all merge requests - Load all metrics for the issues - Load all metrics for the merge requests 4. A list of all the data points that are now being pre-calculated: a. The first time an issue is mentioned in a commit - In `GitPushService`, find all issues mentioned by the given commit using `ReferenceExtractor`. Set the `first_mentioned_in_commit_at` flag for each of them. - There seems to be a (pre-existing) bug here - files (and therefore commits) created using the Web CI don't have cross-references created, and issues are not closed even when the commit title is "Fixes #xx". b. The first time a merge request is deployed to production When a `Deployment` is created, find all merge requests that were merged in before the deployment, and set the `first_deployed_to_production_at` flag for each of them. c. The start / end time for a merge request pipeline Hook into the `Pipeline` state machine. When the `status` moves to `running`, find the merge requests whose tip commit matches the pipeline, and record the `latest_build_started_at` time for each of them. When the `status` moves to `success`, record the `latest_build_finished_at` time. d. The merge requests that close an issue - This was a big cause of the performance problems we were having with Cycle Analytics. We need to use `ReferenceExtractor` to make this calculation, which is slow when we have to run it on a large number of merge requests. - When a merge request is created, updated, or refreshed, find the issues it closes, and create an instance of `MergeRequestsClosingIssues`, which acts as a join model between merge requests and issues. - If a `MergeRequestsClosingIssues` instance links a merge request and an issue, that issue closes that merge request. 5. The `Queries` module was changed into a class, so we can cache the results of `issues` and `merge_requests_closing_issues` across various cycle analytics stages. 6. The code added in this commit is untested. Tests will be added in the next commit.
2016-09-15 04:59:36 -04:00
merge_request.cache_merge_request_closes_issues!(@current_user)
end
end
# rubocop: enable CodeReuse/ActiveRecord
Improve performance of the cycle analytics page. 1. These changes bring down page load time for 100 issues from more than a minute to about 1.5 seconds. 2. This entire commit is composed of these types of performance enhancements: - Cache relevant data in `IssueMetrics` wherever possible. - Cache relevant data in `MergeRequestMetrics` wherever possible. - Preload metrics 3. Given these improvements, we now only need to make 4 SQL calls: - Load all issues - Load all merge requests - Load all metrics for the issues - Load all metrics for the merge requests 4. A list of all the data points that are now being pre-calculated: a. The first time an issue is mentioned in a commit - In `GitPushService`, find all issues mentioned by the given commit using `ReferenceExtractor`. Set the `first_mentioned_in_commit_at` flag for each of them. - There seems to be a (pre-existing) bug here - files (and therefore commits) created using the Web CI don't have cross-references created, and issues are not closed even when the commit title is "Fixes #xx". b. The first time a merge request is deployed to production When a `Deployment` is created, find all merge requests that were merged in before the deployment, and set the `first_deployed_to_production_at` flag for each of them. c. The start / end time for a merge request pipeline Hook into the `Pipeline` state machine. When the `status` moves to `running`, find the merge requests whose tip commit matches the pipeline, and record the `latest_build_started_at` time for each of them. When the `status` moves to `success`, record the `latest_build_finished_at` time. d. The merge requests that close an issue - This was a big cause of the performance problems we were having with Cycle Analytics. We need to use `ReferenceExtractor` to make this calculation, which is slow when we have to run it on a large number of merge requests. - When a merge request is created, updated, or refreshed, find the issues it closes, and create an instance of `MergeRequestsClosingIssues`, which acts as a join model between merge requests and issues. - If a `MergeRequestsClosingIssues` instance links a merge request and an issue, that issue closes that merge request. 5. The `Queries` module was changed into a class, so we can cache the results of `issues` and `merge_requests_closing_issues` across various cycle analytics stages. 6. The code added in this commit is untested. Tests will be added in the next commit.
2016-09-15 04:59:36 -04:00
def filter_merge_requests(merge_requests)
merge_requests.uniq.select(&:source_project)
end
def merge_requests_for_source_branch(reload: false)
@source_merge_requests = nil if reload
@source_merge_requests ||= merge_requests_for(@push.branch_name)
2015-10-21 11:34:12 -04:00
end
# rubocop: disable CodeReuse/ActiveRecord
def merge_requests_for_forks
@merge_requests_for_forks ||=
MergeRequest.opened
.where(source_branch: @push.branch_name, source_project: @project)
.where.not(target_project: @project)
end
# rubocop: enable CodeReuse/ActiveRecord
end
end
MergeRequests::RefreshService.prepend_if_ee('EE::MergeRequests::RefreshService')