03b020f2e4
Previously we scheduled a worker to just some this but we were running into performance issues when the build table was getting too large. So now we've updated the code such that this column is updated immediately and incremented/decremented by the correct amount whenever artifacts are created or deleted. We've also added the performance optimization that we do not update this statistic if a project is deleted because it could result in many updates for a project with many builds.
41 lines
894 B
Ruby
41 lines
894 B
Ruby
class JobArtifactUploader < GitlabUploader
|
|
extend Workhorse::UploadPath
|
|
include ObjectStorage::Concern
|
|
|
|
ObjectNotReadyError = Class.new(StandardError)
|
|
|
|
storage_options Gitlab.config.artifacts
|
|
|
|
def cached_size
|
|
return model.size if model.size.present? && !model.file_changed?
|
|
|
|
size
|
|
end
|
|
|
|
def store_dir
|
|
dynamic_segment
|
|
end
|
|
|
|
def open
|
|
if file_storage?
|
|
File.open(path, "rb") if path
|
|
else
|
|
::Gitlab::Ci::Trace::HttpIO.new(url, cached_size) if url
|
|
end
|
|
end
|
|
|
|
private
|
|
|
|
def dynamic_segment
|
|
raise ObjectNotReadyError, 'JobArtifact is not ready' unless model.id
|
|
|
|
creation_date = model.created_at.utc.strftime('%Y_%m_%d')
|
|
|
|
File.join(disk_hash[0..1], disk_hash[2..3], disk_hash,
|
|
creation_date, model.job_id.to_s, model.id.to_s)
|
|
end
|
|
|
|
def disk_hash
|
|
@disk_hash ||= Digest::SHA2.hexdigest(model.project_id.to_s)
|
|
end
|
|
end
|