589b2db06c
This sets up all the basics for importing Phabricator tasks into GitLab issues. To import all tasks from a Phabricator instance into GitLab, we'll import all of them into a new project that will have its repository disabled. The import is hooked into a regular ProjectImport setup, but similar to the GitHub parallel importer takes care of all the imports itself. In this iteration, we're importing each page of tasks in a separate sidekiq job. The first thing we do when requesting a new page of tasks is schedule the next page to be imported. But to avoid deadlocks, we only allow a single job per worker type to run at the same time. For now we're only importing basic Issue information, this should be extended to richer information.
60 lines
1.3 KiB
Ruby
60 lines
1.3 KiB
Ruby
# frozen_string_literal: true
|
|
module Gitlab
|
|
module PhabricatorImport
|
|
module Conduit
|
|
class Response
|
|
def self.parse!(http_response)
|
|
unless http_response.success?
|
|
raise Gitlab::PhabricatorImport::Conduit::ResponseError,
|
|
"Phabricator responded with #{http_response.status}"
|
|
end
|
|
|
|
response = new(JSON.parse(http_response.body))
|
|
|
|
unless response.success?
|
|
raise ResponseError,
|
|
"Phabricator Error: #{response.error_code}: #{response.error_info}"
|
|
end
|
|
|
|
response
|
|
rescue JSON::JSONError => e
|
|
raise ResponseError.new(e)
|
|
end
|
|
|
|
def initialize(json)
|
|
@json = json
|
|
end
|
|
|
|
def success?
|
|
error_code.nil?
|
|
end
|
|
|
|
def error_code
|
|
json['error_code']
|
|
end
|
|
|
|
def error_info
|
|
json['error_info']
|
|
end
|
|
|
|
def data
|
|
json_result&.fetch('data')
|
|
end
|
|
|
|
def pagination
|
|
return unless cursor_info = json_result&.fetch('cursor')
|
|
|
|
@pagination ||= Pagination.new(cursor_info)
|
|
end
|
|
|
|
private
|
|
|
|
attr_reader :json
|
|
|
|
def json_result
|
|
json['result']
|
|
end
|
|
end
|
|
end
|
|
end
|
|
end
|