f11030173b
Instead of inserting a row after each example to an external database, we save the CI profiling reports into the `rspec_profiling` directory and insert the data in the update-tests-metadata CI stage. This should make each spec run faster and also reduce the number of PostgreSQL connections needed by concurrent CI builds. `scripts/insert-rspec-profiling-data` also inserts one file at a time via the PostgreSQL COPY command for faster inserts. The one side effect is that the `created_at` and `updated_at` timestamps aren't available since they aren't generated in the CSV. Closes https://gitlab.com/gitlab-org/gitlab-ee/issues/10154
47 lines
1.2 KiB
Ruby
Executable file
47 lines
1.2 KiB
Ruby
Executable file
#!/usr/bin/env ruby
|
|
|
|
require 'csv'
|
|
require 'rspec_profiling'
|
|
require 'postgres-copy'
|
|
|
|
module RspecProfiling
|
|
module Collectors
|
|
class PSQL
|
|
def establish_connection
|
|
# This disables the automatic creation of the database and
|
|
# table. In the future, we may want a way to specify the host of
|
|
# the database to connect so that we can call #install.
|
|
Result.establish_connection(results_url)
|
|
end
|
|
|
|
def prepared?
|
|
connection.data_source_exists?(table)
|
|
end
|
|
|
|
def results_url
|
|
ENV['RSPEC_PROFILING_POSTGRES_URL']
|
|
end
|
|
|
|
class Result < ActiveRecord::Base
|
|
acts_as_copy_target
|
|
end
|
|
end
|
|
end
|
|
end
|
|
|
|
def insert_data(path)
|
|
puts "#{Time.now} Inserting CI stats..."
|
|
|
|
collector = RspecProfiling::Collectors::PSQL.new
|
|
collector.install
|
|
|
|
files = Dir[File.join(path, "*.csv")]
|
|
|
|
files.each do |filename|
|
|
puts "#{Time.now} Inserting #{filename}..."
|
|
result = RspecProfiling::Collectors::PSQL::Result.copy_from(filename)
|
|
puts "#{Time.now} Inserted #{result.cmd_tuples} lines in #{filename}, DB response: #{result.cmd_status}"
|
|
end
|
|
end
|
|
|
|
insert_data('rspec_profiling') if ENV['RSPEC_PROFILING_POSTGRES_URL'].present?
|