Add latest changes from gitlab-org/gitlab@master

This commit is contained in:
GitLab Bot 2021-07-30 03:08:43 +00:00
parent 420215876f
commit ad7214cbd7
13 changed files with 227 additions and 72 deletions

View File

@ -0,0 +1,7 @@
---
name: vulnerability_finding_replace_metadata
introduced_by_url: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/66868
rollout_issue_url: https://gitlab.com/gitlab-org/gitlab/-/issues/337253
group: group::threat insights
type: development
default_enabled: false

View File

@ -10,6 +10,10 @@ value_type: number
status: data_available status: data_available
time_frame: all time_frame: all
data_source: redis data_source: redis
instrumentation_class: RedisMetric
options:
counter_class: SourceCodeCounter
event: pushes
distribution: distribution:
- ce - ce
- ee - ee

View File

@ -11,7 +11,7 @@ This guide describes how to develop Service Ping metrics using metrics instrumen
## Nomenclature ## Nomenclature
- **Instrumentation class**: - **Instrumentation class**:
- Inherits one of the metric classes: `DatabaseMetric`, `RedisHLLMetric` or `GenericMetric`. - Inherits one of the metric classes: `DatabaseMetric`, `RedisMetric`, `RedisHLLMetric` or `GenericMetric`.
- Implements the logic that calculates the value for a Service Ping metric. - Implements the logic that calculates the value for a Service Ping metric.
- **Metric definition** - **Metric definition**
@ -24,7 +24,7 @@ This guide describes how to develop Service Ping metrics using metrics instrumen
A metric definition has the [`instrumentation_class`](metrics_dictionary.md) field, which can be set to a class. A metric definition has the [`instrumentation_class`](metrics_dictionary.md) field, which can be set to a class.
The defined instrumentation class should have one of the existing metric classes: `DatabaseMetric`, `RedisHLLMetric`, or `GenericMetric`. The defined instrumentation class should have one of the existing metric classes: `DatabaseMetric`, `RedisMetric`, `RedisHLLMetric`, or `GenericMetric`.
Using the instrumentation classes ensures that metrics can fail safe individually, without breaking the entire Using the instrumentation classes ensures that metrics can fail safe individually, without breaking the entire
process of Service Ping generation. process of Service Ping generation.
@ -51,6 +51,26 @@ module Gitlab
end end
``` ```
## Redis metrics
[Example of a merge request that adds a `Redis` metric](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/66582).
Count unique values for `source_code_pushes` event.
Required options:
- `event`: the event name.
- `counter_class`: one of the counter classes from the `Gitlab::UsageDataCounters` namespace; it should implement `read` method or inherit it from `BaseCounter`.
```yaml
time_frame: all
data_source: redis
instrumentation_class: 'RedisMetric'
options:
event: pushes
counter_class: SourceCodeCounter
```
## Redis HyperLogLog metrics ## Redis HyperLogLog metrics
[Example of a merge request that adds a `RedisHLL` metric](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/61685). [Example of a merge request that adds a `RedisHLL` metric](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/61685).
@ -60,7 +80,7 @@ Count unique values for `i_quickactions_approve` event.
```yaml ```yaml
time_frame: 28d time_frame: 28d
data_source: redis_hll data_source: redis_hll
instrumentation_class: 'Gitlab::Usage::Metrics::Instrumentations::RedisHLLMetric' instrumentation_class: 'RedisHLLMetric'
options: options:
events: events:
- i_quickactions_approve - i_quickactions_approve
@ -91,13 +111,13 @@ end
There is support for: There is support for:
- `count`, `distinct_count` for [database metrics](#database-metrics). - `count`, `distinct_count` for [database metrics](#database-metrics).
- [Redis metrics](#redis-metrics).
- [Redis HLL metrics](#redis-hyperloglog-metrics). - [Redis HLL metrics](#redis-hyperloglog-metrics).
- [Generic metrics](#generic-metrics), which are metrics based on settings or configurations. - [Generic metrics](#generic-metrics), which are metrics based on settings or configurations.
Currently, there is no support for: Currently, there is no support for:
- `add`, `sum`, `histogram`, `estimate_batch_distinct_count` for database metrics. - `add`, `sum`, `histogram`, `estimate_batch_distinct_count` for database metrics.
- Regular Redis counters.
You can [track the progress to support these](https://gitlab.com/groups/gitlab-org/-/epics/6118). You can [track the progress to support these](https://gitlab.com/groups/gitlab-org/-/epics/6118).
@ -107,7 +127,7 @@ To create a stub instrumentation for a Service Ping metric, you can use a dedica
The generator takes the class name as an argument and the following options: The generator takes the class name as an argument and the following options:
- `--type=TYPE` Required. Indicates the metric type. It must be one of: `database`, `generic`. - `--type=TYPE` Required. Indicates the metric type. It must be one of: `database`, `generic`, `redis`.
- `--operation` Required for `database` type. It must be one of: `count`, `distinct_count`. - `--operation` Required for `database` type. It must be one of: `count`, `distinct_count`.
- `--ee` Indicates if the metric is for EE. - `--ee` Indicates if the metric is for EE.

View File

@ -232,7 +232,7 @@ module Backup
end end
def folders_to_backup def folders_to_backup
FOLDERS_TO_BACKUP.reject { |name| skipped?(name) } FOLDERS_TO_BACKUP.select { |name| !skipped?(name) && Dir.exist?(File.join(backup_path, name)) }
end end
def disabled_features def disabled_features

View File

@ -11,7 +11,8 @@ module Gitlab
ALLOWED_SUPERCLASSES = { ALLOWED_SUPERCLASSES = {
generic: 'Generic', generic: 'Generic',
database: 'Database' database: 'Database',
redis: 'Redis'
}.freeze }.freeze
ALLOWED_OPERATIONS = %w(count distinct_count).freeze ALLOWED_OPERATIONS = %w(count distinct_count).freeze

View File

@ -0,0 +1,49 @@
# frozen_string_literal: true
module Gitlab
module Usage
module Metrics
module Instrumentations
# Usage example
#
# In metric YAML definition:
#
# instrumentation_class: RedisMetric
# options:
# event: pushes
# counter_class: SourceCodeCounter
#
class RedisMetric < BaseMetric
def initialize(time_frame:, options: {})
super
raise ArgumentError, "'event' option is required" unless metric_event.present?
raise ArgumentError, "'counter class' option is required" unless counter_class.present?
end
def metric_event
options[:event]
end
def counter_class_name
options[:counter_class]
end
def counter_class
"Gitlab::UsageDataCounters::#{counter_class_name}".constantize
end
def value
redis_usage_data do
counter_class.read(metric_event)
end
end
def suggested_name
Gitlab::Usage::Metrics::NameSuggestion.for(:redis)
end
end
end
end
end
end

View File

@ -31,7 +31,7 @@ module Sidebars
private private
def packages_registry_menu_item def packages_registry_menu_item
if !::Gitlab.config.packages.enabled || !can?(context.current_user, :read_package, context.project) if packages_registry_disabled?
return ::Sidebars::NilMenuItem.new(item_id: :packages_registry) return ::Sidebars::NilMenuItem.new(item_id: :packages_registry)
end end
@ -58,7 +58,7 @@ module Sidebars
end end
def infrastructure_registry_menu_item def infrastructure_registry_menu_item
if Feature.disabled?(:infrastructure_registry_page, context.current_user, default_enabled: :yaml) if Feature.disabled?(:infrastructure_registry_page, context.current_user, default_enabled: :yaml) || packages_registry_disabled?
return ::Sidebars::NilMenuItem.new(item_id: :infrastructure_registry) return ::Sidebars::NilMenuItem.new(item_id: :infrastructure_registry)
end end
@ -69,6 +69,10 @@ module Sidebars
item_id: :infrastructure_registry item_id: :infrastructure_registry
) )
end end
def packages_registry_disabled?
!::Gitlab.config.packages.enabled || !can?(context.current_user, :read_package, context.project)
end
end end
end end
end end

View File

@ -1,7 +1,6 @@
# frozen_string_literal: true # frozen_string_literal: true
require 'octokit' require 'octokit'
require 'parallel'
# rubocop:disable Rails/Pluck # rubocop:disable Rails/Pluck
module QA module QA
@ -21,11 +20,10 @@ module QA
let(:user) do let(:user) do
Resource::User.fabricate_via_api! do |resource| Resource::User.fabricate_via_api! do |resource|
resource.api_client = api_client resource.api_client = api_client
resource.hard_delete_on_api_removal = true
end end
end end
let(:github_repo) { 'rspec/rspec-core' } let(:github_repo) { ENV['QA_LARGE_GH_IMPORT_REPO'] || 'rspec/rspec-core' }
let(:github_client) do let(:github_client) do
Octokit.middleware = Faraday::RackBuilder.new do |builder| Octokit.middleware = Faraday::RackBuilder.new do |builder|
builder.response(:logger, logger, headers: false, bodies: false) builder.response(:logger, logger, headers: false, bodies: false)
@ -98,13 +96,19 @@ module QA
end end
after do |example| after do |example|
# skip saving data if example is skipped or failed before import finished
next if example.pending? next if example.pending?
user.remove_via_api!
next unless defined?(@import_time)
# save data for comparison after run finished # save data for comparison after run finished
save_json( save_json(
"data", "data",
{ {
import_time: @import_time,
github: { github: {
project_name: github_repo,
branches: gh_branches, branches: gh_branches,
commits: gh_commits, commits: gh_commits,
labels: gh_labels, labels: gh_labels,
@ -113,6 +117,7 @@ module QA
issues: gh_issues issues: gh_issues
}, },
gitlab: { gitlab: {
project_name: imported_project.path_with_namespace,
branches: gl_branches, branches: gl_branches,
commits: gl_commits, commits: gl_commits,
labels: gl_labels, labels: gl_labels,
@ -125,6 +130,8 @@ module QA
end end
it 'imports large Github repo via api' do it 'imports large Github repo via api' do
start = Time.now
imported_project # import the project imported_project # import the project
fetch_github_objects # fetch all objects right after import has started fetch_github_objects # fetch all objects right after import has started
@ -132,6 +139,7 @@ module QA
duration: 3600, duration: 3600,
interval: 30 interval: 30
) )
@import_time = Time.now - start
aggregate_failures do aggregate_failures do
verify_repository_import verify_repository_import
@ -146,7 +154,7 @@ module QA
# #
# @return [void] # @return [void]
def fetch_github_objects def fetch_github_objects
logger.debug("Fetching objects for github repo: '#{github_repo}'") logger.debug("== Fetching objects for github repo: '#{github_repo}' ==")
gh_repo gh_repo
gh_branches gh_branches
@ -161,7 +169,7 @@ module QA
# #
# @return [void] # @return [void]
def verify_repository_import def verify_repository_import
logger.debug("Verifying repository import") logger.debug("== Verifying repository import ==")
expect(imported_project.description).to eq(gh_repo.description) expect(imported_project.description).to eq(gh_repo.description)
# check via include, importer creates more branches # check via include, importer creates more branches
# https://gitlab.com/gitlab-org/gitlab/-/issues/332711 # https://gitlab.com/gitlab-org/gitlab/-/issues/332711
@ -173,7 +181,7 @@ module QA
# #
# @return [void] # @return [void]
def verify_merge_requests_import def verify_merge_requests_import
logger.debug("Verifying merge request import") logger.debug("== Verifying merge request import ==")
verify_mrs_or_issues('mr') verify_mrs_or_issues('mr')
end end
@ -181,7 +189,7 @@ module QA
# #
# @return [void] # @return [void]
def verify_issues_import def verify_issues_import
logger.debug("Verifying issue import") logger.debug("== Verifying issue import ==")
verify_mrs_or_issues('issue') verify_mrs_or_issues('issue')
end end
@ -189,15 +197,16 @@ module QA
# #
# @return [void] # @return [void]
def verify_labels_import def verify_labels_import
logger.debug("Verifying label import") logger.debug("== Verifying label import ==")
expect(gl_labels).to match_array(gh_labels) # check via include, additional labels can be inherited from parent group
expect(gl_labels).to include(*gh_labels)
end end
# Verify milestones import # Verify milestones import
# #
# @return [void] # @return [void]
def verify_milestones_import def verify_milestones_import
logger.debug("Verifying milestones import") logger.debug("== Verifying milestones import ==")
expect(gl_milestones).to match_array(gh_milestones) expect(gl_milestones).to match_array(gh_milestones)
end end
@ -217,8 +226,9 @@ module QA
eq(actual.length), eq(actual.length),
"Expected to contain same amount of #{type}s. Expected: #{expected.length}, actual: #{actual.length}" "Expected to contain same amount of #{type}s. Expected: #{expected.length}, actual: #{actual.length}"
) )
logger.debug("= Comparing #{type}s =")
actual.each do |title, actual_item| actual.each do |title, actual_item|
logger.debug("Comparing #{type} with title '#{title}'") print "." # indicate that it is still going but don't spam the output with newlines
expected_item = expected[title] expected_item = expected[title]
@ -235,34 +245,47 @@ module QA
) )
expect(expected_item[:comments]).to match_array(actual_item[:comments]) expect(expected_item[:comments]).to match_array(actual_item[:comments])
end end
puts # print newline after last print to make output pretty
end end
# Imported project branches # Imported project branches
# #
# @return [Array] # @return [Array]
def gl_branches def gl_branches
@gl_branches ||= imported_project.repository_branches(auto_paginate: true).map { |b| b[:name] } @gl_branches ||= begin
logger.debug("= Fetching branches =")
imported_project.repository_branches(auto_paginate: true).map { |b| b[:name] }
end
end end
# Imported project commits # Imported project commits
# #
# @return [Array] # @return [Array]
def gl_commits def gl_commits
@gl_commits ||= imported_project.commits(auto_paginate: true).map { |c| c[:id] } @gl_commits ||= begin
logger.debug("= Fetching commits =")
imported_project.commits(auto_paginate: true).map { |c| c[:id] }
end
end end
# Imported project labels # Imported project labels
# #
# @return [Array] # @return [Array]
def gl_labels def gl_labels
@gl_labels ||= imported_project.labels(auto_paginate: true).map { |label| label.slice(:name, :color) } @gl_labels ||= begin
logger.debug("= Fetching labels =")
imported_project.labels(auto_paginate: true).map { |label| label.slice(:name, :color) }
end
end end
# Imported project milestones # Imported project milestones
# #
# @return [<Type>] <description> # @return [<Type>] <description>
def gl_milestones def gl_milestones
@gl_milestones ||= imported_project.milestones(auto_paginate: true).map { |ms| ms.slice(:title, :description) } @gl_milestones ||= begin
logger.debug("= Fetching milestones =")
imported_project.milestones(auto_paginate: true).map { |ms| ms.slice(:title, :description) }
end
end end
# Imported project merge requests # Imported project merge requests
@ -270,19 +293,17 @@ module QA
# @return [Hash] # @return [Hash]
def mrs def mrs
@mrs ||= begin @mrs ||= begin
logger.debug("Fetching merge requests") logger.debug("= Fetching merge requests =")
imported_mrs = imported_project.merge_requests(auto_paginate: true) imported_mrs = imported_project.merge_requests(auto_paginate: true)
# fetch comments in parallel since we need to do it for each mr separately logger.debug("= Transforming merge request objects for comparison =")
logger.debug("Transforming merge request objects for comparison") imported_mrs.each_with_object({}) do |mr, hash|
mrs_hashes = Parallel.map(imported_mrs) do |mr|
resource = Resource::MergeRequest.init do |resource| resource = Resource::MergeRequest.init do |resource|
resource.project = imported_project resource.project = imported_project
resource.iid = mr[:iid] resource.iid = mr[:iid]
resource.api_client = api_client resource.api_client = api_client
end end
{ hash[mr[:title]] = {
title: mr[:title],
body: mr[:description], body: mr[:description],
comments: resource.comments(auto_paginate: true) comments: resource.comments(auto_paginate: true)
# remove system notes # remove system notes
@ -290,13 +311,6 @@ module QA
.map { |c| sanitize(c[:body]) } .map { |c| sanitize(c[:body]) }
} }
end end
mrs_hashes.each_with_object({}) do |mr, hash|
hash[mr[:title]] = {
body: mr[:body],
comments: mr[:comments]
}
end
end end
end end
@ -305,30 +319,21 @@ module QA
# @return [Hash] # @return [Hash]
def gl_issues def gl_issues
@gl_issues ||= begin @gl_issues ||= begin
logger.debug("Fetching issues") logger.debug("= Fetching issues =")
imported_issues = imported_project.issues(auto_paginate: true) imported_issues = imported_project.issues(auto_paginate: true)
# fetch comments in parallel since we need to do it for each mr separately logger.debug("= Transforming issue objects for comparison =")
logger.debug("Transforming issue objects for comparison") imported_issues.each_with_object({}) do |issue, hash|
issue_hashes = Parallel.map(imported_issues) do |issue|
resource = Resource::Issue.init do |issue_resource| resource = Resource::Issue.init do |issue_resource|
issue_resource.project = imported_project issue_resource.project = imported_project
issue_resource.iid = issue[:iid] issue_resource.iid = issue[:iid]
issue_resource.api_client = api_client issue_resource.api_client = api_client
end end
{ hash[issue[:title]] = {
title: issue[:title],
body: issue[:description], body: issue[:description],
comments: resource.comments(auto_paginate: true).map { |c| sanitize(c[:body]) } comments: resource.comments(auto_paginate: true).map { |c| sanitize(c[:body]) }
} }
end end
issue_hashes.each_with_object({}) do |issue, hash|
hash[issue[:title]] = {
body: issue[:body],
comments: issue[:comments]
}
end
end end
end end

View File

@ -55,12 +55,13 @@ module Matchers
def wait_and_check(actual, expectation_name) def wait_and_check(actual, expectation_name)
attempt = 0 attempt = 0
QA::Runtime::Logger.debug("Running eventually matcher with '#{operator_msg}' operator")
QA::Support::Retrier.retry_until( QA::Support::Retrier.retry_until(
max_attempts: @attempts, max_attempts: @attempts,
max_duration: @duration, max_duration: @duration,
sleep_interval: @interval || 0.5 sleep_interval: @interval || 0.5
) do ) do
QA::Runtime::Logger.debug("Evaluating expectation '#{operator_msg}', attempt: #{attempt += 1}") QA::Runtime::Logger.debug("evaluating expectation, attempt: #{attempt += 1}")
public_send(expectation_name, actual) public_send(expectation_name, actual)
rescue RSpec::Expectations::ExpectationNotMetError, QA::Resource::ApiFabricator::ResourceNotFoundError rescue RSpec::Expectations::ExpectationNotMetError, QA::Resource::ApiFabricator::ResourceNotFoundError

View File

@ -122,3 +122,4 @@ UsageData/InstrumentationSuperclass:
- :DatabaseMetric - :DatabaseMetric
- :GenericMetric - :GenericMetric
- :RedisHLLMetric - :RedisHLLMetric
- :RedisMetric

View File

@ -12,20 +12,13 @@ RSpec.describe Backup::Manager do
before do before do
allow(progress).to receive(:puts) allow(progress).to receive(:puts)
allow(progress).to receive(:print) allow(progress).to receive(:print)
@old_progress = $progress # rubocop:disable Style/GlobalVars
$progress = progress # rubocop:disable Style/GlobalVars
end
after do
$progress = @old_progress # rubocop:disable Style/GlobalVars
end end
describe '#pack' do describe '#pack' do
let(:backup_contents) { ['backup_contents'] } let(:expected_backup_contents) { %w(repositories db uploads.tar.gz builds.tar.gz artifacts.tar.gz pages.tar.gz lfs.tar.gz backup_information.yml) }
let(:tar_file) { '1546300800_2019_01_01_12.3_gitlab_backup.tar' }
let(:tar_system_options) { { out: [tar_file, 'w', Gitlab.config.backup.archive_permissions] } } let(:tar_system_options) { { out: [tar_file, 'w', Gitlab.config.backup.archive_permissions] } }
let(:tar_cmdline) { ['tar', '-cf', '-', *backup_contents, tar_system_options] } let(:tar_cmdline) { ['tar', '-cf', '-', *expected_backup_contents, tar_system_options] }
let(:backup_information) do let(:backup_information) do
{ {
backup_created_at: Time.zone.parse('2019-01-01'), backup_created_at: Time.zone.parse('2019-01-01'),
@ -36,20 +29,20 @@ RSpec.describe Backup::Manager do
before do before do
allow(ActiveRecord::Base.connection).to receive(:reconnect!) allow(ActiveRecord::Base.connection).to receive(:reconnect!)
allow(Kernel).to receive(:system).and_return(true) allow(Kernel).to receive(:system).and_return(true)
allow(YAML).to receive(:load_file).and_return(backup_information)
::Backup::Manager::FOLDERS_TO_BACKUP.each do |folder|
allow(Dir).to receive(:exist?).with(File.join(Gitlab.config.backup.path, folder)).and_return(true)
end
allow(subject).to receive(:backup_contents).and_return(backup_contents)
allow(subject).to receive(:backup_information).and_return(backup_information) allow(subject).to receive(:backup_information).and_return(backup_information)
allow(subject).to receive(:upload) allow(subject).to receive(:upload)
end end
context 'when BACKUP is not set' do it 'executes tar' do
let(:tar_file) { '1546300800_2019_01_01_12.3_gitlab_backup.tar' } subject.pack
it 'uses the default tar file name' do expect(Kernel).to have_received(:system).with(*tar_cmdline)
subject.pack
expect(Kernel).to have_received(:system).with(*tar_cmdline)
end
end end
context 'when BACKUP is set' do context 'when BACKUP is set' do
@ -62,6 +55,37 @@ RSpec.describe Backup::Manager do
expect(Kernel).to have_received(:system).with(*tar_cmdline) expect(Kernel).to have_received(:system).with(*tar_cmdline)
end end
end end
context 'when skipped is set in backup_information.yml' do
let(:expected_backup_contents) { %w{db uploads.tar.gz builds.tar.gz artifacts.tar.gz pages.tar.gz lfs.tar.gz backup_information.yml} }
let(:backup_information) do
{
backup_created_at: Time.zone.parse('2019-01-01'),
gitlab_version: '12.3',
skipped: ['repositories']
}
end
it 'executes tar' do
subject.pack
expect(Kernel).to have_received(:system).with(*tar_cmdline)
end
end
context 'when a directory does not exist' do
let(:expected_backup_contents) { %w{db uploads.tar.gz builds.tar.gz artifacts.tar.gz pages.tar.gz lfs.tar.gz backup_information.yml} }
before do
expect(Dir).to receive(:exist?).with(File.join(Gitlab.config.backup.path, 'repositories')).and_return(false)
end
it 'executes tar' do
subject.pack
expect(Kernel).to have_received(:system).with(*tar_cmdline)
end
end
end end
describe '#remove_old' do describe '#remove_old' do

View File

@ -0,0 +1,23 @@
# frozen_string_literal: true
require 'spec_helper'
RSpec.describe Gitlab::Usage::Metrics::Instrumentations::RedisMetric, :clean_gitlab_redis_shared_state do
before do
4.times do
Gitlab::UsageDataCounters::SourceCodeCounter.count(:pushes)
end
end
let(:expected_value) { 4 }
it_behaves_like 'a correct instrumented metric value', { options: { event: 'pushes', counter_class: 'SourceCodeCounter' } }
it 'raises an exception if event option is not present' do
expect { described_class.new(counter_class: 'SourceCodeCounter') }.to raise_error(ArgumentError)
end
it 'raises an exception if counter_class option is not present' do
expect { described_class.new(event: 'pushes') }.to raise_error(ArgumentError)
end
end

View File

@ -51,8 +51,8 @@ RSpec.describe Sidebars::Projects::Menus::PackagesRegistriesMenu do
context 'when Container Registry is not visible' do context 'when Container Registry is not visible' do
let(:registry_enabled) { false } let(:registry_enabled) { false }
it 'menu link points to Infrastructure Registry page' do it 'does not display menu link' do
expect(subject.link).to eq described_class.new(context).renderable_items.find { |i| i.item_id == :infrastructure_registry }.link expect(subject.render?).to eq false
end end
end end
end end
@ -130,10 +130,26 @@ RSpec.describe Sidebars::Projects::Menus::PackagesRegistriesMenu do
is_expected.not_to be_nil is_expected.not_to be_nil
end end
context 'when config package setting is disabled' do
it 'does not add the menu item to the list' do
stub_config(packages: { enabled: false })
is_expected.to be_nil
end
end
context 'when user cannot read packages' do
let(:user) { nil }
it 'does not add the menu item to the list' do
is_expected.to be_nil
end
end
end end
context 'when feature flag :infrastructure_registry_page is disabled' do context 'when feature flag :infrastructure_registry_page is disabled' do
it 'the menu item is not added to list of menu items' do it 'does not add the menu item to the list' do
stub_feature_flags(infrastructure_registry_page: false) stub_feature_flags(infrastructure_registry_page: false)
is_expected.to be_nil is_expected.to be_nil