gitlab-org--gitlab-foss/lib/tasks/cache.rake
Jacob Vosmaer a310901280 Batch size >1000 does not pay off
We did a small experiment to see how a full scan of the Redis keys on
gitlab.com speeds up as we increase the batch size. The values on the
right are time in seconds for a full scan (no delete operations).

count: 10);      284.500529021
count: 100);      86.21216934
count: 1_000);    60.931676195
count: 10_000);   60.96355610
count: 100_000);  62.378172541

It looks like 1,000 is a good number.
2016-02-25 13:50:08 +01:00

21 lines
616 B
Ruby

namespace :cache do
CLEAR_BATCH_SIZE = 1000 # There seems to be no speedup when pushing beyond 1,000
REDIS_SCAN_START_STOP = '0' # Magic value, see http://redis.io/commands/scan
desc "GitLab | Clear redis cache"
task :clear => :environment do
redis_store = Rails.cache.instance_variable_get(:@data)
cursor = REDIS_SCAN_START_STOP
loop do
cursor, keys = redis_store.scan(
cursor,
match: "#{Gitlab::REDIS_CACHE_NAMESPACE}*",
count: CLEAR_BATCH_SIZE
)
redis_store.del(*keys) if keys.any?
break if cursor == REDIS_SCAN_START_STOP
end
end
end