1
0
Fork 0
mirror of https://github.com/deanpcmad/sidekiq-limit_fetch.git synced 2022-11-09 13:54:36 -05:00
Sidekiq strategy to support an advanced queue control – limiting, pausing, blocking, querying
Find a file
2013-10-03 16:50:35 +04:00
bench Pace yourself 2013-06-20 22:48:28 +04:00
demo Add demo app and rake task to test limits in real life 2013-10-03 16:50:35 +04:00
lib Support process_limits in config 2013-06-22 22:23:01 +04:00
spec Add limits per process 2013-06-22 22:12:42 +04:00
.gitignore Initial implementation 2013-01-11 14:56:41 +04:00
.travis.yml Travis has fixed jruby issues 2013-03-13 12:36:34 +04:00
Gemfile Change rubygems to secure version 2013-03-10 13:39:27 +04:00
LICENSE.txt Initial implementation 2013-01-11 14:56:41 +04:00
Rakefile Correctly support redis namespace 2013-06-18 15:23:12 +04:00
README.md Fix readme 2013-06-22 22:14:21 +04:00
sidekiq-limit_fetch.gemspec Release 2.1.1 2013-06-22 22:23:27 +04:00

Description

Sidekiq strategy to support a granular queue control limiting, pausing, blocking, querying.

Build Status Gem Version Dependency Status Code Climate

Installation

Add this line to your application's Gemfile:

gem 'sidekiq-limit_fetch'

Requirements

redis 2.6 or newer

heroku uses redis 2.4 by default
to update version you need to contact their support:
https://devcenter.heroku.com/articles/redistogo#redis-26

Usage

Limits

Specify limits which you want to place on queues inside sidekiq.yml:

  :limits:
    queue_name1: 5
    queue_name2: 10

Or set it dynamically in your code:

  Sidekiq::Queue['queue_name1'].limit = 5
  Sidekiq::Queue['queue_name2'].limit = 10

In these examples, tasks for the queue_name1 will be run by at most 5 workers at the same time and the queue_name2 will have no more than 10 workers simultaneously.

Ability to set limits dynamically allows you to resize worker distribution among queues any time you want.

Limits per process

If you use multiple sidekiq processes then you can specify limits per process:

  :process_limits:
    queue_name: 2

Or set it in your code:

  Sidekiq::Queue['queue_name'].process_limit = 2

Busy workers by queue

You can see how many workers currently handling a queue:

  Sidekiq::Queue['name'].busy # number of busy workers

Pauses

You can also pause your queues temporarely. Upon continuing their limits will be preserved.

  Sidekiq::Queue['name'].pause # prevents workers from running tasks from this queue
  Sidekiq::Queue['name'].paused? # => true
  Sidekiq::Queue['name'].unpause # allows workers to use the queue

Blocking queue mode

If you use strict queue ordering (it will be used if you don't specify queue weights) then you can set blocking status for queues. It means if a blocking queue task is executing then no new task from lesser priority queues will be ran. Eg,

  :queues:
    - a
    - b
    - c
  :blocking:
    - b

In this case when a task for b queue is ran no new task from c queue will be started.

You can also enable and disable blocking mode for queues on the fly:

  Sidekiq::Queue['name'].block
  Sidekiq::Queue['name'].blocking? # => true
  Sidekiq::Queue['name'].unblock

Advanced blocking queues

You can also block on array of queues. It means when any of them is running only queues higher and queues from their blocking group can run. It will be easier to understand with an example:

  :queues:
    - a
    - b
    - c
    - d
  :blocking:
    - [b, c]

In this case tasks from d will be blocked when a task from queue b or c is executed.

You can dynamically set exceptions for queue blocking:

  Sidekiq::Queue['queue1'].block_except 'queue2'

Thanks

Sponsored by [Evil Martians]. [Evil Martians]: http://evilmartians.com/