mirror of
https://github.com/mperham/sidekiq.git
synced 2022-11-09 13:52:34 -05:00
Trim
This commit is contained in:
parent
4a0432622f
commit
9eeaed2382
1 changed files with 2 additions and 7 deletions
|
@ -24,15 +24,10 @@ end
|
||||||
require "sidekiq/middleware/current_attributes"
|
require "sidekiq/middleware/current_attributes"
|
||||||
Sidekiq::CurrentAttributes.persist(Myapp::Current) # Your AS::CurrentAttributes singleton
|
Sidekiq::CurrentAttributes.persist(Myapp::Current) # Your AS::CurrentAttributes singleton
|
||||||
```
|
```
|
||||||
- **FEATURE**: Introduce new method, `.perform_bulk` on `Sidekiq::Worker` that makes enqueuing
|
- **FEATURE**: Add `Sidekiq::Worker.perform_bulk` for enqueuing jobs in bulk,
|
||||||
jobs in bulk adhere to Redis best practices by enqueuing 1,000 jobs per round trip. This
|
similar to `Sidekiq::Client.push_bulk` [#5042]
|
||||||
shares a similar args syntax to `Sidekiq::Client.push_bulk`. Batch sizes can be configured
|
|
||||||
with the optional `batch_size:` keyword argument.
|
|
||||||
```ruby
|
```ruby
|
||||||
MyJob.perform_bulk([[1], [2], [3]])
|
MyJob.perform_bulk([[1], [2], [3]])
|
||||||
|
|
||||||
# With a batch size provided:
|
|
||||||
MyJob.perform_bulk([[1], [2], [3]], batch_size: 100)
|
|
||||||
```
|
```
|
||||||
- Implement `queue_as`, `wait` and `wait_until` for ActiveJob compatibility [#5003]
|
- Implement `queue_as`, `wait` and `wait_until` for ActiveJob compatibility [#5003]
|
||||||
- Retry Redis operation if we get an `UNBLOCKED` Redis error. [#4985]
|
- Retry Redis operation if we get an `UNBLOCKED` Redis error. [#4985]
|
||||||
|
|
Loading…
Add table
Reference in a new issue