A Ruby gem that makes it easy to convert any method call into a background job using ActiveJob. Simply include the Delayable module and use the delay class method to mark methods that should be available for background execution.
- Easy Background Jobs: Convert any instance or class method into a background job with minimal setup
- Current Attributes Preservation: Automatically preserves Rails
Currentattributes across job boundaries - Flexible Configuration: Configure queue names, delays, and concurrency limits per method
- Method Parameter Support: Full support for regular arguments, keyword arguments, and blocks
- Bang Method Handling: Automatic support for methods ending with
! - ActiveJob Integration: Seamlessly integrates with your existing ActiveJob setup
Add this line to your application's Gemfile:
gem "delayable"And then execute:
$ bundle installOr install it yourself as:
$ gem install delayableBefore using Delayable, ensure you have the following defined in your Rails application:
- ApplicationJob: Your base job class
- Current: A CurrentAttributes class for maintaining context
# app/jobs/application_job.rb
class ApplicationJob < ActiveJob::Base
# Your job configuration
end
# app/models/current.rb
class Current < ActiveSupport::CurrentAttributes
attribute :user, :request_id, :user_agent
# Add other attributes as needed
endInclude Delayable in your class and use the delay method to mark methods for background execution:
class DataProcessor
include Delayable
def process_csv_import(file_path, user_id)
user = User.find(user_id)
CSV.foreach(file_path, headers: true) do |row|
user.records.create!(row.to_h)
end
end
delay :process_csv_import
def sync_external_data(api_endpoint, last_sync_time)
# Fetch and process data from external API
ExternalApiClient.new(api_endpoint).sync_data_since(last_sync_time)
end
delay :sync_external_data
end
# Usage
processor = DataProcessor.new
processor.process_csv_import_later("/tmp/users.csv", 123)
processor.sync_external_data_later("https://api.example.com/data", 1.hour.ago)You can also delay class methods by setting the class_method: true option:
class ReportGenerator
include Delayable
def self.generate_monthly_report(month, year)
# Generate report logic
end
delay :generate_monthly_report, class_method: true
def self.cleanup_old_reports!
# Cleanup logic
end
delay :cleanup_old_reports!, class_method: true
end
# Usage
ReportGenerator.generate_monthly_report_later(12, 2023)
ReportGenerator.cleanup_old_reports_later!Delayable automatically handles methods ending with !:
class DataProcessor
include Delayable
def process_data!
# Processing logic
end
delay :process_data!
end
# Usage - note the bang is preserved in the delayed method name
processor = DataProcessor.new
processor.process_data_later! # Creates ProcessDataBangJobSpecify which queue to use for the background job:
class DataProcessor
include Delayable
def process_csv_import(file_path)
# Large CSV processing logic
end
delay :process_csv_import, queue: :data_processing
def sync_external_api(endpoint)
# API synchronization logic
end
delay :sync_external_api, queue: :integrations
endSet a default delay for job execution:
class CacheManager
include Delayable
def warm_cache(cache_key)
# Pre-populate cache logic
end
delay :warm_cache, wait: 5.minutes
def cleanup_expired_entries
# Cache cleanup logic
end
delay :cleanup_expired_entries, wait: 1.hour
end
# You can also override the delay at call time
manager = CacheManager.new
manager.warm_cache_later("user_stats", wait: 30.seconds) # Overrides the 5.minutes defaultLimit how many jobs of this type can run concurrently using SolidQueue's concurrency controls:
class ResourceIntensiveTask
include Delayable
def process_large_file(file_path)
# Heavy processing that should be limited
end
delay :process_large_file, limits_concurrency: { to: 2 }
def generate_report(report_type)
# Report generation that should run one at a time
end
delay :generate_report, limits_concurrency: { to: 1, key: -> { "report_#{arguments.first}" } }
endNote: The limits_concurrency feature requires SolidQueue as your ActiveJob backend. Other job backends (Sidekiq, Resque, etc.) do not support this feature and will ignore the setting.
Delayable automatically preserves Rails Current attributes across job boundaries:
class AuditLogger
include Delayable
def log_action(action, resource_id)
AuditLog.create!(
action: action,
resource_id: resource_id,
user: Current.user, # This will be preserved from the original request
request_id: Current.request_id
)
end
delay :log_action
end
# In a controller
class PostsController < ApplicationController
def create
Current.user = current_user
Current.request_id = request.id
post = Post.create!(post_params)
# The job will have access to Current.user and Current.request_id
AuditLogger.new.log_action_later("create", post.id)
end
endDelayable works seamlessly with ActiveRecord models:
class User < ApplicationRecord
include Delayable
def update_profile_completeness
score = calculate_completeness_score
update!(profile_completeness: score)
end
delay :update_profile_completeness
def calculate_metrics!
update!(
total_posts: posts.count,
total_comments: comments.count,
last_activity: Time.current
)
end
delay :calculate_metrics!
def archive_old_data!
# Archive user's old posts, comments, etc.
posts.where('created_at < ?', 1.year.ago).update_all(archived: true)
comments.where('created_at < ?', 1.year.ago).delete_all
end
delay :archive_old_data!
end
# Usage
user = User.find(123)
user.update_profile_completeness_later
user.calculate_metrics_later!
user.archive_old_data_later!Delayable automatically generates job class names based on your method names to ensure logging and debugging remain clear and meaningful:
| Method Name | Method Type | Generated Job Class |
|---|---|---|
process_data |
instance | YourClass::ProcessDataJob |
process_data! |
instance | YourClass::ProcessDataBangJob |
generate_report |
class | YourClass::ClassGenerateReportJob |
cleanup! |
class | YourClass::ClassCleanupBangJob |
This naming convention ensures that when you're monitoring job queues, reading logs, or debugging failed jobs, you can immediately understand what method was being executed and in what context. The job name directly corresponds to your original method, making it easy to trace issues back to your application code.
Since Delayable creates standard ActiveJob jobs, you can use all the standard ActiveJob error handling mechanisms:
# In your ApplicationJob
class ApplicationJob < ActiveJob::Base
retry_on StandardError, wait: :exponentially_longer, attempts: 5
discard_on ActiveRecord::RecordNotFound
rescue_from SomeCustomError do |error|
# Custom error handling
end
endDelayable integrates with ActiveJob's testing helpers:
require 'test_helper'
class DataProcessorTest < ActiveSupport::TestCase
include ActiveJob::TestHelper
test "CSV import is enqueued" do
processor = DataProcessor.new
assert_enqueued_jobs 1, only: DataProcessor::ProcessCsvImportJob do
processor.process_csv_import_later("/tmp/data.csv", 123)
end
end
test "CSV data is imported" do
processor = DataProcessor.new
perform_enqueued_jobs do
processor.process_csv_import_later("/tmp/data.csv", 123)
end
# Assert data was imported
user = User.find(123)
assert user.records.count > 0
end
test "current attributes are preserved" do
Current.user = users(:admin)
perform_enqueued_jobs do
AuditLogger.new.log_action_later("test", 123)
end
log = AuditLog.last
assert_equal users(:admin), log.user
end
end- Keep Jobs Idempotent: Design your delayed methods to be safely retried
- Use Appropriate Queues: Separate different types of work into different queues
- Handle Failures Gracefully: Use ActiveJob's retry and error handling features
- Monitor Job Performance: Use tools like Sidekiq's web UI or similar for your job backend
- Test Background Behavior: Always test both the enqueuing and execution of your jobs
class ComplexProcessor
include Delayable
def process_batch(batch_id, options = {})
# Complex processing logic
end
delay :process_batch,
queue: :batch_processing,
wait: 30.seconds,
limits_concurrency: { to: 3, key: -> { "batch_#{arguments.first}" } }
endclass NotificationSender
include Delayable
def send_reminder(user_id, urgency = :normal)
# Send reminder logic
end
delay :send_reminder
def send_reminder_with_delay(user_id, urgency = :normal)
delay_time = case urgency
when :urgent then 0.seconds
when :normal then 1.hour
when :low then 1.day
end
send_reminder_later(user_id, urgency, wait: delay_time)
end
endTo see what jobs are being created, you can inspect the generated job classes:
# In rails console
DataProcessor::ProcessCsvImportJob.new.class.name
# => "DataProcessor::ProcessCsvImportJob"
DataProcessor::ProcessCsvImportJob.queue_name
# => "default"- Ruby >= 3.4.0
- Rails >= 7.0.4.3
- ActiveJob configured with a backend (Sidekiq, Resque, etc.)
- SolidQueue (optional) - Required only if using
limits_concurrencyfeature
- Fork the repository
- Create your feature branch (
git checkout -b my-new-feature) - Add tests for your changes
- Make sure all tests pass (
bundle exec rake test) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin my-new-feature) - Create a new Pull Request
The gem is available as open source under the terms of the MIT License.
See CHANGELOG.md for details about changes in each version.