Elixir Comparison#
This guide is for developers familiar with Oban for Elixir who want to understand how oban-py compares. It covers architectural differences, feature parity, and where Oban Pro for Python bridges the gap.
Architectural Differences#
The most fundamental difference between oban-py and Oban for Elixir stems from their runtime environments.
Concurrency Model#
Elixir runs on the BEAM virtual machine, which provides lightweight processes with preemptive scheduling. Each Oban queue runs in its own GenServer process, and jobs execute in isolated processes that can run truly in parallel across CPU cores.
Python uses asyncio for concurrency. Jobs run as async tasks within a single event loop, which handles I/O-bound work efficiently but is limited by the Global Interpreter Lock (GIL) for CPU-bound work. All jobs in oban-py share a single Python process by default.
For CPU-intensive workloads, Oban Pro provides multi-process execution that distributes jobs across multiple Python processes, each with its own event loop. This bypasses the GIL and enables true parallelism similar to BEAM processes.
Supervision and Lifecycle#
In Elixir, Oban runs as part of your application’s supervision tree. You configure it in
config.exs and add it to your application’s children:
# Elixir
children = [
MyApp.Repo,
{Oban, Application.fetch_env!(:my_app, Oban)}
]
In Python, Oban is typically started via the CLI, which handles process lifecycle and signal handling:
# Python
oban start --queues "default:10"
You can also run Oban programmatically using the async context manager:
async with oban:
await shutdown_event.wait()
Database Integration#
Elixir Oban integrates with Ecto and supports multiple databases (PostgreSQL, SQLite3, MySQL) through different engines. Python Oban uses asyncpg directly and currently supports PostgreSQL only.
API Comparison#
Defining Workers#
Elixir workers are modules that implement the perform/1 callback:
# Elixir
defmodule MyApp.EmailWorker do
use Oban.Worker, queue: :emails, max_attempts: 5
@impl Oban.Worker
def perform(%Oban.Job{args: %{"to" => to, "subject" => subject}}) do
# Send email
:ok
end
end
Python workers are classes with an async process method:
# Python
from oban import worker
@worker(queue="emails", max_attempts=5)
class EmailWorker:
async def process(self, job):
to = job.args["to"]
subject = job.args["subject"]
# Send email
Inserting Jobs#
The patterns are similar, with Elixir using pipe operators and Python using method chaining or direct construction:
# Elixir
%{to: "[email protected]", subject: "Hello"}
|> MyApp.EmailWorker.new()
|> Oban.insert()
# Python
await oban.insert(
EmailWorker.new(to="[email protected]", subject="Hello")
)
Scheduling#
Both support schedule_in and schedule_at with similar semantics:
# Elixir
MyApp.Worker.new(%{}, schedule_in: 60)
MyApp.Worker.new(%{}, schedule_at: ~U[2024-12-25 00:00:00Z])
# Python
from datetime import datetime, timedelta, timezone
Worker.new(schedule_in=60)
Worker.new(schedule_at=datetime(2024, 12, 25, tzinfo=timezone.utc))
Feature Parity#
Core Features (OSS)#
Feature |
Elixir |
Python |
|---|---|---|
Queue-based processing |
✓ |
✓ |
Configurable concurrency |
✓ |
✓ |
Job scheduling |
✓ |
✓ |
Periodic/cron jobs |
✓ |
✓ |
Retry with backoff |
✓ |
✓ |
Job lifecycle states |
✓ |
✓ |
Telemetry/instrumentation |
✓ |
✓ |
Pruner plugin |
✓ |
✓ |
Lifeline plugin |
✓ |
✓ |
CLI |
✓ |
✓ |
Testing utilities |
✓ |
✓ |
Unique jobs (basic) |
✓ |
— |
SQLite3 support |
✓ |
— |
MySQL support |
✓ |
— |
Pro Features#
Feature |
Elixir Pro |
Python Pro |
|---|---|---|
Multi-process execution |
— (BEAM native) |
✓ |
Global concurrency |
✓ |
✓ |
Rate limiting |
✓ |
✓ |
Queue partitioning |
✓ |
✓ |
Unique jobs (strong) |
✓ |
✓ |
Workflows |
✓ |
✓ |
Batches |
✓ |
— |
Chunks |
✓ |
— |
Dynamic plugins |
✓ |
— |
Encrypted args |
✓ |
— |
Recorded output |
✓ |
✓ |
Parallelism: BEAM vs Multi-Process#
The biggest difference for compute-heavy workloads is how parallelism is achieved.
In Elixir, the BEAM scheduler automatically distributes work across CPU cores. A queue with
limit: 20 can run 20 jobs truly in parallel without any additional configuration:
# Elixir - true parallelism out of the box
config :my_app, Oban, queues: [heavy: 20]
In Python, the GIL prevents true parallelism within a single process. For I/O-bound work (API calls, database queries), asyncio handles concurrency efficiently. But for CPU-bound work, you need Oban Pro’s multi-process execution:
# Python - requires Pro for CPU parallelism
obanpro start --processes 4 --queues "heavy:20"
This spawns 4 worker processes, each with its own Python interpreter and event loop. Jobs are distributed across processes, enabling true parallel execution while respecting queue concurrency limits.
Unique Jobs#
In Elixir OSS, unique jobs use transactional locks and database queries, which can have race conditions under high concurrency. Elixir Pro’s Smart Engine uses database constraints for stronger guarantees.
In Python, unique jobs are a Pro-only feature that uses the same constraint-based approach as Elixir Pro, providing strong uniqueness guarantees from the start:
# Python Pro
@worker(unique={"period": 300, "keys": ["user_id"]})
class DeduplicatedWorker:
async def process(self, job):
...
Migration Tips#
If you’re porting an Elixir application to Python:
Workers: Convert
perform/1functions to asyncprocessmethods. Pattern matching on args becomes dictionary access.Configuration: Move from
config.exsto CLI flags, environment variables, oroban.toml.Supervision: Replace OTP supervision with CLI-managed processes or async context managers.
CPU-bound work: If you rely on BEAM’s parallelism for compute-heavy jobs, consider Oban Pro’s multi-process execution.
Unique jobs: If you use basic uniqueness in Elixir, you’ll need Oban Pro for Python to get equivalent functionality.
Testing: Replace
Oban.Testingwith oban-py’s testing utilities. The patterns are similar but use Python’s pytest conventions.