KEMBAR78
Asynchronous job queues with python-rq | PDF
AsynchronousJobQueueswith
PythonRQ
AshishAcharya
Thoplo.com
Outline
● Asynchronous job queuing
● Redis
● Redis Queue (RQ)
● django-rq
AboutMe
● Computer Engineering graduate from Pulchowk
Campus
● Runs a Python shop called Thoplo
● Does a lot of Python/Django
● Uses far too many exec statements
● ashishacharya.com
● Not a designer
WhyAsynchronousJobQueues?
● Some problems take too long to solve within a
single process
● Web applications need to provide an immediate
response
● Sending emails, processing images, making API
calls, etc. can take time
● So background processes and job queues
ToolsforparallelisminPython
● Celery
● IPython.parallel
● Parallel Python
● pymq
● snakeMQ
● Redis Queue
ChallengeswithCelery
● Lots of one-off configuration
● Difficult and time-consuming to set up
● Difficult to get right - lots of moving parts
that can fail
● Time consuming to monitor and maintain
● Needs right kind of storage mechanism with
the right kind of message queues
MainProblemwithCelery
It’s OVERKILL for MOST web projects!
IntroducingRedisQueue
● RQ (Redis Queue) is a simple Python library
for queueing jobs and processing them in the
background with workers.
● Has a low barrier to entry.
● Has been created as a lightweight alternative
to existing queueing frameworks.
WhyRedisQueue?
● Can get it up and running in an afternoon
● No configuration file editing -- or very
little
● Don’t have to configure a message broker --
just uses Redis
AboutRedis
A server that lets you associate a data structure
with a key
● Written in C
● Bindings in most popular languages
● Easy to access from Python
● Used extensively for caching in web applications
● I know almost nothing about it.
AboutRedis
import redis
connection = redis.Redis(‘localhost’)
connection.set(‘key’, ‘value’)
print(connection.get(‘key’))
doesRedisQueueuseRedis?Duh!
A Python module for treating Redis like a
queue
● Pure Python
● Provides a clean, well thought out API for
humans
ASimpleExample
from redis import Redis
from rq import Queue
import time
q = Queue(connection=Redis())
job = q.enqueue(‘operator.add’, 2, 3)
time.sleep(1)
print(job.result)
What’sSoCoolAboutRedisQueue?
...This
Monitoring on a
web interface:
rq-dashboard
...AlsoThis:django-rq
● Django integration with RQ
● Define queues in settings.py
● Add some urls
● Done! You can start queuing jobs
immediately.
Django-rq
● Enqueue jobs like so:
import django_rq
django_rq.enqueue(func, foo, bar=baz)
● Has a @job decorator like Celery’s @task
● Has management commands for rqworker and
rqscheduler
python manage.py rqworker high default low
python manage.py rqscheduler
Alsohasacooldashboard!
● Procrastinate digitally...schedule
a job for later.
● Send spam daily...and other
periodic tasks
...AndThis:rq-scheduler
...andfinally
If you’re already using Redis for caching, you
can use the same connection information and
reap the benefits of optimization.
“Two birds, one Redis.” -- A.A
Twoscoopsapproved
Demo
Tips
Break jobs down into atomic packages that don’t
do anything else.
At a point when this doesn’t work for you
anymore, it’s definitely time to upgrade to a
more powerful, more complex tool -- Celery.
(Or just hire someone to do it for you!)
Questions?

Asynchronous job queues with python-rq