Scaling Celery - Sending Tasks To Remote Machines

Celery:

Celery is a Python package which implements a task queue mechanism with a foucs on real-time processing, while also supporting task scheduling.
It has 3 main components.
Celery Application(or Client): It is responsible for adding tasks to the queue.
Celery Worker(or Server): It is responsible for executing the tasks given to it.
Broker: It is responsible for transporting messages between client and server.

What You Should Know:

You should know basics of Celery and you should be familiar with
creating celery tasks

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

@app.task
def add(x, y):
return x + y
adding tasks to a queue,

add.apply_async(args=[1,2])
and consuming tasks with a worker

celery worker -A my_app -l info
If your tasks doesn't need much system resources, you can setup all of them in the same machine. But, if you have a lot of jobs which consume resources, then you need to spread them out in several machines.
In this tutorial lets move our celery workers into a remote machine keeping client and broker in same machine.

Sending Tasks To Another Machine:

On Machine A:
  1. Install Celery & RabbitMQ.
  2. Configure RabbitMQ so that Machine B can connect to it.

# add new user
sudo rabbitmqctl add_user <user> <password>

# add new virtual host
sudo rabbitmqctl add_vhost <vhost_name>

# set permissions for user on vhost
sudo rabbitmqctl set_permissions -p <vhost_name> <user> ".*" ".*" ".*"

# restart rabbit
sudo rabbitmqctl restart
  1. Create a new file remote.py with a simple task. Here we have broker installed in machine A. So give ip address of machine 1 in broker url option.

from celery import Celery

app = Celery('tasks', backend='amqp',
broker='amqp://<user>:<password>@<ip>/<vhost>')

def add(x, y):
return x + y
  1. Now we have everything setup on machine A. We can now put some tasks in queue.

In [1]: from remote import add

In [2]: add.delay(1, 2)
Out[2]: <AsyncResult: 3eb96a11-aa61-46d3-9b9d-e0e1703438d0>

In [3]: b.delay(2, 3)
Out[3]: <AsyncResult: ec40db1a-a43c-4486-9530-0a3153fe1380>

In [4]: b.delay(3, 4)
Out[4]: <AsyncResult: ca53a4c7-061b-408e-82ee-86c2d43d21a0>
Everything is setup on machine A. Now lets get into machine B.
On Machine B:
  1. Install Celery.
  2. Copy remote.py file from machine A to this machine.
  3. Run a worker to consume the tasks

celery worker -l info -A remote
As soon as you launch the worker, you will receive the tasks you queued up and gets executed immediately.

[2014-11-09 00:01:19,168: INFO/MainProcess] Received task: remote.add[c2d2bb27-ff5f-47da-b2b9-6fb11669ee1a]
[2014-11-09 00:01:19,170: INFO/MainProcess] Received task: remote.add[8daa1a5c-17d0-46dc-9c93-faf7fbeccdd9]
[2014-11-09 00:01:19,172: INFO/MainProcess] Received task: remote.add[79603d15-24f1-43f8-b8b7-525b7cd4b9a2]
[2014-11-09 00:01:19,401: INFO/MainProcess] Task remote.add[8daa1a5c-17d0-46dc-9c93-faf7fbeccdd9] succeeded in 0.226168102003s: 3
[2014-11-09 00:01:19,462: INFO/MainProcess] Task remote.add[c2d2bb27-ff5f-47da-b2b9-6fb11669ee1a] succeeded in 0.286503815001s: 5
[2014-11-09 00:01:19,464: INFO/MainProcess] Task remote.add[79603d15-24f1-43f8-b8b7-525b7cd4b9a2] succeeded in 0.288741396998s: 7
This is just a simple guide on how to send tasks to remote machines.
Depending on your need, you might have to set up a cluster of servers and route tasks accordingly to scale.

Need further help with this? Feel free to send a message.