I’d add several jobs to the celery queue and wait for the results. I have many ideas about how I would accomplish this using some type of shared storage (memcached, redis, database, etc.), but I think it was something Celery could handle automatically, but I can’t find any resources online.

Code example

def do_tasks(b):
    for a in b:

    return c.all_results_some_how()

For Celery >= 3.0, TaskSet is deprecated in favour of group.

from celery import group
from tasks import add

job = group([
             add.s(2, 2),
             add.s(4, 4),
             add.s(8, 8),
             add.s(16, 16),
             add.s(32, 32),

Start the group in the background:

result = job.apply_async()



Task.delay returns AsyncResult. Use AsyncResult.get to get result of each task.

To do that you need to keep references to the tasks.

def do_tasks(b):
    tasks = []
    for a in b:
    return [t.get() for t in tasks]

Or you can use ResultSet:

UPDATE: ResultSet is deprecated, please see @laffuste ‘s answer.

def do_tasks(b):
    rs = ResultSet([])
    for a in b:
    return rs.get()

I have a hunch you are not really wanting the delay but the async feature of Celery.

I think you really want a TaskSet:

from celery.task.sets import TaskSet
from someapp.tasks import sometask

def do_tasks(b):
    job = TaskSet([sometask.subtask((a,)) for a in b])
    result = job.apply_async()
    # might want to handle result.successful() == False
    return result.join()