Each Answer to this Q is separated by one/two green lines.
I’d add several jobs to the celery queue and wait for the results. I have many ideas about how I would accomplish this using some type of shared storage (memcached, redis, database, etc.), but I think it was something Celery could handle automatically, but I can’t find any resources online.
def do_tasks(b): for a in b: c.delay(a) return c.all_results_some_how()
To do that you need to keep references to the tasks.
def do_tasks(b): tasks =  for a in b: tasks.append(c.delay(a)) return [t.get() for t in tasks]
Or you can use
ResultSet is deprecated, please see @laffuste ‘s answer.
def do_tasks(b): rs = ResultSet() for a in b: rs.add(c.delay(a)) return rs.get()
I have a hunch you are not really wanting the delay but the async feature of Celery.
I think you really want a TaskSet:
from celery.task.sets import TaskSet from someapp.tasks import sometask def do_tasks(b): job = TaskSet([sometask.subtask((a,)) for a in b]) result = job.apply_async() # might want to handle result.successful() == False return result.join()