Job.run_all_tasks() — run all queued tasks, and return results

run_all_tasks()
This runs all of the tasks in the job's queue on any available worker. When all of the tasks have finished, this functions returns a list of all the return values from the tasks, in the same order that they were submitted.

Tasks are run in a simple round-robin fashion on the available workers. If a worker fails while running a task, that task is automatically resubmitted to another worker. If you submit more tasks than available workers, new workers are automatically added to the job if the job supports this functionality (e.g., SGEQsubJob()).

See also Job.yield_tasks_unordered().
Example: examples/python/mytask.py

from modeller import *
from modeller.parallel import Task

class MyTask(Task):
    """A task to read in a PDB file on the worker, and return the resolution"""
    def run(self, code):
        env = Environ()
        env.io.atom_files_directory = ["../atom_files"]
        mdl = Model(env, file=code)
        return mdl.resolution

Example: examples/python/parallel-task.py

from modeller import *
from modeller.parallel import Job, LocalWorker

# Load in my task from mytask.py (note: needs to be in a separate Python
# module like this, in order for Python's pickle module to work correctly)
from mytask import MyTask

log.minimal()
# Create an empty parallel job, and then add 2 worker processes running
# on the local machine
j = Job()
j.append(LocalWorker())
j.append(LocalWorker())

# Run 'mytask' tasks
j.queue_task(MyTask('1fdn'))
j.queue_task(MyTask('1b3q'))
j.queue_task(MyTask('1blu'))

results = j.run_all_tasks()

print("Got model resolution: " + str(results))