next up previous contents index
Next: job.run_all_tasks() run Up: Parallel job support Previous: job.slave_startup_commands Slave   Contents   Index

job.queue_task() -- submit a task to run within the job

queue_task(taskobj)
This adds the given task object to the job's queue. All tasks in the queue can later be run with job.run_all_tasks() or job.yield_tasks_unordered().

The task should be a instance of a class derived from task, which provides a 'run' method. This method will be run on the slave node; any arguments to this method are given on the master when the object is created, and are automatically passed for you to the slave. Anything you return from this method is automatically passed back to the master. (Note that Communicator.send_data() is used to send this data, which cannot send all internal MODELLER types.)

Note that generally you need to declare tasks in a separate Python module, and load them in with the import statement, as the tasks are passed using Python's pickle module, which will otherwise give an error such as 'AttributeError: 'module' object has no attribute 'mytask''.
Example: See job.run_all_tasks() command.


next up previous contents index
Next: job.run_all_tasks() run Up: Parallel job support Previous: job.slave_startup_commands Slave   Contents   Index
Automatic builds 2011-09-28