[GE users] how to use Python multiprocessing script with SGE

cguilbert69 cguilbert69 at gmail.com
Wed Apr 28 22:23:06 BST 2010


    [ The following text is in the "iso-8859-1" character set. ]
    [ Your display is set for the "ISO-8859-10" character set.  ]
    [ Some special characters may be displayed incorrectly. ]

Hi,

I'd like to write a python multiprocessing script which will be send using SGE qsub.
this idea is that I am going to required/reserve  10 cores for instance with qsub and use/manage those cores to calculate various tasks in parallel within python.

I am not sure I am using the proper terms here but here what I would like to do
I would need to create some sort of "pipes" in which I can send different job in parallel , I would  need to check the status of the job sent in the pipe for synchronization with the other jobs so I can proceed my calculation further when done, I would need to be able to control,  close the pipes when the job is done and start a new one (with a new job).

I spent a week online trying to find the best method to achieve my goal , the net result is that I am totally confuse now.
there's so many way of doing it.

DRMAA-python seems to be easy to implement,  however I have no idea how to run it with a SGE qsub ?  should I send the python script with qsub or the python script is sending the job directly (without qsub) ?

Python multiprocessing (including in the Python standard library) seems to be the way to go, but I could not find any example how I could make it work with SGE ( like reserve 10 nodes and open 10 pipes, etc ....)

There's also many mpi Python module which are cool,  but again, I don't know how to use them with SGE.

Does any one can guide me here. if you guys could send me any scripts (even uncommented) so I can study them , It would be great

Thanks so much for any help.

Chris



More information about the gridengine-users mailing list