[GE users] how to use Python multiprocessing script with SGE

reuti reuti at staff.uni-marburg.de
Mon May 17 10:14:37 BST 2010


Hi,

Am 15.05.2010 um 03:17 schrieb cguilbert69:

> Sorry I am not clear, I am not a hard core programmer myself and probably use the wrong terms . When I use python multiprocessing script alone, python auto detect how many cores my CPU gots , it this case I have a quad-core so python can distribute my jobs on the 4 cores using python multiprocessing 

4 jobs or one job with 4 threads? I doubt that Python can parallalize a serial script on its own to give you n-times the speed of a serial one.


> Maybe I am using the wrong syntax to reserve 10 cores with SGE, but when I said "#$ -pe mpi 10" , I mean that I want to reserve 10 cores (like if you do mpirun -n 10),  so SGE queue would for example reserved 3 computers clients (2 quad-core and 1 dual core = 10 cores). 
> My python multiprocessing script will distribute all the jobs on the 10 cores define by SGE qsub script. 
> My program is not doing strictly parallel calculation , all I want to do is to send a serie  of jobs on different core  in parallel from my main python script. 
> 
> 
> When I am running my python script directly (with no SGE queuing ), python detect and use the 4 cores of my quad-core (1CPU)

You checked this in `top` and `ps -e f`?


> , I want to do the same with  20, 100, ... cores, I want python multiprocessing to see a gigantic 100 cores CPU

But all still on a 4 core CPU?


> , those cores being defined by the SGE queuing system. they must be a mechanism to trick python and pretend that it is using a 100 cores on 1 CPU. 
> 
> Here is the idea of the stuff I want to do using  ssh  in a cluster of 100 nodes  (1 core per nodes) 
> I run a python script which will send 100 jobs in parallel using ssh on the 100 nodes , when all the calculation will be done , the main python script will continue and may send some other jobs to the nodes .... 
> So instead of using ssh to submit my jobs

SSH isn't used to submit script. You submit the jobs on any machine which has access to the qmaster, and the job is transfered by SGE's internal mechanism to the granted node.


> , I want to use pool() or process() from the multiprocessing python library

Sure, this you can do right now. I'm not a Python user, but AFAICS Python generated local threads and you have just to check not to issue more than n times processes. That Python sets a variable for you (which you can ignore and use $NSLOTS instead) is just for your convenience. Hence access the environment variable $NSLOTS and use this in the pool() command I would say.

-- Reuti


> . Somehow python need to know how to use the CPUs (or cores) reserved by the SGE qsub right ?  How ? 
> 
> If anyone can give me some script to do that it would be really nice. 
> 
> Thanks 
> 
> Chris
> 
> 
> 
> 
> ------------------------------------------------------
> http://gridengine.sunsource.net/ds/viewMessage.do?dsForumId=38&dsMessageId=257342
> 
> To unsubscribe from this discussion, e-mail: [users-unsubscribe at gridengine.sunsource.net].
>

------------------------------------------------------
http://gridengine.sunsource.net/ds/viewMessage.do?dsForumId=38&dsMessageId=257570

To unsubscribe from this discussion, e-mail: [users-unsubscribe at gridengine.sunsource.net].



More information about the gridengine-users mailing list