[GE users] dependant jobs on same host?

daireb Daire.Byrne at framestore.com
Tue Nov 11 16:03:41 GMT 2008

You could make them all make a hard request for a particular hostname but that is not very efficient. Perhaps use a custom load_sensor and have the first job write something to /tmp (say) - the load_sensor detects it and sets run_here=1. Then submit all the subsequent jobs with a hard request for "run_here=1". You'd also have to make sure all the other jobs on the farm also make a soft request to NOT run on that machine..... There may be a better way!


----- "udo waechter" <udo.waechter at uni-osnabrueck.de> wrote:

> Hello again.
> I have many jobs which are split up in 2 steps where step2 is  
> dependant on step1.
> I submit those with
> 	qsub -N jstep1_N ...
> and
> 	qsub -N jstep2_N -hold_jid jstep1_N ....
> where N is a unique int. Thus, each of these tuples is unique.
> Are those two jobs executed on the same execution host? If not, is it 
> possible to somehow ensure that this happens?
> Is this possible to do without any parsing of qstat output or so?
> Thanks.
> udo.
> btw.: Thanks to all developers for this great piece of software. I  
> simply love it.
> -- 
> ---[ Institute of Cognitive Science @ University of Osnabrueck
> ---[ Albrechtstrasse 28, D-49076 Osnabrueck, 969-3362
> ---[ Documentation: https://doc.ikw.uni-osnabrueck.de
> ------------------------------------------------------
> http://gridengine.sunsource.net/ds/viewMessage.do?dsForumId=38&dsMessageId=88480
> To unsubscribe from this discussion, e-mail:
> [users-unsubscribe at gridengine.sunsource.net].


To unsubscribe from this discussion, e-mail: [users-unsubscribe at gridengine.sunsource.net].

More information about the gridengine-users mailing list