[GE users] SGE/6.2 and IBM x3950m2

reuti reuti at staff.uni-marburg.de
Thu Feb 4 21:18:09 GMT 2010


Hi,

Am 04.02.2010 um 19:55 schrieb jbhoren:

> On Thu, Dec 24, 2009 at 2:12 AM, adary <adary at marvell.com> wrote:
> We use a two node x3950, each one with 128gb ram and 4 quad core  
> CPU's, with RHEL4.7 installed.
>
> Just one single host with 32 CPU's and 256gb ram as far as linux  
> and SGE are concerned
>
> I'm curious about how "smart" the Linux kernel's scheduler is.  
> Specifically, given a job which requires multiple CPUs (really,  
> multiple cores), will it be assigned to CPUs/cores on the same  
> card, or randomly, to available CPUs/cores on separate cards?

I would suggest to rise this question on the beowulf.org list.

Nevertheless note the new core binding facility in SGE 6.2u5 (man  
qsub / -binding). http://wikis.sun.com/display/gridengine62u5/Using 
+Job+to+Core+Binding

-- Reuti


> Also, a question in resource management: Our users (~250) access an  
> existing cluster (16 nodes, running SGE/6.0u10) via a web-based  
> portal, configured such that all jobs are submitted with the same  
> user name -- "apache".  When we bring our three-node x3950 on-line  
> (and re-purpose the current cluster), access will continue to be  
> via that web-based portal.
>
> Has anyone developed a methodology for allocating resources with a  
> single user?
>
> Given that our users (graduate students, faculty, and researchers)  
> generally work on one project at a time, I thought about using  
> "project" in place of "user". Has anyone done this, or something  
> similar? If so, care to share? Thoughts? Suggestions?
>
>
> -- 
> JONATHAN B. HOREN
> Systems Administrator
> UAF Life Science Informatics
> Center for Research Services
> jbhoren at alaska.edu
> http://biotech.inbre.alaska.edu
>

------------------------------------------------------
http://gridengine.sunsource.net/ds/viewMessage.do?dsForumId=38&dsMessageId=243324

To unsubscribe from this discussion, e-mail: [users-unsubscribe at gridengine.sunsource.net].



More information about the gridengine-users mailing list