Dear experts,
I am trying to make our ARC CE advertise the correct number of CPUs
(both logical and physical),
however, it seems that the information system always prefers the
values from HTCondor[1].
HTCondor is configured with partitionable slots on 12 worker nodes
(hence 12 slots).
From the documentation I found
cpudistribution=Ncpu:M
for the [cluster] section where N = number of CPUs per machine and M =
nodes with this configuration.
I initially set it to '16cpu:12', however, it was not picked up. I
also tried '16:12' (as seen in the glue information) as well as
putting it in the [queue] block.
None of this helped.
Any ideas?
Cheers,
Luke
[1]
dn: GLUE2ManagerID=urn:ogf:ComputingManager:lcgce01.phy.bris.ac.uk:condor,GLUE
2ServiceID=urn:ogf:ComputingService:lcgce01.phy.bris.ac.uk:arex,GLUE2GroupID=
services,GLUE2DomainID=UKI-SOUTHGRID-BRIS-HEP,o=glue
GLUE2ComputingManagerComputingServiceForeignKey: urn:ogf:ComputingService:lcgc
e01.phy.bris.ac.uk:arex
GLUE2EntityValidity: 60
GLUE2ComputingManagerWorkingAreaFree: 32
GLUE2ManagerProductName: condor
objectClass: GLUE2Manager
objectClass: GLUE2ComputingManager
GLUE2ComputingManagerWorkingAreaGuaranteed: FALSE
GLUE2ComputingManagerWorkingAreaLifeTime: 604800
GLUE2ComputingManagerWorkingAreaTotal: 37
GLUE2ManagerProductVersion: 8.0.3
GLUE2ComputingManagerHomogeneous: FALSE
GLUE2ComputingManagerWorkingAreaShared: FALSE
GLUE2ComputingManagerBulkSubmission: FALSE
GLUE2ManagerServiceForeignKey: urn:ogf:ComputingService:lcgce01.phy.bris.ac.uk
:arex
GLUE2ManagerID: urn:ogf:ComputingManager:lcgce01.phy.bris.ac.uk:condor
GLUE2ComputingManagerSlotsUsedByLocalJobs: 0
GLUE2ComputingManagerTotalLogicalCPUs: 12
GLUE2ComputingManagerLogicalCPUDistribution: 1:12
GLUE2ComputingManagerTotalSlots: 12
GLUE2ComputingManagerSlotsUsedByGridJobs: 0
GLUE2EntityCreationTime: 2013-10-15T12:13:15Z
--
*********************************************************
Dr Lukasz Kreczko +44 (0)117 928 8724
CMS Group
School of Physics
University of Bristol
*********************************************************
|