How much of the GridPP resources are being used this way? While we
haven't gone into the same amount of detail in the analysis our CPU
efficiency has dropped to 10-20% a couple of times in the last few days
when we have had a lot of biomed jobs running (as opposed to ~100% when
running CMS/Atlas/LHCb MC production with which we have been occupied
recently).
All the best,
david
Greig A. Cowan wrote:
> Hi,
>
> See the attached plot for the efficiency of all biomed jobs at Edinburgh
> over the past 2 weeks. A couple of points:
>
> 1. The CPU efficiency is very low for all biomed users.
> 2. Many jobs are killed off by the batch system when they reach the 24hr
> limit (exit code 137).
>
> Cheers,
> Greig
>
> Alex Martin wrote, On 24/11/08 11:04:
>> I should add that this same user has submitted ~11K jobs to our HTC
>> in the last 2 weeks with a cpu/wall clock efficiency of ~ 4.6K
>> hours/ 146K hours =~ 3% :
>>
>> Username njob % wall user system cpu
>> biomed032 7288 20 90949 1466 224 2907
>> biomed032 2741 16 56169 1310 100 1751
>> cheers,
>> Alex
>>
>> On Monday 24 November 2008, Alex Martin wrote:
>>
>>> A biomed user managed to start ~1000 gridftp processes on our
>>> old SE node here last week.
>>>
>>> cheers,
>>> Alex
>>>
>>> On Monday 24 November 2008, Coles, J (Jeremy) wrote:
>>>
>>>> Dear All
>>>>
>>>>
>>>>
>>>> In the site reports for last week Durham report:
>>>>
>>>>
>>>>
>>>> "A biomed user has been transferring huge amounts of data from our SE
>>>> (>500
>>>>
>>>> requests of the same 2.8GB file to a variety of worker nodes across
>>>> Europe. Unfortunately the high bandwidth has revealed instabilities
>>>> when transferring at close to the gigabit limit. I ticketed the user
>>>> and they have distributed more replicas - but they are not following
>>>> the
>>>> grid data-to-cpu model and therefore will cause severe bandwidth issues
>>>> to all sites."
>>>>
>>>>
>>>>
>>>> Has any other site seen such a seeding exercise taking place or
>>>> anything
>>>> related?
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Jeremy
>>>>
>>
>>
>>
>>
>
>
> ------------------------------------------------------------------------
>
|