Hi Chris
Yes I missed those modifications - sorry (I'll change my update to the
PMB). It would be useful to have a note of why a particular test result
is changed but the underlying problem will not often be known for a
"non-relevant" case. The Replica Management test is a particular problem
test to remark on because there are so many external areas it could have
failed and your comment is unlikely to know which if it self-corrected.
It would be possible to check if other sites experienced a similar
general problem but that is more the coordinators role.
Perhaps we ought to press for better definitions of the terms to include
"unknown". Currently the portal says:
"This state field stand for the relevancy of the SFT result. It means
that a SFT result is relevant if its failure has actually been caused by
a site problem, and it is not relevant if the site was working well at
SFT submission but the SFT failure occurred because of an external
reason as a SFT bug, or any third party failures (RB, BDII, ...), or
whatever else."
Thanks for responding,
Jeremy
-----Original Message-----
From: Testbed Support for GridPP member institutes
[mailto:[log in to unmask]] On Behalf Of Brew, CAJ (Chris)
Sent: 23 January 2006 11:14
To: [log in to unmask]
Subject: Re: Thanks for your responses! (was RE: Site admins - please
review your site pre-report!)
Hi Jeremy,
I reviewed the RALPP report and all the issues were Replica Management
SFT failures that resolved on their own without intervention. I then
marked these as "Not Relavant" and didn't think I needed to comment
further. In previous weeks I have commented where the problem was
actually at my site.
Do I need to do more?
Yours,
Chris.
> -----Original Message-----
> From: Testbed Support for GridPP member institutes
> [mailto:[log in to unmask]] On Behalf Of Coles, J (Jeremy)
> Sent: 23 January 2006 10:54
> To: [log in to unmask]
> Subject: Thanks for your responses! (was RE: Site admins -
> please review your site pre-report!)
>
> Dear All
>
>
>
> I have just been through the online weekly reports and wish
> to thank most of you for your support in this area - I hope
> it will continue. The information and observations are useful
> not only to us but grid wide in trying to resolve problems.
>
>
>
> I am pleased to say of the sites in the report the following
> responded:
>
> - Grid Ireland (for all sites!), LeSC, Bristol, Glasgow,
> Edinburgh, Brunel, Durham, Liverpool, Oxford, RHUL, RAL,
> UCL-CCC, Lancaster, IC and QMUL
>
>
>
> Unfortunately no information was entered regarding problems at:
>
> - UCL-HEP, RALPP, Birmingham, Sheffield and Cambridge.
>
>
>
> Hopefully we will achieve full feedback from next week
> onwards. The reports can be edited from Friday morning until
> Monday morning and cover the previous 7 days. If any of you
> have comments about the process being used or suggestions for
> improvements (including what you want to get out of this
> investment in time), please let me know.
>
>
>
> Other information:
>
>
>
> 1) The 2.7.0 pre-release tests have mostly completed
> and feedback is being used to prepare the release for next week.
>
> 2) Our resource usage is still not high and there are
> plenty of new VOs that can be enabled. The deployment team
> will aim to get more information online to help you enable
> more VOs as part of the 2.7.0 upgrade (Nb. VOs can be enabled
> to use free cycles and do not need an allocation. LHC VOs at
> GridPP sites are likely to need shares assigned in Maui (or
> similar) as competition for resources grows over the coming years.
>
> 3) UKI performance has definitely improved over the
> last month:
> https://lxb2001.cern.ch:8443/sft/history_metrics.php?interval=monthly
>
>
>
>
>
> Thanks again for your help,
>
> Jeremy
>
>
>
>
>
> -----Original Message-----
> From: Testbed Support for GridPP member institutes
> [mailto:[log in to unmask]] On Behalf Of Coles, J (Jeremy)
> Sent: 20 January 2006 15:26
> To: [log in to unmask]
> Subject: Site admins - please review your site pre-report!
>
>
>
> Dear Site Admins
>
> As you will be aware the Regional Operations Centre (ROC)
> reports are now available for site admins to update from
> 07:00 (GMT) Friday to 10:00 (GMT) Monday. Please could I urge
> you once again to login via the CIC portal
> (https://cic.in2p3.fr/index.php?id=rc&subid=rc_report&js_statu
> s=2
> <https://cic.in2p3.fr/index.php?id=rc&subid=rc_report&js_statu
s=2> ) to check *YOUR* site report AND ADD REMARKS about the > problem
encountered if a particular SFT failure is
> understood. If the test result is deemed to not be relevant
> as a site problem please change the appropriate status flag
> from "relevant" to "non-relevant". If you are not sure, there
> is also an "unsure" option! You can also use the text boxes
> in the report to raise particular service problems and issues
> to our attention.
>
> These reports will be viewed by both the deployment team and
> myself, and as usual passed on for review at the EGEE weekly
> operations meeting on Monday at 13:00 (GMT). Notes from this
> meeting can always be found here:
> http://agenda.cern.ch/displayLevel.php?fid=258
> <http://agenda.cern.ch/displayLevel.php?fid=258> .
>
> I also take this opportunity to inform you again that on
> Monday (23rd January) the UKI ROC helpdesk will be running at
> risk during an upgrade to Footprints 7.0. Although we do not
> expect any problems there is always a risk of wrong
> configuration leading to unexpected behavior (such as tickets
> being incorrectly assigned). If you notice anything of this
> nature please email myself or David Spence
> ([log in to unmask] <mailto:[log in to unmask]> ) explaining
> the problem.
>
> Thank you for your help, it is much appreciated.
>
> Have a nice weekend,
>
> Jeremy
>
> --------------------------------------------------------------
> ------------------------------------------
>
> Dr Jeremy Coles
>
> GridPP Production Manager
>
> CCLRC eScience Department Phone:
> (+44)(0)1235 778256
>
> Rutherford Appleton Laboratory Fax:
> (+44)(0)1235 446626
>
> Chilton, Didcot, Oxon, OX11 0QX, UK Email: [log in to unmask]
>
> --------------------------------------------------------------
> --------------------------------------------
>
>
|