Hi David, All,
Thanks for an interesting discussion. Here is a list of the comments I
wanted to make, as I fear I may not have made them very well in the meeting.
* I fully agree that a lightweight approach centred on sharing
information and knowledge is the right way to go. I can't see a
structure with more committees and boards trying to dictate standards is
going to attract any participation. (Unless there is lots of money on
the table, of course, but I don't see much sign of that.)
* I would suggest focussing on sharing expertise, and helping projects
with testing, is a better approach than setting up a big central
infrastructure and expecting everyone to use it. But a central build and
test infrastructure for those without the knowledge or inclination to
set up their own could still be useful. A central issue tracking system
might be less useful, unless it interoperates smoothly with all the
other systems used by individual projects. Experience with Savannah does
not inspire confidence.
* We should display our lack of arrogance by looking at what other
communities have built as well as the wonderful things we have to offer
them.
* It would be well worth talking to Neil Chue Hong or others at the
Software Sustainability Institute (SSI), as they have put considerable
thought and effort into tackling similar issues across a broader range
of research areas. Their
[manifesto](http://www.software.ac.uk/policy/manifesto) states that they
"believe that the full benefits of software in research will only be
realised when software is accepted as a valid research output." This
ties into some of our discussion of ensuring recognition and career
paths for those specialising in software.
* At UCL we (the central Research Software Development Team and others)
are drawing up a dicussion paper about career paths in research software
engineering. I'll share this with anyone interested when it is ready for
a wider audience.
* Increasing the efficiency of code is important, and we have to make
use of advances in computer architectures, but *correct* code is even
more important, so training and support in software testing is important.
* DOIs for software: James Adams has already mentioned that GitHub
repositories can now be associated with DOIs thanks to a collaboration
with [Mozilla Science Lab](http://mozillascience.org/): see [Improving
GitHub for
science](https://github.com/blog/1840-improving-github-for-science). The
[Journal of Open Research
Software](http://openresearchsoftware.metajnl.com/) may also be of interest.
* Analyses depend not only on the massive frameworks built by
collaborations, but on smaller programs and scripts written by
non-specialist programmers. Training for software users and physicists
who don't regard themselves primarily as programmers is valuable,
although this may not be central to the sort of collaboration that is
being discussed at the moment. I would like to throw in a plug for
[Software Carpentry](http://software-carpentry.org/).
* What would be really useful would be a small team of experts who can
manage information and provide guidance on issues like copyright and
(open-source) software licences as well as maintain awareness of the
requirements that are not yet covered by HEP software, those that have
been covered many times, and those that are covered by software from
other fields and could be used in HEP.
Best regards,
Ben
On 22/05/14 14:39, David Colling wrote:
> Just a reminder that we have a meeting tomorrow and one of the things
> that will appear on the agenda is the attached document which is a draft
> response to the meeting on HeP software at CERN
>
--
Dr Ben Waugh Tel. +44 (0)20 7679 7223
Computing and IT Manager Internal: 37223
Dept of Physics and Astronomy
University College London
London WC1E 6BT
|