==============================================================================
1st Workshop on Understanding Reproducibility in Evolutionary Computation
http://lopez-ibanez.eu/reproducibility-gecco/
co-organised with
Good Benchmarking Practices for Evolutionary Computation (Benchmarking@GECCO-2021)
https://sites.google.com/view/benchmarking-network/home/activities/gecco-2021-workshop
to be held online as part of
The Genetic and Evolutionary Computation Conference (GECCO 2021),
July 10-14 2021.
http://gecco-2021.sigevo.org
==============================================================================
Experimental studies are prevalent in Evolutionary Computation (EC), and
concerns about the reproducibility and replicability of such studies has
increased in recent years, following similar discussions in other scientific
fields. In this workshop, we want to raise awareness of the reproducibility
issue, shed light on the obstacles when trying to reproduce results, and
discuss best practices in making results reproducible as well as reporting
reproducibility results.
We invite submissions of papers repeating an empirical study published in a
journal or conference proceedings, either by re-using, updating or
reimplementing the necessary codes and datasets, irrespectively of whether this
code was published in some form at the time or not.
* The original study being reproduced should not be so recent as to make the
reproduction attempt trivial. Ideally, we suggest looking at studies that
are at least 10 years old. However, one of the criteria for acceptance is
what can the GECCO community learn from the reproducibility study.
* At least one of the co-authors of the submitted paper should be one of the
co-authors of the original study. This condition makes sure that the
reproducibility attempt is a fair attempt at reproducing the original work.
We expect in the submitted paper:
* Documentation of the process necessary to re-run the experiments. For
example, you may have to retrieve the benchmark problems from the web,
downgrade your system or some libraries, modify your original code because
some library is nowhere to be found, reinstall a specific compiler or
interpreter, etc.
* A discussion on whether you consider your paper reproducible, and why you
think this is the case. If you ran your code with fixed random seeds and you
have recorded them, you may be able to reproduce identical results. If you
haven’t recorded the random seeds, you may need to use statistical tests to
check whether the conclusions still hold. You may even want to try some
different problem instances or parameter settings to check whether results
still hold for slightly different experimental settings.
* Sufficient details to allow an independent reproduction of your experiment
by a third party, including all necessary artifacts used in the attempt to
reproduce results (code, benchmark problems, scripts to generate plots or do
statistical analysis). Artifacts should be made publicly and permanently
available via Zenodo (https://zenodo.org) or other data repository or
submitted together with the paper to be archived in the ACM Digital Library.
In the end, there may be various possible outcomes, and all are acceptable for
a paper: you are unable to run or compile the code, you are able to run the
code but it does not give you the expected results (or no result at all), the
program crashed regularly (before getting results), you do not remember the
parameter settings used, etc. All these are valid conclusions. We care more
about the description of the process, challenges to reproduce results, and the
lessons to be learned, than about whether you have actually been able to
reproduce the study.
Submission Instructions
------------------------
In addition to the instructions above, authors should refer to the GECCO
Submission Instructions:
https://gecco-2021.sigevo.org/Call-for-Workshop-Papers
Please note that GECCO 2021 will be held as an electronic-only conference. All
accepted papers will be required to be presented in the form of a pre-recorded
talk. More details about this will be provided soon.
Important Dates
---------------
- 11 February 2021: Submissions open
- 12 April 2021: Submissions deadline
- 26 April 2021: Acceptance decisions
- 3 May 2021: Camera-ready papers due and author registration deadline
- July 10th-14th 2021: Online GECCO conference
Organizers
----------
Jürgen Branke (University of Warwick, UK)
Carola Doerr (CNRS researcher at Sorbonne University, Paris, France)
Tome Eftimov (Jožef Stefan Institute, Ljubljana, Slovenia)
Pascal Kerschke (University of Münster, Germany)
Manuel López-Ibáñez (University of Málaga, Spain)
Boris Naujoks (TH Cologne, Germany)
Luís Paquete (University of Coimbra, Portugal)
Vanessa Volz (modl.ai, Copenhagen, Denmark)
Thomas Weise (Hefei University, China)
--
Dr Manuel López-Ibáñez | Senior Lecturer (Assoc Prof) at the Decision and
Cognitive Sciences Research Centre | Alliance Manchester Business School | The
University of Manchester, UK | http://lopez-ibanez.eu
------------------------------------------------------------------------------
Fully-funded PhD in real-world optimization: https://www.jobs.ac.uk/job/CEH623
------------------------------------------------------------------------------
Workshop on Understanding Reproducibility in Evolutionary Computation
(Benchmarking@GECCO2021). CFP: http://lopez-ibanez.eu/reproducibility-gecco/
------------------------------------------------------------------------------
Student appointments via https://calendly.com/manuel-lopez-ibanez
------------------------------------------------------------------------------
########################################################################
To unsubscribe from the EVOLUTIONARY-COMPUTING list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=EVOLUTIONARY-COMPUTING&A=1
This message was issued to members of www.jiscmail.ac.uk/EVOLUTIONARY-COMPUTING, a mailing list hosted by www.jiscmail.ac.uk, terms & conditions are available at https://www.jiscmail.ac.uk/policyandsecurity/
|