Friday, 2021-08-13

Artifact Evaluation Track

"Result and Artifact Review and Badging" policy

In this spirit, the ICPE 2018 Artifacts Track exists to review, promote, share and catalog the research artifacts produced by any of the full papers accepted to the research track. Apart from repeatability and replicability, cataloguing these artifacts also allows reuse by other teams in reproduction or other studies. Artifacts of interest include (but are not limited to):

  • Tools, libraries or frameworks, which are implementations of systems or algorithms essential for the results described in the associated paper, possibly also useful in other work.

  • Data or repositories, which are essential for the results described in the associated paper, ideally also useful in other work.

The authors must ensure that at the camera ready deadline, the artifacts are generally available from a stable URL or DOI with an archival plan, such as the SPEC RG Zenodo repository (personal page is not sufficient).

If you require an exception from the conditions above, please mail the chairs before submitting.


What do you get out of it?

If your artifact is accepted, it will receive one of the following badges in the text of the paper and in the ACM Digital Library:

  • Artifacts Evaluated - Functional: The artifacts are complete, well-documented and allow to obtain the same results as the paper.

  • Artifacts Evaluated - Reusable: As above, but the artifacts are of such a high quality that they can be reused as is on other data sets, or for other purposes.

  • Artifacts Available: For artifacts made permanently available. This will only be awarded in conjunction with one of the Artifacts Evaluated badges.


Regarding archival, all accepted artifacts will be indexed on the conference web site.


How to submit?

Submissions are made via Easy Chair by selecting "Artifact Track".

Submission deadlines are listed on the important dates page.


To submit an artifact for your accepted ICPE 2018 full research track paper, it is important to keep in mind: a) how accessible you are making your artifact to other researchers, and b) the fact that the ICPE artifact evaluators will have very limited time for making an assessment of each artifact. Artifact evaluation can be rejected for artifacts whose configuration and installation takes an undue amount of time. If you envision difficulties, please provide your artifact in some easily ported form, such as a virtual machine image (https://www.virtualbox.org) or a container image (https://www.docker.com).

Whichever the case, your artifact should be made available as a link to a single archive file using a widely available compressed archive format (preferably zip or tar.gz).

The repository or archive must:

  • be self-contained (with the exception of pointers to external tools or libraries, which we will not consider being part of the evaluated artifact, but which we will try to use when evaluating the artifact)
  • contain an HTML file called index.html that fully describes the artifact and includes (relative) links to the files (included in the archive) that constitute the artifact:
    • include a getting started guide that should stress the key elements of your artifact and that should enable the reviewers to run, execute or analyze your artifact without any technical difficulty
    • where appropriate, include descriptions of and links to files (included in the archive) that represent expected outputs
  • contain the artifact itself, which may include, but is not limited to, source code, executables, data, a virtual machine image, and documents.
    Please use open formats for documents.

  • contain the submitted version of your research track paper

  • optionally, authors are encouraged to submit a link to a short video (maximum 5 minutes) demonstrating the artifact.

To facilitate artifact review, you should include the link to your artifact as well as the other requested information in the ICPE artifact submission site.


Review Process and Selection Criteria



Submitted artifacts will go through a two-phase evaluation:

  • Kicking the tires: reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (corrupted or missing files, VM does not start, immediate crashes on the simplest example). Authors are informed of the outcome and, in case of technical problems, they can help solve them during a brief author response period.

  • Artifact assessment: reviewers evaluate the artifacts, checking if they live up to the expectations created by the paper.

Since portability bugs are easy to make, the review committee can issue additional requests to authors during the assessment to fix such bugs in the artifact. The resulting version of the artifact should be considered "final" and should allow reviewers to decide about artifact acceptance and badges.

Artifacts will be scored using the following criteria:

  • Artifacts Evaluated - Functional:
    • Documented: Is it accompanied by relevant documentation making it easy to use?

    • Consistent: Is the artifact relevant to the associated paper, and contribute in some inherent way to the generation of its main results?

    • Complete

    • Exercisable: If the artifact is executable, is it easy to download, install, or execute? Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data canaccessed and appropriately manipulated.
  • Artifacts Evaluated - Reusable:
  • Artifacts Available:

Artifact Evaluation Chairs

  • Wilhelm Hasselbring, Kiel University, Germany
  • Petr Tuma, Charles University, Czech Republic


Evaluation Committee

  • Holger Eichelberger, University of Hildesheim, Germany
  • Vincenzo Ferme, University of Lugano, Switzerland
  • Alexey S. Ilyushkin, TU Delft, The Netherlands
  • Holger Knoche, b+m Informatik AG, Germany
  • Haiyang Sun, University of Lugano, Switzerland
  • Michael Vierhauser, University of Notre Dame, United States of America
  • Felix Willnecker, fortiss GmbH, Germany


---

Information from previous evaluation tracks reused with permission.