01. – 04. November 2015, Dagstuhl-Perspektive-Workshop 15452

Artifact Evaluation for Publications


Bruce R. Childers (University of Pittsburgh, US)
Grigori Fursin (cTuning – Cachan, FR)
Shriram Krishnamurthi (Brown University – Providence, US)
Andreas Zeller (Universität des Saarlandes, DE)

Auskunft zu diesem Dagstuhl-Perspektive-Workshop erteilt

Dagstuhl Service Team


Dagstuhl Report, Volume 5, Issue 11 Dagstuhl Report


Computer systems researchers have developed numerous artifacts that encompass a broad collection of software tools, benchmarks, and data sets. These artifacts are used to prototype innovations, evaluate trade-offs and analyze implications. Unfortunately, methods used in the evaluation of computing system innovation are often at odds with sound science and engineering practice. The ever-increasing pressure to publish more and more results poses an impediment to accountability, which is a key component of the scientific and engineering process. Experimental results are not usually disseminated with sufficient metadata (i.e., software extensions, data sets, benchmarks, test cases, scripts, parameters, etc.) to achieve repeatability and/or reproducibility. Without this information, issues surrounding trust, fairness and building on and comparing with previous ideas becomes problematic. Efforts in various computer systems research sub-communities, including programming languages/compilers, computer architecture, and high-performance computing, are underway to address the challenge.

This Dagstuhl Perspectives Workshop (PW) brought together stakeholders of associated CSR sub-communities to determine synergies and to identify the most promising directions and mechanisms to push the broader community toward accountability. The PW assessed current efforts, shared what does and doesn't work, identified additional processes, and determined possible incentives and mechanisms. The outcomes from the workshop, including recommendations to catalyze the community, are separately documented in an associated Dagstuhl Manifesto.

Summary text license
  Creative Commons BY 3.0 Unported license
  Bruce R. Childers, Grigori Fursin, Shriram Krishnamurthi, and Andreas Zeller


  • Hardware
  • Optimization / Scheduling
  • Software Engineering


  • Empirical Evaluation of Software Tools
  • Documentation of Research Processes
  • Artifact Evaluation
  • Experimental Reproducibility


In der Reihe Dagstuhl Reports werden alle Dagstuhl-Seminare und Dagstuhl-Perspektiven-Workshops dokumentiert. Die Organisatoren stellen zusammen mit dem Collector des Seminars einen Bericht zusammen, der die Beiträge der Autoren zusammenfasst und um eine Zusammenfassung ergänzt.


Download Übersichtsflyer (PDF).

Dagstuhl's Impact

Bitte informieren Sie uns, wenn eine Veröffentlichung ausgehend von Ihrem Seminar entsteht. Derartige Veröffentlichungen werden von uns in der Rubrik Dagstuhl's Impact separat aufgelistet  und im Erdgeschoss der Bibliothek präsentiert.


Es besteht weiterhin die Möglichkeit, eine umfassende Kollektion begutachteter Arbeiten in der Reihe Dagstuhl Follow-Ups zu publizieren.