https://www.dagstuhl.de/15452
November 1 – 4 , 2015, Dagstuhl Perspectives Workshop 15452
Artifact Evaluation for Publications
Organizers
Bruce R. Childers (University of Pittsburgh, US)
Grigori Fursin (cTuning – Cachan, FR)
Shriram Krishnamurthi (Brown University – Providence, US)
Andreas Zeller (Universität des Saarlandes, DE)
For support, please contact
Documents
Dagstuhl Report, Volume 5, Issue 11
Aims & Scope
List of Participants
Summary
Computer systems researchers have developed numerous artifacts that encompass a broad collection of software tools, benchmarks, and data sets. These artifacts are used to prototype innovations, evaluate trade-offs and analyze implications. Unfortunately, methods used in the evaluation of computing system innovation are often at odds with sound science and engineering practice. The ever-increasing pressure to publish more and more results poses an impediment to accountability, which is a key component of the scientific and engineering process. Experimental results are not usually disseminated with sufficient metadata (i.e., software extensions, data sets, benchmarks, test cases, scripts, parameters, etc.) to achieve repeatability and/or reproducibility. Without this information, issues surrounding trust, fairness and building on and comparing with previous ideas becomes problematic. Efforts in various computer systems research sub-communities, including programming languages/compilers, computer architecture, and high-performance computing, are underway to address the challenge.
This Dagstuhl Perspectives Workshop (PW) brought together stakeholders of associated CSR sub-communities to determine synergies and to identify the most promising directions and mechanisms to push the broader community toward accountability. The PW assessed current efforts, shared what does and doesn't work, identified additional processes, and determined possible incentives and mechanisms. The outcomes from the workshop, including recommendations to catalyze the community, are separately documented in an associated Dagstuhl Manifesto.


Classification
- Hardware
- Optimization / Scheduling
- Software Engineering
Keywords
- Empirical Evaluation of Software Tools
- Documentation of Research Processes
- Artifact Evaluation
- Experimental Reproducibility