skip to main content

Title: Tackling the Reproducibility Problem in Systems Research with Declarative Experiment Specifications

Validating experimental results in the field of computer systems is a challenging task, mainly due to the many changes in software and hardware that computational environments go through. Determining if an experiment is reproducible entails two separate tasks: re-executing the experiment and validating the results. Existing reproducibility efforts have focused on the former, envisioning techniques and infrastructures that make it easier to re-execute an experiment. In this work we focus on the latter by analyzing the validation workflow that an experiment re-executioner goes through. We notice that validating results is done on the basis of experiment design and high-level goals, rather than exact quantitative metrics. Based on this insight, we introduce a declarative format for specifying the high-level components of an experiment as well as describing generic, testable conditions that serve as the basis for validation. We present a use case in the area of storage systems to illustrate the usefulness of this approach. We also discuss limitations and potential benefits of using this approach in other areas of experimental systems research.
 [1] ;  [1] ;  [2] ;  [3] ;  [3] ;  [4] ;  [4]
  1. Univ. of California, Santa Cruz, CA (United States)
  2. Sandia National Lab. (SNL-CA), Livermore, CA (United States)
  3. Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
  4. Univ. of Wisconsin, Madison, WI (United States)
Publication Date:
OSTI Identifier:
Report Number(s):
DOE Contract Number:
Resource Type:
Technical Report
Research Org:
Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
Sponsoring Org:
Country of Publication:
United States