skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Data & model conditioning for multivariate systematic uncertainty in model calibration, validation, and extrapolation.

Conference ·
OSTI ID:989995

This paper discusses implications and appropriate treatment of systematic uncertainty in experiments and modeling. Systematic uncertainty exists when experimental conditions, and/or measurement bias errors, and/or bias contributed by post-processing the data, are constant over the set of experiments but the particular values of the conditions and/or biases are unknown to within some specified uncertainty. Systematic uncertainties in experiments do not automatically show up in the output data, unlike random uncertainty which is revealed when multiple experiments are performed. Therefore, the output data must be properly 'conditioned' to reflect important sources of systematic uncertainty in the experiments. In industrial scale experiments the systematic uncertainty in experimental conditions (especially boundary conditions) is often large enough that the inference error on how the experimental system maps inputs to outputs is often quite substantial. Any such inference error and uncertainty thereof also has implications in model validation and calibration/conditioning; ignoring systematic uncertainty in experiments can lead to 'Type X' error in these procedures. Apart from any considerations of modeling and simulation, reporting of uncertainty associated with experimental results should include the effects of any significant systematic uncertainties in the experiments. This paper describes and illustrates the treatment of multivariate systematic uncertainties of interval and/or probabilistic natures, and combined cases. The paper also outlines a practical and versatile 'real-space' framework and methodology within which experimental and modeling uncertainties (correlated and uncorrelated, systematic and random, aleatory and epistemic) are treated to mitigate risk in model validation, calibration/conditioning, hierarchical modeling, and extrapolative prediction.

Research Organization:
Sandia National Laboratories (SNL), Albuquerque, NM, and Livermore, CA (United States)
Sponsoring Organization:
USDOE
DOE Contract Number:
AC04-94AL85000
OSTI ID:
989995
Report Number(s):
SAND2010-1773C; TRN: US201020%%13
Resource Relation:
Conference: Proposed for presentation at the 12th AIAA Non-Deterministic Approaches Conference held April 12-15, 2010 in Orlando, FL.
Country of Publication:
United States
Language:
English