skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Validation of simulation codes for future systems : motivations, approach, and the role of nuclear data.

Abstract

The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design. Validation of simulation codes is complementary to the 'verification' process. In fact, 'verification' addresses the question 'are we solving the equations correctly' while validation addresses the question 'are we solving the correct equations with the correct parameters'. Verification implies comparisons with 'reference' equation solutions or with analytical solutions, when they exist. Most of what is called 'numerical validation' falls in this category. Validation strategies differ according to the relative weight of the methods and of the parameters that enter into the simulation tools. Most validation is based on experiments, and the field of neutronics where a 'robust' physics description model exists and which is function of 'input' parameters not fully known, will be the focus of this paper. Inmore » fact, in the case of reactor core, shielding and fuel cycle physics the model (theory) is well established (the Boltzmann and Bateman equations) and the parameters are the nuclear cross-sections, decay data etc. Two types of validation approaches can and have been used: (a) Mock-up experiments ('global' validation): need for a very close experimental simulation of a reference configuration. Bias factors cannot be extrapolated beyond reference configuration; (b) Use of 'clean', 'representative' integral experiments ('bias factor and adjustment' method). Allows to define bias factors, uncertainties and can be used for a wide range of applications. It also allows to define 'adjusted' application libraries or even 'adjusted' data files. The use of this last approach has been particularly successful in the design of SUPERPHENIX. In fact the prediction of the critical mass has been remarkably close to the experimental value observed at reactor start up (discrepancy of {approx}3 out of {approx}400 core sub-assemblies).« less

Authors:
; ; ; ; ;
Publication Date:
Research Org.:
Argonne National Lab. (ANL), Argonne, IL (United States)
Sponsoring Org.:
USDOE Office of Science (SC)
OSTI Identifier:
992043
Report Number(s):
ANL/NE/CP-60290
TRN: US1007602
DOE Contract Number:
DE-AC02-06CH11357
Resource Type:
Conference
Resource Relation:
Conference: NEMEA-4 Workshop; Oct. 16, 2007 - Oct. 18, 2007; Prague, Czech Republic
Country of Publication:
United States
Language:
ENGLISH
Subject:
21 SPECIFIC NUCLEAR REACTORS AND ASSOCIATED PLANTS; ANALYTICAL SOLUTION; CONFIGURATION; CRITICAL MASS; CROSS SECTIONS; DECAY; DESIGN; FORECASTING; FUEL CYCLE; PHYSICS; REACTOR CORES; REACTOR PHYSICS; REACTOR START-UP; SHIELDING; SIMULATION; VALIDATION; VERIFICATION

Citation Formats

Palmiotti, G., Salvatores, M., Aliberti, G., Nuclear Engineering Division, INL, and CEA Cadarache. Validation of simulation codes for future systems : motivations, approach, and the role of nuclear data.. United States: N. p., 2007. Web.
Palmiotti, G., Salvatores, M., Aliberti, G., Nuclear Engineering Division, INL, & CEA Cadarache. Validation of simulation codes for future systems : motivations, approach, and the role of nuclear data.. United States.
Palmiotti, G., Salvatores, M., Aliberti, G., Nuclear Engineering Division, INL, and CEA Cadarache. Mon . "Validation of simulation codes for future systems : motivations, approach, and the role of nuclear data.". United States. doi:.
@article{osti_992043,
title = {Validation of simulation codes for future systems : motivations, approach, and the role of nuclear data.},
author = {Palmiotti, G. and Salvatores, M. and Aliberti, G. and Nuclear Engineering Division and INL and CEA Cadarache},
abstractNote = {The validation of advanced simulation tools will still play a very significant role in several areas of reactor system analysis. This is the case of reactor physics and neutronics, where nuclear data uncertainties still play a crucial role for many core and fuel cycle parameters. The present paper gives a summary of validation motivations, objectives and approach. A validation effort is in particular necessary in the frame of advanced (e.g. Generation-IV or GNEP) reactors and associated fuel cycles assessment and design. Validation of simulation codes is complementary to the 'verification' process. In fact, 'verification' addresses the question 'are we solving the equations correctly' while validation addresses the question 'are we solving the correct equations with the correct parameters'. Verification implies comparisons with 'reference' equation solutions or with analytical solutions, when they exist. Most of what is called 'numerical validation' falls in this category. Validation strategies differ according to the relative weight of the methods and of the parameters that enter into the simulation tools. Most validation is based on experiments, and the field of neutronics where a 'robust' physics description model exists and which is function of 'input' parameters not fully known, will be the focus of this paper. In fact, in the case of reactor core, shielding and fuel cycle physics the model (theory) is well established (the Boltzmann and Bateman equations) and the parameters are the nuclear cross-sections, decay data etc. Two types of validation approaches can and have been used: (a) Mock-up experiments ('global' validation): need for a very close experimental simulation of a reference configuration. Bias factors cannot be extrapolated beyond reference configuration; (b) Use of 'clean', 'representative' integral experiments ('bias factor and adjustment' method). Allows to define bias factors, uncertainties and can be used for a wide range of applications. It also allows to define 'adjusted' application libraries or even 'adjusted' data files. The use of this last approach has been particularly successful in the design of SUPERPHENIX. In fact the prediction of the critical mass has been remarkably close to the experimental value observed at reactor start up (discrepancy of {approx}3 out of {approx}400 core sub-assemblies).},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Mon Jan 01 00:00:00 EST 2007},
month = {Mon Jan 01 00:00:00 EST 2007}
}

Conference:
Other availability
Please see Document Availability for additional information on obtaining the full-text document. Library patrons may search WorldCat to identify libraries that hold this conference proceeding.

Save / Share: