National Library of Energy BETA

Sample records for uncertainty high uncertainty

  1. Sensitivity and Uncertainty Analysis

    Broader source: Energy.gov [DOE]

    Summary Notes from 15 November 2007 Generic Technical Issue Discussion on Sensitivity and Uncertainty Analysis and Model Support

  2. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  3. Assessing Climate Uncertainty

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Climate Uncertainty The uncertainty in climate change and in its impacts is of great concern to the international community. While the ever-growing body of scientific evidence substantiates present climate change, the driving concern about this issue lies in the consequences it poses to humanity. Policy makers will most likely need to make decisions about climate policy before climate scientists have quantified all relevant uncertainties about the impacts of climate change. Sandia scientists

  4. Physical Uncertainty Bounds (PUB)

    SciTech Connect (OSTI)

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  5. Measurement uncertainty relations

    SciTech Connect (OSTI)

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order ? rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  6. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    SciTech Connect (OSTI)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  7. Calibration Under Uncertainty.

    SciTech Connect (OSTI)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  8. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect (OSTI)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  9. Entropic uncertainty relations and entanglement

    SciTech Connect (OSTI)

    Guehne, Otfried; Lewenstein, Maciej

    2004-08-01

    We discuss the relationship between entropic uncertainty relations and entanglement. We present two methods for deriving separability criteria in terms of entropic uncertainty relations. In particular, we show how any entropic uncertainty relation on one part of the system results in a separability condition on the composite system. We investigate the resulting criteria using the Tsallis entropy for two and three qubits.

  10. Computational Fluid Dynamics & Large-Scale Uncertainty Quantification...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... (CFD) simulations and uncertainty analyses. The project developed new mathematical uncertainty quantification techniques and applied them, in combination with high-fidelity CFD ...

  11. Uncertainty quantification and multiscale mathematics. (Conference...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty quantification and multiscale mathematics. Citation Details In-Document Search Title: Uncertainty quantification and multiscale mathematics. No abstract prepared. ...

  12. Whitepaper on Uncertainty Quantification for MPACT

    SciTech Connect (OSTI)

    Williams, Mark L.

    2015-12-17

    The MPACT code provides the ability to perform high-fidelity deterministic calculations to obtain a wide variety of detailed results for very complex reactor core models. However MPACT currently does not have the capability to propagate the effects of input data uncertainties to provide uncertainties in the calculated results. This white paper discusses a potential method for MPACT uncertainty quantification (UQ) based on stochastic sampling.

  13. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect (OSTI)

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  14. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect (OSTI)

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  15. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less

  16. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    SciTech Connect (OSTI)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  17. Evaluation of machine learning algorithms for prediction of regions of high Reynolds averaged Navier Stokes uncertainty

    SciTech Connect (OSTI)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  18. Evaluation of machine learning algorithms for prediction of regions of high Reynolds averaged Navier Stokes uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less

  19. Uncertainty quantification of a containment vessel dynamic response subjected to high-explosive detonation impulse loading

    SciTech Connect (OSTI)

    Rodriguez, E. A.; Pepin, J. E.; Thacker, B. H.; Riha, D. S.

    2002-01-01

    Los Alamos National Laboratory (LANL), in cooperation with Southwest Research Institute, has been developing capabilities to provide reliability-based structural evaluation techniques for performing weapon component and system reliability assessments. The development and applications of Probabilistic Structural Analysis Methods (PSAM) is an important ingredient in the overall weapon reliability assessments. Focus, herein, is placed on the uncertainty quantification associated with the structural response of a containment vessel for high-explosive (HE) experiments. The probabilistic dynamic response of the vessel is evaluated through the coupling of the probabilistic code NESSUS with the non-linear structural dynamics code, DYNA-3D. The probabilistic model includes variations in geometry and mechanical properties, such as Young's Modulus, yield strength, and material flow characteristics. Finally, the probability of exceeding a specified strain limit, which is related to vessel failure, is determined.

  20. Reducing Petroleum Despendence in California: Uncertainties About...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Petroleum Despendence in California: Uncertainties About Light-Duty Diesel Reducing Petroleum Despendence in California: Uncertainties About Light-Duty Diesel 2002 DEER Conference ...

  1. Uncertainty Analysis Technique for OMEGA Dante Measurements ...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty Analysis Technique for OMEGA Dante Measurements Citation Details In-Document Search Title: Uncertainty Analysis Technique for OMEGA Dante Measurements You are...

  2. Report: Technical Uncertainty and Risk Reduction

    Office of Environmental Management (EM)

    TECHNICAL UNCERTAINTY AND RISK REDUCTION Background In FY 2007 EMAB was tasked to assess EM's ability to reduce risk and technical uncertainty. Board members explored this topic ...

  3. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect (OSTI)

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  4. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect (OSTI)

    Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  5. ARM - PI Product - Direct Aerosol Forcing Uncertainty

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ProductsDirect Aerosol Forcing Uncertainty ARM Data Discovery Browse Data Comments? We would love to hear from you! Send us a note below or call us at 1-888-ARM-DATA. Send PI Product : Direct Aerosol Forcing Uncertainty Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement

  6. Uncertainty quantification and multiscale mathematics. (Conference...

    Office of Scientific and Technical Information (OSTI)

    quantification and multiscale mathematics. Citation Details In-Document Search Title: Uncertainty quantification and multiscale mathematics. Authors: Trucano, Timothy Guy ...

  7. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect (OSTI)

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  8. Incorporating Forecast Uncertainty in Utility Control Center

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  9. Direct Aerosol Forcing Uncertainty (Dataset) | Data Explorer

    Office of Scientific and Technical Information (OSTI)

    comparable to uncertainty arising from some individual properties. less Authors: Mccomiskey, Allison Publication Date: 2008-01-15 OSTI Identifier: 1169526 DOE Contract ...

  10. Validation and Uncertainty Characterization for Energy Simulation

    Energy Savers [EERE]

    Validation and Uncertainty Characterization for Energy Simulation (1530) Philip Haves (LBNL) Co-PI's: Ron Judkoff (NREL), Joshua New (ORNL), Ralph Muehleisen (ANL) BTO Merit ...

  11. From Deterministic Inversion to Uncertainty Quantification: Planning...

    Office of Scientific and Technical Information (OSTI)

    Glen's Flow Law exponent) Main sources of uncertainty: Problem definition Goal: ... Common y Proposed U: computed depth averaged velocity H: ice thickness basal sliding ...

  12. Uncertainty quantification for discrimination of nuclear events...

    Office of Scientific and Technical Information (OSTI)

    comprehensive nuclear-test-ban treaty Title: Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty Authors: ...

  13. Uncertainty Quantification in Climate Modeling - Discovering...

    Office of Scientific and Technical Information (OSTI)

    in Climate Modeling - Discovering Sparsity and Building Surrogates. Citation Details In-Document Search Title: Uncertainty Quantification in Climate Modeling - Discovering ...

  14. Uncertainties in global aerosol simulations: Assessment using...

    Office of Scientific and Technical Information (OSTI)

    Title: Uncertainties in global aerosol simulations: Assessment using three meteorological data sets Current global aerosol models use different physical and chemical schemes and 4 ...

  15. Uncertainty Quantification for Nuclear Density Functional Theory...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements Citation Details In-Document Search This content will become publicly...

  16. Uncertainty Quantification and Propagation in Nuclear Density...

    Office of Scientific and Technical Information (OSTI)

    theoretical tools used to study the properties of heavy and ... of model uncertainties and Bayesian inference methods. ... Country of Publication: United States Language: English ...

  17. From Deterministic Inversion to Uncertainty Quantification: Planning...

    Office of Scientific and Technical Information (OSTI)

    Planning a Long Journey in Ice Sheet Modeling. Citation Details In-Document Search Title: From Deterministic Inversion to Uncertainty Quantification: Planning a Long Journey in ...

  18. Uncertainty Estimation of Radiometric Data using a Guide to the Expression of Uncertainty in Measurement (GUM) Method

    SciTech Connect (OSTI)

    Habte, Aron

    2015-06-25

    This presentation summarizes uncertainty estimation of radiometric data using the Guide to the Expression of Uncertainty (GUM) method.

  19. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (OSTI)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  20. Estimating the uncertainty in underresolved nonlinear dynamics

    SciTech Connect (OSTI)

    Chorin, Alelxandre; Hald, Ole

    2013-06-12

    The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.

  1. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect (OSTI)

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  2. Quantifying uncertainty in stable isotope mixing models

    SciTech Connect (OSTI)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (?15N and ?18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the

  3. Quantifying uncertainty in stable isotope mixing models

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods testedmore » are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  4. Quantifying uncertainty in stable isotope mixing models

    SciTech Connect (OSTI)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the

  5. Numerical uncertainty in computational engineering and physics

    SciTech Connect (OSTI)

    Hemez, Francois M

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.

  6. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect (OSTI)

    Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  7. An uncertainty principle for unimodular quantum groups

    SciTech Connect (OSTI)

    Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  8. Reducing Petroleum Despendence in California: Uncertainties About

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Light-Duty Diesel | Department of Energy Petroleum Despendence in California: Uncertainties About Light-Duty Diesel Reducing Petroleum Despendence in California: Uncertainties About Light-Duty Diesel 2002 DEER Conference Presentation: Center for Energy Efficiency and Renewable Technologies 2002_deer_phillips.pdf (62.04 KB) More Documents & Publications Diesel Use in California Future Potential of Hybrid and Diesel Powertrains in the U.S. Light-Duty Vehicle Market Dumping Dirty Diesels:

  9. Cost Analysis: Technology, Competitiveness, Market Uncertainty | Department

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Energy Technology to Market » Cost Analysis: Technology, Competitiveness, Market Uncertainty Cost Analysis: Technology, Competitiveness, Market Uncertainty As a basis for strategic planning, competitiveness analysis, funding metrics and targets, SunShot supports analysis teams at national laboratories to assess technology costs, location-specific competitive advantages, policy impacts on system financing, and to perform detailed levelized cost of energy (LCOE) analyses. This shows the

  10. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by ...

  11. Early solar mass loss, opacity uncertainties, and the solar abundance...

    Office of Scientific and Technical Information (OSTI)

    Early solar mass loss, opacity uncertainties, and the solar abundance problem Citation Details In-Document Search Title: Early solar mass loss, opacity uncertainties, and the solar ...

  12. Analysis of the Uncertainty in Wind Measurements from the Atmospheric...

    Office of Scientific and Technical Information (OSTI)

    Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation ... Citation Details In-Document Search Title: Analysis of the Uncertainty in Wind ...

  13. Addressing Uncertainties in Design Inputs: A Case Study of Probabilist...

    Office of Environmental Management (EM)

    Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF Addressing Uncertainties in Design Inputs: A Case ...

  14. Analysis and Reduction of Chemical Models under Uncertainty ...

    Office of Scientific and Technical Information (OSTI)

    Analysis and Reduction of Chemical Models under Uncertainty Citation Details In-Document Search Title: Analysis and Reduction of Chemical Models under Uncertainty Abstract not ...

  15. Gradient-Enhanced Universal Kriging for Uncertainty Propagation...

    Office of Scientific and Technical Information (OSTI)

    Gradient-Enhanced Universal Kriging for Uncertainty Propagation Citation Details In-Document Search Title: Gradient-Enhanced Universal Kriging for Uncertainty Propagation Authors: ...

  16. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect (OSTI)

    Smith, F.; Phifer, M.

    2011-06-30

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in

  17. Error propagation equations for estimating the uncertainty in high-speed wind tunnel test results

    SciTech Connect (OSTI)

    Clark, E.L.

    1994-07-01

    Error propagation equations, based on the Taylor series model, are derived for the nondimensional ratios and coefficients most often encountered in high-speed wind tunnel testing. These include pressure ratio and coefficient, static force and moment coefficients, dynamic stability coefficients, and calibration Mach number. The error equations contain partial derivatives, denoted as sensitivity coefficients, which define the influence of free-steam Mach number, M{infinity}, on various aerodynamic ratios. To facilitate use of the error equations, sensitivity coefficients are derived and evaluated for five fundamental aerodynamic ratios which relate free-steam test conditions to a reference condition.

  18. Strategies for Application of Isotopic Uncertainties in Burnup Credit

    SciTech Connect (OSTI)

    Gauld, I.C.

    2002-12-23

    Uncertainties in the predicted isotopic concentrations in spent nuclear fuel represent one of the largest sources of overall uncertainty in criticality calculations that use burnup credit. The methods used to propagate the uncertainties in the calculated nuclide concentrations to the uncertainty in the predicted neutron multiplication factor (k{sub eff}) of the system can have a significant effect on the uncertainty in the safety margin in criticality calculations and ultimately affect the potential capacity of spent fuel transport and storage casks employing burnup credit. Methods that can provide a more accurate and realistic estimate of the uncertainty may enable increased spent fuel cask capacity and fewer casks needing to be transported, thereby reducing regulatory burden on licensee while maintaining safety for transporting spent fuel. This report surveys several different best-estimate strategies for considering the effects of nuclide uncertainties in burnup-credit analyses. The potential benefits of these strategies are illustrated for a prototypical burnup-credit cask design. The subcritical margin estimated using best-estimate methods is discussed in comparison to the margin estimated using conventional bounding methods of uncertainty propagation. To quantify the comparison, each of the strategies for estimating uncertainty has been performed using a common database of spent fuel isotopic assay measurements for pressurized-light-water reactor fuels and predicted nuclide concentrations obtained using the current version of the SCALE code system. The experimental database applied in this study has been significantly expanded to include new high-enrichment and high-burnup spent fuel assay data recently published for a wide range of important burnup-credit actinides and fission products. Expanded rare earth fission-product measurements performed at the Khlopin Radium Institute in Russia that contain the only known publicly-available measurement for {sup 103

  19. Risk management & organizational uncertainty implications for the assessment of high consequence organizations

    SciTech Connect (OSTI)

    Bennett, C.T.

    1995-02-23

    Post hoc analyses have demonstrated clearly that macro-system, organizational processes have played important roles in such major catastrophes as Three Mile Island, Bhopal, Exxon Valdez, Chernobyl, and Piper Alpha. How can managers of such high-consequence organizations as nuclear power plants and nuclear explosives handling facilities be sure that similar macro-system processes are not operating in their plants? To date, macro-system effects have not been integrated into risk assessments. Part of the reason for not using macro-system analyses to assess risk may be the impression that standard organizational measurement tools do not provide hard data that can be managed effectively. In this paper, I argue that organizational dimensions, like those in ISO 9000, can be quantified and integrated into standard risk assessments.

  20. A stochastic approach to quantifying the blur with uncertainty estimation for high-energy X-ray imaging systems

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Fowler, Michael J.; Howard, Marylesa; Luttman, Aaron; Mitchell, Stephen E.; Webb, Timothy J.

    2015-06-03

    One of the primary causes of blur in a high-energy X-ray imaging system is the shape and extent of the radiation source, or ‘spot’. It is important to be able to quantify the size of the spot as it provides a lower bound on the recoverable resolution for a radiograph, and penumbral imaging methods – which involve the analysis of blur caused by a structured aperture – can be used to obtain the spot’s spatial profile. We present a Bayesian approach for estimating the spot shape that, unlike variational methods, is robust to the initial choice of parameters. The posteriormore » is obtained from a normal likelihood, which was constructed from a weighted least squares approximation to a Poisson noise model, and prior assumptions that enforce both smoothness and non-negativity constraints. A Markov chain Monte Carlo algorithm is used to obtain samples from the target posterior, and the reconstruction and uncertainty estimates are the computed mean and variance of the samples, respectively. Lastly, synthetic data-sets are used to demonstrate accurate reconstruction, while real data taken with high-energy X-ray imaging systems are used to demonstrate applicability and feasibility.« less

  1. A stochastic approach to quantifying the blur with uncertainty estimation for high-energy X-ray imaging systems

    SciTech Connect (OSTI)

    Fowler, Michael J.; Howard, Marylesa; Luttman, Aaron; Mitchell, Stephen E.; Webb, Timothy J.

    2015-06-03

    One of the primary causes of blur in a high-energy X-ray imaging system is the shape and extent of the radiation source, or ‘spot’. It is important to be able to quantify the size of the spot as it provides a lower bound on the recoverable resolution for a radiograph, and penumbral imaging methods – which involve the analysis of blur caused by a structured aperture – can be used to obtain the spot’s spatial profile. We present a Bayesian approach for estimating the spot shape that, unlike variational methods, is robust to the initial choice of parameters. The posterior is obtained from a normal likelihood, which was constructed from a weighted least squares approximation to a Poisson noise model, and prior assumptions that enforce both smoothness and non-negativity constraints. A Markov chain Monte Carlo algorithm is used to obtain samples from the target posterior, and the reconstruction and uncertainty estimates are the computed mean and variance of the samples, respectively. Lastly, synthetic data-sets are used to demonstrate accurate reconstruction, while real data taken with high-energy X-ray imaging systems are used to demonstrate applicability and feasibility.

  2. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    SciTech Connect (OSTI)

    Hansen, Clifford W.; Martin, Curtis E.

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  3. Estimating parameters and uncertainty for three-dimensional flow and transport in a highly heterogeneous sand box experiment.

    SciTech Connect (OSTI)

    McKenna, Sean Andrew; Yoon, Hongkyu; Hart, David Blaine

    2010-12-01

    Heterogeneity plays an important role in groundwater flow and contaminant transport in natural systems. Since it is impossible to directly measure spatial variability of hydraulic conductivity, predictions of solute transport based on mathematical models are always uncertain. While in most cases groundwater flow and tracer transport problems are investigated in two-dimensional (2D) systems, it is important to study more realistic and well-controlled 3D systems to fully evaluate inverse parameter estimation techniques and evaluate uncertainty in the resulting estimates. We used tracer concentration breakthrough curves (BTCs) obtained from a magnetic resonance imaging (MRI) technique in a small flow cell (14 x 8 x 8 cm) that was packed with a known pattern of five different sands (i.e., zones) having cm-scale variability. In contrast to typical inversion systems with head, conductivity and concentration measurements at limited points, the MRI data included BTCs measured at a voxel scale ({approx}0.2 cm in each dimension) over 13 x 8 x 8 cm with a well controlled boundary condition, but did not have direct measurements of head and conductivity. Hydraulic conductivity and porosity were conceptualized as spatial random fields and estimated using pilot points along layers of the 3D medium. The steady state water flow and solute transport were solved using MODFLOW and MODPATH. The inversion problem was solved with a nonlinear parameter estimation package - PEST. Two approaches to parameterization of the spatial fields are evaluated: (1) The detailed zone information was used as prior information to constrain the spatial impact of the pilot points and reduce the number of parameters; and (2) highly parameterized inversion at cm scale (e.g., 1664 parameters) using singular value decomposition (SVD) methodology to significantly reduce the run-time demands. Both results will be compared to measured BTCs. With MRI, it is easy to change the averaging scale of the observed

  4. Model development and data uncertainty integration

    SciTech Connect (OSTI)

    Swinhoe, Martyn Thomas

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  5. Coping with uncertainties of mercury regulation

    SciTech Connect (OSTI)

    Reich, K.

    2006-09-15

    The thermometer is rising as coal-fired plants cope with the uncertainties of mercury regulation. The paper deals with a diagnosis and a suggested cure. It describes the state of mercury emission rules in the different US states, many of which had laws or rules in place before the Clean Air Mercury Rule (CAMR) was promulgated.

  6. Uncertainty in Simulating Wheat Yields Under Climate Change

    SciTech Connect (OSTI)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  7. Uncertainty in BWR power during ATWS events

    SciTech Connect (OSTI)

    Diamond, D.J.

    1986-01-01

    A study was undertaken to improve our understanding of BWR conditions following the closure of main steam isolation valves and the failure of reactor trip. Of particular interest was the power during the period when the core had reached a quasi-equilibrium condition with a natural circulation flow rate determined by the water level in the downcomer. Insights into the uncertainity in the calculation of this power with sophisticated computer codes were quantified using a simple model which relates power to the principal thermal-hydraulic variables and reactivity coefficients; the latter representing the link between the thermal-hydraulics and the neutronics. Assumptions regarding the uncertainty in these variables and coefficients were then used to determine the uncertainty in power.

  8. On solar geoengineering and climate uncertainty

    SciTech Connect (OSTI)

    MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.

    2015-09-03

    Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictions of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.

  9. Uncertainty estimates for derivatives and intercepts

    SciTech Connect (OSTI)

    Clark, E.L.

    1994-09-01

    Straight line least squares fits of experimental data are widely used in the analysis of test results to provide derivatives and intercepts. A method for evaluating the uncertainty in these parameters is described. The method utilizes conventional least squares results and is applicable to experiments where the independent variable is controlled, but not necessarily free of error. A Monte Carlo verification of the method is given.

  10. Interpolation Uncertainties Across the ARM SGP Area

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Interpolation Uncertainties Across the ARM SGP Area J. E. Christy, C. N. Long, and T. R. Shippert Pacific Northwest National Laboratory Richland, Washington Interpolation Grids Across the SGP Network Area The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Program operates a network of surface radiation measurement sites across north central Oklahoma and south central Kansas. This Southern Great Plains (SGP) network consists of 21 sites unevenly spaced from 95.5 to 99.5

  11. Uncertainties in risk assessment at USDOE facilities

    SciTech Connect (OSTI)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  12. Representation of analysis results involving aleatory and epistemic uncertainty.

    SciTech Connect (OSTI)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  13. October 16, 2014 Webinar - Decisional Analysis under Uncertainty |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 6, 2014 Webinar - Decisional Analysis under Uncertainty October 16, 2014 Webinar - Decisional Analysis under Uncertainty Webinar - October 16, 2014, 11 am - 12:40 pm EDT: Dr. Paul Black (Neptune, Inc), Decisional Analysis under Uncertainty Agenda - October 16, 2014 - P&RA CoP Webinar (59.42 KB) Presentation - Decision Making under Uncertainty: Introduction to Structured Decision Analysis for Performance Assessments (4.02 MB) More Documents & Publications Status

  14. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect (OSTI)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  15. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect (OSTI)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  16. Entropic uncertainty relations in multidimensional position and momentum spaces

    SciTech Connect (OSTI)

    Huang Yichen

    2011-05-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

  17. Tutorial examples for uncertainty quantification methods.

    SciTech Connect (OSTI)

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  18. Microsoft Word - Price Uncertainty Supplement .docx

    Gasoline and Diesel Fuel Update (EIA)

    1 1 January 2011 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 January 11, 2011 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged over $89 per barrel in December, about $5 per barrel higher than the November average. Expectations of higher oil demand, combined with unusually cold weather in both Europe and the U.S. Northeast, contributed to prices. EIA has raised the first quarter 2011 WTI spot price forecast by $8 per barrel

  19. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    April 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 April 6, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $81 per barrel in March 2010, almost $5 per barrel above the prior month's average and $3 per barrel higher than forecast in last month's Outlook. Oil prices rose from a low this year of $71.15 per barrel on February 5 to $80 per barrel by the end of February, generally on news of robust economic and energy demand growth in non-OECD

  20. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 August 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 August 10, 2010 Release WTI crude oil spot prices averaged $76.32 per barrel in July 2010 or about $1 per barrel above the prior month's average, and close to the $77 per barrel projected in last month's Outlook. EIA projects WTI prices will average about $80 per barrel over the second half of this year and rise to $85 by the end of next year (West Texas Intermediate Crude Oil Price Chart). Energy price

  1. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    December 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 December 7, 2010 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged over $84 per barrel in November, more than $2 per barrel higher than the October average. EIA has raised the average winter 2010-2011 period WTI spot price forecast by $1 per barrel from the last monthʹs Outlook to $84 per barrel. WTI spot prices rise to $89 per barrel by the end of next year, $2 per

  2. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 July 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 July 7, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $75.34 per barrel in June 2010 ($1.60 per barrel above the prior month's average), close to the $76 per barrel projected in the forecast in last month's Outlook. EIA projects WTI prices will average about $79 per barrel over the second half of this year and rise to $84 by the end of next year (West Texas Intermediate Crude Oil Price

  3. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 June 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 June 8, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged less than $74 per barrel in May 2010, almost $11 per barrel below the prior month's average and $7 per barrel lower than forecast in last month's Outlook. EIA projects WTI prices will average about $79 per barrel over the second half of this year and rise to $84 by the end of next year, a decrease of about $3 per barrel from the

  4. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    March 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 March 9, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $76.39 per barrel in February 2010, almost $2 per barrel lower than the prior month's average and very near the $76 per barrel forecast in last month's Outlook. Last month, the WTI spot price reached a low of $71.15 on February 5 and peaked at $80.04 on February 22. EIA expects WTI prices to average above $80 per barrel this spring,

  5. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    May 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 May 11, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $84 per barrel in April 2010, about $3 per barrel above the prior month's average and $2 per barrel higher than forecast in last month's Outlook. EIA projects WTI prices will average about $84 per barrel over the second half of this year and rise to $87 by the end of next year, an increase of about $2 per barrel from the previous Outlook

  6. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    November 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 November 9, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged almost $82 per barrel in October, about $7 per barrel higher than the September average, as expectations of higher oil demand pushed up prices. EIA has raised the average fourth quarter 2010 WTI spot price forecast to about $83 per barrel compared with $79 per barrel in last monthʹs Outlook. WTI spot prices rise to $87 per

  7. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 September 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 September 8, 2010 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged about $77 per barrel in August 2010, very close to the July average, but $3 per barrel lower than projected in last month's Outlook. WTI spot prices averaged almost $82 per barrel over the first 10 days of August but then fell by $9 per barrel over the next 2 weeks as the market reacted to a series

  8. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect (OSTI)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  9. Adaptive strategies for materials design using uncertainties

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Balachandran, Prasanna V.; Xue, Dezhen; Theiler, James; Hogden, John; Lookman, Turab

    2016-01-21

    Here, we compare several adaptive design strategies using a data set of 223 M2AX family of compounds for which the elastic properties [bulk (B), shear (G), and Young’s (E) modulus] have been computed using density functional theory. The design strategies are decomposed into an iterative loop with two main steps: machine learning is used to train a regressor that predicts elastic properties in terms of elementary orbital radii of the individual components of the materials; and a selector uses these predictions and their uncertainties to choose the next material to investigate. The ultimate goal is to obtain a material withmore » desired elastic properties in as few iterations as possible. We examine how the choice of data set size, regressor and selector impact the design. We find that selectors that use information about the prediction uncertainty outperform those that don’t. Our work is a step in illustrating how adaptive design tools can guide the search for new materials with desired properties.« less

  10. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    SciTech Connect (OSTI)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  11. Bayesian Uncertainty Quantification in Predictions of Flows in Highly Heterogeneous Media and Its Applications to the CO2 Sequestration

    SciTech Connect (OSTI)

    Efendiev, Yalchin; Datta-Gupta, Akhil; Jafarpour, Behnam; Mallick, Bani; Vassilevski, Panayot

    2015-11-09

    In this proposal, we have worked on Bayesian uncertainty quantification for predictions of fows in highly heterogeneous media. The research in this proposal is broad and includes: prior modeling for heterogeneous permeability fields; effective parametrization of heterogeneous spatial priors; efficient ensemble- level solution techniques; efficient multiscale approximation techniques; study of the regularity of complex posterior distribution and the error estimates due to parameter reduction; efficient sampling techniques; applications to multi-phase ow and transport. We list our publications below and describe some of our main research activities. Our multi-disciplinary team includes experts from the areas of multiscale modeling, multilevel solvers, Bayesian statistics, spatial permeability modeling, and the application domain.

  12. Survey and Evaluate Uncertainty Quantification Methodologies

    SciTech Connect (OSTI)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  13. Lidar arc scan uncertainty reduction through scanning geometry optimization

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Wang, H.; Barthelmie, R. J.; Pryor, S. C.; Brown, G.

    2015-10-07

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine powermoreperformance analysis and annual energy production. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation when arc scans are used for wind resource assessment.less

  14. Geostatistical evaluation of travel time uncertainties

    SciTech Connect (OSTI)

    Devary, J.L.

    1983-08-01

    Data on potentiometric head and hydraulic conductivity, gathered from the Wolfcamp Formation of the Permian System, have exhibited tremendous spatial variability as a result of heterogeneities in the media and the presence of petroleum and natural gas deposits. Geostatistical data analysis and error propagation techniques (kriging and conditional simulation) were applied to determine the effect of potentiometric head uncertainties on radionuclide travel paths and travel times through the Wolfcamp Formation. Blok-average kriging was utilized to remove measurement error from potentiometric head data. The travel time calculations have been enhanced by the use of an inverse technique to determine the relative hydraulic conductivity along travel paths. In this way, the spatial variability of the hydraulic conductivity corresponding to streamline convergence and divergence may be included in the analysis. 22 references, 11 figures, 1 table.

  15. Radiotherapy Dose Fractionation under Parameter Uncertainty

    SciTech Connect (OSTI)

    Davison, Matt; Kim, Daero; Keller, Harald

    2011-11-30

    In radiotherapy, radiation is directed to damage a tumor while avoiding surrounding healthy tissue. Tradeoffs ensue because dose cannot be exactly shaped to the tumor. It is particularly important to ensure that sensitive biological structures near the tumor are not damaged more than a certain amount. Biological tissue is known to have a nonlinear response to incident radiation. The linear quadratic dose response model, which requires the specification of two clinically and experimentally observed response coefficients, is commonly used to model this effect. This model yields an optimization problem giving two different types of optimal dose sequences (fractionation schedules). Which fractionation schedule is preferred depends on the response coefficients. These coefficients are uncertainly known and may differ from patient to patient. Because of this not only the expected outcomes but also the uncertainty around these outcomes are important, and it might not be prudent to select the strategy with the best expected outcome.

  16. Intrinsic Uncertainties in Modeling Complex Systems.

    SciTech Connect (OSTI)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project %23170979, entitled %22Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties%22, which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  17. Entanglement criteria via concave-function uncertainty relations

    SciTech Connect (OSTI)

    Huang Yichen

    2010-07-15

    A general theorem as a necessary condition for the separability of quantum states in both finite and infinite dimensional systems, based on concave-function uncertainty relations, is derived. Two special cases of the general theorem are stronger than two known entanglement criteria based on the Shannon entropic uncertainty relation and the Landau-Pollak uncertainty relation, respectively; other special cases are able to detect entanglement where some famous entanglement criteria fail.

  18. Output-Based Error Estimation and Adaptation for Uncertainty...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Output-Based Error Estimation and Adaptation for Uncertainty Quantification Isaac M. Asher and Krzysztof J. Fidkowski University of Michigan US National Congress on Computational...

  19. Early solar mass loss, opacity uncertainties, and the solar abundance...

    Office of Scientific and Technical Information (OSTI)

    Early solar mass loss, opacity uncertainties, and the solar abundance problem Citation ... Visit OSTI to utilize additional information resources in energy science and technology. A ...

  20. Estimation of uncertainty for contour method residual stress measurements

    SciTech Connect (OSTI)

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulness of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 76 mm, the estimated uncertainty was approximately 5 MPa (?/E = 7 10??) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (?/E = 14 10??).

  1. Uncertainty quantification of US Southwest climate from IPCC...

    Office of Scientific and Technical Information (OSTI)

    Title: Uncertainty quantification of US Southwest climate from IPCC projections. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) made extensive ...

  2. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect (OSTI)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  3. Uncertainty quantification in fission cross section measurements at LANSCE

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  4. Uncertainties in Air Exchange using Continuous-Injection, Long...

    Office of Scientific and Technical Information (OSTI)

    people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS...

  5. Improvements to Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. ... of AFCI. While in the past the design, construction and operation of ...

  6. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Talou, Patrick Los Alamos National Laboratory; Nazarewicz, Witold University of Tennessee, Knoxville,...

  7. A density-matching approach for optimization under uncertainty...

    Office of Scientific and Technical Information (OSTI)

    A density-matching approach for optimization under uncertainty Citation Details ... Journal Name: Computer Methods in Applied Mechanics and Engineering Additional Journal ...

  8. Uncertainty Estimates for SIRS, SKYRAD, & GNDRAD Data and Reprocessing...

    Office of Scientific and Technical Information (OSTI)

    Title: Uncertainty Estimates for SIRS, SKYRAD, & GNDRAD Data and Reprocessing the Pyrgeometer Data (Presentation) The National Renewable Energy Laboratory (NREL) and the ...

  9. Comparison of Uncertainty of Two Precipitation Prediction Models...

    Office of Scientific and Technical Information (OSTI)

    Prediction Models at Los Alamos National Lab Technical Area 54 Citation Details In-Document Search Title: Comparison of Uncertainty of Two Precipitation Prediction Models ...

  10. Variable Grid Method for Visualizing Uncertainty Associated with...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    interpretation. In general, NETL's VGM applies a grid system where the size of the cell represents the uncertainty associated with the original point data sources or their...

  11. A Unified Approach for Reporting ARM Measurement Uncertainties...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    0 A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report E Campos DL Sisterson October 2015 DISCLAIMER This report was prepared as an account of work ...

  12. Estimation of uncertainty for contour method residual stress measurements

    SciTech Connect (OSTI)

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulness of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (σ/E = 7 · 10⁻⁵) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (σ/E = 14 · 10⁻⁵).

  13. Estimation of uncertainty for contour method residual stress measurements

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore » of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (σ/E = 7 · 10⁻⁵) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (σ/E = 14 · 10⁻⁵).« less

  14. Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions

    SciTech Connect (OSTI)

    Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.

    2011-09-30

    This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.

  15. Estimation and Uncertainty Analysis of Impacts of Future Heat...

    Office of Scientific and Technical Information (OSTI)

    However, the estimation of excess mortality attributable to future heat waves is subject to large uncertainties, which have not been examined under the latest greenhouse gas ...

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  17. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  18. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect (OSTI)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  19. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect (OSTI)

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  20. Investigation of uncertainty components in Coulomb blockade thermometry

    SciTech Connect (OSTI)

    Hahtela, O. M.; Heinonen, M.; Manninen, A.; Meschke, M.; Savin, A.; Pekola, J. P.; Gunnarsson, D.; Prunnila, M.; Penttil, J. S.; Roschier, L.

    2013-09-11

    Coulomb blockade thermometry (CBT) has proven to be a feasible method for primary thermometry in every day laboratory use at cryogenic temperatures from ca. 10 mK to a few tens of kelvins. The operation of CBT is based on single electron charging effects in normal metal tunnel junctions. In this paper, we discuss the typical error sources and uncertainty components that limit the present absolute accuracy of the CBT measurements to the level of about 1 % in the optimum temperature range. Identifying the influence of different uncertainty sources is a good starting point for improving the measurement accuracy to the level that would allow the CBT to be more widely used in high-precision low temperature metrological applications and for realizing thermodynamic temperature in accordance to the upcoming new definition of kelvin.

  1. Size exclusion deep bed filtration: Experimental and modelling uncertainties

    SciTech Connect (OSTI)

    Badalyan, Alexander You, Zhenjiang; Aji, Kaiser; Bedrikovetsky, Pavel; Carageorgos, Themis; Zeinijahromi, Abbas

    2014-01-15

    A detailed uncertainty analysis associated with carboxyl-modified latex particle capture in glass bead-formed porous media enabled verification of the two theoretical stochastic models for prediction of particle retention due to size exclusion. At the beginning of this analysis it is established that size exclusion is a dominant particle capture mechanism in the present study: calculated significant repulsive Derjaguin-Landau-Verwey-Overbeek potential between latex particles and glass beads is an indication of their mutual repulsion, thus, fulfilling the necessary condition for size exclusion. Applying linear uncertainty propagation method in the form of truncated Taylor's series expansion, combined standard uncertainties (CSUs) in normalised suspended particle concentrations are calculated using CSUs in experimentally determined parameters such as: an inlet volumetric flowrate of suspension, particle number in suspensions, particle concentrations in inlet and outlet streams, particle and pore throat size distributions. Weathering of glass beads in high alkaline solutions does not appreciably change particle size distribution, and, therefore, is not considered as an additional contributor to the weighted mean particle radius and corresponded weighted mean standard deviation. Weighted mean particle radius and LogNormal mean pore throat radius are characterised by the highest CSUs among all experimental parameters translating to high CSU in the jamming ratio factor (dimensionless particle size). Normalised suspended particle concentrations calculated via two theoretical models are characterised by higher CSUs than those for experimental data. The model accounting the fraction of inaccessible flow as a function of latex particle radius excellently predicts normalised suspended particle concentrations for the whole range of jamming ratios. The presented uncertainty analysis can be also used for comparison of intra- and inter-laboratory particle size exclusion data.

  2. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    SciTech Connect (OSTI)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  3. Uncertainty and sensitivity analysis in the 2008 performance assessment for the proposed repository for high-level radioactive waste at Yucca Mountain, Nevada.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-05-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, an extensive performance assessment (PA) for the YM repository was completed in 2008 [1] and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository [2]. This presentation provides an overview of the conceptual and computational structure of the indicated PA (hereafter referred to as the 2008 YM PA) and the roles that uncertainty analysis and sensitivity analysis play in this structure.

  4. Avoiding climate change uncertainties in Strategic Environmental Assessment

    SciTech Connect (OSTI)

    Larsen, Sanne Vammen; Krnv, Lone; Driscoll, Patrick

    2013-11-15

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies reduction and resilience, denying, ignoring and postponing. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.

  5. Lidar arc scan uncertainty reduction through scanning geometry optimization

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.; Brown, Gareth.

    2016-04-13

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annualmore » energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30% of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. As a result, large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.« less

  6. Lidar arc scan uncertainty reduction through scanning geometry optimization

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.; Brown, Gareth.

    2016-04-13

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annualmore » energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.« less

  7. Error and uncertainty in Raman thermal conductivity measurements

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Thomas Edwin Beechem; Yates, Luke; Graham, Samuel

    2015-04-22

    We investigated error and uncertainty in Raman thermal conductivity measurements via finite element based numerical simulation of two geometries often employed -- Joule-heating of a wire and laser-heating of a suspended wafer. Using this methodology, the accuracy and precision of the Raman-derived thermal conductivity are shown to depend on (1) assumptions within the analytical model used in the deduction of thermal conductivity, (2) uncertainty in the quantification of heat flux and temperature, and (3) the evolution of thermomechanical stress during testing. Apart from the influence of stress, errors of 5% coupled with uncertainties of ±15% are achievable for most materialsmore » under conditions typical of Raman thermometry experiments. Error can increase to >20%, however, for materials having highly temperature dependent thermal conductivities or, in some materials, when thermomechanical stress develops concurrent with the heating. A dimensionless parameter -- termed the Raman stress factor -- is derived to identify when stress effects will induce large levels of error. Together, the results compare the utility of Raman based conductivity measurements relative to more established techniques while at the same time identifying situations where its use is most efficacious.« less

  8. Error and uncertainty in Raman thermal conductivity measurements

    SciTech Connect (OSTI)

    Thomas Edwin Beechem; Yates, Luke; Graham, Samuel

    2015-04-22

    We investigated error and uncertainty in Raman thermal conductivity measurements via finite element based numerical simulation of two geometries often employed -- Joule-heating of a wire and laser-heating of a suspended wafer. Using this methodology, the accuracy and precision of the Raman-derived thermal conductivity are shown to depend on (1) assumptions within the analytical model used in the deduction of thermal conductivity, (2) uncertainty in the quantification of heat flux and temperature, and (3) the evolution of thermomechanical stress during testing. Apart from the influence of stress, errors of 5% coupled with uncertainties of ±15% are achievable for most materials under conditions typical of Raman thermometry experiments. Error can increase to >20%, however, for materials having highly temperature dependent thermal conductivities or, in some materials, when thermomechanical stress develops concurrent with the heating. A dimensionless parameter -- termed the Raman stress factor -- is derived to identify when stress effects will induce large levels of error. Together, the results compare the utility of Raman based conductivity measurements relative to more established techniques while at the same time identifying situations where its use is most efficacious.

  9. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    SciTech Connect (OSTI)

    Díez, C.J.; Cabellos, O.; Martínez, J.S.

    2015-01-15

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  10. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect (OSTI)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  11. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    SciTech Connect (OSTI)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  12. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    SciTech Connect (OSTI)

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene; Malinas, Rebecca; Peterson, Matthew Gregor; Phillips, Cynthia A.; Robinson, David G.; Woodbridge, Diane

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  13. PDF uncertainties at large x and gauge boson production

    SciTech Connect (OSTI)

    Accardi, Alberto

    2012-10-01

    I discuss how global QCD fits of parton distribution functions can make the somewhat separated fields of high-energy particle physics and lower energy hadronic and nuclear physics interact to the benefit of both. In particular, I will argue that large rapidity gauge boson production at the Tevatron and the LHC has the highest short-term potential to constrain the theoretical nuclear corrections to DIS data on deuteron targets necessary for up/down flavor separation. This in turn can considerably reduce the PDF uncertainty on cross section calculations of heavy mass particles such as W' and Z' bosons.

  14. Risk Analysis and Decision-Making Under Uncertainty: A Strategy...

    Office of Environmental Management (EM)

    Uncertainty Analysis and Parameter Estimation Since 2002 To view all the P&RA CoP ... Update of Hydrogen from Biomass - Determination of the Delivered Cost of Hydrogen: ...

  15. Integration of Uncertainty Information into Power System Operations

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Lu, Shuai; Samaan, Nader A.; Huang, Zhenyu; Subbarao, Krishnappa; Etingov, Pavel V.; Ma, Jian; Hafen, Ryan P.; Diao, Ruisheng; Lu, Ning

    2011-10-10

    Contemporary power systems face uncertainties coming from multiple sources, including forecast errors of load, wind and solar generation, uninstructed deviation and forced outage of traditional generators, loss of transmission lines, and others. With increasing amounts of wind and solar generation being integrated into the system, these uncertainties have been growing significantly. It is critical important to build knowledge of major sources of uncertainty, learn how to simulate them, and then incorporate this information into the decision-making processes and power system operations, for better reliability and efficiency. This paper gives a comprehensive view on the sources of uncertainty in power systems, important characteristics, available models, and ways of their integration into system operations. It is primarily based on previous works conducted at the Pacific Northwest National Laboratory (PNNL).

  16. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    SciTech Connect (OSTI)

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  17. Dasymetric Modeling and Uncertainty (Journal Article) | SciTech...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: Dasymetric Modeling and Uncertainty Citation Details In-Document Search ... DOE Contract Number: DE-AC05-00OR22725 Resource Type: Journal Article Resource Relation: ...

  18. Problem Solving Environment for Uncertainty Analysis and Design Exploration

    Energy Science and Technology Software Center (OSTI)

    2011-10-26

    PSUADE is an software system that is used to study the releationships between the inputs and outputs of gerneral simulation models for the purpose of performing uncertainty and sensitivity analyses on simulation models.

  19. Uncertainty analysis of multi-rate kinetics of uranium desorption...

    Office of Scientific and Technical Information (OSTI)

    in the multi-rate model to simualte U(VI) desorption; 3) however, long-term prediction and its uncertainty may be significantly biased by the lognormal assumption for the ...

  20. PROJECT PROFILE: Reducing PV Performance Uncertainty by Accurately

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Quantifying the "PV Resource" | Department of Energy Reducing PV Performance Uncertainty by Accurately Quantifying the "PV Resource" PROJECT PROFILE: Reducing PV Performance Uncertainty by Accurately Quantifying the "PV Resource" Funding Opportunity: SuNLaMP SunShot Subprogram: Photovoltaics Location: National Renewable Energy Laboratory, Golden, CO Amount Awarded: $2,500,000 The procedures used today for prediction of the solar resource available to

  1. Direct Aerosol Forcing: Sensitivity to Uncertainty in Measurements of

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Aerosol Optical and Situational Properties Direct Aerosol Forcing: Sensitivity to Uncertainty in Measurements of Aerosol Optical and Situational Properties McComiskey, Allison CIRES / NOAA Schwartz, Stephen Brookhaven National Laboratory Ricchiazzi, Paul University of California, Santa Barbara Lewis, Ernie Brookhaven National Laboratory Michalsky, Joseph DOC/NOAA/OAR/ESRL/GMD Ogren, John NOAA/CMDL Category: Radiation Understanding sources of uncertainty in estimating aerosol direct radiative

  2. Practical uncertainty reduction and quantification in shock physics measurements

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Akin, M. C.; Nguyen, J. H.

    2015-04-20

    We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.

  3. Gas Exploration Software for Reducing Uncertainty in Gas Concentration

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Estimates - Energy Innovation Portal Energy Analysis Energy Analysis Find More Like This Return to Search Gas Exploration Software for Reducing Uncertainty in Gas Concentration Estimates Lawrence Berkeley National Laboratory Contact LBL About This Technology Technology Marketing SummaryEstimating reservoir parameters for gas exploration from geophysical data is subject to a large degree of uncertainty. Seismic imaging techniques, such as seismic amplitude versus angle (AVA) analysis, can

  4. Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Technical Report: Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Authors: Talou, Patrick [1] ; Nazarewicz, Witold [2] ; Prinja, Anil [3] ; Danon, Yaron [4] + Show Author Affiliations Los Alamos National Laboratory University of Tennessee, Knoxville, TN 37996, USA University of New Mexico, USA Rensselaer

  5. Analysis of the Uncertainty in Wind Measurements from the Atmospheric

    Office of Scientific and Technical Information (OSTI)

    Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report (Program Document) | SciTech Connect Program Document: Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report Citation Details In-Document Search Title: Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report In March and April of 2015, the ARM Doppler

  6. Determining Best Estimates and Uncertainties in Cloud Microphysical

    Office of Scientific and Technical Information (OSTI)

    Parameters from ARM Field Data: Implications for Models, Retrieval Schemes and Aerosol-Cloud-Radiation Interactions (Technical Report) | SciTech Connect Determining Best Estimates and Uncertainties in Cloud Microphysical Parameters from ARM Field Data: Implications for Models, Retrieval Schemes and Aerosol-Cloud-Radiation Interactions Citation Details In-Document Search Title: Determining Best Estimates and Uncertainties in Cloud Microphysical Parameters from ARM Field Data: Implications for

  7. Quantifying Uncertainty in Computer Predictions | netl.doe.gov

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Quantifying Uncertainty in Computer Predictions Quantifying Uncertainty in Computer Model Predictions The U.S. Department of Energy has great interest in technologies that will lead to reducing the CO2 emissions of fossil-fuel-burning power plants. Advanced energy technologies such as Integrated Gasification Combined Cycle (IGCC) and Carbon Capture and Storage (CCS) can potentially lead to the clean and efficient use of fossil fuels to power our nation. The development of new energy

  8. Microsoft Word - Documentation - Price Forecast Uncertainty.doc

    U.S. Energy Information Administration (EIA) Indexed Site

    October 2009 1 October 2009 Short-Term Energy Outlook Supplement: Energy Price Volatility and Forecast Uncertainty 1 Summary It is often noted that energy prices are quite volatile, reflecting market participants' adjustments to new information from physical energy markets and/or markets in energy- related financial derivatives. Price volatility is an indication of the level of uncertainty, or risk, in the market. This paper describes how markets price risk and how the market- clearing process

  9. Uncertainties in the Anti-neutrino Production at Nuclear Reactors

    SciTech Connect (OSTI)

    Djurcic, Zelimir; Detwiler, Jason A.; Piepke, Andreas; Foster Jr., Vince R.; Miller, Lester; Gratta, Giorgio

    2008-08-06

    Anti-neutrino emission rates from nuclear reactors are determined from thermal power measurements and fission rate calculations. The uncertainties in these quantities for commercial power plants and their impact on the calculated interaction rates in {bar {nu}}{sub e} detectors is examined. We discuss reactor-to-reactor correlations between the leading uncertainties, and their relevance to reactor {bar {nu}}{sub e} experiments.

  10. Lab RFP: Validation and Uncertainty Characterization | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Validation and Uncertainty Characterization Lab RFP: Validation and Uncertainty Characterization LBNL's FLEXLAB test facility, which includes four test cells each split into two half-cells to enable side-by-side comparative experiments. The cells have one active, reconfigurable facade, and individual, reconfigurable single-zone HVAC systems. The cell facing the camera sits on 270 degree turntable. Photo credit: LBNL. Bottom: ORNL's two-story flexible research platform test building. The building

  11. How incorporating more data reduces uncertainty in recovery predictions

    SciTech Connect (OSTI)

    Campozana, F.P.; Lake, L.W.; Sepehrnoori, K.

    1997-08-01

    From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.

  12. Fuzzy-probabilistic calculations of water-balance uncertainty

    SciTech Connect (OSTI)

    Faybishenko, B.

    2009-10-01

    Hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete, or subjective information, which may limit the application of conventional stochastic methods in predicting hydrogeologic conditions and associated uncertainty. Instead, redictions and uncertainty analysis can be made using uncertain input parameters expressed as probability boxes, intervals, and fuzzy numbers. The objective of this paper is to present the theory for, and a case study as an application of, the fuzzyprobabilistic approach, ombining probability and possibility theory for simulating soil water balance and assessing associated uncertainty in the components of a simple waterbalance equation. The application of this approach is demonstrated using calculations with the RAMAS Risk Calc code, to ssess the propagation of uncertainty in calculating potential evapotranspiration, actual evapotranspiration, and infiltration-in a case study at the Hanford site, Washington, USA. Propagation of uncertainty into the results of water-balance calculations was evaluated by hanging he types of models of uncertainty incorporated into various input parameters. The results of these fuzzy-probabilistic calculations are compared to the conventional Monte Carlo simulation approach and estimates from field observations at the Hanford site.

  13. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    SciTech Connect (OSTI)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.

  14. Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties

    SciTech Connect (OSTI)

    Chiang, K. Y.; Hu, L. W.; Forget, B.

    2012-07-01

    The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)

  15. MONTE-CARLO BURNUP CALCULATION UNCERTAINTY QUANTIFICATION AND PROPAGATION DETERMINATION

    SciTech Connect (OSTI)

    Nichols, T.; Sternat, M.; Charlton, W.

    2011-05-08

    MONTEBURNS is a Monte-Carlo depletion routine utilizing MCNP and ORIGEN 2.2. Uncertainties exist in the MCNP transport calculation, but this information is not passed to the depletion calculation in ORIGEN or saved. To quantify this transport uncertainty and determine how it propagates between burnup steps, a statistical analysis of a multiple repeated depletion runs is performed. The reactor model chosen is the Oak Ridge Research Reactor (ORR) in a single assembly, infinite lattice configuration. This model was burned for a 25.5 day cycle broken down into three steps. The output isotopics as well as effective multiplication factor (k-effective) were tabulated and histograms were created at each burnup step using the Scott Method to determine the bin width. It was expected that the gram quantities and k-effective histograms would produce normally distributed results since they were produced from a Monte-Carlo routine, but some of results do not. The standard deviation at each burnup step was consistent between fission product isotopes as expected, while the uranium isotopes created some unique results. The variation in the quantity of uranium was small enough that, from the reaction rate MCNP tally, round off error occurred producing a set of repeated results with slight variation. Statistical analyses were performed using the {chi}{sup 2} test against a normal distribution for several isotopes and the k-effective results. While the isotopes failed to reject the null hypothesis of being normally distributed, the {chi}{sup 2} statistic grew through the steps in the k-effective test. The null hypothesis was rejected in the later steps. These results suggest, for a high accuracy solution, MCNP cell material quantities less than 100 grams and greater kcode parameters are needed to minimize uncertainty propagation and minimize round off effects.

  16. LCOE Uncertainty Analysis for Hydropower using Monte Carlo Simulations

    SciTech Connect (OSTI)

    Chalise, Dol Raj; O'Connor, Patrick W; DeNeale, Scott T; Uria Martinez, Rocio; Kao, Shih-Chieh

    2015-01-01

    Levelized Cost of Energy (LCOE) is an important metric to evaluate the cost and performance of electricity production generation alternatives, and combined with other measures, can be used to assess the economics of future hydropower development. Multiple assumptions on input parameters are required to calculate the LCOE, which each contain some level of uncertainty, in turn affecting the accuracy of LCOE results. This paper explores these uncertainties, their sources, and ultimately the level of variability they introduce at the screening level of project evaluation for non-powered dams (NPDs) across the U.S. Owing to site-specific differences in site design, the LCOE for hydropower varies significantly from project to project unlike technologies with more standardized configurations such as wind and gas. Therefore, to assess the impact of LCOE input uncertainty on the economics of U.S. hydropower resources, these uncertainties must be modeled across the population of potential opportunities. To demonstrate the impact of uncertainty, resource data from a recent nationwide non-powered dam (NPD) resource assessment (Hadjerioua et al., 2012) and screening-level predictive cost equations (O Connor et al., 2015) are used to quantify and evaluate uncertainties in project capital and operations & maintenance costs, and generation potential at broad scale. LCOE dependence on financial assumptions is also evaluated on a sensitivity basis to explore ownership/investment implications on project economics for the U.S. hydropower fleet. The results indicate that the LCOE for U.S. NPDs varies substantially. The LCOE estimates for the potential NPD projects of capacity greater than 1 MW range from 40 to 182 $/MWh, with average of 106 $/MWh. 4,000 MW could be developed through projects with individual LCOE values below 100 $/MWh. The results also indicate that typically 90 % of LCOE uncertainty can be attributed to uncertainties in capital costs and energy production; however

  17. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    SciTech Connect (OSTI)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  18. A Method to Estimate Uncertainty in Radiometric Measurement Using the Guide to the Expression of Uncertainty in Measurement (GUM) Method; NREL (National Renewable Energy Laboratory)

    SciTech Connect (OSTI)

    Habte, A.; Sengupta, M.; Reda, I.

    2015-03-01

    Radiometric data with known and traceable uncertainty is essential for climate change studies to better understand cloud radiation interactions and the earth radiation budget. Further, adopting a known and traceable method of estimating uncertainty with respect to SI ensures that the uncertainty quoted for radiometric measurements can be compared based on documented methods of derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM). derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM).

  19. Investment and Upgrade in Distributed Generation under Uncertainty

    SciTech Connect (OSTI)

    Siddiqui, Afzal; Maribu, Karl

    2008-08-18

    The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplatingthe installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility.

  20. Systematic uncertainties from halo asphericity in dark matter searches

    SciTech Connect (OSTI)

    Bernal, Nicols; Forero-Romero, Jaime E.; Garani, Raghuveer; Palomares-Ruiz, Sergio E-mail: je.forero@uniandes.edu.co E-mail: sergio.palomares.ruiz@ific.uv.es

    2014-09-01

    Although commonly assumed to be spherical, dark matter halos are predicted to be non-spherical by N-body simulations and their asphericity has a potential impact on the systematic uncertainties in dark matter searches. The evaluation of these uncertainties is the main aim of this work, where we study the impact of aspherical dark matter density distributions in Milky-Way-like halos on direct and indirect searches. Using data from the large N-body cosmological simulation Bolshoi, we perform a statistical analysis and quantify the systematic uncertainties on the determination of local dark matter density and the so-called J factors for dark matter annihilations and decays from the galactic center. We find that, due to our ignorance about the extent of the non-sphericity of the Milky Way dark matter halo, systematic uncertainties can be as large as 35%, within the 95% most probable region, for a spherically averaged value for the local density of 0.3-0.4 GeV/cm {sup 3}. Similarly, systematic uncertainties on the J factors evaluated around the galactic center can be as large as 10% and 15%, within the 95% most probable region, for dark matter annihilations and decays, respectively.

  1. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    SciTech Connect (OSTI)

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.

  2. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    SciTech Connect (OSTI)

    Piyush Sabharwall; Richard Skifton; Carl Stoots; Eung Soo Kim; Thomas Conder

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  3. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    SciTech Connect (OSTI)

    Campos, E; Sisterson, DL

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.

  4. Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.

    SciTech Connect (OSTI)

    Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan; Kaplan, Paul Garry; Brown, Theresa Jean; Conrad, Stephen Hamilton

    2010-09-01

    Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subset of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.

  5. Hybrid processing of stochastic and subjective uncertainty data

    SciTech Connect (OSTI)

    Cooper, J.A.; Ferson, S.; Ginzburg, L.

    1995-11-01

    Uncertainty analyses typically recognize separate stochastic and subjective sources of uncertainty, but do not systematically combine the two, although a large amount of data used in analyses is partly stochastic and partly subjective. We have developed methodology for mathematically combining stochastic and subjective data uncertainty, based on new ``hybrid number`` approaches. The methodology can be utilized in conjunction with various traditional techniques, such as PRA (probabilistic risk assessment) and risk analysis decision support. Hybrid numbers have been previously examined as a potential method to represent combinations of stochastic and subjective information, but mathematical processing has been impeded by the requirements inherent in the structure of the numbers, e.g., there was no known way to multiply hybrids. In this paper, we will demonstrate methods for calculating with hybrid numbers that avoid the difficulties. By formulating a hybrid number as a probability distribution that is only fuzzy known, or alternatively as a random distribution of fuzzy numbers, methods are demonstrated for the full suite of arithmetic operations, permitting complex mathematical calculations. It will be shown how information about relative subjectivity (the ratio of subjective to stochastic knowledge about a particular datum) can be incorporated. Techniques are also developed for conveying uncertainty information visually, so that the stochastic and subjective constituents of the uncertainty, as well as the ratio of knowledge about the two, are readily apparent. The techniques demonstrated have the capability to process uncertainty information for independent, uncorrelated data, and for some types of dependent and correlated data. Example applications are suggested, illustrative problems are worked, and graphical results are given.

  6. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect (OSTI)

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  7. Uncertainty Analysis for RELAP5-3D

    SciTech Connect (OSTI)

    Aaron J. Pawel; Dr. George L. Mesina

    2011-08-01

    In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.

  8. Distributed Generation Investment by a Microgrid under Uncertainty

    SciTech Connect (OSTI)

    Marnay, Chris; Siddiqui, Afzal; Marnay, Chris

    2008-08-11

    This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.

  9. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    SciTech Connect (OSTI)

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.

  10. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect (OSTI)

    Harp, Dylan; Atchley, Adam; Painter, Scott L; Coon, Ethan T.; Wilson, Cathy; Romanovsky, Vladimir E; Rowland, Joel

    2016-01-01

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21$^{st}$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant

  11. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although

  12. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect (OSTI)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is

  13. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; Coon, Ethan T.; Wilson, Cathy J.; Romanovsky, Vladimir E.; Rowland, Joel C.

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21more » $$^{st}$$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties

  14. Reducing uncertainty in geostatistical description with well testing pressure data

    SciTech Connect (OSTI)

    Reynolds, A.C.; He, Nanqun; Oliver, D.S.

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  15. Improved lower bound on the entropic uncertainty relation

    SciTech Connect (OSTI)

    Jafarpour, Mojtaba; Sabour, Abbass

    2011-09-15

    We present a lower bound on the entropic uncertainty relation for the distinguished measurements of two observables in a d-dimensional Hilbert space for d up to 5. This bound provides an improvement over the best one yet available. The feasibility of the obtained bound presenting an improvement for higher dimensions is also discussed.

  16. Reducing the uncertainties in particle therapy

    SciTech Connect (OSTI)

    Oancea, C.; Shipulin, K. N.; Mytsin, G. V.; Luchin, Y. I.

    2015-02-24

    The use of fundamental Nuclear Physics in Nuclear Medicine has a significant impact in the fight against cancer. Hadrontherapy is an innovative cancer radiotherapy method using nuclear particles (protons, neutrons and ions) for the treatment of early and advanced tumors. The main goal of proton therapy is to deliver high radiation doses to the tumor volume with minimal damage to healthy tissues and organs. The purpose of this work was to investigate the dosimetric errors in clinical proton therapy dose calculation due to the presence of metallic implants in the treatment plan, and to determine the impact of the errors. The results indicate that the errors introduced by the treatment planning systems are higher than 10% in the prediction of the dose at isocenter when the proton beam is passing directly through a metallic titanium alloy implant. In conclusion, we recommend that pencil-beam algorithms not be used when planning treatment for patients with titanium alloy implants, and to consider implementing methods to mitigate the effects of the implants.

  17. River meander modeling and confronting uncertainty.

    SciTech Connect (OSTI)

    Posner, Ari J.

    2011-05-01

    This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.

  18. The IAEA Coordinated Research Program on HTGR Uncertainty Analysis: Phase I Status and Initial Results

    SciTech Connect (OSTI)

    Strydom, Gerhard; Bostelmann, Friederike; Ivanov, Kostadin

    2014-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools is to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies, and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the

  19. Reduction in maximum time uncertainty of paired time signals

    DOE Patents [OSTI]

    Theodosiou, G.E.; Dawson, J.W.

    1983-10-04

    Reduction in the maximum time uncertainty (t[sub max]--t[sub min]) of a series of paired time signals t[sub 1] and t[sub 2] varying between two input terminals and representative of a series of single events where t[sub 1][<=]t[sub 2] and t[sub 1]+t[sub 2] equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t[sub min]) of the first signal t[sub 1] closer to t[sub max] and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20--800. 6 figs.

  20. Reduction in maximum time uncertainty of paired time signals

    DOE Patents [OSTI]

    Theodosiou, George E.; Dawson, John W.

    1983-01-01

    Reduction in the maximum time uncertainty (t.sub.max -t.sub.min) of a series of paired time signals t.sub.1 and t.sub.2 varying between two input terminals and representative of a series of single events where t.sub.1 .ltoreq.t.sub.2 and t.sub.1 +t.sub.2 equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t.sub.min) of the first signal t.sub.1 closer to t.sub.max and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20-800.

  1. Reduction in maximum time uncertainty of paired time signals

    DOE Patents [OSTI]

    Theodosiou, G.E.; Dawson, J.W.

    1981-02-11

    Reduction in the maximum time uncertainty (t/sub max/ - t/sub min/) of a series of paired time signals t/sub 1/ and t/sub 2/ varying between two input terminals and representative of a series of single events where t/sub 1/ less than or equal to t/sub 2/ and t/sub 1/ + t/sub 2/ equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t/sub min/) of the first signal t/sub 1/ closer to t/sub max/ and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20 to 800.

  2. Uncertainty Quantification and Propagation in Nuclear Density Functional Theory

    SciTech Connect (OSTI)

    Schunck, N; McDonnell, J D; Higdon, D; Sarich, J; Wild, S M

    2015-03-17

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this paper, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statistical analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.

  3. Charges versus taxes under uncertainty: A look from investment theory

    SciTech Connect (OSTI)

    Kohlhaas, M.

    1996-12-31

    The search for more cost-efficient economic instruments for environmental protection has been conducted with different priorities in the USA and most European countries. Whereas in Europe there seems to prevail a preference for charges and taxes, in the USA tradeable permits are preferred. Environmental policy has to take into consideration the trade-off between price uncertainty and ecological effectiveness. In the short run and on a national level, ecological effectiveness is less important, and price uncertainty is smaller with environmental taxes. In the long run and in an international framework, environmental constraints and repercussions on world market prices of energy become more important. Since taxes are politically more accepted in Europe, it would make sense to start with an ecological tax reform, which can be transformed or integrated into a system of internationally tradeable permits gradually.

  4. Giant dipole resonance parameters with uncertainties from photonuclear cross sections

    SciTech Connect (OSTI)

    Plujko, V.A.; Capote, R.; Gorbachenko, O.M.

    2011-09-15

    Updated values and corresponding uncertainties of isovector giant dipole resonance (IVGDR or GDR) model parameters are presented that are obtained by the least-squares fitting of theoretical photoabsorption cross sections to experimental data. The theoretical photoabsorption cross section is taken as a sum of the components corresponding to excitation of the GDR and quasideuteron contribution to the experimental photoabsorption cross section. The present compilation covers experimental data as of January 2010. - Highlights: {yields} Experimental {sigma} ({gamma}, abs) or a sum of partial cross sections are taken as input to the fitting. {yields} Data include contributions from photoproton reactions. {yields} Standard (SLO) or modified (SMLO) Lorentzian approaches are used for formulating GDR models. {yields} Spherical or axially deformed nuclear shapes are used in GDR least-squares fit. {yields} Values and uncertainties of the SLO and SMLO GDR model parameters are tabulated.

  5. Uncertainties in nuclear transition matrix elements of neutrinoless ?? decay

    SciTech Connect (OSTI)

    Rath, P. K.

    2013-12-30

    To estimate the uncertainties associated with the nuclear transition matrix elements M{sup (K)} (K=0?/0N) for the 0{sup +} ? 0{sup +} transitions of electron and positron emitting modes of the neutrinoless ?? decay, a statistical analysis has been performed by calculating sets of eight (twelve) different nuclear transition matrix elements M{sup (K)} in the PHFB model by employing four different parameterizations of a Hamiltonian with pairing plus multipolar effective two-body interaction and two (three) different parameterizations of Jastrow short range correlations. The averages in conjunction with their standard deviations provide an estimate of the uncertainties associated the nuclear transition matrix elements M{sup (K)} calculated within the PHFB model, the maximum of which turn out to be 13% and 19% owing to the exchange of light and heavy Majorana neutrinos, respectively.

  6. Neutron reactions and climate uncertainties earn Los Alamos scientists DOE

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Early Career awards DOE Early Career awards Neutron reactions and climate uncertainties earn Los Alamos scientists DOE Early Career awards Marian Jandel and Nathan Urban are among the 61 national recipients of the Energy Department's Early Career Research Program awards for 2013. May 10, 2013 Marian Jandel (left) and Nathan Urban Marian Jandel (left) and Nathan Urban Contact Nancy Ambrosiano Communications Office (505) 667-0471 Email Marian and Nathan are excellent examples of the

  7. Optimization of Complex Energy System Under Uncertainty | Argonne

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Leadership Computing Facility Optimization of Complex Energy System Under Uncertainty PI Name: Mihai Anitescu PI Email: anitescu@mcs.anl.gov Institution: Argonne National Laboratory Allocation Program: INCITE Allocation Hours at ALCF: 10,000,000 Year: 2012 Research Domain: Energy Technologies The U.S. electrical power system is at a crossroads between its mission to deliver cheap and safe electrical energy, a strategic aim to increase the penetration of renewable energy, an increased

  8. Optimization of Complex Energy System Under Uncertainty | Argonne

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Leadership Computing Facility On the left: This shows the Illinois power grid system overlaid on fields portraying electricity prices under a deterministic economic dispatch scenario. Dark blue areas have the lowest prices while red and yellow have the highest. Argonne National Laboratory researchers use a model of the Illinois grid to test algorithms for making power dispatch decisions under uncertainty. On the right: This shows electricity prices in Illinois under a stochastic economic

  9. Comment on ''Improved bounds on entropic uncertainty relations''

    SciTech Connect (OSTI)

    Bosyk, G. M.; Portesi, M.; Plastino, A.; Zozor, S.

    2011-11-15

    We provide an analytical proof of the entropic uncertainty relations presented by J. I. de Vicente and J. Sanchez-Ruiz [Phys. Rev. A 77, 042110 (2008)] and also show that the replacement of Eq. (27) by Eq. (29) in that reference introduces solutions that do not take fully into account the constraints of the problem, which in turn lead to some mistakes in their treatment.

  10. Computational Fluid Dynamics & Large-Scale Uncertainty Quantification for

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Wind Energy Fluid Dynamics & Large-Scale Uncertainty Quantification for Wind Energy - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery

  11. Errors and Uncertainties in Dose Reconstruction for Radiation Effects Research

    SciTech Connect (OSTI)

    Strom, Daniel J.

    2008-04-14

    Dose reconstruction for studies of the health effects of ionizing radiation have been carried out for many decades. Major studies have included Japanese bomb survivors, atomic veterans, downwinders of the Nevada Test Site and Hanford, underground uranium miners, and populations of nuclear workers. For such studies to be credible, significant effort must be put into applying the best science to reconstructing unbiased absorbed doses to tissues and organs as a function of time. In many cases, more and more sophisticated dose reconstruction methods have been developed as studies progressed. For the example of the Japanese bomb survivors, the dose surrogate “distance from the hypocenter” was replaced by slant range, and then by TD65 doses, DS86 doses, and more recently DS02 doses. Over the years, it has become increasingly clear that an equal level of effort must be expended on the quantitative assessment of uncertainty in such doses, and to reducing and managing uncertainty. In this context, this paper reviews difficulties in terminology, explores the nature of Berkson and classical uncertainties in dose reconstruction through examples, and proposes a path forward for Joint Coordinating Committee for Radiation Effects Research (JCCRER) Project 2.4 that requires a reasonably small level of effort for DOSES-2008.

  12. Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth

    SciTech Connect (OSTI)

    E. Borgonovo; C. L. Smith

    2012-10-01

    Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability risk metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.

  13. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    SciTech Connect (OSTI)

    Perko, Z.; Gilli, L.; Lathouwers, D.; Kloosterman, J. L.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)

  14. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    SciTech Connect (OSTI)

    Strydom, Gerhard; Bostelmann, F.

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  15. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect (OSTI)

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  16. Achieving Robustness to Uncertainty for Financial Decision-making

    SciTech Connect (OSTI)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.; Song, Peter

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gap of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously

  17. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    SciTech Connect (OSTI)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).

  18. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  19. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect (OSTI)

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  20. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect (OSTI)

    G. Pastore; L.P. Swiler; J.D. Hales; S.R. Novascone; D.M. Perez; B.W. Spencer; L. Luzzi; P. Van Uffelen; R.L. Williamson

    2014-10-01

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  1. Estimated Uncertainties in the Idaho National Laboratory Matched-Index-of-Refraction Lower Plenum Experiment

    SciTech Connect (OSTI)

    Donald M. McEligot; Hugh M. McIlroy, Jr.; Ryan C. Johnson

    2007-11-01

    The purpose of the fluid dynamics experiments in the MIR (Matched-Index-of-Refraction) flow system at Idaho National Laboratory (INL) is to develop benchmark databases for the assessment of Computational Fluid Dynamics (CFD) solutions of the momentum equations, scalar mixing, and turbulence models for typical Very High Temperature Reactor (VHTR) plenum geometries in the limiting case of negligible buoyancy and constant fluid properties. The experiments use optical techniques, primarily particle image velocimetry (PIV) in the INL MIR flow system. The benefit of the MIR technique is that it permits optical measurements to determine flow characteristics in passages and around objects to be obtained without locating a disturbing transducer in the flow field and without distortion of the optical paths. The objective of the present report is to develop understanding of the magnitudes of experimental uncertainties in the results to be obtained in such experiments. Unheated MIR experiments are first steps when the geometry is complicated. One does not want to use a computational technique, which will not even handle constant properties properly. This report addresses the general background, requirements for benchmark databases, estimation of experimental uncertainties in mean velocities and turbulence quantities, the MIR experiment, PIV uncertainties, positioning uncertainties, and other contributing measurement uncertainties.

  2. A Two-Step Approach to Uncertainty Quantification of Core Simulators...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: A Two-Step Approach to Uncertainty Quantification of Core Simulators Citation Details In-Document Search Title: A Two-Step Approach to Uncertainty Quantification of ...

  3. Attempt to estimate measurement uncertainty in the Air Force Toxic Chemical Dispersion (AFTOX) model. Master's thesis

    SciTech Connect (OSTI)

    Zettlemoyer, M.D.

    1990-01-01

    The Air Force Toxic Chemical Dispersion (AFTOX) model is a Gaussian puff dispersion model that predicts plumes, concentrations, and hazard distances of toxic chemical spills. A measurement uncertainty propagation formula derived by Freeman et al. (1986) is used within AFTOX to estimate resulting concentration uncertainties due to the effects of data input uncertainties in wind speed, spill height, emission rate, and the horizontal and vertical Gaussian dispersion parameters, and the results are compared to true uncertainties as estimated by standard deviations computed by Monte Carlo simulations. The measurement uncertainty uncertainty propagation formula was found to overestimate measurement uncertainty in AFTOX-calculated concentrations by at least 350 percent, with overestimates worsening with increasing stability and/or increasing measurement uncertainty.

  4. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  5. TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE

    SciTech Connect (OSTI)

    Atkinson, R.

    2012-07-31

    Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.

  6. Entropic uncertainty relations for the ground state of a coupled system

    SciTech Connect (OSTI)

    Santhanam, M.S.

    2004-04-01

    There is a renewed interest in the uncertainty principle, reformulated from the information theoretic point of view, called the entropic uncertainty relations. They have been studied for various integrable systems as a function of their quantum numbers. In this work, focussing on the ground state of a nonlinear, coupled Hamiltonian system, we show that approximate eigenstates can be constructed within the framework of adiabatic theory. Using the adiabatic eigenstates, we estimate the information entropies and their sum as a function of the nonlinearity parameter. We also briefly look at the information entropies for the highly excited states in the system.

  7. Identifying and bounding uncertainties in nuclear reactor thermal power calculations

    SciTech Connect (OSTI)

    Phillips, J.; Hauser, E.; Estrada, H.

    2012-07-01

    Determination of the thermal power generated in the reactor core of a nuclear power plant is a critical element in the safe and economic operation of the plant. Direct measurement of the reactor core thermal power is made using neutron flux instrumentation; however, this instrumentation requires frequent calibration due to changes in the measured flux caused by fuel burn-up, flux pattern changes, and instrumentation drift. To calibrate the nuclear instruments, steam plant calorimetry, a process of performing a heat balance around the nuclear steam supply system, is used. There are four basic elements involved in the calculation of thermal power based on steam plant calorimetry: The mass flow of the feedwater from the power conversion system, the specific enthalpy of that feedwater, the specific enthalpy of the steam delivered to the power conversion system, and other cycle gains and losses. Of these elements, the accuracy of the feedwater mass flow and the feedwater enthalpy, as determined from its temperature and pressure, are typically the largest contributors to the calorimetric calculation uncertainty. Historically, plants have been required to include a margin of 2% in the calculation of the reactor thermal power for the licensed maximum plant output to account for instrumentation uncertainty. The margin is intended to ensure a cushion between operating power and the power for which safety analyses are performed. Use of approved chordal ultrasonic transit-time technology to make the feedwater flow and temperature measurements (in place of traditional differential-pressure- based instruments and resistance temperature detectors [RTDs]) allows for nuclear plant thermal power calculations accurate to 0.3%-0.4% of plant rated power. This improvement in measurement accuracy has allowed many plant operators in the U.S. and around the world to increase plant power output through Measurement Uncertainty Recapture (MUR) up-rates of up to 1.7% of rated power, while also

  8. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    SciTech Connect (OSTI)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  9. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    SciTech Connect (OSTI)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  10. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    SciTech Connect (OSTI)

    Xia, Kelin [Department of Mathematics, Michigan State University, Michigan 48824 (United States)] [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, Michigan 48824 (United States) [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, Michigan 48824 (United States)

    2014-03-15

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction.

  11. Design Features and Technology Uncertainties for the Next Generation Nuclear Plant

    SciTech Connect (OSTI)

    John M. Ryskamp; Phil Hildebrandt; Osamu Baba; Ron Ballinger; Robert Brodsky; Hans-Wolfgang Chi; Dennis Crutchfield; Herb Estrada; Jeane-Claude Garnier; Gerald Gordon; Richard Hobbins; Dan Keuter; Marilyn Kray; Philippe Martin; Steve Melancon; Christian Simon; Henry Stone; Robert Varrin; Werner von Lensa

    2004-06-01

    This report presents the conclusions, observations, and recommendations of the Independent Technology Review Group (ITRG) regarding design features and important technology uncertainties associated with very-high-temperature nuclear system concepts for the Next Generation Nuclear Plant (NGNP). The ITRG performed its reviews during the period November 2003 through April 2004.

  12. Uncertainty Analysis of RELAP5-3D

    SciTech Connect (OSTI)

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  13. Quantifying uncertainty in material damage from vibrational data

    SciTech Connect (OSTI)

    Butler, T.; Huhtala, A.; Juntunen, M.

    2015-02-15

    The response of a vibrating beam to a force depends on many physical parameters including those determined by material properties. Damage caused by fatigue or cracks results in local reductions in stiffness parameters and may drastically alter the response of the beam. Data obtained from the vibrating beam are often subject to uncertainties and/or errors typically modeled using probability densities. The goal of this paper is to estimate and quantify the uncertainty in damage modeled as a local reduction in stiffness using uncertain data. We present various frameworks and methods for solving this parameter determination problem. We also describe a mathematical analysis to determine and compute useful output data for each method. We apply the various methods in a specified sequence that allows us to interface the various inputs and outputs of these methods in order to enhance the inferences drawn from the numerical results obtained from each method. Numerical results are presented using both simulated and experimentally obtained data from physically damaged beams.

  14. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect (OSTI)

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  15. The role of the uncertainty in code development

    SciTech Connect (OSTI)

    Barre, F.

    1997-07-01

    From a general point of view, all the results of a calculation should be given with their uncertainty. It is of most importance in nuclear safety where sizing of the safety systems, therefore protection of the population and the environment essentially depends on the calculation results. Until these last years, the safety analysis was performed with conservative tools. Two types of critics can be made. Firstly, conservative margins can be too large and it may be possible to reduce the cost of the plant or its operation with a best estimate approach. Secondly, some of the conservative hypotheses may not really conservative in the full range of physical events which can occur during an accident. Simpson gives an interesting example: in some cases, the majoration of the residual power during a small break LOCA can lead to an overprediction of the swell level and thus of an overprediction of the core cooling, which is opposite to a conservative prediction. A last question is: does the accumulation of conservative hypotheses for a problem always give a conservative result? The two phase flow physics, mainly dealing with situation of mechanical and thermal non-equilibrium, is too much complicated to answer these questions with a simple engineer judgement. The objective of this paper is to make a review of the quantification of the uncertainties which can be made during code development and validation.

  16. Analysis and Reduction of Complex Networks Under Uncertainty.

    SciTech Connect (OSTI)

    Ghanem, Roger G

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  17. Data Filtering Impact on PV Degradation Rates and Uncertainty (Poster)

    SciTech Connect (OSTI)

    Jordan, D. C.; Kurtz, S. R.

    2012-03-01

    To sustain the commercial success of photovoltaics (PV) it becomes vital to know how power output decreases with time. In order to predict power delivery, degradation rates must be determined accurately. Data filtering, any data treatment assessment of long-term field behavior, is discussed as part of a more comprehensive uncertainty analysis and can be one of the greatest sources of uncertainty in long-term performance studies. Several distinct filtering methods such as outlier removal and inclusion of only sunny days on several different metrics such as PVUSA, performance ratio, DC power to plane-of-array irradiance ratio, uncorrected, and temperature-corrected were examined. PVUSA showed the highest sensitivity while temperature-corrected power over irradiance ratio was found to be the least sensitive to data filtering conditions. Using this ratio it is demonstrated that quantification of degradation rates with a statistical accuracy of +/- 0.2%/year within 4 years of field data is possible on two crystalline silicon and two thin-film systems.

  18. WE-B-19A-01: SRT II: Uncertainties in SRT

    SciTech Connect (OSTI)

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-06-15

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring

  19. Quantification of initial-data uncertainty on a shock-accelerated gas cylinder

    SciTech Connect (OSTI)

    Tritschler, V. K. Avdonin, A.; Hickel, S.; Hu, X. Y.; Adams, N. A.

    2014-02-15

    We quantify initial-data uncertainties on a shock accelerated heavy-gas cylinder by two-dimensional well-resolved direct numerical simulations. A high-resolution compressible multicomponent flow simulation model is coupled with a polynomial chaos expansion to propagate the initial-data uncertainties to the output quantities of interest. The initial flow configuration follows previous experimental and numerical works of the shock accelerated heavy-gas cylinder. We investigate three main initial-data uncertainties, (i) shock Mach number, (ii) contamination of SF{sub 6} with acetone, and (iii) initial deviations of the heavy-gas region from a perfect cylindrical shape. The impact of initial-data uncertainties on the mixing process is examined. The results suggest that the mixing process is highly sensitive to input variations of shock Mach number and acetone contamination. Additionally, our results indicate that the measured shock Mach number in the experiment of Tomkins et al. [An experimental investigation of mixing mechanisms in shock-accelerated flow, J. Fluid. Mech. 611, 131 (2008)] and the estimated contamination of the SF{sub 6} region with acetone [S. K. Shankar, S. Kawai, and S. K. Lele, Two-dimensional viscous flow simulation of a shock accelerated heavy gas cylinder, Phys. Fluids 23, 024102 (2011)] exhibit deviations from those that lead to best agreement between our simulations and the experiment in terms of overall flow evolution.

  20. Remediation of the Faultless Underground Nuclear Test: Moving Forward in the Face of Model Uncertainty

    SciTech Connect (OSTI)

    Chapman, J. B.; Pohlmann, K.; Pohll, G.; Hassan, A.; Sanders, P.; Sanchez, M.; Jaunarajs, S.

    2002-02-26

    The Faultless underground nuclear test, conducted in central Nevada, is the site of an ongoing environmental remediation effort that has successfully progressed through numerous technical challenges due to close cooperation between the U.S. Department of Energy, (DOE) National Nuclear Security Administration and the State of Nevada Division of Environmental Protection (NDEP). The challenges faced at this site are similar to those of many other sites of groundwater contamination: substantial uncertainties due to the relative lack of data from a highly heterogeneous subsurface environment. Knowing when, where, and how to devote the often enormous resources needed to collect new data is a common problem, and one that can cause remediators and regulators to disagree and stall progress toward closing sites. For Faultless, a variety of numerical modeling techniques and statistical tools are used to provide the information needed for DOE and NDEP to confidently move forward along the remediation path to site closure. A general framework for remediation was established in an agreement and consent order between DOE and the State of Nevada that recognized that no cost-effective technology currently exists to remove the source of contaminants in nuclear cavities. Rather, the emphasis of the corrective action is on identifying the impacted groundwater resource and ensuring protection of human health and the environment from the contamination through monitoring. As a result, groundwater flow and transport modeling is the linchpin in the remediation effort. An early issue was whether or not new site data should be collected via drilling and testing prior to modeling. After several iterations of the Corrective Action Investigation Plan, all parties agreed that sufficient data existed to support a flow and transport model for the site. Though several aspects of uncertainty were included in the subsequent modeling work, concerns remained regarding uncertainty in individual

  1. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  2. Method to Calculate Uncertainty Estimate of Measuring Shortwave Solar Irradiance using Thermopile and Semiconductor Solar Radiometers

    SciTech Connect (OSTI)

    Reda, I.

    2011-07-01

    The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections to solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.

  3. Science based stockpile stewardship, uncertainty quantification, and fission fragment beams

    SciTech Connect (OSTI)

    Stoyer, M A; McNabb, D; Burke, J; Bernstein, L A; Wu, C Y

    2009-09-14

    Stewardship of this nation's nuclear weapons is predicated on developing a fundamental scientific understanding of the physics and chemistry required to describe weapon performance without the need to resort to underground nuclear testing and to predict expected future performance as a result of intended or unintended modifications. In order to construct more reliable models, underground nuclear test data is being reanalyzed in novel ways. The extent to which underground experimental data can be matched with simulations is one measure of the credibility of our capability to predict weapon performance. To improve the interpretation of these experiments with quantified uncertainties, improved nuclear data is required. As an example, the fission yield of a device was often determined by measuring fission products. Conversion of the measured fission products to yield was accomplished through explosion code calculations (models) and a good set of nuclear reaction cross-sections. Because of the unique high-fluence environment of an exploding nuclear weapon, many reactions occurred on radioactive nuclides, for which only theoretically calculated cross-sections are available. Inverse kinematics reactions at CARIBU offer the opportunity to measure cross-sections on unstable neutron-rich fission fragments and thus improve the quality of the nuclear reaction cross-section sets. One of the fission products measured was {sup 95}Zr, the accumulation of all mass 95 fission products of Y, Sr, Rb and Kr (see Fig. 1). Subsequent neutron-induced reactions on these short lived fission products were assumed to cancel out - in other words, the destruction of mass 95 nuclides was more or less equal to the production of mass 95 nuclides. If a {sup 95}Sr was destroyed by an (n,2n) reaction it was also produced by (n,2n) reactions on {sup 96}Sr, for example. However, since these nuclides all have fairly short half-lives (seconds to minutes or even less), no experimental nuclear reaction

  4. Uncertainty of silicon 1-MeV damage function

    SciTech Connect (OSTI)

    Danjaji, M.B.; Griffin, P.J.

    1997-02-01

    The electronics radiation hardness-testing community uses the ASTM E722-93 Standard Practice to define the energy dependence of the nonionizing neutron damage to silicon semiconductors. This neutron displacement damage response function is defined to be equal to the silicon displacement kerma as calculated from the ORNL Si cross-section evaluation. Experimental work has shown that observed damage ratios at various test facilities agree with the defined response function to within 5%. Here, a covariance matrix for the silicon 1-MeV neutron displacement damage function is developed. This uncertainty data will support the electronic radiation hardness-testing community and will permit silicon displacement damage sensors to be used in least squares spectrum adjustment codes.

  5. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    SciTech Connect (OSTI)

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  6. Long-term energy generation planning under uncertainty

    SciTech Connect (OSTI)

    Escudero, L.F.; Paradinas, I.; Salmeron, J.; Sanchez, M.

    1998-07-01

    In this work the authors deal with the hydro-thermal coordination problem under uncertainty in generators availability, fuel costs, exogenous water inflow and energy demand. The objective is to minimize the system operating cost. The decision variables are the fuel procurement for each thermal generation site, the energy generated by each thermal and hydro-generator and the release and spilled water from reservoirs. Control variables are the stored water in reservoirs and the stored fuel in thermal plants at the end of each time period. The main contribution on the proposed topic focus in the simultaneous inclusion of the hydro-network and the thermal generation related constraints, as well as the stochastic aspect of the aforementioned parameters. The authors report their computational experience on real problems drawn from the Spanish hydro-thermal generation system. A case tested includes 85 generators (42 thermal plants with a global 27084MW capacity) and 57 reservoirs.

  7. Uncertainty in terahertz time-domain spectroscopy measurement

    SciTech Connect (OSTI)

    Withayachumnankul, Withawat; Fischer, Bernd M.; Lin Hungyen; Abbott, Derek

    2008-06-15

    Measurements of optical constants at terahertz--or T-ray--frequencies have been performed extensively using terahertz time-domain spectroscopy (THz-TDS). Spectrometers, together with physical models explaining the interaction between a sample and T-ray radiation, are progressively being developed. Nevertheless, measurement errors in the optical constants, so far, have not been systematically analyzed. This situation calls for a comprehensive analysis of measurement uncertainty in THz-TDS systems. The sources of error existing in a terahertz spectrometer and throughout the parameter estimation process are identified. The analysis herein quantifies the impact of each source on the output optical constants. The resulting analytical model is evaluated against experimental THz-TDS data.

  8. Effect of fluctuation measures on the uncertainty relations between two observables: Different measures lead to opposite conclusions

    SciTech Connect (OSTI)

    Luis, Alfredo

    2011-09-15

    We show within a very simple framework that different measures of fluctuations lead to uncertainty relations resulting in contradictory conclusions. More specifically we focus on Tsallis and Renyi entropic uncertainty relations and we get that the minimum joint uncertainty states for some fluctuation measures are the maximum joint uncertainty states of other fluctuation measures, and vice versa.

  9. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect (OSTI)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  10. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect (OSTI)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  11. Final Report: DOE Project: DE-SC-0005399 Linking the uncertainty...

    Office of Scientific and Technical Information (OSTI)

    Linking the uncertainty of low frequency variability in tropical forcing in regional climate change Citation Details In-Document Search Title: Final Report: DOE Project:...

  12. Uncertainty Analysis of Certified Photovoltaic Measurements at the National Renewable Energy Laboratory

    SciTech Connect (OSTI)

    Emery, K.

    2009-08-01

    Discusses NREL Photovoltaic Cell and Module Performance Characterization Group's procedures to achieve lowest practical uncertainty in measuring PV performance with respect to reference conditions.

  13. MASS MEASUREMENT UNCERTAINTY FOR PLUTONIUM ALIQUOTS ASSAYED BY CONTROLLED-POTENTIAL COULOMETRY

    SciTech Connect (OSTI)

    Holland, M.; Cordaro, J.

    2009-03-18

    Minimizing plutonium measurement uncertainty is essential to nuclear material control and international safeguards. In 2005, the International Organization for Standardization (ISO) published ISO 12183 'Controlled-potential coulometric assay of plutonium', 2nd edition. ISO 12183:2005 recommends a target of {+-}0.01% for the mass of original sample in the aliquot because it is a critical assay variable. Mass measurements in radiological containment were evaluated and uncertainties estimated. The uncertainty estimate for the mass measurement also includes uncertainty in correcting for buoyancy effects from air acting as a fluid and from decreased pressure of heated air from the specific heat of the plutonium isotopes.

  14. Uncertainty analysis of an IGCC system with single-stage entrained-flow gasifier

    SciTech Connect (OSTI)

    Shastri, Y.; Diwekar, U.; Zitney, S.

    2008-01-01

    Integrated Gasification Combined Cycle (IGCC) systems using coal gasification is an attractive option for future energy plants. Consequenty, understanding the system operation and optimizing gasifier performance in the presence of uncertain operating conditions is essential to extract the maximum benefits from the system. This work focuses on conducting such a study using an IGCC process simulation and a high-fidelity gasifier simulation coupled with stochastic simulation and multi-objective optimization capabilities. Coal gasifiers are the necessary basis of IGCC systems, and hence effective modeling and uncertainty analysis of the gasification process constitutes an important element of overall IGCC process design and operation. In this work, an Aspen Plus{reg_sign} steady-state process model of an IGCC system with carbon capture enables us to conduct simulation studies so that the effect of gasification variability on the whole process can be understood. The IGCC plant design consists of an single-stage entrained-flow gasifier, a physical solvent-based acid gas removal process for carbon capture, two model-7FB combustion turbine generators, two heat recovery steam generators, and one steam turbine generator in a multi-shaft 2x2x1 configuration. In the Aspen Plus process simulation, the gasifier is represented as a simplified lumped-parameter, restricted-equilibrium reactor model. In this work, we also make use of a distributed-parameter FLUENT{reg_sign} computational fluid dynamics (CFD) model to characterize the uncertainty for the entrained-flow gasifier. The CFD-based gasifer model is much more comprehensive, predictive, and hence better suited to understand the effects of uncertainty. The possible uncertain parameters of the gasifier model are identified. This includes input coal composition as well as mass flow rates of coal, slurry water, and oxidant. Using a selected number of random (Monte Carlo) samples for the different parameters, the CFD model is

  15. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect (OSTI)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  16. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    SciTech Connect (OSTI)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for all equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate

  17. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    SciTech Connect (OSTI)

    Nguyen, J.; Moteabbed, M.; Paganetti, H.

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagation was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio

  18. Area 2: Inexpensive Monitoring and Uncertainty Assessment of CO2 Plume Migration using Injection Data

    SciTech Connect (OSTI)

    Srinivasan, Sanjay

    2014-09-30

    In-depth understanding of the long-term fate of CO₂ in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO₂ in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO₂ plume migration in two field projects – the In Salah CO₂ Injection project in Algeria and CO₂ injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were

  19. Mesh refinement for uncertainty quantification through model reduction

    SciTech Connect (OSTI)

    Li, Jing Stinis, Panos

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory.

  20. A preliminary study to Assess Model Uncertainties in Fluid Flows

    SciTech Connect (OSTI)

    Marc Oliver Delchini; Jean C. Ragusa

    2009-09-01

    The goal of this study is to assess the impact of various flow models for a simplified primary coolant loop of a light water nuclear reactor. The various fluid flow models are based on the Euler equations with an additional friction term, gravity term, momentum source, and energy source. The geometric model is purposefully chosen simple and consists of a one-dimensional (1D) loop system in order to focus the study on the validity of various fluid flow approximations. The 1D loop system is represented by a rectangle; the fluid is heated up along one of the vertical legs and cooled down along the opposite leg. A pressurizer and a pump are included in the horizontal legs. The amount of energy transferred and removed from the system is equal in absolute value along the two vertical legs. The various fluid flow approximations are compressible vs. incompressible, and complete momentum equation vs. Darcys approximation. The ultimate goal is to compute the fluid flow models uncertainties and, if possible, to generate validity ranges for these models when applied to reactor analysis. We also limit this study to single phase flows with low-Mach numbers. As a result, sound waves carry a very small amount of energy in this particular case. A standard finite volume method is used for the spatial discretization of the system.

  1. Characterization, propagation and analysis of aleatory and epistemic uncertainty in the 2008 performance assessment for the proposed repository for radioactive waste at Yucca Mountain, Nevada.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-10-01

    The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities: a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA.

  2. Optimal Extraction of Cosmological Information from Supernova Datain the Presence of Calibration Uncertainties

    SciTech Connect (OSTI)

    Kim, Alex G.; Miquel, Ramon

    2005-09-26

    We present a new technique to extract the cosmological information from high-redshift supernova data in the presence of calibration errors and extinction due to dust. While in the traditional technique the distance modulus of each supernova is determined separately, in our approach we determine all distance moduli at once, in a process that achieves a significant degree of self-calibration. The result is a much reduced sensitivity of the cosmological parameters to the calibration uncertainties. As an example, for a strawman mission similar to that outlined in the SNAP satellite proposal, the increased precision obtained with the new approach is roughly equivalent to a factor of five decrease in the calibration uncertainty.

  3. Uncertainty in Resilience to Climate Change in India and Indian States

    SciTech Connect (OSTI)

    Malone, Elizabeth L.; Brenkert, Antoinette L.

    2008-10-03

    This study builds on an earlier analysis of resilience of India and Indian states to climate change. The previous study (Brenkert and Malone 2005) assessed current resilience; this research uses the Vulnerability-Resilience Indicators Model (VRIM) to project resilience to 2095 and to perform an uncertainty analysis on the deterministic results. Projections utilized two SRES-based scenarios, one with fast-and-high growth, one with delayed growth. A detailed comparison of two states, the Punjab and Orissa, points to the kinds of insights that can be obtained using the VRIM. The scenarios differ most significantly in the timing of the uncertainty in economic prosperity (represented by GDP per capita) as a major factor in explaining the uncertainty in the resilience index. In the fast-and-high growth scenario the states differ most markedly regarding the role of ecosystem sensitivity, land use and water availability. The uncertainty analysis shows, for example, that resilience in the Punjab might be enhanced, especially in the delayed growth scenario, if early attention is paid to the impact of ecosystems sensitivity on environmental well-being of the state. By the same token, later in the century land-use pressures might be avoided if land is managed through intensification rather than extensification of agricultural land. Thus, this methodology illustrates how a policy maker can be informed about where to focus attention on specific issues, by understanding the potential changes at a specific location and time – and, thus, what might yield desired outcomes. Model results can point to further analyses of the potential for resilience-building.

  4. Effect of the Generalized Uncertainty Principle on post-inflation preheating

    SciTech Connect (OSTI)

    Chemissany, Wissam; Das, Saurya; Ali, Ahmed Farag; Vagenas, Elias C. E-mail: saurya.das@uleth.ca E-mail: evagenas@academyofathens.gr

    2011-12-01

    We examine effects of the Generalized Uncertainty Principle, predicted by various theories of quantum gravity to replace the Heisenberg's uncertainty principle near the Planck scale, on post inflation preheating in cosmology, and show that it can predict either an increase or a decrease in parametric resonance and a corresponding change in particle production. Possible implications are considered.

  5. Using Uncertainty Analysis to Guide the Development of Accelerated Stress Tests (Presentation)

    SciTech Connect (OSTI)

    Kempe, M.

    2014-03-01

    Extrapolation of accelerated testing to the long-term results expected in the field has uncertainty associated with the acceleration factors and the range of possible stresses in the field. When multiple stresses (such as temperature and humidity) can be used to increase the acceleration, the uncertainty may be reduced according to which stress factors are used to accelerate the degradation.

  6. Users manual for the FORSS sensitivity and uncertainty analysis code system

    SciTech Connect (OSTI)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  7. TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)

    SciTech Connect (OSTI)

    KEELE, B.D.

    2007-07-05

    An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex.

  8. Uncertainty analysis in geospatial merit matrix–based hydropower resource assessment

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pasha, M. Fayzul K.; Yeasmin, Dilruba; Saetern, Sen; Yang, Majntxov; Kao, Shih -Chieh; Smith, Brennan T.

    2016-03-30

    Hydraulic head and mean annual streamflow, two main input parameters in hydropower resource assessment, are not measured at every point along the stream. Translation and interpolation are used to derive these parameters, resulting in uncertainties. This study estimates the uncertainties and their effects on model output parameters: the total potential power and the number of potential locations (stream-reach). These parameters are quantified through Monte Carlo Simulation (MCS) linking with a geospatial merit matrix based hydropower resource assessment (GMM-HRA) Model. The methodology is applied to flat, mild, and steep terrains. Results show that the uncertainty associated with the hydraulic head ismore » within 20% for mild and steep terrains, and the uncertainty associated with streamflow is around 16% for all three terrains. Output uncertainty increases as input uncertainty increases. However, output uncertainty is around 10% to 20% of the input uncertainty, demonstrating the robustness of the GMM-HRA model. Hydraulic head is more sensitive to output parameters in steep terrain than in flat and mild terrains. Furthermore, mean annual streamflow is more sensitive to output parameters in flat terrain.« less

  9. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    SciTech Connect (OSTI)

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M; Busch, Robert D.; Emerson, Scott

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity and uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.

  10. Global Sampling for Integrating Physics-Specific Subsystems and Quantifying Uncertainties of CO2 Geological Sequestration

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.; Lu, C.; Mansoor, K.; Carroll, S. A.

    2012-12-20

    The risk of CO2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO2/brine saturation are connected to the fault-leakage model as a boundary condition. CO2 andmore » brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less