National Library of Energy BETA

Sample records for uncertainty high uncertainty

  1. Uncertainty Quantification

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... Uncertainty Quantification in Electric Grids The analysis and reduction of electric grid models under uncertainty is important to developing robust and reliable smart-grid ...

  2. Sensitivity and Uncertainty Analysis

    Broader source: Energy.gov [DOE]

    Summary Notes from 15 November 2007 Generic Technical Issue Discussion on Sensitivity and Uncertainty Analysis and Model Support

  3. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  4. Physical Uncertainty Bounds (PUB)

    SciTech Connect (OSTI)

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  5. Uncertainty with New Technology

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    an application. By removing uncertainty and hesitation associated with new technology adoption, ES-Select plays a key role in helping to address grid issues by providing the first...

  6. Measurement uncertainty relations

    SciTech Connect (OSTI)

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order ? rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  7. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    SciTech Connect (OSTI)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di#11;erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi#12;cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  8. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect (OSTI)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  9. Entropic uncertainty relations and entanglement

    SciTech Connect (OSTI)

    Guehne, Otfried; Lewenstein, Maciej

    2004-08-01

    We discuss the relationship between entropic uncertainty relations and entanglement. We present two methods for deriving separability criteria in terms of entropic uncertainty relations. In particular, we show how any entropic uncertainty relation on one part of the system results in a separability condition on the composite system. We investigate the resulting criteria using the Tsallis entropy for two and three qubits.

  10. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less

  11. Evaluation of machine learning algorithms for prediction of regions of high RANS uncertainty

    SciTech Connect (OSTI)

    Ling, Julia; Templeton, Jeremy Alan

    2015-08-04

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  12. Uncertainty quantification of a containment vessel dynamic response subjected to high-explosive detonation impulse loading

    SciTech Connect (OSTI)

    Rodriguez, E. A.; Pepin, J. E.; Thacker, B. H.; Riha, D. S.

    2002-01-01

    Los Alamos National Laboratory (LANL), in cooperation with Southwest Research Institute, has been developing capabilities to provide reliability-based structural evaluation techniques for performing weapon component and system reliability assessments. The development and applications of Probabilistic Structural Analysis Methods (PSAM) is an important ingredient in the overall weapon reliability assessments. Focus, herein, is placed on the uncertainty quantification associated with the structural response of a containment vessel for high-explosive (HE) experiments. The probabilistic dynamic response of the vessel is evaluated through the coupling of the probabilistic code NESSUS with the non-linear structural dynamics code, DYNA-3D. The probabilistic model includes variations in geometry and mechanical properties, such as Young's Modulus, yield strength, and material flow characteristics. Finally, the probability of exceeding a specified strain limit, which is related to vessel failure, is determined.

  13. Uncertainty Analysis Technique for OMEGA Dante Measurements ...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty Analysis Technique for OMEGA Dante Measurements Citation Details In-Document Search Title: Uncertainty Analysis Technique for OMEGA Dante Measurements You are...

  14. Report: Technical Uncertainty and Risk Reduction

    Office of Environmental Management (EM)

    TECHNICAL UNCERTAINTY AND RISK REDUCTION Background In FY 2007 EMAB was tasked to assess EM's ability to reduce risk and technical uncertainty. Board members explored this topic...

  15. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect (OSTI)

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  16. Assessing Climate Uncertainty

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    approach a solution to this dilemma with science. We study materials to learn their prop- erties in exquisite detail. We study high-energy interactions in weapons tests,...

  17. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect (OSTI)

    Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  18. ARM - PI Product - Direct Aerosol Forcing Uncertainty

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ProductsDirect Aerosol Forcing Uncertainty ARM Data Discovery Browse Data Comments? We would love to hear from you! Send us a note below or call us at 1-888-ARM-DATA. Send PI Product : Direct Aerosol Forcing Uncertainty Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement

  19. Uncertainty quantification for discrimination of nuclear events...

    Office of Scientific and Technical Information (OSTI)

    nuclear-test-ban treaty Prev Next Title: Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty ...

  20. Uncertainty quantification methodologies development for storage...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty quantification methodologies development for storage and trans- portation of used nuclear fuel: Pilot study on stress corrosion cracking of canister welds Citation...

  1. Uncertainty quantification for evaluating impacts of caprock...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty quantification for evaluating impacts of caprock and reservoir properties on pressure buildup and ground surface displacement during geological CO2 sequestration ...

  2. Uncertainty Quantification and Propagation in Nuclear Density...

    Office of Scientific and Technical Information (OSTI)

    and Propagation in Nuclear Density Functional Theory Citation Details In-Document Search Title: Uncertainty Quantification and Propagation in Nuclear Density Functional ...

  3. Uncertainty Quantification for Nuclear Density Functional Theory...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements Citation Details In-Document Search This content will become publicly...

  4. Efficient uncertainty propagation for network multiphysics systems...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: Efficient uncertainty propagation for network multiphysics systems. ... DOE Contract Number: AC04-94AL85000 Resource Type: Journal Article Resource Relation: ...

  5. Microsoft Word - Price Uncertainty Supplement.doc

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    an unusually severe hurricane outlook, financial-market uncertainty in Europe and China, ... Mid-way into the month, energy futures, commodities and securities were sold, as financial...

  6. Uncertainty Estimation of Radiometric Data using a Guide to the Expression of Uncertainty in Measurement (GUM) Method

    SciTech Connect (OSTI)

    Habte, Aron

    2015-06-25

    This presentation summarizes uncertainty estimation of radiometric data using the Guide to the Expression of Uncertainty (GUM) method.

  7. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (OSTI)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmoreof that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.less

  8. Estimating the uncertainty in underresolved nonlinear dynamics

    SciTech Connect (OSTI)

    Chorin, Alelxandre; Hald, Ole

    2013-06-12

    The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.

  9. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect (OSTI)

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  10. Quantifying uncertainty in stable isotope mixing models

    SciTech Connect (OSTI)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (?15N and ?18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated mixing fractions.

  11. Quantifying uncertainty in stable isotope mixing models

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods testedmore » are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated mixing fractions.« less

  12. Numerical uncertainty in computational engineering and physics

    SciTech Connect (OSTI)

    Hemez, Francois M

    2009-01-01

    Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.

  13. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect (OSTI)

    Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  14. An uncertainty principle for unimodular quantum groups

    SciTech Connect (OSTI)

    Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca

    2014-08-15

    We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.

  15. Validation and Uncertainty Characterization for Energy Simulation

    Energy Savers [EERE]

    Validation and Uncertainty Characterization for Energy Simulation (#1530) Philip Haves (LBNL) Co-PI's: Ron Judkoff (NREL), Joshua New (ORNL), Ralph Muehleisen (ANL) BTO Merit Review - April 16/17, 2015 Model Discrepancy Experiment Simulation Problem Statement 2 Sources of differences:  Uncertainty:  model algorithms  input parameters  modeler decisions  Variability:  weather  occupancy  operation Source: Energy performance of LEED-NC buildings, NBI, 2008 as-built vs.

  16. Efficient uncertainty propagation for network multiphysics systems.

    Office of Scientific and Technical Information (OSTI)

    (Journal Article) | SciTech Connect Journal Article: Efficient uncertainty propagation for network multiphysics systems. Citation Details In-Document Search Title: Efficient uncertainty propagation for network multiphysics systems. Authors: Phipps, Eric T. ; Wildey, Timothy Michael ; Constantine, Paul G. Publication Date: 2013-01-01 OSTI Identifier: 1063360 Report Number(s): SAND2013-0423J DOE Contract Number: AC04-94AL85000 Resource Type: Journal Article Resource Relation: Journal Name:

  17. Reducing Petroleum Despendence in California: Uncertainties About

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Light-Duty Diesel | Department of Energy Petroleum Despendence in California: Uncertainties About Light-Duty Diesel Reducing Petroleum Despendence in California: Uncertainties About Light-Duty Diesel 2002 DEER Conference Presentation: Center for Energy Efficiency and Renewable Technologies PDF icon 2002_deer_phillips.pdf More Documents & Publications Diesel Use in California Future Potential of Hybrid and Diesel Powertrains in the U.S. Light-Duty Vehicle Market Dumping Dirty Diesels: The

  18. Cost Analysis: Technology, Competitiveness, Market Uncertainty | Department

    Office of Environmental Management (EM)

    of Energy Technology to Market » Cost Analysis: Technology, Competitiveness, Market Uncertainty Cost Analysis: Technology, Competitiveness, Market Uncertainty As a basis for strategic planning, competitiveness analysis, funding metrics and targets, SunShot supports analysis teams at national laboratories to assess technology costs, location-specific competitive advantages, policy impacts on system financing, and to perform detailed levelized cost of energy (LCOE) analyses. This shows the

  19. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect (OSTI)

    Smith, F.; Phifer, M.

    2011-06-30

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in sand and clay), (b) Dose Parameters (34 parameters), (c) Material Properties (20 parameters), (d) Surface Water Flows (6 parameters), and (e) Vadose and Aquifer Flow (4 parameters). Results provided an assessment of which group of parameters is most significant in the dose uncertainty. It was found that K{sub d} and the vadose/aquifer flow parameters, both of which impact transport timing, had the greatest impact on dose uncertainty. Dose parameters had an intermediate level of impact while material properties and surface water flows had little impact on dose uncertainty. Results of the importance analysis are discussed further in Section 7 of this report. The objectives of this work were to address comments received during the CA review on the uncertainty analysis and to demonstrate an improved methodology for CA uncertainty calculations as part of CA maintenance. This report partially addresses the LFRG Review Team issue of producing an enhanced CA sensitivity and uncertainty analysis. This is described in Table 1-1 which provides specific responses to pertinent CA maintenance items extracted from Section 11 of the SRS CA (2009). As noted above, the original uncertainty analysis looked at each POA separately and only included the effects from at most five sources giving the highest peak doses at each POA. Only 17 of the 152 CA sources were used in the original uncertainty analysis and the simulation time was reduced from 10,000 to 2,000 years. A major constraint on the original uncertainty analysis was the limitation of only being able to use at most four distributed processes. This work expanded the analysis to 10,000 years using 39 of the CA sources, included cumulative dose effects at downstream POAs, with more realizations (1,000) and finer time steps. This was accomplished by using the GoldSim DP-Plus module and the 36 processors available on a new windows cluster. The last part of the work looked at the contribution to overall uncertainty from the main categories of uncertainty variables: K{sub d}s, dose parameters, flow parameters, and material propertie

  20. Risk Analysis and Decision-Making Under Uncertainty: A Strategy...

    Office of Environmental Management (EM)

    Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Ming Ye...

  1. Addressing Uncertainties in Design Inputs: A Case Study of Probabilist...

    Office of Environmental Management (EM)

    Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF Addressing Uncertainties in Design Inputs: A Case...

  2. Dasymetric Modeling and Uncertainty (Journal Article) | SciTech...

    Office of Scientific and Technical Information (OSTI)

    Journal Article: Dasymetric Modeling and Uncertainty Citation Details In-Document Search Title: Dasymetric Modeling and Uncertainty Authors: Nagle, Nicholas N 1 ; Buttenfield, ...

  3. Comparison of Uncertainty of Two Precipitation Prediction Models...

    Office of Scientific and Technical Information (OSTI)

    Comparison of Uncertainty of Two Precipitation Prediction Models at Los Alamos National Lab Technical Area 54 Citation Details In-Document Search Title: Comparison of Uncertainty...

  4. Error propagation equations for estimating the uncertainty in high-speed wind tunnel test results

    SciTech Connect (OSTI)

    Clark, E.L.

    1994-07-01

    Error propagation equations, based on the Taylor series model, are derived for the nondimensional ratios and coefficients most often encountered in high-speed wind tunnel testing. These include pressure ratio and coefficient, static force and moment coefficients, dynamic stability coefficients, and calibration Mach number. The error equations contain partial derivatives, denoted as sensitivity coefficients, which define the influence of free-steam Mach number, M{infinity}, on various aerodynamic ratios. To facilitate use of the error equations, sensitivity coefficients are derived and evaluated for five fundamental aerodynamic ratios which relate free-steam test conditions to a reference condition.

  5. A stochastic approach to quantifying the blur with uncertainty estimation for high-energy X-ray imaging systems

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Fowler, Michael J.; Howard, Marylesa; Luttman, Aaron; Mitchell, Stephen E.; Webb, Timothy J.

    2015-06-03

    One of the primary causes of blur in a high-energy X-ray imaging system is the shape and extent of the radiation source, or ‘spot’. It is important to be able to quantify the size of the spot as it provides a lower bound on the recoverable resolution for a radiograph, and penumbral imaging methods – which involve the analysis of blur caused by a structured aperture – can be used to obtain the spot’s spatial profile. We present a Bayesian approach for estimating the spot shape that, unlike variational methods, is robust to the initial choice of parameters. The posteriormore » is obtained from a normal likelihood, which was constructed from a weighted least squares approximation to a Poisson noise model, and prior assumptions that enforce both smoothness and non-negativity constraints. A Markov chain Monte Carlo algorithm is used to obtain samples from the target posterior, and the reconstruction and uncertainty estimates are the computed mean and variance of the samples, respectively. Lastly, synthetic data-sets are used to demonstrate accurate reconstruction, while real data taken with high-energy X-ray imaging systems are used to demonstrate applicability and feasibility.« less

  6. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    SciTech Connect (OSTI)

    Hansen, Clifford W.; Martin, Curtis E.

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  7. Model development and data uncertainty integration

    SciTech Connect (OSTI)

    Swinhoe, Martyn Thomas

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(?) for 240Pu spontaneous fission, and the effect of spontaneous fission and (?,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (?,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  8. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop

  9. Coping with uncertainties of mercury regulation

    SciTech Connect (OSTI)

    Reich, K.

    2006-09-15

    The thermometer is rising as coal-fired plants cope with the uncertainties of mercury regulation. The paper deals with a diagnosis and a suggested cure. It describes the state of mercury emission rules in the different US states, many of which had laws or rules in place before the Clean Air Mercury Rule (CAMR) was promulgated.

  10. Uncertainty in Simulating Wheat Yields Under Climate Change

    SciTech Connect (OSTI)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  11. On solar geoengineering and climate uncertainty

    SciTech Connect (OSTI)

    MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.

    2015-09-03

    Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictions of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.

  12. Uncertainty in BWR power during ATWS events

    SciTech Connect (OSTI)

    Diamond, D.J.

    1986-01-01

    A study was undertaken to improve our understanding of BWR conditions following the closure of main steam isolation valves and the failure of reactor trip. Of particular interest was the power during the period when the core had reached a quasi-equilibrium condition with a natural circulation flow rate determined by the water level in the downcomer. Insights into the uncertainity in the calculation of this power with sophisticated computer codes were quantified using a simple model which relates power to the principal thermal-hydraulic variables and reactivity coefficients; the latter representing the link between the thermal-hydraulics and the neutronics. Assumptions regarding the uncertainty in these variables and coefficients were then used to determine the uncertainty in power.

  13. Uncertainty estimates for derivatives and intercepts

    SciTech Connect (OSTI)

    Clark, E.L.

    1994-09-01

    Straight line least squares fits of experimental data are widely used in the analysis of test results to provide derivatives and intercepts. A method for evaluating the uncertainty in these parameters is described. The method utilizes conventional least squares results and is applicable to experiments where the independent variable is controlled, but not necessarily free of error. A Monte Carlo verification of the method is given.

  14. Interpolation Uncertainties Across the ARM SGP Area

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Interpolation Uncertainties Across the ARM SGP Area J. E. Christy, C. N. Long, and T. R. Shippert Pacific Northwest National Laboratory Richland, Washington Interpolation Grids Across the SGP Network Area The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Program operates a network of surface radiation measurement sites across north central Oklahoma and south central Kansas. This Southern Great Plains (SGP) network consists of 21 sites unevenly spaced from 95.5 to 99.5

  15. Representation of analysis results involving aleatory and epistemic uncertainty.

    SciTech Connect (OSTI)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  16. October 16, 2014 Webinar - Decisional Analysis under Uncertainty |

    Energy Savers [EERE]

    Department of Energy 6, 2014 Webinar - Decisional Analysis under Uncertainty October 16, 2014 Webinar - Decisional Analysis under Uncertainty Webinar - October 16, 2014, 11 am - 12:40 pm EDT: Dr. Paul Black (Neptune, Inc), Decisional Analysis under Uncertainty PDF icon Agenda - October 16, 2014 - P&RA CoP Webinar PDF icon Presentation - Decision Making under Uncertainty: Introduction to Structured Decision Analysis for Performance Assessments More Documents & Publications Status

  17. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect (OSTI)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  18. Entropic uncertainty relations in multidimensional position and momentum spaces

    SciTech Connect (OSTI)

    Huang Yichen

    2011-05-15

    Commutator-based entropic uncertainty relations in multidimensional position and momentum spaces are derived, twofold generalizing previous entropic uncertainty relations for one-mode states. They provide optimal lower bounds and imply the multidimensional variance-based uncertainty principle. The article concludes with an open conjecture.

  19. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    SciTech Connect (OSTI)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  20. Tutorial examples for uncertainty quantification methods.

    SciTech Connect (OSTI)

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  1. Microsoft Word - Price Uncertainty Supplement .docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    1 1 January 2011 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 January 11, 2011 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged over $89 per barrel in December, about $5 per barrel higher than the November average. Expectations of higher oil demand, combined with unusually cold weather in both Europe and the U.S. Northeast, contributed to prices. EIA has raised the first quarter 2011 WTI spot price forecast by $8 per barrel

  2. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    April 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 April 6, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $81 per barrel in March 2010, almost $5 per barrel above the prior month's average and $3 per barrel higher than forecast in last month's Outlook. Oil prices rose from a low this year of $71.15 per barrel on February 5 to $80 per barrel by the end of February, generally on news of robust economic and energy demand growth in non-OECD

  3. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    0 1 August 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 August 10, 2010 Release WTI crude oil spot prices averaged $76.32 per barrel in July 2010 or about $1 per barrel above the prior month's average, and close to the $77 per barrel projected in last month's Outlook. EIA projects WTI prices will average about $80 per barrel over the second half of this year and rise to $85 by the end of next year (West Texas Intermediate Crude Oil Price Chart). Energy price

  4. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    December 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 December 7, 2010 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged over $84 per barrel in November, more than $2 per barrel higher than the October average. EIA has raised the average winter 2010-2011 period WTI spot price forecast by $1 per barrel from the last monthʹs Outlook to $84 per barrel. WTI spot prices rise to $89 per barrel by the end of next year, $2 per

  5. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    0 1 July 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 July 7, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $75.34 per barrel in June 2010 ($1.60 per barrel above the prior month's average), close to the $76 per barrel projected in the forecast in last month's Outlook. EIA projects WTI prices will average about $79 per barrel over the second half of this year and rise to $84 by the end of next year (West Texas Intermediate Crude Oil Price

  6. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    March 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 March 9, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $76.39 per barrel in February 2010, almost $2 per barrel lower than the prior month's average and very near the $76 per barrel forecast in last month's Outlook. Last month, the WTI spot price reached a low of $71.15 on February 5 and peaked at $80.04 on February 22. EIA expects WTI prices to average above $80 per barrel this spring,

  7. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    May 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 May 11, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $84 per barrel in April 2010, about $3 per barrel above the prior month's average and $2 per barrel higher than forecast in last month's Outlook. EIA projects WTI prices will average about $84 per barrel over the second half of this year and rise to $87 by the end of next year, an increase of about $2 per barrel from the previous Outlook

  8. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    November 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 November 9, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged almost $82 per barrel in October, about $7 per barrel higher than the September average, as expectations of higher oil demand pushed up prices. EIA has raised the average fourth quarter 2010 WTI spot price forecast to about $83 per barrel compared with $79 per barrel in last monthʹs Outlook. WTI spot prices rise to $87 per

  9. Microsoft Word - Price Uncertainty Supplement.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    0 1 September 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 September 8, 2010 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged about $77 per barrel in August 2010, very close to the July average, but $3 per barrel lower than projected in last month's Outlook. WTI spot prices averaged almost $82 per barrel over the first 10 days of August but then fell by $9 per barrel over the next 2 weeks as the market reacted to a series

  10. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    SciTech Connect (OSTI)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  11. Survey and Evaluate Uncertainty Quantification Methodologies

    SciTech Connect (OSTI)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon capture processes. As such, we will develop, as needed and beyond existing capabilities, a suite of robust and efficient computational tools for UQ to be integrated into a CCSI UQ software framework.

  12. Lidar arc scan uncertainty reduction through scanning geometry optimization

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Wang, H.; Barthelmie, R. J.; Pryor, S. C.; Brown, G.

    2015-10-07

    Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine powermoreperformance analysis and annual energy production. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation when arc scans are used for wind resource assessment.less

  13. Radiotherapy Dose Fractionation under Parameter Uncertainty

    SciTech Connect (OSTI)

    Davison, Matt; Kim, Daero; Keller, Harald

    2011-11-30

    In radiotherapy, radiation is directed to damage a tumor while avoiding surrounding healthy tissue. Tradeoffs ensue because dose cannot be exactly shaped to the tumor. It is particularly important to ensure that sensitive biological structures near the tumor are not damaged more than a certain amount. Biological tissue is known to have a nonlinear response to incident radiation. The linear quadratic dose response model, which requires the specification of two clinically and experimentally observed response coefficients, is commonly used to model this effect. This model yields an optimization problem giving two different types of optimal dose sequences (fractionation schedules). Which fractionation schedule is preferred depends on the response coefficients. These coefficients are uncertainly known and may differ from patient to patient. Because of this not only the expected outcomes but also the uncertainty around these outcomes are important, and it might not be prudent to select the strategy with the best expected outcome.

  14. Intrinsic Uncertainties in Modeling Complex Systems.

    SciTech Connect (OSTI)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project %23170979, entitled %22Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties%22, which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  15. Entanglement criteria via concave-function uncertainty relations

    SciTech Connect (OSTI)

    Huang Yichen

    2010-07-15

    A general theorem as a necessary condition for the separability of quantum states in both finite and infinite dimensional systems, based on concave-function uncertainty relations, is derived. Two special cases of the general theorem are stronger than two known entanglement criteria based on the Shannon entropic uncertainty relation and the Landau-Pollak uncertainty relation, respectively; other special cases are able to detect entanglement where some famous entanglement criteria fail.

  16. Which Models Matter: Uncertainty and Sensitivity Analysis for

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Models Matter: Uncertainty and Sensitivity Analysis for Photovoltaic Power Systems Clifford W. Hansen and Andrew Pohl Sandia National Laboratories, Albuquerque, NM, 87185-1033, USA Abstract - Predicting power for a photovoltaic system from measured irradiance requires a sequence of models, e.g.: translation of measured irradiance to the plane-of-array; estimation of cell temperature; and calculation of module electrical output. Uncertainty in predicted power arises from the aggregate uncertainty

  17. Managing Uncertainty and Demonstrating Compliance | Department of Energy

    Office of Environmental Management (EM)

    Uncertainty and Demonstrating Compliance Managing Uncertainty and Demonstrating Compliance Presentation from the 2015 Annual Performance and Risk Assessment (P&RA) Community of Practice (CoP) Technical Exchange Meeting held in Richland, Washington on December 15-16, 2015. PDF icon Managing Uncertainty and Demonstrating Compliance More Documents & Publications Tank Waste Corporate Board Meeting 03/05/09 DOE-HDBK-1215-2014 2009 Performance Assessment for the Saltstone Disposal Facility

  18. PMU Uncertainty Quantification in Voltage Stability Analysis | Argonne

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    National Laboratory PMU Uncertainty Quantification in Voltage Stability Analysis Title PMU Uncertainty Quantification in Voltage Stability Analysis Publication Type Journal Article Year of Publication 2015 Authors Chen, C, Wang, J, Li, Z, Sun, H, Wang, Z Journal IEEE Transactions on Power Systems Volume 30 Start Page 2196 Issue 4 Pagination 2 Date Published 06162015 Keywords phasor measurement unit, recursive least square, uncertainty, voltage stability Abstract This letter presents an

  19. Uncertainty Reduction in Power Generation Forecast Using Coupled

    Office of Scientific and Technical Information (OSTI)

    Wavelet-ARIMA (Conference) | SciTech Connect Conference: Uncertainty Reduction in Power Generation Forecast Using Coupled Wavelet-ARIMA Citation Details In-Document Search Title: Uncertainty Reduction in Power Generation Forecast Using Coupled Wavelet-ARIMA In this paper, we introduce a new approach without implying normal distributions and stationarity of power generation forecast errors. In addition, it is desired to more accurately quantify the forecast uncertainty by reducing prediction

  20. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Talou, Patrick Los Alamos National Laboratory; Nazarewicz, Witold University of Tennessee, Knoxville,...

  1. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect (OSTI)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  2. THE EFFECT OF UNCERTAINTY IN MODELING COEFFICIENTS USED TO PREDICT...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    UNCERTAINTY IN MODELING COEFFICIENTS USED TO PREDICT ENERGY PRODUCTION USING THE SANDIA ARRAY ... relating voltage and current to solar irradiance, for crystalline silicon modules. ...

  3. Computational Fluid Dynamics & Large-Scale Uncertainty Quantification...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Fluid Dynamics & Large-Scale Uncertainty Quantification for Wind Energy - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure ...

  4. Estimation of uncertainty for contour method residual stress measurements

    SciTech Connect (OSTI)

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulness of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 76 mm, the estimated uncertainty was approximately 5 MPa (?/E = 7 10??) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (?/E = 14 10??).

  5. Output-Based Error Estimation and Adaptation for Uncertainty...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Output-Based Error Estimation and Adaptation for Uncertainty Quantification Isaac M. Asher and Krzysztof J. Fidkowski University of Michigan US National Congress on Computational...

  6. Uncertainty quantification for evaluating the impacts of fracture...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty quantification for evaluating the impacts of fracture zone on pressure build-up and ground surface uplift during geological CO sequestration Citation Details ...

  7. Variable Grid Method for Visualizing Uncertainty Associated with...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    interpretation. In general, NETL's VGM applies a grid system where the size of the cell represents the uncertainty associated with the original point data sources or their...

  8. Uncertainty quantification of US Southwest climate from IPCC...

    Office of Scientific and Technical Information (OSTI)

    Title: Uncertainty quantification of US Southwest climate from IPCC projections. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) made extensive ...

  9. Uncertainties in Air Exchange using Continuous-Injection, Long...

    Office of Scientific and Technical Information (OSTI)

    people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS...

  10. Measuring Thermal Conductivity with Raman:Capability Uncertainty...

    Office of Scientific and Technical Information (OSTI)

    Measuring Thermal Conductivity with Raman:Capability Uncertainty and Strain Effects. Citation Details In-Document Search Title: Measuring Thermal Conductivity with Raman:Capability...

  11. An Efficient Surrogate Modeling Approach in Bayesian Uncertainty...

    Office of Scientific and Technical Information (OSTI)

    Conference: An Efficient Surrogate Modeling Approach in Bayesian Uncertainty Analysis Citation Details In-Document Search Title: An Efficient Surrogate Modeling Approach in...

  12. Estimation of uncertainty for contour method residual stress measurements

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.

    2014-12-03

    This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore » of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (σ/E = 7 · 10⁻⁵) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (σ/E = 14 · 10⁻⁵).« less

  13. Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions

    SciTech Connect (OSTI)

    Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.

    2011-09-30

    This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.

  14. Uncertainty quantification in fission cross section measurements at LANSCE

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  15. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  17. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect (OSTI)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  18. Investigation of uncertainty components in Coulomb blockade thermometry

    SciTech Connect (OSTI)

    Hahtela, O. M.; Heinonen, M.; Manninen, A.; Meschke, M.; Savin, A.; Pekola, J. P.; Gunnarsson, D.; Prunnila, M.; Penttil, J. S.; Roschier, L.

    2013-09-11

    Coulomb blockade thermometry (CBT) has proven to be a feasible method for primary thermometry in every day laboratory use at cryogenic temperatures from ca. 10 mK to a few tens of kelvins. The operation of CBT is based on single electron charging effects in normal metal tunnel junctions. In this paper, we discuss the typical error sources and uncertainty components that limit the present absolute accuracy of the CBT measurements to the level of about 1 % in the optimum temperature range. Identifying the influence of different uncertainty sources is a good starting point for improving the measurement accuracy to the level that would allow the CBT to be more widely used in high-precision low temperature metrological applications and for realizing thermodynamic temperature in accordance to the upcoming new definition of kelvin.

  19. Size exclusion deep bed filtration: Experimental and modelling uncertainties

    SciTech Connect (OSTI)

    Badalyan, Alexander You, Zhenjiang; Aji, Kaiser; Bedrikovetsky, Pavel; Carageorgos, Themis; Zeinijahromi, Abbas

    2014-01-15

    A detailed uncertainty analysis associated with carboxyl-modified latex particle capture in glass bead-formed porous media enabled verification of the two theoretical stochastic models for prediction of particle retention due to size exclusion. At the beginning of this analysis it is established that size exclusion is a dominant particle capture mechanism in the present study: calculated significant repulsive Derjaguin-Landau-Verwey-Overbeek potential between latex particles and glass beads is an indication of their mutual repulsion, thus, fulfilling the necessary condition for size exclusion. Applying linear uncertainty propagation method in the form of truncated Taylor's series expansion, combined standard uncertainties (CSUs) in normalised suspended particle concentrations are calculated using CSUs in experimentally determined parameters such as: an inlet volumetric flowrate of suspension, particle number in suspensions, particle concentrations in inlet and outlet streams, particle and pore throat size distributions. Weathering of glass beads in high alkaline solutions does not appreciably change particle size distribution, and, therefore, is not considered as an additional contributor to the weighted mean particle radius and corresponded weighted mean standard deviation. Weighted mean particle radius and LogNormal mean pore throat radius are characterised by the highest CSUs among all experimental parameters translating to high CSU in the jamming ratio factor (dimensionless particle size). Normalised suspended particle concentrations calculated via two theoretical models are characterised by higher CSUs than those for experimental data. The model accounting the fraction of inaccessible flow as a function of latex particle radius excellently predicts normalised suspended particle concentrations for the whole range of jamming ratios. The presented uncertainty analysis can be also used for comparison of intra- and inter-laboratory particle size exclusion data.

  20. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    SciTech Connect (OSTI)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  1. Avoiding climate change uncertainties in Strategic Environmental Assessment

    SciTech Connect (OSTI)

    Larsen, Sanne Vammen; Krnv, Lone; Driscoll, Patrick

    2013-11-15

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies reduction and resilience, denying, ignoring and postponing. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.

  2. Uncertainty and sensitivity analysis in the 2008 performance assessment for the proposed repository for high-level radioactive waste at Yucca Mountain, Nevada.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-05-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, an extensive performance assessment (PA) for the YM repository was completed in 2008 [1] and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository [2]. This presentation provides an overview of the conceptual and computational structure of the indicated PA (hereafter referred to as the 2008 YM PA) and the roles that uncertainty analysis and sensitivity analysis play in this structure.

  3. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    SciTech Connect (OSTI)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commissions Advisory Committee on Reactor Safeguards, a high source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of zero results).

  4. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect (OSTI)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  5. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    SciTech Connect (OSTI)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  6. PDF uncertainties at large x and gauge boson production

    SciTech Connect (OSTI)

    Accardi, Alberto

    2012-10-01

    I discuss how global QCD fits of parton distribution functions can make the somewhat separated fields of high-energy particle physics and lower energy hadronic and nuclear physics interact to the benefit of both. In particular, I will argue that large rapidity gauge boson production at the Tevatron and the LHC has the highest short-term potential to constrain the theoretical nuclear corrections to DIS data on deuteron targets necessary for up/down flavor separation. This in turn can considerably reduce the PDF uncertainty on cross section calculations of heavy mass particles such as W' and Z' bosons.

  7. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    SciTech Connect (OSTI)

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene; Malinas, Rebecca; Peterson, Matthew Gregor; Phillips, Cynthia A.; Robinson, David G.; Woodbridge, Diane

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search, and discuss a number of possible improvements for each.

  8. Integration of Uncertainty Information into Power System Operations

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Lu, Shuai; Samaan, Nader A.; Huang, Zhenyu; Subbarao, Krishnappa; Etingov, Pavel V.; Ma, Jian; Hafen, Ryan P.; Diao, Ruisheng; Lu, Ning

    2011-10-10

    Contemporary power systems face uncertainties coming from multiple sources, including forecast errors of load, wind and solar generation, uninstructed deviation and forced outage of traditional generators, loss of transmission lines, and others. With increasing amounts of wind and solar generation being integrated into the system, these uncertainties have been growing significantly. It is critical important to build knowledge of major sources of uncertainty, learn how to simulate them, and then incorporate this information into the decision-making processes and power system operations, for better reliability and efficiency. This paper gives a comprehensive view on the sources of uncertainty in power systems, important characteristics, available models, and ways of their integration into system operations. It is primarily based on previous works conducted at the Pacific Northwest National Laboratory (PNNL).

  9. Asymptotic and uncertainty analyses of a phase field model for...

    Office of Scientific and Technical Information (OSTI)

    In contrast, the uncertainty resulting from the void surface energy has minimal affect. The analysis also shows that the model is consistent in the sense that its predictions do ...

  10. Research Portfolio Report Ultra-Deepwater: Geologic Uncertainty

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Geologic Uncertainty Cover Image: 3D visualization of directionally drilled boreholes in the Gulf of Mexico, field MC109, showing NETL's interpretation of two reservoir sand intervals. Research Portfolio Report Ultra-Deepwater: Geologic Uncertainty DOE/NETL-2015/1694 Prepared by: Mari Nichols-Haining, Jennifer Funk, Kathy Bruner, John Oelfke, and Christine Rueter KeyLogic Systems, Inc. National Energy Technology Laboratory (NETL) Contact: James Ammer james.ammer@netl.doe.gov Contract

  11. Gas Exploration Software for Reducing Uncertainty in Gas Concentration

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Estimates - Energy Innovation Portal Energy Analysis Energy Analysis Find More Like This Return to Search Gas Exploration Software for Reducing Uncertainty in Gas Concentration Estimates Lawrence Berkeley National Laboratory Contact LBL About This Technology Technology Marketing SummaryEstimating reservoir parameters for gas exploration from geophysical data is subject to a large degree of uncertainty. Seismic imaging techniques, such as seismic amplitude versus angle (AVA) analysis, can

  12. Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties

    Office of Scientific and Technical Information (OSTI)

    (Conference) | SciTech Connect Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties Citation Details In-Document Search Title: Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties Authors: White, Morgan C. [1] ; Rising, Michael E. [1] ; Talou, Patrick [1] + Show Author Affiliations Los Alamos National Laboratory Publication Date: 2012-10-22 OSTI Identifier: 1053899 Report Number(s): LA-UR-12-25665 DOE Contract Number: AC52-06NA25396 Resource Type: Conference

  13. Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Authors: Talou, Patrick [1] ; Nazarewicz, Witold [2] ; Prinja, Anil [3] ; Danon, Yaron [4] + Show Author Affiliations Los Alamos National Laboratory University of Tennessee, Knoxville, TN 37996, USA University of New Mexico, USA Rensselaer Polytechnic Institute, USA

  14. Measuring Thermal Conductivity with Raman:Capability Uncertainty and Strain

    Office of Scientific and Technical Information (OSTI)

    Effects. (Conference) | SciTech Connect Measuring Thermal Conductivity with Raman:Capability Uncertainty and Strain Effects. Citation Details In-Document Search Title: Measuring Thermal Conductivity with Raman:Capability Uncertainty and Strain Effects. Abstract not provided. Authors: Beechem Iii, Thomas Edwin ; Yates, Luke Publication Date: 2012-11-01 OSTI Identifier: 1116156 Report Number(s): SAND2012-10198C 480178 DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource

  15. Comparison of Uncertainty of Two Precipitation Prediction Models at Los

    Office of Scientific and Technical Information (OSTI)

    Alamos National Lab Technical Area 54 (Technical Report) | SciTech Connect Comparison of Uncertainty of Two Precipitation Prediction Models at Los Alamos National Lab Technical Area 54 Citation Details In-Document Search Title: Comparison of Uncertainty of Two Precipitation Prediction Models at Los Alamos National Lab Technical Area 54 Meteorological inputs are an important part of subsurface flow and transport modeling. The choice of source for meteorological data used as inputs has

  16. Uncertainties in the Anti-neutrino Production at Nuclear Reactors

    SciTech Connect (OSTI)

    Djurcic, Zelimir; Detwiler, Jason A.; Piepke, Andreas; Foster Jr., Vince R.; Miller, Lester; Gratta, Giorgio

    2008-08-06

    Anti-neutrino emission rates from nuclear reactors are determined from thermal power measurements and fission rate calculations. The uncertainties in these quantities for commercial power plants and their impact on the calculated interaction rates in {bar {nu}}{sub e} detectors is examined. We discuss reactor-to-reactor correlations between the leading uncertainties, and their relevance to reactor {bar {nu}}{sub e} experiments.

  17. Uncertainty Quantification and Propagation in Nuclear Density Functional

    Office of Scientific and Technical Information (OSTI)

    Theory (Conference) | SciTech Connect Conference: Uncertainty Quantification and Propagation in Nuclear Density Functional Theory Citation Details In-Document Search Title: Uncertainty Quantification and Propagation in Nuclear Density Functional Theory Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root

  18. Uncertainty quantification for discrimination of nuclear events as

    Office of Scientific and Technical Information (OSTI)

    violations of the comprehensive nuclear-test-ban treaty (Journal Article) | DOE PAGES Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty « Prev Next » Title: Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty Authors: Sloan, Jamison ; Sun, Yunwei Search DOE PAGES for author "Sun, Yunwei" Search DOE PAGES for ORCID

  19. Uncertainty quantification for evaluating impacts of caprock and reservoir

    Office of Scientific and Technical Information (OSTI)

    properties on pressure buildup and ground surface displacement during geological CO2 sequestration (Journal Article) | SciTech Connect Uncertainty quantification for evaluating impacts of caprock and reservoir properties on pressure buildup and ground surface displacement during geological CO2 sequestration Citation Details In-Document Search Title: Uncertainty quantification for evaluating impacts of caprock and reservoir properties on pressure buildup and ground surface displacement during

  20. Uncertainty quantification for evaluating the impacts of fracture zone on

    Office of Scientific and Technical Information (OSTI)

    pressure build-up and ground surface uplift during geological CO₂ sequestration (Journal Article) | SciTech Connect Uncertainty quantification for evaluating the impacts of fracture zone on pressure build-up and ground surface uplift during geological CO₂ sequestration Citation Details In-Document Search Title: Uncertainty quantification for evaluating the impacts of fracture zone on pressure build-up and ground surface uplift during geological CO₂ sequestration A series of numerical

  1. Uncertainty quantification methodologies development for storage and trans-

    Office of Scientific and Technical Information (OSTI)

    portation of used nuclear fuel: Pilot study on stress corrosion cracking of canister welds (Technical Report) | SciTech Connect Technical Report: Uncertainty quantification methodologies development for storage and trans- portation of used nuclear fuel: Pilot study on stress corrosion cracking of canister welds Citation Details In-Document Search Title: Uncertainty quantification methodologies development for storage and trans- portation of used nuclear fuel: Pilot study on stress corrosion

  2. Uncertainty quantification of US Southwest climate from IPCC projections.

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Technical Report: Uncertainty quantification of US Southwest climate from IPCC projections. Citation Details In-Document Search Title: Uncertainty quantification of US Southwest climate from IPCC projections. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) made extensive use of coordinated simulations by 18 international modeling groups using a variety of coupled general circulation models (GCMs) with different

  3. Lab RFP: Validation and Uncertainty Characterization | Department of Energy

    Office of Environmental Management (EM)

    Validation and Uncertainty Characterization Lab RFP: Validation and Uncertainty Characterization LBNL's FLEXLAB test facility, which includes four test cells each split into two half-cells to enable side-by-side comparative experiments. The cells have one active, reconfigurable facade, and individual, reconfigurable single-zone HVAC systems. The cell facing the camera sits on 270 degree turntable. Photo credit: LBNL. Bottom: ORNL's two-story flexible research platform test building. The building

  4. Microsoft PowerPoint - Managing Uncertainty and Demonstrating Compliance

    Office of Environmental Management (EM)

    Uncertainty and Demonstrating Compliance Roger Seitz (SRNL) Performance and Risk Assessment Community of Practice Technical Exchange December 15, 2015 Introduction * Remediation activities and radioactive waste disposal facilities can go to great lengths to demonstrate safety * Designs and assessments address potential impacts for very long time frames relative to other human activities * Uncertainty is inherent in natural systems and long time frames, but it can be effectively managed to make

  5. How incorporating more data reduces uncertainty in recovery predictions

    SciTech Connect (OSTI)

    Campozana, F.P.; Lake, L.W.; Sepehrnoori, K.

    1997-08-01

    From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.

  6. Fuzzy-probabilistic calculations of water-balance uncertainty

    SciTech Connect (OSTI)

    Faybishenko, B.

    2009-10-01

    Hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete, or subjective information, which may limit the application of conventional stochastic methods in predicting hydrogeologic conditions and associated uncertainty. Instead, redictions and uncertainty analysis can be made using uncertain input parameters expressed as probability boxes, intervals, and fuzzy numbers. The objective of this paper is to present the theory for, and a case study as an application of, the fuzzyprobabilistic approach, ombining probability and possibility theory for simulating soil water balance and assessing associated uncertainty in the components of a simple waterbalance equation. The application of this approach is demonstrated using calculations with the RAMAS Risk Calc code, to ssess the propagation of uncertainty in calculating potential evapotranspiration, actual evapotranspiration, and infiltration-in a case study at the Hanford site, Washington, USA. Propagation of uncertainty into the results of water-balance calculations was evaluated by hanging he types of models of uncertainty incorporated into various input parameters. The results of these fuzzy-probabilistic calculations are compared to the conventional Monte Carlo simulation approach and estimates from field observations at the Hanford site.

  7. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    SciTech Connect (OSTI)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.

  8. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    SciTech Connect (OSTI)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  9. A Method to Estimate Uncertainty in Radiometric Measurement Using the Guide to the Expression of Uncertainty in Measurement (GUM) Method; NREL (National Renewable Energy Laboratory)

    SciTech Connect (OSTI)

    Habte, A.; Sengupta, M.; Reda, I.

    2015-03-01

    Radiometric data with known and traceable uncertainty is essential for climate change studies to better understand cloud radiation interactions and the earth radiation budget. Further, adopting a known and traceable method of estimating uncertainty with respect to SI ensures that the uncertainty quoted for radiometric measurements can be compared based on documented methods of derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM). derivation.Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This poster provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements from radiometers. The approach follows the Guide to the Expression of Uncertainty in Measurement (GUM).

  10. Investment and Upgrade in Distributed Generation under Uncertainty

    SciTech Connect (OSTI)

    Siddiqui, Afzal; Maribu, Karl

    2008-08-18

    The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplatingthe installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility.

  11. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    SciTech Connect (OSTI)

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.

  12. Systematic uncertainties from halo asphericity in dark matter searches

    SciTech Connect (OSTI)

    Bernal, Nicols; Forero-Romero, Jaime E.; Garani, Raghuveer; Palomares-Ruiz, Sergio E-mail: je.forero@uniandes.edu.co E-mail: sergio.palomares.ruiz@ific.uv.es

    2014-09-01

    Although commonly assumed to be spherical, dark matter halos are predicted to be non-spherical by N-body simulations and their asphericity has a potential impact on the systematic uncertainties in dark matter searches. The evaluation of these uncertainties is the main aim of this work, where we study the impact of aspherical dark matter density distributions in Milky-Way-like halos on direct and indirect searches. Using data from the large N-body cosmological simulation Bolshoi, we perform a statistical analysis and quantify the systematic uncertainties on the determination of local dark matter density and the so-called J factors for dark matter annihilations and decays from the galactic center. We find that, due to our ignorance about the extent of the non-sphericity of the Milky Way dark matter halo, systematic uncertainties can be as large as 35%, within the 95% most probable region, for a spherically averaged value for the local density of 0.3-0.4 GeV/cm {sup 3}. Similarly, systematic uncertainties on the J factors evaluated around the galactic center can be as large as 10% and 15%, within the 95% most probable region, for dark matter annihilations and decays, respectively.

  13. PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility

    SciTech Connect (OSTI)

    Piyush Sabharwall; Richard Skifton; Carl Stoots; Eung Soo Kim; Thomas Conder

    2013-12-01

    Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.

  14. Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.

    SciTech Connect (OSTI)

    Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan; Kaplan, Paul Garry; Brown, Theresa Jean; Conrad, Stephen Hamilton

    2010-09-01

    Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subset of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.

  15. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    SciTech Connect (OSTI)

    Campos, E; Sisterson, DL

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.

  16. Uncertainty Analysis Technique for OMEGA Dante Measurements (Conference) |

    Office of Scientific and Technical Information (OSTI)

    SciTech Connect Conference: Uncertainty Analysis Technique for OMEGA Dante Measurements Citation Details In-Document Search Title: Uncertainty Analysis Technique for OMEGA Dante Measurements The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser

  17. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect (OSTI)

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  18. Distributed Generation Investment by a Microgrid under Uncertainty

    SciTech Connect (OSTI)

    Marnay, Chris; Siddiqui, Afzal; Marnay, Chris

    2008-08-11

    This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.

  19. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    SciTech Connect (OSTI)

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.

  20. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect (OSTI)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  1. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less

  2. Improved lower bound on the entropic uncertainty relation

    SciTech Connect (OSTI)

    Jafarpour, Mojtaba; Sabour, Abbass

    2011-09-15

    We present a lower bound on the entropic uncertainty relation for the distinguished measurements of two observables in a d-dimensional Hilbert space for d up to 5. This bound provides an improvement over the best one yet available. The feasibility of the obtained bound presenting an improvement for higher dimensions is also discussed.

  3. Reducing uncertainty in geostatistical description with well testing pressure data

    SciTech Connect (OSTI)

    Reynolds, A.C.; He, Nanqun; Oliver, D.S.

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  4. River meander modeling and confronting uncertainty.

    SciTech Connect (OSTI)

    Posner, Ari J.

    2011-05-01

    This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.

  5. The IAEA Coordinated Research Program on HTGR Uncertainty Analysis: Phase I Status and Initial Results

    SciTech Connect (OSTI)

    Strydom, Gerhard; Bostelmann, Friederike; Ivanov, Kostadin

    2014-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools is to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies, and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the required confidence level. In order to address uncertainty propagation in analysis and methods in the HTGR community the IAEA initiated a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) that officially started in 2013. Although this project focuses specifically on the peculiarities of HTGR designs and its simulation requirements, many lessons can be learned from the LWR community and the significant progress already made towards a consistent methodology uncertainty analysis. In the case of LWRs the NRC has already in 1988 amended 10 CFR 50.46 to allow best-estimate (plus uncertainties) calculations of emergency core cooling system performance. The Nuclear Energy Agency (NEA) of the Organization for Economic Co-operation and Development (OECD) also established an Expert Group on "Uncertainty Analysis in Modelling" which finally led to the definition of the "Benchmark for Uncertainty Analysis in Modelling (UAM) for Design, Operation and Safety Analysis of LWRs". The CRP on HTGR UAM will follow as far as possible the on-going OECD Light Water Reactor UAM benchmark activity.

  6. Reduction in maximum time uncertainty of paired time signals

    DOE Patents [OSTI]

    Theodosiou, G.E.; Dawson, J.W.

    1981-02-11

    Reduction in the maximum time uncertainty (t/sub max/ - t/sub min/) of a series of paired time signals t/sub 1/ and t/sub 2/ varying between two input terminals and representative of a series of single events where t/sub 1/ less than or equal to t/sub 2/ and t/sub 1/ + t/sub 2/ equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t/sub min/) of the first signal t/sub 1/ closer to t/sub max/ and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20 to 800.

  7. Uncertainty Quantification and Propagation in Nuclear Density Functional Theory

    SciTech Connect (OSTI)

    Schunck, N; McDonnell, J D; Higdon, D; Sarich, J; Wild, S M

    2015-03-17

    Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this paper, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statistical analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.

  8. Uncertainties in nuclear transition matrix elements of neutrinoless ?? decay

    SciTech Connect (OSTI)

    Rath, P. K.

    2013-12-30

    To estimate the uncertainties associated with the nuclear transition matrix elements M{sup (K)} (K=0?/0N) for the 0{sup +} ? 0{sup +} transitions of electron and positron emitting modes of the neutrinoless ?? decay, a statistical analysis has been performed by calculating sets of eight (twelve) different nuclear transition matrix elements M{sup (K)} in the PHFB model by employing four different parameterizations of a Hamiltonian with pairing plus multipolar effective two-body interaction and two (three) different parameterizations of Jastrow short range correlations. The averages in conjunction with their standard deviations provide an estimate of the uncertainties associated the nuclear transition matrix elements M{sup (K)} calculated within the PHFB model, the maximum of which turn out to be 13% and 19% owing to the exchange of light and heavy Majorana neutrinos, respectively.

  9. Giant dipole resonance parameters with uncertainties from photonuclear cross sections

    SciTech Connect (OSTI)

    Plujko, V.A.; Capote, R.; Gorbachenko, O.M.

    2011-09-15

    Updated values and corresponding uncertainties of isovector giant dipole resonance (IVGDR or GDR) model parameters are presented that are obtained by the least-squares fitting of theoretical photoabsorption cross sections to experimental data. The theoretical photoabsorption cross section is taken as a sum of the components corresponding to excitation of the GDR and quasideuteron contribution to the experimental photoabsorption cross section. The present compilation covers experimental data as of January 2010. - Highlights: {yields} Experimental {sigma} ({gamma}, abs) or a sum of partial cross sections are taken as input to the fitting. {yields} Data include contributions from photoproton reactions. {yields} Standard (SLO) or modified (SMLO) Lorentzian approaches are used for formulating GDR models. {yields} Spherical or axially deformed nuclear shapes are used in GDR least-squares fit. {yields} Values and uncertainties of the SLO and SMLO GDR model parameters are tabulated.

  10. Reduction in maximum time uncertainty of paired time signals

    DOE Patents [OSTI]

    Theodosiou, G.E.; Dawson, J.W.

    1983-10-04

    Reduction in the maximum time uncertainty (t[sub max]--t[sub min]) of a series of paired time signals t[sub 1] and t[sub 2] varying between two input terminals and representative of a series of single events where t[sub 1][<=]t[sub 2] and t[sub 1]+t[sub 2] equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t[sub min]) of the first signal t[sub 1] closer to t[sub max] and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20--800. 6 figs.

  11. Reduction in maximum time uncertainty of paired time signals

    DOE Patents [OSTI]

    Theodosiou, George E. (West Chicago, IL); Dawson, John W. (Clarendon Hills, IL)

    1983-01-01

    Reduction in the maximum time uncertainty (t.sub.max -t.sub.min) of a series of paired time signals t.sub.1 and t.sub.2 varying between two input terminals and representative of a series of single events where t.sub.1 .ltoreq.t.sub.2 and t.sub.1 +t.sub.2 equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t.sub.min) of the first signal t.sub.1 closer to t.sub.max and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20-800.

  12. Comment on ''Improved bounds on entropic uncertainty relations''

    SciTech Connect (OSTI)

    Bosyk, G. M.; Portesi, M.; Plastino, A.; Zozor, S.

    2011-11-15

    We provide an analytical proof of the entropic uncertainty relations presented by J. I. de Vicente and J. Sanchez-Ruiz [Phys. Rev. A 77, 042110 (2008)] and also show that the replacement of Eq. (27) by Eq. (29) in that reference introduces solutions that do not take fully into account the constraints of the problem, which in turn lead to some mistakes in their treatment.

  13. Computational Fluid Dynamics & Large-Scale Uncertainty Quantification for

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Wind Energy Fluid Dynamics & Large-Scale Uncertainty Quantification for Wind Energy - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery

  14. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    0 A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report E Campos DL Sisterson October 2015 DISCLAIMER This report was prepared as an account of work sponsored by the U.S. Government. Neither the United States nor any agency thereof, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents

  15. Uncertainty quantification for evaluating impacts of caprock and reservoir

    Office of Scientific and Technical Information (OSTI)

    properties on pressure buildup and ground surface displacement during geological CO2 sequestration (Journal Article) | SciTech Connect quantification for evaluating impacts of caprock and reservoir properties on pressure buildup and ground surface displacement during geological CO2 sequestration Citation Details In-Document Search Title: Uncertainty quantification for evaluating impacts of caprock and reservoir properties on pressure buildup and ground surface displacement during geological

  16. Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth

    SciTech Connect (OSTI)

    E. Borgonovo; C. L. Smith

    2012-10-01

    Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability risk metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.

  17. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    SciTech Connect (OSTI)

    Perko, Z.; Gilli, L.; Lathouwers, D.; Kloosterman, J. L.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)

  18. Achieving Robustness to Uncertainty for Financial Decision-making

    SciTech Connect (OSTI)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.; Song, Peter

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gap of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.

  19. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  20. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect (OSTI)

    G. Pastore; L.P. Swiler; J.D. Hales; S.R. Novascone; D.M. Perez; B.W. Spencer; L. Luzzi; P. Van Uffelen; R.L. Williamson

    2014-10-01

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  1. Estimated Uncertainties in the Idaho National Laboratory Matched-Index-of-Refraction Lower Plenum Experiment

    SciTech Connect (OSTI)

    Donald M. McEligot; Hugh M. McIlroy, Jr.; Ryan C. Johnson

    2007-11-01

    The purpose of the fluid dynamics experiments in the MIR (Matched-Index-of-Refraction) flow system at Idaho National Laboratory (INL) is to develop benchmark databases for the assessment of Computational Fluid Dynamics (CFD) solutions of the momentum equations, scalar mixing, and turbulence models for typical Very High Temperature Reactor (VHTR) plenum geometries in the limiting case of negligible buoyancy and constant fluid properties. The experiments use optical techniques, primarily particle image velocimetry (PIV) in the INL MIR flow system. The benefit of the MIR technique is that it permits optical measurements to determine flow characteristics in passages and around objects to be obtained without locating a disturbing transducer in the flow field and without distortion of the optical paths. The objective of the present report is to develop understanding of the magnitudes of experimental uncertainties in the results to be obtained in such experiments. Unheated MIR experiments are first steps when the geometry is complicated. One does not want to use a computational technique, which will not even handle constant properties properly. This report addresses the general background, requirements for benchmark databases, estimation of experimental uncertainties in mean velocities and turbulence quantities, the MIR experiment, PIV uncertainties, positioning uncertainties, and other contributing measurement uncertainties.

  2. Attempt to estimate measurement uncertainty in the Air Force Toxic Chemical Dispersion (AFTOX) model. Master's thesis

    SciTech Connect (OSTI)

    Zettlemoyer, M.D.

    1990-01-01

    The Air Force Toxic Chemical Dispersion (AFTOX) model is a Gaussian puff dispersion model that predicts plumes, concentrations, and hazard distances of toxic chemical spills. A measurement uncertainty propagation formula derived by Freeman et al. (1986) is used within AFTOX to estimate resulting concentration uncertainties due to the effects of data input uncertainties in wind speed, spill height, emission rate, and the horizontal and vertical Gaussian dispersion parameters, and the results are compared to true uncertainties as estimated by standard deviations computed by Monte Carlo simulations. The measurement uncertainty uncertainty propagation formula was found to overestimate measurement uncertainty in AFTOX-calculated concentrations by at least 350 percent, with overestimates worsening with increasing stability and/or increasing measurement uncertainty.

  3. Ideas underlying quantification of margins and uncertainties(QMU): a white paper.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.

    2006-09-01

    This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.

  4. Entropic uncertainty relations for the ground state of a coupled system

    SciTech Connect (OSTI)

    Santhanam, M.S.

    2004-04-01

    There is a renewed interest in the uncertainty principle, reformulated from the information theoretic point of view, called the entropic uncertainty relations. They have been studied for various integrable systems as a function of their quantum numbers. In this work, focussing on the ground state of a nonlinear, coupled Hamiltonian system, we show that approximate eigenstates can be constructed within the framework of adiabatic theory. Using the adiabatic eigenstates, we estimate the information entropies and their sum as a function of the nonlinearity parameter. We also briefly look at the information entropies for the highly excited states in the system.

  5. Identifying and bounding uncertainties in nuclear reactor thermal power calculations

    SciTech Connect (OSTI)

    Phillips, J.; Hauser, E.; Estrada, H.

    2012-07-01

    Determination of the thermal power generated in the reactor core of a nuclear power plant is a critical element in the safe and economic operation of the plant. Direct measurement of the reactor core thermal power is made using neutron flux instrumentation; however, this instrumentation requires frequent calibration due to changes in the measured flux caused by fuel burn-up, flux pattern changes, and instrumentation drift. To calibrate the nuclear instruments, steam plant calorimetry, a process of performing a heat balance around the nuclear steam supply system, is used. There are four basic elements involved in the calculation of thermal power based on steam plant calorimetry: The mass flow of the feedwater from the power conversion system, the specific enthalpy of that feedwater, the specific enthalpy of the steam delivered to the power conversion system, and other cycle gains and losses. Of these elements, the accuracy of the feedwater mass flow and the feedwater enthalpy, as determined from its temperature and pressure, are typically the largest contributors to the calorimetric calculation uncertainty. Historically, plants have been required to include a margin of 2% in the calculation of the reactor thermal power for the licensed maximum plant output to account for instrumentation uncertainty. The margin is intended to ensure a cushion between operating power and the power for which safety analyses are performed. Use of approved chordal ultrasonic transit-time technology to make the feedwater flow and temperature measurements (in place of traditional differential-pressure- based instruments and resistance temperature detectors [RTDs]) allows for nuclear plant thermal power calculations accurate to 0.3%-0.4% of plant rated power. This improvement in measurement accuracy has allowed many plant operators in the U.S. and around the world to increase plant power output through Measurement Uncertainty Recapture (MUR) up-rates of up to 1.7% of rated power, while also decreasing the probability of significant over-power events. This paper will examine the basic elements involved in calculation of thermal power using ultrasonic transit-time technology and will discuss the criteria for bounding uncertainties associated with each element in order to achieve reactor thermal power calculations to within 0.3% to 0.4%. (authors)

  6. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    SciTech Connect (OSTI)

    Xia, Kelin [Department of Mathematics, Michigan State University, Michigan 48824 (United States)] [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, Michigan 48824 (United States) [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, Michigan 48824 (United States)

    2014-03-15

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction.

  7. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    SciTech Connect (OSTI)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four projections, and associated kriging variances, were averaged using the posterior model probabilities as weights. Finally, cross-validation was conducted by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of the model-averaged result with that of each individual model. Using two quantitative measures of comparison, the model-averaged result was superior to any individual geostatistical model of log permeability considered.

  8. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    SciTech Connect (OSTI)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  9. Uncertainty Analysis of RELAP5-3D

    SciTech Connect (OSTI)

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INLs massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  10. Design Features and Technology Uncertainties for the Next Generation Nuclear Plant

    SciTech Connect (OSTI)

    John M. Ryskamp; Phil Hildebrandt; Osamu Baba; Ron Ballinger; Robert Brodsky; Hans-Wolfgang Chi; Dennis Crutchfield; Herb Estrada; Jeane-Claude Garnier; Gerald Gordon; Richard Hobbins; Dan Keuter; Marilyn Kray; Philippe Martin; Steve Melancon; Christian Simon; Henry Stone; Robert Varrin; Werner von Lensa

    2004-06-01

    This report presents the conclusions, observations, and recommendations of the Independent Technology Review Group (ITRG) regarding design features and important technology uncertainties associated with very-high-temperature nuclear system concepts for the Next Generation Nuclear Plant (NGNP). The ITRG performed its reviews during the period November 2003 through April 2004.

  11. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect (OSTI)

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  12. Analysis and Reduction of Complex Networks Under Uncertainty.

    SciTech Connect (OSTI)

    Ghanem, Roger G

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  13. Quantifying uncertainty in material damage from vibrational data

    SciTech Connect (OSTI)

    Butler, T.; Huhtala, A.; Juntunen, M.

    2015-02-15

    The response of a vibrating beam to a force depends on many physical parameters including those determined by material properties. Damage caused by fatigue or cracks results in local reductions in stiffness parameters and may drastically alter the response of the beam. Data obtained from the vibrating beam are often subject to uncertainties and/or errors typically modeled using probability densities. The goal of this paper is to estimate and quantify the uncertainty in damage modeled as a local reduction in stiffness using uncertain data. We present various frameworks and methods for solving this parameter determination problem. We also describe a mathematical analysis to determine and compute useful output data for each method. We apply the various methods in a specified sequence that allows us to interface the various inputs and outputs of these methods in order to enhance the inferences drawn from the numerical results obtained from each method. Numerical results are presented using both simulated and experimentally obtained data from physically damaged beams.

  14. Data Filtering Impact on PV Degradation Rates and Uncertainty (Poster)

    SciTech Connect (OSTI)

    Jordan, D. C.; Kurtz, S. R.

    2012-03-01

    To sustain the commercial success of photovoltaics (PV) it becomes vital to know how power output decreases with time. In order to predict power delivery, degradation rates must be determined accurately. Data filtering, any data treatment assessment of long-term field behavior, is discussed as part of a more comprehensive uncertainty analysis and can be one of the greatest sources of uncertainty in long-term performance studies. Several distinct filtering methods such as outlier removal and inclusion of only sunny days on several different metrics such as PVUSA, performance ratio, DC power to plane-of-array irradiance ratio, uncorrected, and temperature-corrected were examined. PVUSA showed the highest sensitivity while temperature-corrected power over irradiance ratio was found to be the least sensitive to data filtering conditions. Using this ratio it is demonstrated that quantification of degradation rates with a statistical accuracy of +/- 0.2%/year within 4 years of field data is possible on two crystalline silicon and two thin-film systems.

  15. WE-B-19A-01: SRT II: Uncertainties in SRT

    SciTech Connect (OSTI)

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-06-15

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring variation. Learn a variety of strategies for dealing with uncertainty, including margin recipes and explicit visualization of uncertainty. Understand how the assessment of PTV margins differs from regular fractionation (van Herk recipe) for 15 fraction deliveries.

  16. Quantification of initial-data uncertainty on a shock-accelerated gas cylinder

    SciTech Connect (OSTI)

    Tritschler, V. K. Avdonin, A.; Hickel, S.; Hu, X. Y.; Adams, N. A.

    2014-02-15

    We quantify initial-data uncertainties on a shock accelerated heavy-gas cylinder by two-dimensional well-resolved direct numerical simulations. A high-resolution compressible multicomponent flow simulation model is coupled with a polynomial chaos expansion to propagate the initial-data uncertainties to the output quantities of interest. The initial flow configuration follows previous experimental and numerical works of the shock accelerated heavy-gas cylinder. We investigate three main initial-data uncertainties, (i) shock Mach number, (ii) contamination of SF{sub 6} with acetone, and (iii) initial deviations of the heavy-gas region from a perfect cylindrical shape. The impact of initial-data uncertainties on the mixing process is examined. The results suggest that the mixing process is highly sensitive to input variations of shock Mach number and acetone contamination. Additionally, our results indicate that the measured shock Mach number in the experiment of Tomkins et al. [An experimental investigation of mixing mechanisms in shock-accelerated flow, J. Fluid. Mech. 611, 131 (2008)] and the estimated contamination of the SF{sub 6} region with acetone [S. K. Shankar, S. Kawai, and S. K. Lele, Two-dimensional viscous flow simulation of a shock accelerated heavy gas cylinder, Phys. Fluids 23, 024102 (2011)] exhibit deviations from those that lead to best agreement between our simulations and the experiment in terms of overall flow evolution.

  17. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  18. Science based stockpile stewardship, uncertainty quantification, and fission fragment beams

    SciTech Connect (OSTI)

    Stoyer, M A; McNabb, D; Burke, J; Bernstein, L A; Wu, C Y

    2009-09-14

    Stewardship of this nation's nuclear weapons is predicated on developing a fundamental scientific understanding of the physics and chemistry required to describe weapon performance without the need to resort to underground nuclear testing and to predict expected future performance as a result of intended or unintended modifications. In order to construct more reliable models, underground nuclear test data is being reanalyzed in novel ways. The extent to which underground experimental data can be matched with simulations is one measure of the credibility of our capability to predict weapon performance. To improve the interpretation of these experiments with quantified uncertainties, improved nuclear data is required. As an example, the fission yield of a device was often determined by measuring fission products. Conversion of the measured fission products to yield was accomplished through explosion code calculations (models) and a good set of nuclear reaction cross-sections. Because of the unique high-fluence environment of an exploding nuclear weapon, many reactions occurred on radioactive nuclides, for which only theoretically calculated cross-sections are available. Inverse kinematics reactions at CARIBU offer the opportunity to measure cross-sections on unstable neutron-rich fission fragments and thus improve the quality of the nuclear reaction cross-section sets. One of the fission products measured was {sup 95}Zr, the accumulation of all mass 95 fission products of Y, Sr, Rb and Kr (see Fig. 1). Subsequent neutron-induced reactions on these short lived fission products were assumed to cancel out - in other words, the destruction of mass 95 nuclides was more or less equal to the production of mass 95 nuclides. If a {sup 95}Sr was destroyed by an (n,2n) reaction it was also produced by (n,2n) reactions on {sup 96}Sr, for example. However, since these nuclides all have fairly short half-lives (seconds to minutes or even less), no experimental nuclear reaction cross-sections exist, and only theoretically modeled cross-sections are available. Inverse kinematics reactions at CARIBU offer the opportunity, should the beam intensity be sufficient, to measure cross-sections on a few important nuclides in order to benchmark the theoretical calculations and significantly improve the nuclear data. The nuclides in Fig. 1 are prioritized by importance factor and displayed in stoplight colors, green the highest and red the lowest priority.

  19. Method to Calculate Uncertainty Estimate of Measuring Shortwave Solar Irradiance using Thermopile and Semiconductor Solar Radiometers

    SciTech Connect (OSTI)

    Reda, I.

    2011-07-01

    The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections to solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.

  20. Uncertainty in terahertz time-domain spectroscopy measurement

    SciTech Connect (OSTI)

    Withayachumnankul, Withawat; Fischer, Bernd M.; Lin Hungyen; Abbott, Derek

    2008-06-15

    Measurements of optical constants at terahertz--or T-ray--frequencies have been performed extensively using terahertz time-domain spectroscopy (THz-TDS). Spectrometers, together with physical models explaining the interaction between a sample and T-ray radiation, are progressively being developed. Nevertheless, measurement errors in the optical constants, so far, have not been systematically analyzed. This situation calls for a comprehensive analysis of measurement uncertainty in THz-TDS systems. The sources of error existing in a terahertz spectrometer and throughout the parameter estimation process are identified. The analysis herein quantifies the impact of each source on the output optical constants. The resulting analytical model is evaluated against experimental THz-TDS data.

  1. Uncertainty of silicon 1-MeV damage function

    SciTech Connect (OSTI)

    Danjaji, M.B.; Griffin, P.J.

    1997-02-01

    The electronics radiation hardness-testing community uses the ASTM E722-93 Standard Practice to define the energy dependence of the nonionizing neutron damage to silicon semiconductors. This neutron displacement damage response function is defined to be equal to the silicon displacement kerma as calculated from the ORNL Si cross-section evaluation. Experimental work has shown that observed damage ratios at various test facilities agree with the defined response function to within 5%. Here, a covariance matrix for the silicon 1-MeV neutron displacement damage function is developed. This uncertainty data will support the electronic radiation hardness-testing community and will permit silicon displacement damage sensors to be used in least squares spectrum adjustment codes.

  2. Effect of fluctuation measures on the uncertainty relations between two observables: Different measures lead to opposite conclusions

    SciTech Connect (OSTI)

    Luis, Alfredo

    2011-09-15

    We show within a very simple framework that different measures of fluctuations lead to uncertainty relations resulting in contradictory conclusions. More specifically we focus on Tsallis and Renyi entropic uncertainty relations and we get that the minimum joint uncertainty states for some fluctuation measures are the maximum joint uncertainty states of other fluctuation measures, and vice versa.

  3. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect (OSTI)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  4. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect (OSTI)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  5. Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its

    Office of Environmental Management (EM)

    Applications | Department of Energy Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Ming Ye (mye@fsu.edu) Florida State University Mary Hill (mchill@ku.edu) University of Kansas This research is supported by DOE Early Career Award: DE-SC0008272 Acknowledge the efforts of ISCMEM Working Group 2- Federal Scientists Working for Coordinated Uncertainty Analysis and Parameter

  6. Asymptotic and uncertainty analyses of a phase field model for void

    Office of Scientific and Technical Information (OSTI)

    formation under irradiation (Journal Article) | SciTech Connect Asymptotic and uncertainty analyses of a phase field model for void formation under irradiation Citation Details In-Document Search Title: Asymptotic and uncertainty analyses of a phase field model for void formation under irradiation We perform asymptotic analysis and uncertainty quantification of a phase field model for void formation and evolution in materials subject to irradiation. The parameters of the phase field model

  7. Final Report: DOE Project: DE-SC-0005399 Linking the uncertainty...

    Office of Scientific and Technical Information (OSTI)

    Linking the uncertainty of low frequency variability in tropical forcing in regional climate change Citation Details In-Document Search Title: Final Report: DOE Project:...

  8. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect (OSTI)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  9. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    SciTech Connect (OSTI)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for all equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.

  10. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    SciTech Connect (OSTI)

    Nguyen, J.; Moteabbed, M.; Paganetti, H.

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagation was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio of absolute risks between two modalities is less sensitive to the uncertainties in the risk model and can provide statistically significant estimates.

  11. A preliminary study to Assess Model Uncertainties in Fluid Flows

    SciTech Connect (OSTI)

    Marc Oliver Delchini; Jean C. Ragusa

    2009-09-01

    The goal of this study is to assess the impact of various flow models for a simplified primary coolant loop of a light water nuclear reactor. The various fluid flow models are based on the Euler equations with an additional friction term, gravity term, momentum source, and energy source. The geometric model is purposefully chosen simple and consists of a one-dimensional (1D) loop system in order to focus the study on the validity of various fluid flow approximations. The 1D loop system is represented by a rectangle; the fluid is heated up along one of the vertical legs and cooled down along the opposite leg. A pressurizer and a pump are included in the horizontal legs. The amount of energy transferred and removed from the system is equal in absolute value along the two vertical legs. The various fluid flow approximations are compressible vs. incompressible, and complete momentum equation vs. Darcys approximation. The ultimate goal is to compute the fluid flow models uncertainties and, if possible, to generate validity ranges for these models when applied to reactor analysis. We also limit this study to single phase flows with low-Mach numbers. As a result, sound waves carry a very small amount of energy in this particular case. A standard finite volume method is used for the spatial discretization of the system.

  12. Area 2: Inexpensive Monitoring and Uncertainty Assessment of CO2 Plume Migration using Injection Data

    SciTech Connect (OSTI)

    Srinivasan, Sanjay

    2014-09-30

    In-depth understanding of the long-term fate of CO? in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO? in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO? plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO? plume migration in two field projects the In Salah CO? Injection project in Algeria and CO? injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were highly efficient and yielded accurate grouping of reservoir models. The plume migration paths probabilistically assessed by the method were confirmed by field observations and auxiliary data. The report also documents the application of the software to answer practical questions such as the optimum location of monitoring wells to reliably assess the migration of CO? plume, the effect of CO?-rock interactions on plume migration and the ability to detect the plume under those conditions and the effect of a slow, unresolved leak on the predictions of plume migration.

  13. Optimal Extraction of Cosmological Information from Supernova Datain the Presence of Calibration Uncertainties

    SciTech Connect (OSTI)

    Kim, Alex G.; Miquel, Ramon

    2005-09-26

    We present a new technique to extract the cosmological information from high-redshift supernova data in the presence of calibration errors and extinction due to dust. While in the traditional technique the distance modulus of each supernova is determined separately, in our approach we determine all distance moduli at once, in a process that achieves a significant degree of self-calibration. The result is a much reduced sensitivity of the cosmological parameters to the calibration uncertainties. As an example, for a strawman mission similar to that outlined in the SNAP satellite proposal, the increased precision obtained with the new approach is roughly equivalent to a factor of five decrease in the calibration uncertainty.

  14. Characterization, propagation and analysis of aleatory and epistemic uncertainty in the 2008 performance assessment for the proposed repository for radioactive waste at Yucca Mountain, Nevada.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-10-01

    The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities: a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA.

  15. Uncertainty in Resilience to Climate Change in India and Indian States

    SciTech Connect (OSTI)

    Malone, Elizabeth L.; Brenkert, Antoinette L.

    2008-10-03

    This study builds on an earlier analysis of resilience of India and Indian states to climate change. The previous study (Brenkert and Malone 2005) assessed current resilience; this research uses the Vulnerability-Resilience Indicators Model (VRIM) to project resilience to 2095 and to perform an uncertainty analysis on the deterministic results. Projections utilized two SRES-based scenarios, one with fast-and-high growth, one with delayed growth. A detailed comparison of two states, the Punjab and Orissa, points to the kinds of insights that can be obtained using the VRIM. The scenarios differ most significantly in the timing of the uncertainty in economic prosperity (represented by GDP per capita) as a major factor in explaining the uncertainty in the resilience index. In the fast-and-high growth scenario the states differ most markedly regarding the role of ecosystem sensitivity, land use and water availability. The uncertainty analysis shows, for example, that resilience in the Punjab might be enhanced, especially in the delayed growth scenario, if early attention is paid to the impact of ecosystems sensitivity on environmental well-being of the state. By the same token, later in the century land-use pressures might be avoided if land is managed through intensification rather than extensification of agricultural land. Thus, this methodology illustrates how a policy maker can be informed about where to focus attention on specific issues, by understanding the potential changes at a specific location and time and, thus, what might yield desired outcomes. Model results can point to further analyses of the potential for resilience-building.

  16. Using Uncertainty Analysis to Guide the Development of Accelerated Stress Tests (Presentation)

    SciTech Connect (OSTI)

    Kempe, M.

    2014-03-01

    Extrapolation of accelerated testing to the long-term results expected in the field has uncertainty associated with the acceleration factors and the range of possible stresses in the field. When multiple stresses (such as temperature and humidity) can be used to increase the acceleration, the uncertainty may be reduced according to which stress factors are used to accelerate the degradation.

  17. Users manual for the FORSS sensitivity and uncertainty analysis code system

    SciTech Connect (OSTI)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  18. TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)

    SciTech Connect (OSTI)

    KEELE, B.D.

    2007-07-05

    An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex.

  19. Global Sampling for Integrating Physics-Specific Subsystems and Quantifying Uncertainties of CO2 Geological Sequestration

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.; Lu, C.; Mansoor, K.; Carroll, S. A.

    2012-12-20

    The risk of CO2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO2/brine saturation are connected to the fault-leakage model as a boundary condition. CO2 andmore » brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.« less

  20. Climate uncertainty and implications for U.S. state-level risk assessment through 2050.

    SciTech Connect (OSTI)

    Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.; Tidwell, Vincent Carroll; Stamber, Kevin Louis; Kelic, Andjelka; Backus, George A.; Warren, Drake E.; Zagonel, Aldo A.; Ehlen, Mark Andrew; Klise, Geoffrey T.; Vargas, Vanessa N.

    2009-10-01

    Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to the economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.

  1. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    McDonnell, J. D.; Schunck, N.; Higdon, D.; Sarich, J.; Wild, S. M.; Nazarewicz, W.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squaresmore » optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.« less

  2. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    SciTech Connect (OSTI)

    McDonnell, J. D.; Schunck, N.; Higdon, D.; Sarich, J.; Wild, S. M.; Nazarewicz, W.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. In addition, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  3. Uncertainty quantification for nuclear density functional theory and information content of new measurements

    SciTech Connect (OSTI)

    McDonnell, J. D.; Schunck, N.; Higdon, D.; Sarich, J.; Wild, S. M.; Nazarewicz, W.

    2015-03-24

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. As a result, the example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  4. Managing Model Data Introduced Uncertainties in Simulator Predictions for Generation IV Systems via Optimum Experimental Design

    SciTech Connect (OSTI)

    Turinsky, Paul J; Abdel-Khalik, Hany S; Stover, Tracy E

    2011-03-31

    An optimization technique has been developed to select optimized experimental design specifications to produce data specifically designed to be assimilated to optimize a given reactor concept. Data from the optimized experiment is assimilated to generate posteriori uncertainties on the reactor concepts core attributes from which the design responses are computed. The reactor concept is then optimized with the new data to realize cost savings by reducing margin. The optimization problem iterates until an optimal experiment is found to maximize the savings. A new generation of innovative nuclear reactor designs, in particular fast neutron spectrum recycle reactors, are being considered for the application of closing the nuclear fuel cycle in the future. Safe and economical design of these reactors will require uncertainty reduction in basic nuclear data which are input to the reactor design. These data uncertainty propagate to design responses which in turn require the reactor designer to incorporate additional safety margin into the design, which often increases the cost of the reactor. Therefore basic nuclear data needs to be improved and this is accomplished through experimentation. Considering the high cost of nuclear experiments, it is desired to have an optimized experiment which will provide the data needed for uncertainty reduction such that a reactor design concept can meet its target accuracies or to allow savings to be realized by reducing the margin required due to uncertainty propagated from basic nuclear data. However, this optimization is coupled to the reactor design itself because with improved data the reactor concept can be re-optimized itself. It is thus desired to find the experiment that gives the best optimized reactor design. Methods are first established to model both the reactor concept and the experiment and to efficiently propagate the basic nuclear data uncertainty through these models to outputs. The representativity of the experiment to the design concept is quantitatively determined. A technique is then established to assimilate this data and produce posteriori uncertainties on key attributes and responses of the design concept. Several experiment perturbations based on engineering judgment are used to demonstrate these methods and also serve as an initial generation of the optimization problem. Finally, an optimization technique is developed which will simultaneously arrive at an optimized experiment to produce an optimized reactor design. Solution of this problem is made possible by the use of the simulated annealing algorithm for solution of optimization problems. The optimization examined in this work is based on maximizing the reactor cost savings associated with the modified design made possible by using the design margin gained through reduced basic nuclear data uncertainties. Cost values for experiment design specifications and reactor design specifications are established and used to compute a total savings by comparing the posteriori reactor cost to the a priori cost plus the cost of the experiment. The optimized solution arrives at a maximized cost savings.

  5. Results for Phase I of the IAEA Coordinated Research Program on HTGR Uncertainties

    SciTech Connect (OSTI)

    Strydom, Gerhard; Bostelmann, Friederike; Yoon, Su Jong

    2015-01-01

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied. High Temperature Gas-cooled Reactors (HTGR) has its own peculiarities, coated particle design, large graphite quantities, different materials and high temperatures that also require other simulation requirements. The IAEA has therefore launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the HTR-PM (INET, China). This report summarizes the contributions of the HTGR Methods Simulation group at Idaho National Laboratory (INL) up to this point of the CRP. The activities at INL have been focused so far on creating the problem specifications for the prismatic design, as well as providing reference solutions for the exercises defined for Phase I. An overview is provided of the HTGR UAM objectives and scope, and the detailed specifications for Exercises I-1, I-2, I-3 and I-4 are also included here for completeness. The main focus of the report is the compilation and discussion of reference results for Phase I (i.e. for input parameters at their nominal or best-estimate values), which is defined as the first step of the uncertainty quantification process. These reference results can be used by other CRP participants for comparison with other codes or their own reference results. The status on the Monte Carlo modeling of the experimental VHTRC facility is also discussed. Reference results were obtained for the neutronics stand-alone cases (Ex. I-1 and Ex. I-2) using the (relatively new) Monte Carlo code Serpent, and comparisons were performed with the more established Monte Carlo codes MCNP and KENO-VI. For the thermal-fluids stand-alone cases (Ex. I-3 and I-4) the commercial CFD code CFX was utilized to obtain reference results that can be compared with lower fidelity tools.

  6. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Yankov, Artem; Collins, Benjamin; Klein, Markus; Jessee, Matthew A.; Zwermann, Winfried; Velkov, Kiril; Pautz, Andreas; Downar, Thomas

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  7. A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect (OSTI)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.; Halappanavar, Mahantesh

    2015-12-28

    Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may help address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attackers payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-systems state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.

  8. Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties

    SciTech Connect (OSTI)

    Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald; Lee, Andrew K.; Sahoo, Narayan; Tucker, Susan L.; Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E.; Dong, Lei

    2013-08-01

    Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dosevolume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dosevolume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of ?1.1% (?0.9% for breath-hold), ?0.3%, and ?2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.

  9. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    SciTech Connect (OSTI)

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft fr Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.

  10. Dakota uncertainty quantification methods applied to the NEK-5000 SAHEX model.

    SciTech Connect (OSTI)

    Weirs, V. Gregory

    2014-03-01

    This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.

  11. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect (OSTI)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  12. Balance Calibration A Method for Assigning a Direct-Reading Uncertainty to an Electronic Balance.

    SciTech Connect (OSTI)

    Mike Stears

    2010-07-01

    Paper Title: Balance Calibration A method for assigning a direct-reading uncertainty to an electronic balance. Intended Audience: Those who calibrate or use electronic balances. Abstract: As a calibration facility, we provide on-site (at the customers location) calibrations of electronic balances for customers within our company. In our experience, most of our customers are not using their balance as a comparator, but simply putting an unknown quantity on the balance and reading the displayed mass value. Manufacturers specifications for balances typically include specifications such as readability, repeatability, linearity, and sensitivity temperature drift, but what does this all mean when the balance user simply reads the displayed mass value and accepts the reading as the true value? This paper discusses a method for assigning a direct-reading uncertainty to a balance based upon the observed calibration data and the environment where the balance is being used. The method requires input from the customer regarding the environment where the balance is used and encourages discussion with the customer regarding sources of uncertainty and possible means for improvement; the calibration process becomes an educational opportunity for the balance user as well as calibration personnel. This paper will cover the uncertainty analysis applied to the calibration weights used for the field calibration of balances; the uncertainty is calculated over the range of environmental conditions typically encountered in the field and the resulting range of air density. The temperature stability in the area of the balance is discussed with the customer and the temperature range over which the balance calibration is valid is decided upon; the decision is based upon the uncertainty needs of the customer and the desired rigor in monitoring by the customer. Once the environmental limitations are decided, the calibration is performed and the measurement data is entered into a custom spreadsheet. The spreadsheet uses measurement results, along with the manufacturers specifications, to assign a direct-read measurement uncertainty to the balance. The fact that the assigned uncertainty is a best-case uncertainty is discussed with the customer; the assigned uncertainty contains no allowance for contributions associated with the unknown weighing sample, such as density, static charges, magnetism, etc. The attendee will learn uncertainty considerations associated with balance calibrations along with one method for assigning an uncertainty to a balance used for non-comparison measurements.

  13. Kiwi: An Evaluated Library of Uncertainties in Nuclear Data and Package for Nuclear Sensitivity Studies

    SciTech Connect (OSTI)

    Pruet, J

    2007-06-23

    This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directory structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.

  14. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect (OSTI)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  15. A simplified analysis of uncertainty propagation in inherently controlled ATWS events

    SciTech Connect (OSTI)

    Wade, D.C.

    1987-01-01

    The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ..delta..T across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect.

  16. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect (OSTI)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.

  17. Validation and Uncertainty Quantification in the Consortium for Advanced Simulation of Light Water Reactors

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and Uncertainty Quantification in CASL Michael Pernice Center for Advanced Modeling and Simulation Idaho National Laboratory SAMSI Uncertainty Quantification Transition Workshop May 21-23 2012 CASL-U-2012-0108-000 What Is CASL? * Consortium for Advanced Simulation of LWRs - An Energy Innovation Hub * Objective: predictive simulation of light water reactors - Reduce capital and operating costs * Power uprates * Lifetime extension - Reduce nuclear waste * Higher fuel burnup - Enhance operational

  18. Uncertainty analysis of multi-rate kinetics of uranium desorption from

    Office of Scientific and Technical Information (OSTI)

    sediments (Journal Article) | SciTech Connect SciTech Connect Search Results Journal Article: Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments Citation Details In-Document Search Title: Uncertainty analysis of multi-rate kinetics of uranium desorption from sediments A multi-rate expression for uranyl [U(VI)] surface complexation reactions has been proposed to describe diffusion-limited U(VI) sorption/desorption in heterogeneous subsurface sediments. An

  19. Position-momentum uncertainty relations in the presence of quantum memory

    SciTech Connect (OSTI)

    Furrer, Fabian; Berta, Mario; Tomamichel, Marco; Scholz, Volkher B.; Christandl, Matthias

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenbergs original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  20. Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications

    SciTech Connect (OSTI)

    Arbanas, Goran; Williams, Mark L; Leal, Luiz C; Dunn, Michael E; Khuwaileh, Bassam A.; Wang, C; Abdel-Khalik, Hany

    2015-01-01

    The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimiator (INSURE) module of the AMPX system [1]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncer- tainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore we have incorporated integral benchmark exper- iments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way. We show how the IS/UQ method could be used to optimize uncertainties of IBE s and differential cross section data simultaneously.

  1. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    SciTech Connect (OSTI)

    Perk, Zoltn Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods such as first order perturbation theory or Monte Carlo sampling Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (1520), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.

  2. MO-C-17A-13: Uncertainty Evaluation of CT Image Deformable Registration for H and N Cancer Adaptive Radiotherapy

    SciTech Connect (OSTI)

    Qin, A; Yan, D

    2014-06-15

    Purpose: To evaluate uncertainties of organ specific Deformable Image Registration (DIR) for H and N cancer Adaptive Radiation Therapy (ART). Methods: A commercial DIR evaluation tool, which includes a digital phantom library of 8 patients, and the corresponding “Ground truth Deformable Vector Field” (GT-DVF), was used in the study. Each patient in the phantom library includes the GT-DVF created from a pair of CT images acquired prior to and at the end of the treatment course. Five DIR tools, including 2 commercial tools (CMT1, CMT2), 2 in-house (IH-FFD1, IH-FFD2), and a classic DEMON algorithms, were applied on the patient images. The resulting DVF was compared to the GT-DVF voxel by voxel. Organ specific DVF uncertainty was calculated for 10 ROIs: Whole Body, Brain, Brain Stem, Cord, Lips, Mandible, Parotid, Esophagus and Submandibular Gland. Registration error-volume histogram was constructed for comparison. Results: The uncertainty is relatively small for brain stem, cord and lips, while large in parotid and submandibular gland. CMT1 achieved best overall accuracy (on whole body, mean vector error of 8 patients: 0.98±0.29 mm). For brain, mandible, parotid right, parotid left and submandibular glad, the classic Demon algorithm got the lowest uncertainty (0.49±0.09, 0.51±0.16, 0.46±0.11, 0.50±0.11 and 0.69±0.47 mm respectively). For brain stem, cord and lips, the DVF from CMT1 has the best accuracy (0.28±0.07, 0.22±0.08 and 0.27±0.12 mm respectively). All algorithms have largest right parotid uncertainty on patient #7, which has image artifact caused by tooth implantation. Conclusion: Uncertainty of deformable CT image registration highly depends on the registration algorithm, and organ specific. Large uncertainty most likely appears at the location of soft-tissue organs far from the bony structures. Among all 5 DIR methods, the classic DEMON and CMT1 seem to be the best to limit the uncertainty within 2mm for all OARs. Partially supported by research grant from Elekta.

  3. The ends of uncertainty: Air quality science and planning in Central California

    SciTech Connect (OSTI)

    Fine, James

    2003-09-01

    Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.

  4. Total Measurement Uncertainty for the Plutonium Finishing Plant (PFP) Segmented Gamma Scan Assay System

    SciTech Connect (OSTI)

    WESTSIK, G.A.

    2001-06-06

    This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a container, particularly for low to medium density (0-2.5 g/cc) container matrices. The SGSAS system provides a full gamma characterization of the container content. This document is an edited version of the Rocky Flats TMU Report for the Can Scan Segment Gamma Scanners, which are in use for the plutonium residues projects at the Rocky Flats plant. The can scan segmented gamma scanners at Rocky Flats are the same design as the PFP SGSAS system and use the same software (with the exception of the plutonium isotopics software). Therefore, all performance characteristics are expected to be similar. Modifications in this document reflect minor differences in the system configuration, container packaging, calibration technique, etc. These results are supported by the Quality Assurance Objective (QAO) counts, safeguards test data, calibration data, etc. for the PFP SGSAS system. Other parts of the TMU analysis utilize various modeling techniques such as Monte Carlo N-Particle (MCNP) and In Situ Object Counting Software (ISOCS).

  5. State-of-the-Art Solar Simulator Reduces Measurement Time and Uncertainty (Fact Sheet)

    SciTech Connect (OSTI)

    Not Available

    2012-04-01

    One-Sun Multisource Solar Simulator (OSMSS) brings accurate energy-rating predictions that account for the nonlinear behavior of multijunction photovoltaic devices. The National Renewable Energy Laboratory (NREL) is one of only a few International Organization for Standardization (ISO)-accredited calibration labs in the world for primary and secondary reference cells and modules. As such, it is critical to seek new horizons in developing simulators and measurement methods. Current solar simulators are not well suited for accurately measuring multijunction devices. To set the electrical current to each junction independently, simulators must precisely tune the spectral content with no overlap between the wavelength regions. Current simulators do not have this capability, and the overlaps lead to large measurement uncertainties of {+-}6%. In collaboration with LabSphere, NREL scientists have designed and implemented the One-Sun Multisource Solar Simulator (OSMSS), which enables automatic spectral adjustment with nine independent wavelength regions. This fiber-optic simulator allows researchers and developers to set the current to each junction independently, reducing errors relating to spectral effects. NREL also developed proprietary software that allows this fully automated simulator to rapidly 'build' a spectrum under which all junctions of a multijunction device are current matched and behave as they would under a reference spectrum. The OSMSS will reduce the measurement uncertainty for multijunction devices, while significantly reducing the current-voltage measurement time from several days to minutes. These features will enable highly accurate energy-rating predictions that take into account the nonlinear behavior of multijunction photovoltaic devices.

  6. The IAEA Coordinated Research Program on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis: Description of the Benchmark Test Cases and Phases

    SciTech Connect (OSTI)

    Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov

    2012-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest status and plans are presented.

  7. Methodology of Internal Assessment of Uncertainty and Extension to Neutron Kinetics/Thermal-Hydraulics Coupled Codes

    SciTech Connect (OSTI)

    Petruzzi, A.; D'Auria, F.; Giannotti, W.; Ivanov, K.

    2005-02-15

    The best-estimate calculation results from complex system codes are affected by approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty.The code with (the capability of) internal assessment of uncertainty (CIAU) has been previously proposed by the University of Pisa to realize the integration between a qualified system code and an uncertainty methodology and to supply proper uncertainty bands each time a nuclear power plant (NPP) transient scenario is calculated. The derivation of the methodology and the results achieved by the use of CIAU are discussed to demonstrate the main features and capabilities of the method.In a joint effort between the University of Pisa and The Pennsylvania State University, the CIAU method has been recently extended to evaluate the uncertainty of coupled three-dimensional neutronics/thermal-hydraulics calculations. The result is CIAU-TN. The feasibility of the approach has been demonstrated, and sample results related to the turbine trip transient in the Peach Bottom NPP are shown. Notwithstanding that the full implementation and use of the procedure requires a database of errors not available at the moment, the results give an idea of the errors expected from the present computational tools.

  8. Quantifying sampling noise and parametric uncertainty in atomistic-to-continuum simulations using surrogate models

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Salloum, Maher N.; Sargsyan, Khachik; Jones, Reese E.; Najm, Habib N.; Debusschere, Bert

    2015-08-11

    We present a methodology to assess the predictive fidelity of multiscale simulations by incorporating uncertainty in the information exchanged between the components of an atomistic-to-continuum simulation. We account for both the uncertainty due to finite sampling in molecular dynamics (MD) simulations and the uncertainty in the physical parameters of the model. Using Bayesian inference, we represent the expensive atomistic component by a surrogate model that relates the long-term output of the atomistic simulation to its uncertain inputs. We then present algorithms to solve for the variables exchanged across the atomistic-continuum interface in terms of polynomial chaos expansions (PCEs). We alsomore » consider a simple Couette flow where velocities are exchanged between the atomistic and continuum components, while accounting for uncertainty in the atomistic model parameters and the continuum boundary conditions. Results show convergence of the coupling algorithm at a reasonable number of iterations. As a result, the uncertainty in the obtained variables significantly depends on the amount of data sampled from the MD simulations and on the width of the time averaging window used in the MD simulations.« less

  9. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    SciTech Connect (OSTI)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)

  10. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    SciTech Connect (OSTI)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia; Hough, Patricia Diane; Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  11. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    SciTech Connect (OSTI)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD.; Storlie, Curt B.

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  12. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    SciTech Connect (OSTI)

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energys Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification through PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.

  13. Uncertainty quantification of a radionuclide release model using an adaptive spectral technique

    SciTech Connect (OSTI)

    Gilli, L.; Hoogwerf, C.; Lathouwers, D.; Kloosterman, J. L.

    2013-07-01

    In this paper we present the application of a non-intrusive spectral techniques we recently developed for the evaluation of the uncertainties associated with a radionuclide migration problem. Spectral techniques can be used to reconstruct stochastic quantities of interest by means of a Fourier-like expansion. Their application to uncertainty propagation problems can be performed by evaluating a set of realizations which are chosen adaptively, in this work the main details about how this is done are presented. The uncertainty quantification problem we are going to deal with was first solved in a recent work where the authors used a spectral technique based on an intrusive approach. In this paper we are going to reproduce the results of this reference work, compare them and discuss the main numerical aspects. (authors)

  14. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  15. Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations

    SciTech Connect (OSTI)

    Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.

    2013-05-01

    Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.

  16. Impact of Uncertainty on Loss Estimates for a Repeat of the 1908 Messina-Reggio Calabria Earthquake in Southern Italy

    SciTech Connect (OSTI)

    Franco, Guillermo; Shen-Tu, Bing Ming; Bazzurro, Paolo; Goretti, Agostino; Valensise, Gianluca

    2008-07-08

    Increasing sophistication in the insurance and reinsurance market is stimulating the move towards catastrophe models that offer a greater degree of flexibility in the definition of model parameters and model assumptions. This study explores the impact of uncertainty in the input parameters on the loss estimates by departing from the exclusive usage of mean values to establish the earthquake event mechanism, the ground motion fields, or the damageability of the building stock. Here the potential losses due to a repeat of the 1908 Messina-Reggio Calabria event are calculated using different plausible alternatives found in the literature that encompass 12 event scenarios, 2 different ground motion prediction equations, and 16 combinations of damage functions for the building stock, a total of 384 loss scenarios. These results constitute the basis for a sensitivity analysis of the different assumptions on the loss estimates that allows the model user to estimate the impact of the uncertainty on input parameters and the potential spread of the model results. For the event under scrutiny, average losses would amount today to about 9.000 to 10.000 million Euros. The uncertainty in the model parameters is reflected in the high coefficient of variation of this loss, reaching approximately 45%. The choice of ground motion prediction equations and vulnerability functions of the building stock contribute the most to the uncertainty in loss estimates. This indicates that the application of non-local-specific information has a great impact on the spread of potential catastrophic losses. In order to close this uncertainty gap, more exhaustive documentation practices in insurance portfolios will have to go hand in hand with greater flexibility in the model input parameters.

  17. ROBUSTNESS OF DECISION INSIGHTS UNDER ALTERNATIVE ALEATORY/EPISTEMIC UNCERTAINTY CLASSIFICATIONS

    SciTech Connect (OSTI)

    Unwin, Stephen D.; Eslinger, Paul W.; Johnson, Kenneth I.

    2013-09-22

    The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). A key technical challenge is to establish the conceptual and technical feasibility of analyzing safety margin in a risk-informed way, which, unlike conventionally defined deterministic margin analysis, would be founded on probabilistic characterizations of SSC performance. Evaluation of probabilistic safety margins will in general entail the uncertainty characterization both of the prospective challenge to the performance of an SSC ("load") and of its "capacity" to withstand that challenge. The RISMC framework contrasts sharply with the traditional probabilistic risk assessment (PRA) structure in that the underlying models are not inherently aleatory. Rather, they are largely deterministic physical/engineering models with ambiguities about the appropriate uncertainty classification of many model parameters. The current analysis demonstrates that if the distinction between epistemic and aleatory uncertainties is to be preserved in a RISMC-like modeling environment, then it is unlikely that analysis insights supporting decision-making will in general be robust under recategorization of input uncertainties. If it is believed there is a true conceptual distinction between epistemic and aleatory uncertainty (as opposed to the distinction being primarily a legacy of the PRA paradigm) then a consistent and defensible basis must be established by which to categorize input uncertainties.

  18. Quantification of margins and uncertainty for risk-informed decision analysis.

    SciTech Connect (OSTI)

    Alvin, Kenneth Fredrick

    2010-09-01

    QMU stands for 'Quantification of Margins and Uncertainties'. QMU is a basic framework for consistency in integrating simulation, data, and/or subject matter expertise to provide input into a risk-informed decision-making process. QMU is being applied to a wide range of NNSA stockpile issues, from performance to safety. The implementation of QMU varies with lab and application focus. The Advanced Simulation and Computing (ASC) Program develops validated computational simulation tools to be applied in the context of QMU. QMU provides input into a risk-informed decision making process. The completeness aspect of QMU can benefit from the structured methodology and discipline of quantitative risk assessment (QRA)/probabilistic risk assessment (PRA). In characterizing uncertainties it is important to pay attention to the distinction between those arising from incomplete knowledge ('epistemic' or systematic), and those arising from device-to-device variation ('aleatory' or random). The national security labs should investigate the utility of a probability of frequency (PoF) approach in presenting uncertainties in the stockpile. A QMU methodology is connected if the interactions between failure modes are included. The design labs should continue to focus attention on quantifying uncertainties that arise from epistemic uncertainties such as poorly-modeled phenomena, numerical errors, coding errors, and systematic uncertainties in experiment. The NNSA and design labs should ensure that the certification plan for any RRW is supported by strong, timely peer review and by an ongoing, transparent QMU-based documentation and analysis in order to permit a confidence level necessary for eventual certification.

  19. MANAGING UNCERTAINTIES ASSOCIATED WITH RADIOACTIVE WASTE DISPOSAL: TASK GROUP 4 OF THE IAEA PRISM PROJECT

    SciTech Connect (OSTI)

    Seitz, R.

    2011-03-02

    It is widely recognized that the results of safety assessment calculations provide an important contribution to the safety arguments for a disposal facility, but cannot in themselves adequately demonstrate the safety of the disposal system. The safety assessment and a broader range of arguments and activities need to be considered holistically to justify radioactive waste disposal at any particular site. Many programs are therefore moving towards the production of what has become known as a Safety Case, which includes all of the different activities that are conducted to demonstrate the safety of a disposal concept. Recognizing the growing interest in the concept of a Safety Case, the International Atomic Energy Agency (IAEA) is undertaking an intercomparison and harmonization project called PRISM (Practical Illustration and use of the Safety Case Concept in the Management of Near-surface Disposal). The PRISM project is organized into four Task Groups that address key aspects of the Safety Case concept: Task Group 1 - Understanding the Safety Case; Task Group 2 - Disposal facility design; Task Group 3 - Managing waste acceptance; and Task Group 4 - Managing uncertainty. This paper addresses the work of Task Group 4, which is investigating approaches for managing the uncertainties associated with near-surface disposal of radioactive waste and their consideration in the context of the Safety Case. Emphasis is placed on identifying a wide variety of approaches that can and have been used to manage different types of uncertainties, especially non-quantitative approaches that have not received as much attention in previous IAEA projects. This paper includes discussions of the current results of work on the task on managing uncertainty, including: the different circumstances being considered, the sources/types of uncertainties being addressed and some initial proposals for approaches that can be used to manage different types of uncertainties.

  20. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    SciTech Connect (OSTI)

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and improving the prediction accuracy of the damage modeling and finite element simulation.

  1. A computational framework for uncertainty quantification and stochastic optimization in unit commitment with wind power generation.

    SciTech Connect (OSTI)

    Constantinescu, E. M; Zavala, V. M.; Rocklin, M.; Lee, S.; Anitescu, M.

    2011-02-01

    We present a computational framework for integrating a state-of-the-art numerical weather prediction (NWP) model in stochastic unit commitment/economic dispatch formulations that account for wind power uncertainty. We first enhance the NWP model with an ensemble-based uncertainty quantification strategy implemented in a distributed-memory parallel computing architecture. We discuss computational issues arising in the implementation of the framework and validate the model using real wind-speed data obtained from a set of meteorological stations. We build a simulated power system to demonstrate the developments.

  2. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    SciTech Connect (OSTI)

    Templeton, Jeremy Alan; Blaylock, Myra L.; Domino, Stefan P.; Hewson, John C.; Kumar, Pritvi Raj; Ling, Julia; Najm, Habib N.; Ruiz, Anthony; Safta, Cosmin; Sargsyan, Khachik; Stewart, Alessia; Wagner, Gregory

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  3. Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2

    Office of Scientific and Technical Information (OSTI)

    leakage into aquifers (Journal Article) | SciTech Connect analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers Citation Details In-Document Search Title: Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers In this study, we apply an uncertainty quantification (UQ) framework to CO2 sequestration problems. In one scenario, we look at the risk of wellbore leakage of CO2 into a shallow unconfined aquifer in an urban area; in

  4. Rapidity gap survival in central exclusive diffraction: Dynamical mechanisms and uncertainties

    SciTech Connect (OSTI)

    Strikman, Mark; Weiss, Christian

    2009-01-01

    We summarize our understanding of the dynamical mechanisms governing rapidity gap survival in central exclusive diffraction, pp -> p + H + p (H = high-mass system), and discuss the uncertainties in present estimates of the survival probability. The main suppression of diffractive scattering is due to inelastic soft spectator interactions at small pp impact parameters and can be described in a mean-field approximation (independent hard and soft interactions). Moderate extra suppression results from fluctuations of the partonic configurations of the colliding protons. At LHC energies absorptive interactions of hard spectator partons associated with the gg -> H process reach the black-disk regime and cause substantial additional suppression, pushing the survival probability below 0.01.

  5. Uncertainty Estimates for SIRS, SKYRAD, & GNDRAD Data and Reprocessing the Pyrgeometer Data (Presentation)

    SciTech Connect (OSTI)

    Reda, I.; Stoffel, T.; Habte, A.

    2014-03-01

    The National Renewable Energy Laboratory (NREL) and the Atmospheric Radiation Measurement (ARM) Climate Research Facility work together in providing data from strategically located in situ measurement observatories around the world. Both work together in improving and developing new technologies that assist in acquiring high quality radiometric data. In this presentation we summarize the uncertainty estimates of the ARM data collected at the ARM Solar Infrared Radiation Station (SIRS), Sky Radiometers on Stand for Downwelling Radiation (SKYRAD), and Ground Radiometers on Stand for Upwelling Radiation (GNDRAD), which ultimately improve the existing radiometric data. Three studies are also included to show the difference between calibrating pyrgeometers (e.g., Eppley PIR) using the manufacturer blackbody versus the interim World Infrared Standard Group (WISG), a pyrgeometer aging study, and the sampling rate effect of correcting historical data.

  6. Influence of Advanced Fuel Cycles on Uncertainty in the Performance...

    Office of Scientific and Technical Information (OSTI)

    Resource Relation: Conference: Proposed for presentation at the International High-Level Radioactive Waste Management Conference held April 28 - May 2, 2013 in Albuquerque, NM...

  7. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    SciTech Connect (OSTI)

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  8. Analysis of Variability and Uncertainty in Wind Power Forecasting: An International Comparison (Presentation)

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B.; Miettinen, J.; Holttinen, H.; Gomez-Lozaro, E.; Cutululis, N.; Litong-Palima, M.; Sorensen, P.; Lovholm, A.; Berge, E.; Dobschinski, J.

    2013-10-01

    This presentation summarizes the work to investigate the uncertainty in wind forecasting at different times of year and compare wind forecast errors in different power systems using large-scale wind power prediction data from six countries: the United States, Finland, Spain, Denmark, Norway, and Germany.

  9. Uncertainty and Sensitivity of Alternative Rn-222 Flux Density Models Used in Performance Assessment

    SciTech Connect (OSTI)

    Greg J. Shott, Vefa Yucel, Lloyd Desotell; Non-Nstec Authors: G. Pyles and Jon Carilli

    2007-06-01

    Performance assessments for the Area 5 Radioactive Waste Management Site on the Nevada Test Site have used three different mathematical models to estimate Rn-222 flux density. This study describes the performance, uncertainty, and sensitivity of the three models which include the U.S. Nuclear Regulatory Commission Regulatory Guide 3.64 analytical method and two numerical methods. The uncertainty of each model was determined by Monte Carlo simulation using Latin hypercube sampling. The global sensitivity was investigated using Morris one-at-time screening method, sample-based correlation and regression methods, the variance-based extended Fourier amplitude sensitivity test, and Sobol's sensitivity indices. The models were found to produce similar estimates of the mean and median flux density, but to have different uncertainties and sensitivities. When the Rn-222 effective diffusion coefficient was estimated using five different published predictive models, the radon flux density models were found to be most sensitive to the effective diffusion coefficient model selected, the emanation coefficient, and the radionuclide inventory. Using a site-specific measured effective diffusion coefficient significantly reduced the output uncertainty. When a site-specific effective-diffusion coefficient was used, the models were most sensitive to the emanation coefficient and the radionuclide inventory.

  10. ASSESSMENT OF UNCERTAINTY IN THE RADIATION DOSES FOR THE TECHA RIVER DOSIMETRY SYSTEM

    SciTech Connect (OSTI)

    Napier, Bruce A.; Degteva, M. O.; Anspaugh, L. R.; Shagina, N. B.

    2009-10-23

    In order to provide more accurate and precise estimates of individual dose (and thus more precise estimates of radiation risk) for the members of the ETRC, a new dosimetric calculation system, the Techa River Dosimetry System-2009 (TRDS-2009) has been prepared. The deterministic version of the improved dosimetry system TRDS-2009D was basically completed in April 2009. Recent developments in evaluation of dose-response models in light of uncertain dose have highlighted the importance of different types of uncertainties in the development of individual dose estimates. These include uncertain parameters that may be either shared or unshared within the dosimetric cohort, and also the nature of the type of uncertainty as aleatory or epistemic and either classical or Berkson. This report identifies the nature of the various input parameters and calculational methods incorporated in the Techa River Dosimetry System (based on the TRDS-2009D implementation), with the intention of preparing a stochastic version to estimate the uncertainties in the dose estimates. This report reviews the equations, databases, and input parameters, and then identifies the authors interpretations of their general nature. It presents the approach selected so that the stochastic, Monte-Carlo, implementation of the dosimetry System - TRDS-2009MC - will provide useful information regarding the uncertainties of the doses.

  11. Accounting for uncertainty and risk in assessments of impacts for offshore oil and gas leasing proposals

    SciTech Connect (OSTI)

    Wildermann, R.; Beittel, R. )

    1993-01-01

    The Minerals Management Service (MMS) of the US Department of the Interior prepares an environmental impact statement (EIS) for each proposal to lease a portion of the Outer Continental Shelf (OCS) for oil and gas exploration and development. The nature, magnitude, and timing of the activities that would ultimately result from leasing are subject to wide speculation, primarily because of uncertainties about the locations and amounts of petroleum hydrocarbons that exist on most potential leases. These uncertainties create challenges in preparing EIS's that meet National Environmental Policy Act requirements and provide information useful to decision-makers. This paper examines the constraints that uncertainty places on the detail and reliability of assessments of impacts from potential OCS development. It further describes how the MMS accounts for uncertainty in developing reasonable scenarios of future events that can be evaluated in the EIS. A process for incorporating the risk of accidental oil spills into assessments of expected impacts is also presented. Finally, the paper demonstrates through examination of case studies how a balance can be achieved between the need for an EIS to present impacts in sufficient detail to allow a meaningful comparison of alternatives and the tendency to push the analysis beyond credible limits.

  12. Distributed Generation Investment by a Microgrid UnderUncertainty

    SciTech Connect (OSTI)

    Siddiqui, Afzal; Marnay, Chris

    2006-06-16

    This paper examines a California-based microgrid s decision to invest in a distributed generation (DG) unit that operates on natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find natural gas generating cost thresholds that trigger DG investment. Furthermore, the consideration of operational flexibility by the microgrid accelerates DG investment, while the option to disconnect entirely from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generating cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit.

  13. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect (OSTI)

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the OpenBUGS Scripter has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  14. Hurdling barriers through market uncertainty: Case studies ininnovative technology adoption

    SciTech Connect (OSTI)

    Payne, Christopher T.; Radspieler Jr., Anthony; Payne, Jack

    2002-08-18

    The crisis atmosphere surrounding electricity availability in California during the summer of 2001 produced two distinct phenomena in commercial energy consumption decision-making: desires to guarantee energy availability while blackouts were still widely anticipated, and desires to avoid or mitigate significant price increases when higher commercial electricity tariffs took effect. The climate of increased consideration of these factors seems to have led, in some cases, to greater willingness on the part of business decision-makers to consider highly innovative technologies. This paper examines three case studies of innovative technology adoption: retrofit of time-and-temperature signs on an office building; installation of fuel cells to supply power, heating, and cooling to the same building; and installation of a gas-fired heat pump at a microbrewery. We examine the decision process that led to adoption of these technologies. In each case, specific constraints had made more conventional energy-efficient technologies inapplicable. We examine how these barriers to technology adoption developed over time, how the California energy decision-making climate combined with the characteristics of these innovative technologies to overcome the barriers, and what the implications of hurdling these barriers are for future energy decisions within the firms.

  15. SU-E-T-573: The Robustness of a Combined Margin Recipe for Uncertainties During Radiotherapy

    SciTech Connect (OSTI)

    Stroom, J; Vieira, S; Greco, C [Champalimaud Foundation, Lisbon, Lisbon (Portugal)

    2014-06-01

    Purpose: To investigate the variability of a safety margin recipe that combines CTV and PTV margins quadratically, with several tumor, treatment, and user related factors. Methods: Margin recipes were calculated by monte-carlo simulations in 5 steps. 1. A spherical tumor with or without isotropic microscopic was irradiated with a 5 field dose plan2. PTV: Geometric uncertainties were introduced using systematic (Sgeo) and random (sgeo) standard deviations. CTV: Microscopic disease distribution was modelled by semi-gaussian (Smicro) with varying number of islets (Ni)3. For a specific uncertainty set (Sgeo, sgeo, Smicro(Ni)), margins were varied until pre-defined decrease in TCP or dose coverage was fulfilled. 4. First, margin recipes were calculated for each of the three uncertainties separately. CTV and PTV recipes were then combined quadratically to yield a final recipe M(Sgeo, sgeo, Smicro(Ni)).5. The final M was verified by simultaneous simulations of the uncertainties.Now, M has been calculated for various changing parameters like margin criteria, penumbra steepness, islet radio-sensitivity, dose conformity, and number of fractions. We subsequently investigated A: whether the combined recipe still holds in all these situations, and B: what the margin variation was in all these cases. Results: We found that the accuracy of the combined margin recipes remains on average within 1mm for all situations, confirming the correctness of the quadratic addition. Depending on the specific parameter, margin factors could change such that margins change over 50%. Especially margin recipes based on TCP-criteria are more sensitive to more parameters than those based on purely geometric Dmin-criteria. Interestingly, measures taken to minimize treatment field sizes (by e.g. optimizing dose conformity) are counteracted by the requirement of larger margins to get the same tumor coverage. Conclusion: Margin recipes combining geometric and microscopic uncertainties quadratically are accurate under varying circumstances. However margins can change up to 50% for different situations.

  16. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    SciTech Connect (OSTI)

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  17. Heat Capacity Uncertainty Calculation for the Eutectic Mixture of Biphenyl/Diphenyl Ether Used as Heat Transfer Fluid: Preprint

    SciTech Connect (OSTI)

    Gomez, J. C.; Glatzmaier, G. C.; Mehos, M.

    2012-09-01

    The main objective of this study was to calculate the uncertainty at 95% confidence for the experimental values of heat capacity of the eutectic mixture of biphenyl/diphenyl ether (Therminol VP-1) determined from 300 to 370 degrees C. Twenty-five samples were evaluated using differential scanning calorimetry (DSC) to obtain the sample heat flow as a function of temperature. The ASTM E-1269-05 standard was used to determine the heat capacity using DSC evaluations. High-pressure crucibles were employed to contain the sample in the liquid state without vaporizing. Sample handling has a significant impact on the random uncertainty. It was determined that the fluid is difficult to handle, and a high variability of the data was produced. The heat capacity of Therminol VP-1 between 300 and 370 degrees C was measured to be equal to 0.0025T+0.8672 with an uncertainty of +/- 0.074 J/g.K (3.09%) at 95% confidence with T (temperature) in Kelvin.

  18. Comparison of Homogeneous and Heterogeneous CFD Fuel Models for Phase I of the IAEA CRP on HTR Uncertainties Benchmark

    SciTech Connect (OSTI)

    Gerhard Strydom; Su-Jong Yoon

    2014-04-01

    Computational Fluid Dynamics (CFD) evaluation of homogeneous and heterogeneous fuel models was performed as part of the Phase I calculations of the International Atomic Energy Agency (IAEA) Coordinate Research Program (CRP) on High Temperature Reactor (HTR) Uncertainties in Modeling (UAM). This study was focused on the nominal localized stand-alone fuel thermal response, as defined in Ex. I-3 and I-4 of the HTR UAM. The aim of the stand-alone thermal unit-cell simulation is to isolate the effect of material and boundary input uncertainties on a very simplified problem, before propagation of these uncertainties are performed in subsequent coupled neutronics/thermal fluids phases on the benchmark. In many of the previous studies for high temperature gas cooled reactors, the volume-averaged homogeneous mixture model of a single fuel compact has been applied. In the homogeneous model, the Tristructural Isotropic (TRISO) fuel particles in the fuel compact were not modeled directly and an effective thermal conductivity was employed for the thermo-physical properties of the fuel compact. On the contrary, in the heterogeneous model, the uranium carbide (UCO), inner and outer pyrolytic carbon (IPyC/OPyC) and silicon carbide (SiC) layers of the TRISO fuel particles are explicitly modeled. The fuel compact is modeled as a heterogeneous mixture of TRISO fuel kernels embedded in H-451 matrix graphite. In this study, a steady-state and transient CFD simulations were performed with both homogeneous and heterogeneous models to compare the thermal characteristics. The nominal values of the input parameters are used for this CFD analysis. In a future study, the effects of input uncertainties in the material properties and boundary parameters will be investigated and reported.

  19. Uncertainty Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Analysis - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion ...

  20. Uncertainty Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Analysis - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear Energy

  1. Uncertainty Quantification

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Quantification - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

  2. Life-cycle assessment of municipal solid waste management alternatives with consideration of uncertainty: SIWMS development and application

    SciTech Connect (OSTI)

    El Hanandeh, Ali; El-Zein, Abbas

    2010-05-15

    This paper describes the development and application of the Stochastic Integrated Waste Management Simulator (SIWMS) model. SIWMS provides a detailed view of the environmental impacts and associated costs of municipal solid waste (MSW) management alternatives under conditions of uncertainty. The model follows a life-cycle inventory approach extended with compensatory systems to provide more equitable bases for comparing different alternatives. Economic performance is measured by the net present value. The model is verified against four publicly available models under deterministic conditions and then used to study the impact of uncertainty on Sydney's MSW management 'best practices'. Uncertainty has a significant effect on all impact categories. The greatest effect is observed in the global warming category where a reversal of impact direction is predicted. The reliability of the system is most sensitive to uncertainties in the waste processing and disposal. The results highlight the importance of incorporating uncertainty at all stages to better understand the behaviour of the MSW system.

  3. Systematic uncertainties associated with the cosmological analysis of the first Pan-STARRS1 type Ia supernova sample

    SciTech Connect (OSTI)

    Scolnic, D.; Riess, A.; Brout, D.; Rodney, S. [Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Rest, A. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Huber, M. E.; Tonry, J. L. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Foley, R. J.; Chornock, R.; Berger, E.; Soderberg, A. M.; Stubbs, C. W.; Kirshner, R. P.; Challis, P.; Czekala, I.; Drout, M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Narayan, G. [Department of Physics, Harvard University, 17 Oxford Street, Cambridge, MA 02138 (United States); Smartt, S. J.; Botticella, M. T. [Astrophysics Research Centre, School of Mathematics and Physics, Queens University Belfast, Belfast BT7 1NN (United Kingdom); Schlafly, E. [Max Planck Institute for Astronomy, Konigstuhl 17, D-69117 Heidelberg (Germany); and others

    2014-11-01

    We probe the systematic uncertainties from the 113 Type Ia supernovae (SN Ia) in the Pan-STARRS1 (PS1) sample along with 197 SN Ia from a combination of low-redshift surveys. The companion paper by Rest et al. describes the photometric measurements and cosmological inferences from the PS1 sample. The largest systematic uncertainty stems from the photometric calibration of the PS1 and low-z samples. We increase the sample of observed Calspec standards from 7 to 10 used to define the PS1 calibration system. The PS1 and SDSS-II calibration systems are compared and discrepancies up to ?0.02 mag are recovered. We find uncertainties in the proper way to treat intrinsic colors and reddening produce differences in the recovered value of w up to 3%. We estimate masses of host galaxies of PS1 supernovae and detect an insignificant difference in distance residuals of the full sample of 0.037 0.031 mag for host galaxies with high and low masses. Assuming flatness and including systematic uncertainties in our analysis of only SNe measurements, we find w =?1.120{sub ?0.206}{sup +0.360}(Stat){sub ?0.291}{sup +0.269}(Sys). With additional constraints from Baryon acoustic oscillation, cosmic microwave background (CMB) (Planck) and H {sub 0} measurements, we find w=?1.166{sub ?0.069}{sup +0.072} and ?{sub m}=0.280{sub ?0.012}{sup +0.013} (statistical and systematic errors added in quadrature). The significance of the inconsistency with w = 1 depends on whether we use Planck or Wilkinson Microwave Anisotropy Probe measurements of the CMB: w{sub BAO+H0+SN+WMAP}=?1.124{sub ?0.065}{sup +0.083}.

  4. Global Sampling for Integrating Physics-Specific Subsystems and Quantifying Uncertainties of CO2 Geological Sequestration

    SciTech Connect (OSTI)

    Sun, Y.; Tong, C.; Trainor-Guitten, W. J.; Lu, C.; Mansoor, K.; Carroll, S. A.

    2012-12-20

    The risk of CO2 leakage from a deep storage reservoir into a shallow aquifer through a fault is assessed and studied using physics-specific computer models. The hypothetical CO2 geological sequestration system is composed of three subsystems: a deep storage reservoir, a fault in caprock, and a shallow aquifer, which are modeled respectively by considering sub-domain-specific physics. Supercritical CO2 is injected into the reservoir subsystem with uncertain permeabilities of reservoir, caprock, and aquifer, uncertain fault location, and injection rate (as a decision variable). The simulated pressure and CO2/brine saturation are connected to the fault-leakage model as a boundary condition. CO2 and brine fluxes from the fault-leakage model at the fault outlet are then imposed in the aquifer model as a source term. Moreover, uncertainties are propagated from the deep reservoir model, to the fault-leakage model, and eventually to the geochemical model in the shallow aquifer, thus contributing to risk profiles. To quantify the uncertainties and assess leakage-relevant risk, we propose a global sampling-based method to allocate sub-dimensions of uncertain parameters to sub-models. The risk profiles are defined and related to CO2 plume development for pH value and total dissolved solids (TDS) below the EPA's Maximum Contaminant Levels (MCL) for drinking water quality. A global sensitivity analysis is conducted to select the most sensitive parameters to the risk profiles. The resulting uncertainty of pH- and TDS-defined aquifer volume, which is impacted by CO2 and brine leakage, mainly results from the uncertainty of fault permeability. Subsequently, high-resolution, reduced-order models of risk profiles are developed as functions of all the decision variables and uncertain parameters in all three subsystems.

  5. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect (OSTI)

    Brouns, Thomas M.; Rohay, Alan C.; Reidel, Steve; Gardner, Martin G.

    2007-02-27

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energys (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84th percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis.

  6. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    SciTech Connect (OSTI)

    Miller, David C.; Ng, Brenda; Eslick, John

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  7. Optimal investment and scheduling of distributed energy resources with uncertainty in electric vehicles driving schedules

    SciTech Connect (OSTI)

    Cardoso, Goncalo; Stadler, Michael; Bozchalui, Mohammed C.; Sharma, Ratnesh; Marnay, Chris; Barbosa-Povoa, Ana; Ferrao, Paulo

    2013-12-06

    The large scale penetration of electric vehicles (EVs) will introduce technical challenges to the distribution grid, but also carries the potential for vehicle-to-grid services. Namely, if available in large enough numbers, EVs can be used as a distributed energy resource (DER) and their presence can influence optimal DER investment and scheduling decisions in microgrids. In this work, a novel EV fleet aggregator model is introduced in a stochastic formulation of DER-CAM [1], an optimization tool used to address DER investment and scheduling problems. This is used to assess the impact of EV interconnections on optimal DER solutions considering uncertainty in EV driving schedules. Optimization results indicate that EVs can have a significant impact on DER investments, particularly if considering short payback periods. Furthermore, results suggest that uncertainty in driving schedules carries little significance to total energy costs, which is corroborated by results obtained using the stochastic formulation of the problem.

  8. Integration of Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Etingov, Pavel V.; Huang, Zhenyu; Ma, Jian; Chakrabarti, Bhujanga B.; Subbarao, Krishnappa; Loutan, Clyde; Guttromson, Ross T.

    2010-04-20

    In this paper, a new approach to evaluate the uncertainty ranges for the required generation performance envelope, including the balancing capacity, ramping capability and ramp duration is presented. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (CAISO) real life data have shown the effectiveness and efficiency of the proposed approach.

  9. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    SciTech Connect (OSTI)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Braatz, Brett G.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2013-09-01

    This report describes the status of ongoing research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  10. Uncertainty Quantification Techniques for Sensor Calibration Monitoring in Nuclear Power Plants

    SciTech Connect (OSTI)

    Ramuhalli, Pradeep; Lin, Guang; Crawford, Susan L.; Konomi, Bledar A.; Coble, Jamie B.; Shumaker, Brent; Hashemian, Hash

    2014-04-30

    This report describes research towards the development of advanced algorithms for online calibration monitoring. The objective of this research is to develop the next generation of online monitoring technologies for sensor calibration interval extension and signal validation in operating and new reactors. These advances are expected to improve the safety and reliability of current and planned nuclear power systems as a result of higher accuracies and increased reliability of sensors used to monitor key parameters. The focus of this report is on documenting the outcomes of the first phase of R&D under this project, which addressed approaches to uncertainty quantification (UQ) in online monitoring that are data-driven, and can therefore adjust estimates of uncertainty as measurement conditions change. Such data-driven approaches to UQ are necessary to address changing plant conditions, for example, as nuclear power plants experience transients, or as next-generation small modular reactors (SMR) operate in load-following conditions.

  11. Unit commitment with wind power generation: integrating wind forecast uncertainty and stochastic programming.

    SciTech Connect (OSTI)

    Constantinescu, E. M.; Zavala, V. M.; Rocklin, M.; Lee, S.; Anitescu, M.

    2009-10-09

    We present a computational framework for integrating the state-of-the-art Weather Research and Forecasting (WRF) model in stochastic unit commitment/energy dispatch formulations that account for wind power uncertainty. We first enhance the WRF model with adjoint sensitivity analysis capabilities and a sampling technique implemented in a distributed-memory parallel computing architecture. We use these capabilities through an ensemble approach to model the uncertainty of the forecast errors. The wind power realizations are exploited through a closed-loop stochastic unit commitment/energy dispatch formulation. We discuss computational issues arising in the implementation of the framework. In addition, we validate the framework using real wind speed data obtained from a set of meteorological stations. We also build a simulated power system to demonstrate the developments.

  12. Assessing Fatigue and Ultimate Load Uncertainty in Floating Offshore Wind Turbines Due to Varying Simulation Length

    SciTech Connect (OSTI)

    Stewart, G.; Lackner, M.; Haid, L.; Matha, D.; Jonkman, J.; Robertson, A.

    2013-07-01

    With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation length on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.

  13. Idealization, uncertainty and heterogeneity : game frameworks defined with formal concept analysis.

    SciTech Connect (OSTI)

    Racovitan, M. T.; Sallach, D. L.; Decision and Information Sciences; Northern Illinois Univ.

    2006-01-01

    The present study begins with Formal Concept Analysis, and undertakes to demonstrate how a succession of game frameworks may, by design, address increasingly complex and interesting social phenomena. We develop a series of multi-agent exchange games, each of which incorporates an additional dimension of complexity. All games are based on coalition patterns in exchanges where diverse cultural markers provide a basis for trust and reciprocity. The first game is characterized by an idealized concept of trust. A second game framework introduces uncertainty regarding the reciprocity of prospective transactions. A third game framework retains idealized trust and uncertainty, and adds additional agent heterogeneity. Cultural markers are not equally salient in conferring or withholding trust, and the result is a richer transactional process.

  14. Considerations for sensitivity analysis, uncertainty quantification, and data assimilation for grid-to-rod fretting

    SciTech Connect (OSTI)

    Michael Pernice

    2012-10-01

    Grid-to-rod fretting is the leading cause of fuel failures in pressurized water reactors, and is one of the challenge problems being addressed by the Consortium for Advanced Simulation of Light Water Reactors to guide its efforts to develop a virtual reactor environment. Prior and current efforts in modeling and simulation of grid-to-rod fretting are discussed. Sources of uncertainty in grid-to-rod fretting are also described.

  15. SWEPP PAN assay system uncertainty analysis: Passive mode measurements of graphite waste

    SciTech Connect (OSTI)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, Woo Y.

    1997-07-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the U.S. Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. To this end a modified statistical sampling and verification approach has been developed to determine the total uncertainty of a PAN measurement. In this approach the total performance of the PAN nondestructive assay system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers passive mode measurements of weapons grade plutonium-contaminated graphite molds contained in 208 liter drums (waste code 300). The validity of the simulation approach is verified by comparing simulated output against results from measurements using known plutonium sources and a surrogate graphite waste form drum. For actual graphite waste form conditions, a set of 50 cases covering a statistical sampling of the conditions exhibited in graphite wastes was compiled using a Latin hypercube statistical sampling approach.

  16. Optimizing your options: Extracting the full economic value of transmission when planning under uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Munoz, Francisco D.; Watson, Jean -Paul; Hobbs, Benjamin F.

    2015-06-04

    In this study, the anticipated magnitude of needed investments in new transmission infrastructure in the U.S. requires that these be allocated in a way that maximizes the likelihood of achieving society's goals for power system operation. The use of state-of-the-art optimization tools can identify cost-effective investment alternatives, extract more benefits out of transmission expansion portfolios, and account for the huge economic, technology, and policy uncertainties that the power sector faces over the next several decades.

  17. Cogeneration: A northwest medical facility`s answer to the uncertainties of deregulation

    SciTech Connect (OSTI)

    Almeda, R.; Rivers, J.

    1998-10-01

    Not so long ago, in the good old days, the energy supply to a health care facility was one of the most stable. The local utility provided what was needed at a reasonable cost. Now the energy industry is being deregulated. Major uncertainties exist in all parts of the energy industry. Since reasonably priced and readily available energy is mandatory for a health care facility operation, the energy industry uncertainties reverberate through the health care industry. This article reviews how the uncertainty of electric utility deregulation was converted to an opportunity to implement the ultimate energy conservation project--cogeneration. The project development was made essentially risk free by tailoring project development to deregulation. Costs and financial exposure were minimized by taking numerous small steps in sequence. Valley Medical Center, by persevering with the development of a cogeneration plant, has been able to reduce its energy costs and more importantly, stabilize its energy supply and costs for many years to come. This article reviews activities in two arenas, internal project development and external energy industry developments, by periodically updating each arena and showing how external developments affected the project.

  18. A transform of complementary aspects with applications to entropic uncertainty relations

    SciTech Connect (OSTI)

    Mandayam, Prabha; Wehner, Stephanie; Balachandran, Niranjan

    2010-08-15

    Even though mutually unbiased bases and entropic uncertainty relations play an important role in quantum cryptographic protocols, they remain ill understood. Here, we construct special sets of up to 2n+1 mutually unbiased bases (MUBs) in dimension d=2{sup n}, which have particularly beautiful symmetry properties derived from the Clifford algebra. More precisely, we show that there exists a unitary transformation that cyclically permutes such bases. This unitary can be understood as a generalization of the Fourier transform, which exchanges two MUBs, to multiple complementary aspects. We proceed to prove a lower bound for min-entropic entropic uncertainty relations for any set of MUBs and show that symmetry plays a central role in obtaining tight bounds. For example, we obtain for the first time a tight bound for four MUBs in dimension d=4, which is attained by an eigenstate of our complementarity transform. Finally, we discuss the relation to other symmetries obtained by transformations in discrete phase space and note that the extrema of discrete Wigner functions are directly related to min-entropic uncertainty relations for MUBs.

  19. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of “random” and “systematic” components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed andmore » achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.« less

  20. Empirical and physics based mathematical models of uranium hydride decomposition kinetics with quantified uncertainties.

    SciTech Connect (OSTI)

    Salloum, Maher N.; Gharagozloo, Patricia E.

    2013-10-01

    Metal particle beds have recently become a major technique for hydrogen storage. In order to extract hydrogen from such beds, it is crucial to understand the decomposition kinetics of the metal hydride. We are interested in obtaining a a better understanding of the uranium hydride (UH3) decomposition kinetics. We first developed an empirical model by fitting data compiled from different experimental studies in the literature and quantified the uncertainty resulting from the scattered data. We found that the decomposition time range predicted by the obtained kinetics was in a good agreement with published experimental results. Secondly, we developed a physics based mathematical model to simulate the rate of hydrogen diffusion in a hydride particle during the decomposition. We used this model to simulate the decomposition of the particles for temperatures ranging from 300K to 1000K while propagating parametric uncertainty and evaluated the kinetics from the results. We compared the kinetics parameters derived from the empirical and physics based models and found that the uncertainty in the kinetics predicted by the physics based model covers the scattered experimental data. Finally, we used the physics-based kinetics parameters to simulate the effects of boundary resistances and powder morphological changes during decomposition in a continuum level model. We found that the species change within the bed occurring during the decomposition accelerates the hydrogen flow by increasing the bed permeability, while the pressure buildup and the thermal barrier forming at the wall significantly impede the hydrogen extraction.

  1. A flexible uncertainty quantification method for linearly coupled multi-physics systems

    SciTech Connect (OSTI)

    Chen, Xiao Ng, Brenda; Sun, Yunwei; Tong, Charles

    2013-09-01

    Highlights: We propose a modularly hybrid UQ methodology suitable for independent development of module-based multi-physics simulation. Our algorithmic framework allows for each module to have its own UQ method (either intrusive or non-intrusive). Information from each module is combined systematically to propagate global uncertainty. Our proposed approach can allow for easy swapping of new methods for any modules without the need to address incompatibilities. We demonstrate the proposed framework on a practical application involving a multi-species reactive transport model. -- Abstract: This paper presents a novel approach to building an integrated uncertainty quantification (UQ) methodology suitable for modern-day component-based approach for multi-physics simulation development. Our hybrid UQ methodology supports independent development of the most suitable UQ method, intrusive or non-intrusive, for each physics module by providing an algorithmic framework to couple these stochastic modules for propagating global uncertainties. We address algorithmic and computational issues associated with the construction of this hybrid framework. We demonstrate the utility of such a framework on a practical application involving a linearly coupled multi-species reactive transport model.

  2. Uncertainty Studies of Real Anode Surface Area in Computational Analysis for Molten Salt Electrorefining

    SciTech Connect (OSTI)

    Sungyeol Choi; Jaeyeong Park; Robert O. Hoover; Supathorn Phongikaroon; Michael F. Simpson; Kwang-Rag Kim; Il Soon Hwang

    2011-09-01

    This study examines how much cell potential changes with five differently assumed real anode surface area cases. Determining real anode surface area is a significant issue to be resolved for precisely modeling molten salt electrorefining. Based on a three-dimensional electrorefining model, calculated cell potentials compare with an experimental cell potential variation over 80 hours of operation of the Mark-IV electrorefiner with driver fuel from the Experimental Breeder Reactor II. We succeeded to achieve a good agreement with an overall trend of the experimental data with appropriate selection of a mode for real anode surface area, but there are still local inconsistencies between theoretical calculation and experimental observation. In addition, the results were validated and compared with two-dimensional results to identify possible uncertainty factors that had to be further considered in a computational electrorefining analysis. These uncertainty factors include material properties, heterogeneous material distribution, surface roughness, and current efficiency. Zirconium's abundance and complex behavior have more impact on uncertainty towards the latter period of electrorefining at given batch of fuel. The benchmark results found that anode materials would be dissolved from both axial and radial directions at least for low burn-up metallic fuels after active liquid sodium bonding was dissolved.

  3. Analysis of Variability and Uncertainty in Wind Power Forecasting: An International Comparison: Preprint

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B. M.; Gomez-Lazaro, E.; Lovholm, A. L.; Berge, E.; Miettinen, J.; Holttinen, H.; Cutululis, N.; Litong-Palima, M.; Sorensen, P.; Dobschinski, J.

    2013-10-01

    One of the critical challenges of wind power integration is the variable and uncertain nature of the resource. This paper investigates the variability and uncertainty in wind forecasting for multiple power systems in six countries. An extensive comparison of wind forecasting is performed among the six power systems by analyzing the following scenarios: (i) wind forecast errors throughout a year; (ii) forecast errors at a specific time of day throughout a year; (iii) forecast errors at peak and off-peak hours of a day; (iv) forecast errors in different seasons; (v) extreme forecasts with large overforecast or underforecast errors; and (vi) forecast errors when wind power generation is at different percentages of the total wind capacity. The kernel density estimation method is adopted to characterize the distribution of forecast errors. The results show that the level of uncertainty and the forecast error distribution vary among different power systems and scenarios. In addition, for most power systems, (i) there is a tendency to underforecast in winter; and (ii) the forecasts in winter generally have more uncertainty than the forecasts in summer.

  4. Uncertainty quantification in application of the enrichment meter principle for nondestructive assay of special nuclear material

    SciTech Connect (OSTI)

    Burr, Tom; Croft, Stephen; Jarman, Kenneth D.

    2015-09-05

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings, and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without error bars, which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically quantify total uncertainty in terms of random and systematic components, and then specify error bars for the total mass estimate in multiple items. Uncertainty quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods. To this end, we describe the extent to which the guideline for expressing uncertainty in measurements (GUM) can be used for NDA. Also, we propose improvements over GUM for NDA by illustrating UQ challenges that it does not address, including calibration with errors in predictors, model error, and item-specific biases. A case study is presented using low-resolution NaI spectra and applying the enrichment meter principle to estimate the U-235 mass in an item. The case study illustrates how to update the current American Society for Testing and Materials guide for application of the enrichment meter principle using gamma spectra from a NaI detector.

  5. Minimal position-velocity uncertainty wave packets in relativistic and non-relativistic quantum mechanics

    SciTech Connect (OSTI)

    Al-Hashimi, M.H. Wiese, U.-J.

    2009-12-15

    We consider wave packets of free particles with a general energy-momentum dispersion relation E(p). The spreading of the wave packet is determined by the velocity v={partial_derivative}{sub p}E. The position-velocity uncertainty relation {delta}x{delta}v{>=}1/2 |<{partial_derivative}{sub p}{sup 2}E>| is saturated by minimal uncertainty wave packets {phi}(p)=Aexp(-{alpha}E(p)+{beta}p). In addition to the standard minimal Gaussian wave packets corresponding to the non-relativistic dispersion relation E(p)=p{sup 2}/2m, analytic calculations are presented for the spreading of wave packets with minimal position-velocity uncertainty product for the lattice dispersion relation E(p)=-cos(pa)/ma{sup 2} as well as for the relativistic dispersion relation E(p)={radical}(p{sup 2}+m{sup 2}). The boost properties of moving relativistic wave packets as well as the propagation of wave packets in an expanding Universe are also discussed.

  6. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect (OSTI)

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  7. UNCERTAINTIES OF MODELING GAMMA-RAY PULSAR LIGHT CURVES USING VACUUM DIPOLE MAGNETIC FIELD

    SciTech Connect (OSTI)

    Bai Xuening; Spitkovsky, Anatoly E-mail: anatoly@astro.princeton.ed

    2010-06-01

    Current models of pulsar gamma-ray emission use the magnetic field of a rotating dipole in vacuum as a first approximation to the shape of a plasma-filled pulsar magnetosphere. In this paper, we revisit the question of gamma-ray light curve formation in pulsars in order to ascertain the robustness of the 'two-pole caustic (TPC)' and 'outer gap (OG)' models based on the vacuum magnetic field. We point out an inconsistency in the literature on the use of the relativistic aberration formula, where in several works the shape of the vacuum field was treated as known in the instantaneous corotating frame, rather than in the laboratory frame. With the corrected formula, we find that the peaks in the light curves predicted from the TPC model using the vacuum field are less sharp. The sharpness of the peaks in the OG model is less affected by this change, but the range of magnetic inclination angles and viewing geometries resulting in double-peaked light curves is reduced. In a realistic magnetosphere, the modification of field structure near the light cylinder (LC) due to plasma effects may change the shape of the polar cap and the location of the emission zones. We study the sensitivity of the light curves to different shapes of the polar cap for static and retarded vacuum dipole fields. In particular, we consider polar caps traced by the last open field lines and compare them to circular polar caps. We find that the TPC model is very sensitive to the shape of the polar cap, and a circular polar cap can lead to four peaks of emission. The OG model is less affected by different polar cap shapes, but is subject to big uncertainties of applying the vacuum field near the LC. We conclude that deviations from the vacuum field can lead to large uncertainties in pulse shapes, and a more realistic force-free field should be applied to the study of pulsar high-energy emission.

  8. Signal discovery, limits, and uncertainties with sparse on/off measurements: an objective bayesian analysis

    SciTech Connect (OSTI)

    Knoetig, Max L., E-mail: mknoetig@phys.ethz.ch [Institute for Particle Physics, ETH Zurich, 8093 Zurich (Switzerland)

    2014-08-01

    For decades researchers have studied the On/Off counting problem where a measured rate consists of two parts. One part is due to a signal process and the other is due to a background process, the magnitudes for both of which are unknown. While most frequentist methods are adequate for large number counts, they cannot be applied to sparse data. Here, I want to present a new objective Bayesian solution that only depends on three parameters: the number of events in the signal region, the number of events in the background region, and the ratio of the exposure for both regions. First, the probability of the counts only being due to background is derived analytically. Second, the marginalized posterior for the signal parameter is also derived analytically. With this two-step approach it is easy to calculate the signal's significance, strength, uncertainty, or upper limit in a unified way. This approach is valid without restrictions for any number count, including zero, and may be widely applied in particle physics, cosmic-ray physics, and high-energy astrophysics. In order to demonstrate the performance of this approach, I apply the method to gamma-ray burst data.

  9. Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling

    SciTech Connect (OSTI)

    Johannesson, G

    2010-03-17

    Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that the average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling involves configuring and running a regional climate model (RCM) nested within a given GCM projection (i.e., the GCM provides bounder conditions for the RCM). On the other hand, statistical downscaling aims at establishing a statistical relationship between observed local/regional climate variables of interest and synoptic (GCM-scale) climate predictors. The resulting empirical relationship is then applied to future GCM projections. A comparison of the pros and cons of dynamical versus statistical downscaling is outside the scope of this effort, but has been extensively studied and the reader is referred to Wilby et al. (1998); Murphy (1999); Wood et al. (2004); Benestad et al. (2007); Fowler et al. (2007), and references within those. The scope of this effort is to study methodology, a statistical framework, to propagate and account for GCM uncertainty in regional statistical downscaling assessment. In particular, we will explore how to leverage an ensemble of GCM projections to quantify the impact of the GCM uncertainty in such an assessment. There are three main component to this effort: (1) gather the necessary climate-related data for a regional SDS study, including multiple GCM projections, (2) carry out SDS, and (3) assess the uncertainty. The first step is carried out using tools written in the Python programming language, while analysis tools were developed in the statistical programming language R; see Figure 1.

  10. Generalized Uncertainty Quantification for Linear Inverse Problems in X-ray Imaging

    SciTech Connect (OSTI)

    Fowler, Michael James

    2014-04-25

    In industrial and engineering applications, X-ray radiography has attained wide use as a data collection protocol for the assessment of material properties in cases where direct observation is not possible. The direct measurement of nuclear materials, particularly when they are under explosive or implosive loading, is not feasible, and radiography can serve as a useful tool for obtaining indirect measurements. In such experiments, high energy X-rays are pulsed through a scene containing material of interest, and a detector records a radiograph by measuring the radiation that is not attenuated in the scene. One approach to the analysis of these radiographs is to model the imaging system as an operator that acts upon the object being imaged to produce a radiograph. In this model, the goal is to solve an inverse problem to reconstruct the values of interest in the object, which are typically material properties such as density or areal density. The primary objective in this work is to provide quantitative solutions with uncertainty estimates for three separate applications in X-ray radiography: deconvolution, Abel inversion, and radiation spot shape reconstruction. For each problem, we introduce a new hierarchical Bayesian model for determining a posterior distribution on the unknowns and develop efficient Markov chain Monte Carlo (MCMC) methods for sampling from the posterior. A Poisson likelihood, based on a noise model for photon counts at the detector, is combined with a prior tailored to each application: an edge-localizing prior for deconvolution; a smoothing prior with non-negativity constraints for spot reconstruction; and a full covariance sampling prior based on a Wishart hyperprior for Abel inversion. After developing our methods in a general setting, we demonstrate each model on both synthetically generated datasets, including those from a well known radiation transport code, and real high energy radiographs taken at two U. S. Department of Energy laboratories.

  11. Short-Term Energy Outlook Supplement: Uncertainties in the Short-Term Global Petroleum and Other Liquids Supply Forecast

    Gasoline and Diesel Fuel Update (EIA)

    Uncertainties in the Short-Term Global Petroleum and Other Liquids Supply Forecast February 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | STEO Supplement: Uncertainties in the Global Petroleum and Other Liquids Supply Forecast i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data,

  12. Assessing the near-term risk of climate uncertainty : interdependencies among the U.S. states.

    SciTech Connect (OSTI)

    Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.; Tidwell, Vincent Carroll; Stamber, Kevin Louis; Reinert, Rhonda K.; Backus, George A.; Warren, Drake E.; Zagonel, Aldo A.; Ehlen, Mark Andrew; Klise, Geoffrey T.; Vargas, Vanessa N.

    2010-04-01

    Policy makers will most likely need to make decisions about climate policy before climate scientists have resolved all relevant uncertainties about the impacts of climate change. This study demonstrates a risk-assessment methodology for evaluating uncertain future climatic conditions. We estimate the impacts of climate change on U.S. state- and national-level economic activity from 2010 to 2050. To understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions to mitigate the course of climate change, we focus on precipitation, one of the most uncertain aspects of future climate change. We use results of the climate-model ensemble from the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) as a proxy for representing climate uncertainty over the next 40 years, map the simulated weather from the climate models hydrologically to the county level to determine the physical consequences on economic activity at the state level, and perform a detailed 70-industry analysis of economic impacts among the interacting lower-48 states. We determine the industry-level contribution to the gross domestic product and employment impacts at the state level, as well as interstate population migration, effects on personal income, and consequences for the U.S. trade balance. We show that the mean or average risk of damage to the U.S. economy from climate change, at the national level, is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs.

  13. Treatment planning for prostate focal laser ablation in the face of needle placement uncertainty

    SciTech Connect (OSTI)

    Cepek, Jeremy Fenster, Aaron; Lindner, Uri; Trachtenberg, John; Davidson, Sean R. H.; Haider, Masoom A.; Ghai, Sangeet

    2014-01-15

    Purpose: To study the effect of needle placement uncertainty on the expected probability of achieving complete focal target destruction in focal laser ablation (FLA) of prostate cancer. Methods: Using a simplified model of prostate cancer focal target, and focal laser ablation region shapes, Monte Carlo simulations of needle placement error were performed to estimate the probability of completely ablating a region of target tissue. Results: Graphs of the probability of complete focal target ablation are presented over clinically relevant ranges of focal target sizes and shapes, ablation region sizes, and levels of needle placement uncertainty. In addition, a table is provided for estimating the maximum target size that is treatable. The results predict that targets whose length is at least 5 mm smaller than the diameter of each ablation region can be confidently ablated using, at most, four laser fibers if the standard deviation in each component of needle placement error is less than 3 mm. However, targets larger than this (i.e., near to or exceeding the diameter of each ablation region) require more careful planning. This process is facilitated by using the table provided. Conclusions: The probability of completely ablating a focal target using FLA is sensitive to the level of needle placement uncertainty, especially as the target length approaches and becomes greater than the diameter of ablated tissue that each individual laser fiber can achieve. The results of this work can be used to help determine individual patient eligibility for prostate FLA, to guide the planning of prostate FLA, and to quantify the clinical benefit of using advanced systems for accurate needle delivery for this treatment modality.

  14. Uncertainty quantification of CO? saturation estimated from electrical resistance tomography data at the Cranfield site

    SciTech Connect (OSTI)

    Yang, Xianjin; Chen, Xiao; Carrigan, Charles R.; Ramirez, Abelardo L.

    2014-06-03

    A parametric bootstrap approach is presented for uncertainty quantification (UQ) of CO? saturation derived from electrical resistance tomography (ERT) data collected at the Cranfield, Mississippi (USA) carbon sequestration site. There are many sources of uncertainty in ERT-derived CO? saturation, but we focus on how the ERT observation errors propagate to the estimated CO? saturation in a nonlinear inversion process. Our UQ approach consists of three steps. We first estimated the observational errors from a large number of reciprocal ERT measurements. The second step was to invert the pre-injection baseline data and the resulting resistivity tomograph was used as the prior information for nonlinear inversion of time-lapse data. We assigned a 3% random noise to the baseline model. Finally, we used a parametric bootstrap method to obtain bootstrap CO? saturation samples by deterministically solving a nonlinear inverse problem many times with resampled data and resampled baseline models. Then the mean and standard deviation of CO? saturation were calculated from the bootstrap samples. We found that the maximum standard deviation of CO? saturation was around 6% with a corresponding maximum saturation of 30% for a data set collected 100 days after injection began. There was no apparent spatial correlation between the mean and standard deviation of CO? saturation but the standard deviation values increased with time as the saturation increased. The uncertainty in CO? saturation also depends on the ERT reciprocal error threshold used to identify and remove noisy data and inversion constraints such as temporal roughness. Five hundred realizations requiring 3.5 h on a single 12-core node were needed for the nonlinear Monte Carlo inversion to arrive at stationary variances while the Markov Chain Monte Carlo (MCMC) stochastic inverse approach may expend days for a global search. This indicates that UQ of 2D or 3D ERT inverse problems can be performed on a laptop or desktop PC.

  15. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect (OSTI)

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  16. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect (OSTI)

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  17. Uncertainty quantification of CO₂ saturation estimated from electrical resistance tomography data at the Cranfield site

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Yang, Xianjin; Chen, Xiao; Carrigan, Charles R.; Ramirez, Abelardo L.

    2014-06-03

    A parametric bootstrap approach is presented for uncertainty quantification (UQ) of CO₂ saturation derived from electrical resistance tomography (ERT) data collected at the Cranfield, Mississippi (USA) carbon sequestration site. There are many sources of uncertainty in ERT-derived CO₂ saturation, but we focus on how the ERT observation errors propagate to the estimated CO₂ saturation in a nonlinear inversion process. Our UQ approach consists of three steps. We first estimated the observational errors from a large number of reciprocal ERT measurements. The second step was to invert the pre-injection baseline data and the resulting resistivity tomograph was used as the priormore » information for nonlinear inversion of time-lapse data. We assigned a 3% random noise to the baseline model. Finally, we used a parametric bootstrap method to obtain bootstrap CO₂ saturation samples by deterministically solving a nonlinear inverse problem many times with resampled data and resampled baseline models. Then the mean and standard deviation of CO₂ saturation were calculated from the bootstrap samples. We found that the maximum standard deviation of CO₂ saturation was around 6% with a corresponding maximum saturation of 30% for a data set collected 100 days after injection began. There was no apparent spatial correlation between the mean and standard deviation of CO₂ saturation but the standard deviation values increased with time as the saturation increased. The uncertainty in CO₂ saturation also depends on the ERT reciprocal error threshold used to identify and remove noisy data and inversion constraints such as temporal roughness. Five hundred realizations requiring 3.5 h on a single 12-core node were needed for the nonlinear Monte Carlo inversion to arrive at stationary variances while the Markov Chain Monte Carlo (MCMC) stochastic inverse approach may expend days for a global search. This indicates that UQ of 2D or 3D ERT inverse problems can be performed on a laptop or desktop PC.« less

  18. nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties

    SciTech Connect (OSTI)

    Kusina, A.; Jezo, T.; Clark, D. B.; Keppel, Cynthia; Lyonnet, F.; Morfin, Jorge; Olness, F. I.; Owens, Jeff; Schienbein, I.

    2015-09-01

    We present the first official release of the nCTEQ nuclear parton distribution functions with errors. The main addition to the previous nCTEQ PDFs is the introduction of PDF uncertainties based on the Hessian method. Another important addition is the inclusion of pion production data from RHIC that give us a handle on constraining the gluon PDF. This contribution summarizes our results from arXiv:1509.00792 and concentrates on the comparison with other groups providing nuclear parton distributions.

  19. Generalized uncertainty principle in f(R) gravity for a charged black hole

    SciTech Connect (OSTI)

    Said, Jackson Levi; Adami, Kristian Zarb

    2011-02-15

    Using f(R) gravity in the Palatini formularism, the metric for a charged spherically symmetric black hole is derived, taking the Ricci scalar curvature to be constant. The generalized uncertainty principle is then used to calculate the temperature of the resulting black hole; through this the entropy is found correcting the Bekenstein-Hawking entropy in this case. Using the entropy the tunneling probability and heat capacity are calculated up to the order of the Planck length, which produces an extra factor that becomes important as black holes become small, such as in the case of mini-black holes.

  20. Influence of Nuclear Fuel Cycles on Uncertainty of Long Term Performance of Geologic Disposal Systems

    Broader source: Energy.gov [DOE]

    Development and implementation of future advanced fuel cycles including those that recycle fuel materials, use advanced fuels different from current fuels, or partition and transmute actinide radionuclides, will impact the waste management system. The UFD Campaign can reasonably conclude that advanced fuel cycles, in combination with partitioning and transmutation, which remove actinides, will not materially alter the performance, the spread in dose results around the mean, the modeling effort to include significant features, events, and processes (FEPs) in the performance assessment, or the characterization of uncertainty associated with a geologic disposal system in the regulatory environment of the US.

  1. Analysis of sampling plan options for tank 16H from the perspective of statistical uncertainty

    SciTech Connect (OSTI)

    Shine, E. P.

    2013-02-28

    This report develops a concentration variability model for Tank 16H in order to compare candidate sampling plans for assessing the concentrations of analytes in the residual material in the annulus and on the floor of the primary vessel. A concentration variability model is used to compare candidate sampling plans based on the expected upper 95% confidence limit (UCL95) for the mean. The result is expressed as a rank order of candidate sampling plans from lowest to highest expected UCL95, with the lowest being the most desirable from an uncertainty perspective.

  2. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    E.T. Coon; C.J. Wilson; S.L. Painter; V.E. Romanovsky; D.R. Harp; A.L. Atchley; J.C. Rowland

    2016-02-02

    This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  3. Gamma-Ray Library and Uncertainty Analysis: Passively Emitted Gamma Rays Used in Safeguards Technology

    SciTech Connect (OSTI)

    Parker, W

    2009-09-18

    Non-destructive gamma-ray analysis is a fundamental part of nuclear safeguards, including nuclear energy safeguards technology. Developing safeguards capabilities for nuclear energy will certainly benefit from the advanced use of gamma-ray spectroscopy as well as the ability to model various reactor scenarios. There is currently a wide variety of nuclear data that could be used in computer modeling and gamma-ray spectroscopy analysis. The data can be discrepant (with varying uncertainties), and it may difficult for a modeler or software developer to determine the best nuclear data set for a particular situation. To use gamma-ray spectroscopy to determine the relative isotopic composition of nuclear materials, the gamma-ray energies and the branching ratios or intensities of the gamma-rays emitted from the nuclides in the material must be well known. A variety of computer simulation codes will be used during the development of the nuclear energy safeguards, and, to compare the results of various codes, it will be essential to have all the {gamma}-ray libraries agree. Assessing our nuclear data needs allows us to create a prioritized list of desired measurements, and provides uncertainties for energies and especially for branching intensities. Of interest are actinides, fission products, and activation products, and most particularly mixtures of all of these radioactive isotopes, including mixtures of actinides and other products. Recent work includes the development of new detectors with increased energy resolution, and studies of gamma-rays and their lines used in simulation codes. Because new detectors are being developed, there is an increased need for well known nuclear data for radioactive isotopes of some elements. Safeguards technology should take advantage of all types of gamma-ray detectors, including new super cooled detectors, germanium detectors and cadmium zinc telluride detectors. Mixed isotopes, particularly mixed actinides found in nuclear reactor streams can be especially challenging to identify. The super cooled detectors have a marked improvement in energy resolution, allowing the possibility of deconvolution of mixtures of gamma rays that was unavailable with high purity germanium detectors. Isotopic analysis codes require libraries of gamma rays. In certain situations, isotope identification can be made in the field, sometimes with a short turnaround time, depending on the choice of detector and software analysis package. Sodium iodide and high purity germanium detectors have been successfully used in field scenarios. The newer super cooled detectors offer dramatically increased resolution, but they have lower efficiency and so can require longer collection times. The different peak shapes require software development for the specific detector type and field application. Libraries can be tailored to specific scenarios; by eliminating isotopes that are certainly not present, the analysis time may be shortened and the accuracy may be increased. The intent of this project was to create one accurate library of gamma rays emitted from isotopes of interest to be used as a reliable reference in safeguards work. All simulation and spectroscopy analysis codes can draw upon this best library to improve accuracy and cross-code consistency. Modeling codes may include MCNP and COG. Gamma-ray spectroscopy analysis codes may include MGA, MGAU, U235 and FRAM. The intent is to give developers and users the tools to use in nuclear energy safeguards work. In this project, the library created was limited to a selection of actinide isotopes of immediate interest to reactor technology. These isotopes included {sup 234-238}U, {sup 237}Np, {sup 238-242}Pu, {sup 241,243}Am and {sup 244}Cm. These isotopes were examined, and the best of gamma-ray data, including line energies and relative strengths were selected.

  4. Improvements to the RELAP5/MOD3 reflood model and uncertainty quantification of reflood peak clad temperature

    SciTech Connect (OSTI)

    Chung, Bub Dong; Lee, Young Lee; Park, Chan Eok; Lee, Sang Yong

    1996-10-01

    Assessment of the original REAP/N4OD3.1 code against the FLECHT SEASET series of experiments has identified some weaknesses of the reflood model, such as the lack of a quenching temperature model, the shortcoming of the Chen transition boiling model, and the incorrect prediction of droplet size and interfacial heat transfer. Also, high temperature spikes during the reflood calculation resulted in high steam flow oscillation and liquid carryover. An effort had been made to improve the code with respect to the above weakness, and the necessary model for the wall heat transfer package and the numerical scheme had been modified. Some important FLECHT-SEASET experiments were assessed using the improved version and standard version. The result from the improved REAP/MOD3.1 shows the weaknesses of REAP/N4OD3.1 were much improved when compared to the standard MOD3.1 code. The prediction of void profile and cladding temperature agreed better with test data, especially for the gravity feed test. The scatter diagram of peak cladding temperatures (PCTs) is made from the comparison of all the calculated PCTs and the corresponding experimental values. The deviation between experimental and calculated PCTs were calculated for 2793 data points. The deviations are shown to be normally distributed, and used to quantify statistically the PCT uncertainty of the code. The upper limit of PCT uncertainty at 95% confidence level is evaluated to be about 99K.

  5. Impacts of Variability and Uncertainty in Solar Photovoltaic Generation at Multiple Timescales

    SciTech Connect (OSTI)

    Ela, E.; Diakov, V.; Ibanez, E.; Heaney, M.

    2013-05-01

    The characteristics of variability and uncertainty of PV solar power have been studied extensively. These characteristics can create challenges for system operators who must ensure a balance between generation and demand while obeying power system constraints at the lowest possible cost. A number of studies have looked at the impact of wind power plants, and some recent studies have also included solar PV. The simulations that are used in these studies, however, are typically fixed to one time resolution. This makes it difficult to analyze the variability across several timescales. In this study, we use a simulation tool that has the ability to evaluate both the economic and reliability impacts of PV variability and uncertainty at multiple timescales. This information should help system operators better prepare for increases of PV on their systems and develop improved mitigation strategies to better integrate PV with enhanced reliability. Another goal of this study is to understand how different mitigation strategies and methods can improve the integration of solar power more reliably and efficiently.

  6. Covariant energymomentum and an uncertainty principle for general relativity

    SciTech Connect (OSTI)

    Cooperstock, F.I.; Dupre, M.J.

    2013-12-15

    We introduce a naturally-defined totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. The extension links seamlessly to the action integral for the gravitational field. The demand that the general expression for arbitrary systems reduces to the Tolman integral in the case of stationary bounded distributions, leads to the matter-localized Ricci integral for energymomentum in support of the energy localization hypothesis. The role of the observer is addressed and as an extension of the special relativistic case, the field of observers comoving with the matter is seen to compute the intrinsic global energy of a system. The new localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. It is suggested that in the extreme of strong gravity, the Heisenberg Uncertainty Principle be generalized in terms of spacetime energymomentum. -- Highlights: We present a totally invariant spacetime energy expression for general relativity incorporating the contribution from gravity. Demand for the general expression to reduce to the Tolman integral for stationary systems supports the Ricci integral as energymomentum. Localized energy via the Ricci integral is consistent with the energy localization hypothesis. New localized energy supports the Bonnor claim that the Szekeres collapsing dust solutions are energy-conserving. Suggest the Heisenberg Uncertainty Principle be generalized in terms of spacetime energymomentum in strong gravity extreme.

  7. Cosmological parameter uncertainties from SALT-II type Ia supernova light curve models

    SciTech Connect (OSTI)

    Mosher, J.; Sako, M. [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Guy, J.; Astier, P.; Betoule, M.; El-Hage, P.; Pain, R.; Regnault, N. [LPNHE, CNRS/IN2P3, Universit Pierre et Marie Curie Paris 6, Universi Denis Diderot Paris 7, 4 place Jussieu, F-75252 Paris Cedex 05 (France); Kessler, R.; Frieman, J. A. [Kavli Institute for Cosmological Physics, University of Chicago, 5640 South Ellis Avenue, Chicago, IL 60637 (United States); Marriner, J. [Center for Particle Astrophysics, Fermi National Accelerator Laboratory, P.O. Box 500, Batavia, IL 60510 (United States); Biswas, R.; Kuhlmann, S. [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, IL 60439 (United States); Schneider, D. P., E-mail: kessler@kicp.chicago.edu [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2014-09-20

    We use simulated type Ia supernova (SN Ia) samples, including both photometry and spectra, to perform the first direct validation of cosmology analysis using the SALT-II light curve model. This validation includes residuals from the light curve training process, systematic biases in SN Ia distance measurements, and a bias on the dark energy equation of state parameter w. Using the SN-analysis package SNANA, we simulate and analyze realistic samples corresponding to the data samples used in the SNLS3 analysis: ?120 low-redshift (z < 0.1) SNe Ia, ?255 Sloan Digital Sky Survey SNe Ia (z < 0.4), and ?290 SNLS SNe Ia (z ? 1). To probe systematic uncertainties in detail, we vary the input spectral model, the model of intrinsic scatter, and the smoothing (i.e., regularization) parameters used during the SALT-II model training. Using realistic intrinsic scatter models results in a slight bias in the ultraviolet portion of the trained SALT-II model, and w biases (w {sub input} w {sub recovered}) ranging from 0.005 0.012 to 0.024 0.010. These biases are indistinguishable from each other within the uncertainty; the average bias on w is 0.014 0.007.

  8. Interactive Photochemistry in Earth System Models to Assess Uncertainty in Ozone and Greenhouse Gases. Final report

    SciTech Connect (OSTI)

    Prather, Michael J.; Hsu, Juno; Nicolau, Alex; Veidenbaum, Alex; Smith, Philip Cameron; Bergmann, Dan

    2014-11-07

    Atmospheric chemistry controls the abundances and hence climate forcing of important greenhouse gases including N2O, CH4, HFCs, CFCs, and O3. Attributing climate change to human activities requires, at a minimum, accurate models of the chemistry and circulation of the atmosphere that relate emissions to abundances. This DOE-funded research provided realistic, yet computationally optimized and affordable, photochemical modules to the Community Earth System Model (CESM) that augment the CESM capability to explore the uncertainty in future stratospheric-tropospheric ozone, stratospheric circulation, and thus the lifetimes of chemically controlled greenhouse gases from climate simulations. To this end, we have successfully implemented Fast-J (radiation algorithm determining key chemical photolysis rates) and Linoz v3.0 (linearized photochemistry for interactive O3, N2O, NOy and CH4) packages in LLNL-CESM and for the first time demonstrated how change in O2 photolysis rate within its uncertainty range can significantly impact on the stratospheric climate and ozone abundances. From the UCI side, this proposal also helped LLNL develop a CAM-Superfast Chemistry model that was implemented for the IPCC AR5 and contributed chemical-climate simulations to CMIP5.

  9. Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers

    SciTech Connect (OSTI)

    Hou, Zhangshuan; Bacon, Diana H.; Engel, David W.; Lin, Guang; Fang, Yilin; Ren, Huiying; Fang, Zhufeng

    2014-08-01

    In this study, we apply an uncertainty quantification (UQ) framework to CO2 sequestration problems. In one scenario, we look at the risk of wellbore leakage of CO2 into a shallow unconfined aquifer in an urban area; in another scenario, we study the effects of reservoir heterogeneity on CO2 migration. We combine various sampling approaches (quasi-Monte Carlo, probabilistic collocation, and adaptive sampling) in order to reduce the number of forward calculations while trying to fully explore the input parameter space and quantify the input uncertainty. The CO2 migration is simulated using the PNNL-developed simulator STOMP-CO2e (the water-salt-CO2 module). For computationally demanding simulations with 3D heterogeneity fields, we combined the framework with a scalable version module, eSTOMP, as the forward modeling simulator. We built response curves and response surfaces of model outputs with respect to input parameters, to look at the individual and combined effects, and identify and rank the significance of the input parameters.

  10. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    SciTech Connect (OSTI)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.; Jakeman, John Davis; Swiler, Laura Painton; Stephens, John Adam; Vigil, Dena M.; Wildey, Timothy Michael; Bohnhoff, William J.; Eddy, John P.; Hu, Kenneth T.; Dalbey, Keith R.; Bauman, Lara E; Hough, Patricia Diane

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  11. An Uncertainty Quantification Framework for Studying the Effect of Spatial Heterogeneity in Reservoir Permeability on CO2 Sequestration

    SciTech Connect (OSTI)

    Hou, Zhangshuan; Engel, David W.; Lin, Guang; Fang, Yilin; Fang, Zhufeng

    2013-10-01

    In this paper, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, focused on the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches. First, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling) to reduce the number of forward calculations while trying to explore the parameter space and quantify the input uncertainty. Second, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented with the Global Arrays toolkit that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms, providing a highly-scalable performance. Third, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We demonstrate the framework with a given CO2 injection scenario in heterogeneous sandstone reservoirs.

  12. Uncertainties in Life Cycle Greenhouse Gas Emissions from Advanced Biomass Feedstock Logistics Supply Chains in Kansas

    SciTech Connect (OSTI)

    Cafferty, Kara G.; Searcy, Erin M.; Nguyen, Long; Spatari, Sabrina

    2014-11-01

    To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cycle due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system

  13. Soil Segregation technology: reducing uncertainty and increasing efficiency during radiological decommissioning - a case study

    SciTech Connect (OSTI)

    Lombardo, A.J.; Orthen, R.F.; Shonka, J.J.; Scott, L.M.

    2007-07-01

    The regulatory release of sites and facilities (property) for restricted or unrestricted use has evolved beyond prescribed levels to model-derived dose and risk based limits. Dose models for deriving corresponding soil radionuclide concentration guidelines are necessarily simplified representations of complex processes. It is not practical to obtain data to fully or accurately characterize transport and exposure pathway processes. Similarly, it is not possible to predict future conditions with certainty absent durable land use restrictions. To compensate for the shortage of comprehensive characterization data and site specific inputs to describe the projected 'as-left' contaminated zone, conservative default values are used to derive acceptance criteria. The result is overly conservative criteria. Furthermore, implementation of a remediation plan and subsequent final surveys to show compliance with the conservative criteria often result in excessive remediation due to the large uncertainty. During a recent decommissioning project of a site contaminated with thorium, a unique approach to dose modeling and remedial action design was implemented to effectively manage end-point uncertainty. The approach used a dynamic feedback dose model and soil segregation technology to characterize impacted material with precision and accuracy not possible with static control approaches. Utilizing the remedial action goal 'over excavation' and subsequent auto-segregation of excavated material for refill, the end-state (as-left conditions of the refilled excavation) RESRAD input parameters were re-entered to assess the final dose. The segregation process produced separate below and above criteria material stockpiles whose volumes were optimized for maximum refill and minimum waste. The below criteria material was returned to the excavation without further analysis, while the above criteria material was packaged for offsite disposal. Using the activity concentration data recorded by the segregation system and the as-left configuration of the refilled excavation, the end state model of the site was prepared with substantially reduced uncertainty. The major projected benefits of this approach are reviewed as well as the performance of the segregation system and lessons learned including: 1) Total, first-attempt data discovery brought about by simultaneously conducted characterization and final status surveys, 2) Lowered project costs stemming from efficient analysis and abstraction of impacted material and reduced offsite waste disposal volume, 3) Lowered project costs due to increased remediation/construction efficiency and decreased survey and radio-analytical expenses, and 4) Improving the decommissioning experience with new regulatory guidance. (authors)

  14. Sensitivity of CO2 migration estimation on reservoir temperature and pressure uncertainty

    SciTech Connect (OSTI)

    Jordan, Preston; Doughty, Christine

    2008-11-01

    The density and viscosity of supercritical CO{sub 2} are sensitive to pressure and temperature (PT) while the viscosity of brine is sensitive primarily to temperature. Oil field PT data in the vicinity of WESTCARB's Phase III injection pilot test site in the southern San Joaquin Valley, California, show a range of PT values, indicating either PT uncertainty or variability. Numerical simulation results across the range of likely PT indicate brine viscosity variation causes virtually no difference in plume evolution and final size, but CO{sub 2} density variation causes a large difference. Relative ultimate plume size is almost directly proportional to the relative difference in brine and CO{sub 2} density (buoyancy flow). The majority of the difference in plume size occurs during and shortly after the cessation of injection.

  15. Entropic uncertainty relations and locking: Tight bounds for mutually unbiased bases

    SciTech Connect (OSTI)

    Ballester, Manuel A.; Wehner, Stephanie

    2007-02-15

    We prove tight entropic uncertainty relations for a large number of mutually unbiased measurements. In particular, we show that a bound derived from the result by Maassen and Uffink [Phys. Rev. Lett. 60, 1103 (1988)] for two such measurements can in fact be tight for up to {radical}(d) measurements in mutually unbiased bases. We then show that using more mutually unbiased bases does not always lead to a better locking effect. We prove that the optimal bound for the accessible information using up to {radical}(d) specific mutually unbiased bases is log d/2, which is the same as can be achieved by using only two bases. Our result indicates that merely using mutually unbiased bases is not sufficient to achieve a strong locking effect and we need to look for additional properties.

  16. Challenges, uncertainties and issues facing gas production from gas hydrate deposits

    SciTech Connect (OSTI)

    Moridis, G.J.; Collett, T.S.; Pooladi-Darvish, M.; Hancock, S.; Santamarina, C.; Boswell, R.; Kneafsey, T.; Rutqvist, J.; Kowalsky, M.; Reagan, M.T.; Sloan, E.D.; Sum, A.K.; Koh, C.

    2010-11-01

    The current paper complements the Moridis et al. (2009) review of the status of the effort toward commercial gas production from hydrates. We aim to describe the concept of the gas hydrate petroleum system, to discuss advances, requirement and suggested practices in gas hydrate (GH) prospecting and GH deposit characterization, and to review the associated technical, economic and environmental challenges and uncertainties, including: the accurate assessment of producible fractions of the GH resource, the development of methodologies for identifying suitable production targets, the sampling of hydrate-bearing sediments and sample analysis, the analysis and interpretation of geophysical surveys of GH reservoirs, well testing methods and interpretation of the results, geomechanical and reservoir/well stability concerns, well design, operation and installation, field operations and extending production beyond sand-dominated GH reservoirs, monitoring production and geomechanical stability, laboratory investigations, fundamental knowledge of hydrate behavior, the economics of commercial gas production from hydrates, and the associated environmental concerns.

  17. Theoretical uncertainties in the nuclear matrix elements of neutrinoless double beta decay: The transition operator

    SciTech Connect (OSTI)

    Menndez, Javier

    2013-12-30

    We explore the theoretical uncertainties related to the transition operator of neutrinoless double-beta (0???) decay. The transition operator used in standard calculations is a product of one-body currents, that can be obtained phenomenologically as in Tomoda [1] or imkovic et al. [2]. However, corrections to the operator are hard to obtain in the phenomenological approach. Instead, we calculate the 0??? decay operator in the framework of chiral effective theory (EFT), which gives a systematic order-by-order expansion of the transition currents. At leading orders in chiral EFT we reproduce the standard one-body currents of Refs. [1] and [2]. Corrections appear as two-body (2b) currents predicted by chiral EFT. We compute the effects of the leading 2b currents to the nuclear matrix elements of 0??? decay for several transition candidates. The 2b current contributions are related to the quenching of Gamow-Teller transitions found in nuclear structure calculations.

  18. Electromagnetic form factors of the nucleon: New fit and analysis of uncertainties

    SciTech Connect (OSTI)

    Alberico, W. M.; Giunti, C.; Bilenky, S. M.; Graczyk, K. M.

    2009-06-15

    Electromagnetic form factors of proton and neutron, obtained from a new fit of data, are presented. The proton form factors are obtained from a simultaneous fit to the ratio {mu}{sub p}G{sub Ep}/G{sub Mp} determined from polarization transfer measurements and to ep elastic cross section data. Phenomenological two-photon exchange corrections are taken into account. The present fit for protons was performed in the kinematical region Q{sup 2} is an element of (0,6) GeV{sup 2}. For both protons and neutrons we use the latest available data. For all form factors, the uncertainties and correlations of form factor parameters are investigated with the {chi}{sup 2} method.

  19. Solar Irradiances Measured using SPN1 Radiometers: Uncertainties and Clues for Development

    SciTech Connect (OSTI)

    Badosa, Jordi; Wood, John; Blanc, Philippe; Long, Charles N.; Vuilleumier, Laurent; Demengel, Dominique; Haeffelin, Martial

    2014-12-08

    The fast development of solar radiation and energy applications, such as photovoltaic and solar thermodynamic systems, has increased the need for solar radiation measurement and monitoring, not only for the global component but also the diffuse and direct. End users look for the best compromise between getting close to state-of-the-art measurements and keeping capital, maintenance and operating costs to a minimum. Among the existing commercial options, SPN1 is a relatively low cost solar radiometer that estimates global and diffuse solar irradiances from seven thermopile sensors under a shading mask and without moving parts. This work presents a comprehensive study of SPN1 accuracy and sources of uncertainty, which results from laboratory experiments, numerical modeling and comparison studies between measurements from this sensor and state-of-the art instruments for six diverse sites. Several clues are provided for improving the SPN1 accuracy and agreement with state-of-the-art measurements.

  20. Computing quality scores and uncertainty for approximate pattern matching in geospatial semantic graphs

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; Robinson, David G.; Wilson, Alyson G.; Woodbridge, Diane M. -K.

    2015-09-26

    Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match qualitymore » scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.« less

  1. Computing quality scores and uncertainty for approximate pattern matching in geospatial semantic graphs

    SciTech Connect (OSTI)

    Stracuzzi, David John; Brost, Randolph C.; Phillips, Cynthia A.; Robinson, David G.; Wilson, Alyson G.; Woodbridge, Diane M. -K.

    2015-09-26

    Geospatial semantic graphs provide a robust foundation for representing and analyzing remote sensor data. In particular, they support a variety of pattern search operations that capture the spatial and temporal relationships among the objects and events in the data. However, in the presence of large data corpora, even a carefully constructed search query may return a large number of unintended matches. This work considers the problem of calculating a quality score for each match to the query, given that the underlying data are uncertain. As a result, we present a preliminary evaluation of three methods for determining both match quality scores and associated uncertainty bounds, illustrated in the context of an example based on overhead imagery data.

  2. Formulating Energy Policies Related to Fossil Fuel Use: Critical Uncertainties in the Global Carbon Cycle

    DOE R&D Accomplishments [OSTI]

    Post, W. M.; Dale, V. H.; DeAngelis, D. L.; Mann, L. K.; Mulholland, P. J.; O`Neill, R. V.; Peng, T. -H.; Farrell, M. P.

    1990-02-01

    The global carbon cycle is the dynamic interaction among the earth's carbon sources and sinks. Four reservoirs can be identified, including the atmosphere, terrestrial biosphere, oceans, and sediments. Atmospheric CO{sub 2} concentration is determined by characteristics of carbon fluxes among major reservoirs of the global carbon cycle. The objective of this paper is to document the knowns, and unknowns and uncertainties associated with key questions that if answered will increase the understanding of the portion of past, present, and future atmospheric CO{sub 2} attributable to fossil fuel burning. Documented atmospheric increases in CO{sub 2} levels are thought to result primarily from fossil fuel use and, perhaps, deforestation. However, the observed atmospheric CO{sub 2} increase is less than expected from current understanding of the global carbon cycle because of poorly understood interactions among the major carbon reservoirs.

  3. Uncertainties in the characterization of the thermal environment of a solid rocket propellant fire

    SciTech Connect (OSTI)

    Diaz, J.C.

    1993-10-01

    There has been an interest in developing models capable of predicting the response of systems to Minuteman (MM) III third-stage solid propellant fires. Input parameters for such an effort include the boundary conditions that describe the fire temperature, heat flux, emissivity, and propellant burn rate. In this study scanning spectroscopy and pyrometry were used to infer plume temperatures. Each diagnostic system possessed strengths and weaknesses. The intention was to use various supportive methods to infer plume temperature and emissivity, because no one diagnostic had proven capabilities for determining temperature under these conditions. Furthermore, these diagnostics were being used near the limit of their applicability. All these points created some uncertainty in the data collected.

  4. REVIEW OF MECHANISTIC UNDERSTANDING AND MODELING AND UNCERTAINTY ANALYSIS METHODS FOR PREDICTING CEMENTITIOUS BARRIER PERFORMANCE

    SciTech Connect (OSTI)

    Langton, C.; Kosson, D.

    2009-11-30

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various chapters contain both a description of the mechanism or and a discussion of the current approaches to modeling the phenomena.

  5. FRAP-T6 uncertainty study of LOCA tests LOFT L2-3 and PBF LLR-3. [PWR

    SciTech Connect (OSTI)

    Chambers, R.; Driskell, W.E.; Resch, S.C.

    1983-01-01

    This paper presents the accuracy and uncertainty of fuel rod behavior calculations performed by the transient Fuel Rod Analysis Program (FRAP-T6) during large break loss-of-coolant accidents. The accuracy of the code was determined primarily through comparisons of code calculations with cladding surface temperature measurements from two loss-of-coolant experiments (LOCEs). These LOCEs were the L2-3 experiment conducted in the Loss-of-Fluid Test (LOFT) Facility and the LOFT Lead Rod 3 (LLR-3) experiment conducted in the Power Burst Facility (PBF). Uncertainties in code calculations resulting from uncertainties in fuel and cladding design variables, material property and heat transfer correlations, and thermal-hydraulic boundary conditions were analyzed.

  6. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    SciTech Connect (OSTI)

    Frey, H. Christopher; Rhodes, David S.

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments. The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  7. Spatially Resolved Estimation of Ozone-related Mortality in the United States under Two Representative Concentration Pathways (RCPs) and their Uncertainty

    SciTech Connect (OSTI)

    Kim, Young-Min; Zhou, Ying; Gao, Yang; Fu, Joshua S.; Johnson, Brent; Huang, Cheng; Liu, Yang

    2015-01-01

    BACKGROUND: The spatial pattern of the uncertainty in climate air pollution health impact has rarely been studied due to the lack of high-resolution model simulations, especially under the latest Representative Concentration Pathways (RCPs). OBJECTIVES: We estimated county-level ozone (O3) and PM2.5 related excess mortality (EM) and evaluated the associated uncertainties in the continental United States in the 2050s under RCP4.5 and RCP8.5. METHODS: Using dynamically downscaled climate model simulations, we calculated changes in O3 and PM2.5 levels at 12 km resolution between the future (2057-2059) and present (2001-2004) under two RCP scenarios. Using concentration-response relationships in the literature and projected future populations, we estimated EM attributable to the changes in O3 and PM2.5. We finally analyzed the contribution of input variables to the uncertainty in the county-level EM estimation using Monte Carlo simulation. RESULTS: O3-related premature deaths in the continental U.S. were estimated to be 1,082 deaths/year under RCP8.5 (95% confidence interval (CI): -288 to 2,453), and -5,229 deaths/year under RCP4.5 (-7,212 to -3,246). Simulated PM2.5 changes resulted in a significant decrease in EM under the two RCPs. The uncertainty of O3-related EM estimates was mainly caused by RCP scenarios, whereas that of PM2.5-related EMs was mainly from concentration-response functions. CONCLUSION: EM estimates attributable to climate change-induced air pollution change as well as the associated uncertainties vary substantially in space, and so are the most influential input variables. Spatially resolved data is crucial to develop effective mitigation and adaptation policy.

  8. Estimating U.S. Methane Emissions from the Natural Gas Supply Chain. Approaches, Uncertainties, Current Estimates, and Future Studies

    SciTech Connect (OSTI)

    Heath, Garvin; Warner, Ethan; Steinberg, Daniel; Brandt, Adam

    2015-08-01

    A growing number of studies have raised questions regarding uncertainties in our understanding of methane (CH4) emissions from fugitives and venting along the natural gas (NG) supply chain. In particular, a number of measurement studies have suggested that actual levels of CH4 emissions may be higher than estimated by EPA" tm s U.S. GHG Emission Inventory. We reviewed the literature to identify the growing number of studies that have raised questions regarding uncertainties in our understanding of methane (CH4) emissions from fugitives and venting along the natural gas (NG) supply chain.

  9. Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors: Final Scientific/Technical Report

    SciTech Connect (OSTI)

    Vierow, Karen; Aldemir, Tunc

    2009-09-10

    The project entitled, Uncertainty Quantification in the Reliability and Risk Assessment of Generation IV Reactors, was conducted as a DOE NERI project collaboration between Texas A&M University and The Ohio State University between March 2006 and June 2009. The overall goal of the proposed project was to develop practical approaches and tools by which dynamic reliability and risk assessment techniques can be used to augment the uncertainty quantification process in probabilistic risk assessment (PRA) methods and PRA applications for Generation IV reactors. This report is the Final Scientific/Technical Report summarizing the project.

  10. Determination of electron beam polarization using electron detector in Compton polarimeter with less than 1% statistical and systematic uncertainty

    SciTech Connect (OSTI)

    Narayan, Amrendra

    2015-05-01

    The Q-weak experiment aims to measure the weak charge of proton with a precision of 4.2%. The proposed precision on weak charge required a 2.5% measurement of the parity violating asymmetry in elastic electron - proton scattering. Polarimetry was the largest experimental contribution to this uncertainty and a new Compton polarimeter was installed in Hall C at Jefferson Lab to make the goal achievable. In this polarimeter the electron beam collides with green laser light in a low gain Fabry-Perot Cavity; the scattered electrons are detected in 4 planes of a novel diamond micro strip detector while the back scattered photons are detected in lead tungstate crystals. This diamond micro-strip detector is the first such device to be used as a tracking detector in a nuclear and particle physics experiment. The diamond detectors are read out using custom built electronic modules that include a preamplifier, a pulse shaping amplifier and a discriminator for each detector micro-strip. We use field programmable gate array based general purpose logic modules for event selection and histogramming. Extensive Monte Carlo simulations and data acquisition simulations were performed to estimate the systematic uncertainties. Additionally, the Moller and Compton polarimeters were cross calibrated at low electron beam currents using a series of interleaved measurements. In this dissertation, we describe all the subsystems of the Compton polarimeter with emphasis on the electron detector. We focus on the FPGA based data acquisition system built by the author and the data analysis methods implemented by the author. The simulations of the data acquisition and the polarimeter that helped rigorously establish the systematic uncertainties of the polarimeter are also elaborated, resulting in the first sub 1% measurement of low energy (?1 GeV) electron beam polarization with a Compton electron detector. We have demonstrated that diamond based micro-strip detectors can be used for tracking in a high radiation environment and it has enabled us to achieve the desired precision in the measurement of the electron beam polarization which in turn has allowed the most precise determination of the weak charge of the proton.

  11. Wind Energy Management System EMS Integration Project: Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.; Ma, Jian; Guttromson, Ross T.; Subbarao, Krishnappa; Chakrabarti, Bhujanga B.

    2010-01-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind and solar power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation), and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and wind/solar forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. To improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively by including all sources of uncertainty (load, intermittent generation, generators forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. Currently, uncertainties associated with wind and load forecasts, as well as uncertainties associated with random generator outages and unexpected disconnection of supply lines, are not taken into account in power grid operation. Thus, operators have little means to weigh the likelihood and magnitude of upcoming events of power imbalance. In this project, funded by the U.S. Department of Energy (DOE), a framework has been developed for incorporating uncertainties associated with wind and load forecast errors, unpredicted ramps, and forced generation disconnections into the energy management system (EMS) as well as generation dispatch and commitment applications. A new approach to evaluate the uncertainty ranges for the required generation performance envelope including balancing capacity, ramping capability, and ramp duration has been proposed. The approach includes three stages: forecast and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence levels. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis, incorporating all sources of uncertainties of both continuous (wind and load forecast errors) and discrete (forced generator outages and start-up failures) nature. A new method called the flying brick technique has been developed to evaluate the look-ahead required generation performance envelope for the worst case scenario within a user-specified confidence level. A self-validation algorithm has been developed to validate the accuracy of the confidence intervals.

  12. Uncertainty in Modeling Dust Mass Balance and Radiative Forcing from Size Parameterization

    SciTech Connect (OSTI)

    Zhao, Chun; Chen, Siyu; Leung, Lai-Yung R.; Qian, Yun; Kok, Jasper; Zaveri, Rahul A.; Huang, J.

    2013-11-05

    This study examines the uncertainties in simulating mass balance and radiative forcing of mineral dust due to biases in the aerosol size parameterization. Simulations are conducted quasi-globally (180oW-180oE and 60oS-70oN) using the WRF24 Chem model with three different approaches to represent aerosol size distribution (8-bin, 4-bin, and 3-mode). The biases in the 3-mode or 4-bin approaches against a relatively more accurate 8-bin approach in simulating dust mass balance and radiative forcing are identified. Compared to the 8-bin approach, the 4-bin approach simulates similar but coarser size distributions of dust particles in the atmosphere, while the 3-mode pproach retains more fine dust particles but fewer coarse dust particles due to its prescribed og of each mode. Although the 3-mode approach yields up to 10 days longer dust mass lifetime over the remote oceanic regions than the 8-bin approach, the three size approaches produce similar dust mass lifetime (3.2 days to 3.5 days) on quasi-global average, reflecting that the global dust mass lifetime is mainly determined by the dust mass lifetime near the dust source regions. With the same global dust emission (~6000 Tg yr-1), the 8-bin approach produces a dust mass loading of 39 Tg, while the 4-bin and 3-mode approaches produce 3% (40.2 Tg) and 25% (49.1 Tg) higher dust mass loading, respectively. The difference in dust mass loading between the 8-bin approach and the 4-bin or 3-mode approaches has large spatial variations, with generally smaller relative difference (<10%) near the surface over the dust source regions. The three size approaches also result in significantly different dry and wet deposition fluxes and number concentrations of dust. The difference in dust aerosol optical depth (AOD) (a factor of 3) among the three size approaches is much larger than their difference (25%) in dust mass loading. Compared to the 8-bin approach, the 4-bin approach yields stronger dust absorptivity, while the 3-mode approach yields weaker dust absorptivity. Overall, on quasi-global average, the three size parameterizations result in a significant difference of a factor of 2~3 in dust surface cooling (-1.02~-2.87 W m-2) and atmospheric warming (0.39~0.96 W m-2) and in a tremendous difference of a factor of ~10 in dust TOA cooling (-0.24~-2.20 W m-2). An uncertainty of a factor of 2 is quantified in dust emission estimation due to the different size parameterizations. This study also highlights the uncertainties in modeling dust mass and number loading, deposition fluxes, and radiative forcing resulting from different size parameterizations, and motivates further investigation of the impact of size parameterizations on modeling dust impacts on air quality, climate, and ecosystem.

  13. Addressing Uncertainty in Desigh Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF

    Office of Environmental Management (EM)

    Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF Tom Houston, Greg Mertz, Carl Costantino, Michael Costantino, Andrew Maham Carl J. Costantino & Associates DOE NPH Conference Germantown, Maryland October 25-26 2011 1 CJCAssociates Introduction * Description of the SWPF Settlement Problem * Deterministic v. Probabilistic Approach to Settlement Profile Development * Analysis Approach * Parameters considered *

  14. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    SciTech Connect (OSTI)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  15. Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF

    Broader source: Energy.gov [DOE]

    Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF Tom Houston, Greg Mertz, Carl Costantino, Michael Costantino, Andrew Maham Carl J. Costantino & Associates DOE NPH Conference Germantown, Maryland October 25-26 2011

  16. Sensitivity of Surface Flux Simulations to Hydrologic Parameters Based on an Uncertainty Quantification Framework Applied to the Community Land Model

    SciTech Connect (OSTI)

    Hou, Zhangshuan; Huang, Maoyi; Leung, Lai-Yung R.; Lin, Guang; Ricciuto, Daniel M.

    2012-08-10

    Uncertainties in hydrologic parameters could have significant impacts on the simulated water and energy fluxes and land surface states, which will in turn affect atmospheric processes and the carbon cycle. Quantifying such uncertainties is an important step toward better understanding and quantification of uncertainty of integrated earth system models. In this paper, we introduce an uncertainty quantification (UQ) framework to analyze sensitivity of simulated surface fluxes to selected hydrologic parameters in the Community Land Model (CLM4) through forward modeling. Thirteen flux tower footprints spanning a wide range of climate and site conditions were selected to perform sensitivity analyses by perturbing the parameters identified. In the UQ framework, prior information about the parameters was used to quantify the input uncertainty using the Minimum-Relative-Entropy approach. The quasi-Monte Carlo approach was applied to generate samples of parameters on the basis of the prior pdfs. Simulations corresponding to sampled parameter sets were used to generate response curves and response surfaces and statistical tests were used to rank the significance of the parameters for output responses including latent (LH) and sensible heat (SH) fluxes. Overall, the CLM4 simulated LH and SH show the largest sensitivity to subsurface runoff generation parameters. However, study sites with deep root vegetation are also affected by surface runoff parameters, while sites with shallow root zones are also sensitive to the vadose zone soil water parameters. Generally, sites with finer soil texture and shallower rooting systems tend to have larger sensitivity of outputs to the parameters. Our results suggest the necessity of and possible ways for parameter inversion/calibration using available measurements of latent/sensible heat fluxes to obtain the optimal parameter set for CLM4. This study also provided guidance on reduction of parameter set dimensionality and parameter calibration framework design for CLM4 and other land surface models under different hydrologic and climatic regimes.

  17. Uncertainty Quantification in CO{sub 2} Sequestration Using Surrogate Models from Polynomial Chaos Expansion

    SciTech Connect (OSTI)

    Zhang, Yan; Sahinidis, Nikolaos V.

    2013-04-06

    In this paper, surrogate models are iteratively built using polynomial chaos expansion (PCE) and detailed numerical simulations of a carbon sequestration system. Output variables from a numerical simulator are approximated as polynomial functions of uncertain parameters. Once generated, PCE representations can be used in place of the numerical simulator and often decrease simulation times by several orders of magnitude. However, PCE models are expensive to derive unless the number of terms in the expansion is moderate, which requires a relatively small number of uncertain variables and a low degree of expansion. To cope with this limitation, instead of using a classical full expansion at each step of an iterative PCE construction method, we introduce a mixed-integer programming (MIP) formulation to identify the best subset of basis terms in the expansion. This approach makes it possible to keep the number of terms small in the expansion. Monte Carlo (MC) simulation is then performed by substituting the values of the uncertain parameters into the closed-form polynomial functions. Based on the results of MC simulation, the uncertainties of injecting CO{sub 2} underground are quantified for a saline aquifer. Moreover, based on the PCE model, we formulate an optimization problem to determine the optimal CO{sub 2} injection rate so as to maximize the gas saturation (residual trapping) during injection, and thereby minimize the chance of leakage.

  18. Dark matter vs. neutrinos: the effect of astrophysical uncertainties and timing information on the neutrino floor

    SciTech Connect (OSTI)

    Davis, Jonathan H.

    2015-03-09

    Future multi-tonne Direct Detection experiments will be sensitive to solar neutrino induced nuclear recoils which form an irreducible background to light Dark Matter searches. Indeed for masses around 6 GeV the spectra of neutrinos and Dark Matter are so similar that experiments are said to run into a neutrino floor, for which sensitivity increases only marginally with exposure past a certain cross section. In this work we show that this floor can be overcome using the different annual modulation expected from solar neutrinos and Dark Matter. Specifically for cross sections below the neutrino floor the DM signal is observable through a phase shift and a smaller amplitude for the time-dependent event rate. This allows the exclusion power to be improved by up to an order of magnitude for large exposures. In addition we demonstrate that, using only spectral information, the neutrino floor exists over a wider mass range than has been previously shown, since the large uncertainties in the Dark Matter velocity distribution make the signal spectrum harder to distinguish from the neutrino background. However for most velocity distributions it can still be surpassed using timing information, and so the neutrino floor is not an absolute limit on the sensitivity of Direct Detection experiments.

  19. Optimal Control of Distributed Energy Resources and Demand Response under Uncertainty

    SciTech Connect (OSTI)

    Siddiqui, Afzal; Stadler, Michael; Marnay, Chris; Lai, Judy

    2010-06-01

    We take the perspective of a microgrid that has installed distribution energy resources (DER) in the form of distributed generation with combined heat and power applications. Given uncertain electricity and fuel prices, the microgrid minimizes its expected annual energy bill for various capacity sizes. In almost all cases, there is an economic and environmental advantage to using DER in conjunction with demand response (DR): the expected annualized energy bill is reduced by 9percent while CO2 emissions decline by 25percent. Furthermore, the microgrid's risk is diminished as DER may be deployed depending on prevailing market conditions and local demand. In order to test a policy measure that would place a weight on CO2 emissions, we use a multi-criteria objective function that minimizes a weighted average of expected costs and emissions. We find that greater emphasis on CO2 emissions has a beneficial environmental impact only if DR is available and enough reserve generation capacity exists. Finally, greater uncertainty results in higher expected costs and risk exposure, the effects of which may be mitigated by selecting a larger capacity.

  20. Atmospheric Carbon Dioxide and the Global Carbon Cycle: The Key Uncertainties

    DOE R&D Accomplishments [OSTI]

    Peng, T. H.; Post, W. M.; DeAngelis, D. L.; Dale, V. H.; Farrell, M. P.

    1987-12-01

    The biogeochemical cycling of carbon between its sources and sinks determines the rate of increase in atmospheric CO{sub 2} concentrations. The observed increase in atmospheric CO{sub 2} content is less than the estimated release from fossil fuel consumption and deforestation. This discrepancy can be explained by interactions between the atmosphere and other global carbon reservoirs such as the oceans, and the terrestrial biosphere including soils. Undoubtedly, the oceans have been the most important sinks for CO{sub 2} produced by man. But, the physical, chemical, and biological processes of oceans are complex and, therefore, credible estimates of CO{sub 2} uptake can probably only come from mathematical models. Unfortunately, one- and two-dimensional ocean models do not allow for enough CO{sub 2} uptake to accurately account for known releases. Thus, they produce higher concentrations of atmospheric CO{sub 2} than was historically the case. More complex three-dimensional models, while currently being developed, may make better use of existing tracer data than do one- and two-dimensional models and will also incorporate climate feedback effects to provide a more realistic view of ocean dynamics and CO{sub 2} fluxes. The instability of current models to estimate accurately oceanic uptake of CO{sub 2} creates one of the key uncertainties in predictions of atmospheric CO{sub 2} increases and climate responses over the next 100 to 200 years.

  1. Modeling of Uncertainties in Major Drivers in U.S. Electricity Markets: Preprint

    SciTech Connect (OSTI)

    Short, W.; Ferguson, T.; Leifman, M.

    2006-09-01

    This paper presents information on the Stochastic Energy Deployment System (SEDS) model. DOE and NREL are developing this new model, intended to address many of the shortcomings of the current suite of energy models. Once fully built, the salient qualities of SEDS will include full probabilistic treatment of the major uncertainties in national energy forecasts; code compactness for desktop application; user-friendly interface for a reasonably trained analyst; run-time within limits acceptable for quick-response analysis; choice of detailed or aggregate representations; and transparency of design, code, and assumptions. Moreover, SEDS development will be increasingly collaborative, as DOE and NREL will be coordinating with multiple national laboratories and other institutions, making SEDS nearly an 'open source' project. The collaboration will utilize the best expertise on specific sectors and problems, and also allow constant examination and review of the model. This paper outlines the rationale for this project and a description of its alpha version, as well as some example results. It also describes some of the expected development efforts in SEDS.

  2. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    SciTech Connect (OSTI)

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane; Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  3. 2009 Technical Risk and Uncertainty Analysis of the U.S. Department of Energy's Solar Energy Technologies Program Concentrating Solar Power and Photovoltaics R&D

    SciTech Connect (OSTI)

    McVeigh, J.; Lausten, M.; Eugeni, E.; Soni, A.

    2010-11-01

    The U.S. Department of Energy (DOE) Solar Energy Technologies Program (SETP) conducted a 2009 Technical Risk and Uncertainty Analysis to better assess its cost goals for concentrating solar power (CSP) and photovoltaic (PV) systems, and to potentially rebalance its R&D portfolio. This report details the methodology, schedule, and results of this technical risk and uncertainty analysis.

  4. Reducing uncertainty in high-resolution sea ice models.

    SciTech Connect (OSTI)

    Peterson, Kara J.; Bochev, Pavel Blagoveston

    2013-07-01

    Arctic sea ice is an important component of the global climate system, reflecting a significant amount of solar radiation, insulating the ocean from the atmosphere and influencing ocean circulation by modifying the salinity of the upper ocean. The thickness and extent of Arctic sea ice have shown a significant decline in recent decades with implications for global climate as well as regional geopolitics. Increasing interest in exploration as well as climate feedback effects make predictive mathematical modeling of sea ice a task of tremendous practical import. Satellite data obtained over the last few decades have provided a wealth of information on sea ice motion and deformation. The data clearly show that ice deformation is focused along narrow linear features and this type of deformation is not well-represented in existing models. To improve sea ice dynamics we have incorporated an anisotropic rheology into the Los Alamos National Laboratory global sea ice model, CICE. Sensitivity analyses were performed using the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) to determine the impact of material parameters on sea ice response functions. Two material strength parameters that exhibited the most significant impact on responses were further analyzed to evaluate their influence on quantitative comparisons between model output and data. The sensitivity analysis along with ten year model runs indicate that while the anisotropic rheology provides some benefit in velocity predictions, additional improvements are required to make this material model a viable alternative for global sea ice simulations.

  5. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    SciTech Connect (OSTI)

    Not Available

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 contains an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.

  6. Integrated Risk Assessment for the LaSalle Unit 2 Nuclear Power Plant, Phenomenology and Risk Uncertainty Evaluation Program (PRUEP), MELCOR code calculations. Volume 3

    SciTech Connect (OSTI)

    Shaffer, C.J. [Science and Engineering Associates, Albuquerque, NM (United States); Miller, L.A.; Payne, A.C. Jr.

    1992-10-01

    A Level III Probabilistic Risk Assessment (PRA) has been performed for LaSalle Unit 2 under the Risk Methods Integration and Evaluation Program (RMIEP) and the Phenomenology and Risk Uncertainty Evaluation Program (PRUEP). This report documents the phenomenological calculations and sources of. uncertainty in the calculations performed with HELCOR in support of the Level II portion of the PRA. These calculations are an integral part of the Level II analysis since they provide quantitative input to the Accident Progression Event Tree (APET) and Source Term Model (LASSOR). However, the uncertainty associated with the code results must be considered in the use of the results. The MELCOR calculations performed include four integrated calculations: (1) a high-pressure short-term station blackout, (2) a low-pressure short-term station blackout, (3) an intermediate-term station blackout, and (4) a long-term station blackout. Several sensitivity studies investigating the effect of variations in containment failure size and location, as well as hydrogen ignition concentration are also documented.

  7. Integrated Risk Assessment for the LaSalle Unit 2 Nuclear Power Plant, Phenomenology and Risk Uncertainty Evaluation Program (PRUEP), MELCOR code calculations

    SciTech Connect (OSTI)

    Shaffer, C.J. (Science and Engineering Associates, Albuquerque, NM (United States)); Miller, L.A.; Payne, A.C. Jr.

    1992-10-01

    A Level III Probabilistic Risk Assessment (PRA) has been performed for LaSalle Unit 2 under the Risk Methods Integration and Evaluation Program (RMIEP) and the Phenomenology and Risk Uncertainty Evaluation Program (PRUEP). This report documents the phenomenological calculations and sources of. uncertainty in the calculations performed with HELCOR in support of the Level II portion of the PRA. These calculations are an integral part of the Level II analysis since they provide quantitative input to the Accident Progression Event Tree (APET) and Source Term Model (LASSOR). However, the uncertainty associated with the code results must be considered in the use of the results. The MELCOR calculations performed include four integrated calculations: (1) a high-pressure short-term station blackout, (2) a low-pressure short-term station blackout, (3) an intermediate-term station blackout, and (4) a long-term station blackout. Several sensitivity studies investigating the effect of variations in containment failure size and location, as well as hydrogen ignition concentration are also documented.

  8. PUFF-III: A Code for Processing ENDF Uncertainty Data Into Multigroup Covariance Matrices

    SciTech Connect (OSTI)

    Dunn, M.E.

    2000-06-01

    PUFF-III is an extension of the previous PUFF-II code that was developed in the 1970s and early 1980s. The PUFF codes process the Evaluated Nuclear Data File (ENDF) covariance data and generate multigroup covariance matrices on a user-specified energy grid structure. Unlike its predecessor, PUFF-III can process the new ENDF/B-VI data formats. In particular, PUFF-III has the capability to process the spontaneous fission covariances for fission neutron multiplicity. With regard to the covariance data in File 33 of the ENDF system, PUFF-III has the capability to process short-range variance formats, as well as the lumped reaction covariance data formats that were introduced in ENDF/B-V. In addition to the new ENDF formats, a new directory feature is now available that allows the user to obtain a detailed directory of the uncertainty information in the data files without visually inspecting the ENDF data. Following the correlation matrix calculation, PUFF-III also evaluates the eigenvalues of each correlation matrix and tests each matrix for positive definiteness. Additional new features are discussed in the manual. PUFF-III has been developed for implementation in the AMPX code system, and several modifications were incorporated to improve memory allocation tasks and input/output operations. Consequently, the resulting code has a structure that is similar to other modules in the AMPX code system. With the release of PUFF-III, a new and improved covariance processing code is available to process ENDF covariance formats through Version VI.

  9. M dwarf metallicities and giant planet occurrence: Ironing out uncertainties and systematics

    SciTech Connect (OSTI)

    Gaidos, Eric; Mann, Andrew W.

    2014-08-10

    Comparisons between the planet populations around solar-type stars and those orbiting M dwarfs shed light on the possible dependence of planet formation and evolution on stellar mass. However, such analyses must control for other factors, i.e., metallicity, a stellar parameter that strongly influences the occurrence of gas giant planets. We obtained infrared spectra of 121 M dwarfs stars monitored by the California Planet Search and determined metallicities with an accuracy of 0.08 dex. The mean and standard deviation of the sample are 0.05 and 0.20 dex, respectively. We parameterized the metallicity dependence of the occurrence of giant planets on orbits with a period less than two years around solar-type stars and applied this to our M dwarf sample to estimate the expected number of giant planets. The number of detected planets (3) is lower than the predicted number (6.4), but the difference is not very significant (12% probability of finding as many or fewer planets). The three M dwarf planet hosts are not especially metal rich and the most likely value of the power-law index relating planet occurrence to metallicity is 1.06 dex per dex for M dwarfs compared to 1.80 for solar-type stars; this difference, however, is comparable to uncertainties. Giant planet occurrence around both types of stars allows, but does not necessarily require, a mass dependence of ?1 dex per dex. The actual planet-mass-metallicity relation may be complex, and elucidating it will require larger surveys like those to be conducted by ground-based infrared spectrographs and the Gaia space astrometry mission.

  10. Reconstruction of signals with unknown spectra in information field theory with parameter uncertainty

    SciTech Connect (OSTI)

    Ensslin, Torsten A.; Frommert, Mona [Max-Planck-Institut fuer Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2011-05-15

    The optimal reconstruction of cosmic metric perturbations and other signals requires knowledge of their power spectra and other parameters. If these are not known a priori, they have to be measured simultaneously from the same data used for the signal reconstruction. We formulate the general problem of signal inference in the presence of unknown parameters within the framework of information field theory. To solve this, we develop a generic parameter-uncertainty renormalized estimation (PURE) technique. As a concrete application, we address the problem of reconstructing Gaussian signals with unknown power-spectrum with five different approaches: (i) separate maximum-a-posteriori power-spectrum measurement and subsequent reconstruction, (ii) maximum-a-posteriori reconstruction with marginalized power-spectrum, (iii) maximizing the joint posterior of signal and spectrum, (iv) guessing the spectrum from the variance in the Wiener-filter map, and (v) renormalization flow analysis of the field-theoretical problem providing the PURE filter. In all cases, the reconstruction can be described or approximated as Wiener-filter operations with assumed signal spectra derived from the data according to the same recipe, but with differing coefficients. All of these filters, except the renormalized one, exhibit a perception threshold in case of a Jeffreys prior for the unknown spectrum. Data modes with variance below this threshold do not affect the signal reconstruction at all. Filter (iv) seems to be similar to the so-called Karhune-Loeve and Feldman-Kaiser-Peacock estimators for galaxy power spectra used in cosmology, which therefore should also exhibit a marginal perception threshold if correctly implemented. We present statistical performance tests and show that the PURE filter is superior to the others, especially if the post-Wiener-filter corrections are included or in case an additional scale-independent spectral smoothness prior can be adopted.

  11. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    SciTech Connect (OSTI)

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  12. Mapping and uncertainty analysis of energy and pitch angle phase space in the DIII-D fast ion loss detector

    SciTech Connect (OSTI)

    Pace, D. C. Fisher, R. K.; Van Zeeland, M. A.; Pipes, R.

    2014-11-15

    New phase space mapping and uncertainty analysis of energetic ion loss data in the DIII-D tokamak provides experimental results that serve as valuable constraints in first-principles simulations of energetic ion transport. Beam ion losses are measured by the fast ion loss detector (FILD) diagnostic system consisting of two magnetic spectrometers placed independently along the outer wall. Monte Carlo simulations of mono-energetic and single-pitch ions reaching the FILDs are used to determine the expected uncertainty in the measurements. Modeling shows that the variation in gyrophase of 80 keV beam ions at the FILD aperture can produce an apparent measured energy signature spanning across 50-140 keV. These calculations compare favorably with experiments in which neutral beam prompt loss provides a well known energy and pitch distribution.

  13. Calculating Impacts of Energy Standards on Energy Demand in U.S. Buildings with Uncertainty in an Integrated Assessment Model

    SciTech Connect (OSTI)

    Scott, Michael J.; Daly, Don S.; Hathaway, John E.; Lansing, Carina S.; Liu, Ying; McJeon, Haewon C.; Moss, Richard H.; Patel, Pralit L.; Peterson, Marty J.; Rice, Jennie S.; Zhou, Yuyu

    2015-10-01

    In this paper, an integrated assessment model (IAM) uses a newly-developed Monte Carlo analysis capability to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The paper finds that aggressive building-energy codes and equipment standards are an effective, cost-saving way to reduce energy consumption in buildings and greenhouse gas emissions in U.S. states. This conclusion is robust to significant uncertainties in population, economic activity, climate, carbon prices, and technology performance and costs.

  14. Validation and uncertainty quantification of Fuego simulations of calorimeter heating in a wind-driven hydrocarbon pool fire.

    SciTech Connect (OSTI)

    Domino, Stefan Paul; Figueroa, Victor G.; Romero, Vicente Jose; Glaze, David Jason; Sherman, Martin P.; Luketa-Hanlin, Anay Josephine

    2009-12-01

    The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.

  15. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    SciTech Connect (OSTI)

    Denman, Matthew R.; Brooks, Dusty Marie

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  16. Natural Gas Methane Emissions in the United States Greenhouse Gas Inventory: Sources, Uncertainties and Opportunities for Improvement

    SciTech Connect (OSTI)

    Heath, Garvin; Warner, Ethan; Steinberg, Daniel; Brandt, Adam

    2015-11-19

    Presentation summarizing key findings of a Joint Institute for Strategic Energy Analysis Report at an Environmental Protection Agency workshop: 'Stakeholder Workshop on EPA GHG Data on Petroleum and Natural Gas Systems' on November 19, 2015. For additional information see the JISEA report, 'Estimating U.S. Methane Emissions from the Natural Gas Supply Chain: Approaches, Uncertainties, Current Estimates, and Future Studies' NREL/TP-6A50-62820.

  17. Instrument uncertainty effect on calculation of absolute humidity using dewpoint, wet-bulb, and relative humidity sensors

    SciTech Connect (OSTI)

    Slayzak, S.J.; Ryan, J.P.

    1998-04-01

    As part of the US Department of Energy`s Advanced Desiccant Technology Program, the National Renewable Energy Laboratory (NREL) is characterizing the state-of-the-art in desiccant dehumidifiers, the key component of desiccant cooling systems. The experimental data will provide industry and end users with independent performance evaluation and help researchers assess the energy savings potential of the technology. Accurate determination of humidity ratio is critical to this work and an understanding of the capabilities of the available instrumentation is central to its proper application. This paper compares the minimum theoretical random error in humidity ratio calculation for three common measurement methods to give a sense of the relative maximum accuracy possible for each method assuming systematic errors can be made negligible. A series of experiments conducted also illustrate the capabilities of relative humidity sensors as compared to dewpoint sensors in measuring the grain depression of desiccant dehumidifiers. These tests support the results of the uncertainty analysis. At generally available instrument accuracies, uncertainty in calculated humidity ratio for dewpoint sensors is determined to be constant at approximately 2%. Wet-bulb sensors range between 2% and 6% above 10 g/kg (4%--15% below), and relative humidity sensors vary between 4% above 90% rh and 15% at 20% rh. Below 20% rh, uncertainty for rh sensors increases dramatically. Highest currently attainable accuracies bring dewpoint instruments down to 1% uncertainty, wet bulb to a range of 1%--3% above 10 g/kg (1.5%--8% below), and rh sensors between 1% and 5%.

  18. Uncertainty in soil-structure interaction analysis of a nuclear power plant due to different analytical techniques

    SciTech Connect (OSTI)

    Chen, J.C.; Chun, R.C.; Goudreau, G.L.; Maslenikov, O.R.; Johnson, J.J.

    1984-01-01

    This paper summarizes the results of the dynamic response analysis of the Zion reactor containment building using three different soil-structure interaction (SSI) analytical procedures which are: the substructure method, CLASSI; the equivalent linear finite element approach, ALUSH; and the nonlinear finite element procedure, DYNA3D. Uncertainties in analyzing a soil-structure system due to SSI analysis procedures were investigated. Responses at selected locations in the structure were compared through peak accelerations and response spectra.

  19. A procedure for the estimation of the numerical uncertainty of CFD calculations based on grid refinement studies

    SciTech Connect (OSTI)

    Ea, L.; Hoekstra, M.

    2014-04-01

    This paper offers a procedure for the estimation of the numerical uncertainty of any integral or local flow quantity as a result of a fluid flow computation; the procedure requires solutions on systematically refined grids. The error is estimated with power series expansions as a function of the typical cell size. These expansions, of which four types are used, are fitted to the data in the least-squares sense. The selection of the best error estimate is based on the standard deviation of the fits. The error estimate is converted into an uncertainty with a safety factor that depends on the observed order of grid convergence and on the standard deviation of the fit. For well-behaved data sets, i.e. monotonic convergence with the expected observed order of grid convergence and no scatter in the data, the method reduces to the well known Grid Convergence Index. Examples of application of the procedure are included. - Highlights: Estimation of the numerical uncertainty of any integral or local flow quantity. Least squares fits to power series expansions to handle noisy data. Excellent results obtained for manufactured solutions. Consistent results obtained for practical CFD calculations. Reduces to the well known Grid Convergence Index for well-behaved data sets.

  20. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    SciTech Connect (OSTI)

    Gauntt, Randall O.; Mattie, Patrick D.

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  1. Effect of short-term material balances on the projected uranium measurement uncertainties for the gas centrifuge enrichment plant

    SciTech Connect (OSTI)

    Younkin, J.M.; Rushton, J.E.

    1980-02-05

    A program is under way to design an effective International Atomic Energy Agency (IAEA) safeguards system that could be applied to the Portsmouth Gas Centrifuge Enrichment Plant (GCEP). This system would integrate nuclear material accountability with containment and surveillance. Uncertainties in material balances due to errors in the measurements of the declared uranium streams have been projected on a yearly basis for GCEP under such a system in a previous study. Because of the large uranium flows, the projected balance uncertainties were, in some cases, greater than the IAEA goal quantity of 75 kg of U-235 contained in low-enriched uranium. Therefore, it was decided to investigate the benefits of material balance periods of less than a year in order to improve the sensitivity and timeliness of the nuclear material accountability system. An analysis has been made of projected uranium measurement uncertainties for various short-term material balance periods. To simplify this analysis, only a material balance around the process area is considered and only the major UF/sub 6/ stream measurements are included. That is, storage areas are not considered and uranium waste streams are ignored. It is also assumed that variations in the cascade inventory are negligible compared to other terms in the balance so that the results obtained in this study are independent of the absolute cascade inventory. This study is intended to provide information that will serve as the basis for the future design of a dynamic materials accounting component of the IAEA safeguards system for GCEP.

  2. SU-E-J-145: Geometric Uncertainty in CBCT Extrapolation for Head and Neck Adaptive Radiotherapy

    SciTech Connect (OSTI)

    Liu, C; Kumarasiri, A; Chetvertkov, M; Gordon, J; Chetty, I; Siddiqui, F; Kim, J

    2014-06-01

    Purpose: One primary limitation of using CBCT images for H'N adaptive radiotherapy (ART) is the limited field of view (FOV) range. We propose a method to extrapolate the CBCT by using a deformed planning CT for the dose of the day calculations. The aim was to estimate the geometric uncertainty of our extrapolation method. Methods: Ten H'N patients, each with a planning CT (CT1) and a subsequent CT (CT2) taken, were selected. Furthermore, a small FOV CBCT (CT2short) was synthetically created by cropping CT2 to the size of a CBCT image. Then, an extrapolated CBCT (CBCTextrp) was generated by deformably registering CT1 to CT2short and resampling with a wider FOV (42mm more from the CT2short borders), where CT1 is deformed through translation, rigid, affine, and b-spline transformations in order. The geometric error is measured as the distance map ||DVF|| produced by a deformable registration between CBCTextrp and CT2. Mean errors were calculated as a function of the distance away from the CBCT borders. The quality of all the registrations was visually verified. Results: Results were collected based on the average numbers from 10 patients. The extrapolation error increased linearly as a function of the distance (at a rate of 0.7mm per 1 cm) away from the CBCT borders in the S/I direction. The errors (??) at the superior and inferior boarders were 0.8 0.5mm and 3.0 1.5mm respectively, and increased to 2.7 2.2mm and 5.9 1.9mm at 4.2cm away. The mean error within CBCT borders was 1.16 0.54mm . The overall errors within 4.2cm error expansion were 2.0 1.2mm (sup) and 4.5 1.6mm (inf). Conclusion: The overall error in inf direction is larger due to more large unpredictable deformations in the chest. The error introduced by extrapolation is plan dependent. The mean error in the expanded region can be large, and must be considered during implementation. This work is supported in part by Varian Medical Systems, Palo Alto, CA.

  3. A joint analysis of Planck and BICEP2 B modes including dust polarization uncertainty

    SciTech Connect (OSTI)

    Mortonson, Michael J.; Seljak, Uro E-mail: useljak@berkeley.edu

    2014-10-01

    We analyze BICEP2 and Planck data using a model that includes CMB lensing, gravity waves, and polarized dust. Recently published Planck dust polarization maps have highlighted the difficulty of estimating the amount of dust polarization in low intensity regions, suggesting that the polarization fractions have considerable uncertainties and may be significantly higher than previous predictions. In this paper, we start by assuming nothing about the dust polarization except for the power spectrum shape, which we take to be C{sub l}{sup BB,dust}?l{sup -2.42}. The resulting joint BICEP2+Planck analysis favors solutions without gravity waves, and the upper limit on the tensor-to-scalar ratio is r<0.11, a slight improvement relative to the Planck analysis alone which gives r<0.13 (95% c.l.). The estimated amplitude of the dust polarization power spectrum agrees with expectations for this field based on both HI column density and Planck polarization measurements at 353 GHz in the BICEP2 field. Including the latter constraint on the dust spectrum amplitude in our analysis improves the limit further to r<0.09, placing strong constraints on theories of inflation (e.g., models with r>0.14 are excluded with 99.5% confidence). We address the cross-correlation analysis of BICEP2 at 150 GHz with BICEP1 at 100 GHz as a test of foreground contamination. We find that the null hypothesis of dust and lensing with 0r= gives ??{sup 2}<2 relative to the hypothesis of no dust, so the frequency analysis does not strongly favor either model over the other. We also discuss how more accurate dust polarization maps may improve our constraints. If the dust polarization is measured perfectly, the limit can reach r<0.05 (or the corresponding detection significance if the observed dust signal plus the expected lensing signal is below the BICEP2 observations), but this degrades quickly to almost no improvement if the dust calibration error is 20% or larger or if the dust maps are not processed through the BICEP2 pipeline, inducing sampling variance noise.

  4. Large-Scale Uncertainty and Error Analysis for Time-dependent Fluid/Structure Interactions in Wind Turbine Applications

    SciTech Connect (OSTI)

    Alonso, Juan J.; Iaccarino, Gianluca

    2013-08-25

    The following is the final report covering the entire period of this aforementioned grant, June 1, 2011 - May 31, 2013 for the portion of the effort corresponding to Stanford University (SU). SU has partnered with Sandia National Laboratories (PI: Mike S. Eldred) and Purdue University (PI: Dongbin Xiu) to complete this research project and this final report includes those contributions made by the members of the team at Stanford. Dr. Eldred is continuing his contributions to this project under a no-cost extension and his contributions to the overall effort will be detailed at a later time (once his effort has concluded) on a separate project submitted by Sandia National Laboratories. At Stanford, the team is made up of Profs. Alonso, Iaccarino, and Duraisamy, post-doctoral researcher Vinod Lakshminarayan, and graduate student Santiago Padron. At Sandia National Laboratories, the team includes Michael Eldred, Matt Barone, John Jakeman, and Stefan Domino, and at Purdue University, we have Prof. Dongbin Xiu as our main collaborator. The overall objective of this project was to develop a novel, comprehensive methodology for uncertainty quantification by combining stochastic expansions (nonintrusive polynomial chaos and stochastic collocation), the adjoint approach, and fusion with experimental data to account for aleatory and epistemic uncertainties from random variable, random field, and model form sources. The expected outcomes of this activity were detailed in the proposal and are repeated here to set the stage for the results that we have generated during the time period of execution of this project: 1. The rigorous determination of an error budget comprising numerical errors in physical space and statistical errors in stochastic space and its use for optimal allocation of resources; 2. A considerable increase in efficiency when performing uncertainty quantification with a large number of uncertain variables in complex non-linear multi-physics problems; 3. A solution to the long-time integration problem of spectral chaos approaches; 4. A rigorous methodology to account for aleatory and epistemic uncertainties, to emphasize the most important variables via dimension reduction and dimension-adaptive refinement, and to support fusion with experimental data using Bayesian inference; 5. The application of novel methodologies to time-dependent reliability studies in wind turbine applications including a number of efforts relating to the uncertainty quantification in vertical-axis wind turbine applications. In this report, we summarize all accomplishments in the project (during the time period specified) focusing on advances in UQ algorithms and deployment efforts to the wind turbine application area. Detailed publications in each of these areas have also been completed and are available from the respective conference proceedings and journals as detailed in a later section.

  5. Combined Estimation of Hydrogeologic Conceptual Model, Parameter, and Scenario Uncertainty with Application to Uranium Transport at the Hanford Site 300 Area

    SciTech Connect (OSTI)

    Meyer, Philip D.; Ye, Ming; Rockhold, Mark L.; Neuman, Shlomo P.; Cantrell, Kirk J.

    2007-07-30

    This report to the Nuclear Regulatory Commission (NRC) describes the development and application of a methodology to systematically and quantitatively assess predictive uncertainty in groundwater flow and transport modeling that considers the combined impact of hydrogeologic uncertainties associated with the conceptual-mathematical basis of a model, model parameters, and the scenario to which the model is applied. The methodology is based on a n extension of a Maximum Likelihood implementation of Bayesian Model Averaging. Model uncertainty is represented by postulating a discrete set of alternative conceptual models for a site with associated prior model probabilities that reflect a belief about the relative plausibility of each model based on its apparent consistency with available knowledge and data. Posterior model probabilities are computed and parameter uncertainty is estimated by calibrating each model to observed system behavior; prior parameter estimates are optionally included. Scenario uncertainty is represented as a discrete set of alternative future conditions affecting boundary conditions, source/sink terms, or other aspects of the models, with associated prior scenario probabilities. A joint assessment of uncertainty results from combining model predictions computed under each scenario using as weight the posterior model and prior scenario probabilities. The uncertainty methodology was applied to modeling of groundwater flow and uranium transport at the Hanford Site 300 Area. Eight alternative models representing uncertainty in the hydrogeologic and geochemical properties as well as the temporal variability were considered. Two scenarios represent alternative future behavior of the Columbia River adjacent to the site were considered. The scenario alternatives were implemented in the models through the boundary conditions. Results demonstrate the feasibility of applying a comprehensive uncertainty assessment to large-scale, detailed groundwater flow and transport modeling and illustrate the benefits of the methodology I providing better estimates of predictive uncertiay8, quantitative results for use in assessing risk, and an improved understanding of the system behavior and the limitations of the models.

  6. Quantifiably secure power grid operation, management, and evolution : a study of uncertainties affecting the grid integration of renewables.

    SciTech Connect (OSTI)

    Gray, Genetha Anne; Watson, Jean-Paul; Silva Monroy, Cesar Augusto; Gramacy, Robert B.

    2013-09-01

    This report summarizes findings and results of the Quantifiably Secure Power Grid Operation, Management, and Evolution LDRD. The focus of the LDRD was to develop decisionsupport technologies to enable rational and quantifiable risk management for two key grid operational timescales: scheduling (day-ahead) and planning (month-to-year-ahead). Risk or resiliency metrics are foundational in this effort. The 2003 Northeast Blackout investigative report stressed the criticality of enforceable metrics for system resiliency - the grid's ability to satisfy demands subject to perturbation. However, we neither have well-defined risk metrics for addressing the pervasive uncertainties in a renewable energy era, nor decision-support tools for their enforcement, which severely impacts efforts to rationally improve grid security. For day-ahead unit commitment, decision-support tools must account for topological security constraints, loss-of-load (economic) costs, and supply and demand variability - especially given high renewables penetration. For long-term planning, transmission and generation expansion must ensure realized demand is satisfied for various projected technological, climate, and growth scenarios. The decision-support tools investigated in this project paid particular attention to tailoriented risk metrics for explicitly addressing high-consequence events. Historically, decisionsupport tools for the grid consider expected cost minimization, largely ignoring risk and instead penalizing loss-of-load through artificial parameters. The technical focus of this work was the development of scalable solvers for enforcing risk metrics. Advanced stochastic programming solvers were developed to address generation and transmission expansion and unit commitment, minimizing cost subject to pre-specified risk thresholds. Particular attention was paid to renewables where security critically depends on production and demand prediction accuracy. To address this concern, powerful filtering techniques for spatio-temporal measurement assimilation were used to develop short-term predictive stochastic models. To achieve uncertaintytolerant solutions, very large numbers of scenarios must be simultaneously considered. One focus of this work was investigating ways of reasonably reducing this number.

  7. Final Report: DOE Project: DE-SC-0005399 Linking the uncertainty of low frequency variability in tropical forcing in regional climate change

    SciTech Connect (OSTI)

    Forest, Chris E.; Barsugli, Joseph J.; Li, Wei

    2015-02-20

    Final report for DOE Project: DE-SC-0005399 -- Linking the uncertainty of low frequency variability in tropical forcing in regional climate change. The project utilizes multiple atmospheric general circulation models (AGCMs) to examine the regional climate sensitivity to tropical sea surface temperature forcing through a series of ensemble experiments. The overall goal for this work is to use the global teleconnection operator (GTO) as a metric to assess the impact of model structural differences on the uncertainties in regional climate variability.

  8. Executive summary for assessing the near-term risk of climate uncertainty : interdependencies among the U.S. states.

    SciTech Connect (OSTI)

    Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.; Tidwell, Vincent Carroll; Stamber, Kevin Louis; Reinert, Rhonda K.; Backus, George A.; Warren, Drake E.; Zagonel, Aldo A.; Ehlen, Mark Andrew; Klise, Geoffrey T.; Vargas, Vanessa N.

    2010-04-01

    Policy makers will most likely need to make decisions about climate policy before climate scientists have resolved all relevant uncertainties about the impacts of climate change. This study demonstrates a risk-assessment methodology for evaluating uncertain future climatic conditions. We estimate the impacts of climate change on U.S. state- and national-level economic activity from 2010 to 2050. To understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions to mitigate the course of climate change, we focus on precipitation, one of the most uncertain aspects of future climate change. We use results of the climate-model ensemble from the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) as a proxy for representing climate uncertainty over the next 40 years, map the simulated weather from the climate models hydrologically to the county level to determine the physical consequences on economic activity at the state level, and perform a detailed 70-industry analysis of economic impacts among the interacting lower-48 states. We determine the industry-level contribution to the gross domestic product and employment impacts at the state level, as well as interstate population migration, effects on personal income, and consequences for the U.S. trade balance. We show that the mean or average risk of damage to the U.S. economy from climate change, at the national level, is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs.

  9. Assessment of Uncertainties in the Response of the African Monsoon Precipitation to Land Use change simulated by a regional model

    SciTech Connect (OSTI)

    Hagos, Samson M.; Leung, Lai-Yung R.; Xue, Yongkang; Boone, Aaron; de Sales, Fernando; Neupane, Naresh; Huang, Maoyi; Yoon, Jin-Ho

    2014-02-22

    Land use and land cover over Africa have changed substantially over the last sixty years and this change has been proposed to affect monsoon circulation and precipitation. This study examines the uncertainties on the effect of these changes on the African Monsoon system and Sahel precipitation using an ensemble of regional model simulations with different combinations of land surface and cumulus parameterization schemes. Although the magnitude of the response covers a broad range of values, most of the simulations show a decline in Sahel precipitation due to the expansion of pasture and croplands at the expense of trees and shrubs and an increase in surface air temperature.

  10. Validation/Uncertainty Quantification for Large Eddy Simulations of the heat flux in the Tangentially Fired Oxy-Coal Alstom Boiler Simulation Facility

    SciTech Connect (OSTI)

    Smith, P.J.; Eddings, E.G.; Ring, T.; Thornock, J.; Draper, T.; Isaac, B.; Rezeai, D.; Toth, P.; Wu, Y.; Kelly, K.

    2014-08-01

    The objective of this task is to produce predictive capability with quantified uncertainty bounds for the heat flux in commercial-scale, tangentially fired, oxy-coal boilers. Validation data came from the Alstom Boiler Simulation Facility (BSF) for tangentially fired, oxy-coal operation. This task brings together experimental data collected under Alstom’s DOE project for measuring oxy-firing performance parameters in the BSF with this University of Utah project for large eddy simulation (LES) and validation/uncertainty quantification (V/UQ). The Utah work includes V/UQ with measurements in the single-burner facility where advanced strategies for O2 injection can be more easily controlled and data more easily obtained. Highlights of the work include: • Simulations of Alstom’s 15 megawatt (MW) BSF, exploring the uncertainty in thermal boundary conditions. A V/UQ analysis showed consistency between experimental results and simulation results, identifying uncertainty bounds on the quantities of interest for this system (Subtask 9.1) • A simulation study of the University of Utah’s oxy-fuel combustor (OFC) focused on heat flux (Subtask 9.2). A V/UQ analysis was used to show consistency between experimental and simulation results. • Measurement of heat flux and temperature with new optical diagnostic techniques and comparison with conventional measurements (Subtask 9.3). Various optical diagnostics systems were created to provide experimental data to the simulation team. The final configuration utilized a mid-wave infrared (MWIR) camera to measure heat flux and temperature, which was synchronized with a high-speed, visible camera to utilize two-color pyrometry to measure temperature and soot concentration. • Collection of heat flux and temperature measurements in the University of Utah’s OFC for use is subtasks 9.2 and 9.3 (Subtask 9.4). Several replicates were carried to better assess the experimental error. Experiments were specifically designed for the generation of high-fidelity data from a turbulent oxy-coal flame for the validation of oxy-coal simulation models. Experiments were also conducted on the OFC to determine heat flux profiles using advanced strategies for O2 injection. This is important when considering retrofit of advanced O2 injection in retrofit configurations.

  11. The use of expert elicitation to quantify uncertainty in incomplete sorption data bases for Waste Isolation Pilot Plant performance assessment

    SciTech Connect (OSTI)

    Anderson, D.R.; Trauth, K.M. ); Hora, S.C. )

    1991-01-01

    Iterative, annual performance-assessment calculations are being performed for the Waste Isolation Pilot Plant (WIPP), a planned underground repository in southeastern New Mexico, USA for the disposal of transuranic waste. The performance-assessment calculations estimate the long-term radionuclide releases from the disposal system to the accessible environment. Because direct experimental data in some areas are presently of insufficient quantity to form the basis for the required distributions. Expert judgment was used to estimate the concentrations of specific radionuclides in a brine exiting a repository room or drift as it migrates up an intruding borehole, and also the distribution coefficients that describe the retardation of radionuclides in the overlying Culebra Dolomite. The variables representing these concentrations and coefficients have been shown by 1990 sensitivity analyses to be among the set of parameters making the greatest contribution to the uncertainty in WIPP performance-assessment predictions. Utilizing available information, the experts (one expert panel addressed concentrations and a second panel addressed retardation) developed an understanding of the problem and were formally elicited to obtain probability distributions that characterize the uncertainty in fixed, but unknown, quantities. The probability distributions developed by the experts are being incorporated into the 1991 performance-assessment calculations. 16 refs., 4 tabs.

  12. DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual.

    SciTech Connect (OSTI)

    Griffin, Joshua D.; Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane; Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  13. Uncertainty quantification for evaluating the impacts of fracture zone on pressure build-up and ground surface uplift during geological CO? sequestration

    SciTech Connect (OSTI)

    Bao, Jie; Hou, Zhangshuan; Fang, Yilin; Ren, Huiying; Lin, Guang

    2015-06-01

    A series of numerical test cases reflecting broad and realistic ranges of geological formation and preexisting fault properties was developed to systematically evaluate the impacts of preexisting faults on pressure buildup and ground surface uplift during CO? injection. Numerical test cases were conducted using a coupled hydro-geomechanical simulator, eSTOMP (extreme-scale Subsurface Transport over Multiple Phases). For efficient sensitivity analysis and reliable construction of a reduced-order model, a quasi-Monte Carlo sampling method was applied to effectively sample a high-dimensional input parameter space to explore uncertainties associated with hydrologic, geologic, and geomechanical properties. The uncertainty quantification results show that the impacts on geomechanical response from the pre-existing faults mainly depend on reservoir and fault permeability. When the fault permeability is two to three orders of magnitude smaller than the reservoir permeability, the fault can be considered as an impermeable block that resists fluid transport in the reservoir, which causes pressure increase near the fault. When the fault permeability is close to the reservoir permeability, or higher than 10?? m in this study, the fault can be considered as a conduit that penetrates the caprock, connecting the fluid flow between the reservoir and the upper rock.

  14. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual.

    SciTech Connect (OSTI)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  15. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    SciTech Connect (OSTI)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  16. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    SciTech Connect (OSTI)

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  17. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    SciTech Connect (OSTI)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. This report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.

  18. Impact of parton distribution function and {alpha}{sub s} uncertainties on Higgs boson production in gluon fusion at hadron colliders

    SciTech Connect (OSTI)

    Demartin, Federico; Mariani, Elisa [Dipartimento di Fisica, Universita di Milano, Via Celoria 16, I-20133 Milano (Italy); Forte, Stefano; Vicini, Alessandro [Dipartimento di Fisica, Universita di Milano, Via Celoria 16, I-20133 Milano (Italy); INFN, Sezione di Milano, Via Celoria 16, I-20133 Milano (Italy); Rojo, Juan [INFN, Sezione di Milano, Via Celoria 16, I-20133 Milano (Italy)

    2010-07-01

    We present a systematic study of uncertainties due to parton distributions (PDFs) and the strong coupling on the gluon-fusion production cross section of the standard model Higgs at the Tevatron and LHC colliders. We compare procedures and results when three recent sets of PDFs are used, CTEQ6.6, MSTW08, and NNPDF1.2, and we discuss specifically the way PDF and strong coupling uncertainties are combined. We find that results obtained from different PDF sets are in reasonable agreement if a common value of the strong coupling is adopted. We show that the addition in quadrature of PDF and {alpha}{sub s} uncertainties provides an adequate approximation to the full result with exact error propagation. We discuss a simple recipe to determine a conservative PDF+{alpha}{sub s} uncertainty from available global parton sets, and we use it to estimate this uncertainty on the given process to be about 10% at the Tevatron and 5% at the LHC for a light Higgs.

  19. Columbia University flow instability experimental program: Volume 2. Single tube uniformly heated tests -- Part 2: Uncertainty analysis and data

    SciTech Connect (OSTI)

    Dougherty, T.; Maciuca, C.; McAssey, E.V. Jr.; Reddy, D.G.; Yang, B.W.

    1990-05-01

    In June 1988, Savannah River Laboratory requested that the Heat Transfer Research Facility modify the flow excursion program, which had been in progress since November 1987, to include testing of single tubes in vertical down-flow over a range of length to diameter (L/D) ratios of 100 to 500. The impetus for the request was the desire to obtain experimental data as quickly as possible for code development work. In July 1988, HTRF submitted a proposal to SRL indicating that by modifying a facility already under construction the data could be obtained within three to four months. In January 1990, HTFR issued report CU-HTRF-T4, part 1. This report contained the technical discussion of the results from the single tube uniformly heated tests. The present report is part 2 of CU-HTRF-T4 which contains further discussion of the uncertainty analysis and the complete set of data.

  20. An uncertainty analysis of the hydrogen source term for a station blackout accident in Sequoyah using MELCOR 1.8.5

    SciTech Connect (OSTI)

    Gauntt, Randall O.; Bixler, Nathan E.; Wagner, Kenneth Charles

    2014-03-01

    A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing the range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.

  1. Jet energy measurement and its systematic uncertainty in proton–proton collisions at √s = 7 TeV with the ATLAS detector

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Aad, G.

    2015-01-15

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of \\(\\sqrt{s}=7\\) TeV corresponding to an integrated luminosity of \\(4.7\\) \\(\\,\\,\\text{ fb }^{-1}\\). Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-\\(k_{t}\\) algorithm with distance parameters \\(R=0.4\\) or \\(R=0.6\\), and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transversemore » momentum balance between a jet and a reference object such as a photon or a \\(Z\\) boson, for \\({20} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{1000}\\, ~\\mathrm{GeV }\\) and pseudorapidities \\(|\\eta |<{4.5}\\). The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region (\\(|\\eta |<{1.2}\\)) for jets with \\({55} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{500}\\, ~\\mathrm{GeV }\\). For central jets at lower \\(p_{\\mathrm {T}}\\), the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for \\(p_{\\mathrm {T}}^\\mathrm {jet}> 1\\) TeV. The calibration of forward jets is derived from dijet \\(p_{\\mathrm {T}}\\) balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-\\(p_{\\mathrm {T}}\\) jets at \\(|\\eta |=4.5\\). In addition, JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.« less

  2. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    SciTech Connect (OSTI)

    Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  3. Efficient algorithms for mixed aleatory-epistemic uncertainty quantification with application to radiation-hardened electronics. Part I, algorithms and benchmark results.

    SciTech Connect (OSTI)

    Swiler, Laura Painton; Eldred, Michael Scott

    2009-09-01

    This report documents the results of an FY09 ASC V&V Methods level 2 milestone demonstrating new algorithmic capabilities for mixed aleatory-epistemic uncertainty quantification. Through the combination of stochastic expansions for computing aleatory statistics and interval optimization for computing epistemic bounds, mixed uncertainty analysis studies are shown to be more accurate and efficient than previously achievable. Part I of the report describes the algorithms and presents benchmark performance results. Part II applies these new algorithms to UQ analysis of radiation effects in electronic devices and circuits for the QASPR program.

  4. Supernova relic neutrinos and the supernova rate problem: Analysis of uncertainties and detectability of ONeMg and failed supernovae

    SciTech Connect (OSTI)

    Mathews, Grant J. [Center for Astrophysics, Department of Physics, University of Notre Dame, Notre Dame, IN 46556 (United States); Hidaka, Jun; Kajino, Toshitaka; Suzuki, Jyutaro [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan)

    2014-08-01

    Direct measurements of the core collapse supernova rate (R{sub SN}) in the redshift range 0 ? z ? 1 appear to be about a factor of two smaller than the rate inferred from the measured cosmic massive star formation rate (SFR). This discrepancy would imply that about one-half of the massive stars that have been born in the local observed comoving volume did not explode as luminous supernovae. In this work, we explore the possibility that one could clarify the source of this 'supernova rate problem' by detecting the energy spectrum of supernova relic neutrinos with a next generation 10{sup 6} ton water ?erenkov detector like Hyper-Kamiokande. First, we re-examine the supernova rate problem. We make a conservative alternative compilation of the measured SFR data over the redshift range 0 ?z ? 7. We show that by only including published SFR data for which the dust obscuration has been directly determined, the ratio of the observed massive SFR to the observed supernova rate R{sub SN} has large uncertainties ?1.8{sub ?0.6}{sup +1.6} and is statistically consistent with no supernova rate problem. If we further consider that a significant fraction of massive stars will end their lives as faint ONeMg SNe or as failed SNe leading to a black hole remnant, then the ratio reduces to ?1.1{sub ?0.4}{sup +1.0} and the rate problem is essentially solved. We next examine the prospects for detecting this solution to the supernova rate problem. We first study the sources of uncertainty involved in the theoretical estimates of the neutrino detection rate and analyze whether the spectrum of relic neutrinos can be used to independently identify the existence of a supernova rate problem and its source. We consider an ensemble of published and unpublished core collapse supernova simulation models to estimate the uncertainties in the anticipated neutrino luminosities and temperatures. We illustrate how the spectrum of detector events might be used to establish the average neutrino temperature and constrain SN models. We also consider supernova ?-process nucleosynthesis to deduce constraints on the temperature of the various neutrino flavors. We study the effects of neutrino oscillations on the detected neutrino energy spectrum and also show that one might distinguish the equation of state (EoS) as well as the cause of the possible missing luminous supernovae from the detection of supernova relic neutrinos. We also analyze a possible enhanced contribution from failed supernovae leading to a black hole remnant as a solution to the supernova rate problem. We conclude that indeed it might be possible (though difficult) to measure the neutrino temperature, neutrino oscillations, and the EoS and confirm this source of missing luminous supernovae by the detection of the spectrum of relic neutrinos.

  5. Uncertainty quantification for evaluating impacts of caprock and reservoir properties on pressure buildup and ground surface displacement during geological CO2 sequestration

    SciTech Connect (OSTI)

    Bao, Jie; Hou, Zhangshuan; Fang, Yilin; Ren, Huiying; Lin, Guang

    2013-08-12

    A series of numerical test cases reflecting broad and realistic ranges of geological formation properties was developed to systematically evaluate and compare the impacts of those properties on geomechanical responses to CO2 injection. A coupled hydro-geomechanical subsurface transport simulator, STOMP (Subsurface Transport over Multiple Phases), was adopted to simulate the CO2 migration process and geomechanical behaviors of the surrounding geological formations. A quasi-Monte Carlo sampling method was applied to efficiently sample a high-dimensional parameter space consisting of injection rate and 14 subsurface formation properties, including porosity, permeability, entry pressure, irreducible gas and aqueous saturation, Youngs modulus, and Poissons ratio for both reservoir and caprock. Generalized cross-validation and analysis of variance methods were used to quantitatively measure the significance of the 15 input parameters. Reservoir porosity, permeability, and injection rate were found to be among the most significant factors affecting the geomechanical responses to the CO2 injection. We used a quadrature generalized linear model to build a reduced-order model that can estimate the geomechanical response instantly instead of running computationally expensive numerical simulations. The injection pressure and ground surface displacement are often monitored for injection well safety, and are believed can partially reflect the risk of fault reactivation and seismicity. Based on the reduced order model and response surface, the input parameters can be screened for control the risk of induced seismicity. The uncertainty of the subsurface structure properties cause the numerical simulation based on a single or a few samples does not accurately estimate the geomechanical response in the actual injection site. Probability of risk can be used to evaluate and predict the risk of injection when there are great uncertainty in the subsurface properties and operation conditions.

  6. SU-E-J-125: Classification of CBCT Noises in Terms of Their Contribution to Proton Range Uncertainty

    SciTech Connect (OSTI)

    Brousmiche, S; Orban de Xivry, J; Macq, B; Seco, J

    2014-06-01

    Purpose: This study assesses the potential use of CBCT images in adaptive protontherapy by estimating the contribution of the main sources of noise and calibration errors to the proton range uncertainty. Methods: Measurements intended to highlight each particular source have been achieved by adapting either the testbench configuration, e.g. use of filtration, fan-beam collimation, beam stop arrays, phantoms and detector reset light, or the sequence of correction algorithms including water precorrection. Additional Monte-Carlo simulations have been performed to complement these measurements, especially for the beam hardening and the scatter cases. Simulations of proton beams penetration through the resulting images have then been carried out to quantify the range change due to these effects. The particular case of a brain irradiation is considered mainly because of the multiple effects that the skull bones have on the internal soft tissues. Results: On top of the range error sources is the undercorrection of scatter. Its influence has been analyzed from a comparison of fan-beam and full axial FOV acquisitions. In this case, large range errors of about 12 mm can be reached if the assumption is made that the scatter has only a constant contribution over the projection images. Even the detector lag, which a priori induces a much smaller effect, has been shown to contribute for up to 2 mm to the overall error if its correction only aims at reducing the skin artefact. This last result can partially be understood by the larger interface between tissues and bones inside the skull. Conclusion: This study has set the basis of a more systematical analysis of the effect CBCT noise on range uncertainties based on a combination of measurements, simulations and theoretical results. With our method, even more subtle effects such as the cone-beam artifact or the detector lag can be assessed. SBR and JOR are financed by iMagX, a public-private partnership between the region Wallone of Belgium and IBA under convention #1217662.

  7. Uncertainty Quantification and Management for Multi-scale Nuclear Materials Modeling

    SciTech Connect (OSTI)

    McDowell, David; Deo, Chaitanya; Zhu, Ting; Wang, Yan

    2015-10-21

    Understanding and improving microstructural mechanical stability in metals and alloys is central to the development of high strength and high ductility materials for cladding and cores structures in advanced fast reactors. Design and enhancement of radiation-induced damage tolerant alloys are facilitated by better understanding the connection of various unit processes to collective responses in a multiscale model chain, including: dislocation nucleation, absorption and desorption at interfaces; vacancy production, radiation-induced segregation of Cr and Ni at defect clusters (point defect sinks) in BCC Fe-Cr ferritic/martensitic steels; investigation of interaction of interstitials and vacancies with impurities (V, Nb, Ta, Mo, W, Al, Si, P, S); time evolution of swelling (cluster growth) phenomena of irradiated materials; and energetics and kinetics of dislocation bypass of defects formed by interstitial clustering and formation of prismatic loops, informing statistical models of continuum character with regard to processes of dislocation glide, vacancy agglomeration and swelling, climb and cross slip.

  8. Uncertainty quantification of cinematic imaging for development of predictive simulations of turbulent combustion.

    SciTech Connect (OSTI)

    Lawson, Matthew; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik; Frank, Jonathan H.

    2010-09-01

    Recent advances in high frame rate complementary metal-oxide-semiconductor (CMOS) cameras coupled with high repetition rate lasers have enabled laser-based imaging measurements of the temporal evolution of turbulent reacting flows. This measurement capability provides new opportunities for understanding the dynamics of turbulence-chemistry interactions, which is necessary for developing predictive simulations of turbulent combustion. However, quantitative imaging measurements using high frame rate CMOS cameras require careful characterization of the their noise, non-linear response, and variations in this response from pixel to pixel. We develop a noise model and calibration tools to mitigate these problems and to enable quantitative use of CMOS cameras. We have demonstrated proof of principle for image de-noising using both wavelet methods and Bayesian inference. The results offer new approaches for quantitative interpretation of imaging measurements from noisy data acquired with non-linear detectors. These approaches are potentially useful in many areas of scientific research that rely on quantitative imaging measurements.

  9. Integration of the Uncertainties of Anion and TOC Measurements into the Flammability Control Strategy for Sludge Batch 8 at the DWPF

    SciTech Connect (OSTI)

    Edwards, T. B.

    2013-03-14

    The Savannah River National Laboratory (SRNL) has been working with the Savannah River Remediation (SRR) Defense Waste Processing Facility (DWPF) in the development and implementation of a flammability control strategy for DWPFs melter operation during the processing of Sludge Batch 8 (SB8). SRNLs support has been in response to technical task requests that have been made by SRRs Waste Solidification Engineering (WSE) organization. The flammability control strategy relies on measurements that are performed on Slurry Mix Evaporator (SME) samples by the DWPF Laboratory. Measurements of nitrate, oxalate, formate, and total organic carbon (TOC) standards generated by the DWPF Laboratory are presented in this report, and an evaluation of the uncertainties of these measurements is provided. The impact of the uncertainties of these measurements on DWPFs strategy for controlling melter flammability also is evaluated. The strategy includes monitoring each SME batch for its nitrate content and its TOC content relative to the nitrate content and relative to the antifoam additions made during the preparation of the SME batch. A linearized approach for monitoring the relationship between TOC and nitrate is developed, equations are provided that integrate the measurement uncertainties into the flammability control strategy, and sample calculations for these equations are shown to illustrate the impact of the uncertainties on the flammability control strategy.

  10. Impacts of WRF Physics and Measurement Uncertainty on California Wintertime Model Wet Bias

    SciTech Connect (OSTI)

    Chin, H S; Caldwell, P M; Bader, D C

    2009-07-22

    The Weather and Research Forecast (WRF) model version 3.0.1 is used to explore California wintertime model wet bias. In this study, two wintertime storms are selected from each of four major types of large-scale conditions; Pineapple Express, El Nino, La Nina, and synoptic cyclones. We test the impacts of several model configurations on precipitation bias through comparison with three sets of gridded surface observations; one from the National Oceanographic and Atmospheric Administration, and two variations from the University of Washington (without and with long-term trend adjustment; UW1 and UW2, respectively). To simplify validation, California is divided into 4 regions (Coast, Central Valley, Mountains, and Southern California). Simulations are driven by North American Regional Reanalysis data to minimize large-scale forcing error. Control simulations are conducted with 12-km grid spacing (low resolution) but additional experiments are performed at 2-km (high) resolution to evaluate the robustness of microphysics and cumulus parameterizations to resolution changes. We find that the choice of validation dataset has a significant impact on the model wet bias, and the forecast skill of model precipitation depends strongly on geographic location and storm type. Simulations with right physics options agree better with UW1 observations. In 12-km resolution simulations, the Lin microphysics and the Kain-Fritsch cumulus scheme have better forecast skill in the coastal region while Goddard, Thompson, and Morrison microphysics, and the Grell-Devenyi cumulus scheme perform better in the rest of California. The effect of planetary boundary layer, soil-layer, and radiation physics on model precipitation is weaker than that of microphysics and cumulus processes for short- to medium-range low-resolution simulations. Comparison of 2-km and 12-km resolution runs suggests a need for improvement of cumulus schemes, and supports the use of microphysics schemes in coarser-grid applications.

  11. Uncertainty Analysis of Runoff Simulations and Parameter Identifiability in the Community Land Model Evidence from MOPEX Basins

    SciTech Connect (OSTI)

    Huang, Maoyi; Hou, Zhangshuan; Leung, Lai-Yung R.; Ke, Yinghai; Liu, Ying; Fang, Zhufeng; Sun, Yu

    2013-12-01

    With the emergence of earth system models as important tools for understanding and predicting climate change and implications to mitigation and adaptation, it has become increasingly important to assess the fidelity of the land component within earth system models to capture realistic hydrological processes and their response to the changing climate and quantify the associated uncertainties. This study investigates the sensitivity of runoff simulations to major hydrologic parameters in version 4 of the Community Land Model (CLM4) by integrating CLM4 with a stochastic exploratory sensitivity analysis framework at 20 selected watersheds from the Model Parameter Estimation Experiment (MOPEX) spanning a wide range of climate and site conditions. We found that for runoff simulations, the most significant parameters are those related to the subsurface runoff parameterizations. Soil texture related parameters and surface runoff parameters are of secondary significance. Moreover, climate and soil conditions play important roles in the parameter sensitivity. In general, site conditions within water-limited hydrologic regimes and with finer soil texture result in stronger sensitivity of output variables, such as runoff and its surface and subsurface components, to the input parameters in CLM4. This study demonstrated the feasibility of parameter inversion for CLM4 using streamflow observations to improve runoff simulations. By ranking the significance of the input parameters, we showed that the parameter set dimensionality could be reduced for CLM4 parameter calibration under different hydrologic and climatic regimes so that the inverse problem is less ill posed.

  12. Impact of nucleon matrix element uncertainties on the interpretation of direct and indirect dark matter search results

    SciTech Connect (OSTI)

    Austri, R. Ruiz de

    2013-11-01

    We study in detail the impact of the current uncertainty in nucleon matrix elements on the sensitivity of direct and indirect experimental techniques for dark matter detection. We perform two scans in the framework of the cMSSM: one using recent values of the pion-sigma term obtained from Lattice QCD, and the other using values derived from experimental measurements. The two choices correspond to extreme values quoted in the literature and reflect the current tension between different ways of obtaining information about the structure of the nucleon. All other inputs in the scans, astrophysical and from particle physics, are kept unchanged. We use two experiments, XENON100 and IceCube, as benchmark cases to illustrate our case. We find that the interpretation of dark matter search results from direct detection experiments is more sensitive to the choice of the central values of the hadronic inputs than the results of indirect search experiments. The allowed regions of cMSSM parameter space after including XENON100 constraints strongly differ depending on the assumptions on the hadronic matrix elements used. On the other hand, the constraining potential of IceCube is almost independent of the choice of these values.

  13. Sparing Healthy Tissue and Increasing Tumor Dose Using Bayesian Modeling of Geometric Uncertainties for Planning Target Volume Personalization

    SciTech Connect (OSTI)

    Herschtal, Alan; Te Marvelde, Luc; Mengersen, Kerrie; Foroudi, Farshad; Eade, Thomas; Pham, Daniel; Caine, Hannah; Kron, Tomas

    2015-06-01

    Objective: To develop a mathematical tool that can update a patient's planning target volume (PTV) partway through a course of radiation therapy to more precisely target the tumor for the remainder of treatment and reduce dose to surrounding healthy tissue. Methods and Materials: Daily on-board imaging was used to collect large datasets of displacements for patients undergoing external beam radiation therapy for solid tumors. Bayesian statistical modeling of these geometric uncertainties was used to optimally trade off between displacement data collected from previously treated patients and the progressively accumulating data from a patient currently partway through treatment, to optimally predict future displacements for that patient. These predictions were used to update the PTV position and margin width for the remainder of treatment, such that the clinical target volume (CTV) was more precisely targeted. Results: Software simulation of dose to CTV and normal tissue for 2 real prostate displacement datasets consisting of 146 and 290 patients treated with a minimum of 30 fractions each showed that re-evaluating the PTV position and margin width after 8 treatment fractions reduced healthy tissue dose by 19% and 17%, respectively, while maintaining CTV dose. Conclusion: Incorporating patient-specific displacement patterns from early in a course of treatment allows PTV adaptation for the remainder of treatment. This substantially reduces the dose to healthy tissues and thus can reduce radiation therapy–induced toxicities, improving patient outcomes.

  14. REVISITING THE COSMIC STAR FORMATION HISTORY: CAUTION ON THE UNCERTAINTIES IN DUST CORRECTION AND STAR FORMATION RATE CONVERSION

    SciTech Connect (OSTI)

    Kobayashi, Masakazu A. R.; Inoue, Yoshiyuki; Inoue, Akio K.

    2013-01-20

    The cosmic star formation rate density (CSFRD) has been observationally investigated out to redshift z {approx_equal} 10. However, most of the theoretical models for galaxy formation underpredict the CSFRD at z {approx}> 1. Since the theoretical models reproduce the observed luminosity functions (LFs), luminosity densities (LDs), and stellar mass density at each redshift, this inconsistency does not simply imply that theoretical models should incorporate some missing unknown physical processes in galaxy formation. Here, we examine the cause of this inconsistency at UV wavelengths by using a mock catalog of galaxies generated by a semi-analytic model of galaxy formation. We find that this inconsistency is due to two observational uncertainties: the dust obscuration correction and the conversion from UV luminosity to star formation rate (SFR). The methods for correction of obscuration and SFR conversion used in observational studies result in the overestimation of the CSFRD by {approx}0.1-0.3 dex and {approx}0.1-0.2 dex, respectively, compared to the results obtained directly from our mock catalog. We present new empirical calibrations for dust attenuation and conversion from observed UV LFs and LDs into the CSFRD.

  15. Techno-economic and uncertainty analysis of in situ and ex situ fast pyrolysis for biofuel production

    SciTech Connect (OSTI)

    Li, Boyan; Ou, Longwen; Dang, Qi; Meyer, Pimphan A.; Jones, Susanne B.; Brown, Robert C.; Wright, Mark

    2015-11-01

    This study evaluates the techno-economic uncertainty in cost estimates for two emerging biorefinery technologies for biofuel production: in situ and ex situ catalytic pyrolysis. Stochastic simulations based on process and economic parameter distributions are applied to calculate biorefinery performance and production costs. The probability distributions for the minimum fuel-selling price (MFSP) indicate that in situ catalytic pyrolysis has an expected MFSP of $4.20 per gallon with a standard deviation of 1.15, while the ex situ catalytic pyrolysis has a similar MFSP with a smaller deviation ($4.27 per gallon and 0.79 respectively). These results suggest that a biorefinery based on ex situ catalytic pyrolysis could have a lower techno-economic risk than in situ pyrolysis despite a slightly higher MFSP cost estimate. Analysis of how each parameter affects the NPV indicates that internal rate of return, feedstock price, total project investment, electricity price, biochar yield and bio-oil yield are significant parameters which have substantial impact on the MFSP for both in situ and ex situ catalytic pyrolysis.

  16. Cellular and molecular research to reduce uncertainties in estimates of health effects from low-level radiation

    SciTech Connect (OSTI)

    Elkind, M.M.; Bedford, J.; Benjamin, S.A.; Waldren, C.A. ); Gotchy, R.L. )

    1990-10-01

    A study was undertaken by five radiation scientists to examine the feasibility of reducing the uncertainties in the estimation of risk due to protracted low doses of ionizing radiation. In addressing the question of feasibility, a review was made by the study group: of the cellular, molecular, and mammalian radiation data that are available; of the way in which altered oncogene properties could be involved in the loss of growth control that culminates in tumorigenesis; and of the progress that had been made in the genetic characterizations of several human and animal neoplasms. On the basis of this analysis, the study group concluded that, at the present time, it is feasible to mount a program of radiation research directed at the mechanism(s) of radiation-induced cancer with special reference to risk of neoplasia due to protracted, low doses of sparsely ionizing radiation. To implement a program of research, a review was made of the methods, techniques, and instruments that would be needed. This review was followed by a survey of the laboratories and institutions where scientific personnel and facilities are known to be available. A research agenda of the principal and broad objectives of the program is also discussed. 489 refs., 21 figs., 14 tabs.

  17. Calculating Individual Resources Variability and Uncertainty Factors Based on Their Contributions to the Overall System Balancing Needs

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Du, Pengwei; Pai, M. A.; McManus, Bart

    2014-01-14

    The variability and uncertainty of wind power production requires increased flexibility in power systems, or more operational reserves to main a satisfactory level of reliability. The incremental increase in reserve requirement caused by wind power is often studied separately from the effects of loads. Accordingly, the cost in procuring reserves is allocated based on this simplification rather than a fair and transparent calculation of the different resources contribution to the reserve requirement. This work proposes a new allocation mechanism for intermittency and variability of resources regardless of their type. It is based on a new formula, called grid balancing metric (GBM). The proposed GBM has several distinct features: 1) it is directly linked to the control performance standard (CPS) scores and interconnection frequency performance, 2) it provides scientifically defined allocation factors for individual resources, 3) the sum of allocation factors within any group of resources is equal to the groups collective allocation factor (linearity), and 4) it distinguishes helpers and harmers. The paper illustrates and provides results of the new approach based on actual transmission system operator (TSO) data.

  18. Dispelling Clouds of Uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Lewis, Ernie; Teixeira, João

    2015-06-15

    How do you build a climate model that accounts for cloud physics and the transitions between cloud regimes? Use MAGIC.

  19. Dispelling Clouds of Uncertainty

    SciTech Connect (OSTI)

    Lewis, Ernie; Teixeira, João

    2015-06-15

    How do you build a climate model that accounts for cloud physics and the transitions between cloud regimes? Use MAGIC.

  20. Calculating Confidence, Uncertainty, and Numbers of Samples When Using Statistical Sampling Approaches to Characterize and Clear Contaminated Areas

    SciTech Connect (OSTI)

    Piepel, Gregory F.; Matzke, Brett D.; Sego, Landon H.; Amidan, Brett G.

    2013-04-27

    This report discusses the methodology, formulas, and inputs needed to make characterization and clearance decisions for Bacillus anthracis-contaminated and uncontaminated (or decontaminated) areas using a statistical sampling approach. Specifically, the report includes the methods and formulas for calculating the number of samples required to achieve a specified confidence in characterization and clearance decisions confidence in making characterization and clearance decisions for a specified number of samples for two common statistically based environmental sampling approaches. In particular, the report addresses an issue raised by the Government Accountability Office by providing methods and formulas to calculate the confidence that a decision area is uncontaminated (or successfully decontaminated) if all samples collected according to a statistical sampling approach have negative results. Key to addressing this topic is the probability that an individual sample result is a false negative, which is commonly referred to as the false negative rate (FNR). The two statistical sampling approaches currently discussed in this report are 1) hotspot sampling to detect small isolated contaminated locations during the characterization phase, and 2) combined judgment and random (CJR) sampling during the clearance phase. Typically if contamination is widely distributed in a decision area, it will be detectable via judgment sampling during the characterization phrase. Hotspot sampling is appropriate for characterization situations where contamination is not widely distributed and may not be detected by judgment sampling. CJR sampling is appropriate during the clearance phase when it is desired to augment judgment samples with statistical (random) samples. The hotspot and CJR statistical sampling approaches are discussed in the report for four situations: 1. qualitative data (detect and non-detect) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 2. qualitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0 3. quantitative data (e.g., contaminant concentrations expressed as CFU/cm2) when the FNR = 0 or when using statistical sampling methods that account for FNR > 0 4. quantitative data when the FNR > 0 but statistical sampling methods are used that assume the FNR = 0. For Situation 2, the hotspot sampling approach provides for stating with Z% confidence that a hotspot of specified shape and size with detectable contamination will be found. Also for Situation 2, the CJR approach provides for stating with X% confidence that at least Y% of the decision area does not contain detectable contamination. Forms of these statements for the other three situations are discussed in Section 2.2. Statistical methods that account for FNR > 0 currently only exist for the hotspot sampling approach with qualitative data (or quantitative data converted to qualitative data). This report documents the current status of methods and formulas for the hotspot and CJR sampling approaches. Limitations of these methods are identified. Extensions of the methods that are applicable when FNR = 0 to account for FNR > 0, or to address other limitations, will be documented in future revisions of this report if future funding supports the development of such extensions. For quantitative data, this report also presents statistical methods and formulas for 1. quantifying the uncertainty in measured sample results 2. estimating the true surface concentration corresponding to a surface sample 3. quantifying the uncertainty of the estimate of the true surface concentration. All of the methods and formulas discussed in the report were applied to example situations to illustrate application of the methods and interpretation of the results.

  1. Mineral dissolution and precipitation during CO2 injection at the Frio-I Brine Pilot: Geochemical modeling and uncertainty analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ilgen, A. G.; Cygan, R. T.

    2015-12-07

    During the Frio-I Brine Pilot CO2 injection experiment in 2004, distinct geochemical changes in response to the injection of 1600 tons of CO2 were recorded in samples collected from the monitoring well. Previous geochemical modeling studies have considered dissolution of calcite and iron oxyhydroxides, or release of adsorbed iron, as the most likely sources of the increased ion concentrations. We explore in this modeling study possible alternative sources of the increasing calcium and iron, based on the data from the detailed petrographic characterization of the Upper Frio Formation “C”. Particularly, we evaluate whether dissolution of pyrite and oligoclase (anorthitemore » component) can account for the observed geochemical changes. Due to kinetic limitations, dissolution of pyrite and anorthite cannot account for the increased iron and calcium concentrations on the time scale of the field test (10 days). However, dissolution of these minerals is contributing to carbonate and clay mineral precipitation on the longer time scales (1000 years). The one-dimensional reactive transport model predicts carbonate minerals, dolomite and ankerite, as well as clay minerals kaolinite, nontronite and montmorillonite, will precipitate in the Frio Formation “C” sandstone as the system progresses towards chemical equilibrium during a 1000-year period. Cumulative uncertainties associated with using different thermodynamic databases, activity correction models (Pitzer vs. B-dot), and extrapolating to reservoir temperature, are manifested in the difference in the predicted mineral phases. Furthermore, these models are consistent with regards to the total volume of mineral precipitation and porosity values which are predicted to within 0.002%.« less

  2. MCNP6 Results for the Phase III Sensitivity Benchmark of the OCED/NEA Expert Group on Uncertainty Analysis for Criticality Safety Assessment

    SciTech Connect (OSTI)

    Kiedrowski, Brian C.

    2012-06-19

    Within the last decade, there has been increasing interest in the calculation of cross section sensitivity coefficients of k{sub eff} for integral experiment design and uncertainty analysis. The OECD/NEA has an Expert Group devoted to Sensitivity and Uncertainty Analysis within the Working Party for Nuclear Criticality Safety. This expert group has developed benchmarks to assess code capabilities and performance for doing sensitivity and uncertainty analysis. Phase III of a set of sensitivity benchmarks evaluates capabilities for computing sensitivity coefficients. MCNP6 has the capability to compute cross section sensitivities for k{sub eff} using continuous-energy physics. To help verify this capability, results for the Phase III benchmark cases are generated and submitted to the Expert Group for comparison. The Phase III benchmark has three cases: III.1, an array of MOX fuel pins, III.2, a series of infinite lattices of MOX fuel pins with varying pitches, and III.3 two spheres with homogeneous mixtures of UF{sub 4} and polyethylene with different enrichments.

  3. Evaluating sub-national building-energy efficiency policy options under uncertainty: Efficient sensitivity testing of alternative climate, technolgical, and socioeconomic futures in a regional intergrated-assessment model.

    SciTech Connect (OSTI)

    Scott, Michael J.; Daly, Don S.; Zhou, Yuyu; Rice, Jennie S.; Patel, Pralit L.; McJeon, Haewon C.; Kyle, G. Page; Kim, Son H.; Eom, Jiyong; Clarke, Leon E.

    2014-05-01

    Improving the energy efficiency of the building stock, commercial equipment and household appliances can have a major impact on energy use, carbon emissions, and building services. Subnational regions such as U.S. states wish to increase their energy efficiency, reduce carbon emissions or adapt to climate change. Evaluating subnational policies to reduce energy use and emissions is difficult because of the uncertainties in socioeconomic factors, technology performance and cost, and energy and climate policies. Climate change may undercut such policies. Assessing these uncertainties can be a significant modeling and computation burden. As part of this uncertainty assessment, this paper demonstrates how a decision-focused sensitivity analysis strategy using fractional factorial methods can be applied to reveal the important drivers for detailed uncertainty analysis.

  4. Robustness of RISMC Insights under Alternative Aleatory/Epistemic Uncertainty Classifications: Draft Report under the Risk-Informed Safety Margin Characterization (RISMC) Pathway of the DOE Light Water Reactor Sustainability Program

    SciTech Connect (OSTI)

    Unwin, Stephen D.; Eslinger, Paul W.; Johnson, Kenneth I.

    2012-09-20

    The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy (DOE) Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). A technical challenge at the core of this effort is to establish the conceptual and technical feasibility of analyzing safety margin in a risk-informed way, which, unlike conventionally defined deterministic margin analysis, would be founded on probabilistic characterizations of uncertainty in SSC performance. In the context of probabilistic risk assessment (PRA) technology, there has arisen a general consensus about the distinctive roles of two types of uncertainty: aleatory and epistemic, where the former represents irreducible, random variability inherent in a system, whereas the latter represents a state of knowledge uncertainty on the part of the analyst about the system which is, in principle, reducible through further research. While there is often some ambiguity about how any one contributing uncertainty in an analysis should be classified, there has nevertheless emerged a broad consensus on the meanings of these uncertainty types in the PRA setting. However, while RISMC methodology shares some features with conventional PRA, it will nevertheless be a distinctive methodology set. Therefore, the paradigms for classification of uncertainty in the PRA setting may not fully port to the RISMC environment. Yet the notion of risk-informed margin is based on the characterization of uncertainty, and it is therefore critical to establish a common understanding of uncertainty in the RISMC setting.

  5. ADVANCED NUCLEAR FUEL CYCLE EFFECTS ON THE TREATMENT OF UNCERTAINTY IN THE LONG-TERM ASSESSMENT OF GEOLOGIC DISPOSAL SYSTEMS - EBS INPUT

    SciTech Connect (OSTI)

    Sutton, M; Blink, J A; Greenberg, H R; Sharma, M

    2012-04-25

    The Used Fuel Disposition (UFD) Campaign within the Department of Energy's Office of Nuclear Energy (DOE-NE) Fuel Cycle Technology (FCT) program has been tasked with investigating the disposal of the nation's spent nuclear fuel (SNF) and high-level nuclear waste (HLW) for a range of potential waste forms and geologic environments. The planning, construction, and operation of a nuclear disposal facility is a long-term process that involves engineered barriers that are tailored to both the geologic environment and the waste forms being emplaced. The UFD Campaign is considering a range of fuel cycles that in turn produce a range of waste forms. The UFD Campaign is also considering a range of geologic media. These ranges could be thought of as adding uncertainty to what the disposal facility design will ultimately be; however, it may be preferable to thinking about the ranges as adding flexibility to design of a disposal facility. For example, as the overall DOE-NE program and industrial actions result in the fuel cycles that will produce waste to be disposed, and the characteristics of those wastes become clear, the disposal program retains flexibility in both the choice of geologic environment and the specific repository design. Of course, other factors also play a major role, including local and State-level acceptance of the specific site that provides the geologic environment. In contrast, the Yucca Mountain Project (YMP) repository license application (LA) is based on waste forms from an open fuel cycle (PWR and BWR assemblies from an open fuel cycle). These waste forms were about 90% of the total waste, and they were the determining waste form in developing the engineered barrier system (EBS) design for the Yucca Mountain Repository design. About 10% of the repository capacity was reserved for waste from a full recycle fuel cycle in which some actinides were extracted for weapons use, and the remaining fission products and some minor actinides were encapsulated in borosilicate glass. Because the heat load of the glass was much less than the PWR and BWR assemblies, the glass waste form was able to be co-disposed with the open cycle waste, by interspersing glass waste packages among the spent fuel assembly waste packages. In addition, the Yucca Mountain repository was designed to include some research reactor spent fuel and naval reactor spent fuel, within the envelope that was set using the commercial reactor assemblies as the design basis waste form. This milestone report supports Sandia National Laboratory milestone M2FT-12SN0814052, and is intended to be a chapter in that milestone report. The independent technical review of this LLNL milestone was performed at LLNL and is documented in the electronic Information Management (IM) system at LLNL. The objective of this work is to investigate what aspects of quantifying, characterizing, and representing the uncertainty associated with the engineered barrier are affected by implementing different advanced nuclear fuel cycles (e.g., partitioning and transmutation scenarios) together with corresponding designs and thermal constraints.

  6. REDUCING UNCERTAINTIES IN MODEL PREDICTIONS VIA HISTORY MATCHING OF CO2 MIGRATION AND REACTIVE TRANSPORT MODELING OF CO2 FATE AT THE SLEIPNER PROJECT

    SciTech Connect (OSTI)

    Zhu, Chen

    2015-03-31

    An important question for the Carbon Capture, Storage, and Utility program is “can we adequately predict the CO2 plume migration?” For tracking CO2 plume development, the Sleipner project in the Norwegian North Sea provides more time-lapse seismic monitoring data than any other sites, but significant uncertainties still exist for some of the reservoir parameters. In Part I, we assessed model uncertainties by applying two multi-phase compositional simulators to the Sleipner Benchmark model for the uppermost layer (Layer 9) of the Utsira Sand and calibrated our model against the time-lapsed seismic monitoring data for the site from 1999 to 2010. Approximate match with the observed plume was achieved by introducing lateral permeability anisotropy, adding CH4 into the CO2 stream, and adjusting the reservoir temperatures. Model-predicted gas saturation, CO2 accumulation thickness, and CO2 solubility in brine—none were used as calibration metrics—were all comparable with the interpretations of the seismic data in the literature. In Part II & III, we evaluated the uncertainties of predicted long-term CO2 fate up to 10,000 years, due to uncertain reaction kinetics. Under four scenarios of the kinetic rate laws, the temporal and spatial evolution of CO2 partitioning into the four trapping mechanisms (hydrodynamic/structural, solubility, residual/capillary, and mineral) was simulated with ToughReact, taking into account the CO2-brine-rock reactions and the multi-phase reactive flow and mass transport. Modeling results show that different rate laws for mineral dissolution and precipitation reactions resulted in different predicted amounts of trapped CO2 by carbonate minerals, with scenarios of the conventional linear rate law for feldspar dissolution having twice as much mineral trapping (21% of the injected CO2) as scenarios with a Burch-type or Alekseyev et al.–type rate law for feldspar dissolution (11%). So far, most reactive transport modeling (RTM) studies for CCUS have used the conventional rate law and therefore simulated the upper bound of mineral trapping. However, neglecting the regional flow after injection, as most previous RTM studies have done, artificially limits the extent of geochemical reactions as if it were in a batch system. By replenishing undersaturated groundwater from upstream, the Utsira Sand is reactive over a time scale of 10,000 years. The results from this project have been communicated via five peer-reviewed journal articles, four conference proceeding papers, and 19 invited and contributed presentations at conferences and seminars.

  7. SU-F-BRD-09: Is It Sufficient to Use Only Low Density Tissue-Margin to Compensate Inter-Fractionation Setup Uncertainties in Lung Treatment?

    SciTech Connect (OSTI)

    Nie, K; Yue, N; Chen, T; Millevoi, R; Qin, S; Guo, J

    2014-06-15

    Purpose: In lung radiation treatment, PTV is formed with a margin around GTV (or CTV/ITV). Although GTV is most likely of water equivalent density, the PTV margin may be formed with the surrounding low-density tissues, which may lead to unreal dosimetric plan. This study is to evaluate whether the concern of dose calculation inside the PTV with only low density margin could be justified in lung treatment. Methods: Three SBRT cases were analyzed. The PTV from the original plan (Plan-O) was created with a 510 mm margin outside the ITV to incorporate setup errors and all mobility from 10 respiratory phases. Test plans were generated with the GTV shifted to the PTV edge to simulate the extreme situations with maximum setup uncertainties. Two representative positions as the very posterior-superior (Plan-PS) and anterior-inferior (Plan-AI) edge were considered. The virtual GTV was assigned a density of 1.0 g.cm?3 and surrounding lung, including the PTV margin, was defined as 0.25 g.cm?3. Also, additional plan with a 1mm tissue-margin instead of full lung-margin was created to evaluate whether a composite-margin (Plan-Comp) has a better approximation for dose calculation. All plans were generated on the average CT using Analytical Anisotropic Algorithm with heterogeneity correction on and all planning parameters/monitor unites remained unchanged. DVH analyses were performed for comparisons. Results: Despite the non-static dose distribution, the high-dose region synchronized with tumor positions. This might due to scatter conditions as greater doses were absorbed in the solid-tumor than in the surrounding low-density lungtissue. However, it still showed missing target coverage in general. Certain level of composite-margin might give better approximation for the dosecalculation. Conclusion: Our exploratory results suggest that with the lungmargin only, the planning dose of PTV might overestimate the coverage of the target during treatment. The significance of this overestimation might warrant further investigation.

  8. Uncertainty in least-squares fits to the thermal noise spectra of nanomechanical resonators with applications to the atomic force microscope

    SciTech Connect (OSTI)

    Sader, John E.; Yousefi, Morteza; Friend, James R.; Melbourne Centre for Nanofabrication, Clayton, Victoria 3800

    2014-02-15

    Thermal noise spectra of nanomechanical resonators are used widely to characterize their physical properties. These spectra typically exhibit a Lorentzian response, with additional white noise due to extraneous processes. Least-squares fits of these measurements enable extraction of key parameters of the resonator, including its resonant frequency, quality factor, and stiffness. Here, we present general formulas for the uncertainties in these fit parameters due to sampling noise inherent in all thermal noise spectra. Good agreement with Monte Carlo simulation of synthetic data and measurements of an Atomic Force Microscope (AFM) cantilever is demonstrated. These formulas enable robust interpretation of thermal noise spectra measurements commonly performed in the AFM and adaptive control of fitting procedures with specified tolerances.

  9. SIMULATION MODEL ANALYSIS OF THE MOST PROMISING GEOLOGIC SEQUESTRATION FORMATION CANDIDATES IN THE ROCKY MOUNTAIN REGION, USA, WITH FOCUS ON UNCERTAINTY ASSESSMENT

    SciTech Connect (OSTI)

    Lee, Si-Yong; Zaluski, Wade; Will, Robert; Eisinger, Chris; Matthews, Vince; McPherson, Brian

    2013-09-01

    The purpose of this report is to report results of reservoir model simulation analyses for forecasting subsurface CO2 storage capacity estimation for the most promising formations in the Rocky Mountain region of the USA. A particular emphasis of this project was to assess uncertainty of the simulation-based forecasts. Results illustrate how local-scale data, including well information, number of wells, and location of wells, affect storage capacity estimates and what degree of well density (number of wells over a fixed area) may be required to estimate capacity within a specified degree of confidence. A major outcome of this work was development of a new workflow of simulation analysis, accommodating the addition of random pseudo wells to represent virtual characterization wells.

  10. FURTHER STUDIES ON UNCERTAINTY, CONFOUNDING, AND VALIDATION OF THE DOSES IN THE TECHA RIVER DOSIMETRY SYSTEM: Concluding Progress Report on the Second Phase of Project 1.1

    SciTech Connect (OSTI)

    Degteva, M. O.; Anspaugh, L. R.; Napier, Bruce A.

    2009-10-23

    This is the concluding Progress Report for Project 1.1 of the U.S./Russia Joint Coordinating Committee on Radiation Effects Research (JCCRER). An overwhelming majority of our work this period has been to complete our primary obligation of providing a new version of the Techa River Dosimetry System (TRDS), which we call TRDS-2009D; the D denotes deterministic. This system provides estimates of individual doses to members of the Extended Techa River Cohort (ETRC) and post-natal doses to members of the Techa River Offspring Cohort (TROC). The latter doses were calculated with use of the TRDS-2009D. The doses for the members of the ETRC have been made available to the American and Russian epidemiologists in September for their studies in deriving radiogenic risk factors. Doses for members of the TROC are being provided to European and Russian epidemiologists, as partial input for studies of risk in this population. Two of our original goals for the completion of this nine-year phase of Project 1.1 were not completed. These are completion of TRDS-2009MC, which was to be a Monte Carlo version of TRDS-2009 that could be used for more explicit analysis of the impact of uncertainty in doses on uncertainty in radiogenic risk factors. The second incomplete goal was to be the provision of household specific external doses (rather than village average). This task was far along, but had to be delayed due to the lead investigators work on consideration of a revised source term.

  11. TH-C-BRD-05: Reducing Proton Beam Range Uncertainty with Patient-Specific CT HU to RSP Calibrations Based On Single-Detector Proton Radiography

    SciTech Connect (OSTI)

    Doolan, P; Sharp, G; Testa, M; Lu, H-M; Bentefour, E; Royle, G

    2014-06-15

    Purpose: Beam range uncertainty in proton treatment comes primarily from converting the patient's X-ray CT (xCT) dataset to relative stopping power (RSP). Current practices use a single curve for this conversion, produced by a stoichiometric calibration based on tissue composition data for average, healthy, adult humans, but not for the individual in question. Proton radiographs produce water-equivalent path length (WEPL) maps, dependent on the RSP of tissues within the specific patient. This work investigates the use of such WEPL maps to optimize patient-specific calibration curves for reducing beam range uncertainty. Methods: The optimization procedure works on the principle of minimizing the difference between the known WEPL map, obtained from a proton radiograph, and a digitally-reconstructed WEPL map (DRWM) through an RSP dataset, by altering the calibration curve that is used to convert the xCT into an RSP dataset. DRWMs were produced with Plastimatch, an in-house developed software, and an optimization procedure was implemented in Matlab. Tests were made on a range of systems including simulated datasets with computed WEPL maps and phantoms (anthropomorphic and real biological tissue) with WEPL maps measured by single detector proton radiography. Results: For the simulated datasets, the optimizer showed excellent results. It was able to either completely eradicate or significantly reduce the root-mean-square-error (RMSE) in the WEPL for the homogeneous phantoms (to zero for individual materials or from 1.5% to 0.2% for the simultaneous optimization of multiple materials). For the heterogeneous phantom the RMSE was reduced from 1.9% to 0.3%. Conclusion: An optimization procedure has been designed to produce patient-specific calibration curves. Test results on a range of systems with different complexities and sizes have been promising for accurate beam range control in patients. This project was funded equally by the Engineering and Physical Sciences Research Council (UK) and Ion Beam Applications (Louvain-La-Neuve, Belgium)

  12. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 2. Performance, Emissions, and Cost of Combustion-Based NOx Controls for Wall and Tangential Furnace Coal-Fired Power Plants

    SciTech Connect (OSTI)

    Frey, H. Christopher; Tran, Loan K.

    1999-04-30

    This is Volume 2 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments. The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  13. Development, sensitivity analysis, and uncertainty quantification of high-fidelity arctic sea ice models.

    SciTech Connect (OSTI)

    Peterson, Kara J.; Bochev, Pavel Blagoveston; Paskaleva, Biliana S.

    2010-09-01

    Arctic sea ice is an important component of the global climate system and due to feedback effects the Arctic ice cover is changing rapidly. Predictive mathematical models are of paramount importance for accurate estimates of the future ice trajectory. However, the sea ice components of Global Climate Models (GCMs) vary significantly in their prediction of the future state of Arctic sea ice and have generally underestimated the rate of decline in minimum sea ice extent seen over the past thirty years. One of the contributing factors to this variability is the sensitivity of the sea ice to model physical parameters. A new sea ice model that has the potential to improve sea ice predictions incorporates an anisotropic elastic-decohesive rheology and dynamics solved using the material-point method (MPM), which combines Lagrangian particles for advection with a background grid for gradient computations. We evaluate the variability of the Los Alamos National Laboratory CICE code and the MPM sea ice code for a single year simulation of the Arctic basin using consistent ocean and atmospheric forcing. Sensitivities of ice volume, ice area, ice extent, root mean square (RMS) ice speed, central Arctic ice thickness, and central Arctic ice speed with respect to ten different dynamic and thermodynamic parameters are evaluated both individually and in combination using the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA). We find similar responses for the two codes and some interesting seasonal variability in the strength of the parameters on the solution.

  14. Calculating Impacts of Energy Standards on Energy Demand in U.S. Buildings under Uncertainty with an Integrated Assessment Model: Technical Background Data

    SciTech Connect (OSTI)

    Scott, Michael J.; Daly, Don S.; Hathaway, John E.; Lansing, Carina S.; Liu, Ying; McJeon, Haewon C.; Moss, Richard H.; Patel, Pralit L.; Peterson, Marty J.; Rice, Jennie S.; Zhou, Yuyu

    2014-12-06

    This report presents data and assumptions employed in an application of PNNLs Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of statelevel population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.

  15. Radiogenic p-isotopes from type Ia supernova, nuclear physics uncertainties, and galactic chemical evolution compared with values in primitive meteorites

    SciTech Connect (OSTI)

    Travaglio, C.; Gallino, R.; Rauscher, T.; Dauphas, N.; Rpke, F. K.; Hillebrandt, W. E-mail: claudia.travaglio@b2fh.org

    2014-11-10

    The nucleosynthesis of proton-rich isotopes is calculated for multi-dimensional Chandrasekhar-mass models of Type Ia supernovae (SNe Ia) with different metallicities. The predicted abundances of the short-lived radioactive isotopes {sup 92}Nb, {sup 97,} {sup 98}Tc, and {sup 146}Sm are given in this framework. The abundance seeds are obtained by calculating s-process nucleosynthesis in the material accreted onto a carbon-oxygen white dwarf from a binary companion. A fine grid of s-seeds at different metallicities and {sup 13}C-pocket efficiencies is considered. A galactic chemical evolution model is used to predict the contribution of SN Ia to the solar system p-nuclei composition measured in meteorites. Nuclear physics uncertainties are critical to determine the role of SNe Ia in the production of {sup 92}Nb and {sup 146}Sm. We find that, if standard Chandrasekhar-mass SNe Ia are at least 50% of all SN Ia, they are strong candidates for reproducing the radiogenic p-process signature observed in meteorites.

  16. Dealing with natural gas uncertainties

    SciTech Connect (OSTI)

    Clements, J.; Graeber, D. )

    1991-04-01

    The fuel of choice for generating new power is and will continue over the next two decades to be natural gas. It is the fuel of choice because it is plentiful, environmentally acceptable, and relatively inexpensive. This paper reports that gas reserves on the North American continent continue to be discovered in amounts that may keep the gas bubble inflated far longer than currently estimated. New gas transportation capacity is actively being developed to overcome the capacity bottlenecks and deliverability shortfalls. Natural gas prices will probably remain stable (with expected CPI-related increases) for the short run (2-4 years), and probably will be higher than CPI increases thereafter.

  17. Market Prices and Uncertainty Report

    Reports and Publications (EIA)

    2016-01-01

    Monthly analysis of crude oil, petroleum products, natural gas, and propane prices is released as a regular supplement to the Short-Term Energy Outlook.

  18. Image registration with uncertainty analysis

    DOE Patents [OSTI]

    Simonson, Katherine M. (Cedar Crest, NM)

    2011-03-22

    In an image registration method, edges are detected in a first image and a second image. A percentage of edge pixels in a subset of the second image that are also edges in the first image shifted by a translation is calculated. A best registration point is calculated based on a maximum percentage of edges matched. In a predefined search region, all registration points other than the best registration point are identified that are not significantly worse than the best registration point according to a predetermined statistical criterion.

  19. A New Approach to Modeling Aerosol Effects on East Asian Climate: Parametric Uncertainties Associated with Emissions, Cloud Microphysics and their Interactions

    SciTech Connect (OSTI)

    Yan, Huiping; Qian, Yun; Zhao, Chun; Wang, Hailong; Wang, Minghuai; Yang, Ben; Liu, Xiaohong; Fu, Qiang

    2015-09-16

    In this study, we adopt a parametric sensitivity analysis framework that integrates the quasi-Monte Carlo parameter sampling approach and a surrogate model to examine aerosol effects on the East Asian Monsoon climate simulated in the Community Atmosphere Model (CAM5). A total number of 256 CAM5 simulations are conducted to quantify the model responses to the uncertain parameters associated with cloud microphysics parameterizations and aerosol (e.g., sulfate, black carbon (BC), and dust) emission factors and their interactions. Results show that the interaction terms among parameters are important for quantifying the sensitivity of fields of interest, especially precipitation, to the parameters. The relative importance of cloud-microphysics parameters and emission factors (strength) depends on evaluation metrics or the model fields we focused on, and the presence of uncertainty in cloud microphysics imposes an additional challenge in quantifying the impact of aerosols on cloud and climate. Due to their different optical and microphysical properties and spatial distributions, sulfate, BC, and dust aerosols have very different impacts on East Asian Monsoon through aerosol-cloud-radiation interactions. The climatic effects of aerosol do not always have a monotonic response to the change of emission factors. The spatial patterns of both sign and magnitude of aerosol-induced changes in radiative fluxes, cloud, and precipitation could be different, depending on the aerosol types, when parameters are sampled in different ranges of values. We also identify the different cloud microphysical parameters that show the most significant impact on climatic effect induced by sulfate, BC and dust, respectively, in East Asia.

  20. Mineral dissolution and precipitation during CO2 injection at the Frio-I Brine Pilot: Geochemical modeling and uncertainty analysis

    SciTech Connect (OSTI)

    Ilgen, A. G.; Cygan, R. T.

    2015-12-07

    During the Frio-I Brine Pilot CO2 injection experiment in 2004, distinct geochemical changes in response to the injection of 1600 tons of CO2 were recorded in samples collected from the monitoring well. Previous geochemical modeling studies have considered dissolution of calcite and iron oxyhydroxides, or release of adsorbed iron, as the most likely sources of the increased ion concentrations. We explore in this modeling study possible alternative sources of the increasing calcium and iron, based on the data from the detailed petrographic characterization of the Upper Frio Formation C. Particularly, we evaluate whether dissolution of pyrite and oligoclase (anorthite component) can account for the observed geochemical changes. Due to kinetic limitations, dissolution of pyrite and anorthite cannot account for the increased iron and calcium concentrations on the time scale of the field test (10 days). However, dissolution of these minerals is contributing to carbonate and clay mineral precipitation on the longer time scales (1000 years). The one-dimensional reactive transport model predicts carbonate minerals, dolomite and ankerite, as well as clay minerals kaolinite, nontronite and montmorillonite, will precipitate in the Frio Formation C sandstone as the system progresses towards chemical equilibrium during a 1000-year period. Cumulative uncertainties associated with using different thermodynamic databases, activity correction models (Pitzer vs. B-dot), and extrapolating to reservoir temperature, are manifested in the difference in the predicted mineral phases. Furthermore, these models are consistent with regards to the total volume of mineral precipitation and porosity values which are predicted to within 0.002%.

  1. Quantifying the Uncertainties of Aerosol Indirect Effects and Impacts on Decadal-Scale Climate Variability in NCAR CAM5 and CESM1

    SciTech Connect (OSTI)

    Park, Sungsu

    2014-12-12

    The main goal of this project is to systematically quantify the major uncertainties of aerosol indirect effects due to the treatment of moist turbulent processes that drive aerosol activation, cloud macrophysics and microphysics in response to anthropogenic aerosol perturbations using the CAM5/CESM1. To achieve this goal, the P.I. hired a postdoctoral research scientist (Dr. Anna Fitch) who started her work from the Nov.1st.2012. In order to achieve the project goal, the first task that the Postdoc. and the P.I. did was to quantify the role of subgrid vertical velocity variance on the activation and nucleation of cloud liquid droplets and ice crystals and its impact on the aerosol indirect effect in CAM5. First, we analyzed various LES cases (from dry stable to cloud-topped PBL) to check whether this isotropic turbulence assumption used in CAM5 is really valid. It turned out that this isotropic turbulence assumption is not universally valid. Consequently, from the analysis of LES, we derived an empirical formulation relaxing the isotropic turbulence assumption used for the CAM5 aerosol activation and ice nucleation, and implemented the empirical formulation into CAM5/CESM1, and tested in the single-column and global simulation modes, and examined how it changed aerosol indirect effects in the CAM5/CESM1. These results were reported in the poster section in the 18th Annual CESM workshop held in Breckenridge, CO during Jun.17-20.2013. While we derived an empirical formulation from the analysis of couple of LES from the first task, the general applicability of that empirical formulation was questionable, because it was obtained from the limited number of LES simulations. The second task we did was to derive a more fundamental analytical formulation relating vertical velocity variance to TKE using other information starting from basic physical principles. This was a somewhat challenging subject, but if this could be done in a successful way, it could be directly implemented into the CAM5 as a practical parameterization, and substantially contributes to achieving the project goal. Through an intensive research for about one year, we found appropriate mathematical formulation and tried to implement it into the CAM5 PBL and activation routine as a practical parameterized numerical code. During these processes, however, the Postdoc applied for another position in Sweden, Europe, and accepted a job offer there, and left NCAR in August 2014. In Sweden, Dr. Anna Fitch is still working on this subject in a part time, planning to finalize the research and to write the paper in a near future.

  2. SU-E-T-493: Analysis of the Impact of Range and Setup Uncertainties On the Dose to Brain Stem and Whole Brain in the Passively Scattered Proton Therapy Plans

    SciTech Connect (OSTI)

    Sahoo, N; Zhu, X; Zhang, X; Poenisch, F; Li, H; Wu, R; Lii, M; Umfleet, W; Gillin, M; Mahajan, A; Grosshans, D

    2014-06-01

    Purpose: To quantify the impact of range and setup uncertainties on various dosimetric indices that are used to assess normal tissue toxicities of patients receiving passive scattering proton beam therapy (PSPBT). Methods: Robust analysis of sample treatment plans of six brain cancer patients treated with PSPBT at our facility for whom the maximum brain stem dose exceeded 5800 CcGE were performed. The DVH of each plan was calculated in an Eclipse treatment planning system (TPS) version 11 applying ±3.5% range uncertainty and ±3 mm shift of the isocenter in x, y and z directions to account for setup uncertainties. Worst-case dose indices for brain stem and whole brain were compared to their values in the nominal plan to determine the average change in their values. For the brain stem, maximum dose to 1 cc of volume, dose to 10%, 50%, 90% of volume (D10, D50, D90) and volume receiving 6000, 5400, 5000, 4500, 4000 CcGE (V60, V54, V50, V45, V40) were evaluated. For the whole brain, maximum dose to 1 cc of volume, and volume receiving 5400, 5000, 4500, 4000, 3000 CcGE (V54, V50, V45, V40 and V30) were assessed. Results: The average change in the values of these indices in the worst scenario cases from the nominal plan were as follows. Brain stem; Maximum dose to 1 cc of volume: 1.1%, D10: 1.4%, D50: 8.0%, D90:73.3%, V60:116.9%, V54:27.7%, V50: 21.2%, V45:16.2%, V40:13.6%,Whole brain; Maximum dose to 1 cc of volume: 0.3%, V54:11.4%, V50: 13.0%, V45:13.6%, V40:14.1%, V30:13.5%. Conclusion: Large to modest changes in the dosiemtric indices for brain stem and whole brain compared to nominal plan due to range and set up uncertainties were observed. Such potential changes should be taken into account while using any dosimetric parameters for outcome evaluation of patients receiving proton therapy.

  3. Neutronics qualification of the Jules Horowitz reactor fuel by interpretation of the VALMONT experimental program - Transposition of the uncertainties on the reactivity of JHR with JEF2.2 and JEFF3.1.1

    SciTech Connect (OSTI)

    Leray, O.; Hudelot, J. P.; Antony, M.; Doederlein, C.; Santamarina, A.; Bernard, D.; Vaglio-Gaudard, C.

    2011-07-01

    The new European material testing Jules Horowitz Reactor (JHR), currently under construction in Cadarache center (CEA France), will use LEU (20% enrichment in {sup 235}U) fuels (U{sub 3}Si{sub 2} for the start up and UMoAl in the future) which are quite different from the industrial oxide fuel, for which an extensive neutronics qualification database has been established. The HORUS3D/N neutronics calculation scheme, used for the design and safety studies of the JHR, is being developed within the framework of a rigorous verification-validation-qualification methodology. In this framework, the experimental VALMONT (Validation of Aluminium Molybdenum uranium fuel for Neutronics) program has been performed in the MINERVE facility of CEA Cadarache (France), in order to qualify the capability of HORUS3D/N to accurately calculate the reactivity of the JHR reactor. The MINERVE facility using the oscillation technique provides accurate measurements of reactivity effect of samples. The VALMONT program includes oscillations of samples of UAl{sub x}/Al and UMo/Al with enrichments ranging from 0.2% to 20% and Uranium densities from 2.2 to 8 g/cm{sup 3}. The geometry of the samples and the pitch of the experimental lattice ensure maximum representativeness with the neutron spectrum expected for JHR. By comparing the effect of the sample with the one of a known fuel specimen, the reactivity effect can be measured in absolute terms and be compared to computational results. Special attention was paid to the rigorous determination and reduction of the experimental uncertainties. The calculational analysis of the VALMONT results was performed with the French deterministic code APOLLO2. A comparison of the impact of the different calculation methods, data libraries and energy meshes that were tested is presented. The interpretation of the VALMONT experimental program allowed the qualification of JHR fuel UMoAl8 (with an enrichment of 19.75% {sup 235}U) by the Minerve-dedicated interpretation tool: PIMS. The effect of energy meshes and evaluations put forward the JEFF3.1.1/SHEM scheme that leads to a better calculation of the reactivity effect of VALMONT samples. Then, in order to quantify the impact of the uncertainties linked to the basic nuclear data, their propagation from the cross section measurement to the final computational result was analysed in a rigorous way by using a nuclear data re-estimation method based on Gauss-Newton iterations. This study concludes that the prior uncertainties due to nuclear data (uranium, aluminium, beryllium and water) on the reactivity of the Begin Of Cycle (BOC) for the JHR core reach 1217 pcm at 2{sigma}. Now, the uppermost uncertainty on the JHR reactivity is due to aluminium. (authors)

  4. Reducing Transaction Costs for Energy Efficiency Investments and Analysis of Economic Risk Associated With Building Performance Uncertainties: Small Buildings and Small Portfolios Program

    SciTech Connect (OSTI)

    Langner, R.; Hendron, B.; Bonnema, E.

    2014-08-01

    The small buildings and small portfolios (SBSP) sector face a number of barriers that inhibit SBSP owners from adopting energy efficiency solutions. This pilot project focused on overcoming two of the largest barriers to financing energy efficiency in small buildings: disproportionately high transaction costs and unknown or unacceptable risk. Solutions to these barriers can often be at odds, because inexpensive turnkey solutions are often not sufficiently tailored to the unique circumstances of each building, reducing confidence that the expected energy savings will be achieved. To address these barriers, NREL worked with two innovative, forward-thinking lead partners, Michigan Saves and Energi, to develop technical solutions that provide a quick and easy process to encourage energy efficiency investments while managing risk. The pilot project was broken into two stages: the first stage focused on reducing transaction costs, and the second stage focused on reducing performance risk. In the first stage, NREL worked with the non-profit organization, Michigan Saves, to analyze the effects of 8 energy efficiency measures (EEMs) on 81 different baseline small office building models in Holland, Michigan (climate zone 5A). The results of this analysis (totaling over 30,000 cases) are summarized in a simple spreadsheet tool that enables users to easily sort through the results and find appropriate small office EEM packages that meet a particular energy savings threshold and are likely to be cost-effective.

  5. Monitoring CO2 Storage at Cranfield, Mississippi with Time-Lapse Offset VSP Using Integration and Modeling to Reduce Uncertainty

    SciTech Connect (OSTI)

    Daley, Thomas M.; Hendrickson, Joel; Queen, John H.

    2014-12-31

    A time-lapse Offset Vertical Seismic Profile (OVSP) data set was acquired as part of a subsurface monitoring program for geologic sequestration of CO2. The storage site at Cranfield, near Natchez, Mississippi, is part of a detailed area study (DAS) site for geologic carbon sequestration operated by the U.S. Dept. of Energys Southeast Regional Carbon Sequestration Partnership (SECARB). The DAS site includes three boreholes, an injection well and two monitoring wells. The project team selected the DAS site to examine CO2 sequestration multiphase fluid flow and pressure at the interwell scale in a brine reservoir. The time-lapse (TL) OVSP was part of an integrated monitoring program that included well logs, crosswell seismic, electrical resistance tomography and 4D surface seismic. The goals of the OVSP were to detect the CO2 induced change in seismic response, give information about the spatial distribution of CO2 near the injection well and to help tie the high-resolution borehole monitoring to the 4D surface data. The VSP data were acquired in well CFU 31-F1, which is the ~3200 m deep CO2 injection well at the DAS site. A preinjection survey was recorded in late 2009 with injection beginning in December 2009, and a post injection survey was conducted in Nov 2010 following injection of about 250 kT of CO2. The sensor array for both surveys was a 50-level, 3-component, Sercel MaxiWave system with 15 m (49 ft) spacing between levels. The source for both surveys was an accelerated weight drop, with different source trucks used for the two surveys. Consistent time-lapse processing was applied to both data sets. Time-lapse processing generated difference corridor stacks to investigate CO2 induced reflection amplitude changes from each source point. Corridor stacks were used for amplitude analysis to maximize the signal-to-noise ratio (S/N) for each shot point. Spatial variation in reflectivity (used to map the plume) was similar in magnitude to the corridor stacks but, due to relatively lower S/N, the results were less consistent and more sensitive to processing and therefore are not presented. We examined the overall time-lapse repeatability of the OVSP data using three methods, the NRMS and Predictability (Pred) measures of Kragh and Christie (2002) and the signal-to-distortion ratio (SDR) method of Cantillo (2011). Because time-lapse noise was comparable to the observed change, multiple methods were used to analyze data reliability. The reflections from the top and base reservoir were identified on the corridor stacks by correlation with a synthetic response generated from the well logs. A consistent change in the corridor stack amplitudes from pre- to post-CO2 injection was found for both the top and base reservoir reflections on all ten shot locations analyzed. In addition to the well-log synthetic response, a finite-difference elastic wave propagation model was built based on rock/fluid properties obtained from well logs, with CO2 induced changes guided by time-lapse crosswell seismic tomography (Ajo-Franklin, et al., 2013) acquired at the DAS site. Time-lapse seismic tomography indicated that two reservoir zones were affected by the flood. The modeling established that interpretation of the VSP trough and peak event amplitudes as reflectivity from the top and bottom of reservoir is appropriate even with possible tuning effects. Importantly, this top/base change gives confidence in an interpretation that these changes arise from within the reservoir, not from bounding lithology. The modeled time-lapse change and the observed field data change from 10 shotpoints are in agreement for both magnitude and polarity of amplitude change for top and base of reservoir. Therefore, we conclude the stored CO2 has been successfully detected and, furthermore, the observed seismic reflection change can

  6. Monitoring CO2 Storage at Cranfield, Mississippi with Time-Lapse Offset VSP – Using Integration and Modeling to Reduce Uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Daley, Thomas M.; Hendrickson, Joel; Queen, John H.

    2014-12-31

    A time-lapse Offset Vertical Seismic Profile (OVSP) data set was acquired as part of a subsurface monitoring program for geologic sequestration of CO2. The storage site at Cranfield, near Natchez, Mississippi, is part of a detailed area study (DAS) site for geologic carbon sequestration operated by the U.S. Dept. of Energy’s Southeast Regional Carbon Sequestration Partnership (SECARB). The DAS site includes three boreholes, an injection well and two monitoring wells. The project team selected the DAS site to examine CO2 sequestration multiphase fluid flow and pressure at the interwell scale in a brine reservoir. The time-lapse (TL) OVSP was partmore » of an integrated monitoring program that included well logs, crosswell seismic, electrical resistance tomography and 4D surface seismic. The goals of the OVSP were to detect the CO2 induced change in seismic response, give information about the spatial distribution of CO2 near the injection well and to help tie the high-resolution borehole monitoring to the 4D surface data. The VSP data were acquired in well CFU 31-F1, which is the ~3200 m deep CO2 injection well at the DAS site. A preinjection survey was recorded in late 2009 with injection beginning in December 2009, and a post injection survey was conducted in Nov 2010 following injection of about 250 kT of CO2. The sensor array for both surveys was a 50-level, 3-component, Sercel MaxiWave system with 15 m (49 ft) spacing between levels. The source for both surveys was an accelerated weight drop, with different source trucks used for the two surveys. Consistent time-lapse processing was applied to both data sets. Time-lapse processing generated difference corridor stacks to investigate CO2 induced reflection amplitude changes from each source point. Corridor stacks were used for amplitude analysis to maximize the signal-to-noise ratio (S/N) for each shot point. Spatial variation in reflectivity (used to ‘map’ the plume) was similar in magnitude to the corridor stacks but, due to relatively lower S/N, the results were less consistent and more sensitive to processing and therefore are not presented. We examined the overall time-lapse repeatability of the OVSP data using three methods, the NRMS and Predictability (Pred) measures of Kragh and Christie (2002) and the signal-to-distortion ratio (SDR) method of Cantillo (2011). Because time-lapse noise was comparable to the observed change, multiple methods were used to analyze data reliability. The reflections from the top and base reservoir were identified on the corridor stacks by correlation with a synthetic response generated from the well logs. A consistent change in the corridor stack amplitudes from pre- to post-CO2 injection was found for both the top and base reservoir reflections on all ten shot locations analyzed. In addition to the well-log synthetic response, a finite-difference elastic wave propagation model was built based on rock/fluid properties obtained from well logs, with CO2 induced changes guided by time-lapse crosswell seismic tomography (Ajo-Franklin, et al., 2013) acquired at the DAS site. Time-lapse seismic tomography indicated that two reservoir zones were affected by the flood. The modeling established that interpretation of the VSP trough and peak event amplitudes as reflectivity from the top and bottom of reservoir is appropriate even with possible tuning effects. Importantly, this top/base change gives confidence in an interpretation that these changes arise from within the reservoir, not from bounding lithology. The modeled time-lapse change and the observed field data change from 10 shotpoints are in agreement for both magnitude and polarity of amplitude change for top and base of reservoir. Therefore, we conclude the stored CO2 has been successfully detected and, furthermore, the observed seismic reflection change can be applied to Cranfield’s 4D surface seismic for spatially delineating the CO2/brine interface.« less

  7. explicit representation of uncertainty in solar generation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    solar generation - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

  8. explicit representation of uncertainty in system load

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    system load - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

  9. explicit representation of uncertainty in wind generation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    wind generation - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

  10. Characterizing Uncertainties in Ice Particle Size Distributions

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and image(s), see ARM Research Highlights http:www.arm.govsciencehighlights Research Highlight In many parameterization schemes for numerical models or remote sensing...

  11. Validation and Uncertainty Characterization for Energy Simulation

    Broader source: Energy.gov [DOE]

    Lead Performers: -- Lawrence Berkeley National Laboratory (LBNL) Berkeley, CA -- Oak Ridge National Laboratory (ORNL) Oak Ridge, TN -- National Renewable Energy Laboratory (NREL) Golden, CO -- Argonne National Laboratory (ANL) Argonne, IL

  12. PMU Uncertainty Quantification in Voltage Stability Analysis...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Year of Publication 2015 Authors Chen, C, Wang, J, Li, Z, Sun, H, Wang, Z Journal IEEE Transactions on Power Systems Volume 30 Start Page 2196 Issue 4 Pagination 2 Date...

  13. Uncertainty quantification methodologies development for storage...

    Office of Scientific and Technical Information (OSTI)

    Have feedback or suggestions for a way to improve these results? Save Share this Record Citation Formats MLA APA Chicago Bibtex Export Metadata Endnote Excel CSV XML Save to My ...

  14. Reducing Petroleum Despendence in California: Uncertainties About...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    More Documents & Publications Diesel Use in California Future Potential of Hybrid and Diesel Powertrains in the U.S. Light-Duty Vehicle Market Dumping Dirty Diesels: The View From ...

  15. Validation and Uncertainty Characterization for Energy Simulation...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    reinforce the accuracy and confidence benefits. CONTACTS DOE Technology Manager: Amir Roth, Amir.Roth@ee.doe.gov Lead Performers: Philip Haves, LBNL (phaves@lbl.gov); Joshua...

  16. Approaches for Uncertainty Quantification and Sensitivity Analysis

    Broader source: Energy.gov [DOE]

    Presentation from the 2015 Annual Performance and Risk Assessment (P&RA) Community of Practice (CoP) Technical Exchange Meeting held in Richland, Washington on December 15-16, 2015.

  17. Microsoft Word - Price Uncertainty Supplement.doc

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    ... This can be seen in the spread between the green (July 1 forwards) and black curves ... Source: U.S. EIA, CME Group Energy Information AdministrationShort-Term Energy Outlook ...

  18. Microsoft Word - Price Uncertainty Supplement.doc

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    ... In Figure 3, the then-prompt February 2009 futures contract (green dashed curve in Figure ... Figure 4 shows the comparable 2007 period and the same effect. U.S. Energy Information ...

  19. H. R. 2998: A bill to amend the Natural Gas Act to permit the development of coalbed methane gas in areas where its development has been impeded or made impossible by uncertainty and litigation over ownership rights, and for other purposes, introduced in the US House of Representatives, One Hundred Second Congress, First Session, July 23, 1991

    SciTech Connect (OSTI)

    Not Available

    1991-01-01

    This bill would direct the Secretary of Energy to compile a list of affected states which are determined to be states in which disputes, uncertainty, or litigation exist or potentially exists regarding the ownership of coalbed methane; in which the development of significant deposits of coalbed methane may be impeded by such disputes; in which statutory or regulatory procedures permitting and encouraging development of coalbed methane prior to final resolution of disputes are not in place; and in which extensive development of coalbed methane does not exist. Colorado, Montana, New Mexico, Wyoming, Utah, Virginia, and Alabama are excluded from such a list since they currently have development of coalbed methane. Until the Secretary of Energy publishes a different list, the affected states are West Virginia, Pennsylvania, Kentucky, Ohio, Tennessee, Indiana, and Illinois, effective on the date of enactment of this bill.

  20. THE EFFECT OF UNCERTAINTY IN MODELING COEFFICIENTS USED TO PREDICT...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    likely to vary for systems in different climates, depending on the distribution of temperature and irradiance, as well as for different technology types. As previously reported,...

  1. Malaria and global change: Insights, uncertainties and possible surprises

    SciTech Connect (OSTI)

    Martin, P.H.; Steel, A.

    1996-12-31

    Malaria may change with global change. Indeed, global change may affect malaria risk and malaria epidemiology. Malaria risk may change in response to a greenhouse warming; malaria epidemiology, in response to the social, economic, and political developments which a greenhouse warming may trigger. To date, malaria receptivity and epidemiology futures have been explored within the context of equilibrium studies. Equilibrium studies of climate change postulate an equilibrium present climate (the starting point) and a doubled-carbon dioxide climate (the end point), simulate conditions in both instances, and compare the two. What happens while climate changes, i.e., between the starting point and the end point, is ignored. The present paper focuses on malaria receptivity and addresses what equilibrium studies miss, namely transient malaria dynamics.

  2. Uncertainty quantification of US Southwest climate from IPCC...

    Office of Scientific and Technical Information (OSTI)

    VA at www.ntis.gov. The Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) made extensive use of coordinated simulations by 18 international...

  3. A Regulator’s Perspective on Assessing Temporal Uncertainty

    Broader source: Energy.gov [DOE]

    Presentation from the 2015 Annual Performance and Risk Assessment (P&RA) Community of Practice (CoP) Technical Exchange Meeting held in Richland, Washington on December 15-16, 2015.

  4. Invited Review Article: Error and uncertainty in Raman thermal...

    Office of Scientific and Technical Information (OSTI)

    for a way to improve these results? Save Share this Record Citation Formats MLA APA Chicago Bibtex Export Metadata Endnote Excel CSV XML Send to Email Send to Email...

  5. Optimizing weak lensing mass estimates for cluster profile uncertainty

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Gruen, D.; Bernstein, G. M.; Lam, T. Y.; Seitz, S.

    2011-09-11

    Weak lensing measurements of cluster masses are necessary for calibrating mass-observable relations (MORs) to investigate the growth of structure and the properties of dark energy. However, the measured cluster shear signal varies at fixed mass M200m due to inherent ellipticity of background galaxies, intervening structures along the line of sight, and variations in the cluster structure due to scatter in concentrations, asphericity and substructure. We use N-body simulated halos to derive and evaluate a weak lensing circular aperture mass measurement Map that minimizes the mass estimate variance <(Map - M200m)2> in the presence of all these forms of variability. Dependingmore » on halo mass and observational conditions, the resulting mass estimator improves on Map filters optimized for circular NFW-profile clusters in the presence of uncorrelated large scale structure (LSS) about as much as the latter improve on an estimator that only minimizes the influence of shape noise. Optimizing for uncorrelated LSS while ignoring the variation of internal cluster structure puts too much weight on the profile near the cores of halos, and under some circumstances can even be worse than not accounting for LSS at all. As a result, we discuss the impact of variability in cluster structure and correlated structures on the design and performance of weak lensing surveys intended to calibrate cluster MORs.« less

  6. Monte Carlo Solution for Uncertainty Propagation in Particle...

    Office of Scientific and Technical Information (OSTI)

    Resource Relation: Conference: International Conference on Math. and Comp. Methods Applied ... presentation at the International Conference on Math. and Comp. Methods Applied to Nucl. ...

  7. Improvements to Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    multiplicity, and produce new evaluated files of U and Pu ... global and microscopic model input parameters, leading ... While in the past the design, construction and ...

  8. Estimation and Uncertainty Analysis of Impacts of Future Heat...

    Office of Scientific and Technical Information (OSTI)

    DOE Contract Number: AC05-76RL01830 Resource Type: Journal Article Resource Relation: Journal Name: Environmental Health Perspectives, 122(1):10-16 Research Org: Pacific Northwest ...

  9. Uncertainty Quantification with Monte Carlo Hauser-Feshbach Calculatio...

    Office of Scientific and Technical Information (OSTI)

    LANL Country of Publication: United States Language: English Subject: Atomic and Nuclear Physics; Nuclear Fuel Cycle & Fuel Materials(11); Nuclear Physics & Radiation Physics(73)...

  10. Addressing Uncertainty in Desigh Inputs: A Case Study of Probabilisti...

    Office of Environmental Management (EM)

    used in settlement profile development - Soft zone location and depth - Thickness and shape of the soft zone - Beta angle - Consolidation strain - Subgrade modulus *...

  11. Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties...

    Office of Scientific and Technical Information (OSTI)

    Country of Publication: United States Language: English Subject: General Studies of Nuclear Reactors(22); Instrumentation Related to Nuclear Science & Technology(46); Military ...

  12. Optimized Uncertainty Quantification Algorithm Within a Dynamic Event Tree Framework

    SciTech Connect (OSTI)

    J. W. Nielsen; Akira Tokuhiro; Robert Hiromoto

    2014-06-01

    Methods for developing Phenomenological Identification and Ranking Tables (PIRT) for nuclear power plants have been a useful tool in providing insight into modelling aspects that are important to safety. These methods have involved expert knowledge with regards to reactor plant transients and thermal-hydraulic codes to identify are of highest importance. Quantified PIRT provides for rigorous method for quantifying the phenomena that can have the greatest impact. The transients that are evaluated and the timing of those events are typically developed in collaboration with the Probabilistic Risk Analysis. Though quite effective in evaluating risk, traditional PRA methods lack the capability to evaluate complex dynamic systems where end states may vary as a function of transition time from physical state to physical state . Dynamic PRA (DPRA) methods provide a more rigorous analysis of complex dynamic systems. A limitation of DPRA is its potential for state or combinatorial explosion that grows as a function of the number of components; as well as, the sampling of transition times from state-to-state of the entire system. This paper presents a method for performing QPIRT within a dynamic event tree framework such that timing events which result in the highest probabilities of failure are captured and a QPIRT is performed simultaneously while performing a discrete dynamic event tree evaluation. The resulting simulation results in a formal QPIRT for each end state. The use of dynamic event trees results in state explosion as the number of possible component states increases. This paper utilizes a branch and bound algorithm to optimize the solution of the dynamic event trees. The paper summarizes the methods used to implement the branch-and-bound algorithm in solving the discrete dynamic event trees.

  13. Uncertainty analysis of multi-rate kinetics of uranium desorption...

    Office of Scientific and Technical Information (OSTI)

    In this paper, a Bayes-based, Differential Evolution Markov Chain method was used to ... Country of Publication: United States Language: English Subject: Markov Chain, ...

  14. Enabling Efficient Uncertainty Quantification Using Adjoint-based...

    Office of Scientific and Technical Information (OSTI)

    not provided. Authors: Wildey, Timothy Michael 1 ; Shadid, John N. 1 ; Cyr, Eric C 1 ; Constantine, Paul 1 + Show Author Affiliations Sandia National Lab. (SNL-NM),...

  15. Monte Carlo Solution for Uncertainty Propagation in Particle Transport with

    Office of Scientific and Technical Information (OSTI)

    a Stochastic Galerkin Method. (Conference) | SciTech Connect Authors: Franke, Brian C. ; Prinja, Anil K. Publication Date: 2013-01-01 OSTI Identifier: 1063492 Report Number(s): SAND2013-0204C DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource Relation: Conference: Proposed for presentation at the International Conference on Math. and Comp. Methods Applied to Nucl. Sci. and Engg. (M&C 2013) held May 5-9, 2013 in Sun Valley, ID. Research Org: Sandia National

  16. Monte Carlo Solution for Uncertainty Propagation in Particle Transport with

    Office of Scientific and Technical Information (OSTI)

    a Stochastic Galerkin Method. (Conference) | SciTech Connect Abstract not provided. Authors: Franke, Brian C. ; Prinja, Anil K. Publication Date: 2013-04-01 OSTI Identifier: 1078905 Report Number(s): SAND2013-3409C 448625 DOE Contract Number: AC04-94AL85000 Resource Type: Conference Resource Relation: Conference: International Conference on Math. and Comp. Methods Applied to Nucl. Sci. and Engg. (M&C 2013) held May 5-9, 2013 in Sun Valley, ID.; Related Information: Proposed for

  17. Influence of Nuclear Fuel Cycles on Uncertainty of Long Term...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    transmutation, which remove actinides, will not materially alter the performance, the spread in dose results around the mean, the modeling effort to include significant features,...

  18. Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties...

    Office of Scientific and Technical Information (OSTI)

    Country of Publication: United States Language: English Subject: General Studies of Nuclear Reactors(22); Instrumentation Related to Nuclear Science & Technology(46); Military...

  19. Uncertainties in Air Exchange using Continuous-Injection, Long...

    Office of Scientific and Technical Information (OSTI)

    Technologies Division Country of Publication: United States Language: English Subject: 32 ENERGY CONSERVATION, CONSUMPTION, AND UTILIZATION Word Cloud More Like This Full Text...

  20. Impact of Advanced Fuel Cycles on Uncertainty Associated with...

    Office of Scientific and Technical Information (OSTI)

    ASME 15th International Conference on Environmental Remediation and Radioactive Waste Management held September 8-12, 2013 in Brussels, Belgium.; Related Information:...

  1. Comparison of Uncertainty of Two Precipitation Prediction Models...

    Office of Scientific and Technical Information (OSTI)

    Lab Technical Area 54 Meteorological inputs are an important part of subsurface flow and transport modeling. The choice of source for meteorological data used as inputs has...

  2. Neutron reactions and climate uncertainties earn Los Alamos scientists...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    examples of the intellectual vitality here at Los Alamos," said Director Charlie McMillan. "The early career awards are a great honor for them and the Laboratory. I...

  3. Planning under uncertainty solving large-scale stochastic linear programs

    SciTech Connect (OSTI)

    Infanger, G. . Dept. of Operations Research Technische Univ., Vienna . Inst. fuer Energiewirtschaft)

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  4. The CAIR vacatur raises uncertainty in the power generation industry

    SciTech Connect (OSTI)

    Dan Weiss; John Kinsman

    2008-12-15

    On 11 July 2008, the U.S. Court of Appeals for the District of Columbia issued a unanimous decision vacating the entire Clean Air Interstate Rule (CAIR) and the associated federal implementation plan. The upset of this program to reduce power plant sulfur dioxide (SO{sub 2}) and nitrogen oxides (NOx) emissions in the eastern United States was a great surprise, creating operational and planning turmoil in the industry. 4 refs.

  5. Energy Department Seeks Information on Geothermal Risk and Uncertainty Management

    Broader source: Energy.gov [DOE]

    The Energy Department's Office of Energy Efficiency and Renewable Energy (EERE) has issued a Request for Information (RFI) to help quantify and manage risk in geothermal exploration, in an effort...

  6. Uncertainty analyses of CO2 plume expansion subsequent to wellbore...

    Office of Scientific and Technical Information (OSTI)

    in another scenario, we study the effects of reservoir heterogeneity on CO2 migration. ... The CO2 migration is simulated using the PNNL-developed simulator STOMP-CO2e (the ...

  7. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    Have feedback or suggestions for a way to improve these results? Save Share this Record Citation Formats MLA APA Chicago Bibtex Export Metadata Endnote Excel CSV XML Save to My ...

  8. Improvements of Nuclear Data and Its Uncertainties by Theoretical...

    Office of Scientific and Technical Information (OSTI)

    APA Chicago Bibtex Export Metadata Endnote Excel CSV XML Save to My Library Send to Email Send to Email Email address: Content: Close Send Cite: MLA Format Close Cite: APA ...

  9. Uncertainty Reduction in Power Generation Forecast Using Coupled...

    Office of Scientific and Technical Information (OSTI)

    Resource Type: Conference Resource Relation: Conference: IEEE PES General Meeting, Conference & Exposition, July 27-31, 2014, National Harbor, MD Publisher: IEEE, Piscataway, NJ, ...

  10. Enabling Efficient Uncertainty Quantification Using Adjoint-based...

    Office of Scientific and Technical Information (OSTI)

    Bibtex Format Close 0 pages in this document matching the terms "" Search For Terms: Enter terms in the toolbar above to search the full text of this document for pages...

  11. Error and uncertainty in Raman thermal conductivity measurements...

    Office of Scientific and Technical Information (OSTI)

    Using this methodology, the accuracy and precision of the Raman-derived thermal ... Publisher: American Institute of Physics (AIP) Research Org: Sandia National Laboratories ...

  12. Direct Aerosol Forcing: Sensitivity to Uncertainty in Measurements...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Brookhaven National Laboratory Ricchiazzi, Paul University of California, Santa Barbara Lewis, Ernie Brookhaven National Laboratory Michalsky, Joseph DOCNOAAOARESRLGMD Ogren,...

  13. Microsoft Word - feb10-Price Uncertainty Supplement.doc

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    monthly supplement to the EIA Short-Term Energy Outlook. (http:www.eia.doe.govemeu... in mid-January, as is seen in the green curve in Figure 1, going to a level of ...

  14. Microsoft Word - Documentation - Price Forecast Uncertainty.doc

    U.S. Energy Information Administration (EIA) Indexed Site

    growth, Organization of Petroleum ... Journal of Environmental Economics and Management, Vol. 46 (2003) pp. 52 - 71. Ogawa, ... volatility, John Wiley & Sons Ltd. (2005). ...

  15. Electricity Plant Cost Uncertainties (released in AEO2009)

    Reports and Publications (EIA)

    2009-01-01

    Construction costs for new power plants have increased at an extraordinary rate over the past several years. One study, published in mid-2008, reported that construction costs had more than doubled since 2000, with most of the increase occurring since 2005. Construction costs have increased for plants of all types, including coal, nuclear, natural gas, and wind.

  16. ACCOUNTING FOR COSMIC VARIANCE IN STUDIES OF GRAVITATIONALLY LENSED HIGH-REDSHIFT GALAXIES IN THE HUBBLE FRONTIER FIELD CLUSTERS

    SciTech Connect (OSTI)

    Robertson, Brant E.; Stark, Dan P.; Ellis, Richard S.; Dunlop, James S.; McLure, Ross J.; McLeod, Derek

    2014-12-01

    Strong gravitational lensing provides a powerful means for studying faint galaxies in the distant universe. By magnifying the apparent brightness of background sources, massive clusters enable the detection of galaxies fainter than the usual sensitivity limit for blank fields. However, this gain in effective sensitivity comes at the cost of a reduced survey volume and, in this Letter, we demonstrate that there is an associated increase in the cosmic variance uncertainty. As an example, we show that the cosmic variance uncertainty of the high-redshift population viewed through the Hubble Space Telescope Frontier Field cluster Abell 2744 increases from ?35% at redshift z ? 7 to ? 65% at z ? 10. Previous studies of high-redshift galaxies identified in the Frontier Fields have underestimated the cosmic variance uncertainty that will affect the ultimate constraints on both the faint-end slope of the high-redshift luminosity function and the cosmic star formation rate density, key goals of the Frontier Field program.

  17. Department of Defense High Performance Computing Modernization Program

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    DoD High Performance Computing Science and Engineering Applications Larry Davis 25 April 2007 Department of Defense High Performance Computing Modernization Program Overview  DoD High Performance Computing Modernization Program (HPCMP)  DoD science and engineering applications  Use of modeling and simulation for aircraft certification  HPCMP benchmarking for acquisitions  Overall acquisition process  Validated vendor benchmarking results  Uncertainty analysis in performance

  18. Robust emergent climate phenomena associated with the high-sensitivity

    Office of Scientific and Technical Information (OSTI)

    tail. (Conference) | SciTech Connect Robust emergent climate phenomena associated with the high-sensitivity tail. Citation Details In-Document Search Title: Robust emergent climate phenomena associated with the high-sensitivity tail. Because the potential effects of climate change are more severe than had previously been thought, increasing focus on uncertainty quantification is required for risk assessment needed by policy makers. Current scientific efforts focus almost exclusively on

  19. Benchmark Evaluation of Start-Up and Zero-Power Measurements at the High-Temperature Engineering Test Reactor

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Bess, John D.; Fujimoto, Nozomu

    2014-10-09

    Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  20. Benchmark Evaluation of Start-Up and Zero-Power Measurements at the High-Temperature Engineering Test Reactor

    SciTech Connect (OSTI)

    Bess, John D.; Fujimoto, Nozomu

    2014-10-09

    Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in the experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.

  1. An adaptive sparse-grid high-order stochastic collocation method for

    Office of Scientific and Technical Information (OSTI)

    Bayesian inference in groundwater reactive transport modeling (Technical Report) | SciTech Connect sparse-grid high-order stochastic collocation method for Bayesian inference in groundwater reactive transport modeling Citation Details In-Document Search Title: An adaptive sparse-grid high-order stochastic collocation method for Bayesian inference in groundwater reactive transport modeling Although Bayesian analysis has become vital to the quantification of prediction uncertainty in

  2. EXTENDING THE REALM OF OPTIMIZATION FOR COMPLEX SYSTEMS: UNCERTAINTY, COMPETITION, AND DYNAMICS

    SciTech Connect (OSTI)

    Shanbhag, Uday V; Basar, Tamer; Meyn, Sean; Mehta, Prashant

    2013-10-08

    Research reported addressed these topics: the development of analytical and algorithmic tools for distributed computation of Nash equilibria; synchronization in mean-field oscillator games, with an emphasis on learning and efficiency analysis; questions that combine learning and computation; questions including stochastic and mean-field games; modeling and control in the context of power markets.

  3. Quantification of uncertainty in machining operations for on-machine acceptance.

    SciTech Connect (OSTI)

    Claudet, Andre A.; Tran, Hy D.; Su, Jiann-Chemg

    2008-09-01

    Manufactured parts are designed with acceptance tolerances, i.e. deviations from ideal design conditions, due to unavoidable errors in the manufacturing process. It is necessary to measure and evaluate the manufactured part, compared to the nominal design, to determine whether the part meets design specifications. The scope of this research project is dimensional acceptance of machined parts; specifically, parts machined using numerically controlled (NC, or also CNC for Computer Numerically Controlled) machines. In the design/build/accept cycle, the designer will specify both a nominal value, and an acceptable tolerance. As part of the typical design/build/accept business practice, it is required to verify that the part did meet acceptable values prior to acceptance. Manufacturing cost must include not only raw materials and added labor, but also the cost of ensuring conformance to specifications. Ensuring conformance is a substantial portion of the cost of manufacturing. In this project, the costs of measurements were approximately 50% of the cost of the machined part. In production, cost of measurement would be smaller, but still a substantial proportion of manufacturing cost. The results of this research project will point to a science-based approach to reducing the cost of ensuring conformance to specifications. The approach that we take is to determine, a priori, how well a CNC machine can manufacture a particular geometry from stock. Based on the knowledge of the manufacturing process, we are then able to decide features which need further measurements from features which can be accepted 'as is' from the CNC. By calibration of the machine tool, and establishing a machining accuracy ratio, we can validate the ability of CNC to fabricate to a particular level of tolerance. This will eliminate the costs of checking for conformance for relatively large tolerances.

  4. An Optimized Autoregressive Forecast Error Generator for Wind and Load Uncertainty Study

    SciTech Connect (OSTI)

    De Mello, Phillip; Lu, Ning; Makarov, Yuri V.

    2011-01-17

    This paper presents a first-order autoregressive algorithm to generate real-time (RT), hour-ahead (HA), and day-ahead (DA) wind and load forecast errors. The methodology aims at producing random wind and load forecast time series reflecting the autocorrelation and cross-correlation of historical forecast data sets. Five statistical characteristics are considered: the means, standard deviations, autocorrelations, and cross-correlations. A stochastic optimization routine is developed to minimize the differences between the statistical characteristics of the generated time series and the targeted ones. An optimal set of parameters are obtained and used to produce the RT, HA, and DA forecasts in due order of succession. This method, although implemented as the first-order regressive random forecast error generator, can be extended to higher-order. Results show that the methodology produces random series with desired statistics derived from real data sets provided by the California Independent System Operator (CAISO). The wind and load forecast error generator is currently used in wind integration studies to generate wind and load inputs for stochastic planning processes. Our future studies will focus on reflecting the diurnal and seasonal differences of the wind and load statistics and implementing them in the random forecast generator.

  5. Mobile source emission control cost-effectiveness: Issues, uncertainties, and results

    SciTech Connect (OSTI)

    Wang, M.Q.

    1994-12-01

    Emissions from mobile sources undoubtedly contribute to US urban air pollution problems. Consequently, mobile source control measures, ranging from vehicle emission standards to reducing vehicle travel, have been adopted or proposed to help attain air quality standards. To rank various mobile source control measures, various government agencies and private organizations calculate cost-effectiveness in dollars per ton of emissions reduced. Arguments for or against certain control measures are often made on the basis of the calculated cost-effectiveness. Yet, different studies may yield significantly different cost-effectiveness results, because of the various methodologies used and assumptions regarding the values of costs and emission reductions. Because of the methodological differences, the cost-effectiveness results may not be comparable between studies. Use of incomparable cost-effectiveness results may result in adoption of ineffective control measures. This paper first discusses some important methodological issues involved in cost-effectiveness calculation for mobile sources and proposes appropriate, systematic methods for dealing with these issues. Various studies have been completed recently to evaluate the cost-effectiveness of mobile source emission control measures. These studies resulted in wide variations in the cost-effectiveness for same control measures. Methodological assumptions used in each study are presented and, based on the proposed methods for cost-effectiveness calculation, adjustments are applied to the original estimates in each study to correct inappropriate methodological assumptions and to make the studies comparable. Finally, mobile source control measures are ranked on the basis of the adjusted cost-effectiveness estimates.

  6. Resource adequacy, capital adequacy and investment uncertainty in the Australian power market

    SciTech Connect (OSTI)

    Simshauser, Paul

    2010-01-15

    Ignoring the importance of capital markets risks overlooking one of the most fundamental drivers of investment and price in the utilities industry. While the worst effects of the financial crisis are beginning to subside, the residual fallout will be more than a passing fad for energy utilities. (author)

  7. Impacts of Uncertainty in Energy Project Costs (released in AEO2008)

    Reports and Publications (EIA)

    2008-01-01

    From the late 1970s through 2002, steel, cement, and concrete prices followed a general downward trend. Since then, however, iron and steel prices have increased by 8% in 2003, 10% in 2004, and 31% in 2005. Although iron and steel prices declined in 2006, early data for 2007 show another increase. Cement and concrete prices, as well as the composite cost index for all construction commodities, have shown similar trends but with smaller increases in 2004 and 2005.

  8. Mastering Uncertainty and Risk at Multiple Time Scales in the Future Electrical Grid

    SciTech Connect (OSTI)

    Chertkov, Michael; Bent, Russell W.; Backhaus, Scott N.

    2012-07-10

    Today's electrical grids enjoy a relatively clean separation of spatio-temporal scales yielding a compartmentalization of grid design, optimization, control and risk assessment allowing for the use of conventional mathematical tools within each area. In contrast, the future grid will incorporate time-intermittent renewable generation, operate via faster electrical markets, and tap the latent control capability at finer grid modeling scales; creating a fundamentally new set of couplings across spatiotemporal scales and requiring revolutionary advances in mathematics techniques to bridge these scales. One example is found in decade-scale grid expansion planning in which today's algorithms assume accurate load forecasts and well-controlled generation. Incorporating intermittent renewable generation creates fluctuating network flows at the hourly time scale, inherently linking the ability of a transmission line to deliver electrical power to hourly operational decisions. New operations-based planning algorithms are required, creating new mathematical challenges. Spatio-temporal scales are also crossed when the future grid's minute-scale fluctuations in network flows (due to intermittent generation) create a disordered state upon which second-scale transient grid dynamics propagate effectively invalidating today's on-line dynamic stability analyses. Addressing this challenge requires new on-line algorithms that use large data streams from new grid sensing technologies to physically aggregate across many spatial scales to create responsive, data-driven dynamic models. Here, we sketch the mathematical foundations of these problems and potential solutions.

  9. Measurement Uncertainty Analysis of an Accelerometer Calibration Using a POC Electromagnetic Launcher

    SciTech Connect (OSTI)

    Timpson, Erik J.; Engel, T. G.

    2012-06-12

    A pulse forming network (PFN), helical electromagnetic launcher (HEML), command module (CM), and calibration table (CT) were built and evaluated for the combined ability to calibrate an accelerometer. The PFN has a maximum stored nergy of 19.25 kJ bank and is fired by a silicon controlled rectifier (SCR), with appropriate safety precautions. The HEML is constructed out of G-10 fiberglass reinforced epoxy and is designed to accelerate a mass of 600 grams to a velocity of 10 meters per second. The CM is microcontroller-based running Arduino Software. The CM has a keypad input and 7 segment outputs of the PFN voltage and desired charging voltage. After entering a desired PFN voltage, the CM controls the charging of the PFN. When the two voltages are equal it sends a pulse to the SCR to fire the PFN and in turn, the HEML. The HEML projectiles tip hits a target that is held by the CT. The CT consists of a table to hold the PFN and HEML, a vacuum chuck, air bearing, velocimeter and catch pot. The target is held with the vacuum chuck awaiting impact. After impact, the air bearing allows the target to fall freely so that the velocimeter can accurately read. A known acceleration is determined from the known change in velocity of the target. Thus, if an accelerometer was attached to the target, the measured value can be compared to the known value.

  10. PROJECT PROFILE: Reducing PV Performance Uncertainty by Accurately Quantifying the “PV Resource”

    Broader source: Energy.gov [DOE]

    Funding Opportunity: SuNLaMPSunShot Subprogram: PhotovoltaicsLocation: National Renewable Energy Laboratory, Golden, COAmount Awarded: $2,500,000

  11. The effect of 12C + 12C rate uncertainties on the weak s-process component

    SciTech Connect (OSTI)

    Fryer, Christopher Lee; Hungerford, Aimee L; Hirschi, Raphael; Pignatari, Marco; Bennett, Michael E; Diehl, Steven; Herwig, Falk; Hillary, William; Richman, Debra; Rockefeller, Gabriel; Timmes, Frank X; Wiescher, Michael

    2010-09-10

    The contribution by massive stars (M > 15M{sub {circle_dot}}) to the weak s-process component of the solar system abundances is primarily due to the {sup 22}Ne neutron source, which is activated near the end of helium-core burning. The residual {sup 22}Ne left over from helium-core burning is then reignited during carbon burning, initiating further s-processing that modifies the isotopic distribution. This modification is sensitive to the stellar structure and the carbon burning reaction rate. Recent work on the {sup 12}C + {sup 12}C reaction suggests that resonances located within the Gamow peak may exist, causing a strong increase in the astrophysical S-factor and consequently the reaction rate. To investigate the effect of such a rate, 25M{sub {circle_dot}} stellar models with different carbon burning rates, at solar metallicity, were generated using the Geneva Stellar Evolution Code (GENEC) with nucleosynthesis post-processing calculated using the NuGrid Multi-zone Post-Processing Network code (MPPNP). A strongly enhanced rate can cause carbon burning to occur in a convective core rather than a radiative one and the convective core mixes the matter synthesized there up into the carbon shell, significantly altering the initial composition of the carbon-shell. In addition, an enhanced rate causes carbon-shell burning episodes to ignite earlier in the evolution of the star, igniting the {sup 22}Ne source at lower temperatures and reducing the neutron density.

  12. Climate Change Technology R&D Portfolio Decision Making Under Uncertainty

    SciTech Connect (OSTI)

    Baker, E.; Keisler, J.; Chon, H.

    2008-11-17

    In this project we have completed, or are in the process of, collecting and analyzing information on seven energy technologies solar photovoltaics, nuclear power, carbon capture and storage, electricity from biomass, liquid bio-fuels, and batteries in regards to their potential impact on reducing greenhouse gas emissions. We have collected expert elicitations, relating U.S. government funding trajectories to probabilities of success. We then used MiniCAM, a technologically-detailed Integrated Assessnent Model to determine the impact on the marginal cost of reducing greenhouse gas emissions, if the technologies were successful. Finally, we have performed initial analysis on portfolios of technologies. This project has partially supported nine papers, either published, under review, or under preparation for such journals as Energy Economics, The Energy Journal, Climatic Change, Management Science, and Transportation Research.

  13. Microsoft PowerPoint - uncertainty_hh_2009_2010.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    09 - December 2010 Past Henry Hub Price and 95% NYMEX Confidence Interval, January 2009 $18 $20 $12 $14 $16 $8 $10 $12 $2 $4 $6 $0 $2 Jan Jul Jan Jul Jan Jul Jan Jul 1 2008 2008 2009 2009 2010 2010 2011 2011 Past Henry Hub Price and 95% NYMEX Confidence Interval, February 2009 $18 $20 $12 $14 $16 $8 $10 $12 $2 $4 $6 $0 $2 Jan Jul Jan Jul Jan Jul Jan Jul 2 2008 2008 2009 2009 2010 2010 2011 2011 Past Henry Hub Price and 95% NYMEX Confidence Interval, March 2009 $18 $20 $12 $14 $16 $8 $10 $12 $2

  14. Microsoft PowerPoint - uncertainty_hh_2011_2012.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    11 - December 2012 Past Henry Hub Price and 95% NYMEX Confidence Interval, January 2011 $10 $12 $8 $10 $4 $6 $2 $4 $0 Jan Jul Jan Jul Jan Jul Jan Jul 1 2010 2010 2011 2011 2012 2012 2013 2013 Past Henry Hub Price and 95% NYMEX Confidence Interval, February 2011 $10 $12 $8 $10 $4 $6 $2 $4 $0 Jan Jul Jan Jul Jan Jul Jan Jul 2 2010 2010 2011 2011 2012 2012 2013 2013 Past Henry Hub Price and 95% NYMEX Confidence Interval, March 2011 $10 $12 $8 $10 $4 $6 $2 $4 $0 Jan Jul Jan Jul Jan Jul Jan Jul 3

  15. Microsoft PowerPoint - uncertainty_hh_2013_2014.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    3 - December 2014 Short-Term Energy Outlook 1 Past Henry Hub Price and 95% NYMEX Confidence Interval, January 2013 $0 $1 $2 $3 $4 $5 $6 $7 $8 $9 $10 Jan 2012 Jul 2012 Jan 2013 Jul 2013 Jan 2014 Jul 2014 Jan 2015 Jul 2015 2 Past Henry Hub Price and 95% NYMEX Confidence Interval, February 2013 $0 $1 $2 $3 $4 $5 $6 $7 $8 $9 $10 Jan 2012 Jul 2012 Jan 2013 Jul 2013 Jan 2014 Jul 2014 Jan 2015 Jul 2015 3 Past Henry Hub Price and 95% NYMEX Confidence Interval, March 2013 $0 $1 $2 $3 $4 $5 $6 $7 $8 $9

  16. Microsoft PowerPoint - uncertainty_past_hh.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    5 - March 2016 Short-Term Energy Outlook 1 Past Henry Hub Price and 95% NYMEX Confidence Interval, January 2015 $0 $1 $2 $3 $4 $5 $6 $7 $8 $9 $10 Jan 2014 Jul 2014 Jan 2015 Jul 2015 Jan 2016 Jul 2016 Jan 2017 Jul 2017 2 Past Henry Hub Price and 95% NYMEX Confidence Interval, February 2015 $0 $1 $2 $3 $4 $5 $6 $7 $8 $9 $10 Jan 2014 Jul 2014 Jan 2015 Jul 2015 Jan 2016 Jul 2016 Jan 2017 Jul 2017 3 Past Henry Hub Price and 95% NYMEX Confidence Interval, March 2015 $0 $1 $2 $3 $4 $5 $6 $7 $8 $9 $10

  17. Microsoft PowerPoint - uncertainty_past_wti.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    5 - March 2016 Short-Term Energy Outlook 1 Historical WTI price and 95% NYMEX Confidence Interval, January 2015 $0 $20 $40 $60 $80 $100 $120 $140 $160 $180 $200 Jan 2014 Jul 2014 Jan 2015 Jul 2015 Jan 2016 Jul 2016 Jan 2017 Jul 2017 2 Historical WTI price and 95% NYMEX Confidence Interval, February 2015 $0 $20 $40 $60 $80 $100 $120 $140 $160 $180 $200 Jan 2014 Jul 2014 Jan 2015 Jul 2015 Jan 2016 Jul 2016 Jan 2017 Jul 2017 3 Historical WTI price and 95% NYMEX Confidence Interval, March 2015 $0

  18. Microsoft PowerPoint - uncertainty_wti_2009_2010.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    09 - December 2010 January 2009 December 2010 Historical WTI price and 95% NYMEX Confidence Interval, January 2009 $180 $200 $120 $140 $160 $80 $100 $120 $20 $40 $60 $0 $20 Jan Jul Jan Jul Jan Jul Jan Jul 1 2008 2008 2009 2009 2010 2010 2011 2011 Historical WTI price and 95% NYMEX Confidence Interval, February 2009 $180 $200 $120 $140 $160 $80 $100 $120 $20 $40 $60 $0 $20 Jan Jul Jan Jul Jan Jul Jan Jul 2 2008 2008 2009 2009 2010 2010 2011 2011 Historical WTI price and 95% NYMEX Confidence

  19. Microsoft PowerPoint - uncertainty_wti_2011_2012.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    11 - December 2012 January 2011 December 2012 Historical WTI price and 95% NYMEX Confidence Interval, January 2011 $250 $150 $200 $100 $150 $50 $0 Jan Jul Jan Jul Jan Jul Jan Jul 1 2010 2010 2011 2011 2012 2012 2013 2013 Historical WTI price and 95% NYMEX Confidence Interval, February 2011 $250 $150 $200 $100 $150 $50 $0 Jan Jul Jan Jul Jan Jul Jan Jul 2 2010 2010 2011 2011 2012 2012 2013 2013 Historical WTI price and 95% NYMEX Confidence Interval, March 2011 $250 $150 $200 $100 $150 $50 $0 Jan

  20. Microsoft PowerPoint - uncertainty_wti_2013_2014.ppt [Compatibility Mode]

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    3 - December 2014 Short-Term Energy Outlook 1 Historical WTI price and 95% NYMEX Confidence Interval, January 2013 $0 $20 $40 $60 $80 $100 $120 $140 $160 $180 $200 Jan 2012 Jul 2012 Jan 2013 Jul 2013 Jan 2014 Jul 2014 Jan 2015 Jul 2015 2 Historical WTI price and 95% NYMEX Confidence Interval, February 2013 $0 $20 $40 $60 $80 $100 $120 $140 $160 $180 $200 Jan 2012 Jul 2012 Jan 2013 Jul 2013 Jan 2014 Jul 2014 Jan 2015 Jul 2015 3 Historical WTI price and 95% NYMEX Confidence Interval, March 2013 $0