Assessor Training Measurement Uncertainty
NVLAP Assessor Training Measurement Uncertainty #12;Assessor Training 2009: Measurement Uncertainty Training 2009: Measurement Uncertainty 3 Measurement Uncertainty ·Calibration and testing labs performing Training 2009: Measurement Uncertainty 4 Measurement Uncertainty ·When the nature of the test precludes
Uncertainties in Gapped Graphene
Eylee Jung; Kwang S. Kim; DaeKil Park
2012-03-20T23:59:59.000Z
Motivated by graphene-based quantum computer we examine the time-dependence of the position-momentum and position-velocity uncertainties in the monolayer gapped graphene. The effect of the energy gap to the uncertainties is shown to appear via the Compton-like wavelength $\\lambda_c$. The uncertainties in the graphene are mainly contributed by two phenomena, spreading and zitterbewegung. While the former determines the uncertainties in the long-range of time, the latter gives the highly oscillation to the uncertainties in the short-range of time. The uncertainties in the graphene are compared with the corresponding values for the usual free Hamiltonian $\\hat{H}_{free} = (p_1^2 + p_2^2) / 2 M$. It is shown that the uncertainties can be under control within the quantum mechanical law if one can choose the gap parameter $\\lambda_c$ freely.
Uncertainty of calorimeter measurements at NREL's high flux solar furnace
Bingham, C.E.
1991-12-01T23:59:59.000Z
The uncertainties of the calorimeter and concentration measurements at the High Flux Solar Furnace (HFSF) at the National Renewable Energy Laboratory (NREL) are discussed. Two calorimeter types have been used to date. One is an array of seven commercially available circular foil calorimeters (gardon or heat flux gages) for primary concentrator peak flux (up to 250 W/cm{sup 2}). The second is a cold-water calorimeter designed and built by the University of Chicago to measure the average exit power of the reflective compound parabolic secondary concentrator used at the HFSF (over 3.3 kW across a 1.6cm{sup {minus}2} exit aperture, corresponding to a flux of about 2 kW/cm{sup 2}). This paper discussed the uncertainties of the calorimeter and pyrheliometer measurements and resulting concentration calculations. The measurement uncertainty analysis is performed according to the ASME/ANSI standard PTC 19.1 (1985). Random and bias errors for each portion of the measurement are analyzed. The results show that as either the power or the flux is reduced, the uncertainties increase. Another calorimeter is being designed for a new, refractive secondary which will use a refractive material to produce a higher average flux (5 kW/cm{sup 2}) than the reflective secondary. The new calorimeter will use a time derivative of the fluid temperature as a key measurement of the average power out of the secondary. A description of this calorimeter and test procedure is also presented, along with a pre-test estimate of major sources of uncertainty. 8 refs., 4 figs., 3 tabs.
Direct Aerosol Forcing Uncertainty
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Mccomiskey, Allison
Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.
Environmental Modeling: Coping with Uncertainty
Politècnica de Catalunya, Universitat
Statistical heterogeneity Complex correlation structures · Classical models: Additive noise UniEnvironmental Modeling: Coping with Uncertainty Daniel M. Tartakovsky (dmt@lanl.gov) Theoretical of environmental processes 2. Parametric uncertainty 3. Current approaches to uncertainty quantification 4. Random
Uncertainty Analysis Economic Evaluations
Bhulai, Sandjai
uncertainties in typical oil and gas projects: 1. The oil price, 2. The investments (capex) and operating 4.1 Oil Prices...............................................................................................14 4.1.1 Analysis of historical oil prices........................................................15
A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications
Iaccarino, Gianluca
2014-04-01T23:59:59.000Z
Multiphysics processes modeled by a system of unsteady di#11;erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi#12;cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.
Optimal Uncertainty Quantification
Owhadi, Houman; Sullivan, Timothy John; McKerns, Mike; Ortiz, Michael
2010-01-01T23:59:59.000Z
We propose a rigorous framework for Uncertainty Quantification (UQ) in which the UQ objectives and the assumptions/information set are brought to the forefront. This framework, which we call \\emph{Optimal Uncertainty Quantification} (OUQ), is based on the observation that, given a set of assumptions and information about the problem, there exist optimal bounds on uncertainties: these are obtained as extreme values of well-defined optimization problems corresponding to extremizing probabilities of failure, or of deviations, subject to the constraints imposed by the scenarios compatible with the assumptions and information. In particular, this framework does not implicitly impose inappropriate assumptions, nor does it repudiate relevant information. Although OUQ optimization problems are extremely large, we show that under general conditions, they have finite-dimensional reductions. As an application, we develop \\emph{Optimal Concentration Inequalities} (OCI) of Hoeffding and McDiarmid type. Surprisingly, contr...
Uncertainty and calibration analysis
Coutts, D.A.
1991-03-01T23:59:59.000Z
All measurements contain some deviation from the true value which is being measured. In the common vernacular this deviation between the true value and the measured value is called an inaccuracy, an error, or a mistake. Since all measurements contain errors, it is necessary to accept that there is a limit to how accurate a measurement can be. The undertainty interval combined with the confidence level, is one measure of the accuracy for a measurement or value. Without a statement of uncertainty (or a similar parameter) it is not possible to evaluate if the accuracy of the measurement, or data, is appropriate. The preparation of technical reports, calibration evaluations, and design calculations should consider the accuracy of measurements and data being used. There are many methods to accomplish this. This report provides a consistent method for the handling of measurement tolerances, calibration evaluations and uncertainty calculations. The SRS Quality Assurance (QA) Program requires that the uncertainty of technical data and instrument calibrations be acknowledged and estimated. The QA Program makes some specific technical requirements related to the subject but does not provide a philosophy or method on how uncertainty should be estimated. This report was prepared to provide a technical basis to support the calculation of uncertainties and the calibration of measurement and test equipment for any activity within the Experimental Thermal-Hydraulics (ETH) Group. The methods proposed in this report provide a graded approach for estimating the uncertainty of measurements, data, and calibrations. The method is based on the national consensus standard, ANSI/ASME PTC 19.1.
aerosols and climate : uncertainties
aerosols and climate : uncertainties and the need for standardization Frank Raes, Jean, industry, public stakeholder and government outlook on key aerosol metrology issues critical to furthering our understanding of aerosols in the climate theatre From the letter of invitation: #12;- We are more
Uncertainty Principle Respects Locality
Dongsheng Wang
2015-04-19T23:59:59.000Z
The notion of nonlocality implicitly implies there might be some kind of spooky action at a distance in nature, however, the validity of quantum mechanics has been well tested up to now. In this work it is argued that the notion of nonlocality is physically improper, the basic principle of locality in nature is well respected by quantum mechanics, namely, the uncertainty principle. We show that the quantum bound on the Clauser, Horne, Shimony, and Holt (CHSH) inequality can be recovered from the uncertainty relation in a multipartite setting. We further argue that the super-quantum correlation demonstrated by the nonlocal box is not physically comparable with the quantum one. The origin of the quantum structure of nature still remains to be explained, some post-quantum theory which is more complete in some sense than quantum mechanics is possible and might not necessarily be a hidden variable theory.
Predicting System Performance with Uncertainty
Yan, B.; Malkawi, A.
2012-01-01T23:59:59.000Z
The main purpose of this research is to include uncertainty that lies in modeling process and that arises from input values when predicting system performance, and to incorporate uncertainty related to system controls in a computationally...
Calibration Under Uncertainty.
Swiler, Laura Painton; Trucano, Timothy Guy
2005-03-01T23:59:59.000Z
This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.
Sandia National Laboratories: Uncertainty Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Experimental Testing Phenomenological Modeling Risk and Safety Assessment Cyber-Based Vulnerability Assessments Uncertainty Analysis Transportation Safety Fire Science Human...
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01T23:59:59.000Z
First, it recovers historical oil price uncertainty seriesmore reliable historical oil price uncertainty series as itAs a result, the historical oil price uncertainty series is
Uncertainty with New Technology
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE:1 First Use of Energy for All Purposes (Fuel and Nonfuel), 2002; Level:5 TablesExports(Journal Article) |govInstrumentsmfrirtA Journey InsideMicroBooNE LArTPC SarahArea: U.S. LowertoroidalUncertainty with New Technology As the
Predicting System Performance with Uncertainty
Yan, B.; Malkawi, A.
2012-01-01T23:59:59.000Z
on uncertainty in input values for predictions. The input values associated with predictions can come from estimations or measurements corrupted with noise. Therefore, it is more reasonable to assign probability distributions over their domains of plausible... increases, the number of simulations required increases significantly. The time cost limits the extension of uncertainty analysis. Current studies have not covered uncertainty related to system controls in operations. Measurements in system operations...
Uncertainty in emissions projections for climate models
Webster, Mort David.; Babiker, Mustafa H.M.; Mayer, Monika.; Reilly, John M.; Harnisch, Jochen.; Hyman, Robert C.; Sarofim, Marcus C.; Wang, Chien.
Future global climate projections are subject to large uncertainties. Major sources of this uncertainty are projections of anthropogenic emissions. We evaluate the uncertainty in future anthropogenic emissions using a ...
Structural model uncertainty in stochastic simulation
McKay, M.D.; Morrison, J.D. [Los Alamos National Lab., NM (United States). Technology and Safety Assessment Div.
1997-09-01T23:59:59.000Z
Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.
Policy Uncertainty and Household Savings
Giavazzi, Francesco
Using German microdata and a quasi-natural experiment, we provide evidence on how households respond to an increase in uncertainty. We find that household saving increases significantly following the increase in political ...
Uncertainty, investment, and industry evolution
Caballero, Ricardo J.
1992-01-01T23:59:59.000Z
We study the effects of aggregate and idiosyncratic uncertainty on the entry of firms, total investment, and prices in a competitive industry with irreversible investment. We first use standard dynamic programming methods ...
Reducing Petroleum Despendence in California: Uncertainties About...
Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site
Petroleum Despendence in California: Uncertainties About Light-Duty Diesel Reducing Petroleum Despendence in California: Uncertainties About Light-Duty Diesel 2002 DEER Conference...
Uncertainty quantification approaches for advanced reactor analyses.
Briggs, L. L.; Nuclear Engineering Division
2009-03-24T23:59:59.000Z
The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01T23:59:59.000Z
historical uncertainty series by incorporating a realized volatility series from daily oil price data
Davis, C B
1987-08-01T23:59:59.000Z
The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.
Review on Generalized Uncertainty Principle
Tawfik, Abdel Nasser
2015-01-01T23:59:59.000Z
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Review on Generalized Uncertainty Principle
Abdel Nasser Tawfik; Abdel Magied Diab
2015-09-22T23:59:59.000Z
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Uncertainty of Pyrometers in a Casting Facility
Mee, D.K.; Elkins, J.E.; Fleenor, R.M.; Morrision, J.M.; Sherrill, M.W.; Seiber, L.E.
2001-12-07T23:59:59.000Z
This work has established uncertainty limits for the EUO filament pyrometers, digital pyrometers, two-color automatic pyrometers, and the standards used to certify these instruments (Table 1). If symmetrical limits are used, filament pyrometers calibrated in Production have certification uncertainties of not more than {+-}20.5 C traceable to NIST over the certification period. Uncertainties of these pyrometers were roughly {+-}14.7 C before introduction of the working standard that allowed certification in the field. Digital pyrometers addressed in this report have symmetrical uncertainties of not more than {+-}12.7 C or {+-}18.1 C when certified on a Y-12 Standards Laboratory strip lamp or in a production area tube furnace, respectively. Uncertainty estimates for automatic two-color pyrometers certified in Production are {+-}16.7 C. Additional uncertainty and bias are introduced when measuring production melt temperatures. A -19.4 C bias was measured in a large 1987 data set which is believed to be caused primarily by use of Pyrex{trademark} windows (not present in current configuration) and window fogging. Large variability (2{sigma} = 28.6 C) exists in the first 10 m of the hold period. This variability is attributed to emissivity variation across the melt and reflection from hot surfaces. For runs with hold periods extending to 20 m, the uncertainty approaches the calibration uncertainty of the pyrometers. When certifying pyrometers on a strip lamp at the Y-12 Standards Laboratory, it is important to limit ambient temperature variation (23{+-}4 C), to order calibration points from high to low temperatures, to allow 6 m for the lamp to reach thermal equilibrium (12 m for certifications below 1200 C) to minimize pyrometer bias, and to calibrate the pyrometer if error exceeds vendor specifications. A procedure has been written to assure conformance.
5, 45074543, 2005 Uncertainty analysis
Paris-Sud XI, Université de
parameterization, the representation of the in-cloud updraft velocity, the relationship between effective radius10 of the study of aerosol effects on the global climate, uncertainty in the estimation of the indirect aerosol the aerosol chemical and physical properties and cloud microphysics. Aerosols influence cloud radiative
Uncertainty quantification and error analysis
Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL
2010-01-01T23:59:59.000Z
UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.
Pharmaceutical Waste Management Under Uncertainty
Linninger, Andreas A.
Pharmaceutical Waste Management Under Uncertainty Andreas A. Linninger and Aninda Chakraborty of their benefits and costs constitutes a formidable task. Designing plant-wide waste management policies assuming this article addresses the problem of finding optimal waste management policies for entire manufacturing sites
Estimating uncertainty of inference for validation
Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM
2010-09-30T23:59:59.000Z
We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the first in a series of inference uncertainty estimations. While the methods demonstrated are primarily statistical, these do not preclude the use of nonprobabilistic methods for uncertainty characterization. The methods presented permit accurate determinations for validation and eventual prediction. It is a goal that these methods establish a standard against which best practice may evolve for determining degree of validation.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)
2007-05-01T23:59:59.000Z
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Greenhouse Gas Inventory Uncertainties Need Characterization
Post, Wilfred M.
Greenhouse Gas Inventory Uncertainties Need Characterization Contact: Gregg Marland, 865 of greenhouse gases (GHGs) emitted to and removed from the atmosphere are essential for understanding global.S. Department of Energy Greenhouse Gas Inventory Uncertainties Need Characterization Abstract: The assessment
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01T23:59:59.000Z
xi Chapter 1 The Effects of Oil Price Uncertainty on theLIST OF FIGURES Figure Figure Figure Figure Figure Oil priceOil price uncertainty with and without realized
Theoretical Uncertainties in Inflationary Predictions
William H. Kinney; Antonio Riotto
2006-03-09T23:59:59.000Z
With present and future observations becoming of higher and higher quality, it is timely and necessary to investigate the most significant theoretical uncertainties in the predictions of inflation. We show that our ignorance of the entire history of the Universe, including the physics of reheating after inflation, translates to considerable errors in observationally relevant parameters. Using the inflationary flow formalism, we estimate that for a spectral index $n$ and tensor/scalar ratio $r$ in the region favored by current observational constraints, the theoretical errors are of order $\\Delta n / | n - 1| \\sim 0.1 - 1$ and $\\Delta r /r \\sim 0.1 - 1$. These errors represent the dominant theoretical uncertainties in the predictions of inflation, and are generically of the order of or larger than the projected uncertainties in future precision measurements of the Cosmic Microwave Background. We also show that the lowest-order classification of models into small field, large field, and hybrid breaks down when higher order corrections to the dynamics are included. Models can flow from one region to another.
Quantum Mechanics and the Generalized Uncertainty Principle
Jang Young Bang; Micheal S. Berger
2006-11-30T23:59:59.000Z
The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.
Robot Motion Planning with Uncertainty The Challenge
Whitton, Mary C.
Roadmap (SMR), a new motion planning framework that explicitly considers uncertainty in robot motion approach. Our framework builds on the highly successful approach used in Probabilistic Roadmaps (PRMs of discrete states is selected in the state space, and a roadmap is built that represents their collision
Extended Forward Sensitivity Analysis for Uncertainty Quantification
Haihua Zhao; Vincent A. Mousseau
2011-09-01T23:59:59.000Z
Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed to run at optimized time and space steps without affecting the confidence of the physical parameter sensitivity results. The time and space steps forward sensitivity analysis method can also replace the traditional time step and grid convergence study with much less computational cost. Several well defined benchmark problems with manufactured solutions are utilized to demonstrate the extended forward sensitivity analysis method. All the physical solutions, parameter sensitivity solutions, even the time step sensitivity in one case, have analytical forms, which allows the verification to be performed in the strictest sense.
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
reservoir modelling Considerable uncertainty about porosity of rock Astronomy "Random" spatial: uncertainty in modelling parameters uncertainty in geometry uncertainty in initial conditions uncertainty with Uncertainty Examples: Long-term climate modelling: Lots of sources of uncertainty including the effects
Incorporating Forecast Uncertainty in Utility Control Center
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2014-07-09T23:59:59.000Z
Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)
Uncertainty Quantification for Nuclear Density Functional Theory...
Office of Scientific and Technical Information (OSTI)
Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements Citation Details In-Document Search This content will become publicly...
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01T23:59:59.000Z
peak, and finds that this nonlinear transformation of the oiland oil price growth rates. As seen in the above illustration, uncertainty is at its peak
Short communication Smooth pursuit under stimulusresponse uncertainty
Bar, Moshe
reaction times (RTs) are typically faster than choice reaction times and increase with uncertaintyR) uncertainty and reaction times (RTs): RT = a + blog2(N), where a is simple RT, b is the slope of the increase an eye velocity criterion of 1.5j sÀ 1 which was equivalent to 25% of the stimulus velocity
Dealing with Uncertainty in the Semantic Web
Theune, Mariët
Dealing with Uncertainty in the Semantic Web Tjitze Rienstra M.Sc. Thesis November 8, 2009. Paul van der Vet Dr. Maarten Fokkinga #12;#12;Abstract Standardizing the Semantic Web is still of the Semantic Web, are yet to be standardized. One of these is dealing with uncertainty. Like classical logic
Multivariate Receptor Models and Model Uncertainty
Washington at Seattle, University of
Multivariate Receptor Models and Model Uncertainty Eun Sug Park Man-Suk Oh Peter Guttorp NRCSET e c provides the Center's primary funding. #12;Multivariate Receptor Models and Model Uncertainty Eun Sug Park1 composition profiles, and the source contributions is the main interest in multivariate receptor modeling. Due
Estimating the uncertainty in underresolved nonlinear dynamics
Chorin, Alelxandre; Hald, Ole
2013-06-12T23:59:59.000Z
The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.
Quantifying uncertainty in stable isotope mixing models
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.
2015-05-19T23:59:59.000Z
Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (?15N and ?18O) but all methods testedmore »are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated mixing fractions.« less
Generalized Uncertainty Principle: Approaches and Applications
Abdel Nasser Tawfik; Abdel Magied Diab
2014-11-23T23:59:59.000Z
We review highlights from string theory, black hole physics and doubly special relativity and some "thought" experiments which were suggested to probe the shortest distance and/or the maximum momentum at the Planck scale. The models which are designed to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed entered the literature as the Generalized Uncertainty Principle (GUP). We compare between them. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. Furthermore, assuming modified dispersion relation allows for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of the gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. Another one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.
Majorization formulation of uncertainty in quantum mechanics
Partovi, M. Hossein [Department of Physics and Astronomy, California State University, Sacramento, California 95819-6041 (United States)
2011-11-15T23:59:59.000Z
Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.
Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)
Jordan, D.; Kurtz, S.; Hansen, C.
2014-04-01T23:59:59.000Z
Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.
Numerical uncertainty in computational engineering and physics
Hemez, Francois M [Los Alamos National Laboratory
2009-01-01T23:59:59.000Z
Obtaining a solution that approximates ordinary or partial differential equations on a computational mesh or grid does not necessarily mean that the solution is accurate or even 'correct'. Unfortunately assessing the quality of discrete solutions by questioning the role played by spatial and temporal discretizations generally comes as a distant third to test-analysis comparison and model calibration. This publication is contributed to raise awareness of the fact that discrete solutions introduce numerical uncertainty. This uncertainty may, in some cases, overwhelm in complexity and magnitude other sources of uncertainty that include experimental variability, parametric uncertainty and modeling assumptions. The concepts of consistency, convergence and truncation error are overviewed to explain the articulation between the exact solution of continuous equations, the solution of modified equations and discrete solutions computed by a code. The current state-of-the-practice of code and solution verification activities is discussed. An example in the discipline of hydro-dynamics illustrates the significant effect that meshing can have on the quality of code predictions. A simple method is proposed to derive bounds of solution uncertainty in cases where the exact solution of the continuous equations, or its modified equations, is unknown. It is argued that numerical uncertainty originating from mesh discretization should always be quantified and accounted for in the overall uncertainty 'budget' that supports decision-making for applications in computational physics and engineering.
Pouliot, Jean
Dose uncertainty due to computed tomography ,,CT... slice thickness in CT-based high dose rate in Medicine. DOI: 10.1118/1.1785454 Key words: high dose rate brachytherapy, computed tomography, prostate at risk OARs by providing three-dimensional 3D anatomical information from computed tomography CT
Uncertainty Quantification on Prompt Fission Neutrons Spectra
Talou, P. [T-16, Nuclear Physics Group, Los Alamos National Laboratory, NM 87545 (United States)], E-mail: talou@lanl.gov; Madland, D.G.; Kawano, T. [T-16, Nuclear Physics Group, Los Alamos National Laboratory, NM 87545 (United States)
2008-12-15T23:59:59.000Z
Uncertainties in the evaluated prompt fission neutrons spectra present in ENDF/B-VII.0 are assessed in the framework of the Los Alamos model. The methodology used to quantify the uncertainties on an evaluated spectrum is introduced. We also briefly review the Los Alamos model and single out the parameters that have the largest influence on the calculated results. Using a Kalman filter, experimental data and uncertainties are introduced to constrain model parameters, and construct an evaluated covariance matrix for the prompt neutrons spectrum. Preliminary results are shown in the case of neutron-induced fission of {sup 235}U from thermal up to 15 MeV incident energies.
Error Detection and Recovery for Robot Motion Planning with Uncertainty
Donald, Bruce Randall
1987-07-01T23:59:59.000Z
Robots must plan and execute tasks in the presence of uncertainty. Uncertainty arises from sensing errors, control errors, and uncertainty in the geometry of the environment. The last, which is called model error, has ...
EFFICIENT UNCERTAINTY ANALYSIS METHODS FOR MULTIDISCIPLINARY ROBUST DESIGN
Chen, Wei
EFFICIENT UNCERTAINTY ANALYSIS METHODS FOR MULTIDISCIPLINARY ROBUST DESIGN Xiaoping Du* and Wei robust design procedure that utilizes efficient methods for uncertainty analysis is developed. Different techniques used for uncertainty analysis will significantly reduce the amount of design evaluations
DAMAGE ASSESSMENT OF COMPOSITE PLATE STRUCTURES WITH UNCERTAINTY
Boyer, Edmond
DAMAGE ASSESSMENT OF COMPOSITE PLATE STRUCTURES WITH UNCERTAINTY Chandrashekhar M.* , Ranjan Uncertainties associated with a structural model and measured vibration data may lead to unreliable damage that material uncertainties in composite structures cause considerable problem in damage assessment which can
Analysis of S-Circuit Uncertainty
Ahmed, Taahir
2011-08-08T23:59:59.000Z
The theory of sensori-computational circuits provides a capable framework for the description and optimization of robotic systems, including on-line optimizations. This theory, however, is inadequate in that it does not account for uncertainty in a...
Multifidelity methods for multidisciplinary design under uncertainty
Christensen, Daniel Erik
2012-01-01T23:59:59.000Z
For computational design and analysis tasks, scientists and engineers often have available many different simulation models. The output of each model has an associated uncertainty that is a result of the modeling process. ...
Uncertainty in climate change policy analysis
Jacoby, Henry D.; Prinn, Ronald G.
Achieving agreement about whether and how to control greenhouse gas emissions would be difficult enough even if the consequences were fully known. Unfortunately, choices must be made in the face of great uncertainty, about ...
Uncertainty Quantification in ocean state estimation
Kalmikov, Alexander G
2013-01-01T23:59:59.000Z
Quantifying uncertainty and error bounds is a key outstanding challenge in ocean state estimation and climate research. It is particularly difficult due to the large dimensionality of this nonlinear estimation problem and ...
A note on competitive investment under uncertainty
Pindyck, Robert S.
1991-01-01T23:59:59.000Z
This paper clarifies how uncertainty affects irreversible investment in a competitive market equilibrium. With free entry, irreversibility affects the distribution of future prices, and thereby creates an opportunity cost ...
Handling uncertainty in DEX methodology Martin Znidarsic
Bohanec, Marko
URPDM2010 1 Handling uncertainty in DEX methodology Martin Znidarsic Jozef Stefan Institute, Jamova cesta 39, martin.znidarsic@ijs.si Marko Bohanec Jozef Stefan Institute, Jamova cesta 39, marko
Uncertainty Quantification Techniques of SCALE/TSUNAMI
Rearden, Bradley T [ORNL] [ORNL; Mueller, Don [ORNL] [ORNL
2011-01-01T23:59:59.000Z
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.
Sandia Energy - Computational Fluid Dynamics & Large-Scale Uncertainty...
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
& Large-Scale Uncertainty Quantification for Wind Energy Home Highlights - HPC Computational Fluid Dynamics & Large-Scale Uncertainty Quantification for Wind Energy Previous Next...
Improvements of Nuclear Data and Its Uncertainties by Theoretical...
Office of Scientific and Technical Information (OSTI)
Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by...
Risk Analysis and Decision-Making Under Uncertainty: A Strategy...
Office of Environmental Management (EM)
Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Ming Ye...
Programmatic methods for addressing contaminated volume uncertainties.
DURHAM, L.A.; JOHNSON, R.L.; RIEMAN, C.R.; SPECTOR, H.L.; Environmental Science Division; U.S. ARMY CORPS OF ENGINEERS BUFFALO DISTRICT
2007-01-01T23:59:59.000Z
Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the preremedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in predesign data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in predesign characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland1, Ashland2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate predesign contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District.
Uncertainties and severe-accident management
Kastenberg, W.E. (Univ. of California, Los Angeles (United States))
1991-01-01T23:59:59.000Z
Severe-accident management can be defined as the use of existing and or alternative resources, systems, and actions to prevent or mitigate a core-melt accident. Together with risk management (e.g., changes in plant operation and/or addition of equipment) and emergency planning (off-site actions), accident management provides an extension of the defense-indepth safety philosophy for severe accidents. A significant number of probabilistic safety assessments have been completed, which yield the principal plant vulnerabilities, and can be categorized as (a) dominant sequences with respect to core-melt frequency, (b) dominant sequences with respect to various risk measures, (c) dominant threats that challenge safety functions, and (d) dominant threats with respect to failure of safety systems. Severe-accident management strategies can be generically classified as (a) use of alternative resources, (b) use of alternative equipment, and (c) use of alternative actions. For each sequence/threat and each combination of strategy, there may be several options available to the operator. Each strategy/option involves phenomenological and operational considerations regarding uncertainty. These include (a) uncertainty in key phenomena, (b) uncertainty in operator behavior, (c) uncertainty in system availability and behavior, and (d) uncertainty in information availability (i.e., instrumentation). This paper focuses on phenomenological uncertainties associated with severe-accident management strategies.
ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS
Smith, F.; Phifer, M.
2011-06-30T23:59:59.000Z
The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in sand and clay), (b) Dose Parameters (34 parameters), (c) Material Properties (20 parameters), (d) Surface Water Flows (6 parameters), and (e) Vadose and Aquifer Flow (4 parameters). Results provided an assessment of which group of parameters is most significant in the dose uncertainty. It was found that K{sub d} and the vadose/aquifer flow parameters, both of which impact transport timing, had the greatest impact on dose uncertainty. Dose parameters had an intermediate level of impact while material properties and surface water flows had little impact on dose uncertainty. Results of the importance analysis are discussed further in Section 7 of this report. The objectives of this work were to address comments received during the CA review on the uncertainty analysis and to demonstrate an improved methodology for CA uncertainty calculations as part of CA maintenance. This report partially addresses the LFRG Review Team issue of producing an enhanced CA sensitivity and uncertainty analysis. This is described in Table 1-1 which provides specific responses to pertinent CA maintenance items extracted from Section 11 of the SRS CA (2009). As noted above, the original uncertainty analysis looked at each POA separately and only included the effects from at most five sources giving the highest peak doses at each POA. Only 17 of the 152 CA sources were used in the original uncertainty analysis and the simulation time was reduced from 10,000 to 2,000 years. A major constraint on the original uncertainty analysis was the limitation of only being able to use at most four distributed processes. This work expanded the analysis to 10,000 years using 39 of the CA sources, included cumulative dose effects at downstream POAs, with more realizations (1,000) and finer time steps. This was accomplished by using the GoldSim DP-Plus module and the 36 processors available on a new windows cluster. The last part of the work looked at the contribution to overall uncertainty from the main categories of uncertainty variables: K{sub d}s, dose parameters, flow parameters, and material propertie
Uncertainties in Galactic Chemical Evolution Models
Côté, Benoit; O'Shea, Brian W; Herwig, Falk; Pignatari, Marco; Jones, Samuel; Fryer, Chris
2015-01-01T23:59:59.000Z
We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions, for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of seven basic parameters, which are: the lower and upper mass limit of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per solar mass formed, the total stellar mass formed, and the initial mass of gas of the galaxy. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of ...
Error propagation equations for estimating the uncertainty in high-speed wind tunnel test results
Clark, E.L.
1994-07-01T23:59:59.000Z
Error propagation equations, based on the Taylor series model, are derived for the nondimensional ratios and coefficients most often encountered in high-speed wind tunnel testing. These include pressure ratio and coefficient, static force and moment coefficients, dynamic stability coefficients, and calibration Mach number. The error equations contain partial derivatives, denoted as sensitivity coefficients, which define the influence of free-steam Mach number, M{infinity}, on various aerodynamic ratios. To facilitate use of the error equations, sensitivity coefficients are derived and evaluated for five fundamental aerodynamic ratios which relate free-steam test conditions to a reference condition.
Bennett, C.T.
1995-02-23T23:59:59.000Z
Post hoc analyses have demonstrated clearly that macro-system, organizational processes have played important roles in such major catastrophes as Three Mile Island, Bhopal, Exxon Valdez, Chernobyl, and Piper Alpha. How can managers of such high-consequence organizations as nuclear power plants and nuclear explosives handling facilities be sure that similar macro-system processes are not operating in their plants? To date, macro-system effects have not been integrated into risk assessments. Part of the reason for not using macro-system analyses to assess risk may be the impression that standard organizational measurement tools do not provide hard data that can be managed effectively. In this paper, I argue that organizational dimensions, like those in ISO 9000, can be quantified and integrated into standard risk assessments.
Uncertainty and sampling issues in tank characterization
Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M. [and others
1997-06-01T23:59:59.000Z
A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible.
INCORPORATING UNCERTAINTY INTO DAM SAFETY RISK Sanjay S. Chauhan1
Chauhan, Sanjay S.
on the topic of uncertainty in quantitative risk and policy analysis the reader is referred to Morgan to incorporating input uncertainties into risk analysis model. Input uncertainties are captured by using for uncertainty analysis in dam safety risk assessment, and demonstrates some useful formats for presenting
Bounded Uncertainty Roadmaps for Path Planning Leonidas J. Guibas1
Guibas, Leonidas J.
Bounded Uncertainty Roadmaps for Path Planning Leonidas J. Guibas1 , David Hsu2 , Hanna Kurniawati2 uncertainty during planning. We in- troduce the notion of a bounded uncertainty roadmap (BURM) and use, and it is not much slower than classic probabilistic roadmap planning algorithms, which ignore uncertainty
Programmatic methods for addressing contaminated volume uncertainties
Rieman, C.R.; Spector, H.L. [U.S. Army Corps of Engineers Buffalo District, Buffalo, NY (United States); Durham, L.A.; Johnson, R.L. [Argonne National Laboratory, Environmental Science Div., IL (United States)
2007-07-01T23:59:59.000Z
Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the pre-remedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The U.S. Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in pre-design data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in pre-design characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland 1, Ashland 2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate pre-design contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District. (authors)
Is the Heisenberg uncertainty relation really violated?
Masao Kitano
2008-03-31T23:59:59.000Z
It has been pointed out that for some types of measurement the Heisenberg uncertainty relation seems to be violated. In order to save the situation a new uncertainty relation was proposed by Ozawa. Here we introduce revised definitions of error and disturbance taking into account the gain associated with generalized measurement interactions. With these new definitions, the validity of the Heisenberg inequality is recovered for continuous linear measurement interactions. We also examine the changes in distribution functions caused by the general measurement interaction and clarify the physical meanings of infinitely large errors and disturbances.
Uncertainty in Simulating Wheat Yields Under Climate Change
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.
2013-09-01T23:59:59.000Z
Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.
Visual Exploration of Uncertainty in Remotesensing Classification
Utrecht, Universiteit
Visual Exploration of Uncertainty in Remotesensing Classification Frans J.M. van der Wel Utrecht analysis of remotelysensed data aims at acquiring insight as to the stability of possible classifications for an overwhelming flow of data on the appearance and condition of our planet. The data yielded by remote sensing can
Time Crystals from Minimum Time Uncertainty
Mir Faizal; Mohammed M. Khalil; Saurya Das
2014-12-29T23:59:59.000Z
Motivated by the Generalized Uncertainty Principle, covariance, and a minimum measurable time, we propose a deformation of the Heisenberg algebra, and show that this leads to corrections to all quantum mechanical systems. We also demonstrate that such a deformation implies a discrete spectrum for time. In other words, time behaves like a crystal.
Model Uncertainty in Discrete Event Systems \\Lambda
Garg, Vijay
that a correct model of the system to be controlled was availÂ able. A goal of this wok is to provide be controllably distinÂ guished. We use the finite state machine model with controllable and uncontrollable events to control systems in the presence of uncertainty in the model of the system and environment in which
Uncertainty in climate science and climate policy
Uncertainty in climate science and climate policy Jonathan Rougier University of Bristol, UK Michel1.tex. 1 Introduction This essay, written by a statistician and a climate scientist, describes our view of the gap that exists between current practice in mainstream climate science, and the practical
Outage Probability Under Channel Distribution Uncertainty
Loyka, Sergey
Outage Probability Under Channel Distribution Uncertainty Ioanna Ioannou, Charalambos D. Charalambous and Sergey Loyka Abstract--Outage probability of a class of block-fading (MIMO) channels outage probability defined as min (over the input distribution) -max (over the channel distribution class
IJCAI-05 Workshop Reasoning with Uncertainty in
Pineau, Joelle
Problems via Spatio-Temporal Belief State Clustering [p.17] X. Li, W.K. Cheung, and J. Liu Dept Uncertainty [p. 70] P. Fabiani and F. Teichteil-K¨onigsbuch ONERA/DCSC, Toulouse, France iii #12;iv #12;IJCAI, Canada) Stergios Roumeliotis (University of Minnesota, USA) Nicholas Roy (MIT, USA) Alessandro Saffiotti
Bayesian Environmetrics: Uncertainty and Sensitivity Analysis
Draper, David
problems 1 #12;Outline Two case studies, both involving risk assessment for nuclear waste disposal methods has better repeated-sampling properties than maximum likelihood. -- Several (standard) fluid dynamics: uncertainty and sensitivity analysis and inverse problems 2 #12;Case Study 1: GESAMAC Nuclear fission
Dealing with Uncertainties During Heat Exchanger Design
Polley, G. T.; Pugh, S. J.
2001-01-01T23:59:59.000Z
Over the last thirty years much progress has been made in heat exchanger design methodology. Even so, the design engineer still has to deal with a great deal of uncertainty. Whilst the methods used to predict heat transfer coefficients are now quite...
Nuclear power expansion: thinking about uncertainty
Holt, Lynne; Sotkiewicz, Paul; Berg, Sanford
2010-06-15T23:59:59.000Z
Nuclear power is one of many options available to achieve reduced carbon dioxide emissions. The real-option value model can help explain the uncertainties facing prospective nuclear plant developers in developing mitigation strategies for the development, construction, and operation of new nuclear plants. (author)
UAV mission planning under uncertainty
Sakamoto, Philemon
2006-01-01T23:59:59.000Z
With the continued development of high endurance Unmanned Aerial Vehicles (UAV) and Unmanned Combat Aerial Vehicles (UCAV) that are capable of performing autonomous fiunctions across the spectrum of military operations, ...
Supporting qualified database for uncertainty evaluation
Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D'Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)
2012-07-01T23:59:59.000Z
Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization: this includes the rationale adopted for each part of the nodalization, the user choices, and the systematic derivation and justification of any value present in the code input respect to the values as indicated in the RDS-facility and in the RDS-test. (authors)
Representation of analysis results involving aleatory and epistemic uncertainty.
Johnson, Jay Dean (ProStat, Mesa, AZ); Helton, Jon Craig (Arizona State University, Tempe, AZ); Oberkampf, William Louis; Sallaberry, Cedric J.
2008-08-01T23:59:59.000Z
Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.
The Time's Arrow within the Uncertainty Quantum
Zhen Wang
1998-06-22T23:59:59.000Z
A generalized framework is developed which uses a set description instead of wavefunction to emphasize the role of the observer. Such a framework is found to be very effective in the study of the measurement problem and time's arrow. Measurement in classical and quantum theory is given a unified treatment. With the introduction of the concept of uncertainty quantum which is the basic unit of measurement, we show that the time's arrow within the uncertainty quantum is just opposite to the time's arrow in the observable reality. A special constant is discussed which explains our sensation of time and provides a permanent substrate for all change. It is shown that the whole spacetime connects together in a delicate structure.
Uncertainty in BWR power during ATWS events
Diamond, D.J.
1986-01-01T23:59:59.000Z
A study was undertaken to improve our understanding of BWR conditions following the closure of main steam isolation valves and the failure of reactor trip. Of particular interest was the power during the period when the core had reached a quasi-equilibrium condition with a natural circulation flow rate determined by the water level in the downcomer. Insights into the uncertainity in the calculation of this power with sophisticated computer codes were quantified using a simple model which relates power to the principal thermal-hydraulic variables and reactivity coefficients; the latter representing the link between the thermal-hydraulics and the neutronics. Assumptions regarding the uncertainty in these variables and coefficients were then used to determine the uncertainty in power.
A Framework for Modeling Uncertainty in Regional Climate Change
Monier, Erwan
In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the US associated with four dimensions of uncertainty. The sources ...
Uncertainties in Energy Consumption Introduced by Building Operations and
Uncertainties in Energy Consumption Introduced by Building Operations and Weather for a Medium between predicted and actual building energy consumption can be attributed to uncertainties introduced in energy consumption due to actual weather and building operational practices, using a simulation
Uncertainty analysis of climate change and policy response
Webster, Mort David.; Forest, Chris Eliot.; Reilly, John M.; Babiker, Mustafa H.M.; Kicklighter, David W.; Mayer, Monika.; Prinn, Ronald G.; Sarofim, Marcus C.; Sokolov, Andrei P.; Stone, Peter H.; Wang, Chien.
To aid climate policy decisions, accurate quantitative descriptions of the uncertainty in climate outcomes under various possible policies are needed. Here, we apply an earth systems model to describe the uncertainty in ...
Adaptive control of hypersonic vehicles in presence of actuation uncertainties
Somanath, Amith
2010-01-01T23:59:59.000Z
The thesis develops a new class of adaptive controllers that guarantee global stability in presence of actuation uncertainties. Actuation uncertainties culminate to linear plants with a partially known input matrix B. ...
Spectral Representations of Uncertainty: Algorithms and Applications
George Em Karniadakis
2005-04-24T23:59:59.000Z
The objectives of this project were: (1) Develop a general algorithmic framework for stochastic ordinary and partial differential equations. (2) Set polynomial chaos method and its generalization on firm theoretical ground. (3) Quantify uncertainty in large-scale simulations involving CFD, MHD and microflows. The overall goal of this project was to provide DOE with an algorithmic capability that is more accurate and three to five orders of magnitude more efficient than the Monte Carlo simulation.
Uncertainty estimates for derivatives and intercepts
Clark, E.L.
1990-01-01T23:59:59.000Z
Straight line least squares fits of experimental data are widely used in the analysis of test results to provide derivatives and intercepts. A method for evaluating the uncertainty in these parameters is described. The method utilizes conventional least squares results and is applicable to experiments where the independent variable is controlled, but not necessarily free of error. A Monte Carlo verification of the method is given 7 refs., 2 tabs.
Information-Disturbance theorem and Uncertainty Relation
Takayuki Miyadera; Hideki Imai
2007-07-31T23:59:59.000Z
It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.
Uncertainty and Complementarity Relations in Weak Measurement
Arun Kumar Pati; Junde Wu
2014-11-26T23:59:59.000Z
We prove uncertainty relations that quantitatively express the impossibility of jointly sharp preparation of pre- and post-selected quantum states for measuring incompatible observables during the weak measurement. By defining a suitable operator whose average in the pre-selected quantum state gives the weak value, we show that one can have new uncertainty relations for variances of two such operators corresponding to two non-commuting observables. These generalize the recent stronger uncertainty relations that give non-trivial lower bounds for the sum of variances of two observables which fully capture the concept of incompatible observables. Furthermore, we show that weak values for two non-commuting projection operators obey a complementarity relation. Specifically, we show that for a pre-selected state if we measure a projector corresponding to an observable $A$ weakly followed by the strong measurement of another observable $B$ (for the post-selection) and, for the same pre-selected state we measure a projector corresponding to an observable $B$ weakly followed by the strong measurement of the observable $A$ (for the post-selection), then the product of these two weak values is always less than one. This shows that even though individually they are complex and can be large, their product is always bounded.
Dynamic diagnostic and decision procedures under uncertainty
Baranov, V.V.
1995-01-01T23:59:59.000Z
In this paper, we consider uncertainty that arises when the true state x {element_of} E is not accessible to direct observation and remains unknown. Instead, we observe some features {theta} {element_of} {Theta} that carry a certain information about the true state. This information is described by the conditional distribution P({Theta}{vert_bar}E), which we call the linkage distribution. Regarding this distribution we assume that it exists but is unknown. This leads to uncertainty with respect to states from E and the linkage distribution P({Theta}{vert_bar}E), which we denote by NEP. The substantive problem can be stated as follows: from observations of the features {theta}{element_of}{Theta} made at each time instant n = 1,2,...,recognize the state x {element_of} E, identify the linkage distribution P, and use the results of recognition and identification to choose a decision y {element_of} Y so that the decision process is optimal in some sense. State recognition is the subject of diagnostics. The uncertainty NEP thus generates a problem of diagnostics and dynamic decision making.
Uncertainty analyses of CO2 plume expansion subsequent to wellbore...
Office of Scientific and Technical Information (OSTI)
Sponsoring Org: USDOE Country of Publication: United States Language: English Subject: Carbon sequestration, uncertainty quantification, wellbore leakage, reservoir...
Uncertainty, Performance, and Model Dependency in Approximate Adaptive Nonlinear Control
Szepesvari, Csaba
Uncertainty, Performance, and Model Dependency in Approximate Adaptive Nonlinear Control M. French, and the performance of a class of approximate model based adaptive controllers is studied. An upper performance bound uncertainty model; control effort bounds require both L 2 and L 1 uncertainty models), and various structural
Uncertainty, Subjectivity, Trust and Risk: How It All Fits Together
Stølen, Ketil
Uncertainty, Subjectivity, Trust and Risk: How It All Fits Together Bjørnar Solhaug1 and Ketil uncertainty, subjective, objective, trust, risk, trust management. 1 Aleatory Uncertainty vs. Epistemic be reduced by narrowing the interval and thereby making a more precise prediction. 2 Objective vs. Subjective
Bayesiannetwork Confirmation of Software Testing Uncertainties Debra J. Richardson
Ziv, Hadar
Bayesiannetwork Confirmation of Software Testing Uncertainties Hadar Ziv Debra J. Richardson presentation of uncertainty in software testing. We then propose that a specific technique, known as Bayesian Belief Networks, be used to model software testing uncertainties. We demonstrate the use of Bayesian
Estimating uncertainty of streamflow simulation using Bayesian neural networks
Liang, Faming
Estimating uncertainty of streamflow simulation using Bayesian neural networks Xuesong Zhang,1 neural networks (BNNs) are powerful tools for providing reliable hydrologic prediction and quantifying of the uncertainties related to parameters (neural network's weights) and model structures were applied for uncertainty
Estimating Uncertainty of Streamflow Simulation using Bayesian Neural Networks
Estimating Uncertainty of Streamflow Simulation using Bayesian Neural Networks Xuesong Zhang1-2607 Email: r-srinivasan@tamu.edu 1 #12;Abstract: Recent studies have shown that Bayesian Neural Networks of the uncertainties related to parameters (neural network's weights) and model structures were applied for uncertainty
Robust Nonlinear State Feedback for Power Systems under Structured Uncertainty
Pota, Himanshu Roy
, and highly nonlinear systems with constantly varying loads. Control of modern electric power systems becomes perturbations to the state model of power systems. The control law is applied through the excitation system, stability, uncertainty,. 1. INTRODUCTION For a control scheme to be effective, some details of the system
A Bayesian approach to simultaneously quantify assignments and linguistic uncertainty
Chavez, Gregory M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC FREDERICKSBURG; Ross, Timothy J [UNM
2010-10-07T23:59:59.000Z
Subject matter expert assessments can include both assignment and linguistic uncertainty. This paper examines assessments containing linguistic uncertainty associated with a qualitative description of a specific state of interest and the assignment uncertainty associated with assigning a qualitative value to that state. A Bayesian approach is examined to simultaneously quantify both assignment and linguistic uncertainty in the posterior probability. The approach is applied to a simplified damage assessment model involving both assignment and linguistic uncertainty. The utility of the approach and the conditions under which the approach is feasible are examined and identified.
Measurement uncertainty in surface flatness measurement
H. L. Thang
2011-11-29T23:59:59.000Z
Flatness of a plate is a parameter has been put under consideration for long time. Factors influencing the accuracy of this parameter have been recognized and examined carefully but placed scatterringly. Beside that those reports have not been always in harmonization with Guide for expression of uncertainty measurement (GUM). Furthermore, mathematical equations describing clearly the flatness measurement have not been seen in those reports also. We have collected those influencing factors for systematic reference purpose, re-written the equation describing the profile measurement of the plate topography, and proposed an equation for flatness determination. An illustrative numerical example will be also shown.
Uncertainty quantification for large-scale ocean circulation predictions.
Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik
2010-09-01T23:59:59.000Z
Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.
Uncertainty quantification in reacting flow modeling.
Le MaÒitre, Olivier P. (UniversitÔe d'Evry Val d'Essonne, Evry, France); Reagan, Matthew T.; Knio, Omar M. (Johns Hopkins University, Baltimore, MD); Ghanem, Roger Georges (Johns Hopkins University, Baltimore, MD); Najm, Habib N.
2003-10-01T23:59:59.000Z
Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.
Page 1 Construction and AvailabilityConstruction and Availability Uncertainty in the Regional and Technology Availability Construction Costs Economic Retirement Variable Capacity for Existing Units #12;Page;Page 4 7 OverviewOverview Supply and Technology Availability Construction Costs Economic Retirement
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
repository and oil reservoir modelling Considerable uncertainty about porosity of rock Astronomy "Random handling uncertainty: uncertainty in modelling parameters uncertainty in geometry uncertainty in initial 3031, 2013 4 / 41 #12;PDEs with Uncertainty Examples: Long-term climate modelling: Lots of sources
Survey and Evaluate Uncertainty Quantification Methodologies
Lin, Guang; Engel, David W.; Eslinger, Paul W.
2012-02-01T23:59:59.000Z
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon capture processes. As such, we will develop, as needed and beyond existing capabilities, a suite of robust and efficient computational tools for UQ to be integrated into a CCSI UQ software framework.
Incorporating uncertainty in RADTRAN 6.0 input files.
Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John (Alion Science and Technology)
2010-02-01T23:59:59.000Z
Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.
Adaptively Addressing Uncertainty in Estuarine and Near Coastal Restoration Projects
Thom, Ronald M.; Williams, Greg D.; Borde, Amy B.; Southard, John A.; Sargeant, Susan L.; Woodruff, Dana L.; Laufle, Jeffrey C.; Glasoe, Stuart
2005-03-01T23:59:59.000Z
Restoration projects have an uncertain outcome because of a lack of information about current site conditions, historical disturbance levels, effects of landscape alterations on site development, unpredictable trajectories or patterns of ecosystem structural development, and many other factors. A poor understanding of the factors that control the development and dynamics of a system, such as hydrology, salinity, wave energies, can also lead to an unintended outcome. Finally, lack of experience in restoring certain types of systems (e.g., rare or very fragile habitats) or systems in highly modified situations (e.g., highly urbanized estuaries) makes project outcomes uncertain. Because of these uncertainties, project costs can rise dramatically in an attempt to come closer to project goals. All of the potential sources of error can be addressed to a certain degree through adaptive management. The first step is admitting that these uncertainties can exist, and addressing as many of the uncertainties with planning and directed research prior to implementing the project. The second step is to evaluate uncertainties through hypothesis-driven experiments during project implementation. The third step is to use the monitoring program to evaluate and adjust the project as needed to improve the probability of the project to reach is goal. The fourth and final step is to use the information gained in the project to improve future projects. A framework that includes a clear goal statement, a conceptual model, and an evaluation framework can help in this adaptive restoration process. Projects and programs vary in their application of adaptive management in restoration, and it is very difficult to be highly prescriptive in applying adaptive management to projects that necessarily vary widely in scope, goal, ecosystem characteristics, and uncertainties. Very large ecosystem restoration programs in the Mississippi River delta (Coastal Wetlands Planning, Protection, and Restoration Act; CWPPRA) have incorporated very specific and detailed elements in a more active adaptive management effort. In Puget Sound, the Puget Sound Action Team uses site-specific case studies, monitoring, and public involvement to direct actions to reduce microbial contamination of harvestable shellfish. Small-scale projects can also be improved through application of adaptive management. For example, directed research and site assessments resulted in successful restoration of seagrasses near a ferry terminal in Puget Sound. It is recommended that all restoration programs be conducted in an adaptive management framework, and where appropriate, a more active adaptive management approach be applied. The net effect should be less uncertainty, improved project success, advancement of the science of restoration, and cost savings.
Quantum Limits of Measurements and Uncertainty Principle
Masanao Ozawa
2015-05-19T23:59:59.000Z
In this paper, we show how the Robertson uncertainty relation gives certain intrinsic quantum limits of measurements in the most general and rigorous mathematical treatment. A general lower bound for the product of the root-mean-square measurement errors arising in joint measurements of noncommuting observables is established. We give a rigorous condition for holding of the standard quantum limit (SQL) for repeated measurements, and prove that if a measuring instrument has no larger root-mean-square preparational error than the root-mean-square measurement errors then it obeys the SQL. As shown previously, we can even construct many linear models of position measurement which circumvent this condition for the SQL.
UNCERTAINTY EVALUATION OF AVAILABLE ENERGY AND POWER
Jon P. Christophersen; John L. Morrison
2006-05-01T23:59:59.000Z
The Idaho National Laboratory does extensive testing and evaluation of advanced technology batteries and ultracapacitors for applications in electric and hybrid vehicles. The testing is essentially acquiring time records of voltage, current and temperature from a variety of charge and discharge time profiles. From these three basic measured parameters, a complex assortment of derived parameters (resistance, power, etc.) is computed. Derived parameters are in many cases functions of multiple layers of other derived parameters that eventually work back to the three basic measured parameters. The purpose of this paper is to document the methodology used for the uncertainty analysis of the most complicated derived parameters broadly grouped as available energy and available power. This work is an analytical derivation. Future work will report the implementation of algorithms based upon this effort.
Intrinsic Uncertainties in Modeling Complex Systems.
Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.
2014-09-01T23:59:59.000Z
Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project %23170979, entitled %22Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties%22, which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.
Which Models Matter: Uncertainty and Sensitivity Analysis for
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Models Matter: Uncertainty and Sensitivity Analysis for Photovoltaic Power Systems Clifford W. Hansen and Andrew Pohl Sandia National Laboratories, Albuquerque, NM, 87185-1033, USA...
Optimization Online - The impact of wind uncertainty on the strategic ...
Pedro Crespo Del Granado
2015-01-14T23:59:59.000Z
Jan 14, 2015 ... Abstract: The intermittent nature of wind energy generation has introduced a new degree of uncertainty to the tactical planning of energy ...
Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions
Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.
2011-09-30T23:59:59.000Z
This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.
The importance of covariance in nuclear data uncertainty propagation studies
Benstead, J. [AWE Plc, Aldermaston, Berkshire (United Kingdom)
2012-07-01T23:59:59.000Z
A study has been undertaken to investigate what proportion of the uncertainty propagated through plutonium critical assembly calculations is due to the covariances between the fission cross section in different neutron energy groups. The uncertainties on k{sub eff} calculated show that the presence of covariances between the cross section in different neutron energy groups accounts for approximately 27-37% of the propagated uncertainty due to the plutonium fission cross section. This study also confirmed the validity of employing the sandwich equation, with associated sensitivity and covariance data, instead of a Monte Carlo sampling approach to calculating uncertainties for linearly varying systems. (authors)
Special Issue on Integrated Uncertainty Management for Decision Making
under soft constraints #12;· Application: Ranking and recommendation systems Supply chain management and decision making. Topics include but are not limited to: · Methodology: Uncertainty formalisms: Bayesian
Model simplification of chemical kinetic systems under uncertainty
Coles, Thomas Michael Kyte
2011-01-01T23:59:59.000Z
This thesis investigates the impact of uncertainty on the reduction and simplification of chemical kinetics mechanisms. Chemical kinetics simulations of complex fuels are very computationally expensive, especially when ...
Worst-case-expectation approach to optimization under uncertainty
Alexander Shapiro
2012-10-30T23:59:59.000Z
Oct 30, 2012 ... Worst-case-expectation approach to optimization under uncertainty ... approximation, risk neutral and risk averse approaches, case studies.
Estimation of uncertainty for contour method residual stress measurements
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.
2015-03-01T23:59:59.000Z
This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore »of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (?/E = 7 · 10??) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (?/E = 14 · 10??).« less
Measurement uncertainty analysis techniques applied to PV performance measurements
Wells, C.
1992-10-01T23:59:59.000Z
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Asymptotic and uncertainty analyses of a phase field model for...
Office of Scientific and Technical Information (OSTI)
analysis and uncertainty quantification of a phase field model for void formation and evolution in materials subject to irradiation. The parameters of the phase field model are...
Quantifying Uncertainty in Computer Predictions | netl.doe.gov
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
initiated work on the verification, validation and uncertainty quantification of multiphase computational fluid dynamics (CFD) models that underpin the simulation of several...
DRAFT REPORT HIERARCHY OF METHODS TO CHARACTERIZE UNCERTAINTY
Frey, H. Christopher
...................................................................................................... 5 1.2.3 Risk, Cost, Uncertainty and Decisions ........................................................... 20 1.4.3 Other Quantitative Methods.............................................................................................. 24 1.4.5 Sensitivity Analysis
Decision making under epistemic uncertainty : an application to seismic design
Agarwal, Anna
2008-01-01T23:59:59.000Z
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations ...
Uncertainty quantification in fission cross section measurements at LANSCE
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Tovesson, F.
2015-01-01T23:59:59.000Z
Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.
Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.
2010-05-01T23:59:59.000Z
Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, an extensive performance assessment (PA) for the YM repository was completed in 2008 [1] and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository [2]. This presentation provides an overview of the conceptual and computational structure of the indicated PA (hereafter referred to as the 2008 YM PA) and the roles that uncertainty analysis and sensitivity analysis play in this structure.
Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others
1995-01-01T23:59:59.000Z
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.
Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others
1995-01-01T23:59:59.000Z
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1998-04-01T23:59:59.000Z
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1997-12-01T23:59:59.000Z
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)
1997-12-01T23:59:59.000Z
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Copyright Uncertainty in the Geoscience Community: Part I, What's Free for the Taking?
Clement, Gail
2012-01-01T23:59:59.000Z
Pre-print submitted for publication: Clement, Gail, 2012. ?Copyright Uncertainty in the Geoscience community: Part I, What?s Free for the taking?? Proceedings - Geoscience Information Society 42 , forthcoming... and Scholarly Communication, Texas A&M University Libraries, Mailstop 5000, College Station, TX 77843 gclement@tamu.edu Introduction: Why does copyright uncertainty exist in the geoscience information community? Geoscience information is highly...
Examining Uncertainty in Demand Response Baseline Models and
LBNL-5096E Examining Uncertainty in Demand Response Baseline Models and Variability in Automated of California. #12;Examining Uncertainty in Demand Response Baseline Models and Variability in Automated.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR
Climate Change Uncertainty and Skepticism: A Cross-Country Analysis
Hall, Sharon J.
Climate Change Uncertainty and Skepticism: A Cross-Country Analysis Skepticism about climate change for other countries. · Skepticism and uncertainty are related but different aspects of climate change perceptions. In the literature, skepticism often relates to whether people believe climate change is happening
Using Storage To Control Uncertainty in Power Systems
Using Storage To Control Uncertainty in Power Systems Glyn Eggar Department of Actuarial the simplest case 5-10mins Â· Questions 2 #12;Disclaimer 3 #12;Motivation Â· Can we use storage to control uncertainty in the electricity network? Â· What sort of storage are we even talking about here? Â· How much
Uncertainties and assessments of chemistry-climate models of the
Wirosoetisno, Djoko
and assessments of chemistry-climate models of the stratosphere. Atmospheric Chemistry and Physics, 3 (1). pp. 1, 127, 2003 www.atmos-chem-phys.org/acp/3/1/ Atmospheric Chemistry and Physics UncertaintiesUncertainties and assessments of chemistry-climate models of the stratosphere Article Published
Sensitivity, Approximation and Uncertainty in Power System Dynamic Simulation
1 Sensitivity, Approximation and Uncertainty in Power System Dynamic Simulation Ian A. Hiskens, Fellow, IEEE Jassim Alseddiqui Student Member, IEEE Abstract-- Parameters of power system models the influence of uncertainty in simulations of power system dynamic behaviour. It is shown that trajectory
Chronological information and uncertainty Radiocarbon dating & calibration -Paula Reimer
Sengun, Mehmet Haluk
uncertainty · Estimated ± 1 - 2 (± 8-32 years) · Measured off-line ± 0.2 (± 2) - ok in gas counting systems.17 (lab error multiplier) #12;14C calibration uncertainties Animation: M. Blaauw #12;Constant initial 14C interpol. Too much weight on individual dates? #12;Which age-depth model? 800 cm core 9 14 C dates
Applying uncertainty quantification to multiphase flow computational fluid dynamics
Gel, A.; Garg, R.; Tong, C.; Shahnam, M.; Guenther, C.
2013-07-01T23:59:59.000Z
Multiphase computational fluid dynamics plays a major role in design and optimization of fossil fuel based reactors. There is a growing interest in accounting for the influence of uncertainties associated with physical systems to increase the reliability of computational simulation based engineering analysis. The U.S. Department of Energy's National Energy Technology Laboratory (NETL) has recently undertaken an initiative to characterize uncertainties associated with computer simulation of reacting multiphase flows encountered in energy producing systems such as a coal gasifier. The current work presents the preliminary results in applying non-intrusive parametric uncertainty quantification and propagation techniques with NETL's open-source multiphase computational fluid dynamics software MFIX. For this purpose an open-source uncertainty quantification toolkit, PSUADE developed at the Lawrence Livermore National Laboratory (LLNL) has been interfaced with MFIX software. In this study, the sources of uncertainty associated with numerical approximation and model form have been neglected, and only the model input parametric uncertainty with forward propagation has been investigated by constructing a surrogate model based on data-fitted response surface for a multiphase flow demonstration problem. Monte Carlo simulation was employed for forward propagation of the aleatory type input uncertainties. Several insights gained based on the outcome of these simulations are presented such as how inadequate characterization of uncertainties can affect the reliability of the prediction results. Also a global sensitivity study using Sobol' indices was performed to better understand the contribution of input parameters to the variability observed in response variable.
Closure for Production Planning under Power Uncertainty Project
Grossmann, Ignacio E.
Closure for Production Planning under Power Uncertainty Project Lehigh University Pietro Belotti C the power is recovered Production occurs at reduced rate The New Uncertainty Set Requires more binary¸ agri Latifoglu Fay Li Larry Snyder Air Products and Chemicals, Inc. Jim Hutton Peter Connard September
Hyperparameter estimation for uncertainty quantification in mesoscale carbon dioxide inversions
Paris-Sud XI, Université de
Hyperparameter estimation for uncertainty quantification in mesoscale carbon dioxide inversions-validation (GCV) and x2 test are compared for the first time under a realistic setting in a mesoscale CO2 estimation, uncertainty quantification, mesoscale carbon dioxide inversions 1. Introduction The atmosphere
Impact of PV forecasts uncertainty in batteries management in microgrids
Paris-Sud XI, Université de
Impact of PV forecasts uncertainty in batteries management in microgrids Andrea Michiorri Arthur--This paper is motivated by the question of the impact that uncertainty in PV forecasts has in forecast models. This situation can be expected to be frequent with new PV installations. A probabilistic PV
Quantification of Variability and Uncertainty in Emission H. Christopher Frey
Frey, H. Christopher
in feedstock composition, design, maintenance, and operation. Uncertainty refers to lack of knowledge regarding, and maintenance; and intra-plant variability in operation and maintenance. Uncertainty typically arises due (statistical random sampling error), and non-representativeness (which can introduce additional errors
A REACTIVE APPROACH FOR MINING PROJECT EVALUATION UNDER PRICE UNCERTAINTY
Duffy, Ken
A REACTIVE APPROACH FOR MINING PROJECT EVALUATION UNDER PRICE UNCERTAINTY Meimei Zhang. This method often undervalues a mining project since it ignores future price uncertainty and does not allow on metal price. This paper also demonstrates that the "reactive" approach can estimate the mine project
A Genetic Programming Methodology for Strategy Optimization Under Uncertainty
Fernandez, Thomas
the Missile Countermeasures Optimization (MCO) problem as an instance of a strategy optimization problem; describes various types and degrees of uncertainty that may be introduced into the MCO problem; and develops a new methodology for solving the MCO problem under conditions of uncertainty. The new methodology
Production Optimization; System Identification and Uncertainty Steinar M. Elgsaeter
Johansen, Tor Arne
Production Optimization; System Identification and Uncertainty Estimation Steinar M. Elgsaeter Olav Slupphaug Tor Arne Johansen 1 Abstract Real-time optimization of oil and gas production requires a production model, which must be fitted to data for accuracy. A certain amount of uncertainty must typically
Representing Thermal Vibrations and Uncertainty in Molecular Surfaces
Varshney, Amitabh
in a molecule is fuzzy because of its uncertainty in protein structure determination and thermal energy because of its thermal energy. Therefore, the smooth molecular surface will also vibrate. Also in proteinRepresenting Thermal Vibrations and Uncertainty in Molecular Surfaces Chang Ha Lee and Amitabh
Holographic position uncertainty and the quantum-classical transition
C. L. Herzenberg
2010-04-16T23:59:59.000Z
Arguments based on general principles of quantum mechanics have suggested that a minimum length associated with Planck-scale unification may in the context of the holographic principle entail a new kind of observable uncertainty in the transverse position of macroscopically separated objects. Here, we address potential implications of such a position uncertainty for establishing an additional threshold between quantum and classical behavior.
September 2006 P-1 Appendix P: Risk and Uncertainty
September 2006 P-1 Appendix P: Risk and Uncertainty This appendix deals with the representation. In the section on "Uncertainties," beginning on page P-18, it describes in detail how the regional portfolio of these files is a compressed file, L24X-DW02-P.zip, containing the workbook files that Appendix L uses
Uncertainty and Predictability in Geophysics: Chaos and Multifractal Insights
Lovejoy, Shaun
Uncertainty and Predictability in Geophysics: Chaos and Multifractal Insights Daniel Schertzer Department, McGill University, Montreal, Canada Uncertainty and error growth are crosscutting geophysical extremes. The focus is now on time-space geophysical scaling behavior: their multifractality. It is found
A Summary Report Fluctuating energy costs and economic uncertainties worldwide
Minnesota, University of
Club of Minneapolis and St. Paul Energy Uncertainties: Supply Chain Impacts in the Upper MidwestA Summary Report Fluctuating energy costs and economic uncertainties worldwide are disrupting Upper Midwest supply chains. While facing these challenges, participants throughout a supply chain need to con
Offshore Oilfield Development Planning under Uncertainty and Fiscal Considerations
Grossmann, Ignacio E.
Planning under Perfect Information Include Fiscal Calculations within the basic model Uncer1 Offshore Oilfield Development Planning under Uncertainty and Fiscal Considerations Vijay Gupta1 of uncertainty and complex fiscal rules in the development planning of offshore oil and gas fields which involve
Measurement uncertainty of adsorption testing of desiccant materials
Bingham, C E; Pesaran, A A
1988-12-01T23:59:59.000Z
The technique of measurement uncertainty analysis as described in the current ANSI/ASME standard is applied to the testing of desiccant materials in SERI`s Sorption Test Facility. This paper estimates the elemental precision and systematic errors in these tests and propagates them separately to obtain the resulting uncertainty of the test parameters, including relative humidity ({plus_minus}.03) and sorption capacity ({plus_minus}.002 g/g). Errors generated by instrument calibration, data acquisition, and data reduction are considered. Measurement parameters that would improve the uncertainty of the results are identified. Using the uncertainty in the moisture capacity of a desiccant, the design engineer can estimate the uncertainty in performance of a dehumidifier for desiccant cooling systems with confidence. 6 refs., 2 figs., 8 tabs.
Avoiding climate change uncertainties in Strategic Environmental Assessment
Larsen, Sanne Vammen, E-mail: sannevl@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark); Kørnøv, Lone, E-mail: lonek@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University, Skibbrogade 5, 1. Sal, 9000 Aalborg (Denmark)] [The Danish Centre for Environmental Assessment, Aalborg University, Skibbrogade 5, 1. Sal, 9000 Aalborg (Denmark); Driscoll, Patrick, E-mail: patrick@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark)] [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark)
2013-11-15T23:59:59.000Z
This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.
Evaluating uncertainty in stochastic simulation models
McKay, M.D.
1998-02-01T23:59:59.000Z
This paper discusses fundamental concepts of uncertainty analysis relevant to both stochastic simulation models and deterministic models. A stochastic simulation model, called a simulation model, is a stochastic mathematical model that incorporates random numbers in the calculation of the model prediction. Queuing models are familiar simulation models in which random numbers are used for sampling interarrival and service times. Another example of simulation models is found in probabilistic risk assessments where atmospheric dispersion submodels are used to calculate movement of material. For these models, randomness comes not from the sampling of times but from the sampling of weather conditions, which are described by a frequency distribution of atmospheric variables like wind speed and direction as a function of height above ground. A common characteristic of simulation models is that single predictions, based on one interarrival time or one weather condition, for example, are not nearly as informative as the probability distribution of possible predictions induced by sampling the simulation variables like time and weather condition. The language of model analysis is often general and vague, with terms having mostly intuitive meaning. The definition and motivations for some of the commonly used terms and phrases offered in this paper lead to an analysis procedure based on prediction variance. In the following mathematical abstraction the authors present a setting for model analysis, relate practical objectives to mathematical terms, and show how two reasonable premises lead to a viable analysis strategy.
Uncertainty quantification of limit-cycle oscillations
Beran, Philip S. [Multidisciplinary Technologies Center, Air Vehicles Directorate, AFRL/VASD, Building 146, 2210 Eighth Street, WPAFB, OH 45433 (United States)]. E-mail: philip.beran@wpafb.af.mil; Pettit, Chris L. [United States Naval Academy, 590 Holloway Rd., MS 11-B, Annapolis, MD 21402 (United States)]. E-mail: pettitcl@usna.edu; Millman, Daniel R. [USAF TPS/EDT, 220 South Wolfe Ave, Bldg. 1220, Rm. 131, Edwards AFB, CA 93524-6485 (United States)]. E-mail: daniel.millman@edwards.af.mil
2006-09-01T23:59:59.000Z
Different computational methodologies have been developed to quantify the uncertain response of a relatively simple aeroelastic system in limit-cycle oscillation, subject to parametric variability. The aeroelastic system is that of a rigid airfoil, supported by pitch and plunge structural coupling, with nonlinearities in the component in pitch. The nonlinearities are adjusted to permit the formation of a either a subcritical or supercritical branch of limit-cycle oscillations. Uncertainties are specified in the cubic coefficient of the torsional spring and in the initial pitch angle of the airfoil. Stochastic projections of the time-domain and cyclic equations governing system response are carried out, leading to both intrusive and non-intrusive computational formulations. Non-intrusive formulations are examined using stochastic projections derived from Wiener expansions involving Haar wavelet and B-spline bases, while Wiener-Hermite expansions of the cyclic equations are employed intrusively and non-intrusively. Application of the B-spline stochastic projection is extended to the treatment of aerodynamic nonlinearities, as modeled through the discrete Euler equations. The methodologies are compared in terms of computational cost, convergence properties, ease of implementation, and potential for application to complex aeroelastic systems.
Uncertainty analysis of steady state incident heat flux measurements in hydrocarbon fuel fires.
Nakos, James Thomas
2005-12-01T23:59:59.000Z
The objective of this report is to develop uncertainty estimates for three heat flux measurement techniques used for the measurement of incident heat flux in a combined radiative and convective environment. This is related to the measurement of heat flux to objects placed inside hydrocarbon fuel (diesel, JP-8 jet fuel) fires, which is very difficult to make accurately (e.g., less than 10%). Three methods will be discussed: a Schmidt-Boelter heat flux gage; a calorimeter and inverse heat conduction method; and a thin plate and energy balance method. Steady state uncertainties were estimated for two types of fires (i.e., calm wind and high winds) at three times (early in the fire, late in the fire, and at an intermediate time). Results showed a large uncertainty for all three methods. Typical uncertainties for a Schmidt-Boelter gage ranged from {+-}23% for high wind fires to {+-}39% for low wind fires. For the calorimeter/inverse method the uncertainties were {+-}25% to {+-}40%. The thin plate/energy balance method the uncertainties ranged from {+-}21% to {+-}42%. The 23-39% uncertainties for the Schmidt-Boelter gage are much larger than the quoted uncertainty for a radiative only environment (i.e ., {+-}3%). This large difference is due to the convective contribution and because the gage sensitivities to radiative and convective environments are not equal. All these values are larger than desired, which suggests the need for improvements in heat flux measurements in fires.
A method to estimate the effect of deformable image registration uncertainties on daily dose mapping
Murphy, Martin J.; Salguero, Francisco J.; Siebers, Jeffrey V.; Staub, David; Vaman, Constantin [Department of Radiation Oncology, Virginia Commonwealth University, Richmond Virginia 23298 (United States)
2012-02-15T23:59:59.000Z
Purpose: To develop a statistical sampling procedure for spatially-correlated uncertainties in deformable image registration and then use it to demonstrate their effect on daily dose mapping. Methods: Sequential daily CT studies are acquired to map anatomical variations prior to fractionated external beam radiotherapy. The CTs are deformably registered to the planning CT to obtain displacement vector fields (DVFs). The DVFs are used to accumulate the dose delivered each day onto the planning CT. Each DVF has spatially-correlated uncertainties associated with it. Principal components analysis (PCA) is applied to measured DVF error maps to produce decorrelated principal component modes of the errors. The modes are sampled independently and reconstructed to produce synthetic registration error maps. The synthetic error maps are convolved with dose mapped via deformable registration to model the resulting uncertainty in the dose mapping. The results are compared to the dose mapping uncertainty that would result from uncorrelated DVF errors that vary randomly from voxel to voxel. Results: The error sampling method is shown to produce synthetic DVF error maps that are statistically indistinguishable from the observed error maps. Spatially-correlated DVF uncertainties modeled by our procedure produce patterns of dose mapping error that are different from that due to randomly distributed uncertainties. Conclusions: Deformable image registration uncertainties have complex spatial distributions. The authors have developed and tested a method to decorrelate the spatial uncertainties and make statistical samples of highly correlated error maps. The sample error maps can be used to investigate the effect of DVF uncertainties on daily dose mapping via deformable image registration. An initial demonstration of this methodology shows that dose mapping uncertainties can be sensitive to spatial patterns in the DVF uncertainties.
Measurement uncertainties for vacuum standards at Korea Research Institute of Standards and Science
Hong, S. S.; Shin, Y. H.; Chung, K. H. [Vacuum Center, Korea Research Institute of Standards and Science, Daejeon 305-600 (Korea, Republic of)
2006-09-15T23:59:59.000Z
The Korea Research Institute of Standards and Science has three major vacuum systems: an ultrasonic interferometer manometer (UIM) (Sec. II, Figs. 1 and 2) for low vacuum, a static expansion system (SES) (Sec. III, Figs. 3 and 4) for medium vacuum, and an orifice-type dynamic expansion system (DES) (Sec. IV, Figs. 5 and 6) for high and ultrahigh vacuum. For each system explicit measurement model equations with multiple variables are, respectively, given. According to ISO standards, all these system variable errors were used to calculate the expanded uncertainty (U). For each system the expanded uncertainties (k=1, confidence level=95%) and relative expanded uncertainty (expanded uncertainty/generated pressure) are summarized in Table IV and are estimated to be as follows. For UIM, at 2.5-300 Pa generated pressure, the expanded uncertainty is <4.17x10{sup -2} Pa and the relative expanded uncertainty is <1.18x10{sup -2}; at 1-100 kPa generated pressure, the expanded uncertainty is <7.87 Pa and the relative expanded uncertainty is <7.84x10{sup -5}. For SES, at 3-100 Pa generated pressure, the expanded uncertainty is <1.77x10{sup -1} Pa and the relative expanded uncertainty is <1.81x10{sup -3}. For DES, at 4.6x10{sup -3}-1.3x10{sup -2} Pa generated pressure, the expanded uncertainty is <1.04x10{sup -4} Pa and the relative expanded uncertainty is <8.37x10{sup -3}; at 3.0x10{sup -6}-9.0x10{sup -4} Pa generated pressure, the expanded uncertainty is <8.21x10{sup -6} Pa and the relative expanded uncertainty is <1.37x10{sup -2}. Within uncertainty limits our bilateral and key comparisons [CCM.P-K4 (10 Pa-1 kPa)] are extensive and in good agreement with those of other nations (Fig. 8 and Table V)
Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina
2014-02-01T23:59:59.000Z
This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).
Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others
1997-06-01T23:59:59.000Z
This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.
Measurement uncertainty analysis techniques applied to PV performance measurements
Wells, C.
1992-10-01T23:59:59.000Z
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint
Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.
2014-11-01T23:59:59.000Z
Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.
A recipe for EFT uncertainty quantification in nuclear physics
R. J. Furnstahl; D. R. Phillips; S. Wesolowski
2014-12-16T23:59:59.000Z
The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model.
Neural network uncertainty assessment using Bayesian statistics with application to remote sensing
Aires, Filipe
Neural network uncertainty assessment using Bayesian statistics with application to remote sensing for many inversion problems in remote sensing; however, uncertainty estimates are rarely provided Meteorology and Atmospheric Dynamics: General or miscellaneous; KEYWORDS: remote sensing, uncertainty, neural
Quantification of Uncertainties Due to Opacities in a Laser-Driven Radiative-Shock Problem
Hetzler, Adam C
2013-03-28T23:59:59.000Z
it is infeasible to apply standard uncertainty-propagation techniques to the O(105) uncertain opacities in a realistic simulation. The new approach toward uncertainty quantification applies the uncertainty analysis to the physical parameters in the underlying...
The Uncertainty Principle in Software Engineering Hadar Ziv Debra J. Richardson
Ziv, Hadar
plore in detail uncertainty in software testing, includ ing test planning, test enactment, error principles, software testing, uncertainty mod eling, Bayesian networks INTRODUCTION Today's software), followed by more detailed expositions of uncertainty in software testing, including test planning, test
Informatively optimal levels of confidence for measurement uncertainty
David
2011-10-13T23:59:59.000Z
expanded uncertainty of measurement as a historical artifact, and not as a strictly substantiated value. .... where ?o(?) = 1.5 is true for the most uncertain classification situation (50% confidence) about allowing or ..... as a power of exponent (n).
Some methods of estimating uncertainty in accident reconstruction
Milan Batista
2011-07-20T23:59:59.000Z
In the paper four methods for estimating uncertainty in accident reconstruction are discussed: total differential method, extreme values method, Gauss statistical method, and Monte Carlo simulation method. The methods are described and the program solutions are given.
Analysis and Reduction of Complex Networks Under Uncertainty
Knio, Omar M
2014-04-09T23:59:59.000Z
This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.
Modeling aviation's global emissions, uncertainty analysis, and applications to policy
Lee, Joosung Joseph, 1974-
2005-01-01T23:59:59.000Z
(cont.) fuel burn results below 3000 ft. For emissions, the emissions indices were the most influential uncertainties for the variance in model outputs. By employing the model, this thesis examined three policy options for ...
Including model uncertainty in risk-informed decision-making
Reinert, Joshua M
2005-01-01T23:59:59.000Z
Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and ...
Do the uncertainty relations really have crucial significances for physics ?
Spiridon Dumitru
2010-05-03T23:59:59.000Z
It is proved the falsity of idea that the Uncertainty Relations (UR) have crucial significances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.
Creating value from uncertainty : a study of ocean transportation contracting
Pálsson, Sigurjón
2005-01-01T23:59:59.000Z
How can financial tools like real options and hedging mitigate and create value from uncertainty in transportation? This paper describes these concepts and identifies research on them that has relevance to transportation. ...
Uncertainty in Greenhouse Emissions and Costs of Atmospheric Stabilization
Webster, Mort D.
We explore the uncertainty in projections of emissions, and costs of atmospheric stabilization applying the MIT Emissions Prediction and Policy Analysis model, a computable general equilibrium model of the global economy. ...
HANDLING UNCERTAINTY IN PRODUCTION ACTIVITY CONTROL USING PROACTIVE SIMULATION
Paris-Sud XI, Université de
HANDLING UNCERTAINTY IN PRODUCTION ACTIVITY CONTROL USING PROACTIVE SIMULATION Olivier CARDIN, Production Control, Manufacturing Systems, Proactive, Real-time. 1. INTRODUCTION In today's complex of a product. Real-word planning and scheduling problems are generally complex, constrained and multi
Maintaining artificial recharge ponds under uncertainty: a probabilistic approach for
Politècnica de Catalunya, Universitat
Maintaining artificial recharge ponds under uncertainty: a probabilistic approach for engineering surface ponds (SP) Clogging What is clogging? Mathematical models for clogging Risk formulation Carlo analysis Conclusions #12;Surface ponds (SP) collect selected external water (e.g. regenerated
The effects of incorporating dynamic data on estimates of uncertainty
Mulla, Shahebaz Hisamuddin
2004-09-30T23:59:59.000Z
analysis. The results were compared with the uncertainty predicted using only static data. We also investigated approaches for best selecting a smaller number of models from a larger set of realizations to be history matched for quantification...
History matching and uncertainty quantificiation using sampling method
Ma, Xianlin
2009-05-15T23:59:59.000Z
Uncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte ...
Robust decision-making with model uncertainty in aerospace systems
Bertuccelli, Luca Francesco, 1981-
2008-01-01T23:59:59.000Z
Actual performance of sequential decision-making problems can be extremely sensitive to errors in the models, and this research addressed the role of robustness in coping with this uncertainty. The first part of this thesis ...
Chance Constrained RRT for Probabilistic Robustness to Environmental Uncertainty
Luders, Brandon Douglas
For motion planning problems involving many or unbounded forms of uncertainty, it may not be possible to identify a path guaranteed to be feasible, requiring consideration of the trade-off between planner conservatism and ...
Uncertainty in future carbon emissions : a preliminary exploration
Webster, Mort David.
In order to analyze competing policy approaches for addressing global climate change, a wide variety of economic-energy models are used to project future carbon emissions under various policy scenarios. Due to uncertainties ...
Integration of Uncertainty Information into Power System Operations
Makarov, Yuri V.; Lu, Shuai; Samaan, Nader A.; Huang, Zhenyu; Subbarao, Krishnappa; Etingov, Pavel V.; Ma, Jian; Hafen, Ryan P.; Diao, Ruisheng; Lu, Ning
2011-10-10T23:59:59.000Z
Contemporary power systems face uncertainties coming from multiple sources, including forecast errors of load, wind and solar generation, uninstructed deviation and forced outage of traditional generators, loss of transmission lines, and others. With increasing amounts of wind and solar generation being integrated into the system, these uncertainties have been growing significantly. It is critical important to build knowledge of major sources of uncertainty, learn how to simulate them, and then incorporate this information into the decision-making processes and power system operations, for better reliability and efficiency. This paper gives a comprehensive view on the sources of uncertainty in power systems, important characteristics, available models, and ways of their integration into system operations. It is primarily based on previous works conducted at the Pacific Northwest National Laboratory (PNNL).
Uncertainty of microwave radiative transfer computations in rain
Hong, Sung Wook
2009-06-02T23:59:59.000Z
Currently, the effect of the vertical resolution on the brightness temperature (BT) has not been examined in depth. The uncertainty of the freezing level (FL) retrieved using two different satellites' data is large. Various ...
Essays on Voluntary Contribution with Private Information and Threshold Uncertainty
Peng, Hui-Chun
2014-07-08T23:59:59.000Z
This dissertation concerns individual voluntary contributions in the subscription game with three important model considerations: private information on public good valuations, threshold uncertainty and the timing of the contribution — simultaneous...
Infill location determination and assessment of corresponding uncertainty
Senel, Ozgur
2009-05-15T23:59:59.000Z
, generated probabilistic distributions of incremental field production and, finally, used descriptive statistical analysis to evaluate results. I quantified the uncertainty associated with infill location selection in terms of incremental field production...
Real options approach to capacity planning under uncertainty
Mittal, Geetanjali, 1979-
2004-01-01T23:59:59.000Z
This thesis highlights the effectiveness of Real Options Analysis (ROA) in capacity planning decisions for engineering projects subject to uncertainty. This is in contrast to the irreversible decision-making proposed by ...
Investment and Upgrade in Distributed Generation under Uncertainty
Siddiqui, Afzal
2008-01-01T23:59:59.000Z
ment of uncertainty via real options increases the value of2007) and the 2007 Real Options Conference in Berkeley, CA,distributed generation, real options JEL Codes: D81, Q40
System architecture decisions under uncertainty : a case study on automotive battery system design
Renzi, Matthew Joseph
2012-01-01T23:59:59.000Z
Flexibility analysis using the Real Options framework is typically utilized on high-level architectural decisions. Using Real Options, a company may develop strategies to mitigate downside risk for future uncertainties ...
An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation
McLean, Noah M.; Bowring, J.F.; Bowring, S.A.
2011-06-01T23:59:59.000Z
High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of ...
An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation
McLean, Noah Morgan
High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of ...
Clark, E.L.
1993-08-01T23:59:59.000Z
Error propagation equations, based on the Taylor series model, are derived for the nondimensional ratios and coefficients most often encountered in high-speed wind tunnel testing. These include pressure ratio and coefficient, static force and moment coefficients, dynamic stability coefficients, calibration Mach number and Reynolds number. The error equations contain partial derivatives, denoted as sensitivity coefficients, which define the influence of free-stream Mach number, M{infinity}, on various aerodynamic ratios. To facilitate use of the error equations, sensitivity coefficients are derived and evaluated for nine fundamental aerodynamic ratios, most of which relate free-stream test conditions (pressure, temperature, density or velocity) to a reference condition. Tables of the ratios, R, absolute sensitivity coefficients, {partial_derivative}R/{partial_derivative}M{infinity}, and relative sensitivity coefficients, (M{infinity}/R) ({partial_derivative}R/{partial_derivative}M{infinity}), are provided as functions of M{infinity}.
Uncertainties in the Anti-neutrino Production at Nuclear Reactors
Djurcic, Zelimir; Detwiler, Jason A.; Piepke, Andreas; Foster Jr., Vince R.; Miller, Lester; Gratta, Giorgio
2008-08-06T23:59:59.000Z
Anti-neutrino emission rates from nuclear reactors are determined from thermal power measurements and fission rate calculations. The uncertainties in these quantities for commercial power plants and their impact on the calculated interaction rates in {bar {nu}}{sub e} detectors is examined. We discuss reactor-to-reactor correlations between the leading uncertainties, and their relevance to reactor {bar {nu}}{sub e} experiments.
Entropic uncertainty relation for power-law wave packets
Sumiyoshi Abe; S. Martinez; F. Pennini; A. Plastino
2002-06-06T23:59:59.000Z
For the power-law quantum wave packet in configuration space, the variance of the position observable may be divergent. Accordingly, the information-entropic formulation of the uncertainty principle becomes more appropriate than the Heisenberg-type formulation, since it involves only the finite quantities. It is found that the total amount of entropic uncertainty converges to its lower bound in the limit of a large value of the exponent.
Living With Radical Uncertainty. The Exemplary case of Folding Protein
Ignazio Licata
2010-04-21T23:59:59.000Z
Laplace's demon still makes strong impact on contemporary science, in spite of the fact that Logical Mathematics outcomes, Quantum Physics advent and more recently Complexity Science have pointed out the crucial role of uncertainty in the World's descriptions. We focus here on the typical problem of folding protein as an example of uncertainty, radical emergence and a guide to the "simple" principles for studying complex systems.
Uncertainty and sensitivity analysis for long-running computer codes : a critical review
Langewisch, Dustin R
2010-01-01T23:59:59.000Z
This thesis presents a critical review of existing methods for performing probabilistic uncertainty and sensitivity analysis for complex, computationally expensive simulation models. Uncertainty analysis (UA) methods ...
Callaway, Duncan S; Tabone, Michaelangelo D
2015-01-01T23:59:59.000Z
AND UNCERTAINTY OF PHOTOVOLTAIC GENERATION [9] M. Milligan,for grid-connected photovoltaic system based on advancedand uncertainty in solar photovoltaic generation at multiple
Optimal Control of Distributed Energy Resources and Demand Response under Uncertainty
Siddiqui, Afzal
2010-01-01T23:59:59.000Z
Energy Resources and Demand Response under Uncertainty AfzalEnergy Resources and Demand Response under Uncertainty ?DER in conjunction with demand response (DR): the expected
Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling
Danon, Yaron; Nazarewicz, Witold; Talou, Patrick
2013-02-18T23:59:59.000Z
This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implement innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.
How incorporating more data reduces uncertainty in recovery predictions
Campozana, F.P.; Lake, L.W.; Sepehrnoori, K. [Univ. of Texas, Austin, TX (United States)
1997-08-01T23:59:59.000Z
From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.
Uncertainty Estimation Improves Energy Measurement and Verification Procedures
Walter, Travis; Price, Phillip N.; Sohn, Michael D.
2014-05-14T23:59:59.000Z
Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.
The Role of Uncertainty Quantification for Reactor Physics
Massimo Salvatores; Giuseppe Palmiotti; G. Aliberti
2015-01-01T23:59:59.000Z
The role of uncertainty quantification has been stressed and has been the object of several assessments in the past (see e.g. References 1 and 2 among many others), in particular in relation to design requirements for safety assessments, design margins definition and optimization, both for the reactor core and for the associated fuel cycles. The use of integral experiments has been advocated since many years, and recently re-assessed in order to reduce uncertainties and to define new reduced ''a-posteriori'' uncertainties. While uncertainty quantification in the case of existing power plants benefits from a large data base of operating reactor experimental results, innovative reactor systems (reactor and associated fuel cycles) should rely on limited power reactor experiment data bases and on a number of past integral experiments that should be shown to be representative enough. Moreover, in some cases, in particular related to innovative fuel cycle performance and feasibility assessment, nuclear data uncertainties are the only available information. Uncertainty quantification in that case becomes a tool for detecting potential show stoppers associated to specfic fuel cycle strategies, besides the challenges related to fuel properties, fuel processing chemistry and material performance issues.
Fuzzy-probabilistic calculations of water-balance uncertainty
Faybishenko, B.
2009-10-01T23:59:59.000Z
Hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete, or subjective information, which may limit the application of conventional stochastic methods in predicting hydrogeologic conditions and associated uncertainty. Instead, redictions and uncertainty analysis can be made using uncertain input parameters expressed as probability boxes, intervals, and fuzzy numbers. The objective of this paper is to present the theory for, and a case study as an application of, the fuzzyprobabilistic approach, ombining probability and possibility theory for simulating soil water balance and assessing associated uncertainty in the components of a simple waterbalance equation. The application of this approach is demonstrated using calculations with the RAMAS Risk Calc code, to ssess the propagation of uncertainty in calculating potential evapotranspiration, actual evapotranspiration, and infiltration-in a case study at the Hanford site, Washington, USA. Propagation of uncertainty into the results of water-balance calculations was evaluated by hanging he types of models of uncertainty incorporated into various input parameters. The results of these fuzzy-probabilistic calculations are compared to the conventional Monte Carlo simulation approach and estimates from field observations at the Hanford site.
Systematic uncertainties of artificial neural-network pulse-shape discrimination for $0\
Abt, I; Cossavella, F; Majorovits, B; Palioselitis, D; Volynets, O
2014-01-01T23:59:59.000Z
A pulse-shape discrimination method based on artificial neural networks was applied to pulses simulated for different background, signal and signal-like interactions inside a germanium detector. The simulated pulses were used to investigate the systematic uncertainties of the method. It is verified that neural networks are well-suited to identify background pulses in true-coaxial high-purity germanium detectors. The systematic uncertainty on the signal recognition efficiency derived using signal-like samples from calibration measurements is estimated to be 5\\%. This uncertainty is due to differences between signal and calibration samples.
Comparison of nuclear data uncertainty propagation methodologies for PWR burn-up simulations
Carlos Javier Diez; Oliver Buss; Axel Hoefer; Dieter Porsch; Oscar Cabellos
2014-11-04T23:59:59.000Z
Several methodologies using different levels of approximations have been developed for propagating nuclear data uncertainties in nuclear burn-up simulations. Most methods fall into the two broad classes of Monte Carlo approaches, which are exact apart from statistical uncertainties but require additional computation time, and first order perturbation theory approaches, which are efficient for not too large numbers of considered response functions but only applicable for sufficiently small nuclear data uncertainties. Some methods neglect isotopic composition uncertainties induced by the depletion steps of the simulations, others neglect neutron flux uncertainties, and the accuracy of a given approximation is often very hard to quantify. In order to get a better sense of the impact of different approximations, this work aims to compare results obtained based on different approximate methodologies with an exact method, namely the NUDUNA Monte Carlo based approach developed by AREVA GmbH. In addition, the impact of different covariance data is studied by comparing two of the presently most complete nuclear data covariance libraries (ENDF/B-VII.1 and SCALE 6.0), which reveals a high dependency of the uncertainty estimates on the source of covariance data. The burn-up benchmark Exercise I-1b proposed by the OECD expert group "Benchmarks for Uncertainty Analysis in Modeling (UAM) for the Design, Operation and Safety Analysis of LWRs" is studied as an example application. The burn-up simulations are performed with the SCALE 6.0 tool suite.
PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility
Piyush Sabharwall; Richard Skifton; Carl Stoots; Eung Soo Kim; Thomas Conder
2013-12-01T23:59:59.000Z
Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.
Investment and Upgrade in Distributed Generation under Uncertainty
Siddiqui, Afzal; Maribu, Karl
2008-08-18T23:59:59.000Z
The ongoing deregulation of electricity industries worldwide is providing incentives for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower than that of central-station production, relatively high tariff rates and the potential for CHP applications increase the attraction of on-site generation. Nevertheless, a microgrid contemplatingthe installation of gas-fired DG has to be aware of the uncertainty in the natural gas price. Treatment of uncertainty via real options increases the value of the investment opportunity, which then delays the adoption decision as the opportunity cost of exercising the investment option increases as well. In this paper, we take the perspective of a microgrid that can proceed in a sequential manner with DG capacity and HX investment in order to reduce its exposure to risk from natural gas price volatility. In particular, with the availability of the HX, the microgrid faces a tradeoff between reducing its exposure to the natural gas price and maximising its cost savings. By varying the volatility parameter, we find that the microgrid prefers a direct investment strategy for low levels of volatility and a sequential one for higher levels of volatility.
IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases
Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov
2012-11-01T23:59:59.000Z
Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.
Determination of the uncertainty in assembly average burnup
Cacciapouti, R.J.; Lam, G.M.; Theriault, P.A.; Delmolino, P.M.
1998-12-31T23:59:59.000Z
Pressurized water reactors maintain records of the assembly average burnup for each fuel assembly at the plant. The reactor records are currently used by commercial reactor operators and vendors for (a) special nuclear accountability, (b) placement of spent fuel in storage pools, and (c) dry storage cask design and analysis. A burnup credit methodology has been submitted to the US Nuclear Regulatory Commission (NRC) by the US Department of Energy. In order to support this application, utilities are requested to provide burnup uncertainty as part of their reactor records. The collected burnup data are used for the development of a plant correction to the cask vendor supplied burnup credit loading curve. The objective of this work is to identify a feasible methodology for determining the 95/95 uncertainty in the assembly average burnup. Reactor records are based on the core neutronic analysis coupled with measured in-core detector data. The uncertainty of particular burnup records depends mainly on the uncertainty associated with the methods used to develop the records. The methodology adopted for this analysis utilizes current neutronic codes for the determination of the uncertainty in assembly average burnup.
Direct tests of measurement uncertainty relations: what it takes
Paul Busch; Neil Stevens
2015-01-17T23:59:59.000Z
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that in nearly 90 years there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance), and precise formulations of such relations that are {\\em universally valid}and {\\em directly testable}. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of {\\em value deviation errors} (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for {\\em state-dependent} error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify {\\em distances between observables}.
SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data
Williams, Mark L [ORNL; Rearden, Bradley T [ORNL
2008-01-01T23:59:59.000Z
Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.
Fuel cycle cost uncertainty from nuclear fuel cycle comparison
Li, J.; McNelis, D. [Institute for the Environment, University of North Carolina, Chapel Hill (United States); Yim, M.S. [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (Korea, Republic of)
2013-07-01T23:59:59.000Z
This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.
Approaches to uncertainty analysis in probabilistic risk assessment
Bohn, M.P.; Wheeler, T.A.; Parry, G.W.
1988-01-01T23:59:59.000Z
An integral part of any probabilistic risk assessment (PRA) is the performance of an uncertainty analysis to quantify the uncertainty in the point estimates of the risk measures considered. While a variety of classical methods of uncertainty analysis exist, application of these methods and developing new techniques consistent with existing PRA data bases and the need for expert (subjective) input has been an area of considerable interest since the pioneering Reactor Safety Study (WASH-1400) in 1975. This report presents the results of a critical review of existing methods for performing uncertainty analyses for PRAs, with special emphasis on identifying data base limitations on the various methods. Both classical and Baysian approaches have been examined. This work was funded by the US Nuclear Regulatory Commission in support of its ongoing full-scope PRA of the LaSalle nuclear power station. Thus in addition to the review, this report contains recommendations for a suitable uncertainty analysis methodology for the LaSalle PRA.
Solar irradiance forecasting at multiple time horizons and novel methods to evaluate uncertainty
Marquez, Ricardo
2012-01-01T23:59:59.000Z
Solar irradiance data . . . . . . . . . . . . .Accuracy . . . . . . . . . . . . . . . . . Solar Resourcev Uncertainty In Solar Resource: Forecasting
Mills, Andrew
2010-01-01T23:59:59.000Z
and Uncertainty of Photovoltaics for Integration with themodels and datasets. Photovoltaics fall under the broader
4. Uncertainty D. Keil Artificial Intelligence 7/13 David Keil, Framingham State University
Keil, David M.
4. Uncertainty D. Keil Artificial Intelligence 7/13 David Keil, Framingham State University CSCI 300 Artificial Intelligence 4. Uncertainty and belief 1. Acting under uncertainty 2. Probability CSCI 300 Artificial Intelligence 4. Uncertainty 7/13 Inquiry · Is probabilistic reasoning part
Harp, D. R. [Los Alamos National Laboratory, Los Alamos, NM, USA; Atchley, A. L. [Los Alamos National Laboratory, Los Alamos, NM, USA; Painter, S. L. [Oak Ridge National Laboratory, Oak Ridge, TN, USA; Coon, E. T. [Los Alamos National Laboratory, Los Alamos, NM, USA; Wilson, C. J. [Los Alamos National Laboratory, Los Alamos, NM, USA; Romanovsky, V. E. [University of Alaska, Fairbanks, USA] (ORCID:0000000295152087); Rowland, J. C. [Los Alamos National Laboratory, Los Alamos, NM, USA
2015-01-01T23:59:59.000Z
The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.
2015-06-29T23:59:59.000Z
The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore »the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less
LDRD Final Report: Capabilities for Uncertainty in Predictive Science.
Phipps, Eric T.; Eldred, Michael S.; Salinger, Andrew G.; Webster, Clayton G.
2008-10-01T23:59:59.000Z
Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3
Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.
Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan; Kaplan, Paul Garry; Brown, Theresa Jean; Conrad, Stephen Hamilton
2010-09-01T23:59:59.000Z
Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subset of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.
Incorporating uncertainty into electric utility projections and decisions
Hanson, D.A.
1992-01-01T23:59:59.000Z
This paper focuses on how electric utility companies can respond in their decision making to uncertain variables. Here we take a mean- variance type of approach. The mean'' value is an expected cost, on a discounted value basis. We assume that management has risk preferences incorporating a tradeoff between the mean and variance in the utility's net income. Decisions that utilities are faced with can be classified into two types: ex ante and ex post. The ex ante decisions need to be made prior to the uncertainty being revealed and the ex post decision can be postponed until after the uncertainty is revealed. Intuitively, we can say that the ex ante decisions provide a hedge against the uncertainties and the ex post decisions allow the negative outcomes of uncertain variables to be partially mitigated, dampening the losses. An example of an ex post decision is how the system is operated i.e., unit dispatch, and in some cases switching among types of fuels, say with different sulfur contents. For example, if gas prices go up, natural gas combined cycle units are likely to be dispatched at lower capacity factors. If SO{sub 2} emission allowance prices go up, a utility may seek to switch into a lower sulfur coal. Here we assume that regulated electric utilities do have some incentive to lower revenue requirements and hence an incentive to lower the electric rates needed for the utility to break even, thereby earning a fair return on invested capital. This paper presents the general approach first, including applications to capacity expansion and system dispatch. Then a case study is presented focusing on the 1990 Clean Air Act Amendments including SO{sub 2} emissions abatement and banking of allowances under uncertainty. It is concluded that the emission banking decisions should not be made in isolation but rather all the uncertainties in demand, fuel prices, technology performance etc., should be included in the uncertainty analysis affecting emission banking.
Distributed Generation Investment by a Microgrid under Uncertainty
Marnay, Chris; Siddiqui, Afzal; Marnay, Chris
2008-08-11T23:59:59.000Z
This paper examines a California-based microgrid?s decision to invest in a distributed generation (DG) unit fuelled by natural gas. While the long-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity at a fixed retail rate from its utility. Using the real options approach, we find a natural gas generation cost threshold that triggers DG investment. Furthermore, the consideration of operational flexibility by the microgrid increases DG investment, while the option to disconnect from the utility is not attractive. By allowing the electricity price to be stochastic, we next determine an investment threshold boundary and find that high electricity price volatility relative to that of natural gas generation cost delays investment while simultaneously increasing the value of the investment. We conclude by using this result to find the implicit option value of the DG unit when two sources of uncertainty exist.
Solar WIMPs Unraveled: Experiments, astrophysical uncertainties, and interactive Tools
Danninger, Matthias
2015-01-01T23:59:59.000Z
The absence of a neutrino flux from self-annihilating dark matter captured in the Sun has tightly constrained some leading particle dark matter scenarios. The impact of astrophysical uncertainties on the capture process of dark matter in the Sun and hence also the derived constraints by neutrino telescopes need to be taken into account. In this review we have explored relevant uncertainties in solar WIMP searches, summarized results from leading experiments, and provided an outlook into upcoming searches and future experiments. We have created an interactive plotting tool that allows the user to view current limits and projected sensitivities of major experiments under changing astrophysical conditions.
Effective field theory for nuclear vibrations with quantified uncertainties
Pérez, E A Coello
2015-01-01T23:59:59.000Z
We develop an effective field theory (EFT) for nuclear vibrations. The key ingredients - quadrupole degrees of freedom, rotational invariance, and a breakdown scale around the three-phonon level - are taken from data. The EFT is developed for spectra and electromagnetic moments and transitions. We employ tools from Bayesian statistics for the quantification of theoretical uncertainties. The EFT consistently describes spectra and electromagnetic transitions for $^{62}$Ni, $^{98,100}$Ru, $^{106,108}$Pd, $^{110,112,114}$Cd, and $^{118,120,122}$Te within the theoretical uncertainties. This suggests that these nuclei can be viewed as anharmonic vibrators.
Reactor Neutrino Flux Uncertainty Suppression on Multiple Detector Experiments
Andi Cucoanes; Pau Novella; Anatael Cabrera; Muriel Fallot; Anthony Onillon; Michel Obolensky; Frederic Yermia
2015-01-02T23:59:59.000Z
This publication provides a coherent treatment for the reactor neutrino flux uncertainties suppression, specially focussed on the latest $\\theta_{13}$ measurement. The treatment starts with single detector in single reactor site, most relevant for all reactor experiments beyond $\\theta_{13}$. We demonstrate there is no trivial error cancellation, thus the flux systematic error can remain dominant even after the adoption of multi-detector configurations. However, three mechanisms for flux error suppression have been identified and calculated in the context of Double Chooz, Daya Bay and RENO sites. Our analysis computes the error {\\it suppression fraction} using simplified scenarios to maximise relative comparison among experiments. We have validated the only mechanism exploited so far by experiments to improve the precision of the published $\\theta_{13}$. The other two newly identified mechanisms could lead to total error flux cancellation under specific conditions and are expected to have major implications on the global $\\theta_{13}$ knowledge today. First, Double Chooz, in its final configuration, is the only experiment benefiting from a negligible reactor flux error due to a $\\sim$90\\% geometrical suppression. Second, Daya Bay and RENO could benefit from their partial geometrical cancellation, yielding a potential $\\sim$50\\% error suppression, thus significantly improving the global $\\theta_{13}$ precision today. And third, we illustrate the rationale behind further error suppression upon the exploitation of the inter-reactor error correlations, so far neglected. So, our publication is a key step forward in the context of high precision neutrino reactor experiments providing insight on the suppression of their intrinsic flux error uncertainty, thus affecting past and current experimental results, as well as the design of future experiments.
Partition Testing versus Random Testing: the Influence of Uncertainty
Gutjahr, Walter
Partition Testing versus Random Testing: the Influence of Uncertainty Walter J. Gutjahr Department of Statistics, O.R. and Computer Science University of Vienna Abstract --- The paper compares partition testing and random testing on the assumption that program failure rates are not known with certainty before testing
Reducing uncertainty in geostatistical description with well testing pressure data
Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)
1997-08-01T23:59:59.000Z
Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.
Optimal Storage Policies with Wind Forecast Uncertainties [Extended Abstract
Dalang, Robert C.
Optimal Storage Policies with Wind Forecast Uncertainties [Extended Abstract] Nicolas Gast EPFL, IC generation. The use of energy storage compensates to some extent these negative effects; it plays a buffer role between demand and production. We revisit a model of real storage proposed by Bejan et al.[1]. We
Oil and Gas Production Optimization; Lost Potential due to Uncertainty
Johansen, Tor Arne
Oil and Gas Production Optimization; Lost Potential due to Uncertainty Steinar M. Elgsaeter Olav.ntnu.no) Abstract: The information content in measurements of offshore oil and gas production is often low, and when in the context of offshore oil and gas fields, can be considered the total output of production wells, a mass
Representing Uncertainty in Decision Support Harvey J. Greenberg
Greenberg, Harvey J.
overlapping factors. · Source of uncertainty. Is it a future event, such as product demand? Is it ignorance defines a proven reserve? · Time scale. Is the decision a strategic one, to cover a planning horizon? Is it a major setback, such as the loss of a company? Is it catastrophic, such as the loss of a city? · Decision
Uncertainties analysis of fission fraction for reactor antineutrino experiments
X. B. Ma; F. Lu; L. Z. Wang; Y. X. Chen; W. L. Zhong; F. P. An
2015-03-17T23:59:59.000Z
Reactor antineutrino experiment are used to study neutrino oscillation, search for signatures of nonstandard neutrino interaction, and monitor reactor operation for safeguard application. Reactor simulation is an important source of uncertainties for a reactor neutrino experiment. Commercial code is used for reactor simulation to evaluate fission fraction in Daya Bay neutrino experiment, but the source code doesn't open to our researcher results from commercial secret. In this study, The open source code DRAGON was improved to calculate the fission rates of the four most important isotopes in fissions, $^{235}$U,$^{238}$U,$^{239}$Pu and $^{241}$Pu, and then was validated for PWRs using the Takahama-3 benchmark. The fission fraction results are consistent with those of MIT's results. Then, fission fraction of Daya Bay reactor core was calculated by using improved DRAGON code, and the fission fraction calculated by DRAGON agreed well with these calculated by SCIENCE. The average deviation less than 5\\% for all the four isotopes. The correlation coefficient matrix between $^{235}$U,$^{238}$U,$^{239}$Pu and $^{241}$Pu were also studied using DRAGON, and then the uncertainty of the antineutrino flux by the fission fraction was calculated by using the correlation coefficient matrix. The uncertainty of the antineutrino flux by the fission fraction simulation is 0.6\\% per core for Daya Bay antineutrino experiment. The uncertainties source of fission fraction calculation need further to be studied in the future.
The Geometry of Uncertainty in Moving Objects Databases
Wolfson, Ouri E.
The Geometry of Uncertainty in Moving Objects Databases Goce Trajcevski1 , Ouri Wolfson1@arl.mil Abstract. This work addresses the problem of querying moving ob- jects databases. which capture- ing, constructing, and querying a trajectories database. We propose to model a trajectory as a 3D
Managing Uncertainty in Operational Control of Water Distribution Systems
Bargiela, Andrzej
Managing Uncertainty in Operational Control of Water Distribution Systems A. Bargiela Department Operation of water distribution systems requires a variety of decisions to be made. There are system between system management regulations and operational decisions. The difference is in the possible
Polynomial regression with derivative information in nuclear reactor uncertainty quantification*
Anitescu, Mihai
parameters on the performance of a model of sodium-cooled fast reactor. The experiments show-cooled fast reactor. We construct a surrogate model as a goal-oriented projection onto an incomplete space1 Polynomial regression with derivative information in nuclear reactor uncertainty quantification
Early Power Grid Verification Under Circuit Current Uncertainties
Najm, Farid N.
Early Power Grid Verification Under Circuit Current Uncertainties Imad A. Ferzli Department of ECE Eindhoven, The Netherlands lars@magma-da.com ABSTRACT As power grid safety becomes increasingly important in modern integrated circuits, so does the need to start power grid verifica- tion early in the design cycle
Developments of the Price equation and natural selection under uncertainty
Grafen, Alan
success, following Darwin (1859). Here, this project is pursued by developing the Price equation, ¢rstDevelopments of the Price equation and natural selection under uncertainty Alan Grafen Department to employ these approaches. Here, a new theore- tical development arising from the Price equation provides
AN OPTIMIZATION APPROACH TO UNCERTAINTY PROPAGATION IN BOUNDARY LOAD FLOW
Stankoviæ, Aleksandar
.g., SCADA) and network parame- ters. The method is potentially applicable to large-scale power systems) possible errors in measurements (e.g., SCADA). Topological uncertainties are linked with the fidelity. On the other hand, errors in network parameters and in SCADA measurements tend to be smaller in size
Plant-Wide Waste Management. 2. Decision Making under Uncertainty
Linninger, Andreas A.
Plant-Wide Waste Management. 2. Decision Making under Uncertainty Aninda Chakraborty and Andreas A of Illinois at Chicago, Chicago, Illinois 60607 The synthesis and optimization of plant-wide waste management flowsheet produces a superstructure that embeds all plant-wide waste management policies. In the subsequent
Assessing Uncertainty in Simulation Based Maritime Risk Assessment
van Dorp, Johan René
, such as nuclear powered vessels [6], vessels transporting liquefied natural gas [7] and offshore oil #12;2 and gas the decision-makers unsure whether the evidence was sufficient to assess specific risks and benefits. The first techniques to propagate uncertainty throughout the analysis. The conclusions drawn in the original study
What makes information valuable: signal reliability and environmental uncertainty
Stephens, David W.
uncertainty in animal signal use. We de- veloped a simple model that predicted when animals should switch The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved. Many aspects. An example of a reliable signal is the carotenoid-based plumage colora- tion in male house finches
Resilience and Water Governance Addressing Fragmentation and Uncertainty in Water
Review Advisory Commission 1998). The water quality impairment is caused both by chemical pollution's waters" (CWA § 125l(a)). Diffuse sources of pollution resulting from land use activities, including agriFIVE Resilience and Water Governance Addressing Fragmentation and Uncertainty in Water Allocation
Coal supply and cost under technological and environmental uncertainty
Coal supply and cost under technological and environmental uncertainty Submitted in partial chapters. My conversations with Kurt Walzer at Clean Air Task Force and Rory McIlmoil at Coal Valley Wind Technology Laboratory. I did not complete this work alone. I had a lot of help along the way. I would like
Quantification of nuclear uncertainties in nucleosynthesis of elements beyond Iron
T. Rauscher
2014-12-22T23:59:59.000Z
Nucleosynthesis beyond Fe poses additional challenges not encountered when studying astrophysical processes involving light nuclei. Generally higher temperatures and nuclear level densities lead to stronger contributions of transitions on excited target states. This may prevent cross section measurements to determine stellar reaction rates and theory contributions remain important. Furthermore, measurements often are not feasible in the astrophysically relevant energy range. Sensitivity analysis allows not only to determine the contributing nuclear properties but also is a handy tool for experimentalists to interpret the impact of their data on predicted cross sections and rates. It can also speed up future input variation studies of nucleosynthesis by simplifying an intermediate step in the full calculation sequence. Large-scale predictions of sensitivities and ground-state contributions to the stellar rates are presented, allowing an estimate of how well rates can be directly constrained by experiment. The reactions 185W(n,gamma) and 186W(gamma,n) are discussed as application examples. Studies of uncertainties in abundances predicted in nucleosynthesis simulations rely on the knowledge of reaction rate errors. An improved treatment of uncertainty analysis is presented as well as a recipe for combining experimental data and theory to arrive at a new reaction rate and its uncertainty. As an example, it is applied to neutron capture rates for the s-process, leading to larger uncertainties than previously assumed.
ASME PTC 47, gasification combined cycle performance -- Uncertainty
Archer, D.H.; Horazak, D.A.; Bannister, R.L.
1998-07-01T23:59:59.000Z
Determining the uncertainty of measured calculated performance parameters is required in all Performance Test Codes of the ASME. This determination begins with the equations defining the performance parameters of the equipment or system--input, useful output, and effectiveness (an input/output ratio). The variables in these equations are: plant operating conditions measured throughout a test; corrections that compensate for deviations of other significant measured plant and ambient operating conditions from their values specified for the test. PTC47, Gasification Combined Cycle Plant Performance, will define procedures for the performance testing of overall gasification combined cycle plants and for each of three major sections that may comprise such a plant. The Committee is now defining the performance parameters for these tests. Performance factor computations include uncertainty calculations in order to provide preliminary estimates of the accuracy expected from the test methods proposed in this Code. Uncertainty calculations will also be used to explore energy balance methods for calculating the energy input to various gasifiers--entrained flow, fluidized bed, and moving bed. Such methods would be important as possible alternatives to direct measurements of flows and heating values of the various fuels fed to the gasifiers. Uncertainty calculations will also be used to assist in identifying those measurements of ambient, imposed, and controlled operating conditions that significantly affect test results and for which correction factors should be determined.
A Genetic Programming Methodology for Missile Countermeasures Optimization Under Uncertainty
Fernandez, Thomas
The missile countermeasures optimization (MCO) problem has been the subject of intensive research [12], [6 with aircraft maneuvers and the introduction of uncertainty make the MCO problem extremely difficult to solve states of the MCO system. The SAM's current state may be uncertain as a result of incomplete data
Phoenix: A Reactor Burnup Code With Uncertainty Quantification
Spence, Grant R
2014-12-15T23:59:59.000Z
....................................................... 23 3.1.3. Case 3: Xe-135 Type Isotopes ........................................................................ 25 3.2. Fuel Enrichment Perturbations & Linear Regression ........................................... 27 4. PHOENIX THEORY AND OPERATION....2. Implementation of Uncertainty Quantification in PHOENIX............................... 42 4.3. Calculated Values .................................................................................................. 45 viii 4.3.1. Recoverable Energy per Fission...
Complementary Space for Enhanced Uncertainty and Dynamics Visualization
Texas at Austin, University of
of complementary space, i.e. the space exterior to but still "near" the surface in question. In this paper, we showComplementary Space for Enhanced Uncertainty and Dynamics Visualization Chandrajit Bajaj, Andrew to it. (a) A primal space visualization of the first time step with the heme group identified. (b
POWER SCHEDULING IN A HYDRO-THERMAL SYSTEM UNDER UNCERTAINTY
Römisch, Werner
POWER SCHEDULING IN A HYDRO-THERMAL SYSTEM UNDER UNCERTAINTY C.C. Car e1, M.P. Nowak2, W. Romisch2 and pumped-storage hydro units is developed. For its compu- tational solution two di erent decompo- sition-burning) thermal units, pumped-storage hydro plants and delivery con- tracts and describe an optimization model
Reducing Uncertainty: A Formal Theory of Organizations in Action 1
Amsterdam, University of
Reducing Uncertainty: A Formal Theory of Organizations in Action 1 Jaap Kamps and L'aszl'o P presents a formal reconstruction of James D. Thompson's classic contribution to organization theory---a heretofore unknown implication of the theory. #12; 1 Introduction Thompson's Organizations in Action
Reducing Uncertainty: A Formal Theory of Organizations in Action1
Kamps, Jaap
Reducing Uncertainty: A Formal Theory of Organizations in Action1 Jaap Kamps and La´szlo´ Po contribution to organization theory, Organizations in Action. The reconstruction explicates the underlying, this theory explains why Thompson's propositions do not hold for noncomplex or "atomic" organizations (a
Constructing Bayesiannetwork Models of Software Testing and Maintenance Uncertainties
Ziv, Hadar
Constructing Bayesiannetwork Models of Software Testing and Maintenance Uncertainties To appear, Software requirements, Software test ing, Software maintenance 1 #12; 1 Introduction Developmenttomarket constraints. In reallife software projects, testing is often done after coding is complete, does
Managing Uncertainty in Sound based Control for an Autonomous Helicopter
Hopgood, Adrian
Managing Uncertainty in Sound based Control for an Autonomous Helicopter Benjamin N. Passow, Mario research us- ing a multi-purpose, small and low cost autonomous helicopter platform (Flyper). We supervised method to localise the indoor helicopter and extract meaningful information to enable
Uncertainties and Ambiguities in Percentiles and how to Avoid Them
Schreiber, Michael
2012-01-01T23:59:59.000Z
The recently proposed fractional scoring scheme is used to attribute publications to percentile rank classes. It is shown that in this way uncertainties and ambiguities in the evaluation of percentile ranks do not occur. Using the fractional scoring the total score of all papers exactly reproduces the theoretical value.
On quantifying the space of uncertainty of stochastic simulations
Froyland, Gary
Froyland March 15, 2011 Abstract Distance computation as a technique to measure dissimilarity- mon approach to assessing geological uncertainty, and one of the most common practical methods can create multiple realizations that honor the original School of Mining Engineering,The University
Gerhard Strydom; Friederike Bostelmann; Kostadin Ivanov
2014-10-01T23:59:59.000Z
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools are to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the required confidence level. In order to address uncertainty propagation in analysis and methods in the HTGR community the IAEA initiated a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) [6] that officially started in 2013. Although this project focuses specifically on the peculiarities of HTGR designs and its simulation requirements, many lessons can be learned from the LWR community and the significant progress already made towards a consistent methodology uncertainty analysis. In the case of LWRs the NRC has already in 1988 amended 10 CFR 50.46 to allow best-estimate (plus uncertainties) calculations of emergency core cooling system performance. The Nuclear Energy Agency (NEA) of the Organization for Economic Co-operation and Development (OECD) also established an Expert Group on "Uncertainty Analysis in Modelling" which finally led to the definition of the "Benchmark for Uncertainty Analysis in Modelling (UAM) for Design, Operation and Safety Analysis of LWRs" [7]. The CRP on HTGR UAM will follow as far as possible the on-going OECD Light Water Reactor UAM benchmark activity.
The Economic Slide Abates but Uncertainty Abounds
A (page 2). Manufacturing continued to be the leading edge of decline, losing 27,500 jobs. The approximately 900,000 manufacturing jobs once housed in the state in 1969 (33 percent of total employment) have. The rate reached as high as 17.2 percent during the first quarter of 2002. In addition, Somerset County
Perko, Z. [Section Physics of Nuclear Reactors, Department of Radiation, Radionuclides and Reactors, TU Delft, Mekelweg 15, 2629 JB, Delft (Netherlands); Gilli, L.; Lathouwers, D.; Kloosterman, J. L. [Section Physics of Nuclear Reactors, Department of Radiation, Radionuclides and Reactors, Delft University of Technology, Mekelweg 15, 2629 JB, Delft (Netherlands)
2013-07-01T23:59:59.000Z
Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)
River meander modeling and confronting uncertainty.
Posner, Ari J. (University of Arizona Tucson, AZ)
2011-05-01T23:59:59.000Z
This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.
Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh
2010-10-01T23:59:59.000Z
Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information (sensitivity result) to reduce sampling number. (4) Allowing grid independence for scaled integral effect test (IET) simulation and real plant applications: (a) eliminate numerical uncertainty on scaling; (b) reduce experimental cost by allowing smaller scaled IET; (c) eliminate user effects. This paper will review the issues related to the current CSAU, introduce FSA, discuss a potential Q-PIRT process, and show simple examples to perform FSA. Finally, the general research direction and requirements to use FSA in a system analysis code will be discussed.
Donald M. McEligot; Hugh M. McIlroy, Jr.; Ryan C. Johnson
2007-11-01T23:59:59.000Z
The purpose of the fluid dynamics experiments in the MIR (Matched-Index-of-Refraction) flow system at Idaho National Laboratory (INL) is to develop benchmark databases for the assessment of Computational Fluid Dynamics (CFD) solutions of the momentum equations, scalar mixing, and turbulence models for typical Very High Temperature Reactor (VHTR) plenum geometries in the limiting case of negligible buoyancy and constant fluid properties. The experiments use optical techniques, primarily particle image velocimetry (PIV) in the INL MIR flow system. The benefit of the MIR technique is that it permits optical measurements to determine flow characteristics in passages and around objects to be obtained without locating a disturbing transducer in the flow field and without distortion of the optical paths. The objective of the present report is to develop understanding of the magnitudes of experimental uncertainties in the results to be obtained in such experiments. Unheated MIR experiments are first steps when the geometry is complicated. One does not want to use a computational technique, which will not even handle constant properties properly. This report addresses the general background, requirements for benchmark databases, estimation of experimental uncertainties in mean velocities and turbulence quantities, the MIR experiment, PIV uncertainties, positioning uncertainties, and other contributing measurement uncertainties.
Generalized Uncertainty Relations and Long Time Limits for Quantum Brownian Motion Models
C. Anastopoulos; J. J. Halliwell
1994-07-27T23:59:59.000Z
We study the time evolution of the reduced Wigner function for a class of quantum Brownian motion models. We derive two generalized uncertainty relations. The first consists of a sharp lower bound on the uncertainty function, $U = (\\Delta p)^2 (\\Delta q)^2 $, after evolution for time $t$ in the presence of an environment. The second, a stronger and simpler result, consists of a lower bound at time $t$ on a modified uncertainty function, essentially the area enclosed by the $1-\\sigma$ contour of the Wigner function. In both cases the minimizing initial state is a non-minimal Gaussian pure state. These generalized uncertainty relations supply a measure of the comparative size of quantum and thermal fluctuations. We prove two simple inequalites, relating uncertainty to von Neumann entropy, and the von Neumann entropy to linear entropy. We also prove some results on the long-time limit of the Wigner function for arbitrary initial states. For the harmonic oscillator the Wigner function for all initial states becomes a Gaussian at large times (often, but not always, a thermal state). We derive the explicit forms of the long-time limit for the free particle (which does not in general go to a Gaussian), and also for more general potentials in the approximation of high temperature.
Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling
G. Pastore; L.P. Swiler; J.D. Hales; S.R. Novascone; D.M. Perez; B.W. Spencer; L. Luzzi; P. Van Uffelen; R.L. Williamson
2014-10-01T23:59:59.000Z
The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.
Uncertainty quantification for CO2 sequestration and enhanced oil recovery
Dai, Zhenxue; Fessenden-Rahn, Julianna; Middleton, Richard; Pan, Feng; Jia, Wei; Lee, Si-Yong; McPherson, Brian; Ampomah, William; Grigg, Reid
2014-01-01T23:59:59.000Z
This study develops a statistical method to perform uncertainty quantification for understanding CO2 storage potential within an enhanced oil recovery (EOR) environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil-water flow and reactive transport in the Morrow formation are conducted for global sensitivity and statistical analysis of the major uncertainty metrics: net CO2 injection, cumulative oil production, cumulative gas (CH4) production, and net water injection. A global sensitivity and response surface analysis indicates that reservoir permeability, porosity, and thickness are the major intrinsic reservoir parameters that control net CO2 injection/storage and oil/gas recovery rates. The well spacing and the initial water saturation also have large impact on the oil/gas recovery rates. Further, this study has revealed key insights into the potential behavior and the operational parameters of CO2 sequestration at CO2-EOR s...
Cumulative theoretical uncertainties in lithium depletion boundary age
Tognelli, Emanuele; Degl'Innocenti, Scilla
2015-01-01T23:59:59.000Z
We performed a detailed analysis of the main theoretical uncertainties affecting the age at the lithium depletion boundary (LDB). To do that we computed almost 12000 pre-main sequence models with mass in the range [0.06, 0.4] M_sun by varying input physics (nuclear reaction cross-sections, plasma electron screening, outer boundary conditions, equation of state, and radiative opacity), initial chemical elements abundances (total metallicity, helium and deuterium abundances, and heavy elements mixture), and convection efficiency (mixing length parameter, alpha_ML). As a first step, we studied the effect of varying these quantities individually within their extreme values. Then, we analysed the impact of simultaneously perturbing the main input/parameters without an a priori assumption of independence. Such an approach allowed us to build for the first time the cumulative error stripe, which defines the edges of the maximum uncertainty region in the theoretical LDB age. We found that the cumulative error stripe ...
Reduction in maximum time uncertainty of paired time signals
Theodosiou, George E. (West Chicago, IL); Dawson, John W. (Clarendon Hills, IL)
1983-01-01T23:59:59.000Z
Reduction in the maximum time uncertainty (t.sub.max -t.sub.min) of a series of paired time signals t.sub.1 and t.sub.2 varying between two input terminals and representative of a series of single events where t.sub.1 .ltoreq.t.sub.2 and t.sub.1 +t.sub.2 equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t.sub.min) of the first signal t.sub.1 closer to t.sub.max and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20-800.
Reduction in maximum time uncertainty of paired time signals
Theodosiou, G.E.; Dawson, J.W.
1983-10-04T23:59:59.000Z
Reduction in the maximum time uncertainty (t[sub max]--t[sub min]) of a series of paired time signals t[sub 1] and t[sub 2] varying between two input terminals and representative of a series of single events where t[sub 1][<=]t[sub 2] and t[sub 1]+t[sub 2] equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t[sub min]) of the first signal t[sub 1] closer to t[sub max] and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20--800. 6 figs.
Reduction in maximum time uncertainty of paired time signals
Theodosiou, G.E.; Dawson, J.W.
1981-02-11T23:59:59.000Z
Reduction in the maximum time uncertainty (t/sub max/ - t/sub min/) of a series of paired time signals t/sub 1/ and t/sub 2/ varying between two input terminals and representative of a series of single events where t/sub 1/ less than or equal to t/sub 2/ and t/sub 1/ + t/sub 2/ equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t/sub min/) of the first signal t/sub 1/ closer to t/sub max/ and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20 to 800.
Systematic Uncertainties in the Analysis of the Reactor Neutrino Anomaly
A. C. Hayes; J. L. Friar; G. T. Garvey; G. Jungman; Guy Jonkmans
2014-04-05T23:59:59.000Z
We examine uncertainties in the analysis of the reactor neutrino anomaly, wherein it is suggested that only about 94% of the emitted antineutrino flux was detected in short baseline experiments. We find that the form of the corrections that lead to the anomaly are very uncertain for the 30% of the flux that arises from forbidden decays. This uncertainty was estimated in four ways, is as large as the size of the anomaly, and is unlikely to be reduced without accurate direct measurements of the antineutrino flux. Given the present lack of detailed knowledge of the structure of the forbidden transitions, it is not possible to convert the measured aggregate fission beta spectra to antineutrino spectra to the accuracy needed to infer an anomaly. Neutrino physics conclusions based on the original anomaly need to be revisited, as do oscillation analyses that assumed that the antineutrino flux is known to better than approximately 4%.
Computer-aided-design of agricultural drains under uncertainty
Garcia, Luis Alfredo
1985-01-01T23:59:59.000Z
for the depth, D, and spacing, L. D, L & 0 (4. 5) This MPP did not include uncertainty. In the next section uncertainty will be incorporated into the MPP. The ful I deterministic NPP is shown below: 35 Min [Ci + C2 + C3 + C4 + s. t. L2Qd D + d' ? (d...' + ----) / & NDWT 4K when 0 & d/L & 0. 31 d' d/L(2. 55 in(d/r)-3. 55-1. 6(d/L)+2(d/L)2) when d/L & 0. 31 d' 2, 55 (ln(l/r) ? 1. 15) D-h & NDWT D & ND D, L & 0 An example of drainage design using the deterministic NPP wi 1 1 be the shown below...
Rivington, Michael
2011-06-28T23:59:59.000Z
This Thesis explored a range of approaches to study the uncertainty and impacts associated with climate change at the farm scale in Scotland. The research objective was to use a process of uncertainty evaluation and simulation modelling to provide...
Application of price uncertainty quantification models and their impacts on project evaluations
Fariyibi, Festus Lekan
2006-10-30T23:59:59.000Z
This study presents an analysis of several recently published methods for quantifying the uncertainty in economic evaluations due to uncertainty in future oil prices. Conventional price forecasting methods used in the ...
Holmes, Christopher D.
Reactive greenhouse gas scenarios: Systematic exploration of uncertainties and the role chemistry of reactive greenhouse gases is needed to accurately quantify the relationship between human activities and climate, and to incorporate uncertainty in our projections of greenhouse gas abundances. We
Hubbard, Susan
Calibration of a Distributed Flood Forecasting Model with Input Uncertainty Using a Bayesian, Berkeley, CA, United States. In the process of calibrating distributed hydrological models, accounting in calibrating GBHM parameters and in estimating their associated uncertainty. The calibration ignoring input
Dealing with parameter uncertainty in the calculation of water surface profiles
Vargas-Cruz, Ruben F.
1998-01-01T23:59:59.000Z
Hydrologic and hydraulic variables common to many graphics. water resources engineering problems are known to contain considerable uncertainty. The description of the underlying uncertainty of these parameters is extremely important...
Continuous reservoir simulation incorporating uncertainty quantification and real-time data
Holmes, Jay Cuthbert
2009-05-15T23:59:59.000Z
A significant body of work has demonstrated both the promise and difficulty of quantifying uncertainty in reservoir simulation forecasts. It is generally accepted that accurate and complete quantification of uncertainty should lead to better...
Effects of Data Uncertainties on Estimated Soil Organic Carbon in the Sudan
Ardö, Jonas
Effects of Data Uncertainties on Estimated Soil Organic Carbon in the Sudan JEAN-NICOLAS POUSSART was situated in central Sudan and dominated by subsistence agroecosystems. Uncertainties in the modeling
A decomposition approach for commodity pickup and delivery with time-windows under uncertainty
Marla, Lavanya
We consider a special class of large-scale, network-based, resource allocation problems under uncertainty, namely that of multi-commodity flows with time-windows under uncertainty. In this class, we focus on problems ...
RIVERWARE'S INTEGRATED MODELING AND ANALYSIS TOOLS FOR LONG-TERM PLANNING UNDER UNCERTAINTY
RIVERWARE'S INTEGRATED MODELING AND ANALYSIS TOOLS FOR LONG- TERM PLANNING UNDER UNCERTAINTY and reservoir operations under hydrologic uncertainty benefits from modeling capabilities that include 1-objective river and reservoir modeling tool that can represent various planning alternatives and easily run
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
Monte Carlo Methods for Uncertainty Quantification Mike Giles Mathematical Institute, University of Oxford ERCOFTAC course on Mathematical Methods and Tools in Uncertainty Management and Quantification Lecture 1: Introduction and Monte Carlo basics some model applications random number generation Monte
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
Monte Carlo Methods for Uncertainty Quantification Mike Giles Mathematical Institute, University of Oxford ERCOFTAC course on Mathematical Methods and Tools in Uncertainty Management and Quantification: Introduction and Monte Carlo basics some model applications random number generation Monte Carlo estimation
Using Cross-Section Uncertainty Data to Estimate Biases
Mueller, Don [ORNL; Rearden, Bradley T [ORNL
2008-01-01T23:59:59.000Z
Ideally, computational method validation is performed by modeling critical experiments that are very similar, neutronically, to the model used in the safety analysis. Similar, in this context, means that the neutron multiplication factors (k{sub eff}) of the safety analysis model and critical experiment model are affected in the same way to the same degree by variations (or errors) in the same nuclear data. Where similarity is demonstrated, the computational bias calculated using the critical experiment model results is 'applicable' to the safety analysis model. Unfortunately, criticality safety analysts occasionally find that the safety analysis models include some feature or material for which adequately similar well-defined critical experiments do not exist to support validation. For example, the analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to assign an additional administrative margin to compensate for the validation weakness or to conclude that the impact on the calculated bias and bias uncertainty is negligible. Due to advances in computer programs and the evolution of cross-section uncertainty data, analysts can use the sensitivity and uncertainty analyses tools implemented in the SCALE TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides that are under-represented or not present in the critical experiments. This paper discusses the method, computer codes, and data used to estimate the potential contribution toward the computational bias of individual nuclides. The results from application of the method to fission products in a burnup credit model are presented.
Climate dynamics and fluid mechanics: Natural variability and related uncertainties
Michael Ghil; Mickaël D. Chekroun; Eric Simonnet
2010-06-15T23:59:59.000Z
The purpose of this review-and-research paper is twofold: (i) to review the role played in climate dynamics by fluid-dynamical models; and (ii) to contribute to the understanding and reduction of the uncertainties in future climate-change projections. To illustrate the first point, we focus on the large-scale, wind-driven flow of the mid-latitude oceans which contribute in a crucial way to Earth's climate, and to changes therein. We study the low-frequency variability (LFV) of the wind-driven, double-gyre circulation in mid-latitude ocean basins, via the bifurcation sequence that leads from steady states through periodic solutions and on to the chaotic, irregular flows documented in the observations. This sequence involves local, pitchfork and Hopf bifurcations, as well as global, homoclinic ones. The natural climate variability induced by the LFV of the ocean circulation is but one of the causes of uncertainties in climate projections. Another major cause of such uncertainties could reside in the structural instability in the topological sense, of the equations governing climate dynamics, including but not restricted to those of atmospheric and ocean dynamics. We propose a novel approach to understand, and possibly reduce, these uncertainties, based on the concepts and methods of random dynamical systems theory. As a very first step, we study the effect of noise on the topological classes of the Arnol'd family of circle maps, a paradigmatic model of frequency locking as occurring in the nonlinear interactions between the El Nino-Southern Oscillations (ENSO) and the seasonal cycle. It is shown that the maps' fine-grained resonant landscape is smoothed by the noise, thus permitting their coarse-grained classification. This result is consistent with stabilizing effects of stochastic parametrization obtained in modeling of ENSO phenomenon via some general circulation models.
Methodologies for Estimating Building Energy Savings Uncertainty: Review and Comparison
Baltazar, J.C.; Sun, Y.; Claridge, D.
2014-01-01T23:59:59.000Z
CONFERENCE FOR ENHANCED BUILDING OPERATIONS TSINGHUA UNIVERSITY – BEIJING, CHINA –SEPTEMBER 14 -17, 2014 Methodologies for Estimating Building Energy Savings Uncertainty: Review and Comparison Juan-Carlos Baltazar PhD, PE, Yifu Sun EIT, and David Claridge... PhD, P.E. International Conference for Enhanced Building Operations Tsinghua University – Beijing, China –September 14 -17, 2014 ESL-IC-14-09-11a Proceedings of the 14th International Conference for Enhanced Building Operations, Beijing, China...
Optimal Reservoir Management and Well Placement Under Geologic Uncertainty
Taware, Satyajit Vijay
2012-10-19T23:59:59.000Z
of DOCTOR OF PHILOSOPHY Approved by: Co-Chairs of Committee, Akhil Datta-Gupta Michael King Committee Members, Ding Zhu Wolfgang Bangerth Head of Department, Dan Hill August 2012 Major Subject...: Petroleum Engineering iii ABSTRACT Optimal Reservoir Management and Well Placement Under Geologic Uncertainty. (August 2012) Satyajit Vijay Taware, B.E., Pune University, India; M.E., Pune University, India. Co-Chairs of Advisory Committee: Dr...
New agegraphic dark energy model with generalized uncertainty principle
Yong-Wan Kim; Hyung Won Lee; Yun Soo Myung; Mu-In Park
2008-08-07T23:59:59.000Z
We investigate the new agegraphic dark energy models with generalized uncertainty principle (GUP). It turns out that although the GUP affects the early universe, it does not change the current and future dark energy-dominated universe significantly. Furthermore, this model could describe the matter-dominated universe in the past only when the parameter $n$ is chosen to be $n>n_c$, where the critical value determined to be $n_c=2.799531478$.
Methodologies for Estimating Building Energy Savings Uncertainty: Review and Comparison
Baltazar, J.C.; Sun, Y.; Claridge, D.
2014-01-01T23:59:59.000Z
for the right case is given. ESL-IC-14-09-11 Proceedings of the 14th International Conference for Enhanced Building Operations, Beijing, China, September 14-17, 2014 ...p. 1 INTERNATIONAL CONFERENCE FOR ENHANCED BUILDING OPERATIONS TSINGHUA UNIVERSITY – BEIJING, CHINA –SEPTEMBER 14 -17, 2014 Methodologies for Estimating Building Energy Savings Uncertainty: Review and Comparison Juan-Carlos Baltazar PhD, PE, Yifu...
The Method of Manufactured Universes for Testing Uncertainty Quantification Methods
Stripling, Hayes Franklin
2011-02-22T23:59:59.000Z
Adaptive Regression Splines GP Gaussian process GPMSA Gaussian Process Model for Simulation Analysis LANL Los Alamos National Laboratory MMU Method of Manufactured Universes PS&E Predictive science and engineering QOIs Quantities of interest UQ Uncertainty... (LANL) and second, a Bayesian Multivariate Adaptive Regression Spline (BMARS) technique combined with a ltering/weighting method. The conclusion drawn from these results is that MMU is a powerful technique that can help identify problems in UQ software...
Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs
Howard, M. [NSTec; Luttman, A. [NSTec; Fowler, M. [NSTec
2014-11-01T23:59:59.000Z
In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.
Ideas underlying quantification of margins and uncertainties(QMU): a white paper.
Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.
2006-09-01T23:59:59.000Z
This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.
Trapped Between Two Tails: Trading Off Scientific Uncertainties via Climate Targets
Lemoine, Derek M.; McJeon, Haewon C.
2013-08-20T23:59:59.000Z
Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology- rich GCAM integrated assessment model to assess the robustness of 450 ppm and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.
Generalized Uncertainty Principle: Implications for Black Hole Complementarity
Pisin Chen; Yen Chin Ong; Dong-han Yeom
2014-12-10T23:59:59.000Z
At the heart of the black hole information loss paradox and the firewall controversy lies the conflict between quantum mechanics and general relativity. Much has been said about quantum corrections to general relativity, but much less in the opposite direction. It is therefore crucial to examine possible corrections to quantum mechanics due to gravity. Indeed, the Heisenberg Uncertainty Principle is one profound feature of quantum mechanics, which nevertheless may receive correction when gravitational effects become important. Such generalized uncertainty principle [GUP] has been motivated from not only quite general considerations of quantum mechanics and gravity, but also string theoretic arguments. We examine the role of GUP in the context of black hole complementarity. We find that while complementarity can be violated by large N rescaling if one assumes only the Heisenberg's Uncertainty Principle, the application of GUP may save complementarity, but only if certain N-dependence is also assumed. This raises two important questions beyond the scope of this work, i.e., whether GUP really has the proposed form of N-dependence, and whether black hole complementarity is indeed correct.
On Uncertainty Quantification of Lithium-ion Batteries
Hadigol, Mohammad; Doostan, Alireza
2015-01-01T23:59:59.000Z
In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC$_6$/LiCoO$_2$ cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the ce...
Uncertainty in calculated surface temperature and surface heat flux of THTF heater rods
Childs, K.W.
1980-12-01T23:59:59.000Z
This report presents a procedure for determining the uncertainty in the output of a complex computer code resulting from uncertainties in its input variables. This method is applied to ORINC (Oak Ridge Inverse Code) to estimate the uncertainty in the calculated surface temperature and surface heat flux of a THTF heater during a blowdown transient. The significant input variables are identified and 95% confidence bands are calculated for the code outputs based on the uncertainty in these input variables. 21 refs., 43 figs.
Uncertainty in calculated surface temperature and surface heat flux of THTF heater rods
Childs, K.W.
1980-12-01T23:59:59.000Z
The report presents a procedure for determining the uncertainty in the output of a complex computer code resulting from uncertainties in its input variables. This method is applied to ORINC (Oak Ridge Inverse Code) to estimate the uncertainty in the calculated surface temperature and surface heat flux of a THTF heater during a blowdown transient. The significant input variables are identified and 95% confidence bands are calculated for the code outputs based on the uncertainty in these input variables.
Allan Tameshtit
2012-04-09T23:59:59.000Z
High temperature and white noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white noise approximation is avoided, it is shown that if the zero point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle.
Chaubey, Indrajeet
considerable interest in developing methods for uncertainty analysis of artificial neural network (ANN) models and parametric uncertainty in artificial neural network hydrologic models, Water Resour. Res., 43, W10407, doi:10A simplified approach to quantifying predictive and parametric uncertainty in artificial neural
Uncertainty analysis of river flooding and dam failure risks using local sensitivity computations.
Paris-Sud XI, Université de
Uncertainty analysis of river flooding and dam failure risks using local sensitivity computations) for uncertainty analysis with respect to two major types of risk in river hydrodynamics: flash flood and dam failure. LSA is com- pared to a Global Uncertainty Analysis (GUA) consisting in running Monte Carlo
Analysis and Reduction of Power Grid Models under Uncertainty Sandia National Laboratories
Levi, Anthony F. J.
1.30pm Analysis and Reduction of Power Grid Models under Uncertainty Habib Najm Sandia National Laboratories Abstract The increased utilization of alternative energy sources requires that evolving power grid Uncertainty Eigenproblem Closure Analysis and Reduction of Power Grid Models under Uncertainty H.N. Najm
Water Management, Risk, and Uncertainty: Things We Wish We Knew in the 21st
Shaw, W. Douglass
Water Management, Risk, and Uncertainty: Things We Wish We Knew in the 21st Century [Forthcoming;1 Introduction A survey is offered of the most difficult and challenging issues to water managers in the 21st Century, focusing on the economics of risks and more so, uncertainty. Risk and uncertainty is addressed
Bayesian-network Con rmation of Software Testing Uncertainties Debra J. Richardson
Ziv, Hadar
Bayesian-network Con rmation of Software Testing Uncertainties Hadar Ziv Debra J. Richardson presentation of uncertainty in software testing. We then propose that a speci c technique, known as Bayesian Belief Networks, be used to model software testing uncertainties. We demonstrate the use of Bayesian
TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE
Atkinson, R.
2012-07-31T23:59:59.000Z
Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.
Bayes Linear Uncertainty Analysis for Oil Reservoirs Based on Multiscale Computer Experiments
Oakley, Jeremy
of the input parameters for a reservoir model. Therefore, an uncertainty analysis for the model often proceedsBayes Linear Uncertainty Analysis for Oil Reservoirs Based on Multiscale Computer Experiments for the efficient management of the reservoir. In a Bayesian analysis, all of our uncertainties are incorporated
Achieving Robustness to Uncertainty for Financial Decision-making
Barnum, George M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Van Buren, Kendra L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hemez, Francois M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Song, Peter [Univ. of Pennsylvania, Philadelphia, PA (United States)
2014-01-10T23:59:59.000Z
This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gap of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.
Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P. [Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany); Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg, Germany and Department of Radiation Oncology, University Clinic Heidelberg, 69120 Heidelberg (Germany); Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany)
2012-04-15T23:59:59.000Z
Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.
Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others
1995-01-01T23:59:59.000Z
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.
Design Features and Technology Uncertainties for the Next Generation Nuclear Plant
John M. Ryskamp; Phil Hildebrandt; Osamu Baba; Ron Ballinger; Robert Brodsky; Hans-Wolfgang Chi; Dennis Crutchfield; Herb Estrada; Jeane-Claude Garnier; Gerald Gordon; Richard Hobbins; Dan Keuter; Marilyn Kray; Philippe Martin; Steve Melancon; Christian Simon; Henry Stone; Robert Varrin; Werner von Lensa
2004-06-01T23:59:59.000Z
This report presents the conclusions, observations, and recommendations of the Independent Technology Review Group (ITRG) regarding design features and important technology uncertainties associated with very-high-temperature nuclear system concepts for the Next Generation Nuclear Plant (NGNP). The ITRG performed its reviews during the period November 2003 through April 2004.
Energy-saving technology adoption under uncertainty in the residential sector
Paris-Sud XI, Université de
Energy-saving technology adoption under uncertainty in the residential sector Dorothée Charlier in the energy-saving technology, to save or to consume energy goods and non-energy goods. Resolution to be a highly effective means for households to lower expenditures on energy. In this sense, home renova- tion
While future changes in emission are the largest uncertainty on future climate change, another
Allan, Richard P.
While future changes in emission are the largest uncertainty on future climate change, another. Above, the thick lines show different possible future scenarios (Representative Concentration Pathways) ranging from low emission (RCP2.6) up to high emission (RCP 8.5) which explain a large range in future
DEVELOPMENT OF REAL TIME FLOOD PREDICTION CAPABILITIES IN PUERTO RICO TO EVALUATE UNCERTAINTIES
Gilbes, Fernando
DEVELOPMENT OF REAL TIME FLOOD PREDICTION CAPABILITIES IN PUERTO RICO TO EVALUATE UNCERTAINTIES of Electrical and Computer Engineering 1,2,3 University of Puerto Rico at Mayagüez Abstract Due to the complex terrain and the tropical influence, Puerto Rico is characterized by small watersheds, high rainfall
CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis
Ferrer, R.; Rhodes, J. [Studsvik Scandpower, Inc., 504 Shoup Ave., Idaho Falls, ID 83402 (United States); Smith, K. [Dept. of Nuclear Science and Engineering, Massachusetts Inst. of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States)
2012-07-01T23:59:59.000Z
The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)
Myers, D. R.; Reda, I. M.; Wilcox, S. M.; Stoffel, T. L.
2004-04-01T23:59:59.000Z
The measurement of broadband solar radiation has grown in importance since the advent of solar renewable energy technologies in the 1970's, and the concern about the Earth's radiation balance related to climate change in the 1990's. In parallel, standardized methods of uncertainty analysis and reporting have been developed. Historical and updated uncertainties are based on the current international standardized uncertainty analysis method. Despite the fact that new and sometimes overlooked sources of uncertainty have been identified over the period 1988 to 2004, uncertainty in broadband solar radiometric instrumentation remains at 3% to 5% for pyranometers, and 2% to 3% for pyrheliometers. Improvements in characterizing correction functions for radiometer data may reduce total uncertainty. We analyze the theoretical standardized uncertainty sensitivity coefficients for the instrumentation calibration measurement equation and highlight the single parameter (thermal offset voltages), which contributes the most to the observed calibration responsivities.
Quantification of initial-data uncertainty on a shock-accelerated gas cylinder
Tritschler, V. K., E-mail: volker.tritschler@aer.mw.tum.de; Avdonin, A.; Hickel, S.; Hu, X. Y.; Adams, N. A. [Institute of Aerodynamics and Fluid Mechanics, Technische Universität München, 85747 Garching (Germany)] [Institute of Aerodynamics and Fluid Mechanics, Technische Universität München, 85747 Garching (Germany)
2014-02-15T23:59:59.000Z
We quantify initial-data uncertainties on a shock accelerated heavy-gas cylinder by two-dimensional well-resolved direct numerical simulations. A high-resolution compressible multicomponent flow simulation model is coupled with a polynomial chaos expansion to propagate the initial-data uncertainties to the output quantities of interest. The initial flow configuration follows previous experimental and numerical works of the shock accelerated heavy-gas cylinder. We investigate three main initial-data uncertainties, (i) shock Mach number, (ii) contamination of SF{sub 6} with acetone, and (iii) initial deviations of the heavy-gas region from a perfect cylindrical shape. The impact of initial-data uncertainties on the mixing process is examined. The results suggest that the mixing process is highly sensitive to input variations of shock Mach number and acetone contamination. Additionally, our results indicate that the measured shock Mach number in the experiment of Tomkins et al. [“An experimental investigation of mixing mechanisms in shock-accelerated flow,” J. Fluid. Mech. 611, 131 (2008)] and the estimated contamination of the SF{sub 6} region with acetone [S. K. Shankar, S. Kawai, and S. K. Lele, “Two-dimensional viscous flow simulation of a shock accelerated heavy gas cylinder,” Phys. Fluids 23, 024102 (2011)] exhibit deviations from those that lead to best agreement between our simulations and the experiment in terms of overall flow evolution.
Stochastic methods for uncertainty quantification in radiation transport
Fichtl, Erin D [Los Alamos National Laboratory; Prinja, Anil K [Los Alamos National Laboratory; Warsa, James S [Los Alamos National Laboratory
2009-01-01T23:59:59.000Z
The use of generalized polynomial chaos (gPC) expansions is investigated for uncertainty quantification in radiation transport. The gPC represents second-order random processes in terms of an expansion of orthogonal polynomials of random variables and is used to represent the uncertain input(s) and unknown(s). We assume a single uncertain input-the total macroscopic cross section-although this does not represent a limitation of the approaches considered here. Two solution methods are examined: The Stochastic Finite Element Method (SFEM) and the Stochastic Collocation Method (SCM). The SFEM entails taking Galerkin projections onto the orthogonal basis, which, for fixed source problems, yields a linear system of fully -coupled equations for the PC coefficients of the unknown. For k-eigenvalue calculations, the SFEM system is non-linear and a Newton-Krylov method is employed to solve it. The SCM utilizes a suitable quadrature rule to compute the moments or PC coefficients of the unknown(s), thus the SCM solution involves a series of independent deterministic transport solutions. The accuracy and efficiency of the two methods are compared and contrasted. The PC coefficients are used to compute the moments and probability density functions of the unknown(s), which are shown to be accurate by comparing with Monte Carlo results. Our work demonstrates that stochastic spectral expansions are a viable alternative to sampling-based uncertainty quantification techniques since both provide a complete characterization of the distribution of the flux and the k-eigenvalue. Furthermore, it is demonstrated that, unlike perturbation methods, SFEM and SCM can handle large parameter uncertainty.
Uncertainties in Life Cycle Greenhouse Gas Emissions from Advanced
Kara G. Cafferty; Erin M. Searcy; Long Nguyen; Sabrina Spatari
2014-11-01T23:59:59.000Z
To meet Energy Independence and Security Act (EISA) cellulosic biofuel mandates, the United States will require an annual domestic supply of about 242 million Mg of biomass by 2022. To improve the feedstock logistics of lignocellulosic biofuels and access available biomass resources from areas with varying yields, commodity systems have been proposed and designed to deliver on-spec biomass feedstocks at preprocessing “depots”, which densify and stabilize the biomass prior to long-distance transport and delivery to centralized biorefineries. The harvesting, preprocessing, and logistics (HPL) of biomass commodity supply chains thus could introduce spatially variable environmental impacts into the biofuel life cycle due to needing to harvest, move, and preprocess biomass from multiple distances that have variable spatial density. This study examines the uncertainty in greenhouse gas (GHG) emissions of corn stover logisticsHPL within a bio-ethanol supply chain in the state of Kansas, where sustainable biomass supply varies spatially. Two scenarios were evaluated each having a different number of depots of varying capacity and location within Kansas relative to a central commodity-receiving biorefinery to test GHG emissions uncertainty. Monte Carlo simulation was used to estimate the spatial uncertainty in the HPL gate-to-gate sequence. The results show that the transport of densified biomass introduces the highest variability and contribution to the carbon footprint of the logistics HPL supply chain (0.2-13 g CO2e/MJ). Moreover, depending upon the biomass availability and its spatial density and surrounding transportation infrastructure (road and rail), logistics HPL processes can increase the variability in life cycle environmental impacts for lignocellulosic biofuels. Within Kansas, life cycle GHG emissions could range from 24 to 41 g CO2e/MJ depending upon the location, size and number of preprocessing depots constructed. However, this range can be minimized through optimizing the siting of preprocessing depots where ample rail infrastructure exists to supply biomass commodity to a regional biorefinery supply system
Quantification of Uncertainties in Nuclear Density Functional theory
N. Schunck; J. D. McDonnell; D. Higdon; J. Sarich; S. Wild
2014-09-17T23:59:59.000Z
Reliable predictions of nuclear properties are needed as much to answer fundamental science questions as in applications such as reactor physics or data evaluation. Nuclear density functional theory is currently the only microscopic, global approach to nuclear structure that is applicable throughout the nuclear chart. In the past few years, a lot of effort has been devoted to setting up a general methodology to assess theoretical uncertainties in nuclear DFT calculations. In this paper, we summarize some of the recent progress in this direction. Most of the new material discussed here will be be published in separate articles.
PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT
Seitz, R
2008-06-25T23:59:59.000Z
Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.
An efficient method for treating uncertainties in structural dynamics
Wang, Bing
1992-01-01T23:59:59.000Z
VITA 64 LIST OF FIGURES FIGURE 1 Multi-story planar shear building excited by a ground acceleration 2 Locating elements from tensor [xh~, ] to vector tt, where Ns ? 2, N?= 3 Page and N?= 10. 22 3 El Centro SOOE acceleration record... the structural uncertainties; (3) to investi- 28 4000 e Cl e 0 Cl I 3000 2000 c 0 CI Cl Q O 1000 -1000 -2000 3000 0 5 10 15 time (cec) 20 FIG. 3. El Centro SOOE acceleration record of the Imperial Valley Earthquake, May 18, 1940. gate...
Molecular nonlinear dynamics and protein thermal uncertainty quantification
Xia, Kelin [Department of Mathematics, Michigan State University, Michigan 48824 (United States)] [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, Michigan 48824 (United States) [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, Michigan 48824 (United States)
2014-03-15T23:59:59.000Z
This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction.
Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty
Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.
2004-03-01T23:59:59.000Z
The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four projections, and associated kriging variances, were averaged using the posterior model probabilities as weights. Finally, cross-validation was conducted by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of the model-averaged result with that of each individual model. Using two quantitative measures of comparison, the model-averaged result was superior to any individual geostatistical model of log permeability considered.
Uncertainty Analysis of RELAP5-3D
Alexandra E Gertman; Dr. George L Mesina
2012-07-01T23:59:59.000Z
As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.
Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results
Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory
2009-01-01T23:59:59.000Z
The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.
Energy-Time Uncertainty Relations in Quantum Measurements
Takayuki Miyadera
2015-05-14T23:59:59.000Z
Quantum measurement is a physical process. A system and an apparatus interact for a certain time period (measurement time). During this interaction, information of an observable is transferred from the system to the apparatus. In this study, we study the amount of energy fluctuation of the apparatus that is required for this physical process to occur. To do so, the interface between the quantum and classical worlds ("Heisenberg cut") must be carefully chosen so that the quantum side is large enough to autonomously switch on the interaction. At this setting we prove that a trade-off relation (energy-time uncertainty relation) holds between the energy fluctuation of the apparatus and the measurement time. We use this trade-off relation to discuss the spacetime uncertainty relation questioning the operational meaning of the microscopic structure of spacetime. In addition, we derive another trade-off inequality between the measurement time and the strength of interaction between the system and the apparatus. The larger the information carried by an observable to be measured, the stronger restriction the trade-off relations give.
Analysis and Reduction of Complex Networks Under Uncertainty.
Ghanem, Roger G [University of Southern California
2014-07-31T23:59:59.000Z
This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.
Data Filtering Impact on PV Degradation Rates and Uncertainty (Poster)
Jordan, D. C.; Kurtz, S. R.
2012-03-01T23:59:59.000Z
To sustain the commercial success of photovoltaics (PV) it becomes vital to know how power output decreases with time. In order to predict power delivery, degradation rates must be determined accurately. Data filtering, any data treatment assessment of long-term field behavior, is discussed as part of a more comprehensive uncertainty analysis and can be one of the greatest sources of uncertainty in long-term performance studies. Several distinct filtering methods such as outlier removal and inclusion of only sunny days on several different metrics such as PVUSA, performance ratio, DC power to plane-of-array irradiance ratio, uncorrected, and temperature-corrected were examined. PVUSA showed the highest sensitivity while temperature-corrected power over irradiance ratio was found to be the least sensitive to data filtering conditions. Using this ratio it is demonstrated that quantification of degradation rates with a statistical accuracy of +/- 0.2%/year within 4 years of field data is possible on two crystalline silicon and two thin-film systems.
Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.
2010-10-01T23:59:59.000Z
The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities: a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA.
Lacey, Ronald; Faulkner, William
2015-01-01T23:59:59.000Z
uncertainty is covered by several professional and international standards (American Society of Mechanical Engineers [ASME], 1985; American National Standards Institute [ANSI], 1997; Joint Committee for Guides in Metrology [JCGM], 2008; Taylor and Kuyatt, 1994... on the publisher’s website. References American National Standards Institute (ANSI). 1997. American National Standard for Expressing Uncertainty—U.S. Guide to the Expression of Uncertainty in Measurement. ANSI, ANSI/NCSL Z540-2-1997. Gaithersburg, MD: National...
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01T23:59:59.000Z
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01T23:59:59.000Z
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
Generalized uncertainty principle and thermostatistics: a semiclassical approach
Abbasiyan-Motlaq, M
2015-01-01T23:59:59.000Z
We present an exact treatment of the thermodynamics of physical systems in the framework of the generalized uncertainty principle (GUP). Our purpose is to study and compare the consequences of two GUPs that one implies a minimal length while the other predicts a minimal length and a maximal momentum. Using a semiclassical method, we exactly calculate the modified internal energies and heat capacities in the presence of generalized commutation relations. We show that the total shift in these quantities only depends on the deformed algebra not on the system under study. Finally, the modified internal energy for an specific physical system such as ideal gas is obtained in the framework of two different GUPs.
Reaction rate uncertainties and the {nu}p-process
Froehlich, C.; Rauscher, T. [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Dept. of Physics, University of Basel, 4056 Basel (Switzerland)
2012-11-12T23:59:59.000Z
Current hydrodynamical simulations of core collapse supernovae find proton-rich early ejecta. At the same time, the models fail to eject neutron-rich matter, thus leaving the origin of the main r-process elements unsolved. However, the proton-rich neutrino-driven winds from supernovae have been identified as a possible production site for light n-capture elements beyond iron (such as Ge, Sr, Y, Zr) through the {nu}p-process. The detailed nucleosynthesis patterns of the {nu}p-process depend on the hydrodynamic conditions and the nuclear reaction rates of key reactions. We investigate the impact of reaction rate uncertainties on the {nu}p-process nucleosynthesis.
Epistemologies of uncertainty : governing CO2 capture and storage science and technology
Evar, Benjamin
2014-11-27T23:59:59.000Z
This thesis progresses from a ‘science and technology studies’ (STS) perspective to consider the ways that expert stakeholders perceive and communicate uncertainties and risks attached to carbon ...
ImpactMap: Designing Sustainable Supply Chains by Incorporating Data Uncertainty
Fuge, Mark; McKinstry, Katherine; Ninomiya, Kevin
2013-01-01T23:59:59.000Z
Analyzing Uncertainty in Life- Cycle Assessment: A Survey ofissues in data quality analysis in life cycle assessment. ”International Journal of Life Cycle Assessment, 17 (1): pp.
Nuclear data uncertainties by the PWR MOX/UO{sub 2} core rod ejection benchmark
Pasichnyk, I.; Klein, M.; Velkov, K.; Zwermann, W.; Pautz, A. [Boltzmannstr. 14, D-85748 Garching b. Muenchen (Germany)
2012-07-01T23:59:59.000Z
Rod ejection transient of the OECD/NEA and U.S. NRC PWR MOX/UO{sub 2} core benchmark is considered under the influence of nuclear data uncertainties. Using the GRS uncertainty and sensitivity software package XSUSA the propagation of the uncertainties in nuclear data up to the transient calculations are considered. A statistically representative set of transient calculations is analyzed and both integral as well as local output quantities are compared with the benchmark results of different participants. It is shown that the uncertainties in nuclear data play a crucial role in the interpretation of the results of the simulation. (authors)
Li, M.
2013-01-01T23:59:59.000Z
error model for calibration and uncertainty estimation ofand T. Wagener (2005), Model calibration and uncertaintyand A. Mailhot (2008), Calibration of hydrological model
Carbon Accounting and Economic Model Uncertainty of Emissions from Biofuels-Induced Land Use Change
Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'??Hare, Michael
2015-01-01T23:59:59.000Z
uncertainty of full carbon accounting of forest ecosystemsA. ; Hopson, E. , Proper accounting for time increases crop-use change modeling in GTEM: Accounting for forest sinks.
Zhang, Xuesong; Zhao, Kaiguang
2012-06-01T23:59:59.000Z
Bayesian Neural Networks (BNNs) have been shown as useful tools to analyze modeling uncertainty of Neural Networks (NNs). This research focuses on the comparison of two BNNs. The first BNNs (BNN-I) use statistical methods to describe the characteristics of different uncertainty sources (input, parameter, and model structure) and integrate these uncertainties into a Markov Chain Monte Carlo (MCMC) framework to estimate total uncertainty. The second BNNs (BNN-II) lump all uncertainties into a single error term (i.e. the residual between model prediction and measurement). In this study, we propose a simple BNN-II, which use Genetic Algorithms (GA) and Bayesian Model Averaging (BMA) to calibrate Neural Networks with different structures (number of hidden units) and combine the predictions from different NNs to derive predictions and uncertainty analysis. We tested these two BNNs in two watersheds for daily and monthly hydrologic simulation. The BMA based BNNs developed in this study outperforms BNN-I in the two watersheds in terms of both accurate prediction and uncertainty estimation. These results show that, given incomplete understanding of the characteristics associated with each uncertainty source, the simple lumped error approach may yield better prediction and uncertainty estimation.
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications
Plevin, Richard Jay
2010-01-01T23:59:59.000Z
tainty in quantitative risk and policy analysis. Cambridge ;quantitative and qualitative measures of uncertainty in model-based environmental as- sessment: The NUSAP system. Risk Analysis
Framework for Modeling the Uncertainty of Future Events in Life Cycle Assessment
Chen, Yi-Fen; Simon, Rachel; Dornfeld, David
2013-01-01T23:59:59.000Z
event scenarios could alter LCA result. REFERENCES SchweimerEconomic- balance hybrid LCA extended with uncertaintyLife Cycle Assessment (LCA) is a leading technique used to
Emery, K.
2009-08-01T23:59:59.000Z
Discusses NREL Photovoltaic Cell and Module Performance Characterization Group's procedures to achieve lowest practical uncertainty in measuring PV performance with respect to reference conditions.
Kim, Alex G.; Miquel, Ramon
2005-09-26T23:59:59.000Z
We present a new technique to extract the cosmological information from high-redshift supernova data in the presence of calibration errors and extinction due to dust. While in the traditional technique the distance modulus of each supernova is determined separately, in our approach we determine all distance moduli at once, in a process that achieves a significant degree of self-calibration. The result is a much reduced sensitivity of the cosmological parameters to the calibration uncertainties. As an example, for a strawman mission similar to that outlined in the SNAP satellite proposal, the increased precision obtained with the new approach is roughly equivalent to a factor of five decrease in the calibration uncertainty.
Srinivasan, Sanjay [Univ. of Texas, Austin, TX (United States)
2014-09-30T23:59:59.000Z
In-depth understanding of the long-term fate of CO? in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO? in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO? plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO? plume migration in two field projects – the In Salah CO? Injection project in Algeria and CO? injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were highly efficient and yielded accurate grouping of reservoir models. The plume migration paths probabilistically assessed by the method were confirmed by field observations and auxiliary data. The report also documents the application of the software to answer practical questions such as the optimum location of monitoring wells to reliably assess the migration of CO? plume, the effect of CO?-rock interactions on plume migration and the ability to detect the plume under those conditions and the effect of a slow, unresolved leak on the predictions of plume migration.
CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements
Bergman, Rolf; Paget, Maria L.; Richman, Eric E.
2011-03-31T23:59:59.000Z
With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for all equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.
Amand Faessler; G. L. Fogli; E. Lisi; V. Rodin; A. M. Rotunno; F. Simkovic
2009-03-06T23:59:59.000Z
The variances and covariances associated to the nuclear matrix elements (NME) of neutrinoless double beta decay are estimated within the quasiparticle random phase approximation (QRPA). It is shown that correlated NME uncertainties play an important role in the comparison of neutrinoless double beta decay rates for different nuclei, and that they are degenerate with the uncertainty in the reconstructed Majorana neutrino mass.
Protat, Alain
Assessment of uncertainty in cloud radiative effects and heating rates through retrieval algorithm. The effect of uncertainty in retrieved quantities on the cloud radiative effect and radiative heating rates translates into sometimes large differences in cloud shortwave radiative effect (CRE) though the majority
Energy, Exergy and Uncertainty Analyses of the Thermal Response Test for a Ground Heat Exchanger
Al-Shayea, Naser Abdul-Rahman
1 Energy, Exergy and Uncertainty Analyses of the Thermal Response Test for a Ground Heat Exchanger 31261, Saudi Arabia Abstract This paper presents energy, exergy and uncertainty analyses for the thermal from September 2007 to April 2008. The energy and exergy transports of these thermal response tests
Hawking temperature for various kinds of black holes from Heisenberg uncertainty principle
Fabio Scardigli
2006-07-04T23:59:59.000Z
Hawking temperature is computed for a large class of black holes (with spherical, toroidal and hyperboloidal topologies) using only laws of classical physics plus the "classical" Heisenberg Uncertainty Principle. This principle is shown to be fully sufficient to get the result, and there is no need to this scope of a Generalized Uncertainty Principle.
ASSESSING THE UNCERTAINTY OF WIND POWER PREDICTIONS WITH REGARD TO SPECIFIC WEATHER SITUATIONS
Heinemann, Detlev
ASSESSING THE UNCERTAINTY OF WIND POWER PREDICTIONS WITH REGARD TO SPECIFIC WEATHER SITUATIONS existing weather classification schemes the impact of the overall weather situation on the prediction error is investigated. While for a number of sites the prediction uncertainty is significantly lower in weather
The hedge value of international emissions trading under uncertainty Mort Webster n
The hedge value of international emissions trading under uncertainty Mort Webster n , Sergey Keywords: Climate change Emissions trading Uncertainty a b s t r a c t This paper estimates the value of international emissions trading, focusing on a here-to-fore neglected component; its value as a hedge against
Uncertainty in Scenarios of Human-Caused Climate NATHAN J MANTUA1
Mantua, Nathan
greenhouse gas emissions and atmospheric concentrations, and second is the uncertainty associated for eliminating, or even vastly reducing, environmental uncertainty for the purpose of improved natural resource emerged on key aspects of global climate change: humans have unquestionably altered the composition
Huang, Yinlun
Sustainable distributed biodiesel manufacturing under uncertainty: An interval A sophisticated biodiesel manufacturing study demonstrated methodological efficacy. a r t i c l e i n f o Article Simulation Uncertainty a b s t r a c t Biodiesel, a clean-burning alternative fuel, can be produced using
A Review of Uncertainty in Data Visualization Ken Brodlie, Rodolfo Allendes Osorio and Adriano Lopes
Brodlie, Ken
A Review of Uncertainty in Data Visualization Ken Brodlie, Rodolfo Allendes Osorio and Adriano, and in this article we review their work. We place the work in the context of a reference model for data visualization of the discipline is maintained. Key words: visualization; uncertainty; errors Ken Brodlie University of Leeds
Variation and Uncertainty in Evaporation from a Subtropical Estuary: Florida Bay
Miami, University of
Variation and Uncertainty in Evaporation from a Subtropical Estuary: Florida Bay RENE´ M. PRICE1 both vapor flux and energy budget methods. The results were placed into a long-term context using 33 the overall uncertainty in monthly evaporation, and ranged from 9% to 26%. Over a 33-yr period (1970
A Unified Treatment of Uncertainties Center for Research on Concepts and Cognition
Indiana University
A Unified Treatment of Uncertainties Pei Wang Center for Research on Concepts and Cognition Indiana'' is an active research field, where several approaches have been suggested and studied for dealing with various types of uncertainty. However, it's hard to rank the approaches in general, because each of them
1997-2001 by M. Kostic Ch.5: Uncertainty/Error Analysis
Kostic, Milivoje M.
1 ©1997-2001 by M. Kostic Ch.5: Uncertainty/Error Analysis · Introduction · Bias and Precision Summation/Propagation (Expanded Combined Uncertainty) · Problem 5-30 ©1997-2001 by M. Kostic Ch.5) at corresponding Probability (%P) Remember: u = d%P = t,%PS (@ %P); z=t=d/S #12;2 ©1997-2001 by M. Kostic Bias
Leonard, John J.
Lagoon of Venice ecosystem: Seasonal dynamics and environmental guidance with uncertainty analyses the seasonal ecosystem dynamics of the Lagoon of Venice and provide guidance on the monitoring and management stochastic ecosystem modeling components are developed to represent prior uncertainties in the Lagoon
TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)
KEELE, B.D.
2007-07-05T23:59:59.000Z
An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex.
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications
Kammen, Daniel M.
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications by Richard J Friedman Fall 2010 #12;Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications Copyright 2010 by Richard J. Plevin #12;1 Abstract Life Cycle Regulation of Transportation Fuels
Risk Assessment & Management This chapter presents the Council's approach to addressing uncertainty and managing risk. After reviewing the reasons for addressing uncertainty in the Council's Fifth Power Plan favor going ahead. In this plan, the Council further integrates risk assessment and management into its
Interaction of loading pattern and nuclear data uncertainties in reactor core calculations
Klein, M.; Gallner, L.; Krzykacz-Hausmann, B.; Pautz, A.; Velkov, K.; Zwermann, W. [Gesellschaft fuer Anlagen- und Reaktorsicherheit GRS MbH, Boltzmannstr. 14, D- 85748 Garching b. Muenchen (Germany)
2012-07-01T23:59:59.000Z
Along with best-estimate calculations for design and safety analysis, understanding uncertainties is important to determine appropriate design margins. In this framework, nuclear data uncertainties and their propagation to full core calculations are a critical issue. To deal with this task, different error propagation techniques, deterministic and stochastic are currently developed to evaluate the uncertainties in the output quantities. Among these is the sampling based uncertainty and sensitivity software XSUSA which is able to quantify the influence of nuclear data covariance on reactor core calculations. In the present work, this software is used to investigate systematically the uncertainties in the power distributions of two PWR core loadings specified in the OECD UAM-Benchmark suite. With help of a statistical sensitivity analysis, the main contributors to the uncertainty are determined. Using this information a method is studied with which loading patterns of reactor cores can be optimized with regard to minimizing power distribution uncertainties. It is shown that this technique is able to halve the calculation uncertainties of a MOX/UOX core configuration. (authors)
Paris-Sud XI, Université de
CAPACITY INVESTMENT UNDER DEMAND UNCERTAINTY. AN EMPIRICAL STUDY OF THE US CEMENT INDUSTRY Demand Uncertainty. An Empirical Study of the US Cement Industry, 19942006* JeanPierre Ponssard of the theory literature on this topic in an empirical study of the US cement industry between 1994
Navigation Planning in Probabilistic Roadmaps with Uncertainty Michael Kneebone and Richard Dearden
Yao, Xin
Navigation Planning in Probabilistic Roadmaps with Uncertainty Michael Kneebone and Richard Dearden,rwd@cs.bham.ac.uk Abstract Probabilistic Roadmaps (PRM) are a commonly used class of algorithms for robot navigation tasks Probabilistic Roadmap (PRM) planning to handle uncertainty and observations. Probabilistic Roadmaps (Kavraki
An Architecture of a Multi-Agent System for SCADA -dealing with uncertainty, plans and actions
Liu, Weiru
An Architecture of a Multi-Agent System for SCADA - dealing with uncertainty, plans and actions.loughlin@ecit.qub.ac.uk Keywords: Autonomous Agents, Multi-agent Systems, Sensors, SCADA, Uncertainty, Plans, Actions, Fusion in traditional SCADA systems deployed in critical environments such as electrical power generation, transmission
4. Uncertainty D. Keil Artificial Intelligence 1/12 CSCI 400 Artificial Intelligence
Keil, David M.
4. Uncertainty D. Keil Artificial Intelligence 1/12 CSCI 400 Artificial Intelligence David Keil in partially observable and non- deterministic environments? D. Keil Special Topics: Artificial Intelligence 1/12 2 #12;4. Uncertainty D. Keil Artificial Intelligence 1/12 Objectives 4a. Describe ways to operate
Quantification of Variability and Uncertainty in Hourly NOx Emissions from Coal-Fired Power Plants
Frey, H. Christopher
to quantify variability and uncertainty for NOx emissions from coal-fired power plants. Data for hourly NOx emissions, heat rate, gross load and capacity factor of 32 units from 9 different power plants were analyzed Uncertainty, Variability, Emission Factors, Coal-Fired Power Plants, NOx emissions, Regression Models
Users manual for the FORSS sensitivity and uncertainty analysis code system
Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.
1981-01-01T23:59:59.000Z
FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.
Using Uncertainty Analysis to Guide the Development of Accelerated Stress Tests (Presentation)
Kempe, M.
2014-03-01T23:59:59.000Z
Extrapolation of accelerated testing to the long-term results expected in the field has uncertainty associated with the acceleration factors and the range of possible stresses in the field. When multiple stresses (such as temperature and humidity) can be used to increase the acceleration, the uncertainty may be reduced according to which stress factors are used to accelerate the degradation.
arXiv:astro-ph/0409387v115Sep2004 Accounting for Source Uncertainties in Analyses
Masci, Frank
arXiv:astro-ph/0409387v115Sep2004 Accounting for Source Uncertainties in Analyses of Astronomical in analyzing data from astronomical surveys: accounting for measurement uncertainties in the properties ingredient in such analyses is accounting for the volume in the incidental parameter space via marginaliza
Uncertainties in Nuclear Matrix Elements for Neutrinoless Double-Beta Decay
Engel, Jonathan
Uncertainties in Nuclear Matrix Elements for Neutrinoless Double-Beta Decay Jonathan Engel Abstract. I briefly review calculations of the matrix elements governing neutrinoless double-beta decay;Uncertainties in Nuclear Matrix Elements for Neutrinoless Double-Beta Decay 2 in reducing theoretical error
Assessing Early Investments in Low Carbon Technologies under Uncertainty: The Case of Carbon and Policy Program #12;2 #12;3 Assessing Early Investments in Low Carbon Technologies under Uncertainty: The Case of Carbon Capture and Storage By Eleanor Ereira Submitted to the Engineering Systems Division
Threat Assessment for Safe Navigation in Environments with Uncertainty in Predictability
How, Jonathan P.
Threat Assessment for Safe Navigation in Environments with Uncertainty in Predictability by Georges;Threat Assessment for Safe Navigation in Environments with Uncertainty in Predictability by Georges Salim fulfillment of the requirements for the degree of Doctor of Philosophy Abstract This thesis develops threat
G E O M A T I C A SPATIAL DATA UNCERTAINTY
G E O M A T I C A SPATIAL DATA UNCERTAINTY IN THE VGI WORLD: GOING FROM CONSUMER TO PRODUCER Joel discuss the concept of "perceived qualities" as, in a Volunteered Geographic Information (VGI) context, a classification framework is proposed of various types of spatial data usage. Then, we address uncertainty and VGI
Stochastic Formulation for Uncertainty Analysis of Two-Phase Flow in
Zhang, Dongxiao
Stochastic Formulation for Uncertainty Analysis of Two-Phase Flow in Heterogeneous Reservoirs in flow performance predictions due to uncertainty in the reservoir description. We solve moment equations. Accurate modeling of the physics that govern complex multi- phase reservoir flows requires a detailed
Uncertainty Quantification in Modeling HIV Viral Mechanics H.T. Banks1,2
Uncertainty Quantification in Modeling HIV Viral Mechanics H.T. Banks1,2 , Robert Baraldi1 for the resulting parameter estimates. Key Words: In-host HIV-1 progression models, uncertainty quantification of [7] for further analysis. A major motivation for revisiting this model is its potential to be readily
Uncertainty Discounting for Land-Based Carbon Sequestration Man-Keun Kim
McCarl, Bruce A.
1 Uncertainty Discounting for Land-Based Carbon Sequestration By Man-Keun Kim Post Doctoral Fellow Discounting for Land-Based Carbon Sequestration Abstract The effect of various stochastic factors like weather% to 10% for the East Texas region. #12;3 Uncertainty Discounting for Land-Based Carbon Sequestration 1
Climate uncertainty and implications for U.S. state-level risk assessment through 2050.
Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.; Tidwell, Vincent Carroll; Stamber, Kevin Louis; Kelic, Andjelka; Backus, George A.; Warren, Drake E.; Zagonel, Aldo A.; Ehlen, Mark Andrew; Klise, Geoffrey T.; Vargas, Vanessa N.
2009-10-01T23:59:59.000Z
Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to the economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.
J. D. McDonnell; N. Schunck; D. Higdon; J. Sarich; S. M. Wild; W. Nazarewicz
2015-01-15T23:59:59.000Z
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models; to estimate model errors and thereby improve predictive capability; to extrapolate beyond the regions reached by experiment; and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Comparison of nuclear data uncertainty propagation methodologies for PWR burn-up simulations
Diez, Carlos Javier; Hoefer, Axel; Porsch, Dieter; Cabellos, Oscar
2014-01-01T23:59:59.000Z
Several methodologies using different levels of approximations have been developed for propagating nuclear data uncertainties in nuclear burn-up simulations. Most methods fall into the two broad classes of Monte Carlo approaches, which are exact apart from statistical uncertainties but require additional computation time, and first order perturbation theory approaches, which are efficient for not too large numbers of considered response functions but only applicable for sufficiently small nuclear data uncertainties. Some methods neglect isotopic composition uncertainties induced by the depletion steps of the simulations, others neglect neutron flux uncertainties, and the accuracy of a given approximation is often very hard to quantify. In order to get a better sense of the impact of different approximations, this work aims to compare results obtained based on different approximate methodologies with an exact method, namely the NUDUNA Monte Carlo based approach developed by AREVA GmbH. In addition, the impact ...
Thorough approach to measurement uncertainty analysis applied to immersed heat exchanger testing
Farrington, R.B.; Wells, C.V.
1986-04-01T23:59:59.000Z
This paper discusses the value of an uncertainty analysis, discusses how to determine measurement uncertainty, and then details the sources of error in instrument calibration, data acquisition, and data reduction for a particular experiment. Methods are discussed to determine both the systematic (or bias) error in an experiment as well as to determine the random (or precision) error in the experiment. The detailed analysis is applied to two sets of conditions in measuring the effectiveness of an immersed coil heat exchanger. It shows the value of such analysis as well as an approach to reduce overall measurement uncertainty and to improve the experiment. This paper outlines how to perform an uncertainty analysis and then provides a detailed example of how to apply the methods discussed in the paper. The authors hope this paper will encourage researchers and others to become more concerned with their measurement processes and to report measurement uncertainty with all of their test results.
Results for Phase I of the IAEA Coordinated Research Program on HTGR Uncertainties
Strydom, Gerhard; Bostelmann, Friederike; Yoon, Su Jong
2015-01-01T23:59:59.000Z
The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied. High Temperature Gas-cooled Reactors (HTGR) has its own peculiarities, coated particle design, large graphite quantities, different materials and high temperatures that also require other simulation requirements. The IAEA has therefore launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the HTR-PM (INET, China). This report summarizes the contributions of the HTGR Methods Simulation group at Idaho National Laboratory (INL) up to this point of the CRP. The activities at INL have been focused so far on creating the problem specifications for the prismatic design, as well as providing reference solutions for the exercises defined for Phase I. An overview is provided of the HTGR UAM objectives and scope, and the detailed specifications for Exercises I-1, I-2, I-3 and I-4 are also included here for completeness. The main focus of the report is the compilation and discussion of reference results for Phase I (i.e. for input parameters at their nominal or best-estimate values), which is defined as the first step of the uncertainty quantification process. These reference results can be used by other CRP participants for comparison with other codes or their own reference results. The status on the Monte Carlo modeling of the experimental VHTRC facility is also discussed. Reference results were obtained for the neutronics stand-alone cases (Ex. I-1 and Ex. I-2) using the (relatively new) Monte Carlo code Serpent, and comparisons were performed with the more established Monte Carlo codes MCNP and KENO-VI. For the thermal-fluids stand-alone cases (Ex. I-3 and I-4) the commercial CFD code CFX was utilized to obtain reference results that can be compared with lower fidelity tools.
SENSITIVITY AND UNCERTAINTY ANALYSIS OF COMMERCIAL REACTOR CRITICALS FOR BURNUP CREDIT
Radulescu, Georgeta [ORNL; Mueller, Don [ORNL; Wagner, John C [ORNL
2009-01-01T23:59:59.000Z
The purpose of this study is to provide insights into the neutronic similarities that may exist between a generic cask containing typical spent nuclear fuel assemblies and commercial reactor critical (CRC) state-points. Forty CRC state-points from five pressurized-water reactors were selected for the study and the type of CRC state-points that may be applicable for validation of burnup credit criticality safety calculations for spent fuel transport/storage/disposal systems are identified. The study employed cross-section sensitivity and uncertainty analysis methods developed at Oak Ridge National Laboratory and the TSUNAMI set of tools in the SCALE code system as a means to investigate system similarity on an integral and nuclide-reaction specific level. The results indicate that, except for the fresh fuel core configuration, all analyzed CRC state-points are either highly similar, similar, or marginally similar to a generic cask containing spent nuclear fuel assemblies with burnups ranging from 10 to 60 GWd/MTU. Based on the integral system parameter, C{sub k}, approximately 30 of the 40 CRC state-points are applicable to validation of burnup credit in the generic cask containing typical spent fuel assemblies with burnups ranging from 10 to 60 GWd/MTU. The state-points providing the highest similarity (C{sub k} > 0.95) were attained at or near the end of a reactor cycle. The C{sub k} values are dominated by neutron reactions with major actinides and hydrogen, as the sensitivities of these reactions are much higher than those of the minor actinides and fission products. On a nuclide-reaction specific level, the CRC state-points provide significant similarity for most of the actinides and fission products relevant to burnup credit. A comparison of energy-dependent sensitivity profiles shows a slight shift of the CRC K{sub eff} sensitivity profiles toward higher energies in the thermal region as compared to the K{sub eff} sensitivity profile of the generic cask. Parameters representing coverage of the application by the CRCs on an energy-dependent, nuclide-reaction specific level (i.e., effectiveness of the CRCs for validating the cross sections as used in the application) were also examined. Based on the CRCs with C{sub k} > 0.8 and an assumed relative standard deviation for uncovered covariance data of 25%, the relative standard deviation of K{sub eff} due to uncovered sensitivity data varies from 0.79% to 0.95% for cask burnups ranging from 10 to 60 GWd/MTU. As expected, this uncertainty in K{sub eff} is largely dominated by noncoverage of sensitivities from major actinides and hydrogen. The contributions from fission products and minor actinides are very small and comparable to statistical uncertainties in K{sub eff} results. These results (again, assuming a 25% uncertainty for uncovered covariance data) indicate that there could be approximately 1% uncertainty in the calculated application K{sub eff} due to incomplete neutronic testing (validation) of the software by the CRCs. However, this conclusion also assumes all other uncertainties in the complex CRC configurations (e.g., isotopic compositions of burned fuel, operation history, data) are well known. Thus, an evaluation of the uncertainties in the CRC configurations is needed prior to the use of CRCs for code validation (i.e., quantifying code bias and bias uncertainty).
Calibration under uncertainty for finite element models of masonry monuments
Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin
2010-02-01T23:59:59.000Z
Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.
Clocking in the face of unpredictability beyond quantum uncertainty
F. Hadi Madjid; John M. Myers
2015-04-16T23:59:59.000Z
In earlier papers we showed unpredictability beyond quantum uncertainty in atomic clocks, ensuing from a proven gap between given evidence and explanations of that evidence. Here we reconceive a clock, not as an isolated entity, but as enmeshed in a self-adjusting communications network adapted to one or another particular investigation, in contact with an unpredictable environment. From the practical uses of clocks, we abstract a clock enlivened with the computational capacity of a Turing machine, modified to transmit and to receive numerical communications. Such "live clocks" phase the steps of their computations to mesh with the arrival of transmitted numbers. We lift this phasing, known in digital communications, to a principle of \\emph{logical synchronization}, distinct from the synchronization defined by Einstein in special relativity. Logical synchronization elevates digital communication to a topic in physics, including applications to biology. One explores how feedback loops in clocking affect numerical signaling among entities functioning in the face of unpredictable influences, making the influences themselves into subjects of investigation. The formulation of communications networks in terms of live clocks extends information theory by expressing the need to actively maintain communications channels, and potentially, to create or drop them. We show how networks of live clocks are presupposed by the concept of coordinates in a spacetime. A network serves as an organizing principle, even when the concept of the rigid body that anchors a special-relativistic coordinate system is inapplicable, as is the case, for example, in a generic curved spacetime.
A preliminary study to Assess Model Uncertainties in Fluid Flows
Marc Oliver Delchini; Jean C. Ragusa
2009-09-01T23:59:59.000Z
The goal of this study is to assess the impact of various flow models for a simplified primary coolant loop of a light water nuclear reactor. The various fluid flow models are based on the Euler equations with an additional friction term, gravity term, momentum source, and energy source. The geometric model is purposefully chosen simple and consists of a one-dimensional (1D) loop system in order to focus the study on the validity of various fluid flow approximations. The 1D loop system is represented by a rectangle; the fluid is heated up along one of the vertical legs and cooled down along the opposite leg. A pressurizer and a pump are included in the horizontal legs. The amount of energy transferred and removed from the system is equal in absolute value along the two vertical legs. The various fluid flow approximations are compressible vs. incompressible, and complete momentum equation vs. Darcy’s approximation. The ultimate goal is to compute the fluid flow models’ uncertainties and, if possible, to generate validity ranges for these models when applied to reactor analysis. We also limit this study to single phase flows with low-Mach numbers. As a result, sound waves carry a very small amount of energy in this particular case. A standard finite volume method is used for the spatial discretization of the system.
Le Pallec, J. C.; Crouzet, N.; Bergeaud, V.; Delavaud, C. [CEA/DEN/DM2S, CEA/Saclay, 91191 Gif sur Yvette Cedex (France)
2012-07-01T23:59:59.000Z
The control of uncertainties in the field of reactor physics and their propagation in best-estimate modeling are a major issue in safety analysis. In this framework, the CEA develops a methodology to perform multi-physics simulations including uncertainties analysis. The present paper aims to present and apply this methodology for the analysis of an accidental situation such as REA (Rod Ejection Accident). This accident is characterized by a strong interaction between the different areas of the reactor physics (neutronic, fuel thermal and thermal hydraulic). The modeling is performed with CRONOS2 code. The uncertainties analysis has been conducted with the URANIE platform developed by the CEA: For each identified response from the modeling (output) and considering a set of key parameters with their uncertainties (input), a surrogate model in the form of a neural network has been produced. The set of neural networks is then used to carry out a sensitivity analysis which consists on a global variance analysis with the determination of the Sobol indices for all responses. The sensitivity indices are obtained for the input parameters by an approach based on the use of polynomial chaos. The present exercise helped to develop a methodological flow scheme, to consolidate the use of URANIE tool in the framework of parallel calculations. Finally, the use of polynomial chaos allowed computing high order sensitivity indices and thus highlighting and classifying the influence of identified uncertainties on each response of the analysis (single and interaction effects). (authors)
Uncertainty evaluation for the matrix 'solidified state' of fissionable elements
Iliescu, Elena; Iancso, Georgeta [National Institute of R and D for Physics and Nuclear Engineering-'Horia Hulubei', Str.Reactorului 30, P.O. BOX MG-6, Magurele (Romania)
2012-09-06T23:59:59.000Z
In case of the analysis of the radioactive liquid samples, no matter the relative physical analysis method used, two impediments act that belong to the behavior in time of the dispersion state of the liquid samples to be analyzed and of the standard used in the analysis. That is, one of them refers to the state of the sample to be analyzed when being sampled, which 'alter' during the time elapsed from sampling up to the analysis of the sample. The other impediment is the natural change of the dispersion state of the standard radioactive solutions, due to the occurrence and evolution in time of the radiocolloidal and pseudo-radiocolloidal states. These radiocolloidal states are states of aggregation and they lead to the destruction of the homogeneity of the solutions. Taking into consideration the advantages offered by the relative physical methods of analysis as against the chemical or the radiochemical ones, different ways of eliminating these impediments have been tried. We eliminated these impediments processing the liquid reference materials (the solutions calibrated in radionuclides of interest), immediately after the preparation. This processing changes the liquid physical state of the reference materials in a 'solidified state'. Through this procedure the dispersion states of the samples, practically, can no longer be essentially modified in time and also ensure the uniform distribution of the radionuclides of interest in the elemental matrix of the samples 'state solidified'. The homogeneity of the distribution of the atoms of the radionuclides from the samples 'solidified state' was checked up through the track micromapping technique of the alpha particles. Through this technique, in the chemically etched track detectors that were put in direct contact with the sample for a determined period of time, the alpha exposure time of the detectors, micromaps of alpha tracks were obtained. These micromaps are retorts through tracks of the distributions atoms of fissionable elements (Thorium e.g.), of which, heavy charged particles, in this case the alpha radiations naturally emitted, were registered in the CR-39 track detectors. The density of alpha track from the obtained track micromaps was studied through common optic microscopy. Micromaps were studied counting the tracks on equal areas, in different measurement points. For the study of the foils prepared within the paper, the studied area was of 4.9 mm2, formed of 10 fields of 0.49 mm2 area each. The estimation of the uncertainty was carried out for all the sizes that were measured within the paper, no matter if they participate, directly or indirectly, in the estimation of the uncertainty regarding the homogeneity of the Thorium atoms distribution in the 'solidified state' foils of the standard solution calibrated in Thorium, such as: i) the weighted masses, ii) the dropped volumes of solution, iii) the alpha duration of exposure of the detectors, iv) the area studied on the surface of the micromap and v) the densities of alpha tracks. The procedure suggested allowed us to considerate that the homogeneity of alpha tracks distribution, on the surface and in thickness, is within the limits of 3.1%.
Analysis of uncertainties in CRAC2 calculations: the inhalation pathway
Killough, G.G.; Dunning, D.E. Jr.
1984-01-01T23:59:59.000Z
CRAC2 is a computer code for estimating the health effects and economic costs that might result from a release of radioactivity from a nuclear reactor to the environment. This paper describes tests of sensitivity of the predicted health effects to uncertainties in parameters associated with inhalation of the released radionuclides. These parameters are the particle size of the carrier aerosol and, for each element in the release, the clearance parameters for the lung model on which the code's dose conversion factors for inhalation are based. CRAC2 uses hourly meteorological data and a straight-line Gaussian plume model to predict the transport of airborne radioactivity; it includes models for plume depletion and population evacuation, and data for the distributions of population and land use. The code can compute results for single weather sequences, or it can perform random sampling of weather sequences from the meteorological data file and compute results for each weather sequence in the sample. For the work described in this paper, we concentrated on three fixed weather sequences that represent a range of conditions. For each fixed weather sequence, we applied random sampling to joint distributions of the inhalation parameters in order to estimate the sensitivity of the predicted health effects. All sampling runs produced coefficients of variation that were less than 50%, but some differences of means between weather sequences were substantial, as were some differences between means and the corresponding CRAC2 results without random sampling. Early injuries showed differences of as much as 1 to 2 orders of magnitude, while the differences in early fatalities were less than a factor of 2. Latent cancer fatalities varied by less than 10%. 19 references, 6 figures, 3 tables.
Generalized Uncertainty Principle and Recent Cosmic Inflation Observations
Abdel Nasser Tawfik; Abdel Magied Diab
2014-10-29T23:59:59.000Z
The recent background imaging of cosmic extragalactic polarization (BICEP2) observations are believed as an evidence for the cosmic inflation. BICEP2 provided a first direct evidence for the inflation, determined its energy scale and debriefed witnesses for the quantum gravitational processes. The ratio of scalar-to-tensor fluctuations $r$ which is the canonical measurement of the gravitational waves, was estimated as $r=0.2_{-0.05}^{+0.07}$. Apparently, this value agrees well with the upper bound value corresponding to PLANCK $r\\leq 0.012$ and to WMAP9 experiment $r=0.2$. It is believed that the existence of a minimal length is one of the greatest predictions leading to modifications in the Heisenberg uncertainty principle or a GUP at the Planck scale. In the present work, we investigate the possibility of interpreting recent BICEP2 observations through quantum gravity or GUP. We estimate the slow-roll parameters, the tensorial and the scalar density fluctuations which are characterized by the scalar field $\\phi$. Taking into account the background (matter and radiation) energy density, $\\phi$ is assumed to interact with the gravity and with itself. We first review the Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe and then suggest modification in the Friedmann equation due to GUP. By using a single potential for a chaotic inflation model, various inflationary parameters are estimated and compared with the PLANCK and BICEP2 observations. While GUP is conjectured to break down the expansion of the early Universe (Hubble parameter and scale factor), two inflation potentials based on certain minimal supersymmetric extension of the standard model result in $r$ and spectral index matching well with the observations. Corresponding to BICEP2 observations, our estimation for $r$ depends on the inflation potential and the scalar field. A power-law inflation potential does not.
Dark Energy from Quantum Uncertainty of Distant Clocks
M. J. Luo
2015-05-04T23:59:59.000Z
The observed cosmic acceleration was attributed to an exotic dark energy in the framework of classical general relativity. The dark energy behaves very similar with vacuum energy in quantum mechanics. However, once the quantum effects are seriously taken into account, it predicts a complete wrong result and leads to a severe fine-tuning. To solve the problem, the exact meaning of time in quantum mechanics is reexamined. We abandon the standard interpretation of time in quantum mechanics that time is just a global parameter, replace it by a quantum dynamical variable playing the role of physical clock. We find that synchronization of two spatially separated clocks can not be precisely realized at quantum level. There is an intrinsic quantum uncertainty of distant simultaneity, which implies an apparent vacuum energy fluctuation and gives an observed dark energy density $\\rho_{de}=\\frac{6}{\\pi}L_{P}^{-2}L_{H}^{-2}$ at tree level approximation, where $L_{P}$ and $L_{H}$ are the Planck and Hubble scale cutoffs. The fraction of the dark energy is given by $\\Omega_{de}=\\frac{2}{\\pi}$, which does not evolve with the internal clock time. The "dark energy" as a quantum cosmic variance is always seen comparable with the matter energy density by an observer using the internal clock time. The corrected distance-redshift relation of cosmic observations due to the distant clock effect are also discussed, which again gives a redshift independent fraction $\\Omega_{de}=\\frac{2}{\\pi}$. The theory is consistent with current cosmic observations.
Aghamohammadi, Aliakbar
2014-05-07T23:59:59.000Z
This dissertation addresses the problem of stochastic optimal control with imperfect measurements. The main application of interest is robot motion planning under uncertainty. In the presence of process uncertainty and imperfect measurements...
Aghamohammadi, Aliakbar
2014-05-07T23:59:59.000Z
This dissertation addresses the problem of stochastic optimal control with imperfect measurements. The main application of interest is robot motion planning under uncertainty. In the presence of process uncertainty and imperfect measurements...
Flores, Alejandro N.
[1] Representation of model input uncertainty is critical in ensemble-based data assimilation. Monte Carlo sampling of model inputs produces uncertainty in the hydrologic state through the model dynamics. Small Monte Carlo ...
Jun, Mina
2007-01-01T23:59:59.000Z
Estimating, presenting, and assessing uncertainties are important parts in assessment of a complex system. This thesis focuses on the assessment of uncertainty in the price module and the climate module in the Aviation ...
Miller, C.; Little, C.A.
1982-08-01T23:59:59.000Z
The purpose is to summarize estimates based on currently available data of the uncertainty associated with radiological assessment models. The models being examined herein are those recommended previously for use in breeder reactor assessments. Uncertainty estimates are presented for models of atmospheric and hydrologic transport, terrestrial and aquatic food-chain bioaccumulation, and internal and external dosimetry. Both long-term and short-term release conditions are discussed. The uncertainty estimates presented in this report indicate that, for many sites, generic models and representative parameter values may be used to calculate doses from annual average radionuclide releases when these calculated doses are on the order of one-tenth or less of a relevant dose limit. For short-term, accidental releases, especially those from breeder reactors located in sites dominated by complex terrain and/or coastal meteorology, the uncertainty in the dose calculations may be much larger than an order of magnitude. As a result, it may be necessary to incorporate site-specific information into the dose calculation under these circumstances to reduce this uncertainty. However, even using site-specific information, natural variability and the uncertainties in the dose conversion factor will likely result in an overall uncertainty of greater than an order of magnitude for predictions of dose or concentration in environmental media following shortterm releases.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Yankov, Artem; Collins, Benjamin; Klein, Markus; Jessee, Matthew A.; Zwermann, Winfried; Velkov, Kiril; Pautz, Andreas; Downar, Thomas
2012-01-01T23:59:59.000Z
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore »in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Gerhard Strydom
2011-01-01T23:59:59.000Z
The need for a defendable and systematic uncertainty and sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008. The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This report summarized the results of the initial investigations performed with SUSA, utilizing a typical High Temperature Reactor benchmark (the IAEA CRP-5 PBMR 400MW Exercise 2) and the PEBBED-THERMIX suite of codes. The following steps were performed as part of the uncertainty and sensitivity analysis: 1. Eight PEBBED-THERMIX model input parameters were selected for inclusion in the uncertainty study: the total reactor power, inlet gas temperature, decay heat, and the specific heat capability and thermal conductivity of the fuel, pebble bed and reflector graphite. 2. The input parameters variations and probability density functions were specified, and a total of 800 PEBBED-THERMIX model calculations were performed, divided into 4 sets of 100 and 2 sets of 200 Steady State and Depressurized Loss of Forced Cooling (DLOFC) transient calculations each. 3. The steady state and DLOFC maximum fuel temperature, as well as the daily pebble fuel load rate data, were supplied to SUSA as model output parameters of interest. The 6 data sets were statistically analyzed to determine the 5% and 95% percentile values for each of the 3 output parameters with a 95% confidence level, and typical statistical indictors were also generated (e.g. Kendall, Pearson and Spearman coefficients). 4. A SUSA sensitivity study was performed to obtain correlation data between the input and output parameters, and to identify the primary contributors to the output data uncertainties. It was found that the uncertainties in the decay heat, pebble bed and reflector thermal conductivities were responsible for the bulk of the propagated uncertainty in the DLOFC maximum fuel temperature. It was also determined that the two standard deviation (2s) uncertainty on the maximum fuel temperature was between ±58oC (3.6%) and ±76oC (4.7%) on a mean value of 1604 oC. These values mostly depended on the selection of the distributions types, and not on the number of model calculations above the required Wilks criteria (a (95%,95%) statement would usually require 93 model runs).
Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties
Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Lee, Andrew K. [Department of Radiation Oncology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Sahoo, Narayan [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Tucker, Susan L. [Department of Bioinformatics and Computational Biology, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E. [Department of Radiation Physics, University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Dong, Lei, E-mail: dong.lei@scrippshealth.org [Scripps Proton Therapy Center, San Diego, California (United States)
2013-08-01T23:59:59.000Z
Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dose–volume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dose–volume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of ?1.1% (?0.9% for breath-hold), ?0.3%, and ?2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.
Balance Calibration – A Method for Assigning a Direct-Reading Uncertainty to an Electronic Balance.
Mike Stears
2010-07-01T23:59:59.000Z
Paper Title: Balance Calibration – A method for assigning a direct-reading uncertainty to an electronic balance. Intended Audience: Those who calibrate or use electronic balances. Abstract: As a calibration facility, we provide on-site (at the customer’s location) calibrations of electronic balances for customers within our company. In our experience, most of our customers are not using their balance as a comparator, but simply putting an unknown quantity on the balance and reading the displayed mass value. Manufacturer’s specifications for balances typically include specifications such as readability, repeatability, linearity, and sensitivity temperature drift, but what does this all mean when the balance user simply reads the displayed mass value and accepts the reading as the true value? This paper discusses a method for assigning a direct-reading uncertainty to a balance based upon the observed calibration data and the environment where the balance is being used. The method requires input from the customer regarding the environment where the balance is used and encourages discussion with the customer regarding sources of uncertainty and possible means for improvement; the calibration process becomes an educational opportunity for the balance user as well as calibration personnel. This paper will cover the uncertainty analysis applied to the calibration weights used for the field calibration of balances; the uncertainty is calculated over the range of environmental conditions typically encountered in the field and the resulting range of air density. The temperature stability in the area of the balance is discussed with the customer and the temperature range over which the balance calibration is valid is decided upon; the decision is based upon the uncertainty needs of the customer and the desired rigor in monitoring by the customer. Once the environmental limitations are decided, the calibration is performed and the measurement data is entered into a custom spreadsheet. The spreadsheet uses measurement results, along with the manufacturer’s specifications, to assign a direct-read measurement uncertainty to the balance. The fact that the assigned uncertainty is a best-case uncertainty is discussed with the customer; the assigned uncertainty contains no allowance for contributions associated with the unknown weighing sample, such as density, static charges, magnetism, etc. The attendee will learn uncertainty considerations associated with balance calibrations along with one method for assigning an uncertainty to a balance used for non-comparison measurements.
Dakota uncertainty quantification methods applied to the NEK-5000 SAHEX model.
Weirs, V. Gregory
2014-03-01T23:59:59.000Z
This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.
Jet energy and missing ET systematic uncertainties for early Run 2 data with the ATLAS detector
Alkire, Steven Patrick; The ATLAS collaboration
2015-01-01T23:59:59.000Z
The jet energy scale and resolution and their systematic uncertainties are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of \\sqrt{s}=13 TeV. Jets are clustered with the anti-kt algorithm with R=0.4 and calibrated using MC simulations. Uncertainties are based on measurements using 2012 data, measurements using early 2015 data and dedicated studies using MC simulations. In addition, systematic uncertainties on soft activity and its contributions to the missing transverse energy are determined using MC simulations and validated using early 2015 data.
Multiplicative scale uncertainties in the unified approach for constructing confidence intervals
E. S. Smith
2009-03-31T23:59:59.000Z
We have investigated how uncertainties in the estimation of the detection efficiency affect the 90% confidence intervals in the unified approach for constructing confidence intervals. The study has been conducted for experiments where the number of detected events is large and can be described by a Gaussian probability density function. We also assume the detection efficiency has a Gaussian probability density and study the range of the relative uncertainties $\\sigma_\\epsilon$ between 0 and 30%. We find that the confidence intervals provide proper coverage over a wide signal range and increase smoothly and continuously from the intervals that ignore scale uncertainties with a quadratic dependence on $\\sigma_\\epsilon$.
Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes
Gerhard Strydom
2010-06-01T23:59:59.000Z
The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.
Determining the Uncertainty Associated with Retrospective Air Sampling for Optimization Purposes
Hadlock, D.J.
2003-10-03T23:59:59.000Z
NUREG 1400 contains an acceptable methodology for determining the uncertainty associated with retrospective air sampling. The method is a fairly simple one in which both the systemic and random uncertainties, usually expressed as a percent error, are propagated using the square root of the sum of the squares. Historically, many people involved in air sampling have focused on the statistical counting error as the deciding factor of overall uncertainty in retrospective air sampling. This paper looks at not only the counting error but also other errors associated with the performance of retrospective air sampling.
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.
2001-11-09T23:59:59.000Z
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.
The uncertainties of a Pd3PGV onsite earthquake early warning system Jui-Pin Wang a,n
Wu, Yih-Min
The uncertainties of a Pd3PGV onsite earthquake early warning system Jui-Pin Wang a,n , Yih-Min Wu should be issued, based on factors such as the uncertainty of the early warning system, the magnitude the development and implementation of earthquake early warning systems. This study examines the uncertainty of one
Fitelson, Branden
Chapter 2 Representing Uncertainty Do not expect to arrive at certainty in every subject which you best light and reasoning will reach no farther. --Isaac Watts How should uncertainty be represented in this chapter, I discuss some other difficulties that probability has in representing uncertainty
Boyer, Edmond
, and simple models are usually considered in the analysis. This is an important constraint when uncertaintiesProbabilistic model identification of the bit-rock-interaction-model uncertainties in nonlinear model of uncertainties in a bit-rock interaction model for the nonlinear dynamics of a drill
Practical reliability and uncertainty quantification in complex systems : final report.
Grace, Matthew D.; Ringland, James T.; Marzouk, Youssef M. (Massachusetts Institute of Technology, Cambridge, MA); Boggs, Paul T.; Zurn, Rena M.; Diegert, Kathleen V. (Sandia National Laboratories, Albuquerque, NM); Pebay, Philippe Pierre; Red-Horse, John Robert (Sandia National Laboratories, Albuquerque, NM)
2009-09-01T23:59:59.000Z
The purpose of this project was to investigate the use of Bayesian methods for the estimation of the reliability of complex systems. The goals were to find methods for dealing with continuous data, rather than simple pass/fail data; to avoid assumptions of specific probability distributions, especially Gaussian, or normal, distributions; to compute not only an estimate of the reliability of the system, but also a measure of the confidence in that estimate; to develop procedures to address time-dependent or aging aspects in such systems, and to use these models and results to derive optimal testing strategies. The system is assumed to be a system of systems, i.e., a system with discrete components that are themselves systems. Furthermore, the system is 'engineered' in the sense that each node is designed to do something and that we have a mathematical description of that process. In the time-dependent case, the assumption is that we have a general, nonlinear, time-dependent function describing the process. The major results of the project are described in this report. In summary, we developed a sophisticated mathematical framework based on modern probability theory and Bayesian analysis. This framework encompasses all aspects of epistemic uncertainty and easily incorporates steady-state and time-dependent systems. Based on Markov chain, Monte Carlo methods, we devised a computational strategy for general probability density estimation in the steady-state case. This enabled us to compute a distribution of the reliability from which many questions, including confidence, could be addressed. We then extended this to the time domain and implemented procedures to estimate the reliability over time, including the use of the method to predict the reliability at a future time. Finally, we used certain aspects of Bayesian decision analysis to create a novel method for determining an optimal testing strategy, e.g., we can estimate the 'best' location to take the next test to minimize the risk of making a wrong decision about the fitness of a system. We conclude this report by proposing additional fruitful areas of research.
Huang, Yinlun
Technology Evaluation and Decision Making for Sustainability Enhancement of Industrial Systems. A case study on sustainable development of biodiesel manufacturing demonstrates methodological efficacy: sustainability enhancement, decision making, uncertainty, interval-parameter-based analysis, technology
Clements, Emily Baker
2013-01-01T23:59:59.000Z
Most space programs experience significant cost and schedule growth over the course of program development. Poor uncertainty management has been identified as one of the leading causes of program cost and schedule overruns. ...
Climate Change Impacts on Extreme Events in the United States: An Uncertainty Analysis
Monier, Erwan
Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in ...
Lynch, Sharon G.; Kroencke, Dawn C.; Denney, Douglas R.
2001-12-01T23:59:59.000Z
The relationship between disability and depression was studied in 188 patients with clinically definite multiple sclerosis (MS). Patients were administered the Zung Self-Rating Depression Scale, Ways of Coping, Uncertainty of Illness Scale, and Hope...
Effects of the Uncertainty about Global Economic Recovery on Energy Transition and CO2 Price
Durand-Lasserve, Olivier
This paper examines the impact that uncertainty over economic growth may have on global energy transition and CO2 prices. We use a general-equilibrium model derived from MERGE, and define several stochastic scenarios for ...
Seelhof, Michael
2014-01-01T23:59:59.000Z
A computer model was developed to find optimal long-term investment strategies for the electric power sector under uncertainty with respect to future regulatory regimes and market conditions. The model is based on a ...
Maryland at College Park, University of
ABSTRACT Title of Dissertation: STUDENTS' UNDERSTANDING OF MEASUREMENT AND UNCERTAINTY Lippmann, Doctor of Philosophy, 2003 Dissertation directed by: Professor Edward F. Redish Department CONSTRUCTION, UNDERLYING CONCEPTS, AND QUANTITATIVE ANALYSIS by Rebecca Faith Lippmann Dissertation submitted
Quantification of the impact of climate uncertainty on regional air quality
Liao, K.-J.
Uncertainties in calculated impacts of climate forecasts on future regional air quality are investigated using downscaled MM5 meteorological fields from the NASA GISS and MIT IGSM global models and the CMAQ model in 2050 ...
Sobes, Vladimir
2014-01-01T23:59:59.000Z
A new methodology has been developed that couples differential cross section data evaluation with integral benchmark analysis for improved uncertainty quantification. The new methodology was applied to the two new copper ...
Influence of air quality model resolution on uncertainty associated with health impacts
Thompson, Tammy M.
We use regional air quality modeling to evaluate the impact of model resolution on uncertainty associated with the human health benefits resulting from proposed air quality regulations. Using a regional photochemical model ...
AlMisnad, Abdulla
2014-01-01T23:59:59.000Z
The development of new infrastructure projects is a key part of global efforts to meet the demands of growing populations in times of increasing uncertainty. The deterministic approaches commonly used for the development ...
Botea, Adi
"CRISIS, UNCERTAINTY AND DEMOCRACY": OPENING AUSTRALIAN POLITICAL STUDIES ASSOCIATION (APSA of government or #12;public policy around the world our discipline and sub-disciplines are at the heart
The Value of Assessing Uncertainty in Oil and Gas Portfolio Optimization
Hdadou, Houda
2013-07-25T23:59:59.000Z
It has been shown in the literature that the oil and gas industry deals with a substantial number of biases that impact project evaluation and portfolio performance. Previous studies concluded that properly estimating uncertainties...
Determination of uncertainty in reserves estimate from analysis of production decline data
Wang, Yuhong
2007-09-17T23:59:59.000Z
Analysts increasingly have used probabilistic approaches to evaluate the uncertainty in reserves estimates based on a decline curve analysis. This is because the results represent statistical analysis of historical data that usually possess...
Bei, Naifang
The purpose of the present study is to investigate the sensitivity of ozone (O3)[(O subscript 3)] predictions in the Mexico City Metropolitan Area (MCMA) to meteorological initial uncertainties and planetary boundary layer ...
Pruet, J
2007-06-23T23:59:59.000Z
This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directory structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.
Morris, Jennifer F. (Jennifer Faye)
2013-01-01T23:59:59.000Z
The electric power sector, which accounts for approximately 40% of U.S. carbon dioxide emissions, will be a critical component of any policy the U.S. government pursues to confront climate change. In the context of uncertainty ...
Uncertainty Analysis in Upscaling Well Log data By Markov Chain Monte Carlo Method
Hwang, Kyubum
2010-01-16T23:59:59.000Z
the uncertainties, a Bayesian framework could be a useful tool for providing the posterior information to give a better estimate for a chosen model with a conditional probability. In addition, the likelihood of a Bayesian framework plays an important role...
Sensitivity and uncertainty analyses for thermo-hydraulic calculation of research reactor
Hartini, Entin; Andiwijayakusuma, Dinan [Center for Development of Nuclear Informatics - National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)] [Center for Development of Nuclear Informatics - National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia); Isnaeni, Muh Darwis [Center for Reactor Technology and Nuclear Safety- National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)] [Center for Reactor Technology and Nuclear Safety- National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)
2013-09-09T23:59:59.000Z
The sensitivity and uncertainty analysis of input parameters on thermohydraulic calculations for a research reactor has successfully done in this research. The uncertainty analysis was carried out on input parameters for thermohydraulic calculation of sub-channel analysis using Code COOLOD-N. The input parameters include radial peaking factor, the increase bulk coolant temperature, heat flux factor and the increase temperature cladding and fuel meat at research reactor utilizing plate fuel element. The input uncertainty of 1% - 4% were used in nominal power calculation. The bubble detachment parameters were computed for S ratio (the safety margin against the onset of flow instability ratio) which were used to determine safety level in line with the design of 'Reactor Serba Guna-G. A. Siwabessy' (RSG-GA Siwabessy). It was concluded from the calculation results that using the uncertainty input more than 3% was beyond the safety margin of reactor operation.
Reducing Uncertainty in Fisheries Management: The Time for Fishers' Ecological Knowledge
Carr, Liam
2012-07-16T23:59:59.000Z
This dissertation work presents a novel method for addressing system uncertainty to improve management of a small-scale fishery in St. Croix, United States Virgin Islands. Using fishers' ecological knowledge (FEK), this research examines existing...
Webster, Mort David.; Tatang, Menner A.; McRae, Gregory J.
This paper presents the probabilistic collocation method as a computationally efficient method for performing uncertainty analysis on large complex models such as those used in global climate change research. The collocation ...
Lyons, Jeffrey M. (Jeffrey Michael), 1973-
2000-01-01T23:59:59.000Z
As the use of distributed engineering models becomes more prevalent, engineers need tools to evaluate the quality of these models and understand how subsystem uncertainty affects predictions of system behavior. This thesis ...
Cardoni, Jeffrey N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kalinich, Donald A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2014-02-01T23:59:59.000Z
Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.
Seo, Sangtaek
2006-04-12T23:59:59.000Z
Agricultural producers face uncertain agricultural production and market conditions. Much of the uncertainty faced by agricultural producers cannot be controlled by the producer, but can be managed. Several risk management programs are available...
Hurdling Barriers Through Market Uncertainty: Case Studies in Innovative Technology Adoption
Hurdling Barriers Through Market Uncertainty: Case Studies in Innovative Technology Adoption technologies. This paper examines three case studies of innovative technology adoption: retrofit of time these circumstances as "innovative technology adoption." This paper presents case studies of three innovative
Introduction Course Logistics Sets and Elements CMPSCI 240: Reasoning about Uncertainty
McGregor, Andrew
Introduction Course Logistics Sets and Elements CMPSCI 240: Reasoning about Uncertainty Lecture 1 Course Logistics Sets and Elements What's this course about. . . Course description: Development arguments and using mathematical concepts. #12;Introduction Course Logistics Sets and Elements Probability
Yurko, Joseph P.
Propagating parameter uncertainty for a nuclear reactor system code is a very challenging problem. Numerous parameters influence the system response in complicated and often non-linear fashions, in addition to sometimes ...
A rock physics strategy for quantifying uncertainty in common hydrocarbon indicators
Mavko, G.M.; Mukerji, T.
1995-12-31T23:59:59.000Z
We present a strategy for hydrocarbon detection and for quantifying the uncertainty in hydrocarbon indicators, by combining statistical techniques with deterministic rock physics relations derived from the laboratory and theory. A simple example combines Gassmann`s deterministic equation for fluid substitution with statistics inferred from log and core data, to detect hydrocarbons from observed seismic velocities. The formulation gives the most likely estimate of the pore fluid modulus, corresponding to each observed velocity, and also the uncertainty of that interpretation. The variances of seismic velocity and porosity in the calibration data determine the uncertainty of the pore fluid interpretation. As expected, adding information about shear wave velocity, from AVO for example, narrows the uncertainty of the hydrocarbon indicator. The formulation offers a convenient way to implement deterministic fluid substitution equations in the realistic case when the reference porosity and velocity span a range of values.
Northern winter climate change: Assessment of uncertainty in CMIP5 projections related
Gerber, Edwin
Northern winter climate change: Assessment of uncertainty in CMIP5 projections related circulation could have an important impact on northern winter tropospheric climate change, given that sea coherent variations in troposphere-stratosphere circulation. Here we assess northern winter stratospheric
The Low energy structure of the Nucleon-Nucleon interaction: Statistical vs Systematic Uncertainties
R. Navarro Perez; J. E. Amaro; E. Ruiz Arriola
2015-05-15T23:59:59.000Z
We analyze the low energy NN interaction by confronting statistical vs systematic uncertainties. This is based on the analysis of 6 different potentials fitted to the Granada-2013 database where a statistically meaningful partial wave analysis comprising a total of $6713$ np and pp published scattering data from 1950 till 2013 below pion production threshold has been made. This required the design and implementation of three new interactions which are introduced here. We extract threshold parameters uncertainties from the coupled channel effective range expansion up to $j \\le 5$. We find that for threshold parameters systematic uncertainties are generally at least an order of magnitude larger than statistical uncertainties. Similar results are found for np phase-shifts and amplitude parameters.
A simplified analysis of uncertainty propagation in inherently controlled ATWS events
Wade, D.C.
1987-01-01T23:59:59.000Z
The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ..delta..T across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect.
System level assessment of uncertainty in aviation environmental policy impact analysis
Liem, Rhea Patricia
2010-01-01T23:59:59.000Z
This thesis demonstrates the assessment of uncertainty of a simulation model at the system level, which takes into account the interaction between the modules that comprise the system. Results from this system level ...
Continuous reservoir simulation incorporating uncertainty quantification and real-time data
Holmes, Jay Cuthbert
2009-05-15T23:59:59.000Z
of uncertainty in resulting forecasts. A new technology to allow this process to run continuously with little human interaction is real-time production and pressure data, which can be automatically integrated into runs. Two tests of this continuous simulation...
PDF uncertainties on the W boson mass measurement from the lepton transverse momentum distribution
Giuseppe Bozzi; Luca Citelli; Alessandro Vicini
2015-05-22T23:59:59.000Z
We study the charged current Drell-Yan process and we evaluate the proton parton densities uncertainties on the lepton transverse momentum distribution and their impact on the determination of the W-boson mass. We consider the global PDF sets CT10, MSTW2008CPdeut, NNPDF2.3, NNPDF3.0, MMHT2014, and apply the PDF4LHC recipe to combine the individual results, obtaining an uncertainty on MW that ranges between +-18 and +-24 MeV, depending on the final state, collider energy and kind. We discuss the dependence of the uncertainty on the acceptance cuts and the role of the individual parton densities in the final result. We remark that some PDF sets predict an uncertainty on MW of O(10 MeV); this encouraging result is spoiled, in the combined analysis of the different sets, by an important spread of the central values predicted by each group.
PDF uncertainties on the W boson mass measurement from the lepton transverse momentum distribution
Bozzi, Giuseppe; Vicini, Alessandro
2015-01-01T23:59:59.000Z
We study the charged current Drell-Yan process and we evaluate the proton parton densities uncertainties on the lepton transverse momentum distribution and their impact on the determination of the W-boson mass. We consider the global PDF sets CT10, MSTW2008CPdeut, NNPDF2.3, NNPDF3.0, MMHT2014, and apply the PDF4LHC recipe to combine the individual results, obtaining an uncertainty on MW that ranges between +-18 and +-24 MeV, depending on the final state, collider energy and kind. We discuss the dependence of the uncertainty on the acceptance cuts and the role of the individual parton densities in the final result. We remark that some PDF sets predict an uncertainty on MW of O(10 MeV); this encouraging result is spoiled, in the combined analysis of the different sets, by an important spread of the central values predicted by each group.
Ereira, Eleanor Charlotte
2010-01-01T23:59:59.000Z
Climate change is a threat that could be mitigated by introducing new energy technologies into the electricity market that emit fewer greenhouse gas (GHG) emissions. We face many uncertainties that would affect the demand ...
UNIVERSITY OF CALGARY Reduction of Wellbore Positional Uncertainty During Directional Drilling
Calgary, University of
UNIVERSITY OF CALGARY Reduction of Wellbore Positional Uncertainty During Directional Drilling the wellbore positional accuracy in directional drilling operations taken by Measurement While Drilling (MWD-survey correction for compensating drilling assembly magnetic interference to solve the problem of wellbore
Determination of uncertainty in reserves estimate from analysis of production decline data
Wang, Yuhong
2007-09-17T23:59:59.000Z
Analysts increasingly have used probabilistic approaches to evaluate the uncertainty in reserves estimates based on a decline curve analysis. This is because the results represent statistical analysis of historical data ...
Morris, J.
The electric power sector, which accounts for approximately 40% of U.S. carbon dioxide emissions, will be a critical component of any policy the U.S. government pursues to confront climate change. In the context of uncertainty ...
Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models
Frey, H. Christopher
Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models Volume 2 20874 By Water Resources and Environmental Engineering Program Department of Civil Engineering North................................................................................................................. 5 2. CHEMISTRY OF NOX COMBUSTION AND THE ATMOSPHERE ....................................... 9 2
Strategic investment in power generation under uncertainty : Electric Reliability Council of Texas
Chiyangwa, Diana Kudakwashe
2010-01-01T23:59:59.000Z
The purpose of this study is to develop a strategy for investment in power generation technologies in the future given the uncertainties in climate policy and fuel prices. First, such studies are commonly conducted using ...
Compensating for model uncertainty in the control of cooperative field robots
Sujan, Vivek Anand, 1972-
2002-01-01T23:59:59.000Z
Current control and planning algorithms are largely unsuitable for mobile robots in unstructured field environment due to uncertainties in the environment, task, robot models and sensors. A key problem is that it is often ...
Threat assessment for safe navigation in environments with uncertainty in predictability
Aoudé, Georges Salim
2011-01-01T23:59:59.000Z
This thesis develops threat assessment algorithms to improve the safety of the decision making of autonomous and human-operated vehicles navigating in dynamic and uncertain environments, where the source of uncertainty is ...
Quark mass uncertainties revive Kim-Shifman-Vainshtein-Zakharov axion dark matter
Buckley, Matthew R.; Murayama, Hitoshi
2007-01-01T23:59:59.000Z
mass uncertainties revive KSVZ axion dark matter Matthew R.bounds on the QCD axion, speci?cally KSVZ axions in the 2 ?particles. The as-yet-unseen axion, origi- nally proposed to
Uncertainties in Estimating Moisture Fluxes over the Intra-Americas Sea ALBERTO M. MESTAS-NUEZ
single-year sounding observations bear large uncertainties because of interannual variability. Large Plains low-level jet (GPLLJ) along the lee (east) side of the Rocky Mountain range and southerly flow
Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models
Frey, H. Christopher
Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models Volume 1 ........................................6 1.3 Is A Probabilistic Analysis Necessary? ................................................................8 1.4 Previous Work in Probabilistic Risk Assessment
Zhang, Xuesong
2009-05-15T23:59:59.000Z
This study focuses on developing and evaluating efficient and effective parameter calibration and uncertainty methods for hydrologic modeling. Five single objective optimization algorithms and six multi-objective optimization ...
Joint location of microseismic events in the presence of velocity uncertainty
Poliannikov, Oleg V.
The locations of seismic events are used to infer reservoir properties and to guide future production activity, as well as to determine and understand the stress field. Thus, locating seismic events with uncertainty ...
Investment Timing and Capacity Choice for Small-Scale Wind Power Under Uncertainty
Fleten, Stein-Erik; Maribu, Karl Magnus
2004-01-01T23:59:59.000Z
REFERENCES [1] American Wind Power Association (AWEA), Road-CHOICE FOR SMALL-SCALE WIND POWER UNDER UNCERTAINTY Stein-Power production from wind power has stochastic inflows, and
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications
Plevin, Richard Jay
2010-01-01T23:59:59.000Z
2.3.2. Methodological issues with LCA . . .2.3.3. Attributional versus consequential LCA 2.3.4.Economic Input-Output LCA . . . . . 2.4. Uncertainty in
Hansen, K.M.
1992-10-01T23:59:59.000Z
Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds.
Mikaelian, Tsoline
2009-01-01T23:59:59.000Z
Complex systems and enterprises, such as those typical in the aerospace industry, are subject to uncertainties that may lead to suboptimal performance or even catastrophic failures if unmanaged. This work focuses on ...
Miller, Bruno, 1974-
2005-01-01T23:59:59.000Z
Real options analysis is being increasingly used as a tool to evaluate investments under uncertainty; however, traditional real options methodologies have some shortcomings that limit their utility, such as the use of the ...
Gregor, Jeffrey Allen
2003-01-01T23:59:59.000Z
The United States Navy is facing a need for a novel surface combatant capability. This new system of ships must be deigned to meet the uncertainty associated with constantly changing required mission capabilities, threats, ...
Managing uncertainty in systems with a Valuation Approach for Strategic Changeability
Fitzgerald, Matthew Edward
2012-01-01T23:59:59.000Z
Complex engineering systems are frequently exposed to large amounts of uncertainty, as many exogenous, uncontrollable conditions change over time and can affect the performance and value delivery of a system. Engineering ...
Ghanem, Roger
2013-03-25T23:59:59.000Z
Methods and algorithms are developed to enable the accurate analysis of problems that exhibit interacting physical processes with uncertainties. These uncertainties can pertain either to each of the physical processes or to the manner in which they depend on each others. These problems are cast within a polynomial chaos framework and their solution then involves either solving a large system of algebraic equations or a high dimensional numerical quadrature. In both cases, the curse of dimensionality is manifested. Procedures are developed for the efficient evaluation of the resulting linear equations that advantage of the block sparse structure of these equations, resulting in a block recursive Schur complement construction. In addition, embedded quadratures are constructed that permit the evaluation of very high-dimensional integrals using low-dimensional quadratures adapted to particular quantities of interest. The low-dimensional integration is carried out in a transformed measure space in which the quantity of interest is low-dimensional. Finally, a procedure is also developed to discover a low-dimensional manifold, embedded in the initial high-dimensional one, in which scalar quantities of interest exist. This approach permits the functional expression of the reduced space in terms of the original space, thus permitting cross-scale sensitivity analysis.
Study of the systematic uncertainty of tracking from $J/?\\to p \\overline{p} ?^+ ?^-$
Wenlong Yuan; Xiaocong Ai; Xiaobin Ji; Shenjian Chen; Yao Zhang; Linghui Wu; Liangliang Wang; Ye Yuan
2015-07-13T23:59:59.000Z
Based on the $J/\\psi$ events collected with the BESIII detector, with corresponding Monte Carlo samples, the systematic uncertainties of tracking are studied using the control sample of $J/\\psi \\to p \\overline{p} \\pi^+ \\pi^-$. Method validations and different influence factors to the tracking efficiency are studied in detail. The tracking efficiency and its systematic uncertainty of proton and pion with the transverse momentum and polar angle dependence are also discussed.
Copyright Uncertainty in the Geoscience Community: Part I, What's Free for the Taking?
Clement, Gail
2012-01-01T23:59:59.000Z
Pre-print submitted for publication: Clement, Gail, 2012. ?Copyright Uncertainty in the Geoscience community: Part I, What?s Free for the taking?? Proceedings - Geoscience Information Society 42 , forthcoming.... _______________________________________________________________________________________________________ _______________________________________________________________________________________________________ This work is distributed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License. Copyright Uncertainty in the Geoscience community: What?s Free for the Taking? Gail Clement Associate Professor and Head, Digital Services...
Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications
Arbanas, Goran [ORNL; Williams, Mark L [ORNL; Leal, Luiz C [ORNL; Dunn, Michael E [ORNL; Khuwaileh, Bassam A. [North Carolina State University; Wang, C [North Carolina State University; Abdel-Khalik, Hany [North Carolina State University
2015-01-01T23:59:59.000Z
The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimiator (INSURE) module of the AMPX system [1]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncer- tainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore we have incorporated integral benchmark exper- iments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way. We show how the IS/UQ method could be used to optimize uncertainties of IBE s and differential cross section data simultaneously.
Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)
Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.
1982-01-01T23:59:59.000Z
The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections.
Position-Momentum Uncertainty Relations in the Presence of Quantum Memory
Fabian Furrer; Mario Berta; Marco Tomamichel; Volkher B. Scholz; Matthias Christandl
2015-01-05T23:59:59.000Z
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.
Zhang, Xuesong; Liang, Faming; Yu, Beibei; Zong, Ziliang
2011-11-09T23:59:59.000Z
Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associated with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.
Monte Carlo uncertainty reliability and isotope production calculations for a fast reactor
Miles, T.L.
1992-01-01T23:59:59.000Z
Statistical uncertainties in Monte Carlo calculations are typically determined by the first and second moments of the tally. For certain types of calculations, there is concern that the uncertainty estimate is significantly non-conservative. This is typically seen in reactor eigenvalue problems where the uncertainty estimate is aggravated by the generation-to-generation fission source. It has been speculated that optimization of the random walk, through biasing techniques, may increase the non-conservative nature of the uncertainty estimate. A series of calculations are documented here which quantify the reliability of the Monte Carlo Neutron and Photon (MCNP) mean and uncertainty estimates by comparing these estimates to the true mean. These calculations were made with a liquid metal fast reactor model, but every effort was made to isolate the statistical nature of the uncertainty estimates so that the analysis of the reliability of the MCNP estimates should be relevant for small thermal reactors as well. Also, preliminary reactor physics calculations for two different special isotope production test assemblies for irradiation in the Fast Flux Test Facility (FFTF) were performed using MCNP and are documented here. The effect of an yttrium-hydride moderator to tailor the neutron flux incident on the targets to maximize isotope production for different designs in different locations within the reactor is discussed. These calculations also demonstrate the useful application of MCNP in design iterations by utilizing many of the codes features.
Da Cruz, D. F.; Rochman, D.; Koning, A. J. [Nuclear Research and Consultancy Group NRG, Westerduinweg 3, 1755 ZG Petten (Netherlands)
2012-07-01T23:59:59.000Z
This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {sup 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)
Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov
2012-10-01T23:59:59.000Z
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest status and plans are presented.
Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis
Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl; Kloosterman, Jan Leen, E-mail: J.L.Kloosterman@tudelft.nl
2014-03-01T23:59:59.000Z
The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (15–20), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.
David Brizuela
2014-11-03T23:59:59.000Z
The classical and quantum evolution of a generic probability distribution is analyzed. To that end, a formalism based on the decomposition of the distribution in terms of its statistical moments is used, which makes explicit the differences between the classical and quantum dynamics. In particular, there are two different sources of quantum effects. Distributional effects, which are also present in the classical evolution of an extended distribution, are due to the fact that all moments can not be vanishing because of the Heisenberg uncertainty principle. In addition, the non-commutativity of the basic quantum operators add some terms to the quantum equations of motion that explicitly depend on the Planck constant and are not present in the classical setting. These are thus purely-quantum effects. Some particular Hamiltonians are analyzed that have very special properties regarding the evolution they generate in the classical and quantum sector. In addition, a large class of inequalities obeyed by high-order statistical moments, and in particular uncertainty relations that bound the information that is possible to obtain from a quantum system, are derived.
Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia (Sandia National Laboratories, Livermore, CA); Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Eddy, John P.
2011-12-01T23:59:59.000Z
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.
Yurko, J. P.; Buongiorno, J. [MIT, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States)
2012-07-01T23:59:59.000Z
Propagating parameter uncertainty for a nuclear reactor system code is a challenging problem due to often non-linear system response to the numerous parameters involved and lengthy computational times; issues that compound when a statistical sampling procedure is adopted, since the code must be run many times. The number of parameters sampled must therefore be limited to as few as possible that still accurately characterize the uncertainty in the system response. A Quantitative Phenomena Identification and Ranking Table (QPIRT) was developed to accomplish this goal. The QPIRT consists of two steps: a 'Top-Down' step focusing on identifying the dominant physical phenomena controlling the system response, and a 'Bottom-Up' step which focuses on determining the correlations from those key physical phenomena that significantly contribute to the response uncertainty. The Top-Down step evaluates phenomena using the governing equations of the system code at nominal parameter values, providing a 'fast' screening step. The Bottom-Up step then analyzes the correlations and models for the phenomena identified from the Top-Down step to find which parameters to sample. The QPIRT, through the Top-Down and Bottom-Up steps thus provides a systematic approach to determining the limited set of physically relevant parameters that influence the uncertainty of the system response. This strategy was demonstrated through an application to the RELAP5-based analysis of a PWR Total Loss of main Feedwater Flow (TLOFW) accident, also known as feed and bleed' scenario, . Ultimately, this work is the first component in a larger task of building a calibrated uncertainty propagation framework. The QPIRT is an essential piece because the uncertainty of those selected parameters will be calibrated to data from both Separate and Integral Effect Tests (SETs and IETs). Therefore the system response uncertainty will incorporate the knowledge gained from the database of past large IETs. (authors)
The ends of uncertainty: Air quality science and planning in Central California
Fine, James
2003-09-01T23:59:59.000Z
Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.
F. Herwig; Sam M. Austin; John C. Lattanzio
2005-11-14T23:59:59.000Z
Calculations that demonstrate the influence of three key nuclear reaction rates on the evolution of Asymptotic Giant Branch stars have been carried out. We study the case of a star with an initial mass of 2Msun and a metallicity of Z=0.01, somewhat less than the solar metallicity. The dredge-up of nuclear processed material from the interior of the star, and the yield predictions for carbon, are sensitive to the rate of the N14(p,gamma)O15 and triple-alpha reactions. These reactions dominate the H- and He-burning shells of stars in this late evolutionary phase. Published uncertainty estimates for each of these two rates propagated through stellar evolution calculations cause uncertainties in carbon enrichment and yield predictions of about a factor of two. The other important He-burning reaction C12(alpha,gamma)O16, although associated with the largest uncertainty in our study, does not have a significant influence on the abundance evolution compared to other modelling uncertainties. This finding remains valid when the entire evolution from the main-sequence to the tip of the AGB is considered. We discuss the experimental sources of the rate uncertainties addressed here, and give some outlook for future work.
Asymptotic and uncertainty analyses of a phase field model for void formation under irradiation
Nan Wang; Srujan Rokkam; Thomas Hochrainer; Michael Pernice; Anter El-Azab
2014-06-01T23:59:59.000Z
We perform asymptotic analysis and uncertainty quantification of a phase field model for void formation and evolution in materials subject to irradiation. The parameters of the phase field model are obtained in terms of the underlying material specific quantities by matching the sharp interface limit of the phase field model with the corresponding sharp interface theory for void growth. To evaluate the sensitivity of phase field simulations to uncertainties in input parameters we quantify the predictions using the stochastic collocation method. Uncertainties arising from material parameters are investigated based on available experimental and atomic scale data. The results of our analysis suggest that the uncertainty in the formation and migration energies of vacancies are found to have a strong influence on the void volume fraction (or porosity). In contrast, the uncertainty resulting from the void surface energy has minimal affect. The analysis also shows that the model is consistent in the sense that its predictions do not drastically change as a result of small variations of the model input parameters.
Optimization under Uncertainty for Water Consumption in a Pulverized Coal Power Plant
Juan M. Salazar; Stephen E. Zitney; Urmila Diwekar
2009-01-01T23:59:59.000Z
Pulverized coal (PC) power plants are widely recognized as major water consumers whose operability has started to be affected by drought conditions across some regions of the country. Water availability will further restrict the retrofitting of existing PC plants with water-expensive carbon capture technologies. Therefore, national efforts to reduce water withdrawal and consumption have been intensified. Water consumption in PC plants is strongly associated to losses from the cooling water cycle, particularly water evaporation from cooling towers. Accurate estimation of these water losses requires realistic cooling tower models, as well as the inclusion of uncertainties arising from atmospheric conditions. In this work, the cooling tower for a supercritical PC power plant was modeled as a humidification operation and used for optimization under uncertainty. Characterization of the uncertainty (air temperature and humidity) was based on available weather data. Process characteristics including boiler conditions, reactant ratios, and pressure ratios in turbines were calculated to obtain the minimum water consumption under the above mentioned uncertainties. In this study, the calculated conditions predicted up to 12% in reduction in the average water consumption for a 548 MW supercritical PC power plant simulated using Aspen Plus. Optimization under uncertainty for these large-scale PC plants cannot be solved with conventional stochastic programming algorithms because of the computational expenses involved. In this work, we discuss the use of a novel better optimization of nonlinear uncertain systems (BONUS) algorithm which dramatically decreases the computational requirements of the stochastic optimization.
Optimization Under Uncertainty for Water Consumption in a Pulverized Coal Power Plant
Juan M. Salazara; Stephen E. Zitney; Urmila M. Diwekara
2009-01-01T23:59:59.000Z
Pulverized coal (PC) power plants are widely recognized as major water consumers whose operability has started to be affected by drought conditions across some regions of the country. Water availability will further restrict the retrofitting of existing PC plants with water-expensive carbon capture technologies. Therefore, national efforts to reduce water withdrawal and consumption have been intensified. Water consumption in PC plants is strongly associated to losses from the cooling water cycle, particularly water evaporation from cooling towers. Accurate estimation of these water losses requires realistic cooling tower models, as well as the inclusion of uncertainties arising from atmospheric conditions. In this work, the cooling tower for a supercritical PC power plant was modeled as a humidification operation and used for optimization under uncertainty. Characterization of the uncertainty (air temperature and humidity) was based on available weather data. Process characteristics including boiler conditions, reactant ratios, and pressure ratios in turbines were calculated to obtain the minimum water consumption under the above mentioned uncertainties. In this study, the calculated conditions predicted up to 12% in reduction in the average water consumption for a 548 MW supercritical PC power plant simulated using Aspen Plus. Optimization under uncertainty for these large-scale PC plants cannot be solved with conventional stochastic programming algorithms because of the computational expenses involved. In this work, we discuss the use of a novel better optimization of nonlinear uncertain systems (BONUS) algorithm which dramatically decreases the computational requirements of the stochastic optimization.
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina
2014-02-01T23:59:59.000Z
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)
Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling Unit Chilled Water Valve
Song, Li; Wang, Gang; Brambley, Michael R.
2013-04-28T23:59:59.000Z
A virtual water flow meter is developed that uses the chilled water control valve on an air-handling unit as a measurement device. The flow rate of water through the valve is calculated using the differential pressure across the valve and its associated coil, the valve command, and an empirically determined valve characteristic curve. Thus, the probability of error in the measurements is significantly greater than for conventionally manufactured flow meters. In this paper, mathematical models are developed and used to conduct uncertainty analysis for the virtual flow meter, and the results from the virtual meter are compared to measurements made with an ultrasonic flow meter. Theoretical uncertainty analysis shows that the total uncertainty in flow rates from the virtual flow meter is 1.46% with 95% confidence; comparison of virtual flow meter results with measurements from an ultrasonic flow meter yielded anuncertainty of 1.46% with 99% confidence. The comparable results from the theoretical uncertainty analysis and empirical comparison with the ultrasonic flow meter corroborate each other, and tend to validate the approach to computationally estimating uncertainty for virtual sensors introduced in this study.
Franco, Guillermo; Shen-Tu, Bing Ming; Bazzurro, Paolo [AIR Worldwide Corporation, 131 Dartmouth Street, Boston, MA 02116 (United States); Goretti, Agostino [Seismic Risk Office, CPD, Rome (Italy); Valensise, Gianluca [Institute of Geophysics and Vulcanology (INGV), Rome (Italy)
2008-07-08T23:59:59.000Z
Increasing sophistication in the insurance and reinsurance market is stimulating the move towards catastrophe models that offer a greater degree of flexibility in the definition of model parameters and model assumptions. This study explores the impact of uncertainty in the input parameters on the loss estimates by departing from the exclusive usage of mean values to establish the earthquake event mechanism, the ground motion fields, or the damageability of the building stock. Here the potential losses due to a repeat of the 1908 Messina-Reggio Calabria event are calculated using different plausible alternatives found in the literature that encompass 12 event scenarios, 2 different ground motion prediction equations, and 16 combinations of damage functions for the building stock, a total of 384 loss scenarios. These results constitute the basis for a sensitivity analysis of the different assumptions on the loss estimates that allows the model user to estimate the impact of the uncertainty on input parameters and the potential spread of the model results. For the event under scrutiny, average losses would amount today to about 9.000 to 10.000 million Euros. The uncertainty in the model parameters is reflected in the high coefficient of variation of this loss, reaching approximately 45%. The choice of ground motion prediction equations and vulnerability functions of the building stock contribute the most to the uncertainty in loss estimates. This indicates that the application of non-local-specific information has a great impact on the spread of potential catastrophic losses. In order to close this uncertainty gap, more exhaustive documentation practices in insurance portfolios will have to go hand in hand with greater flexibility in the model input parameters.
Degeneracies of particle and nuclear physics uncertainties in neutrinoless double beta decay
Lisi, E; Simkovic, F
2015-01-01T23:59:59.000Z
Theoretical estimates for the half life of neutrinoless double beta decay in candidate nuclei are affected by both particle and nuclear physics uncertainties, which may complicate the interpretation of decay signals or limits. We study such uncertainties and their degeneracies in the following context: three nuclei of great interest for large-scale experiments (76-Ge, 130-Te, 136-Xe), two representative particle physics mechanisms (light and heavy Majorana neutrino exchange), and a large set of nuclear matrix elements (NME), computed within the quasiparticle random phase approximation (QRPA). It turns out that the main theoretical uncertainties, associated with the effective axial coupling g_A and with the nucleon-nucleon potential, can be parametrized in terms of NME rescaling factors, up to small residuals. From this parametrization, the following QRPA features emerge: (1) the NME dependence on g_A is milder than quadratic; (2) in each of the two mechanisms, the relevant lepton flavor violating parameter is...
Survey of sampling-based methods for uncertainty and sensitivity analysis.
Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)
2006-06-01T23:59:59.000Z
Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.
Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations
Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.
2013-05-01T23:59:59.000Z
Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.
Uncertainty quantification of a radionuclide release model using an adaptive spectral technique
Gilli, L.; Hoogwerf, C.; Lathouwers, D.; Kloosterman, J. L. [Delft University of Technology, Faculty of Applied Sciences Radiation Science and Technology, Department Nuclear Energy and Radiation Applications, Mekelweg 15, 2629 JB Delft (Netherlands)
2013-07-01T23:59:59.000Z
In this paper we present the application of a non-intrusive spectral techniques we recently developed for the evaluation of the uncertainties associated with a radionuclide migration problem. Spectral techniques can be used to reconstruct stochastic quantities of interest by means of a Fourier-like expansion. Their application to uncertainty propagation problems can be performed by evaluating a set of realizations which are chosen adaptively, in this work the main details about how this is done are presented. The uncertainty quantification problem we are going to deal with was first solved in a recent work where the authors used a spectral technique based on an intrusive approach. In this paper we are going to reproduce the results of this reference work, compare them and discuss the main numerical aspects. (authors)
Stochastic modelling of landfill processes incorporating waste heterogeneity and data uncertainty
Zacharof, A.I.; Butler, A.P
2004-07-01T23:59:59.000Z
A landfill is a very complex heterogeneous environment and as such it presents many modelling challenges. Attempts to develop models that reproduce these complexities generally involve the use of large numbers of spatially dependent parameters that cannot be properly characterised in the face of data uncertainty. An alternative method is presented, which couples a simplified microbial degradation model with a stochastic hydrological and contaminant transport model. This provides a framework for incorporating the complex effects of spatial heterogeneity within the landfill in a simplified manner, along with other key variables. A methodology for handling data uncertainty is also integrated into the model structure. Illustrative examples of the model's output are presented to demonstrate effects of data uncertainty on leachate composition and gas volume prediction.
Intermittent optical frequency measurements to reduce the dead time uncertainty of frequency link
Hachisu, Hidekazu
2015-01-01T23:59:59.000Z
The absolute frequency of the $^{87}{\\rm Sr}$ lattice clock transition was evaluated with an uncertainty of $1.1\\times 10^{-15}$ using a frequency link to the international atomic time (TAI). The frequency uncertainty of a hydrogen maser used as a transfer oscillator was reduced by homogeneously distributed intermittent measurement over a five-day grid of TAI. Three sets of four or five days measurements as well as systematic uncertainty of the clock at $8.6\\times 10^{-17}$ have resulted in an absolute frequency of $^{87}{\\rm Sr}\\ {}^1S_0 - {}^3P_0$ clock transition to be 429 228 004 229 872.85 (47) Hz.