Assessor Training Measurement Uncertainty
NVLAP Assessor Training Measurement Uncertainty #12;Assessor Training 2009: Measurement Uncertainty Training 2009: Measurement Uncertainty 3 Measurement Uncertainty ·Calibration and testing labs performing Training 2009: Measurement Uncertainty 4 Measurement Uncertainty ·When the nature of the test precludes
Uncertainties in Gapped Graphene
Eylee Jung; Kwang S. Kim; DaeKil Park
2012-03-20
Motivated by graphene-based quantum computer we examine the time-dependence of the position-momentum and position-velocity uncertainties in the monolayer gapped graphene. The effect of the energy gap to the uncertainties is shown to appear via the Compton-like wavelength $\\lambda_c$. The uncertainties in the graphene are mainly contributed by two phenomena, spreading and zitterbewegung. While the former determines the uncertainties in the long-range of time, the latter gives the highly oscillation to the uncertainties in the short-range of time. The uncertainties in the graphene are compared with the corresponding values for the usual free Hamiltonian $\\hat{H}_{free} = (p_1^2 + p_2^2) / 2 M$. It is shown that the uncertainties can be under control within the quantum mechanical law if one can choose the gap parameter $\\lambda_c$ freely.
Sandia Energy - Uncertainty Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
Uncertainty Analysis Home Stationary Power Nuclear Fuel Cycle Nuclear Energy Safety Technologies Risk and Safety Assessment Uncertainty Analysis Uncertainty AnalysisTara...
Uncertainty of calorimeter measurements at NREL's high flux solar furnace
Bingham, C.E.
1991-12-01
The uncertainties of the calorimeter and concentration measurements at the High Flux Solar Furnace (HFSF) at the National Renewable Energy Laboratory (NREL) are discussed. Two calorimeter types have been used to date. One is an array of seven commercially available circular foil calorimeters (gardon or heat flux gages) for primary concentrator peak flux (up to 250 W/cm{sup 2}). The second is a cold-water calorimeter designed and built by the University of Chicago to measure the average exit power of the reflective compound parabolic secondary concentrator used at the HFSF (over 3.3 kW across a 1.6cm{sup {minus}2} exit aperture, corresponding to a flux of about 2 kW/cm{sup 2}). This paper discussed the uncertainties of the calorimeter and pyrheliometer measurements and resulting concentration calculations. The measurement uncertainty analysis is performed according to the ASME/ANSI standard PTC 19.1 (1985). Random and bias errors for each portion of the measurement are analyzed. The results show that as either the power or the flux is reduced, the uncertainties increase. Another calorimeter is being designed for a new, refractive secondary which will use a refractive material to produce a higher average flux (5 kW/cm{sup 2}) than the reflective secondary. The new calorimeter will use a time derivative of the fluid temperature as a key measurement of the average power out of the secondary. A description of this calorimeter and test procedure is also presented, along with a pre-test estimate of major sources of uncertainty. 8 refs., 4 figs., 3 tabs.
Direct Aerosol Forcing Uncertainty
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Mccomiskey, Allison
Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.
Direct Aerosol Forcing Uncertainty
DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]
Mccomiskey, Allison
2008-01-15
Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.
Sandia Energy - Uncertainty Quantification
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
rate-expression parameters. It is important, both for model validation and design optimization purposes, that these uncertainties be adequately quantified and propagated to...
Physical Uncertainty Bounds (PUB)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.
Uncertainty Analysis Economic Evaluations
Bhulai, Sandjai
uncertainties in typical oil and gas projects: 1. The oil price, 2. The investments (capex) and operating expenses (opex) for the project, 3. The number of wells and associated capex to recover the reserves, 4
A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications
Iaccarino, Gianluca
2014-04-01
Multiphysics processes modeled by a system of unsteady di#11;erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi#12;cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.
Optimal Uncertainty Quantification
Owhadi, Houman; Sullivan, Timothy John; McKerns, Mike; Ortiz, Michael
2010-01-01
We propose a rigorous framework for Uncertainty Quantification (UQ) in which the UQ objectives and the assumptions/information set are brought to the forefront. This framework, which we call \\emph{Optimal Uncertainty Quantification} (OUQ), is based on the observation that, given a set of assumptions and information about the problem, there exist optimal bounds on uncertainties: these are obtained as extreme values of well-defined optimization problems corresponding to extremizing probabilities of failure, or of deviations, subject to the constraints imposed by the scenarios compatible with the assumptions and information. In particular, this framework does not implicitly impose inappropriate assumptions, nor does it repudiate relevant information. Although OUQ optimization problems are extremely large, we show that under general conditions, they have finite-dimensional reductions. As an application, we develop \\emph{Optimal Concentration Inequalities} (OCI) of Hoeffding and McDiarmid type. Surprisingly, contr...
Uncertainty and calibration analysis
Coutts, D.A.
1991-03-01
All measurements contain some deviation from the true value which is being measured. In the common vernacular this deviation between the true value and the measured value is called an inaccuracy, an error, or a mistake. Since all measurements contain errors, it is necessary to accept that there is a limit to how accurate a measurement can be. The undertainty interval combined with the confidence level, is one measure of the accuracy for a measurement or value. Without a statement of uncertainty (or a similar parameter) it is not possible to evaluate if the accuracy of the measurement, or data, is appropriate. The preparation of technical reports, calibration evaluations, and design calculations should consider the accuracy of measurements and data being used. There are many methods to accomplish this. This report provides a consistent method for the handling of measurement tolerances, calibration evaluations and uncertainty calculations. The SRS Quality Assurance (QA) Program requires that the uncertainty of technical data and instrument calibrations be acknowledged and estimated. The QA Program makes some specific technical requirements related to the subject but does not provide a philosophy or method on how uncertainty should be estimated. This report was prepared to provide a technical basis to support the calculation of uncertainties and the calibration of measurement and test equipment for any activity within the Experimental Thermal-Hydraulics (ETH) Group. The methods proposed in this report provide a graded approach for estimating the uncertainty of measurements, data, and calibrations. The method is based on the national consensus standard, ANSI/ASME PTC 19.1.
Optimization Under Generalized Uncertainty
Lodwick, Weldon
11 Optimization Under Generalized Uncertainty Optimization Modeling Math 4794/5794: Spring 2013 Weldon A. Lodwick Weldon.Lodwick@ucdenver.edu 2/14/2013 Optimization Modeling - Spring 2013 #12 in the context of optimization problems. The theoretical frame-work for these notes is interval analysis. From
Essays on pricing under uncertainty
Escobari Urday, Diego Alfonso
2008-10-10
This dissertation analyzes pricing under uncertainty focusing on the U.S. airline industry. It sets to test theories of price dispersion driven by uncertainty in the demand by taking advantage of very detailed information about the dynamics...
Predicting System Performance with Uncertainty
Yan, B.; Malkawi, A.
2012-01-01
The main purpose of this research is to include uncertainty that lies in modeling process and that arises from input values when predicting system performance, and to incorporate uncertainty related to system controls in a computationally...
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01
First, it recovers historical oil price uncertainty seriesmore reliable historical oil price uncertainty series as itAs a result, the historical oil price uncertainty series is
Sandia Energy - Uncertainty Analysis
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity ofkandz-cm11 Outreach Home RoomPreservation of Fe(II)GeothermalFuel MagnetizationTransportation EnergyUncertainty Analysis Home
Sandia Energy - Uncertainty Quantification
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity ofkandz-cm11 Outreach Home RoomPreservation of Fe(II)GeothermalFuel MagnetizationTransportation EnergyUncertainty Analysis
Rodriguez, E. A.; Pepin, J. E.; Thacker, B. H.; Riha, D. S.
2002-01-01
Los Alamos National Laboratory (LANL), in cooperation with Southwest Research Institute, has been developing capabilities to provide reliability-based structural evaluation techniques for performing weapon component and system reliability assessments. The development and applications of Probabilistic Structural Analysis Methods (PSAM) is an important ingredient in the overall weapon reliability assessments. Focus, herein, is placed on the uncertainty quantification associated with the structural response of a containment vessel for high-explosive (HE) experiments. The probabilistic dynamic response of the vessel is evaluated through the coupling of the probabilistic code NESSUS with the non-linear structural dynamics code, DYNA-3D. The probabilistic model includes variations in geometry and mechanical properties, such as Young's Modulus, yield strength, and material flow characteristics. Finally, the probability of exceeding a specified strain limit, which is related to vessel failure, is determined.
Uncertainty in emissions projections for climate models
Webster, Mort David.; Babiker, Mustafa H.M.; Mayer, Monika.; Reilly, John M.; Harnisch, Jochen.; Hyman, Robert C.; Sarofim, Marcus C.; Wang, Chien.
Future global climate projections are subject to large uncertainties. Major sources of this uncertainty are projections of anthropogenic emissions. We evaluate the uncertainty in future anthropogenic emissions using a ...
Policy Uncertainty and Household Savings
Giavazzi, Francesco
Using German microdata and a quasi-natural experiment, we provide evidence on how households respond to an increase in uncertainty. We find that household saving increases significantly following the increase in political ...
Game Theory for auctions Uncertainty
Dignum, Frank
widenarrow¬drillThe world is uncertain: 60% chance for oil 40% chance of no oil -29,-2-29,-16-29,0wide -16 in static games #12;Exogenous uncertainty, static 1,116,-131,0wide -1,1614,1444,0narrow 0,310,440,0¬drill,-2-16,-16-16,0narrow 0,-290,-160,0¬drill widenarrow¬drill #12;Exogenous uncertainty, static Combine the two possible
Uncertainty Quantification for Nuclear Density Functional Theory...
Office of Scientific and Technical Information (OSTI)
Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements Citation Details In-Document Search Title: Uncertainty Quantification...
Uncertainty quantification approaches for advanced reactor analyses.
Briggs, L. L.; Nuclear Engineering Division
2009-03-24
The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01
historical uncertainty series by incorporating a realized volatility series from daily oil price data
Davis, C B
1987-08-01
The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.
Review on Generalized Uncertainty Principle
Tawfik, Abdel Nasser
2015-01-01
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Review on Generalized Uncertainty Principle
Abdel Nasser Tawfik; Abdel Magied Diab
2015-09-22
Based on string theory, black hole physics, doubly special relativity and some "thought" experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in understanding recent PLANCK observations on the cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta.
Uncertainty of Pyrometers in a Casting Facility
Mee, D.K.; Elkins, J.E.; Fleenor, R.M.; Morrision, J.M.; Sherrill, M.W.; Seiber, L.E.
2001-12-07
This work has established uncertainty limits for the EUO filament pyrometers, digital pyrometers, two-color automatic pyrometers, and the standards used to certify these instruments (Table 1). If symmetrical limits are used, filament pyrometers calibrated in Production have certification uncertainties of not more than {+-}20.5 C traceable to NIST over the certification period. Uncertainties of these pyrometers were roughly {+-}14.7 C before introduction of the working standard that allowed certification in the field. Digital pyrometers addressed in this report have symmetrical uncertainties of not more than {+-}12.7 C or {+-}18.1 C when certified on a Y-12 Standards Laboratory strip lamp or in a production area tube furnace, respectively. Uncertainty estimates for automatic two-color pyrometers certified in Production are {+-}16.7 C. Additional uncertainty and bias are introduced when measuring production melt temperatures. A -19.4 C bias was measured in a large 1987 data set which is believed to be caused primarily by use of Pyrex{trademark} windows (not present in current configuration) and window fogging. Large variability (2{sigma} = 28.6 C) exists in the first 10 m of the hold period. This variability is attributed to emissivity variation across the melt and reflection from hot surfaces. For runs with hold periods extending to 20 m, the uncertainty approaches the calibration uncertainty of the pyrometers. When certifying pyrometers on a strip lamp at the Y-12 Standards Laboratory, it is important to limit ambient temperature variation (23{+-}4 C), to order calibration points from high to low temperatures, to allow 6 m for the lamp to reach thermal equilibrium (12 m for certifications below 1200 C) to minimize pyrometer bias, and to calibrate the pyrometer if error exceeds vendor specifications. A procedure has been written to assure conformance.
Pharmaceutical Waste Management Under Uncertainty
Linninger, Andreas A.
Pharmaceutical Waste Management Under Uncertainty Andreas A. Linninger and Aninda Chakraborty of their benefits and costs constitutes a formidable task. Designing plant-wide waste management policies assuming this article addresses the problem of finding optimal waste management policies for entire manufacturing sites
Estimating uncertainty of inference for validation
Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM
2010-09-30
We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the first in a series of inference uncertainty estimations. While the methods demonstrated are primarily statistical, these do not preclude the use of nonprobabilistic methods for uncertainty characterization. The methods presented permit accurate determinations for validation and eventual prediction. It is a goal that these methods establish a standard against which best practice may evolve for determining degree of validation.
Experimental uncertainty estimation and statistics for data having interval uncertainty.
Kreinovich, Vladik; Oberkampf, William Louis; Ginzburg, Lev; Ferson, Scott; Hajagos, Janos
2007-05-01
This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.
Greenhouse Gas Inventory Uncertainties Need Characterization
Post, Wilfred M.
Greenhouse Gas Inventory Uncertainties Need Characterization Contact: Gregg Marland, 865 of greenhouse gases (GHGs) emitted to and removed from the atmosphere are essential for understanding global.S. Department of Energy Greenhouse Gas Inventory Uncertainties Need Characterization Abstract: The assessment
December 2008 Uncertainty and Production Planning
Graves, Stephen C.
December 2008 Uncertainty and Production Planning Stephen C. Graves MIT 77 Massachusetts Avenue, E how uncertainty is handled in production planning. We describe and critique current practices decisions that are critical to the proper handling of uncertainty in production planning. We observe
Quantum Mechanics and the Generalized Uncertainty Principle
Jang Young Bang; Micheal S. Berger
2006-11-30
The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.
Robot Motion Planning with Uncertainty The Challenge
Whitton, Mary C.
Roadmap (SMR), a new motion planning framework that explicitly considers uncertainty in robot motion approach. Our framework builds on the highly successful approach used in Probabilistic Roadmaps (PRMs of discrete states is selected in the state space, and a roadmap is built that represents their collision
Extended Forward Sensitivity Analysis for Uncertainty Quantification
Haihua Zhao; Vincent A. Mousseau
2011-09-01
Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed to run at optimized time and space steps without affecting the confidence of the physical parameter sensitivity results. The time and space steps forward sensitivity analysis method can also replace the traditional time step and grid convergence study with much less computational cost. Several well defined benchmark problems with manufactured solutions are utilized to demonstrate the extended forward sensitivity analysis method. All the physical solutions, parameter sensitivity solutions, even the time step sensitivity in one case, have analytical forms, which allows the verification to be performed in the strictest sense.
Habte, Aron
2015-06-25
This presentation summarizes uncertainty estimation of radiometric data using the Guide to the Expression of Uncertainty (GUM) method.
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
reservoir modelling Considerable uncertainty about porosity of rock Astronomy "Random" spatial: uncertainty in modelling parameters uncertainty in geometry uncertainty in initial conditions uncertainty with Uncertainty Examples: Long-term climate modelling: Lots of sources of uncertainty including the effects
The impact of uncertainty and risk measures
Jo, Soojin; Jo, Soojin
2012-01-01
peak, and finds that this nonlinear transformation of the oiland oil price growth rates. As seen in the above illustration, uncertainty is at its peak
Uncertainty Quantification for Nuclear Density Functional Theory...
Office of Scientific and Technical Information (OSTI)
Uncertainty Quantification for Nuclear Density Functional Theory and Information Content of New Measurements Citation Details In-Document Search This content will become publicly...
Pricing Cloud Bandwidth Reservations under Demand Uncertainty
Li, Baochun
Pricing Cloud Bandwidth Reservations under Demand Uncertainty Di Niu, Chen Feng, Baochun Li's utility depends not only on its bandwidth usage, but more importantly on the portion of its demand that can be made by all tenants and the cloud provider, even with the presence of demand uncertainty
Dealing with Uncertainty in the Semantic Web
Theune, Mariët
Dealing with Uncertainty in the Semantic Web Tjitze Rienstra M.Sc. Thesis November 8, 2009. Paul van der Vet Dr. Maarten Fokkinga #12;#12;Abstract Standardizing the Semantic Web is still of the Semantic Web, are yet to be standardized. One of these is dealing with uncertainty. Like classical logic
Estimating the uncertainty in underresolved nonlinear dynamics
Chorin, Alelxandre; Hald, Ole
2013-06-12
The Mori-Zwanzig formalism of statistical mechanics is used to estimate the uncertainty caused by underresolution in the solution of a nonlinear dynamical system. A general approach is outlined and applied to a simple example. The noise term that describes the uncertainty turns out to be neither Markovian nor Gaussian. It is argued that this is the general situation.
Multivariate Receptor Models and Model Uncertainty
Washington at Seattle, University of
Multivariate Receptor Models and Model Uncertainty Eun Sug Park Man-Suk Oh Peter Guttorp NRCSET e c provides the Center's primary funding. #12;Multivariate Receptor Models and Model Uncertainty Eun Sug Park1 composition profiles, and the source contributions is the main interest in multivariate receptor modeling. Due
Short communication Smooth pursuit under stimulusresponse uncertainty
Bar, Moshe
reaction times (RTs) are typically faster than choice reaction times and increase with uncertaintyR) uncertainty and reaction times (RTs): RT = a + blog2(N), where a is simple RT, b is the slope of the increase an eye velocity criterion of 1.5j sÀ 1 which was equivalent to 25% of the stimulus velocity
Quantifying uncertainty in stable isotope mixing models
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.
2015-05-19
Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (?15N and ?18O) but all methods testedmore »are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated mixing fractions.« less
Generalized Uncertainty Principle: Approaches and Applications
Abdel Nasser Tawfik; Abdel Magied Diab
2014-11-23
We review highlights from string theory, black hole physics and doubly special relativity and some "thought" experiments which were suggested to probe the shortest distance and/or the maximum momentum at the Planck scale. The models which are designed to implement the minimal length scale and/or the maximum momentum in different physical systems are analysed entered the literature as the Generalized Uncertainty Principle (GUP). We compare between them. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. Furthermore, assuming modified dispersion relation allows for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of the gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. Another one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.
Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)
Jordan, D.; Kurtz, S.; Hansen, C.
2014-04-01
Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.
Majorization formulation of uncertainty in quantum mechanics
Partovi, M. Hossein [Department of Physics and Astronomy, California State University, Sacramento, California 95819-6041 (United States)
2011-11-15
Heisenberg's uncertainty principle is formulated for a set of generalized measurements within the framework of majorization theory, resulting in a partial uncertainty order on probability vectors that is stronger than those based on quasientropic measures. The theorem that emerges from this formulation guarantees that the uncertainty of the results of a set of generalized measurements without a common eigenstate has an inviolable lower bound which depends on the measurement set but not the state. A corollary to this theorem yields a parallel formulation of the uncertainty principle for generalized measurements corresponding to the entire class of quasientropic measures. Optimal majorization bounds for two and three mutually unbiased bases in two dimensions are calculated. Similarly, the leading term of the majorization bound for position and momentum measurements is calculated which provides a strong statement of Heisenberg's uncertainty principle in direct operational terms. Another theorem provides a majorization condition for the least-uncertain generalized measurement of a given state with interesting physical implications.
The Role of Uncertainty Quantification for Reactor Physics
Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.
2015-01-01
The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.
Pouliot, Jean
brachytherapy of the prostate cancer Yongbok Kim,a) I-Chow Joe Hsu, Etienne Lessard, and Jean Pouliot Department tomography CT -based high dose rate HDR brachytherapy, the uncertainty in the localization in Medicine. DOI: 10.1118/1.1785454 Key words: high dose rate brachytherapy, computed tomography, prostate
EFFICIENT UNCERTAINTY ANALYSIS METHODS FOR MULTIDISCIPLINARY ROBUST DESIGN
Chen, Wei
EFFICIENT UNCERTAINTY ANALYSIS METHODS FOR MULTIDISCIPLINARY ROBUST DESIGN Xiaoping Du* and Wei robust design procedure that utilizes efficient methods for uncertainty analysis is developed. Different techniques used for uncertainty analysis will significantly reduce the amount of design evaluations
DAMAGE ASSESSMENT OF COMPOSITE PLATE STRUCTURES WITH UNCERTAINTY
Boyer, Edmond
DAMAGE ASSESSMENT OF COMPOSITE PLATE STRUCTURES WITH UNCERTAINTY Chandrashekhar M.* , Ranjan Uncertainties associated with a structural model and measured vibration data may lead to unreliable damage that material uncertainties in composite structures cause considerable problem in damage assessment which can
Analysis of S-Circuit Uncertainty
Ahmed, Taahir
2011-08-08
The theory of sensori-computational circuits provides a capable framework for the description and optimization of robotic systems, including on-line optimizations. This theory, however, is inadequate in that it does not account for uncertainty in a...
Uncertainty in climate change policy analysis
Jacoby, Henry D.; Prinn, Ronald G.
Achieving agreement about whether and how to control greenhouse gas emissions would be difficult enough even if the consequences were fully known. Unfortunately, choices must be made in the face of great uncertainty, about ...
An uncertainty principle for unimodular quantum groups
Crann, Jason; Kalantar, Mehrdad E-mail: mkalanta@math.carleton.ca
2014-08-15
We present a generalization of Hirschman's entropic uncertainty principle for locally compact Abelian groups to unimodular locally compact quantum groups. As a corollary, we strengthen a well-known uncertainty principle for compact groups, and generalize the relation to compact quantum groups of Kac type. We also establish the complementarity of finite-dimensional quantum group algebras. In the non-unimodular setting, we obtain an uncertainty relation for arbitrary locally compact groups using the relative entropy with respect to the Haar weight as the measure of uncertainty. We also show that when restricted to q-traces of discrete quantum groups, the relative entropy with respect to the Haar weight reduces to the canonical entropy of the random walk generated by the state.
Estimating uncertainties in integrated reservoir studies
Zhang, Guohong
2004-09-30
existing methods. The integrated mismatch method tends to generate smaller ranges of uncertainty than many existing methods. When starting from nonoptimal reservoir models, in some cases the integrated mismatch method is able to bracket the true reserves...
Multifidelity methods for multidisciplinary design under uncertainty
Christensen, Daniel Erik
2012-01-01
For computational design and analysis tasks, scientists and engineers often have available many different simulation models. The output of each model has an associated uncertainty that is a result of the modeling process. ...
Uncertainty Quantification Techniques of SCALE/TSUNAMI
Rearden, Bradley T [ORNL] [ORNL; Mueller, Don [ORNL] [ORNL
2011-01-01
The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification is useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.
Risk Analysis and Decision-Making Under Uncertainty: A Strategy...
Office of Environmental Management (EM)
Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Risk Analysis and Decision-Making Under Uncertainty: A Strategy and its Applications Ming Ye...
Improvements of Nuclear Data and Its Uncertainties by Theoretical...
Office of Scientific and Technical Information (OSTI)
Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements of Nuclear Data and Its Uncertainties by...
Addressing Uncertainties in Design Inputs: A Case Study of Probabilist...
Office of Environmental Management (EM)
Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF Addressing Uncertainties in Design Inputs: A Case...
Asymptotic and uncertainty analyses of a phase field model for...
Office of Scientific and Technical Information (OSTI)
Asymptotic and uncertainty analyses of a phase field model for void formation under irradiation Citation Details In-Document Search Title: Asymptotic and uncertainty analyses of a...
Uncertainties and severe-accident management
Kastenberg, W.E. (Univ. of California, Los Angeles (United States))
1991-01-01
Severe-accident management can be defined as the use of existing and or alternative resources, systems, and actions to prevent or mitigate a core-melt accident. Together with risk management (e.g., changes in plant operation and/or addition of equipment) and emergency planning (off-site actions), accident management provides an extension of the defense-indepth safety philosophy for severe accidents. A significant number of probabilistic safety assessments have been completed, which yield the principal plant vulnerabilities, and can be categorized as (a) dominant sequences with respect to core-melt frequency, (b) dominant sequences with respect to various risk measures, (c) dominant threats that challenge safety functions, and (d) dominant threats with respect to failure of safety systems. Severe-accident management strategies can be generically classified as (a) use of alternative resources, (b) use of alternative equipment, and (c) use of alternative actions. For each sequence/threat and each combination of strategy, there may be several options available to the operator. Each strategy/option involves phenomenological and operational considerations regarding uncertainty. These include (a) uncertainty in key phenomena, (b) uncertainty in operator behavior, (c) uncertainty in system availability and behavior, and (d) uncertainty in information availability (i.e., instrumentation). This paper focuses on phenomenological uncertainties associated with severe-accident management strategies.
Error propagation equations for estimating the uncertainty in high-speed wind tunnel test results
Clark, E.L.
1994-07-01
Error propagation equations, based on the Taylor series model, are derived for the nondimensional ratios and coefficients most often encountered in high-speed wind tunnel testing. These include pressure ratio and coefficient, static force and moment coefficients, dynamic stability coefficients, and calibration Mach number. The error equations contain partial derivatives, denoted as sensitivity coefficients, which define the influence of free-steam Mach number, M{infinity}, on various aerodynamic ratios. To facilitate use of the error equations, sensitivity coefficients are derived and evaluated for five fundamental aerodynamic ratios which relate free-steam test conditions to a reference condition.
ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS
Smith, F.; Phifer, M.
2011-06-30
The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in sand and clay), (b) Dose Parameters (34 parameters), (c) Material Properties (20 parameters), (d) Surface Water Flows (6 parameters), and (e) Vadose and Aquifer Flow (4 parameters). Results provided an assessment of which group of parameters is most significant in the dose uncertainty. It was found that K{sub d} and the vadose/aquifer flow parameters, both of which impact transport timing, had the greatest impact on dose uncertainty. Dose parameters had an intermediate level of impact while material properties and surface water flows had little impact on dose uncertainty. Results of the importance analysis are discussed further in Section 7 of this report. The objectives of this work were to address comments received during the CA review on the uncertainty analysis and to demonstrate an improved methodology for CA uncertainty calculations as part of CA maintenance. This report partially addresses the LFRG Review Team issue of producing an enhanced CA sensitivity and uncertainty analysis. This is described in Table 1-1 which provides specific responses to pertinent CA maintenance items extracted from Section 11 of the SRS CA (2009). As noted above, the original uncertainty analysis looked at each POA separately and only included the effects from at most five sources giving the highest peak doses at each POA. Only 17 of the 152 CA sources were used in the original uncertainty analysis and the simulation time was reduced from 10,000 to 2,000 years. A major constraint on the original uncertainty analysis was the limitation of only being able to use at most four distributed processes. This work expanded the analysis to 10,000 years using 39 of the CA sources, included cumulative dose effects at downstream POAs, with more realizations (1,000) and finer time steps. This was accomplished by using the GoldSim DP-Plus module and the 36 processors available on a new windows cluster. The last part of the work looked at the contribution to overall uncertainty from the main categories of uncertainty variables: K{sub d}s, dose parameters, flow parameters, and material propertie
Uncertainties in Galactic Chemical Evolution Models
Côté, Benoit; O'Shea, Brian W; Herwig, Falk; Pignatari, Marco; Jones, Samuel; Fryer, Chris
2015-01-01
We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions, for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of seven basic parameters, which are: the lower and upper mass limit of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per solar mass formed, the total stellar mass formed, and the initial mass of gas of the galaxy. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of ...
Dutta, Parikshit
2012-10-19
Recently there has been growing interest to characterize and reduce uncertainty in stochastic dynamical systems. This drive arises out of need to manage uncertainty in complex, high dimensional physical systems. Traditional ...
Quantifying the Objective Cost of Uncertainty in Complex Dynamical Systems
Yoon, Byung-Jun
in translational genomics. Index Terms Mean objective cost of uncertainty (MOCU), objective-based uncertaintyQuantifying the Objective Cost of Uncertainty in Complex Dynamical Systems Byung-Jun Yoon, Senior quantifies the uncertainty in a given system based on the expected increase of the operational cost
Bounded Uncertainty Roadmaps for Path Planning Leonidas J. Guibas1
Guibas, Leonidas J.
Bounded Uncertainty Roadmaps for Path Planning Leonidas J. Guibas1 , David Hsu2 , Hanna Kurniawati2 uncertainty during planning. We in- troduce the notion of a bounded uncertainty roadmap (BURM) and use, and it is not much slower than classic probabilistic roadmap planning algorithms, which ignore uncertainty
Uncertainty in Simulating Wheat Yields Under Climate Change
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.
2013-09-01
Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.
Uncertainty in climate science and climate policy
Uncertainty in climate science and climate policy Jonathan Rougier University of Bristol, UK Michel1.tex. 1 Introduction This essay, written by a statistician and a climate scientist, describes our view of the gap that exists between current practice in mainstream climate science, and the practical
Outage Probability Under Channel Distribution Uncertainty
Loyka, Sergey
Outage Probability Under Channel Distribution Uncertainty Ioanna Ioannou, Charalambos D. Charalambous and Sergey Loyka Abstract--Outage probability of a class of block-fading (MIMO) channels outage probability defined as min (over the input distribution) -max (over the channel distribution class
Time Crystals from Minimum Time Uncertainty
Mir Faizal; Mohammed M. Khalil; Saurya Das
2014-12-29
Motivated by the Generalized Uncertainty Principle, covariance, and a minimum measurable time, we propose a deformation of the Heisenberg algebra, and show that this leads to corrections to all quantum mechanical systems. We also demonstrate that such a deformation implies a discrete spectrum for time. In other words, time behaves like a crystal.
Bayesian Environmetrics: Uncertainty and Sensitivity Analysis
Draper, David
problems 1 #12;Outline Two case studies, both involving risk assessment for nuclear waste disposal methods has better repeated-sampling properties than maximum likelihood. -- Several (standard) fluid dynamics: uncertainty and sensitivity analysis and inverse problems 2 #12;Case Study 1: GESAMAC Nuclear fission
Nuclear power expansion: thinking about uncertainty
Holt, Lynne; Sotkiewicz, Paul; Berg, Sanford
2010-06-15
Nuclear power is one of many options available to achieve reduced carbon dioxide emissions. The real-option value model can help explain the uncertainties facing prospective nuclear plant developers in developing mitigation strategies for the development, construction, and operation of new nuclear plants. (author)
Coping with uncertainties of mercury regulation
Reich, K.
2006-09-15
The thermometer is rising as coal-fired plants cope with the uncertainties of mercury regulation. The paper deals with a diagnosis and a suggested cure. It describes the state of mercury emission rules in the different US states, many of which had laws or rules in place before the Clean Air Mercury Rule (CAMR) was promulgated.
Dealing with Uncertainties During Heat Exchanger Design
Polley, G. T.; Pugh, S. J.
2001-01-01
Over the last thirty years much progress has been made in heat exchanger design methodology. Even so, the design engineer still has to deal with a great deal of uncertainty. Whilst the methods used to predict heat transfer coefficients are now quite...
CLIMATE AND CLIMATE CHANGE CERTAINTIES AND UNCERTAINTIES
Schwartz, Stephen E.
CLIMATE AND CLIMATE CHANGE CERTAINTIES AND UNCERTAINTIES Stephen E. Schwartz http://www.ecd.bnl.gov/steve/schwartz.html December 4, 2001 Updated, March 4, 2002 #12;OUTLINE · Overview of the Earth climate system · Increased concentrations of "greenhouse gases" · Radiative forcing of climate change · Climate system response
Visual Exploration of Uncertainty in Remotesensing Classification
Utrecht, Universiteit
Visual Exploration of Uncertainty in Remotesensing Classification Frans J.M. van der Wel Utrecht analysis of remotelysensed data aims at acquiring insight as to the stability of possible classifications for an overwhelming flow of data on the appearance and condition of our planet. The data yielded by remote sensing can
COMPARISON OF UNCERTAINTY PARAMETERISATIONS FOR H ROBUST
Cambridge, University of
COMPARISON OF UNCERTAINTY PARAMETERISATIONS FOR H ROBUST CONTROL OF TURBOCHARGED DIESEL ENGINES arose during the model validation for a turbocharged diesel engine (cf. Sections 2 and 3). Mismatches PROBLEM The plant to be controlled is a turbocharged pas- senger car diesel engine equipped with exhaust
UAV mission planning under uncertainty
Sakamoto, Philemon
2006-01-01
With the continued development of high endurance Unmanned Aerial Vehicles (UAV) and Unmanned Combat Aerial Vehicles (UCAV) that are capable of performing autonomous fiunctions across the spectrum of military operations, ...
Holian, Gary L.; Sokolov, Andrei P.; Prinn, Ronald G.
Key uncertainties in the global carbon cycle are explored with a 2-D model for the oceanic carbon sink. By calibrating the key parameters of this ocean carbon sink model to widely referenced values, it produces an average ...
Supporting qualified database for uncertainty evaluation
Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D'Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)
2012-07-01
Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization: this includes the rationale adopted for each part of the nodalization, the user choices, and the systematic derivation and justification of any value present in the code input respect to the values as indicated in the RDS-facility and in the RDS-test. (authors)
Representation of analysis results involving aleatory and epistemic uncertainty.
Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis; Sallaberry, Cedric J.
2008-08-01
Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.
On solar geoengineering and climate uncertainty
MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.
2015-09-03
Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictions of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.
Uncertainty in BWR power during ATWS events
Diamond, D.J.
1986-01-01
A study was undertaken to improve our understanding of BWR conditions following the closure of main steam isolation valves and the failure of reactor trip. Of particular interest was the power during the period when the core had reached a quasi-equilibrium condition with a natural circulation flow rate determined by the water level in the downcomer. Insights into the uncertainity in the calculation of this power with sophisticated computer codes were quantified using a simple model which relates power to the principal thermal-hydraulic variables and reactivity coefficients; the latter representing the link between the thermal-hydraulics and the neutronics. Assumptions regarding the uncertainty in these variables and coefficients were then used to determine the uncertainty in power.
Adaptive control of hypersonic vehicles in presence of actuation uncertainties
Somanath, Amith
2010-01-01
The thesis develops a new class of adaptive controllers that guarantee global stability in presence of actuation uncertainties. Actuation uncertainties culminate to linear plants with a partially known input matrix B. ...
Depression during exacerbations in multiple sclerosis: the importance of uncertainty
Kroencke, Dawn C.; Denney, Douglas R.; Lynch, Sharon G.
2001-08-01
or not they were currently experiencing an exacerbation of their symptoms); (b) their level of uncertainty concerning their illness, and (c) their strategies for coping with their illness. A current exacerbation in symptoms, greater uncertainty of illness...
A Framework for Modeling Uncertainty in Regional Climate Change
Monier, Erwan
In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the US associated with four dimensions of uncertainty. The sources ...
Uncertainty analysis of climate change and policy response
Webster, Mort David.; Forest, Chris Eliot.; Reilly, John M.; Babiker, Mustafa H.M.; Kicklighter, David W.; Mayer, Monika.; Prinn, Ronald G.; Sarofim, Marcus C.; Sokolov, Andrei P.; Stone, Peter H.; Wang, Chien.
To aid climate policy decisions, accurate quantitative descriptions of the uncertainty in climate outcomes under various possible policies are needed. Here, we apply an earth systems model to describe the uncertainty in ...
Uncertainties in Energy Consumption Introduced by Building Operations and
Uncertainties in Energy Consumption Introduced by Building Operations and Weather for a Medium-Size Office Building Liping Wang, Paul Mathew, Xiufeng Pang Environmental Energy Technologies Division between predicted and actual building energy consumption can be attributed to uncertainties introduced
Information-Disturbance theorem and Uncertainty Relation
Takayuki Miyadera; Hideki Imai
2007-07-31
It has been shown that Information-Disturbance theorem can play an important role in security proof of quantum cryptography. The theorem is by itself interesting since it can be regarded as an information theoretic version of uncertainty principle. It, however, has been able to treat restricted situations. In this paper, the restriction on the source is abandoned, and a general information-disturbance theorem is obtained. The theorem relates information gain by Eve with information gain by Bob.
Uncertainty estimates for derivatives and intercepts
Clark, E.L.
1990-01-01
Straight line least squares fits of experimental data are widely used in the analysis of test results to provide derivatives and intercepts. A method for evaluating the uncertainty in these parameters is described. The method utilizes conventional least squares results and is applicable to experiments where the independent variable is controlled, but not necessarily free of error. A Monte Carlo verification of the method is given 7 refs., 2 tabs.
Uncertainty estimates for derivatives and intercepts
Clark, E.L.
1994-09-01
Straight line least squares fits of experimental data are widely used in the analysis of test results to provide derivatives and intercepts. A method for evaluating the uncertainty in these parameters is described. The method utilizes conventional least squares results and is applicable to experiments where the independent variable is controlled, but not necessarily free of error. A Monte Carlo verification of the method is given.
Dynamic diagnostic and decision procedures under uncertainty
Baranov, V.V.
1995-01-01
In this paper, we consider uncertainty that arises when the true state x {element_of} E is not accessible to direct observation and remains unknown. Instead, we observe some features {theta} {element_of} {Theta} that carry a certain information about the true state. This information is described by the conditional distribution P({Theta}{vert_bar}E), which we call the linkage distribution. Regarding this distribution we assume that it exists but is unknown. This leads to uncertainty with respect to states from E and the linkage distribution P({Theta}{vert_bar}E), which we denote by NEP. The substantive problem can be stated as follows: from observations of the features {theta}{element_of}{Theta} made at each time instant n = 1,2,...,recognize the state x {element_of} E, identify the linkage distribution P, and use the results of recognition and identification to choose a decision y {element_of} Y so that the decision process is optimal in some sense. State recognition is the subject of diagnostics. The uncertainty NEP thus generates a problem of diagnostics and dynamic decision making.
Uncertainty and Complementarity Relations in Weak Measurement
Arun Kumar Pati; Junde Wu
2014-11-26
We prove uncertainty relations that quantitatively express the impossibility of jointly sharp preparation of pre- and post-selected quantum states for measuring incompatible observables during the weak measurement. By defining a suitable operator whose average in the pre-selected quantum state gives the weak value, we show that one can have new uncertainty relations for variances of two such operators corresponding to two non-commuting observables. These generalize the recent stronger uncertainty relations that give non-trivial lower bounds for the sum of variances of two observables which fully capture the concept of incompatible observables. Furthermore, we show that weak values for two non-commuting projection operators obey a complementarity relation. Specifically, we show that for a pre-selected state if we measure a projector corresponding to an observable $A$ weakly followed by the strong measurement of another observable $B$ (for the post-selection) and, for the same pre-selected state we measure a projector corresponding to an observable $B$ weakly followed by the strong measurement of the observable $A$ (for the post-selection), then the product of these two weak values is always less than one. This shows that even though individually they are complex and can be large, their product is always bounded.
Bayesiannetwork Confirmation of Software Testing Uncertainties Debra J. Richardson
Ziv, Hadar
Bayesiannetwork Confirmation of Software Testing Uncertainties Hadar Ziv Debra J. Richardson presentation of uncertainty in software testing. We then propose that a specific technique, known as Bayesian Belief Networks, be used to model software testing uncertainties. We demonstrate the use of Bayesian
Estimating uncertainty of streamflow simulation using Bayesian neural networks
Liang, Faming
Estimating uncertainty of streamflow simulation using Bayesian neural networks Xuesong Zhang,1 neural networks (BNNs) are powerful tools for providing reliable hydrologic prediction and quantifying of the uncertainties related to parameters (neural network's weights) and model structures were applied for uncertainty
Estimating Uncertainty of Streamflow Simulation using Bayesian Neural Networks
Estimating Uncertainty of Streamflow Simulation using Bayesian Neural Networks Xuesong Zhang1-2607 Email: r-srinivasan@tamu.edu 1 #12;Abstract: Recent studies have shown that Bayesian Neural Networks of the uncertainties related to parameters (neural network's weights) and model structures were applied for uncertainty
Aperture Photometry Uncertainties assuming Priors and Correlated Noise
Masci, Frank
1 Aperture Photometry Uncertainties assuming Priors and Correlated Noise F. Masci, version 2.0, 10 aperture photometry assuming (i) prior pixel-flux uncertainties are available for the image (e.g., computed photometry is being performed. One way to do this is to compare the uncertainties with the local RMS pixel
UNCERTAINTY IN PALEOECOLOGICAL STUDIES OF MERCURY IN SEDIMENT CORES
Gottgens, Hans
UNCERTAINTY IN PALEOECOLOGICAL STUDIES OF MERCURY IN SEDIMENT CORES JOHAN F. GOTTGENS1,, BRIAN E by as much as ±48%. Uncertainty in paleoecological studies of mercury needs to be documented in order, paleoecology, sediment cores, uncertainty 1. Introduction Paleoecological studies of lakes, wetlands
Uncertainty, Subjectivity, Trust and Risk: How It All Fits Together
Stølen, Ketil
Uncertainty, Subjectivity, Trust and Risk: How It All Fits Together Bjørnar Solhaug1 and Ketil uncertainty, subjective, objective, trust, risk, trust management. 1 Aleatory Uncertainty vs. Epistemic be reduced by narrowing the interval and thereby making a more precise prediction. 2 Objective vs. Subjective
Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.
1997-06-01
This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Methodology for characterizing modeling and discretization uncertainties in computational simulation
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
Measurement uncertainty in surface flatness measurement
H. L. Thang
2011-11-29
Flatness of a plate is a parameter has been put under consideration for long time. Factors influencing the accuracy of this parameter have been recognized and examined carefully but placed scatterringly. Beside that those reports have not been always in harmonization with Guide for expression of uncertainty measurement (GUM). Furthermore, mathematical equations describing clearly the flatness measurement have not been seen in those reports also. We have collected those influencing factors for systematic reference purpose, re-written the equation describing the profile measurement of the plate topography, and proposed an equation for flatness determination. An illustrative numerical example will be also shown.
Gravitational tests of the Generalized Uncertainty Principle
Fabio Scardigli; Roberto Casadio
2014-07-01
We compute the corrections to the Schwarzschild metric necessary to reproduce the Hawking temperature derived from a Generalized Uncertainty Principle (GUP), so that the GUP deformation parameter is directly linked to the deformation of the metric. Using this modified Schwarzschild metric, we compute corrections to the standard General Relativistic predictions for the light deflection and perihelion precession, both for planets in the solar system and for binary pulsars. This analysis allows us to set bounds for the GUP deformation parameter from well-known astronomical measurements.
Characterizing Uncertainties in Ice Particle Size Distributions
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantity ofkandz-cm11 OutreachProductswsicloudwsiclouddenDVA N C E D BGene Network ShapingDate:Characterization ofArcticUncertainties in
Lidar arc scan uncertainty reduction through scanning geometry optimization
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Wang, H.; Barthelmie, R. J.; Pryor, S. C.; Brown, G.
2015-10-07
Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine powermore »performance analysis and annual energy production. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation when arc scans are used for wind resource assessment.« less
Uncertainty quantification for large-scale ocean circulation predictions.
Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik
2010-09-01
Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.
Page 1 Construction and AvailabilityConstruction and Availability Uncertainty in the Regional and Technology Availability Construction Costs Economic Retirement Variable Capacity for Existing Units #12;Page to construction power plants or to take other action May include policies for particular resources "Scenario
Modeling TechnologyModeling Technology Innovation:Innovation: Uncertainties inUncertainties in
costs and preferred technologies? #12;2 E.S. Rubin, Carnegie Mellon Focus on Innovation for technology for achieving climate change policy goals at the lowest cost, in conjunction with other mitigation1 Modeling TechnologyModeling Technology Innovation:Innovation: Uncertainties in
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
repository and oil reservoir modelling Considerable uncertainty about porosity of rock Astronomy "Random handling uncertainty: uncertainty in modelling parameters uncertainty in geometry uncertainty in initial 3031, 2013 4 / 41 #12;PDEs with Uncertainty Examples: Long-term climate modelling: Lots of sources
Survey and Evaluate Uncertainty Quantification Methodologies
Lin, Guang; Engel, David W.; Eslinger, Paul W.
2012-02-01
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon capture processes. As such, we will develop, as needed and beyond existing capabilities, a suite of robust and efficient computational tools for UQ to be integrated into a CCSI UQ software framework.
Quantum Limits of Measurements and Uncertainty Principle
Masanao Ozawa
2015-05-19
In this paper, we show how the Robertson uncertainty relation gives certain intrinsic quantum limits of measurements in the most general and rigorous mathematical treatment. A general lower bound for the product of the root-mean-square measurement errors arising in joint measurements of noncommuting observables is established. We give a rigorous condition for holding of the standard quantum limit (SQL) for repeated measurements, and prove that if a measuring instrument has no larger root-mean-square preparational error than the root-mean-square measurement errors then it obeys the SQL. As shown previously, we can even construct many linear models of position measurement which circumvent this condition for the SQL.
Radiotherapy Dose Fractionation under Parameter Uncertainty
Davison, Matt; Kim, Daero; Keller, Harald
2011-11-30
In radiotherapy, radiation is directed to damage a tumor while avoiding surrounding healthy tissue. Tradeoffs ensue because dose cannot be exactly shaped to the tumor. It is particularly important to ensure that sensitive biological structures near the tumor are not damaged more than a certain amount. Biological tissue is known to have a nonlinear response to incident radiation. The linear quadratic dose response model, which requires the specification of two clinically and experimentally observed response coefficients, is commonly used to model this effect. This model yields an optimization problem giving two different types of optimal dose sequences (fractionation schedules). Which fractionation schedule is preferred depends on the response coefficients. These coefficients are uncertainly known and may differ from patient to patient. Because of this not only the expected outcomes but also the uncertainty around these outcomes are important, and it might not be prudent to select the strategy with the best expected outcome.
Intrinsic Uncertainties in Modeling Complex Systems.
Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.
2014-09-01
Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project %23170979, entitled %22Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties%22, which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.
Planning Wireless Networks with Demand Uncertainty using Robust ...
2011-03-04
as OFDMA. Current methods for this task require a static model of the problem. However, uncertainty of data arises frequently in wireless networks, e. g., fluctuat-
Investment and Upgrade in Distributed Generation under Uncertainty
Siddiqui, Afzal
2008-01-01
tax on microgrid combined heat and power adoption. JournalDG) and combined heat and power (CHP) applications via heatUncertainty Keywords: Combined heat and power applications,
Estimation of uncertainty for contour method residual stress measurements
Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.
2014-12-03
This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulness of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (?/E = 7 · 10??) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (?/E = 14 · 10??).
Estimation of uncertainty for contour method residual stress measurements
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Olson, Mitchell D.; DeWald, Adrian T.; Prime, Michael B.; Hill, Michael R.
2014-12-03
This paper describes a methodology for the estimation of measurement uncertainty for the contour method, where the contour method is an experimental technique for measuring a two-dimensional map of residual stress over a plane. Random error sources including the error arising from noise in displacement measurements and the smoothing of the displacement surfaces are accounted for in the uncertainty analysis. The output is a two-dimensional, spatially varying uncertainty estimate such that every point on the cross-section where residual stress is determined has a corresponding uncertainty value. Both numerical and physical experiments are reported, which are used to support the usefulnessmore »of the proposed uncertainty estimator. The uncertainty estimator shows the contour method to have larger uncertainty near the perimeter of the measurement plane. For the experiments, which were performed on a quenched aluminum bar with a cross section of 51 × 76 mm, the estimated uncertainty was approximately 5 MPa (?/E = 7 · 10??) over the majority of the cross-section, with localized areas of higher uncertainty, up to 10 MPa (?/E = 14 · 10??).« less
Model simplification of chemical kinetic systems under uncertainty
Coles, Thomas Michael Kyte
2011-01-01
This thesis investigates the impact of uncertainty on the reduction and simplification of chemical kinetics mechanisms. Chemical kinetics simulations of complex fuels are very computationally expensive, especially when ...
Decision making under epistemic uncertainty : an application to seismic design
Agarwal, Anna
2008-01-01
The problem of accounting for epistemic uncertainty in risk management decisions is conceptually straightforward, but is riddled with practical difficulties. Simple approximations are often used whereby future variations ...
Addressing Uncertainty in Desigh Inputs: A Case Study of Probabilisti...
Office of Environmental Management (EM)
Addressing Uncertainties in Design Inputs: A Case Study of Probabilistic Settlement Evaluations for Soft Zone Collapse at SWPF Tom Houston, Greg Mertz, Carl Costantino, Michael...
Measurement uncertainty analysis techniques applied to PV performance measurements
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Uncertainty quantification in fission cross section measurements at LANSCE
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Tovesson, F.
2015-01-09
Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.
Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties...
Office of Scientific and Technical Information (OSTI)
Modeling Correlations In Prompt Neutron Fission Spectra Uncertainties Citation Details In-Document Search Title: Modeling Correlations In Prompt Neutron Fission Spectra...
Optimization Online - The impact of wind uncertainty on the strategic ...
Pedro Crespo Del Granado
2015-01-14
Jan 14, 2015 ... Abstract: The intermittent nature of wind energy generation has introduced a new degree of uncertainty to the tactical planning of energy ...
Uncertainty quantification in fission cross section measurements at LANSCE
Tovesson, F. [Los Alamos National Laboratory (LANL), Los Alamos, NM (United States)
2015-01-01
Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.
Improvements of Nuclear Data and Its Uncertainties by Theoretical...
Office of Scientific and Technical Information (OSTI)
Improvements of Nuclear Data and Its Uncertainties by Theoretical Modeling Talou, Patrick Los Alamos National Laboratory; Nazarewicz, Witold University of Tennessee, Knoxville,...
The importance of covariance in nuclear data uncertainty propagation studies
Benstead, J. [AWE Plc, Aldermaston, Berkshire (United Kingdom)
2012-07-01
A study has been undertaken to investigate what proportion of the uncertainty propagated through plutonium critical assembly calculations is due to the covariances between the fission cross section in different neutron energy groups. The uncertainties on k{sub eff} calculated show that the presence of covariances between the cross section in different neutron energy groups accounts for approximately 27-37% of the propagated uncertainty due to the plutonium fission cross section. This study also confirmed the validity of employing the sandwich equation, with associated sensitivity and covariance data, instead of a Monte Carlo sampling approach to calculating uncertainties for linearly varying systems. (authors)
Improvements to Nuclear Data and Its Uncertainties by Theoretical...
Office of Scientific and Technical Information (OSTI)
Technical Report: Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling Citation Details In-Document Search Title: Improvements to Nuclear Data and Its...
Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions
Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.
2011-09-30
This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.
Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.
2010-05-01
Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, an extensive performance assessment (PA) for the YM repository was completed in 2008 [1] and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository [2]. This presentation provides an overview of the conceptual and computational structure of the indicated PA (hereafter referred to as the 2008 YM PA) and the roles that uncertainty analysis and sensitivity analysis play in this structure.
Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.
1998-04-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Size exclusion deep bed filtration: Experimental and modelling uncertainties
Badalyan, Alexander You, Zhenjiang; Aji, Kaiser; Bedrikovetsky, Pavel; Carageorgos, Themis; Zeinijahromi, Abbas
2014-01-15
A detailed uncertainty analysis associated with carboxyl-modified latex particle capture in glass bead-formed porous media enabled verification of the two theoretical stochastic models for prediction of particle retention due to size exclusion. At the beginning of this analysis it is established that size exclusion is a dominant particle capture mechanism in the present study: calculated significant repulsive Derjaguin-Landau-Verwey-Overbeek potential between latex particles and glass beads is an indication of their mutual repulsion, thus, fulfilling the necessary condition for size exclusion. Applying linear uncertainty propagation method in the form of truncated Taylor's series expansion, combined standard uncertainties (CSUs) in normalised suspended particle concentrations are calculated using CSUs in experimentally determined parameters such as: an inlet volumetric flowrate of suspension, particle number in suspensions, particle concentrations in inlet and outlet streams, particle and pore throat size distributions. Weathering of glass beads in high alkaline solutions does not appreciably change particle size distribution, and, therefore, is not considered as an additional contributor to the weighted mean particle radius and corresponded weighted mean standard deviation. Weighted mean particle radius and LogNormal mean pore throat radius are characterised by the highest CSUs among all experimental parameters translating to high CSU in the jamming ratio factor (dimensionless particle size). Normalised suspended particle concentrations calculated via two theoretical models are characterised by higher CSUs than those for experimental data. The model accounting the fraction of inaccessible flow as a function of latex particle radius excellently predicts normalised suspended particle concentrations for the whole range of jamming ratios. The presented uncertainty analysis can be also used for comparison of intra- and inter-laboratory particle size exclusion data.
Copyright Uncertainty in the Geoscience Community: Part I, What's Free for the Taking?
Clement, Gail
2012-01-01
Pre-print submitted for publication: Clement, Gail, 2012. ?Copyright Uncertainty in the Geoscience community: Part I, What?s Free for the taking?? Proceedings - Geoscience Information Society 42 , forthcoming... and Scholarly Communication, Texas A&M University Libraries, Mailstop 5000, College Station, TX 77843 gclement@tamu.edu Introduction: Why does copyright uncertainty exist in the geoscience information community? Geoscience information is highly...
Enclosure 2 DOE's Position on Dose Rate "Measurement Uncertainty"
system. Therefore, some knowledge of the uncertainty is important in determining the overall uncertainty and that they be used within their intended measurement range. This practice is sufficient to satisfy the nuclear safety requirements associated with the management of the containers of waste prior to disposal. The 200 mrem
Uncertainty and Predictability in Geophysics: Chaos and Multifractal Insights
Lovejoy, Shaun
Uncertainty and Predictability in Geophysics: Chaos and Multifractal Insights Daniel Schertzer Department, McGill University, Montreal, Canada Uncertainty and error growth are crosscutting geophysical extremes. The focus is now on time-space geophysical scaling behavior: their multifractality. It is found
Holographic position uncertainty and the quantum-classical transition
C. L. Herzenberg
2010-04-16
Arguments based on general principles of quantum mechanics have suggested that a minimum length associated with Planck-scale unification may in the context of the holographic principle entail a new kind of observable uncertainty in the transverse position of macroscopically separated objects. Here, we address potential implications of such a position uncertainty for establishing an additional threshold between quantum and classical behavior.
Managing Wind Power Forecast Uncertainty in Electric Brandon Keith Mauch
i Managing Wind Power Forecast Uncertainty in Electric Grids Brandon Keith Mauch Co Paulina Jaramillo Doctor Paul Fischbeck 2012 #12;ii #12;iii Managing Wind Power Forecast Uncertainty generated from wind power is both variable and uncertain. Wind forecasts provide valuable information
Shahab D. Mohaghegh, WVU, ISI Quantifying Uncertainties Associated
Mohaghegh, Shahab
1 Shahab D. Mohaghegh, WVU, ISI Quantifying Uncertainties Associated with Reservoir Simulation Solutions, Inc. SPE 102492 #12;2 Shahab D. Mohaghegh, WVU, ISI SPE 102492 Outline Reservoir Simulation Uncertainty, Using Surrogate Reservoir Model #12;3 Shahab D. Mohaghegh, WVU, ISI SPE 102492 Sources
Uncertainties and assessments of chemistry-climate models of the
Wirosoetisno, Djoko
and assessments of chemistry-climate models of the stratosphere. Atmospheric Chemistry and Physics, 3 (1). pp. 1, 127, 2003 www.atmos-chem-phys.org/acp/3/1/ Atmospheric Chemistry and Physics UncertaintiesUncertainties and assessments of chemistry-climate models of the stratosphere Article Published
Uncertainties in the Anti-neutrino Production at Nuclear Reactors
Z. Djurcic; J. A. Detwiler; A. Piepke; V. R. Foster Jr.; L. Miller; G. Gratta
2008-08-06
Anti-neutrino emission rates from nuclear reactors are determined from thermal power measurements and fission rate calculations. The uncertainties in these quantities for commercial power plants and their impact on the calculated interaction rates in electron anti-neutrino detectors is examined. We discuss reactor-to-reactor correlations between the leading uncertainties and their relevance to reactor anti-neutrino experiments.
Representing Thermal Vibrations and Uncertainty in Molecular Surfaces
Varshney, Amitabh
in a molecule is fuzzy because of its uncertainty in protein structure determination and thermal energy because of its thermal energy. Therefore, the smooth molecular surface will also vibrate. Also in proteinRepresenting Thermal Vibrations and Uncertainty in Molecular Surfaces Chang Ha Lee and Amitabh
Energy Management Problems Under Uncertainties for Grid-Connected Microgrids
Zhang, Wei
1 Energy Management Problems Under Uncertainties for Grid-Connected Microgrids : a Chance prob- lems under uncertainties for a grid-connected microgrid. The problems are motivated by practical microgrid problems such as peak power shaving and frequency regulation. The problems require constraints
September 2006 P-1 Appendix P: Risk and Uncertainty
September 2006 P-1 Appendix P: Risk and Uncertainty This appendix deals with the representation. In the section on "Uncertainties," beginning on page P-18, it describes in detail how the regional portfolio of these files is a compressed file, L24X-DW02-P.zip, containing the workbook files that Appendix L uses
An entropy measure of uncertainty in vote choiceq
Gill, Jeff
An entropy measure of uncertainty in vote choiceq Jeff GillÃ University of California, Davis, Political Science, One Shield Avenue, Davis, CA 95616, USA Abstract We examine voters' uncertainty in the bill's language, and the bill is disingenuously entitled the ``California Civil Rights Act
Utility Maximization under Model Uncertainty in Discrete Time
Nutz, Marcel
Utility Maximization under Model Uncertainty in Discrete Time Marcel Nutz January 14, 2014 Abstract We give a general formulation of the utility maximization problem under nondominated model uncertainty in discrete time and show that an optimal portfolio exists for any utility function
Chronological information and uncertainty Radiocarbon dating & calibration -Paula Reimer
Sengun, Mehmet Haluk
uncertainty · Estimated ± 1 - 2 (± 8-32 years) · Measured off-line ± 0.2 (± 2) - ok in gas counting systems.17 (lab error multiplier) #12;14C calibration uncertainties Animation: M. Blaauw #12;Constant initial 14C interpol. Too much weight on individual dates? #12;Which age-depth model? 800 cm core 9 14 C dates
Offshore Oilfield Development Planning under Uncertainty and Fiscal Considerations
Grossmann, Ignacio E.
1 Offshore Oilfield Development Planning under Uncertainty and Fiscal Considerations Vijay Gupta1 of uncertainty and complex fiscal rules in the development planning of offshore oil and gas fields which involve, Offshore Oil and Gas, Multistage Stochastic, Endogenous, Production Sharing Agreements (PSAs) 1
Closure for Production Planning under Power Uncertainty Project
Grossmann, Ignacio E.
Closure for Production Planning under Power Uncertainty Project Lehigh University Pietro Belotti C the power is recovered Production occurs at reduced rate The New Uncertainty Set Requires more binary¸ agri Latifoglu Fay Li Larry Snyder Air Products and Chemicals, Inc. Jim Hutton Peter Connard September
Filament Estimation and Uncertainty Measures Yen-Chi Chen
Filament Estimation and Uncertainty Measures Yen-Chi Chen Larry Wasserman Christopher Genovese Mellon University April 21, 2014 1 / 39 #12;Outline Introduction to Filaments Filament Estimation Uncertainty Measures Future Work 2 / 39 #12;Introduction Outline Introduction to Filaments Filament Estimation
Climate Change Uncertainty and Skepticism: A Cross-Country Analysis
Hall, Sharon J.
Climate Change Uncertainty and Skepticism: A Cross-Country Analysis Skepticism about climate change for other countries. · Skepticism and uncertainty are related but different aspects of climate change perceptions. In the literature, skepticism often relates to whether people believe climate change is happening
A REACTIVE APPROACH FOR MINING PROJECT EVALUATION UNDER PRICE UNCERTAINTY
Duffy, Ken
A REACTIVE APPROACH FOR MINING PROJECT EVALUATION UNDER PRICE UNCERTAINTY Meimei Zhang. This method often undervalues a mining project since it ignores future price uncertainty and does not allow on metal price. This paper also demonstrates that the "reactive" approach can estimate the mine project
Treatment of Uncertainty in Long-Term Planning 1 Introduction
McCalley, James D.
attribute to lie. For example, we could specify the price of natural gas in one of the following ways: #12 with time. #12;3 Figure 2: Specification of uncertainty in natural gas price Aside: We may also apply in this figure how (a) the expected price will increase with time, and (b) the uncertainty will also increase
Examining Uncertainty in Demand Response Baseline Models and
LBNL-5096E Examining Uncertainty in Demand Response Baseline Models and Variability in Automated of California. #12;Examining Uncertainty in Demand Response Baseline Models and Variability in Automated.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR
Measurement uncertainty of adsorption testing of desiccant materials
Bingham, C E; Pesaran, A A
1988-12-01
The technique of measurement uncertainty analysis as described in the current ANSI/ASME standard is applied to the testing of desiccant materials in SERI`s Sorption Test Facility. This paper estimates the elemental precision and systematic errors in these tests and propagates them separately to obtain the resulting uncertainty of the test parameters, including relative humidity ({plus_minus}.03) and sorption capacity ({plus_minus}.002 g/g). Errors generated by instrument calibration, data acquisition, and data reduction are considered. Measurement parameters that would improve the uncertainty of the results are identified. Using the uncertainty in the moisture capacity of a desiccant, the design engineer can estimate the uncertainty in performance of a dehumidifier for desiccant cooling systems with confidence. 6 refs., 2 figs., 8 tabs.
Avoiding climate change uncertainties in Strategic Environmental Assessment
Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick
2013-11-15
This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.
Uncertainty analysis of steady state incident heat flux measurements in hydrocarbon fuel fires.
Nakos, James Thomas
2005-12-01
The objective of this report is to develop uncertainty estimates for three heat flux measurement techniques used for the measurement of incident heat flux in a combined radiative and convective environment. This is related to the measurement of heat flux to objects placed inside hydrocarbon fuel (diesel, JP-8 jet fuel) fires, which is very difficult to make accurately (e.g., less than 10%). Three methods will be discussed: a Schmidt-Boelter heat flux gage; a calorimeter and inverse heat conduction method; and a thin plate and energy balance method. Steady state uncertainties were estimated for two types of fires (i.e., calm wind and high winds) at three times (early in the fire, late in the fire, and at an intermediate time). Results showed a large uncertainty for all three methods. Typical uncertainties for a Schmidt-Boelter gage ranged from {+-}23% for high wind fires to {+-}39% for low wind fires. For the calorimeter/inverse method the uncertainties were {+-}25% to {+-}40%. The thin plate/energy balance method the uncertainties ranged from {+-}21% to {+-}42%. The 23-39% uncertainties for the Schmidt-Boelter gage are much larger than the quoted uncertainty for a radiative only environment (i.e ., {+-}3%). This large difference is due to the convective contribution and because the gage sensitivities to radiative and convective environments are not equal. All these values are larger than desired, which suggests the need for improvements in heat flux measurements in fires.
Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina
2014-02-01
This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).
Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others
1997-06-01
This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.
Measurement uncertainty analysis techniques applied to PV performance measurements
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint
Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.
2014-11-01
Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.
A recipe for EFT uncertainty quantification in nuclear physics
R. J. Furnstahl; D. R. Phillips; S. Wesolowski
2014-12-16
The application of effective field theory (EFT) methods to nuclear systems provides the opportunity to rigorously estimate the uncertainties originating in the nuclear Hamiltonian. Yet this is just one source of uncertainty in the observables predicted by calculations based on nuclear EFTs. We discuss the goals of uncertainty quantification in such calculations and outline a recipe to obtain statistically meaningful error bars for their predictions. We argue that the different sources of theory error can be accounted for within a Bayesian framework, as we illustrate using a toy model.
PDF uncertainties at large x and gauge boson production
Accardi, Alberto
2012-10-01
I discuss how global QCD fits of parton distribution functions can make the somewhat separated fields of high-energy particle physics and lower energy hadronic and nuclear physics interact to the benefit of both. In particular, I will argue that large rapidity gauge boson production at the Tevatron and the LHC has the highest short-term potential to constrain the theoretical nuclear corrections to DIS data on deuteron targets necessary for up/down flavor separation. This in turn can considerably reduce the PDF uncertainty on cross section calculations of heavy mass particles such as W' and Z' bosons.
The Uncertainty Principle in Software Engineering Hadar Ziv Debra J. Richardson
Ziv, Hadar
plore in detail uncertainty in software testing, includ ing test planning, test enactment, error principles, software testing, uncertainty mod eling, Bayesian networks INTRODUCTION Today's software), followed by more detailed expositions of uncertainty in software testing, including test planning, test
Neural network uncertainty assessment using Bayesian statistics with application to remote sensing
Aires, Filipe
Neural network uncertainty assessment using Bayesian statistics with application to remote sensing for many inversion problems in remote sensing; however, uncertainty estimates are rarely provided Meteorology and Atmospheric Dynamics: General or miscellaneous; KEYWORDS: remote sensing, uncertainty, neural
Clark, E.L.
1993-08-01
Error propagation equations, based on the Taylor series model, are derived for the nondimensional ratios and coefficients most often encountered in high-speed wind tunnel testing. These include pressure ratio and coefficient, static force and moment coefficients, dynamic stability coefficients, calibration Mach number and Reynolds number. The error equations contain partial derivatives, denoted as sensitivity coefficients, which define the influence of free-stream Mach number, M{infinity}, on various aerodynamic ratios. To facilitate use of the error equations, sensitivity coefficients are derived and evaluated for nine fundamental aerodynamic ratios, most of which relate free-stream test conditions (pressure, temperature, density or velocity) to a reference condition. Tables of the ratios, R, absolute sensitivity coefficients, {partial_derivative}R/{partial_derivative}M{infinity}, and relative sensitivity coefficients, (M{infinity}/R) ({partial_derivative}R/{partial_derivative}M{infinity}), are provided as functions of M{infinity}.
Application Form Certificate Program in Risk, Uncertainty, and Decision Analysis
Van Veen, Barry D.
Application Form Certificate Program in Risk, Uncertainty, and Decision Analysis College _____________ Proposed Substitution _______________________________ _____________ Methods for Probabilistic Risk Analysis (3 credits): Industrial and Systems Engineering 574 Methods for Probabilistic Risk Analysis
Integration of Uncertainty Information into Power System Operations
Makarov, Yuri V.; Lu, Shuai; Samaan, Nader A.; Huang, Zhenyu; Subbarao, Krishnappa; Etingov, Pavel V.; Ma, Jian; Hafen, Ryan P.; Diao, Ruisheng; Lu, Ning
2011-10-10
Contemporary power systems face uncertainties coming from multiple sources, including forecast errors of load, wind and solar generation, uninstructed deviation and forced outage of traditional generators, loss of transmission lines, and others. With increasing amounts of wind and solar generation being integrated into the system, these uncertainties have been growing significantly. It is critical important to build knowledge of major sources of uncertainty, learn how to simulate them, and then incorporate this information into the decision-making processes and power system operations, for better reliability and efficiency. This paper gives a comprehensive view on the sources of uncertainty in power systems, important characteristics, available models, and ways of their integration into system operations. It is primarily based on previous works conducted at the Pacific Northwest National Laboratory (PNNL).
Time-Dependent Utility and Action Under Uncertainty Eric Horvitz
Horvitz, Eric
Time-Dependent Utility and Action Under Uncertainty Eric Horvitz Palo Alto Laboratory Rockwell systems Doyle, 1988, Horvitz, 1988, Boddy and Dean, 1989, Russell and Wefald, 1989, Breese and Horvitz
Modeling aviation's global emissions, uncertainty analysis, and applications to policy
Lee, Joosung Joseph, 1974-
2005-01-01
(cont.) fuel burn results below 3000 ft. For emissions, the emissions indices were the most influential uncertainties for the variance in model outputs. By employing the model, this thesis examined three policy options for ...
The effects of incorporating dynamic data on estimates of uncertainty
Mulla, Shahebaz Hisamuddin
2004-09-30
analysis. The results were compared with the uncertainty predicted using only static data. We also investigated approaches for best selecting a smaller number of models from a larger set of realizations to be history matched for quantification...
History matching and uncertainty quantificiation using sampling method
Ma, Xianlin
2009-05-15
Uncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte ...
Uncertainty of microwave radiative transfer computations in rain
Hong, Sung Wook
2009-06-02
Currently, the effect of the vertical resolution on the brightness temperature (BT) has not been examined in depth. The uncertainty of the freezing level (FL) retrieved using two different satellites' data is large. Various ...
Investment and Upgrade in Distributed Generation under Uncertainty
Siddiqui, Afzal
2008-01-01
ment of uncertainty via real options increases the value of2007) and the 2007 Real Options Conference in Berkeley, CA,distributed generation, real options JEL Codes: D81, Q40
Real options approach to capacity planning under uncertainty
Mittal, Geetanjali, 1979-
2004-01-01
This thesis highlights the effectiveness of Real Options Analysis (ROA) in capacity planning decisions for engineering projects subject to uncertainty. This is in contrast to the irreversible decision-making proposed by ...
Infill location determination and assessment of corresponding uncertainty
Senel, Ozgur
2009-05-15
, generated probabilistic distributions of incremental field production and, finally, used descriptive statistical analysis to evaluate results. I quantified the uncertainty associated with infill location selection in terms of incremental field production...
Data-driven models for uncertainty and behavior
Gupta, Vishal, Ph. D. Massachusetts Institute of Technology
2014-01-01
The last decade has seen an explosion in the availability of data. In this thesis, we propose new techniques to leverage these data to tractably model uncertainty and behavior. Specifically, this thesis consists of three ...
Managing Sensor Data Uncertainty: a data quality approach
Servigne, Sylvie
and users with data quality information. Keywords: Data Quality, Sensor Data, Metadata, EnvironmentalManaging Sensor Data Uncertainty: a data quality approach Claudia C. Gutiérrez Rodríguez (I3S improvement, sensors infrastructure actually supports many current and promising environmental applications
Uncertainty in Greenhouse Emissions and Costs of Atmospheric Stabilization
Webster, Mort D.
We explore the uncertainty in projections of emissions, and costs of atmospheric stabilization applying the MIT Emissions Prediction and Policy Analysis model, a computable general equilibrium model of the global economy. ...
Control of systems subject to uncertainty and constraints
Villota Cerna, Elizabeth Roxana
2009-05-15
Engineering iii ABSTRACT Control of Systems Subject to Uncertainty and Constraints. (December 2007) Elizabeth Roxana Villota Cerna, B.S., Universidad Nacional de Ingenieria; M.S., Pontiflcia Universidade Catolica do Rio de Janeiro Chair of Advisory Committee...
Maintaining artificial recharge ponds under uncertainty: a probabilistic approach for
Politècnica de Catalunya, Universitat
Maintaining artificial recharge ponds under uncertainty: a probabilistic approach for engineering surface ponds (SP) Clogging What is clogging? Mathematical models for clogging Risk formulation Carlo analysis Conclusions #12;Surface ponds (SP) collect selected external water (e.g. regenerated
HANDLING UNCERTAINTY IN PRODUCTION ACTIVITY CONTROL USING PROACTIVE SIMULATION
Paris-Sud XI, Université de
HANDLING UNCERTAINTY IN PRODUCTION ACTIVITY CONTROL USING PROACTIVE SIMULATION Olivier CARDIN, Production Control, Manufacturing Systems, Proactive, Real-time. 1. INTRODUCTION In today's complex of a product. Real-word planning and scheduling problems are generally complex, constrained and multi
Phoenix: A Reactor Burnup Code With Uncertainty Quantification
Spence, Grant R
2014-12-15
, and will continue to be, an area of great interest in nuclear research. Several methods have been developed to simulate reactor burnup; however, quantifying the uncertainty in reactor burnup simulations is in its relative infancy. This research developed a...
Including model uncertainty in risk-informed decision-making
Reinert, Joshua M
2005-01-01
Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and ...
Some methods of estimating uncertainty in accident reconstruction
Milan Batista
2011-07-20
In the paper four methods for estimating uncertainty in accident reconstruction are discussed: total differential method, extreme values method, Gauss statistical method, and Monte Carlo simulation method. The methods are described and the program solutions are given.
Do the uncertainty relations really have crucial significances for physics ?
Spiridon Dumitru
2010-05-03
It is proved the falsity of idea that the Uncertainty Relations (UR) have crucial significances for physics. Additionally one argues for the necesity of an UR-disconnected quantum philosophy.
Induced Matter Theory & Heisenberg-like Uncertainty Relations
James R. Bogan
2005-04-01
We show that a differential variant of the Heisenberg uncertainty relations emerges naturally from induced matter theory, as a sum of line elements in both momentum and Minkowski spaces.
Uncertainty quantification using multiscale methods for porous media flows
Dostert, Paul Francis
2009-05-15
In this dissertation we discuss numerical methods used for uncertainty quantifi- cation applications to flow in porous media. We consider stochastic flow equations that contain both a spatial and random component which ...
Creating value from uncertainty : a study of ocean transportation contracting
Pálsson, Sigurjón
2005-01-01
How can financial tools like real options and hedging mitigate and create value from uncertainty in transportation? This paper describes these concepts and identifies research on them that has relevance to transportation. ...
Investment and Upgrade in Distributed Generation under Uncertainty
Siddiqui, Afzal
2008-01-01
of the uncertainty in the natural gas price. Treat- ment ofits exposure to risk from natural gas price volatility. Inits exposure to the natural gas price and maximising its
Assessing uncertainty in models of the ocean carbon cycle
Scott, Vivian
2010-01-01
In this thesis I explore the effect of parameter uncertainty in ocean biogeochemical models on the calculation of carbon uptake by the ocean. The ocean currently absorbs around a quarter of the annual anthropogenic CO2 ...
A Framework for Modeling Uncertainty in Regional Climate Change
climate models). The modeling framework revolves around the Massachusetts Institute of Technology (MITA Framework for Modeling Uncertainty in Regional Climate Change Erwan Monier, Xiang Gao, Jeffery processes of policy development and implementation, climate change research needs to focus on improving
Precipitation Research at UMN: Multiscale variability and Uncertainty in
Foufoula-Georgiou, Efi
Precipitation Research at UMN: Multiscale variability and Uncertainty in Rainfall Estimation Developing nonparametric schemes for merging multisensor precipitation products Gupta, R., V. Venugopal and E. Foufoula-Georgiou, A methodology for merging multisensor precipitation estimates based on expectation
An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation
McLean, Noah M.; Bowring, J.F.; Bowring, S.A.
2011-06-01
High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of calculated U-Pb dates...
System architecture decisions under uncertainty : a case study on automotive battery system design
Renzi, Matthew Joseph
2012-01-01
Flexibility analysis using the Real Options framework is typically utilized on high-level architectural decisions. Using Real Options, a company may develop strategies to mitigate downside risk for future uncertainties ...
Entropic uncertainty relation for power-law wave packets
Sumiyoshi Abe; S. Martinez; F. Pennini; A. Plastino
2002-06-06
For the power-law quantum wave packet in configuration space, the variance of the position observable may be divergent. Accordingly, the information-entropic formulation of the uncertainty principle becomes more appropriate than the Heisenberg-type formulation, since it involves only the finite quantities. It is found that the total amount of entropic uncertainty converges to its lower bound in the limit of a large value of the exponent.
POWER SCHEDULING IN A HYDROTHERMAL SYSTEM UNDER UNCERTAINTY
RÃ¶misch, Werner
POWER SCHEDULING IN A HYDROÂTHERMAL SYSTEM UNDER UNCERTAINTY C.C. CarÃ¸e 1 , M.P. Nowak 2 , W. R: A multiÂstage stochastic proÂ gramming model for power scheduling unÂ der uncertainty in a generation system comprising thermal and pumpedÂstorage hydro units is developed. For its compuÂ tational solution
Uncertainty Quantification and Calibration in Well Construction Cost Estimates
Valdes Machado, Alejandro
2013-08-05
, it is possible to assess the probabilities of cost overruns, understand the accuracy of the estimates and provide identification of risk factors in the well construction operations. Capen (1976) claimed we have not learned to successfully deal with uncertainty... of the project. Both issues contribute to inefficient capital budgeting. Using a probabilistic methodology is not sufficient to assess the uncertainty reliably. Look-backs, which are based on the analysis of past estimations, are required to provide more...
Uncertainties in the Anti-neutrino Production at Nuclear Reactors
Djurcic, Zelimir; Detwiler, Jason A.; Piepke, Andreas; Foster Jr., Vince R.; Miller, Lester; Gratta, Giorgio
2008-08-06
Anti-neutrino emission rates from nuclear reactors are determined from thermal power measurements and fission rate calculations. The uncertainties in these quantities for commercial power plants and their impact on the calculated interaction rates in {bar {nu}}{sub e} detectors is examined. We discuss reactor-to-reactor correlations between the leading uncertainties, and their relevance to reactor {bar {nu}}{sub e} experiments.
Living With Radical Uncertainty. The Exemplary case of Folding Protein
Ignazio Licata
2010-04-21
Laplace's demon still makes strong impact on contemporary science, in spite of the fact that Logical Mathematics outcomes, Quantum Physics advent and more recently Complexity Science have pointed out the crucial role of uncertainty in the World's descriptions. We focus here on the typical problem of folding protein as an example of uncertainty, radical emergence and a guide to the "simple" principles for studying complex systems.
Tabone, Michaelangelo D; Callaway, Duncan S
2015-01-01
AND UNCERTAINTY OF PHOTOVOLTAIC GENERATION [9] M. Milligan,for grid-connected photovoltaic system based on advancedand uncertainty in solar photovoltaic generation at multiple
Optimal Control of Distributed Energy Resources and Demand Response under Uncertainty
Siddiqui, Afzal
2010-01-01
Energy Resources and Demand Response under Uncertainty AfzalEnergy Resources and Demand Response under Uncertainty ?DER in conjunction with demand response (DR): the expected
Uncertainty and sensitivity analysis for long-running computer codes : a critical review
Langewisch, Dustin R
2010-01-01
This thesis presents a critical review of existing methods for performing probabilistic uncertainty and sensitivity analysis for complex, computationally expensive simulation models. Uncertainty analysis (UA) methods ...
How incorporating more data reduces uncertainty in recovery predictions
Campozana, F.P.; Lake, L.W.; Sepehrnoori, K.
1997-08-01
From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.
Fuzzy-probabilistic calculations of water-balance uncertainty
Faybishenko, B.
2009-10-01
Hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete, or subjective information, which may limit the application of conventional stochastic methods in predicting hydrogeologic conditions and associated uncertainty. Instead, redictions and uncertainty analysis can be made using uncertain input parameters expressed as probability boxes, intervals, and fuzzy numbers. The objective of this paper is to present the theory for, and a case study as an application of, the fuzzyprobabilistic approach, ombining probability and possibility theory for simulating soil water balance and assessing associated uncertainty in the components of a simple waterbalance equation. The application of this approach is demonstrated using calculations with the RAMAS Risk Calc code, to ssess the propagation of uncertainty in calculating potential evapotranspiration, actual evapotranspiration, and infiltration-in a case study at the Hanford site, Washington, USA. Propagation of uncertainty into the results of water-balance calculations was evaluated by hanging he types of models of uncertainty incorporated into various input parameters. The results of these fuzzy-probabilistic calculations are compared to the conventional Monte Carlo simulation approach and estimates from field observations at the Hanford site.
Uncertainty Estimation Improves Energy Measurement and Verification Procedures
Walter, Travis; Price, Phillip N.; Sohn, Michael D.
2014-05-14
Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.
Comparison of nuclear data uncertainty propagation methodologies for PWR burn-up simulations
Carlos Javier Diez; Oliver Buss; Axel Hoefer; Dieter Porsch; Oscar Cabellos
2014-11-04
Several methodologies using different levels of approximations have been developed for propagating nuclear data uncertainties in nuclear burn-up simulations. Most methods fall into the two broad classes of Monte Carlo approaches, which are exact apart from statistical uncertainties but require additional computation time, and first order perturbation theory approaches, which are efficient for not too large numbers of considered response functions but only applicable for sufficiently small nuclear data uncertainties. Some methods neglect isotopic composition uncertainties induced by the depletion steps of the simulations, others neglect neutron flux uncertainties, and the accuracy of a given approximation is often very hard to quantify. In order to get a better sense of the impact of different approximations, this work aims to compare results obtained based on different approximate methodologies with an exact method, namely the NUDUNA Monte Carlo based approach developed by AREVA GmbH. In addition, the impact of different covariance data is studied by comparing two of the presently most complete nuclear data covariance libraries (ENDF/B-VII.1 and SCALE 6.0), which reveals a high dependency of the uncertainty estimates on the source of covariance data. The burn-up benchmark Exercise I-1b proposed by the OECD expert group "Benchmarks for Uncertainty Analysis in Modeling (UAM) for the Design, Operation and Safety Analysis of LWRs" is studied as an example application. The burn-up simulations are performed with the SCALE 6.0 tool suite.
PIV Uncertainty Methodologies for CFD Code Validation at the MIR Facility
Piyush Sabharwall; Richard Skifton; Carl Stoots; Eung Soo Kim; Thomas Conder
2013-12-01
Currently, computational fluid dynamics (CFD) is widely used in the nuclear thermal hydraulics field for design and safety analyses. To validate CFD codes, high quality multi dimensional flow field data are essential. The Matched Index of Refraction (MIR) Flow Facility at Idaho National Laboratory has a unique capability to contribute to the development of validated CFD codes through the use of Particle Image Velocimetry (PIV). The significance of the MIR facility is that it permits non intrusive velocity measurement techniques, such as PIV, through complex models without requiring probes and other instrumentation that disturb the flow. At the heart of any PIV calculation is the cross-correlation, which is used to estimate the displacement of particles in some small part of the image over the time span between two images. This image displacement is indicated by the location of the largest peak. In the MIR facility, uncertainty quantification is a challenging task due to the use of optical measurement techniques. Currently, this study is developing a reliable method to analyze uncertainty and sensitivity of the measured data and develop a computer code to automatically analyze the uncertainty/sensitivity of the measured data. The main objective of this study is to develop a well established uncertainty quantification method for the MIR Flow Facility, which consists of many complicated uncertainty factors. In this study, the uncertainty sources are resolved in depth by categorizing them into uncertainties from the MIR flow loop and PIV system (including particle motion, image distortion, and data processing). Then, each uncertainty source is mathematically modeled or adequately defined. Finally, this study will provide a method and procedure to quantify the experimental uncertainty in the MIR Flow Facility with sample test results.
Uncertainty and sensitivity analysis for photovoltaic system modeling.
Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk
2013-12-01
We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.
Fuel cycle cost uncertainty from nuclear fuel cycle comparison
Li, J.; McNelis, D. [Institute for the Environment, University of North Carolina, Chapel Hill (United States); Yim, M.S. [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology (Korea, Republic of)
2013-07-01
This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.
Systematic uncertainties from halo asphericity in dark matter searches
Bernal, Nicolás; Forero-Romero, Jaime E.; Garani, Raghuveer; Palomares-Ruiz, Sergio E-mail: je.forero@uniandes.edu.co E-mail: sergio.palomares.ruiz@ific.uv.es
2014-09-01
Although commonly assumed to be spherical, dark matter halos are predicted to be non-spherical by N-body simulations and their asphericity has a potential impact on the systematic uncertainties in dark matter searches. The evaluation of these uncertainties is the main aim of this work, where we study the impact of aspherical dark matter density distributions in Milky-Way-like halos on direct and indirect searches. Using data from the large N-body cosmological simulation Bolshoi, we perform a statistical analysis and quantify the systematic uncertainties on the determination of local dark matter density and the so-called J factors for dark matter annihilations and decays from the galactic center. We find that, due to our ignorance about the extent of the non-sphericity of the Milky Way dark matter halo, systematic uncertainties can be as large as 35%, within the 95% most probable region, for a spherically averaged value for the local density of 0.3-0.4 GeV/cm {sup 3}. Similarly, systematic uncertainties on the J factors evaluated around the galactic center can be as large as 10% and 15%, within the 95% most probable region, for dark matter annihilations and decays, respectively.
SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data
Williams, Mark L [ORNL; Rearden, Bradley T [ORNL
2008-01-01
Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.
Direct tests of measurement uncertainty relations: what it takes
Paul Busch; Neil Stevens
2015-01-17
The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that in nearly 90 years there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance), and precise formulations of such relations that are {\\em universally valid}and {\\em directly testable}. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of {\\em value deviation errors} (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for {\\em state-dependent} error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify {\\em distances between observables}.
IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases
Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov
2012-11-01
Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.
Mills, Andrew
2010-01-01
and Uncertainty of Photovoltaics for Integration with themodels and datasets. Photovoltaics fall under the broader
4. Uncertainty D. Keil Artificial Intelligence 7/13 David Keil, Framingham State University
Keil, David M.
4. Uncertainty D. Keil Artificial Intelligence 7/13 David Keil, Framingham State University CSCI 300 Artificial Intelligence 4. Uncertainty and belief 1. Acting under uncertainty 2. Probability CSCI 300 Artificial Intelligence 4. Uncertainty 7/13 Inquiry · Is probabilistic reasoning part
Solar irradiance forecasting at multiple time horizons and novel methods to evaluate uncertainty
Marquez, Ricardo
2012-01-01
Solar irradiance data . . . . . . . . . . . . .Accuracy . . . . . . . . . . . . . . . . . Solar Resourcev Uncertainty In Solar Resource: Forecasting
Harp, D. R. [Los Alamos National Laboratory, Los Alamos, NM, USA; Atchley, A. L. [Los Alamos National Laboratory, Los Alamos, NM, USA; Painter, S. L. [Oak Ridge National Laboratory, Oak Ridge, TN, USA; Coon, E. T. [Los Alamos National Laboratory, Los Alamos, NM, USA; Wilson, C. J. [Los Alamos National Laboratory, Los Alamos, NM, USA; Romanovsky, V. E. [University of Alaska, Fairbanks, USA] (ORCID:0000000295152087); Rowland, J. C. [Los Alamos National Laboratory, Los Alamos, NM, USA
2015-01-01
The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.
2015-06-29
The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore »the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less
Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.
Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan; Kaplan, Paul Garry; Brown, Theresa Jean; Conrad, Stephen Hamilton
2010-09-01
Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subset of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.
Incorporating uncertainty into electric utility projections and decisions
Hanson, D.A.
1992-01-01
This paper focuses on how electric utility companies can respond in their decision making to uncertain variables. Here we take a mean- variance type of approach. The mean'' value is an expected cost, on a discounted value basis. We assume that management has risk preferences incorporating a tradeoff between the mean and variance in the utility's net income. Decisions that utilities are faced with can be classified into two types: ex ante and ex post. The ex ante decisions need to be made prior to the uncertainty being revealed and the ex post decision can be postponed until after the uncertainty is revealed. Intuitively, we can say that the ex ante decisions provide a hedge against the uncertainties and the ex post decisions allow the negative outcomes of uncertain variables to be partially mitigated, dampening the losses. An example of an ex post decision is how the system is operated i.e., unit dispatch, and in some cases switching among types of fuels, say with different sulfur contents. For example, if gas prices go up, natural gas combined cycle units are likely to be dispatched at lower capacity factors. If SO{sub 2} emission allowance prices go up, a utility may seek to switch into a lower sulfur coal. Here we assume that regulated electric utilities do have some incentive to lower revenue requirements and hence an incentive to lower the electric rates needed for the utility to break even, thereby earning a fair return on invested capital. This paper presents the general approach first, including applications to capacity expansion and system dispatch. Then a case study is presented focusing on the 1990 Clean Air Act Amendments including SO{sub 2} emissions abatement and banking of allowances under uncertainty. It is concluded that the emission banking decisions should not be made in isolation but rather all the uncertainties in demand, fuel prices, technology performance etc., should be included in the uncertainty analysis affecting emission banking.
Effective field theory for nuclear vibrations with quantified uncertainties
Pérez, E A Coello
2015-01-01
We develop an effective field theory (EFT) for nuclear vibrations. The key ingredients - quadrupole degrees of freedom, rotational invariance, and a breakdown scale around the three-phonon level - are taken from data. The EFT is developed for spectra and electromagnetic moments and transitions. We employ tools from Bayesian statistics for the quantification of theoretical uncertainties. The EFT consistently describes spectra and electromagnetic transitions for $^{62}$Ni, $^{98,100}$Ru, $^{106,108}$Pd, $^{110,112,114}$Cd, and $^{118,120,122}$Te within the theoretical uncertainties. This suggests that these nuclei can be viewed as anharmonic vibrators.
Solar WIMPs Unraveled: Experiments, astrophysical uncertainties, and interactive Tools
Danninger, Matthias
2015-01-01
The absence of a neutrino flux from self-annihilating dark matter captured in the Sun has tightly constrained some leading particle dark matter scenarios. The impact of astrophysical uncertainties on the capture process of dark matter in the Sun and hence also the derived constraints by neutrino telescopes need to be taken into account. In this review we have explored relevant uncertainties in solar WIMP searches, summarized results from leading experiments, and provided an outlook into upcoming searches and future experiments. We have created an interactive plotting tool that allows the user to view current limits and projected sensitivities of major experiments under changing astrophysical conditions.
Zachary Lewis; Tatsu Takeuchi
2011-09-13
We analyze the position and momentum uncertainties of the energy eigenstates of the harmonic oscillator in the context of a deformed quantum mechanics, namely, that in which the commutator between the position and momentum operators is given by [x,p]=i\\hbar(1+\\beta p^2). This deformed commutation relation leads to the minimal length uncertainty relation \\Delta x > (\\hbar/2)(1/\\Delta p +\\beta\\Delta p), which implies that \\Delta x ~ 1/\\Delta p at small \\Delta p while \\Delta x ~ \\Delta p at large \\Delta p. We find that the uncertainties of the energy eigenstates of the normal harmonic oscillator (m>0), derived in Ref. [1], only populate the \\Delta x ~ 1/\\Delta p branch. The other branch, \\Delta x ~ \\Delta p, is found to be populated by the energy eigenstates of the `inverted' harmonic oscillator (m \\sqrt{2} [\\hbar^2/k|m|]^{1/4}. Correspondence with the classical limit is also discussed.
Reactor Neutrino Flux Uncertainty Suppression on Multiple Detector Experiments
Andi Cucoanes; Pau Novella; Anatael Cabrera; Muriel Fallot; Anthony Onillon; Michel Obolensky; Frederic Yermia
2015-01-02
This publication provides a coherent treatment for the reactor neutrino flux uncertainties suppression, specially focussed on the latest $\\theta_{13}$ measurement. The treatment starts with single detector in single reactor site, most relevant for all reactor experiments beyond $\\theta_{13}$. We demonstrate there is no trivial error cancellation, thus the flux systematic error can remain dominant even after the adoption of multi-detector configurations. However, three mechanisms for flux error suppression have been identified and calculated in the context of Double Chooz, Daya Bay and RENO sites. Our analysis computes the error {\\it suppression fraction} using simplified scenarios to maximise relative comparison among experiments. We have validated the only mechanism exploited so far by experiments to improve the precision of the published $\\theta_{13}$. The other two newly identified mechanisms could lead to total error flux cancellation under specific conditions and are expected to have major implications on the global $\\theta_{13}$ knowledge today. First, Double Chooz, in its final configuration, is the only experiment benefiting from a negligible reactor flux error due to a $\\sim$90\\% geometrical suppression. Second, Daya Bay and RENO could benefit from their partial geometrical cancellation, yielding a potential $\\sim$50\\% error suppression, thus significantly improving the global $\\theta_{13}$ precision today. And third, we illustrate the rationale behind further error suppression upon the exploitation of the inter-reactor error correlations, so far neglected. So, our publication is a key step forward in the context of high precision neutrino reactor experiments providing insight on the suppression of their intrinsic flux error uncertainty, thus affecting past and current experimental results, as well as the design of future experiments.
The Harmonic Oscillator and the Uncertainty Principle Frank Rioux
Rioux, Frank
The Harmonic Oscillator and the Uncertainty Principle Frank Rioux Emeritus Professor of Chemistry CSB|SJU SchrÃ¶dinger's equation in atomic units (h = 2) for the harmonic oscillator has an exact is that tunneling occurs in the simple harmonic oscillator. The classical turning point is that position at which
Reducing uncertainty in geostatistical description with well testing pressure data
Reynolds, A.C.; He, Nanqun; Oliver, D.S.
1997-08-01
Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.
Environmental Health Policy Decisions: The Role of Uncertainty in
Washington at Seattle, University of
in quantitative risk assessments and associated regulatory outcomes. In representing a broad spec- trum of this work is to underscore the role of irreversibility and uncertainty for bene t-cost analysis implications for those health professionals, particu- larly in the area of quantitative risk assessment, whose
Investment and Upgrade in Distributed Generation under Uncertainty
Guillas, Serge
for microgrids to use small-scale distributed generation (DG) and combined heat and power (CHP) applications via.maribu@ensmp.fr 1 #12;Investment and Upgrade under Uncertainty in Distributed Generation 2 Keywords: Combined heat heat exchangers (HXs) to meet local energy loads. Although the electric-only efficiency of DG is lower
A Framework for Analysis of the Uncertainty of Socioeconomic Growth
A Framework for Analysis of the Uncertainty of Socioeconomic Growth and Climate Change on the Risk of Technology 77 Massachusetts Avenue, E19-411 Cambridge, MA 02139 (USA) Location: Building E19, Room 411 400 Growth and Climate Change on the Risk of Water Stress: a Case Study in Asia Charles Fant* , C. Adam
Constructing Bayesiannetwork Models of Software Testing and Maintenance Uncertainties
Ziv, Hadar
Constructing Bayesiannetwork Models of Software Testing and Maintenance Uncertainties To appear, Software requirements, Software test ing, Software maintenance 1 #12; 1 Introduction Developmenttomarket constraints. In reallife software projects, testing is often done after coding is complete, does
Partition Testing versus Random Testing: the Influence of Uncertainty
Gutjahr, Walter
detection, partition testÂ ing, program testing, random testing, software testing. I. Introduction Few topics in software testing methodology seem to be more controversial than the question whetherPartition Testing versus Random Testing: the Influence of Uncertainty Walter J. Gutjahr Department
H 1 Optimization with Plant Uncertainty and Semidefinite Programming \\Lambda
Helton, J. William
H 1 Optimization with Plant Uncertainty and Semidefinite Programming \\Lambda J. William Helton Dept worst case frequency domain specifications. This is a non smooth optimization problem that underlies in Qualitative Feedback Theory for example. It is shown in this article how the fundamental H 1 optimization
Impact of PV forecasts uncertainty in batteries management in microgrids
Paris-Sud XI, Université de
Impact of PV forecasts uncertainty in batteries management in microgrids Andrea Michiorri Arthur-based battery schedule optimisation in microgrids in presence of network constraints. We examine a specific case production forecast algorithm is used in combination with a battery schedule optimisation algorithm. The size
Bayesian System Identification and Response Predictions Robust to Modeling Uncertainty
Beck, James L.
.g. system ID, structural health monitoring, robust control, state &/or parameter estimation ) #12;33 Outline of seismic ground acceleration Finite element model with uncertain parameters Posterior analysis: During;55 System performance measure in the presence of uncertainty: Failure probability + - "Failure" t(t)iy i b i
Resilience and Water Governance Addressing Fragmentation and Uncertainty in Water
Control Act, commonly known as the Clean Water Act (CWA), to dean up point source discharges fromFIVE Resilience and Water Governance Addressing Fragmentation and Uncertainty in Water Allocation and Water Quality Law BARBARA A. COSENS AND CRAIG A. STOW The U.S. EPA reports that almost half
Managing Uncertainty in Operational Control of Water Distribution Systems
Bargiela, Andrzej
Managing Uncertainty in Operational Control of Water Distribution Systems A. Bargiela Department Operation of water distribution systems requires a variety of decisions to be made. There are system. There are system management decisions concerning the regulatory measures such as water pricing principles, effluent
Forecasting Uncertainty Related to Ramps of Wind Power Production
Boyer, Edmond
Forecasting Uncertainty Related to Ramps of Wind Power Production Arthur Bossavy, Robin Girard - The continuous improvement of the accuracy of wind power forecasts is motivated by the increasing wind power. This paper presents two methods focusing on forecasting large and sharp variations in power output of a wind
Post-doc: Uncertainty Quantification of Offshore Wind Farms
Jansen, Erik
Post-doc: Uncertainty Quantification of Offshore Wind Farms Faculty/department Aerospace systems, from small wind turbines to large offshore wind farms, are the focus of research in the Wind research activities there is a focus on large, multimegawatt offshore wind turbines and offshore wind farms
Stochastic Optimization for Operating Chemical Processes under Uncertainty
Henrion, René
Abstract Mathematical optimization techniques are on their way to becoming a standard tool in chemical Stochastic Optimization for Operating Chemical Processes under Uncertainty René Henrion1 , Pu Li2 for the explicit treatment of uncertain- ties by stochastic optimization. 1 OPERATING CHEMICAL PROCESSES Chemical
Distributed Generation Investment by a Microgrid under Uncertainty++++ Afzal Siddiqui
Guillas, Serge
1 Distributed Generation Investment by a Microgrid under Uncertainty++++ Afzal Siddiqui University, CA 94720-8163, USA, c_marnay@lbl.gov ABSTRACT. This paper examines a California-based microgrid-term natural gas generation cost is stochastic, we initially assume that the microgrid may purchase electricity
ASME PTC 47, gasification combined cycle performance -- Uncertainty
Archer, D.H.; Horazak, D.A.; Bannister, R.L.
1998-07-01
Determining the uncertainty of measured calculated performance parameters is required in all Performance Test Codes of the ASME. This determination begins with the equations defining the performance parameters of the equipment or system--input, useful output, and effectiveness (an input/output ratio). The variables in these equations are: plant operating conditions measured throughout a test; corrections that compensate for deviations of other significant measured plant and ambient operating conditions from their values specified for the test. PTC47, Gasification Combined Cycle Plant Performance, will define procedures for the performance testing of overall gasification combined cycle plants and for each of three major sections that may comprise such a plant. The Committee is now defining the performance parameters for these tests. Performance factor computations include uncertainty calculations in order to provide preliminary estimates of the accuracy expected from the test methods proposed in this Code. Uncertainty calculations will also be used to explore energy balance methods for calculating the energy input to various gasifiers--entrained flow, fluidized bed, and moving bed. Such methods would be important as possible alternatives to direct measurements of flows and heating values of the various fuels fed to the gasifiers. Uncertainty calculations will also be used to assist in identifying those measurements of ambient, imposed, and controlled operating conditions that significantly affect test results and for which correction factors should be determined.
Managing Uncertainty in Sound based Control for an Autonomous Helicopter
Hopgood, Adrian
Managing Uncertainty in Sound based Control for an Autonomous Helicopter Benjamin N. Passow, Mario research us- ing a multi-purpose, small and low cost autonomous helicopter platform (Flyper). We supervised method to localise the indoor helicopter and extract meaningful information to enable
Comment on "Coherence and Uncertainty in Nanostructured Organic Photovoltaics"
Mukamel, Shaul
Comment on "Coherence and Uncertainty in Nanostructured Organic Photovoltaics" Shaul Mukamel of the eigenfunctions of the Schrodinger equation that describes the nanostructured organic photovoltaic blend. (2 universal "coherence effects" in photovoltaics is misguided. There are two types of coherences in many
Leveraging Uncertainty in Design-to-Criteria Scheduling Thomas Wagner
Massachusetts at Amherst, University of
Leveraging Uncertainty in Design-to-Criteria Scheduling Thomas Wagner Computer Science Department Technical Report 1997-11 January 20, 1996 Abstract Design-to-Criteria scheduling is the process of custom to achieve tasks and sub- tasks. Formerly, Design-to-Criteria scheduling relied on simple expected value
Leveraging Uncertainty in DesigntoCriteria Scheduling \\Lambda Thomas Wagner
Massachusetts at Amherst, University of
Leveraging Uncertainty in DesigntoCriteria Scheduling \\Lambda Thomas Wagner Computer Science Computer Science Technical Report 199711 January 20, 1996 Abstract DesigntoCriteria scheduling that describes alternate ways to achieve tasks and sub tasks. Formerly, DesigntoCriteria scheduling relied
INCORPORATING UNCERTAINTY INTO DAM SAFETY RISK Sanjay S. Chauhan1
Bowles, David S.
INCORPORATING UNCERTAINTY INTO DAM SAFETY RISK ASSESSMENT Sanjay S. Chauhan1 and David S. Bowles2 ABSTRACT Risk assessment is becoming more widely used to supplement traditional approaches to dam safety decision-making. Dam owners throughout Australia, the U.S. Army Corps of Engineers, and the U.S. Bureau
Oil and Gas Production Optimization; Lost Potential due to Uncertainty
Johansen, Tor Arne
Oil and Gas Production Optimization; Lost Potential due to Uncertainty Steinar M. Elgsaeter Olav.ntnu.no) Abstract: The information content in measurements of offshore oil and gas production is often low, and when in the context of offshore oil and gas fields, can be considered the total output of production wells, a mass
Chaos: a bridge from microscopic uncertainty to macroscopic randomness
S. J. Liao
2012-01-09
It is traditionally believed that the macroscopic randomness has nothing to do with the micro-level uncertainty. Besides, the sensitive dependence on initial condition (SDIC) of Lorenz chaos has never been considered together with the so-called continuum-assumption of fluid (on which Lorenz equations are based), from physical and statistic viewpoints. A very fine numerical technique (Liao, 2009) with negligible truncation and round-off errors, called here the "clean numerical simulation" (CNS), is applied to investigate the propagation of the micro-level unavoidable uncertain fluctuation (caused by the continuum-assumption of fluid) of initial conditions for Lorenz equation with chaotic solutions. Our statistic analysis based on CNS computation of 10,000 samples shows that, due to the SDIC, the uncertainty of the micro-level statistic fluctuation of initial conditions transfers into the macroscopic randomness of chaos. This suggests that chaos might be a bridge from micro-level uncertainty to macroscopic randomness, and thus would be an origin of macroscopic randomness. We reveal in this article that, due to the SDIC of chaos and the inherent uncertainty of initial data, accurate long-term prediction of chaotic solution is not only impossible in mathematics but also has no physical meanings. This might provide us a new, different viewpoint to deepen and enrich our understandings about the SDIC of chaos.
Observability and Estimation Uncertainty Analysis for PMU Placement Alternatives
Pollefeys, Marc
Observability and Estimation Uncertainty Analysis for PMU Placement Alternatives Jinghe Zhang unit (PMU), developed in the 1980s, is considered to be one of the most important devices in the future of power systems. While PMU measurements currently cover fewer than 1% of the nodes in the U.S. power grid
Quantification of nuclear uncertainties in nucleosynthesis of elements beyond Iron
T. Rauscher
2014-12-22
Nucleosynthesis beyond Fe poses additional challenges not encountered when studying astrophysical processes involving light nuclei. Generally higher temperatures and nuclear level densities lead to stronger contributions of transitions on excited target states. This may prevent cross section measurements to determine stellar reaction rates and theory contributions remain important. Furthermore, measurements often are not feasible in the astrophysically relevant energy range. Sensitivity analysis allows not only to determine the contributing nuclear properties but also is a handy tool for experimentalists to interpret the impact of their data on predicted cross sections and rates. It can also speed up future input variation studies of nucleosynthesis by simplifying an intermediate step in the full calculation sequence. Large-scale predictions of sensitivities and ground-state contributions to the stellar rates are presented, allowing an estimate of how well rates can be directly constrained by experiment. The reactions 185W(n,gamma) and 186W(gamma,n) are discussed as application examples. Studies of uncertainties in abundances predicted in nucleosynthesis simulations rely on the knowledge of reaction rate errors. An improved treatment of uncertainty analysis is presented as well as a recipe for combining experimental data and theory to arrive at a new reaction rate and its uncertainty. As an example, it is applied to neutron capture rates for the s-process, leading to larger uncertainties than previously assumed.
Representing Uncertainty in Decision Support Harvey J. Greenberg
Greenberg, Harvey J.
overlapping factors. · Source of uncertainty. Is it a future event, such as product demand? Is it ignorance defines a proven reserve? · Time scale. Is the decision a strategic one, to cover a planning horizon? Is it a major setback, such as the loss of a company? Is it catastrophic, such as the loss of a city? · Decision
Medium Term Planning & Scheduling under Uncertainty for BP Chemicals
Grossmann, Ignacio E.
Results & Future Research #12;5 By Products Benzene: Used elsewhere by other BP companies. Styrene can1 Medium Term Planning & Scheduling under Uncertainty for BP Chemicals Progress Report Murat Kurt Products & Applications Models Results & Future Research #12;4 Products - PTA Purified Terephthalic Acid
UNCERTAINTY MEASURES AND LIMITING DISTRIBUTIONS FOR FILAMENT ESTIMATION
UNCERTAINTY MEASURES AND LIMITING DISTRIBUTIONS FOR FILAMENT ESTIMATION By Yen-Chi Chen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 1. Introduction. A filament is a one-dimensional, smooth, connected structure embedded in a multi-dimensional space. Filaments arise in many applications. For example, matter in the universe tends to concentrate
Assessing Uncertainty in Simulation Based Maritime Risk Assessment
van Dorp, Johan René
Assessing Uncertainty in Simulation Based Maritime Risk Assessment #12;Abstract Recent work in the assessment of risk in maritime transportation systems has used simulation-based probabilistic risk assessment techniques. In the Prince William Sound and Washington State Ferries risk assessments, the studies
Early Power Grid Verification Under Circuit Current Uncertainties
Najm, Farid N.
Early Power Grid Verification Under Circuit Current Uncertainties Imad A. Ferzli Department of ECE Eindhoven, The Netherlands lars@magma-da.com ABSTRACT As power grid safety becomes increasingly important in modern integrated circuits, so does the need to start power grid verifica- tion early in the design cycle
Coal supply and cost under technological and environmental uncertainty
Coal supply and cost under technological and environmental uncertainty Submitted in partial chapters. My conversations with Kurt Walzer at Clean Air Task Force and Rory McIlmoil at Coal Valley Wind Technology Laboratory. I did not complete this work alone. I had a lot of help along the way. I would like
Planning with Concurrency under Resources and Time Uncertainty
Kabanza, Froduald
and Franc¸ois Michaud1 Abstract. Planning with actions concurrency under resources and time uncertainty has uncertain, the duration of a navigation task and the energy required to fulfill it are probabilistic. Up include the Generate, Test, and Debug (GTD) framework [11, 6]. Others include the Factory Policy Gradient
What makes information valuable: signal reliability and environmental uncertainty
Stephens, David W.
uncertainty in animal signal use. We de- veloped a simple model that predicted when animals should switch The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved. Many aspects. An example of a reliable signal is the carotenoid-based plumage colora- tion in male house finches
Plant-Wide Waste Management. 2. Decision Making under Uncertainty
Linninger, Andreas A.
Plant-Wide Waste Management. 2. Decision Making under Uncertainty Aninda Chakraborty and Andreas A of Illinois at Chicago, Chicago, Illinois 60607 The synthesis and optimization of plant-wide waste management flowsheet produces a superstructure that embeds all plant-wide waste management policies. In the subsequent
Developments of the Price equation and natural selection under uncertainty
Grafen, Alan
success, following Darwin (1859). Here, this project is pursued by developing the Price equation, ¢rstDevelopments of the Price equation and natural selection under uncertainty Alan Grafen Department to employ these approaches. Here, a new theore- tical development arising from the Price equation provides
POWER SCHEDULING IN A HYDRO-THERMAL SYSTEM UNDER UNCERTAINTY
Römisch, Werner
POWER SCHEDULING IN A HYDRO-THERMAL SYSTEM UNDER UNCERTAINTY C.C. Car e1, M.P. Nowak2, W. Romisch2 and pumped-storage hydro units is developed. For its compu- tational solution two di erent decompo- sition-burning) thermal units, pumped-storage hydro plants and delivery con- tracts and describe an optimization model
Submitted to Building and Environment UNCERTAINTY IN AIR FLOW CALCULATIONS
LBL-25415 Submitted to Building and Environment UNCERTAINTY IN AIR FLOW CALCULATIONS USING TRACER are becoming widely used to measure the ventilation rates in buildings. As more detailed information by the Assistant Secretary for Conservation and Renewable Energy, Office of Building and Community Systems
Coal supply and cost under technological and environmental uncertainty
Coal supply and cost under technological and environmental uncertainty Submitted in partial, and Rod Lawrence at Foundation Coal. I received a lot of feedback and input on this report, and would like chapters. My conversations with Kurt Walzer at Clean Air Task Force and Rory McIlmoil at Coal Valley Wind
A Computational Framework for Uncertainty Quantification and ...
2010-03-05
in stochastic unit commitment/energy dispatch formulations that .... presents many challenges to the operation of the electrical power ... In [12], [15], artificial neural network (ANN) models ...... is highly flexible and allows us to consider these extensions. ... real size with hundreds of generators, transmission constraints,.
Gerhard Strydom; Friederike Bostelmann; Kostadin Ivanov
2014-10-01
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools are to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the required confidence level. In order to address uncertainty propagation in analysis and methods in the HTGR community the IAEA initiated a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) [6] that officially started in 2013. Although this project focuses specifically on the peculiarities of HTGR designs and its simulation requirements, many lessons can be learned from the LWR community and the significant progress already made towards a consistent methodology uncertainty analysis. In the case of LWRs the NRC has already in 1988 amended 10 CFR 50.46 to allow best-estimate (plus uncertainties) calculations of emergency core cooling system performance. The Nuclear Energy Agency (NEA) of the Organization for Economic Co-operation and Development (OECD) also established an Expert Group on "Uncertainty Analysis in Modelling" which finally led to the definition of the "Benchmark for Uncertainty Analysis in Modelling (UAM) for Design, Operation and Safety Analysis of LWRs" [7]. The CRP on HTGR UAM will follow as far as possible the on-going OECD Light Water Reactor UAM benchmark activity.
Perko, Z.; Gilli, L.; Lathouwers, D.; Kloosterman, J. L.
2013-07-01
Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used technique proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)
Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh
2010-10-01
Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information (sensitivity result) to reduce sampling number. (4) Allowing grid independence for scaled integral effect test (IET) simulation and real plant applications: (a) eliminate numerical uncertainty on scaling; (b) reduce experimental cost by allowing smaller scaled IET; (c) eliminate user effects. This paper will review the issues related to the current CSAU, introduce FSA, discuss a potential Q-PIRT process, and show simple examples to perform FSA. Finally, the general research direction and requirements to use FSA in a system analysis code will be discussed.
Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling
G. Pastore; L.P. Swiler; J.D. Hales; S.R. Novascone; D.M. Perez; B.W. Spencer; L. Luzzi; P. Van Uffelen; R.L. Williamson
2014-10-01
The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.
Donald M. McEligot; Hugh M. McIlroy, Jr.; Ryan C. Johnson
2007-11-01
The purpose of the fluid dynamics experiments in the MIR (Matched-Index-of-Refraction) flow system at Idaho National Laboratory (INL) is to develop benchmark databases for the assessment of Computational Fluid Dynamics (CFD) solutions of the momentum equations, scalar mixing, and turbulence models for typical Very High Temperature Reactor (VHTR) plenum geometries in the limiting case of negligible buoyancy and constant fluid properties. The experiments use optical techniques, primarily particle image velocimetry (PIV) in the INL MIR flow system. The benefit of the MIR technique is that it permits optical measurements to determine flow characteristics in passages and around objects to be obtained without locating a disturbing transducer in the flow field and without distortion of the optical paths. The objective of the present report is to develop understanding of the magnitudes of experimental uncertainties in the results to be obtained in such experiments. Unheated MIR experiments are first steps when the geometry is complicated. One does not want to use a computational technique, which will not even handle constant properties properly. This report addresses the general background, requirements for benchmark databases, estimation of experimental uncertainties in mean velocities and turbulence quantities, the MIR experiment, PIV uncertainties, positioning uncertainties, and other contributing measurement uncertainties.
Generalized Uncertainty Relations and Long Time Limits for Quantum Brownian Motion Models
C. Anastopoulos; J. J. Halliwell
1994-07-27
We study the time evolution of the reduced Wigner function for a class of quantum Brownian motion models. We derive two generalized uncertainty relations. The first consists of a sharp lower bound on the uncertainty function, $U = (\\Delta p)^2 (\\Delta q)^2 $, after evolution for time $t$ in the presence of an environment. The second, a stronger and simpler result, consists of a lower bound at time $t$ on a modified uncertainty function, essentially the area enclosed by the $1-\\sigma$ contour of the Wigner function. In both cases the minimizing initial state is a non-minimal Gaussian pure state. These generalized uncertainty relations supply a measure of the comparative size of quantum and thermal fluctuations. We prove two simple inequalites, relating uncertainty to von Neumann entropy, and the von Neumann entropy to linear entropy. We also prove some results on the long-time limit of the Wigner function for arbitrary initial states. For the harmonic oscillator the Wigner function for all initial states becomes a Gaussian at large times (often, but not always, a thermal state). We derive the explicit forms of the long-time limit for the free particle (which does not in general go to a Gaussian), and also for more general potentials in the approximation of high temperature.
Reduction in maximum time uncertainty of paired time signals
Theodosiou, G.E.; Dawson, J.W.
1981-02-11
Reduction in the maximum time uncertainty (t/sub max/ - t/sub min/) of a series of paired time signals t/sub 1/ and t/sub 2/ varying between two input terminals and representative of a series of single events where t/sub 1/ less than or equal to t/sub 2/ and t/sub 1/ + t/sub 2/ equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t/sub min/) of the first signal t/sub 1/ closer to t/sub max/ and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20 to 800.
Modeling Heterogeneity in Networks using Uncertainty Quantification Tools
Rajendran, Karthikeyan; Siettos, Constantinos I; Laing, Carlo R; Kevrekidis, Ioannis G
2015-01-01
Using the dynamics of information propagation on a network as our illustrative example, we present and discuss a systematic approach to quantifying heterogeneity and its propagation that borrows established tools from Uncertainty Quantification. The crucial assumption underlying this mathematical and computational "technology transfer" is that the evolving states of the nodes in a network quickly become correlated with the corresponding node "identities": features of the nodes imparted by the network structure (e.g. the node degree, the node clustering coefficient). The node dynamics thus depend on heterogeneous (rather than uncertain) parameters, whose distribution over the network results from the network structure. Knowing these distributions allows us to obtain an efficient coarse-grained representation of the network state in terms of the expansion coefficients in suitable orthogonal polynomials. This representation is closely related to mathematical/computational tools for uncertainty quantification (th...
Uncertainties in nuclear transition matrix elements of neutrinoless ?? decay
Rath, P. K.
2013-12-30
To estimate the uncertainties associated with the nuclear transition matrix elements M{sup (K)} (K=0?/0N) for the 0{sup +} ? 0{sup +} transitions of electron and positron emitting modes of the neutrinoless ?? decay, a statistical analysis has been performed by calculating sets of eight (twelve) different nuclear transition matrix elements M{sup (K)} in the PHFB model by employing four different parameterizations of a Hamiltonian with pairing plus multipolar effective two-body interaction and two (three) different parameterizations of Jastrow short range correlations. The averages in conjunction with their standard deviations provide an estimate of the uncertainties associated the nuclear transition matrix elements M{sup (K)} calculated within the PHFB model, the maximum of which turn out to be 13% and 19% owing to the exchange of light and heavy Majorana neutrinos, respectively.
Cumulative theoretical uncertainties in lithium depletion boundary age
Tognelli, Emanuele; Degl'Innocenti, Scilla
2015-01-01
We performed a detailed analysis of the main theoretical uncertainties affecting the age at the lithium depletion boundary (LDB). To do that we computed almost 12000 pre-main sequence models with mass in the range [0.06, 0.4] M_sun by varying input physics (nuclear reaction cross-sections, plasma electron screening, outer boundary conditions, equation of state, and radiative opacity), initial chemical elements abundances (total metallicity, helium and deuterium abundances, and heavy elements mixture), and convection efficiency (mixing length parameter, alpha_ML). As a first step, we studied the effect of varying these quantities individually within their extreme values. Then, we analysed the impact of simultaneously perturbing the main input/parameters without an a priori assumption of independence. Such an approach allowed us to build for the first time the cumulative error stripe, which defines the edges of the maximum uncertainty region in the theoretical LDB age. We found that the cumulative error stripe ...
Uncertainty quantification for CO2 sequestration and enhanced oil recovery
Dai, Zhenxue; Fessenden-Rahn, Julianna; Middleton, Richard; Pan, Feng; Jia, Wei; Lee, Si-Yong; McPherson, Brian; Ampomah, William; Grigg, Reid
2014-01-01
This study develops a statistical method to perform uncertainty quantification for understanding CO2 storage potential within an enhanced oil recovery (EOR) environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. A set of geostatistical-based Monte Carlo simulations of CO2-oil-water flow and reactive transport in the Morrow formation are conducted for global sensitivity and statistical analysis of the major uncertainty metrics: net CO2 injection, cumulative oil production, cumulative gas (CH4) production, and net water injection. A global sensitivity and response surface analysis indicates that reservoir permeability, porosity, and thickness are the major intrinsic reservoir parameters that control net CO2 injection/storage and oil/gas recovery rates. The well spacing and the initial water saturation also have large impact on the oil/gas recovery rates. Further, this study has revealed key insights into the potential behavior and the operational parameters of CO2 sequestration at CO2-EOR s...
RIVERWARE'S INTEGRATED MODELING AND ANALYSIS TOOLS FOR LONG-TERM PLANNING UNDER UNCERTAINTY
RIVERWARE'S INTEGRATED MODELING AND ANALYSIS TOOLS FOR LONG- TERM PLANNING UNDER UNCERTAINTY and reservoir operations under hydrologic uncertainty benefits from modeling capabilities that include 1-objective river and reservoir modeling tool that can represent various planning alternatives and easily run
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
Monte Carlo Methods for Uncertainty Quantification Mike Giles Mathematical Institute, University of Oxford ERCOFTAC course on Mathematical Methods and Tools in Uncertainty Management and Quantification Lecture 1: Introduction and Monte Carlo basics some model applications random number generation Monte
Monte Carlo Methods for Uncertainty Quantification Mathematical Institute, University of Oxford
Giles, Mike
Monte Carlo Methods for Uncertainty Quantification Mike Giles Mathematical Institute, University of Oxford ERCOFTAC course on Mathematical Methods and Tools in Uncertainty Management and Quantification: Introduction and Monte Carlo basics some model applications random number generation Monte Carlo estimation
de Weck, Olivier L.
Technology Decisions Under Architectural Uncertainty: Informing Investment Decisions Through this architectural uncertainty, it is difficult to define the value proposition of technology investments. This paper proposes a method for evaluating technology across a tradespace defined by architectural decisions. Main
Quantification of Uncertainties Due to Opacities in a Laser-Driven Radiative-Shock Problem
Hetzler, Adam C
2013-03-28
This research presents new physics-based methods to estimate predictive uncertainty stemming from uncertainty in the material opacities in radiative transfer computations of key quantities of interest (QOIs). New methods ...
Application of price uncertainty quantification models and their impacts on project evaluations
Fariyibi, Festus Lekan
2006-10-30
This study presents an analysis of several recently published methods for quantifying the uncertainty in economic evaluations due to uncertainty in future oil prices. Conventional price forecasting methods used in the industry typically...
Dealing with parameter uncertainty in the calculation of water surface profiles
Vargas-Cruz, Ruben F.
1998-01-01
for the reliability of the analysis results. Uncertainty analysis should be adopted as another important step within the hydrologic and hydraulic analysis. Unfortunately, the incorporation of uncertainty analysis in many engineering studies is not a common practice...
Financial reporting quality and uncertainty about credit risk among the ratings agencies
Akins, Brian Keith
2012-01-01
I study whether financial reporting quality resolves uncertainty about credit risk by examining how it affects disagreement between rating agencies. I find better reporting quality is associated with less uncertainty about ...
Rivington, Michael
2011-06-28
This Thesis explored a range of approaches to study the uncertainty and impacts associated with climate change at the farm scale in Scotland. The research objective was to use a process of uncertainty evaluation and simulation modelling to provide...
Uncertainty quantification and prediction for non-autonomous linear and nonlinear systems
Phadnis, Akash
2013-01-01
The science of uncertainty quantification has gained a lot of attention over recent years. This is because models of real processes always contain some elements of uncertainty, and also because real systems can be better ...
Holmes, Christopher D.
Reactive greenhouse gas scenarios: Systematic exploration of uncertainties and the role chemistry of reactive greenhouse gases is needed to accurately quantify the relationship between human activities and climate, and to incorporate uncertainty in our projections of greenhouse gas abundances. We
Zettlemoyer, M.D.
1990-01-01
The Air Force Toxic Chemical Dispersion (AFTOX) model is a Gaussian puff dispersion model that predicts plumes, concentrations, and hazard distances of toxic chemical spills. A measurement uncertainty propagation formula derived by Freeman et al. (1986) is used within AFTOX to estimate resulting concentration uncertainties due to the effects of data input uncertainties in wind speed, spill height, emission rate, and the horizontal and vertical Gaussian dispersion parameters, and the results are compared to true uncertainties as estimated by standard deviations computed by Monte Carlo simulations. The measurement uncertainty uncertainty propagation formula was found to overestimate measurement uncertainty in AFTOX-calculated concentrations by at least 350 percent, with overestimates worsening with increasing stability and/or increasing measurement uncertainty.
Climate dynamics and fluid mechanics: Natural variability and related uncertainties
Michael Ghil; Mickaël D. Chekroun; Eric Simonnet
2010-06-15
The purpose of this review-and-research paper is twofold: (i) to review the role played in climate dynamics by fluid-dynamical models; and (ii) to contribute to the understanding and reduction of the uncertainties in future climate-change projections. To illustrate the first point, we focus on the large-scale, wind-driven flow of the mid-latitude oceans which contribute in a crucial way to Earth's climate, and to changes therein. We study the low-frequency variability (LFV) of the wind-driven, double-gyre circulation in mid-latitude ocean basins, via the bifurcation sequence that leads from steady states through periodic solutions and on to the chaotic, irregular flows documented in the observations. This sequence involves local, pitchfork and Hopf bifurcations, as well as global, homoclinic ones. The natural climate variability induced by the LFV of the ocean circulation is but one of the causes of uncertainties in climate projections. Another major cause of such uncertainties could reside in the structural instability in the topological sense, of the equations governing climate dynamics, including but not restricted to those of atmospheric and ocean dynamics. We propose a novel approach to understand, and possibly reduce, these uncertainties, based on the concepts and methods of random dynamical systems theory. As a very first step, we study the effect of noise on the topological classes of the Arnol'd family of circle maps, a paradigmatic model of frequency locking as occurring in the nonlinear interactions between the El Nino-Southern Oscillations (ENSO) and the seasonal cycle. It is shown that the maps' fine-grained resonant landscape is smoothed by the noise, thus permitting their coarse-grained classification. This result is consistent with stabilizing effects of stochastic parametrization obtained in modeling of ENSO phenomenon via some general circulation models.
The Method of Manufactured Universes for Testing Uncertainty Quantification Methods
Stripling, Hayes Franklin
2011-02-22
Adaptive Regression Splines GP Gaussian process GPMSA Gaussian Process Model for Simulation Analysis LANL Los Alamos National Laboratory MMU Method of Manufactured Universes PS&E Predictive science and engineering QOIs Quantities of interest UQ Uncertainty... (LANL) and second, a Bayesian Multivariate Adaptive Regression Spline (BMARS) technique combined with a ltering/weighting method. The conclusion drawn from these results is that MMU is a powerful technique that can help identify problems in UQ software...
Methodologies for Estimating Building Energy Savings Uncertainty: Review and Comparison
Baltazar, J.C.; Sun, Y.; Claridge, D.
2014-01-01
CONFERENCE FOR ENHANCED BUILDING OPERATIONS TSINGHUA UNIVERSITY – BEIJING, CHINA –SEPTEMBER 14 -17, 2014 Methodologies for Estimating Building Energy Savings Uncertainty: Review and Comparison Juan-Carlos Baltazar PhD, PE, Yifu Sun EIT, and David Claridge... PhD, P.E. International Conference for Enhanced Building Operations Tsinghua University – Beijing, China –September 14 -17, 2014 ESL-IC-14-09-11a Proceedings of the 14th International Conference for Enhanced Building Operations, Beijing, China...
Using Cross-Section Uncertainty Data to Estimate Biases
Mueller, Don [ORNL; Rearden, Bradley T [ORNL
2008-01-01
Ideally, computational method validation is performed by modeling critical experiments that are very similar, neutronically, to the model used in the safety analysis. Similar, in this context, means that the neutron multiplication factors (k{sub eff}) of the safety analysis model and critical experiment model are affected in the same way to the same degree by variations (or errors) in the same nuclear data. Where similarity is demonstrated, the computational bias calculated using the critical experiment model results is 'applicable' to the safety analysis model. Unfortunately, criticality safety analysts occasionally find that the safety analysis models include some feature or material for which adequately similar well-defined critical experiments do not exist to support validation. For example, the analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to assign an additional administrative margin to compensate for the validation weakness or to conclude that the impact on the calculated bias and bias uncertainty is negligible. Due to advances in computer programs and the evolution of cross-section uncertainty data, analysts can use the sensitivity and uncertainty analyses tools implemented in the SCALE TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides that are under-represented or not present in the critical experiments. This paper discusses the method, computer codes, and data used to estimate the potential contribution toward the computational bias of individual nuclides. The results from application of the method to fission products in a burnup credit model are presented.
On Uncertainty Quantification of Lithium-ion Batteries
Hadigol, Mohammad; Doostan, Alireza
2015-01-01
In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC$_6$/LiCoO$_2$ cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the ce...
Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth
E. Borgonovo; C. L. Smith
2012-10-01
Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability risk metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.
Sampling-based Uncertainty Quantification in Deconvolution of X-ray Radiographs
Howard, M. [NSTec; Luttman, A. [NSTec; Fowler, M. [NSTec
2014-11-01
In imaging applications that focus on quantitative analysis{such as X-ray radiography in the security sciences--it is necessary to be able to reliably estimate the uncertainties in the processing algorithms applied to the image data, and deconvolving the system blur out of the image is usually an essential step. In this work we solve the deconvolution problem within a Bayesian framework for edge-enhancing reconstruction with uncertainty quantification. The likelihood is a normal approximation to the Poisson likelihood, and the prior is generated from a classical total variation regularized Poisson deconvolution. Samples from the corresponding posterior distribution are computed using a Markov chain Monte Carlo approach, giving a pointwise measure of uncertainty in the final, deconvolved signal. We demonstrate the results on real data used to calibrate a high-energy X-ray source and show that this approach gives reconstructions as good as classical regularization methods, while mitigating many of their drawbacks.
Ideas underlying quantification of margins and uncertainties(QMU): a white paper.
Helton, Jon Craig; Trucano, Timothy Guy; Pilch, Martin M.
2006-09-01
This report describes key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions at Sandia National Laboratories. While QMU is a broad process and methodology for generating critical technical information to be used in stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, we discuss the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, the need to separate aleatory and epistemic uncertainty in QMU, and the risk-informed decision making that is best suited for decisive application of QMU. The paper is written at a high level, but provides a systematic bibliography of useful papers for the interested reader to deepen their understanding of these ideas.
Allan Tameshtit
2012-04-09
High temperature and white noise approximations are frequently invoked when deriving the quantum Brownian equation for an oscillator. Even if this white noise approximation is avoided, it is shown that if the zero point energies of the environment are neglected, as they often are, the resultant equation will violate not only the basic tenet of quantum mechanics that requires the density operator to be positive, but also the uncertainty principle. When the zero-point energies are included, asymptotic results describing the evolution of the oscillator are obtained that preserve positivity and, therefore, the uncertainty principle.
Lacey, Ronald; Faulkner, William
2015-01-01
of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration... of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative...
Bayes Linear Uncertainty Analysis for Oil Reservoirs Based on Multiscale Computer Experiments
Oakley, Jeremy
of the input parameters for a reservoir model. Therefore, an uncertainty analysis for the model often proceedsBayes Linear Uncertainty Analysis for Oil Reservoirs Based on Multiscale Computer Experiments for the efficient management of the reservoir. In a Bayesian analysis, all of our uncertainties are incorporated
Mohaghegh, Shahab
-cellular model (reservoir parameters), uncertainty analysis becomes an important task that is required for makingSPE 125959 Reservoir Simulation and Uncertainty Analysis of Enhanced CBM Production Using the foundation of any reservoir simulation), comprehensive analysis and uncertainty quantification of ECBM and CO
ResGrid: A Grid-Aware Toolkit for Reservoir Uncertainty Analysis
Allen, Gabrielle
Anisotropical Ratio (SGSIM) Reservoir Uncertainty Analysis · Responses to factors are obtained by modelsResGrid: A Grid-Aware Toolkit for Reservoir Uncertainty Analysis Z. Lei, D. Huang, A. Kulshrestha & performance prediction Sensitivity analysis & uncertainty assessment History matching (model verification
Quantification of Variability and Uncertainty in Hourly NOx Emissions from Coal-Fired Power Plants
Frey, H. Christopher
1 Quantification of Variability and Uncertainty in Hourly NOx Emissions from Coal-Fired Power to quantify variability and uncertainty for NOx emissions from coal-fired power plants. Data for hourly NOx Uncertainty, Variability, Emission Factors, Coal-Fired Power Plants, NOx emissions, Regression Models
Bayesian-network Con rmation of Software Testing Uncertainties Debra J. Richardson
Ziv, Hadar
Bayesian-network Con rmation of Software Testing Uncertainties Hadar Ziv Debra J. Richardson presentation of uncertainty in software testing. We then propose that a speci c technique, known as Bayesian Belief Networks, be used to model software testing uncertainties. We demonstrate the use of Bayesian
Water Management, Risk, and Uncertainty: Things We Wish We Knew in the 21st
Shaw, W. Douglass
Water Management, Risk, and Uncertainty: Things We Wish We Knew in the 21st Century [Forthcoming;1 Introduction A survey is offered of the most difficult and challenging issues to water managers in the 21st Century, focusing on the economics of risks and more so, uncertainty. Risk and uncertainty is addressed
Uncertainty-aware geospatial system for mapping and visualizing underground utilities
Kamat, Vineet R.
Uncertainty-aware geospatial system for mapping and visualizing underground utilities Shuai Li Accepted 6 March 2015 Available online 21 March 2015 Keywords: GPR GPS GIS 3D underground utility mapping Uncertainty modeling Uncertainty-aware visualization Accuracy assessment Underground utility lines being
Analysis and Reduction of Power Grid Models under Uncertainty Sandia National Laboratories
Levi, Anthony F. J.
1.30pm Analysis and Reduction of Power Grid Models under Uncertainty Habib Najm Sandia National Laboratories Abstract The increased utilization of alternative energy sources requires that evolving power grid Uncertainty Eigenproblem Closure Analysis and Reduction of Power Grid Models under Uncertainty H.N. Najm
Achieving Robustness to Uncertainty for Financial Decision-making
Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.; Song, Peter
2014-01-10
This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gap of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.
Hub, Martina; Thieke, Christian; Kessler, Marc L.; Karger, Christian P. [Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany); Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg, Germany and Department of Radiation Oncology, University Clinic Heidelberg, 69120 Heidelberg (Germany); Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Medical Physics in Radiation Oncology, German Cancer Research Center (DKFZ), 69120 Heidelberg (Germany)
2012-04-15
Purpose: In fractionated radiation therapy, image guidance with daily tomographic imaging becomes more and more clinical routine. In principle, this allows for daily computation of the delivered dose and for accumulation of these daily dose distributions to determine the actually delivered total dose to the patient. However, uncertainties in the mapping of the images can translate into errors of the accumulated total dose, depending on the dose gradient. In this work, an approach to estimate the uncertainty of mapping between medical images is proposed that identifies areas bearing a significant risk of inaccurate dose accumulation. Methods: This method accounts for the geometric uncertainty of image registration and the heterogeneity of the dose distribution, which is to be mapped. Its performance is demonstrated in context of dose mapping based on b-spline registration. It is based on evaluation of the sensitivity of dose mapping to variations of the b-spline coefficients combined with evaluation of the sensitivity of the registration metric with respect to the variations of the coefficients. It was evaluated based on patient data that was deformed based on a breathing model, where the ground truth of the deformation, and hence the actual true dose mapping error, is known. Results: The proposed approach has the potential to distinguish areas of the image where dose mapping is likely to be accurate from other areas of the same image, where a larger uncertainty must be expected. Conclusions: An approach to identify areas where dose mapping is likely to be inaccurate was developed and implemented. This method was tested for dose mapping, but it may be applied in context of other mapping tasks as well.
Handling risk of uncertainty in model-based production optimization: a
Van den Hof, Paul
.d.Jansen@tudelft.nl) Abstract: Model-based economic optimization of oil production suffers from high levels of uncertainty, the secondary objective is aimed at maximizing the speed of oil production to mitigate risk. This multi and production data about the true values of the model parameters. Furthermore, economic variables such as oil
Design Features and Technology Uncertainties for the Next Generation Nuclear Plant
John M. Ryskamp; Phil Hildebrandt; Osamu Baba; Ron Ballinger; Robert Brodsky; Hans-Wolfgang Chi; Dennis Crutchfield; Herb Estrada; Jeane-Claude Garnier; Gerald Gordon; Richard Hobbins; Dan Keuter; Marilyn Kray; Philippe Martin; Steve Melancon; Christian Simon; Henry Stone; Robert Varrin; Werner von Lensa
2004-06-01
This report presents the conclusions, observations, and recommendations of the Independent Technology Review Group (ITRG) regarding design features and important technology uncertainties associated with very-high-temperature nuclear system concepts for the Next Generation Nuclear Plant (NGNP). The ITRG performed its reviews during the period November 2003 through April 2004.
While future changes in emission are the largest uncertainty on future climate change, another
Allan, Richard P.
While future changes in emission are the largest uncertainty on future climate change, another. Above, the thick lines show different possible future scenarios (Representative Concentration Pathways) ranging from low emission (RCP2.6) up to high emission (RCP 8.5) which explain a large range in future
Scanlon, Bridget R.
low-level and high-level radioactive waste disposal sites in the United States, and groundwater,000 to 105,000 years to 25 m depth) were generally corroborated by residence times estimated from radioactive uncertainties in transport processes, Cl input, and Cl output. Although the CMB approach assumes one
Energy-saving technology adoption under uncertainty in the residential sector
Paris-Sud XI, UniversitÃ© de
Energy-saving technology adoption under uncertainty in the residential sector DorothÃ©e Charlier, Alejandro MosiÃ±o and Aude Pommeret October 5, 2009 Abstract Home renovation is generally asserted to be a highly effective means for households to lower expenditures on energy. In this sense, home renova- tion
CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis
Ferrer, R.; Rhodes, J. [Studsvik Scandpower, Inc., 504 Shoup Ave., Idaho Falls, ID 83402 (United States); Smith, K. [Dept. of Nuclear Science and Engineering, Massachusetts Inst. of Technology, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States)
2012-07-01
The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)
Myers, D. R.; Reda, I. M.; Wilcox, S. M.; Stoffel, T. L.
2004-04-01
The measurement of broadband solar radiation has grown in importance since the advent of solar renewable energy technologies in the 1970's, and the concern about the Earth's radiation balance related to climate change in the 1990's. In parallel, standardized methods of uncertainty analysis and reporting have been developed. Historical and updated uncertainties are based on the current international standardized uncertainty analysis method. Despite the fact that new and sometimes overlooked sources of uncertainty have been identified over the period 1988 to 2004, uncertainty in broadband solar radiometric instrumentation remains at 3% to 5% for pyranometers, and 2% to 3% for pyrheliometers. Improvements in characterizing correction functions for radiometer data may reduce total uncertainty. We analyze the theoretical standardized uncertainty sensitivity coefficients for the instrumentation calibration measurement equation and highlight the single parameter (thermal offset voltages), which contributes the most to the observed calibration responsivities.
Quantification of initial-data uncertainty on a shock-accelerated gas cylinder
Tritschler, V. K. Avdonin, A.; Hickel, S.; Hu, X. Y.; Adams, N. A.
2014-02-15
We quantify initial-data uncertainties on a shock accelerated heavy-gas cylinder by two-dimensional well-resolved direct numerical simulations. A high-resolution compressible multicomponent flow simulation model is coupled with a polynomial chaos expansion to propagate the initial-data uncertainties to the output quantities of interest. The initial flow configuration follows previous experimental and numerical works of the shock accelerated heavy-gas cylinder. We investigate three main initial-data uncertainties, (i) shock Mach number, (ii) contamination of SF{sub 6} with acetone, and (iii) initial deviations of the heavy-gas region from a perfect cylindrical shape. The impact of initial-data uncertainties on the mixing process is examined. The results suggest that the mixing process is highly sensitive to input variations of shock Mach number and acetone contamination. Additionally, our results indicate that the measured shock Mach number in the experiment of Tomkins et al. [“An experimental investigation of mixing mechanisms in shock-accelerated flow,” J. Fluid. Mech. 611, 131 (2008)] and the estimated contamination of the SF{sub 6} region with acetone [S. K. Shankar, S. Kawai, and S. K. Lele, “Two-dimensional viscous flow simulation of a shock accelerated heavy gas cylinder,” Phys. Fluids 23, 024102 (2011)] exhibit deviations from those that lead to best agreement between our simulations and the experiment in terms of overall flow evolution.
Quantification of Uncertainties in Nuclear Density Functional theory
N. Schunck; J. D. McDonnell; D. Higdon; J. Sarich; S. Wild
2014-09-17
Reliable predictions of nuclear properties are needed as much to answer fundamental science questions as in applications such as reactor physics or data evaluation. Nuclear density functional theory is currently the only microscopic, global approach to nuclear structure that is applicable throughout the nuclear chart. In the past few years, a lot of effort has been devoted to setting up a general methodology to assess theoretical uncertainties in nuclear DFT calculations. In this paper, we summarize some of the recent progress in this direction. Most of the new material discussed here will be be published in separate articles.
PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT
Seitz, R
2008-06-25
Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.
Molecular nonlinear dynamics and protein thermal uncertainty quantification
Xia, Kelin [Department of Mathematics, Michigan State University, Michigan 48824 (United States)] [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Wei, Guo-Wei, E-mail: wei@math.msu.edu [Department of Mathematics, Michigan State University, Michigan 48824 (United States) [Department of Mathematics, Michigan State University, Michigan 48824 (United States); Department of Electrical and Computer Engineering, Michigan State University, Michigan 48824 (United States); Department of Biochemistry and Molecular Biology, Michigan State University, Michigan 48824 (United States)
2014-03-15
This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction.
Research Portfolio Report Ultra-Deepwater: Geologic Uncertainty
Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)
AFDC Printable Version Share this resource Send a link to EERE: Alternative Fuels Data Center Home Page to someone by E-mail Share EERE: Alternative Fuels Data Center Home Page on Facebook Tweet about EERE: Alternative Fuels Data Center Home Page on Twitter Bookmark EERE: Alternative Fuels Data Center Homesum_a_epg0_fpd_mmcf_m.xls" ,"Available from WebQuantityBonneville Power Administration wouldMassR&D100 WinnersAffiliatesMadden-JulianOut with theORD'sNewGeologic Uncertainty Cover
Uncertainty Analysis of RELAP5-3D
Alexandra E Gertman; Dr. George L Mesina
2012-07-01
As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.
Analysis and Reduction of Complex Networks Under Uncertainty.
Ghanem, Roger G
2014-07-31
This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.
Data Filtering Impact on PV Degradation Rates and Uncertainty (Poster)
Jordan, D. C.; Kurtz, S. R.
2012-03-01
To sustain the commercial success of photovoltaics (PV) it becomes vital to know how power output decreases with time. In order to predict power delivery, degradation rates must be determined accurately. Data filtering, any data treatment assessment of long-term field behavior, is discussed as part of a more comprehensive uncertainty analysis and can be one of the greatest sources of uncertainty in long-term performance studies. Several distinct filtering methods such as outlier removal and inclusion of only sunny days on several different metrics such as PVUSA, performance ratio, DC power to plane-of-array irradiance ratio, uncorrected, and temperature-corrected were examined. PVUSA showed the highest sensitivity while temperature-corrected power over irradiance ratio was found to be the least sensitive to data filtering conditions. Using this ratio it is demonstrated that quantification of degradation rates with a statistical accuracy of +/- 0.2%/year within 4 years of field data is possible on two crystalline silicon and two thin-film systems.
Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results
Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory
2009-01-01
The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.
Energy-Time Uncertainty Relations in Quantum Measurements
Takayuki Miyadera
2015-05-14
Quantum measurement is a physical process. A system and an apparatus interact for a certain time period (measurement time). During this interaction, information of an observable is transferred from the system to the apparatus. In this study, we study the amount of energy fluctuation of the apparatus that is required for this physical process to occur. To do so, the interface between the quantum and classical worlds ("Heisenberg cut") must be carefully chosen so that the quantum side is large enough to autonomously switch on the interaction. At this setting we prove that a trade-off relation (energy-time uncertainty relation) holds between the energy fluctuation of the apparatus and the measurement time. We use this trade-off relation to discuss the spacetime uncertainty relation questioning the operational meaning of the microscopic structure of spacetime. In addition, we derive another trade-off inequality between the measurement time and the strength of interaction between the system and the apparatus. The larger the information carried by an observable to be measured, the stronger restriction the trade-off relations give.
Neutron skin uncertainties of Skyrme energy density functionals
M. Kortelainen; J. Erler; W. Nazarewicz; N. Birge; Y. Gao; E. Olsen
2013-07-16
Background: Neutron-skin thickness is an excellent indicator of isovector properties of atomic nuclei. As such, it correlates strongly with observables in finite nuclei that depend on neutron-to-proton imbalance and the nuclear symmetry energy that characterizes the equation of state of neutron-rich matter. A rich worldwide experimental program involving studies with rare isotopes, parity violating electron scattering, and astronomical observations is devoted to pinning down the isovector sector of nuclear models. Purpose: We assess the theoretical systematic and statistical uncertainties of neutron-skin thickness and relate them to the equation of state of nuclear matter, and in particular to nuclear symmetry energy parameters. Methods: We use the nuclear superfluid Density Functional Theory with several Skyrme energy density functionals and density dependent pairing. To evaluate statistical errors and their budget, we employ the statistical covariance technique. Results: We find that the errors on neutron skin increase with neutron excess. Statistical errors due to uncertain coupling constants of the density functional are found to be larger than systematic errors, the latter not exceeding 0.06 fm in most neutron-rich nuclei across the nuclear landscape. The single major source of uncertainty is the poorly determined slope L of the symmetry energy that parametrizes its density dependence. Conclusions: To provide essential constraints on the symmetry energy of the nuclear energy density functional, next-generation measurements of neutron skins are required to deliver precision better than 0.06 fm.
Reda, I.
2011-07-01
The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections to solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.
An Information-Theoretic Measure of Uncertainty due to Quantum and Thermal Fluctuations
Arlen Anderson; Jonathan J. Halliwell
1993-04-28
We study an information-theoretic measure of uncertainty for quantum systems. It is the Shannon information $I$ of the phase space probability distribution $\\la z | \\rho | z \\ra $, where $|z \\ra $ are coherent states, and $\\rho$ is the density matrix. The uncertainty principle is expressed in this measure as $I \\ge 1$. For a harmonic oscillator in a thermal state, $I$ coincides with von Neumann entropy, $- \\Tr(\\rho \\ln \\rho)$, in the high-temperature regime, but unlike entropy, it is non-zero at zero temperature. It therefore supplies a non-trivial measure of uncertainty due to both quantum and thermal fluctuations. We study $I$ as a function of time for a class of non-equilibrium quantum systems consisting of a distinguished system coupled to a heat bath. We derive an evolution equation for $I$. For the harmonic oscillator, in the Fokker-Planck regime, we show that $I$ increases monotonically. For more general Hamiltonians, $I$ settles down to monotonic increase in the long run, but may suffer an initial decrease for certain initial states that undergo ``reassembly'' (the opposite of quantum spreading). Our main result is to prove, for linear systems, that $I$ at each moment of time has a lower bound $I_t^{min}$, over all possible initial states. This bound is a generalization of the uncertainty principle to include thermal fluctuations in non-equilibrium systems, and represents the least amount of uncertainty the system must suffer after evolution in the presence of an environment for time $t$.
Science based stockpile stewardship, uncertainty quantification, and fission fragment beams
Stoyer, M A; McNabb, D; Burke, J; Bernstein, L A; Wu, C Y
2009-09-14
Stewardship of this nation's nuclear weapons is predicated on developing a fundamental scientific understanding of the physics and chemistry required to describe weapon performance without the need to resort to underground nuclear testing and to predict expected future performance as a result of intended or unintended modifications. In order to construct more reliable models, underground nuclear test data is being reanalyzed in novel ways. The extent to which underground experimental data can be matched with simulations is one measure of the credibility of our capability to predict weapon performance. To improve the interpretation of these experiments with quantified uncertainties, improved nuclear data is required. As an example, the fission yield of a device was often determined by measuring fission products. Conversion of the measured fission products to yield was accomplished through explosion code calculations (models) and a good set of nuclear reaction cross-sections. Because of the unique high-fluence environment of an exploding nuclear weapon, many reactions occurred on radioactive nuclides, for which only theoretically calculated cross-sections are available. Inverse kinematics reactions at CARIBU offer the opportunity to measure cross-sections on unstable neutron-rich fission fragments and thus improve the quality of the nuclear reaction cross-section sets. One of the fission products measured was {sup 95}Zr, the accumulation of all mass 95 fission products of Y, Sr, Rb and Kr (see Fig. 1). Subsequent neutron-induced reactions on these short lived fission products were assumed to cancel out - in other words, the destruction of mass 95 nuclides was more or less equal to the production of mass 95 nuclides. If a {sup 95}Sr was destroyed by an (n,2n) reaction it was also produced by (n,2n) reactions on {sup 96}Sr, for example. However, since these nuclides all have fairly short half-lives (seconds to minutes or even less), no experimental nuclear reaction cross-sections exist, and only theoretically modeled cross-sections are available. Inverse kinematics reactions at CARIBU offer the opportunity, should the beam intensity be sufficient, to measure cross-sections on a few important nuclides in order to benchmark the theoretical calculations and significantly improve the nuclear data. The nuclides in Fig. 1 are prioritized by importance factor and displayed in stoplight colors, green the highest and red the lowest priority.
Uncertainty in terahertz time-domain spectroscopy measurement
Withayachumnankul, Withawat; Fischer, Bernd M.; Lin Hungyen; Abbott, Derek
2008-06-15
Measurements of optical constants at terahertz--or T-ray--frequencies have been performed extensively using terahertz time-domain spectroscopy (THz-TDS). Spectrometers, together with physical models explaining the interaction between a sample and T-ray radiation, are progressively being developed. Nevertheless, measurement errors in the optical constants, so far, have not been systematically analyzed. This situation calls for a comprehensive analysis of measurement uncertainty in THz-TDS systems. The sources of error existing in a terahertz spectrometer and throughout the parameter estimation process are identified. The analysis herein quantifies the impact of each source on the output optical constants. The resulting analytical model is evaluated against experimental THz-TDS data.
Beating the Landauer's limit by trading energy with uncertainty
Gammaitoni, Luca
2011-01-01
According to the International Technology Roadmap for Semiconductors in the next 10-15 years the limits imposed by the physics of switch operation will be the major roadblock for future scaling of the CMOS technology. Among these limits the most fundamental is represented by the so-called Shannon-von Neumann-Landauer limit that sets a lower bound to the minimum heat dissipated per bit erasing operation. Here we show that in a nanoscale switch, operated at finite temperature T, this limit can be beaten by trading the dissipated energy with the uncertainty in the distinguishability of switch logic states. We establish a general relation between the minimum required energy and the maximum error rate in the switch operation and briefly discuss the potential applications in the design of future switches.
Generalized uncertainty principle and thermostatistics: a semiclassical approach
Abbasiyan-Motlaq, M
2015-01-01
We present an exact treatment of the thermodynamics of physical systems in the framework of the generalized uncertainty principle (GUP). Our purpose is to study and compare the consequences of two GUPs that one implies a minimal length while the other predicts a minimal length and a maximal momentum. Using a semiclassical method, we exactly calculate the modified internal energies and heat capacities in the presence of generalized commutation relations. We show that the total shift in these quantities only depends on the deformed algebra not on the system under study. Finally, the modified internal energy for an specific physical system such as ideal gas is obtained in the framework of two different GUPs.
Generalized uncertainty principle and thermostatistics: a semiclassical approach
M. Abbasiyan-Motlaq; P. Pedram
2015-09-29
We present an exact treatment of the thermodynamics of physical systems in the framework of the generalized uncertainty principle (GUP). Our purpose is to study and compare the consequences of two GUPs that one implies a minimal length while the other predicts a minimal length and a maximal momentum. Using a semiclassical method, we exactly calculate the modified internal energies and heat capacities in the presence of generalized commutation relations. We show that the total shift in these quantities only depends on the deformed algebra not on the system under study. Finally, the modified internal energy for an specific physical system such as ideal gas is obtained in the framework of two different GUPs.
Uncertainty of silicon 1-MeV damage function
Danjaji, M.B.; Griffin, P.J.
1997-02-01
The electronics radiation hardness-testing community uses the ASTM E722-93 Standard Practice to define the energy dependence of the nonionizing neutron damage to silicon semiconductors. This neutron displacement damage response function is defined to be equal to the silicon displacement kerma as calculated from the ORNL Si cross-section evaluation. Experimental work has shown that observed damage ratios at various test facilities agree with the defined response function to within 5%. Here, a covariance matrix for the silicon 1-MeV neutron displacement damage function is developed. This uncertainty data will support the electronic radiation hardness-testing community and will permit silicon displacement damage sensors to be used in least squares spectrum adjustment codes.
Reaction rate uncertainties and the {nu}p-process
Froehlich, C.; Rauscher, T. [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Dept. of Physics, University of Basel, 4056 Basel (Switzerland)
2012-11-12
Current hydrodynamical simulations of core collapse supernovae find proton-rich early ejecta. At the same time, the models fail to eject neutron-rich matter, thus leaving the origin of the main r-process elements unsolved. However, the proton-rich neutrino-driven winds from supernovae have been identified as a possible production site for light n-capture elements beyond iron (such as Ge, Sr, Y, Zr) through the {nu}p-process. The detailed nucleosynthesis patterns of the {nu}p-process depend on the hydrodynamic conditions and the nuclear reaction rates of key reactions. We investigate the impact of reaction rate uncertainties on the {nu}p-process nucleosynthesis.
Impact of nuclear mass uncertainties on the $r$-process
Martin, Dirk; Nazarewicz, Witold; Olsen, Erik
2015-01-01
Nuclear masses play a fundamental role in understanding how the heaviest elements in the Universe are created in the $r$-process. We predict $r$-process nucleosynthesis yields using neutron capture and photodissociation rates that are based on nuclear density functional theory. Using six Skyrme energy density functionals based on different optimization protocols, we determine for the first time systematic uncertainty bands -- related to mass modeling -- for $r$-process abundances in realistic astrophysical scenarios. We find that features of the underlying microphysics make an imprint on abundances especially in the vicinity of neutron shell closures: abundance peaks and troughs are reflected in trends of neutron separation energy. Further advances in nuclear theory and experiments, when linked to observations, will help in the understanding of astrophysical conditions in extreme $r$-process sites.
Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.
2010-10-01
The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities: a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
Principles and applications of measurement and uncertainty analysis in research and calibration
Wells, C.V.
1992-11-01
Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.
Zhang, Xuesong; Zhao, Kaiguang
2012-06-01
Bayesian Neural Networks (BNNs) have been shown as useful tools to analyze modeling uncertainty of Neural Networks (NNs). This research focuses on the comparison of two BNNs. The first BNNs (BNN-I) use statistical methods to describe the characteristics of different uncertainty sources (input, parameter, and model structure) and integrate these uncertainties into a Markov Chain Monte Carlo (MCMC) framework to estimate total uncertainty. The second BNNs (BNN-II) lump all uncertainties into a single error term (i.e. the residual between model prediction and measurement). In this study, we propose a simple BNN-II, which use Genetic Algorithms (GA) and Bayesian Model Averaging (BMA) to calibrate Neural Networks with different structures (number of hidden units) and combine the predictions from different NNs to derive predictions and uncertainty analysis. We tested these two BNNs in two watersheds for daily and monthly hydrologic simulation. The BMA based BNNs developed in this study outperforms BNN-I in the two watersheds in terms of both accurate prediction and uncertainty estimation. These results show that, given incomplete understanding of the characteristics associated with each uncertainty source, the simple lumped error approach may yield better prediction and uncertainty estimation.
Epistemologies of uncertainty : governing CO2 capture and storage science and technology
Evar, Benjamin
2014-11-27
This thesis progresses from a ‘science and technology studies’ (STS) perspective to consider the ways that expert stakeholders perceive and communicate uncertainties and risks attached to carbon ...
Final Report: DOE Project: DE-SC-0005399 Linking the uncertainty...
Office of Scientific and Technical Information (OSTI)
Linking the uncertainty of low frequency variability in tropical forcing in regional climate change Citation Details In-Document Search Title: Final Report: DOE Project:...
Final Report: DOE Project: DE-SC-0005399 Linking the uncertainty...
Office of Scientific and Technical Information (OSTI)
Final Report: DOE Project: DE-SC-0005399 Linking the uncertainty of low frequency variability in tropical forcing in regional climate change Citation Details In-Document Search...
Framework for Modeling the Uncertainty of Future Events in Life Cycle Assessment
Chen, Yi-Fen; Simon, Rachel; Dornfeld, David
2013-01-01
event scenarios could alter LCA result. REFERENCES SchweimerEconomic- balance hybrid LCA extended with uncertaintyLife Cycle Assessment (LCA) is a leading technique used to
Clay, Michael J; Johnston, Robert A.
2008-01-01
LARGE REAL ESTATE DEVELOPMENTS, SPATIAL UNCERTAINTY, ANDfor than others)? Large real estate construction projectsneed to model large real estate developments, several types
Carbon Accounting and Economic Model Uncertainty of Emissions from Biofuels-Induced Land Use Change
Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'??Hare, Michael
2015-01-01
uncertainty of full carbon accounting of forest ecosystemsA. ; Hopson, E. , Proper accounting for time increases crop-use change modeling in GTEM: Accounting for forest sinks.
Tabone, Michaelangelo D; Callaway, Duncan S
2015-01-01
for grid-connected photovoltaic system based on advancedof many photovoltaic power generation systems dis- persed inSYSTEMS Modeling Variability and Uncertainty of Photovoltaic
ImpactMap: Designing Sustainable Supply Chains by Incorporating Data Uncertainty
Fuge, Mark; McKinstry, Katherine; Ninomiya, Kevin
2013-01-01
Analyzing Uncertainty in Life- Cycle Assessment: A Survey ofissues in data quality analysis in life cycle assessment. ”International Journal of Life Cycle Assessment, 17 (1): pp.
Kim, Alex G.; Miquel, Ramon
2005-09-26
We present a new technique to extract the cosmological information from high-redshift supernova data in the presence of calibration errors and extinction due to dust. While in the traditional technique the distance modulus of each supernova is determined separately, in our approach we determine all distance moduli at once, in a process that achieves a significant degree of self-calibration. The result is a much reduced sensitivity of the cosmological parameters to the calibration uncertainties. As an example, for a strawman mission similar to that outlined in the SNAP satellite proposal, the increased precision obtained with the new approach is roughly equivalent to a factor of five decrease in the calibration uncertainty.
Srinivasan, Sanjay
2014-09-30
In-depth understanding of the long-term fate of CO? in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO? in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO? plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO? plume migration in two field projects – the In Salah CO? Injection project in Algeria and CO? injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were highly efficient and yielded accurate grouping of reservoir models. The plume migration paths probabilistically assessed by the method were confirmed by field observations and auxiliary data. The report also documents the application of the software to answer practical questions such as the optimum location of monitoring wells to reliably assess the migration of CO? plume, the effect of CO?-rock interactions on plume migration and the ability to detect the plume under those conditions and the effect of a slow, unresolved leak on the predictions of plume migration.
CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements
Bergman, Rolf; Paget, Maria L.; Richman, Eric E.
2011-03-31
With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for all equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.
G E O M A T I C A SPATIAL DATA UNCERTAINTY
G E O M A T I C A SPATIAL DATA UNCERTAINTY IN THE VGI WORLD: GOING FROM CONSUMER TO PRODUCER Joel discuss the concept of "perceived qualities" as, in a Volunteered Geographic Information (VGI) context, a classification framework is proposed of various types of spatial data usage. Then, we address uncertainty and VGI
Stochastic Formulation for Uncertainty Analysis of Two-Phase Flow in
Zhang, Dongxiao
Stochastic Formulation for Uncertainty Analysis of Two-Phase Flow in Heterogeneous Reservoirs in flow performance predictions due to uncertainty in the reservoir description. We solve moment equations. Accurate modeling of the physics that govern complex multi- phase reservoir flows requires a detailed
Uncertainty Quantification in Modeling HIV Viral Mechanics H.T. Banks1,2
Uncertainty Quantification in Modeling HIV Viral Mechanics H.T. Banks1,2 , Robert Baraldi1 for the resulting parameter estimates. Key Words: In-host HIV-1 progression models, uncertainty quantification of [7] for further analysis. A major motivation for revisiting this model is its potential to be readily
Casting a Polyhedron with Directional Uncertainty Hee-kap Ahn Otfried Cheong Rene van Oostrum
Utrecht, Universiteit
Casting a Polyhedron with Directional UncertaintyÂ£ Hee-kap AhnÂ£Â£ Otfried CheongÂ£Â£ RenÂ´e van OostrumÂ£Â£ Abstract Casting is a manufacturing process in which molten material is poured into a cast (mould), which with imperfect control of the casting machinery. In this paper, we consider directional uncertainty: given a 3
ASSESSING THE UNCERTAINTY OF WIND POWER PREDICTIONS WITH REGARD TO SPECIFIC WEATHER SITUATIONS
Heinemann, Detlev
The growing share of wind energy in electrical grids de- mands new strategies to improve the integrationASSESSING THE UNCERTAINTY OF WIND POWER PREDICTIONS WITH REGARD TO SPECIFIC WEATHER SITUATIONS.lange@mail.uni-oldenburg.de, www.physik.uni-oldenburg.de/ehf The uncertainty of a short term wind power prediction is commonly
The hedge value of international emissions trading under uncertainty Mort Webster n
The hedge value of international emissions trading under uncertainty Mort Webster n , Sergey Keywords: Climate change Emissions trading Uncertainty a b s t r a c t This paper estimates the value of international emissions trading, focusing on a here-to-fore neglected component; its value as a hedge against
A New Cauchy-Based Black-Box Technique for Uncertainty in Risk Analysis
Kreinovich, Vladik
A New Cauchy-Based Black-Box Technique for Uncertainty in Risk Analysis V. Kreinovich a,, S information in risk analysis. Several such techniques have been presented, often on a heuristic basis. Key words: Uncertainty, Risk analysis, Monte-Carlo, black-box techniques PACS: 02.70.Tt, 02.70.Uu, 02
A New CauchyBased BlackBox Technique for Uncertainty in Risk Analysis
Kreinovich, Vladik
A New CauchyBased BlackBox Technique for Uncertainty in Risk Analysis V. Kreinovich a;\\Lambda , S information in risk analysis. Several such techniques have been presented, often on a heuristic basis. Key words: Uncertainty, Risk analysis, MonteCarlo, blackbox techniques PACS: 02.70.Tt, 02.70.Uu, 02
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications
Kammen, Daniel M.
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications by Richard J Friedman Fall 2010 #12;Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications Copyright 2010 by Richard J. Plevin #12;1 Abstract Life Cycle Regulation of Transportation Fuels
Users manual for the FORSS sensitivity and uncertainty analysis code system
Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.
1981-01-01
FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.
An Architecture of a Multi-Agent System for SCADA -dealing with uncertainty, plans and actions
Liu, Weiru
An Architecture of a Multi-Agent System for SCADA - dealing with uncertainty, plans and actions.loughlin@ecit.qub.ac.uk Keywords: Autonomous Agents, Multi-agent Systems, Sensors, SCADA, Uncertainty, Plans, Actions, Fusion in traditional SCADA systems deployed in critical environments such as electrical power generation, transmission
4. Uncertainty D. Keil Artificial Intelligence 1/12 CSCI 400 Artificial Intelligence
Keil, David M.
4. Uncertainty D. Keil Artificial Intelligence 1/12 CSCI 400 Artificial Intelligence David Keil in partially observable and non- deterministic environments? D. Keil Special Topics: Artificial Intelligence 1/12 2 #12;4. Uncertainty D. Keil Artificial Intelligence 1/12 Objectives 4a. Describe ways to operate
Modeling Uncertainty and its Implications to Sophisticated Control in TMS Agents
Raja, Anita
/scheduler used to control TÆMS agents is the Design-to-Criteria (DTC) agent scheduler. Design-to-Criteria that describe alternate ways to achieve tasks and subtasks. Recent advances in Design-to-Criteria control and the incorporation of uncertainty in the scheduling process. As we show, the use of uncertainty in TÆMS and Design-to-Criteria
Amand Faessler; G. L. Fogli; E. Lisi; V. Rodin; A. M. Rotunno; F. Simkovic
2009-03-06
The variances and covariances associated to the nuclear matrix elements (NME) of neutrinoless double beta decay are estimated within the quasiparticle random phase approximation (QRPA). It is shown that correlated NME uncertainties play an important role in the comparison of neutrinoless double beta decay rates for different nuclei, and that they are degenerate with the uncertainty in the reconstructed Majorana neutrino mass.
1997-2001 by M. Kostic Ch.5: Uncertainty/Error Analysis
Kostic, Milivoje M.
1 ©1997-2001 by M. Kostic Ch.5: Uncertainty/Error Analysis · Introduction · Bias and Precision Summation/Propagation (Expanded Combined Uncertainty) · Problem 5-30 ©1997-2001 by M. Kostic Ch.5) at corresponding Probability (%P) Remember: u = d%P = t,%PS (@ %P); z=t=d/S #12;2 ©1997-2001 by M. Kostic Bias
Navigation Planning in Probabilistic Roadmaps with Uncertainty Michael Kneebone and Richard Dearden
Yao, Xin
Navigation Planning in Probabilistic Roadmaps with Uncertainty Michael Kneebone and Richard Dearden,rwd@cs.bham.ac.uk Abstract Probabilistic Roadmaps (PRM) are a commonly used class of algorithms for robot navigation tasks Probabilistic Roadmap (PRM) planning to handle uncertainty and observations. Probabilistic Roadmaps (Kavraki
Uncertainties in Nuclear Matrix Elements for Neutrinoless Double-Beta Decay
Engel, Jonathan
Uncertainties in Nuclear Matrix Elements for Neutrinoless Double-Beta Decay Jonathan Engel Abstract. I briefly review calculations of the matrix elements governing neutrinoless double-beta decay;Uncertainties in Nuclear Matrix Elements for Neutrinoless Double-Beta Decay 2 in reducing theoretical error
A Unified Treatment of Uncertainties Center for Research on Concepts and Cognition
Indiana University
A Unified Treatment of Uncertainties Pei Wang Center for Research on Concepts and Cognition Indiana'' is an active research field, where several approaches have been suggested and studied for dealing with various types of uncertainty. However, it's hard to rank the approaches in general, because each of them
arXiv:astro-ph/0409387v115Sep2004 Accounting for Source Uncertainties in Analyses
Masci, Frank
arXiv:astro-ph/0409387v115Sep2004 Accounting for Source Uncertainties in Analyses of Astronomical in analyzing data from astronomical surveys: accounting for measurement uncertainties in the properties ingredient in such analyses is accounting for the volume in the incidental parameter space via marginaliza
Influence of air quality model resolution on uncertainty associated with health impacts*
Influence of air quality model resolution on uncertainty associated with health impacts* Tammy M interactions among natural and human climate system components; objectively assess uncertainty in economic, monitor and verify greenhouse gas emissions and climatic impacts. This reprint is one of a series intended
Uncertainty in Scenarios of Human-Caused Climate NATHAN J MANTUA1
Mantua, Nathan
greenhouse gas emissions and atmospheric concentrations, and second is the uncertainty associated for eliminating, or even vastly reducing, environmental uncertainty for the purpose of improved natural resource emerged on key aspects of global climate change: humans have unquestionably altered the composition
A Discussion on Heisenberg Uncertainty Principle in the Picture of Special Relativity
Luca Nanni
2015-01-09
In this note the formulation of the Heisenberg uncertainty principle (HUP) in the picture of the special relativity is given. The inequality shows that the product of quantum conjugate variables uncertainties is greater than an amount that is not more a constant but depends on the speed of the system on which the measurement is taken.
Huang, Yinlun
Sustainable distributed biodiesel manufacturing under uncertainty: An interval A sophisticated biodiesel manufacturing study demonstrated methodological efficacy. a r t i c l e i n f o Article Simulation Uncertainty a b s t r a c t Biodiesel, a clean-burning alternative fuel, can be produced using
Using Uncertainty Analysis to Guide the Development of Accelerated Stress Tests (Presentation)
Kempe, M.
2014-03-01
Extrapolation of accelerated testing to the long-term results expected in the field has uncertainty associated with the acceleration factors and the range of possible stresses in the field. When multiple stresses (such as temperature and humidity) can be used to increase the acceleration, the uncertainty may be reduced according to which stress factors are used to accelerate the degradation.
POWER MANAGEMENT IN A HYDRO-THERMAL SYSTEM UNDER UNCERTAINTY BY LAGRANGIAN
Römisch, Werner
POWER MANAGEMENT IN A HYDRO-THERMAL SYSTEM UNDER UNCERTAINTY BY LAGRANGIAN RELAXATION NICOLE GR power in a hydro-thermal system under uncertainty in load, inflow to reservoirs and prices for fuel to successive decom- position into single thermal and hydro unit subproblems that are solved by dynamic
A Review of Uncertainty in Data Visualization Ken Brodlie, Rodolfo Allendes Osorio and Adriano Lopes
Brodlie, Ken
A Review of Uncertainty in Data Visualization Ken Brodlie, Rodolfo Allendes Osorio and Adriano, and in this article we review their work. We place the work in the context of a reference model for data visualization of the discipline is maintained. Key words: visualization; uncertainty; errors Ken Brodlie University of Leeds
TOTAL MEASUREMENT UNCERTAINTY IN HOLDUP MEASUREMENTS AT THE PLUTONIUM FINISHING PLANT (PFP)
KEELE, B.D.
2007-07-05
An approach to determine the total measurement uncertainty (TMU) associated with Generalized Geometry Holdup (GGH) [1,2,3] measurements was developed and implemented in 2004 and 2005 [4]. This paper describes a condensed version of the TMU calculational model, including recent developments. Recent modifications to the TMU calculation model include a change in the attenuation uncertainty, clarifying the definition of the forward background uncertainty, reducing conservatism in the random uncertainty by selecting either a propagation of counting statistics or the standard deviation of the mean, and considering uncertainty in the width and height as a part of the self attenuation uncertainty. In addition, a detection limit is calculated for point sources using equations derived from summary equations contained in Chapter 20 of MARLAP [5]. The Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2007-1 to the Secretary of Energy identified a lack of requirements and a lack of standardization for performing measurements across the U.S. Department of Energy (DOE) complex. The DNFSB also recommended that guidance be developed for a consistent application of uncertainty values. As such, the recent modifications to the TMU calculational model described in this paper have not yet been implemented. The Plutonium Finishing Plant (PFP) is continuing to perform uncertainty calculations as per Reference 4. Publication at this time is so that these concepts can be considered in developing a consensus methodology across the complex.
Towing tank PIV measurement system, data and uncertainty assessment for DTMB Model 5512
Gui, Lichuan
. Longo, F. Stern Abstract A towed PIV system designed by DANTEC Measurement Technology for the Iowa-hole pitot probe data. Uncertainty assessment following standard proce- dures is used to quantify. Quantitative comparisons with 5-hole pitot data shows that PIV uncertainties are about 1% lower than those
Climate uncertainty and implications for U.S. state-level risk assessment through 2050.
Loose, Verne W.; Lowry, Thomas Stephen; Malczynski, Leonard A.; Tidwell, Vincent Carroll; Stamber, Kevin Louis; Kelic, Andjelka; Backus, George A.; Warren, Drake E.; Zagonel, Aldo A.; Ehlen, Mark Andrew; Klise, Geoffrey T.; Vargas, Vanessa N.
2009-10-01
Decisions for climate policy will need to take place in advance of climate science resolving all relevant uncertainties. Further, if the concern of policy is to reduce risk, then the best-estimate of climate change impacts may not be so important as the currently understood uncertainty associated with realizable conditions having high consequence. This study focuses on one of the most uncertain aspects of future climate change - precipitation - to understand the implications of uncertainty on risk and the near-term justification for interventions to mitigate the course of climate change. We show that the mean risk of damage to the economy from climate change, at the national level, is on the order of one trillion dollars over the next 40 years, with employment impacts of nearly 7 million labor-years. At a 1% exceedance-probability, the impact is over twice the mean-risk value. Impacts at the level of individual U.S. states are then typically in the multiple tens of billions dollar range with employment losses exceeding hundreds of thousands of labor-years. We used results of the Intergovernmental Panel on Climate Change's (IPCC) Fourth Assessment Report 4 (AR4) climate-model ensemble as the referent for climate uncertainty over the next 40 years, mapped the simulated weather hydrologically to the county level for determining the physical consequence to economic activity at the state level, and then performed a detailed, seventy-industry, analysis of economic impact among the interacting lower-48 states. We determined industry GDP and employment impacts at the state level, as well as interstate population migration, effect on personal income, and the consequences for the U.S. trade balance.
J. D. McDonnell; N. Schunck; D. Higdon; J. Sarich; S. M. Wild; W. Nazarewicz
2015-01-15
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models; to estimate model errors and thereby improve predictive capability; to extrapolate beyond the regions reached by experiment; and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
Thorough approach to measurement uncertainty analysis applied to immersed heat exchanger testing
Farrington, R.B.; Wells, C.V.
1986-04-01
This paper discusses the value of an uncertainty analysis, discusses how to determine measurement uncertainty, and then details the sources of error in instrument calibration, data acquisition, and data reduction for a particular experiment. Methods are discussed to determine both the systematic (or bias) error in an experiment as well as to determine the random (or precision) error in the experiment. The detailed analysis is applied to two sets of conditions in measuring the effectiveness of an immersed coil heat exchanger. It shows the value of such analysis as well as an approach to reduce overall measurement uncertainty and to improve the experiment. This paper outlines how to perform an uncertainty analysis and then provides a detailed example of how to apply the methods discussed in the paper. The authors hope this paper will encourage researchers and others to become more concerned with their measurement processes and to report measurement uncertainty with all of their test results.
Comparison of nuclear data uncertainty propagation methodologies for PWR burn-up simulations
Diez, Carlos Javier; Hoefer, Axel; Porsch, Dieter; Cabellos, Oscar
2014-01-01
Several methodologies using different levels of approximations have been developed for propagating nuclear data uncertainties in nuclear burn-up simulations. Most methods fall into the two broad classes of Monte Carlo approaches, which are exact apart from statistical uncertainties but require additional computation time, and first order perturbation theory approaches, which are efficient for not too large numbers of considered response functions but only applicable for sufficiently small nuclear data uncertainties. Some methods neglect isotopic composition uncertainties induced by the depletion steps of the simulations, others neglect neutron flux uncertainties, and the accuracy of a given approximation is often very hard to quantify. In order to get a better sense of the impact of different approximations, this work aims to compare results obtained based on different approximate methodologies with an exact method, namely the NUDUNA Monte Carlo based approach developed by AREVA GmbH. In addition, the impact ...
Results for Phase I of the IAEA Coordinated Research Program on HTGR Uncertainties
Strydom, Gerhard; Bostelmann, Friederike; Yoon, Su Jong
2015-01-01
The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied. High Temperature Gas-cooled Reactors (HTGR) has its own peculiarities, coated particle design, large graphite quantities, different materials and high temperatures that also require other simulation requirements. The IAEA has therefore launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the HTR-PM (INET, China). This report summarizes the contributions of the HTGR Methods Simulation group at Idaho National Laboratory (INL) up to this point of the CRP. The activities at INL have been focused so far on creating the problem specifications for the prismatic design, as well as providing reference solutions for the exercises defined for Phase I. An overview is provided of the HTGR UAM objectives and scope, and the detailed specifications for Exercises I-1, I-2, I-3 and I-4 are also included here for completeness. The main focus of the report is the compilation and discussion of reference results for Phase I (i.e. for input parameters at their nominal or best-estimate values), which is defined as the first step of the uncertainty quantification process. These reference results can be used by other CRP participants for comparison with other codes or their own reference results. The status on the Monte Carlo modeling of the experimental VHTRC facility is also discussed. Reference results were obtained for the neutronics stand-alone cases (Ex. I-1 and Ex. I-2) using the (relatively new) Monte Carlo code Serpent, and comparisons were performed with the more established Monte Carlo codes MCNP and KENO-VI. For the thermal-fluids stand-alone cases (Ex. I-3 and I-4) the commercial CFD code CFX was utilized to obtain reference results that can be compared with lower fidelity tools.
A preliminary study to Assess Model Uncertainties in Fluid Flows
Marc Oliver Delchini; Jean C. Ragusa
2009-09-01
The goal of this study is to assess the impact of various flow models for a simplified primary coolant loop of a light water nuclear reactor. The various fluid flow models are based on the Euler equations with an additional friction term, gravity term, momentum source, and energy source. The geometric model is purposefully chosen simple and consists of a one-dimensional (1D) loop system in order to focus the study on the validity of various fluid flow approximations. The 1D loop system is represented by a rectangle; the fluid is heated up along one of the vertical legs and cooled down along the opposite leg. A pressurizer and a pump are included in the horizontal legs. The amount of energy transferred and removed from the system is equal in absolute value along the two vertical legs. The various fluid flow approximations are compressible vs. incompressible, and complete momentum equation vs. Darcy’s approximation. The ultimate goal is to compute the fluid flow models’ uncertainties and, if possible, to generate validity ranges for these models when applied to reactor analysis. We also limit this study to single phase flows with low-Mach numbers. As a result, sound waves carry a very small amount of energy in this particular case. A standard finite volume method is used for the spatial discretization of the system.
Clocking in the face of unpredictability beyond quantum uncertainty
F. Hadi Madjid; John M. Myers
2015-04-16
In earlier papers we showed unpredictability beyond quantum uncertainty in atomic clocks, ensuing from a proven gap between given evidence and explanations of that evidence. Here we reconceive a clock, not as an isolated entity, but as enmeshed in a self-adjusting communications network adapted to one or another particular investigation, in contact with an unpredictable environment. From the practical uses of clocks, we abstract a clock enlivened with the computational capacity of a Turing machine, modified to transmit and to receive numerical communications. Such "live clocks" phase the steps of their computations to mesh with the arrival of transmitted numbers. We lift this phasing, known in digital communications, to a principle of \\emph{logical synchronization}, distinct from the synchronization defined by Einstein in special relativity. Logical synchronization elevates digital communication to a topic in physics, including applications to biology. One explores how feedback loops in clocking affect numerical signaling among entities functioning in the face of unpredictable influences, making the influences themselves into subjects of investigation. The formulation of communications networks in terms of live clocks extends information theory by expressing the need to actively maintain communications channels, and potentially, to create or drop them. We show how networks of live clocks are presupposed by the concept of coordinates in a spacetime. A network serves as an organizing principle, even when the concept of the rigid body that anchors a special-relativistic coordinate system is inapplicable, as is the case, for example, in a generic curved spacetime.
Calibration under uncertainty for finite element models of masonry monuments
Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin
2010-02-01
Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.
Turinsky, Paul J; Abdel-Khalik, Hany S; Stover, Tracy E
2011-03-31
An optimization technique has been developed to select optimized experimental design specifications to produce data specifically designed to be assimilated to optimize a given reactor concept. Data from the optimized experiment is assimilated to generate posteriori uncertainties on the reactor concept’s core attributes from which the design responses are computed. The reactor concept is then optimized with the new data to realize cost savings by reducing margin. The optimization problem iterates until an optimal experiment is found to maximize the savings. A new generation of innovative nuclear reactor designs, in particular fast neutron spectrum recycle reactors, are being considered for the application of closing the nuclear fuel cycle in the future. Safe and economical design of these reactors will require uncertainty reduction in basic nuclear data which are input to the reactor design. These data uncertainty propagate to design responses which in turn require the reactor designer to incorporate additional safety margin into the design, which often increases the cost of the reactor. Therefore basic nuclear data needs to be improved and this is accomplished through experimentation. Considering the high cost of nuclear experiments, it is desired to have an optimized experiment which will provide the data needed for uncertainty reduction such that a reactor design concept can meet its target accuracies or to allow savings to be realized by reducing the margin required due to uncertainty propagated from basic nuclear data. However, this optimization is coupled to the reactor design itself because with improved data the reactor concept can be re-optimized itself. It is thus desired to find the experiment that gives the best optimized reactor design. Methods are first established to model both the reactor concept and the experiment and to efficiently propagate the basic nuclear data uncertainty through these models to outputs. The representativity of the experiment to the design concept is quantitatively determined. A technique is then established to assimilate this data and produce posteriori uncertainties on key attributes and responses of the design concept. Several experiment perturbations based on engineering judgment are used to demonstrate these methods and also serve as an initial generation of the optimization problem. Finally, an optimization technique is developed which will simultaneously arrive at an optimized experiment to produce an optimized reactor design. Solution of this problem is made possible by the use of the simulated annealing algorithm for solution of optimization problems. The optimization examined in this work is based on maximizing the reactor cost savings associated with the modified design made possible by using the design margin gained through reduced basic nuclear data uncertainties. Cost values for experiment design specifications and reactor design specifications are established and used to compute a total savings by comparing the posteriori reactor cost to the a priori cost plus the cost of the experiment. The optimized solution arrives at a maximized cost savings.
SENSITIVITY AND UNCERTAINTY ANALYSIS OF COMMERCIAL REACTOR CRITICALS FOR BURNUP CREDIT
Radulescu, Georgeta [ORNL; Mueller, Don [ORNL; Wagner, John C [ORNL
2009-01-01
The purpose of this study is to provide insights into the neutronic similarities that may exist between a generic cask containing typical spent nuclear fuel assemblies and commercial reactor critical (CRC) state-points. Forty CRC state-points from five pressurized-water reactors were selected for the study and the type of CRC state-points that may be applicable for validation of burnup credit criticality safety calculations for spent fuel transport/storage/disposal systems are identified. The study employed cross-section sensitivity and uncertainty analysis methods developed at Oak Ridge National Laboratory and the TSUNAMI set of tools in the SCALE code system as a means to investigate system similarity on an integral and nuclide-reaction specific level. The results indicate that, except for the fresh fuel core configuration, all analyzed CRC state-points are either highly similar, similar, or marginally similar to a generic cask containing spent nuclear fuel assemblies with burnups ranging from 10 to 60 GWd/MTU. Based on the integral system parameter, C{sub k}, approximately 30 of the 40 CRC state-points are applicable to validation of burnup credit in the generic cask containing typical spent fuel assemblies with burnups ranging from 10 to 60 GWd/MTU. The state-points providing the highest similarity (C{sub k} > 0.95) were attained at or near the end of a reactor cycle. The C{sub k} values are dominated by neutron reactions with major actinides and hydrogen, as the sensitivities of these reactions are much higher than those of the minor actinides and fission products. On a nuclide-reaction specific level, the CRC state-points provide significant similarity for most of the actinides and fission products relevant to burnup credit. A comparison of energy-dependent sensitivity profiles shows a slight shift of the CRC K{sub eff} sensitivity profiles toward higher energies in the thermal region as compared to the K{sub eff} sensitivity profile of the generic cask. Parameters representing coverage of the application by the CRCs on an energy-dependent, nuclide-reaction specific level (i.e., effectiveness of the CRCs for validating the cross sections as used in the application) were also examined. Based on the CRCs with C{sub k} > 0.8 and an assumed relative standard deviation for uncovered covariance data of 25%, the relative standard deviation of K{sub eff} due to uncovered sensitivity data varies from 0.79% to 0.95% for cask burnups ranging from 10 to 60 GWd/MTU. As expected, this uncertainty in K{sub eff} is largely dominated by noncoverage of sensitivities from major actinides and hydrogen. The contributions from fission products and minor actinides are very small and comparable to statistical uncertainties in K{sub eff} results. These results (again, assuming a 25% uncertainty for uncovered covariance data) indicate that there could be approximately 1% uncertainty in the calculated application K{sub eff} due to incomplete neutronic testing (validation) of the software by the CRCs. However, this conclusion also assumes all other uncertainties in the complex CRC configurations (e.g., isotopic compositions of burned fuel, operation history, data) are well known. Thus, an evaluation of the uncertainties in the CRC configurations is needed prior to the use of CRCs for code validation (i.e., quantifying code bias and bias uncertainty).
John N. Bahcall; Aldo M. Serenelli
2005-03-11
We show that uncertainties in the values of the surface heavy element abundances of the Sun are the largest source of the theoretical uncertainty in calculating the p-p, pep, 8B, 13N, 15O, and 17F solar neutrino fluxes. We evaluate for the first time the sensitivity (partial derivative) of each solar neutrino flux with respect to the surface abundance of each element. We then calculate the uncertainties in each neutrino flux using `conservative (preferred)' and `optimistic' estimates for the uncertainties in the element abundances. The total conservative (optimistic) composition uncertainty in the predicted 8B neutrino flux is 11.6% (5.0%) when sensitivities to individual element abundances are used. The traditional method that lumps all abundances into a single quantity (total heavy element to hydrogen ratio, Z/X) yields a larger uncertainty, 20%. The uncertainties in the carbon, oxygen, neon, silicon, sulphur, and iron abundances all make significant contributions to the uncertainties in calculating solar neutrino fluxes; the uncertainties of different elements are most important for different neutrino fluxes. The uncertainty in the iron abundance is the largest source of the estimated composition uncertainties of the important 7Be and 8B solar neutrinos. Carbon is the largest contributor to the uncertainty in the calculation of the p-p, 13N, and 15O neutrino fluxes. However, for all neutrino fluxes, several elements contribute comparable amounts to the total composition uncertainty.
Le Pallec, J. C.; Crouzet, N.; Bergeaud, V.; Delavaud, C. [CEA/DEN/DM2S, CEA/Saclay, 91191 Gif sur Yvette Cedex (France)
2012-07-01
The control of uncertainties in the field of reactor physics and their propagation in best-estimate modeling are a major issue in safety analysis. In this framework, the CEA develops a methodology to perform multi-physics simulations including uncertainties analysis. The present paper aims to present and apply this methodology for the analysis of an accidental situation such as REA (Rod Ejection Accident). This accident is characterized by a strong interaction between the different areas of the reactor physics (neutronic, fuel thermal and thermal hydraulic). The modeling is performed with CRONOS2 code. The uncertainties analysis has been conducted with the URANIE platform developed by the CEA: For each identified response from the modeling (output) and considering a set of key parameters with their uncertainties (input), a surrogate model in the form of a neural network has been produced. The set of neural networks is then used to carry out a sensitivity analysis which consists on a global variance analysis with the determination of the Sobol indices for all responses. The sensitivity indices are obtained for the input parameters by an approach based on the use of polynomial chaos. The present exercise helped to develop a methodological flow scheme, to consolidate the use of URANIE tool in the framework of parallel calculations. Finally, the use of polynomial chaos allowed computing high order sensitivity indices and thus highlighting and classifying the influence of identified uncertainties on each response of the analysis (single and interaction effects). (authors)
Generalized Uncertainty Principle and Recent Cosmic Inflation Observations
Abdel Nasser Tawfik; Abdel Magied Diab
2014-10-29
The recent background imaging of cosmic extragalactic polarization (BICEP2) observations are believed as an evidence for the cosmic inflation. BICEP2 provided a first direct evidence for the inflation, determined its energy scale and debriefed witnesses for the quantum gravitational processes. The ratio of scalar-to-tensor fluctuations $r$ which is the canonical measurement of the gravitational waves, was estimated as $r=0.2_{-0.05}^{+0.07}$. Apparently, this value agrees well with the upper bound value corresponding to PLANCK $r\\leq 0.012$ and to WMAP9 experiment $r=0.2$. It is believed that the existence of a minimal length is one of the greatest predictions leading to modifications in the Heisenberg uncertainty principle or a GUP at the Planck scale. In the present work, we investigate the possibility of interpreting recent BICEP2 observations through quantum gravity or GUP. We estimate the slow-roll parameters, the tensorial and the scalar density fluctuations which are characterized by the scalar field $\\phi$. Taking into account the background (matter and radiation) energy density, $\\phi$ is assumed to interact with the gravity and with itself. We first review the Friedmann-Lemaitre-Robertson-Walker (FLRW) Universe and then suggest modification in the Friedmann equation due to GUP. By using a single potential for a chaotic inflation model, various inflationary parameters are estimated and compared with the PLANCK and BICEP2 observations. While GUP is conjectured to break down the expansion of the early Universe (Hubble parameter and scale factor), two inflation potentials based on certain minimal supersymmetric extension of the standard model result in $r$ and spectral index matching well with the observations. Corresponding to BICEP2 observations, our estimation for $r$ depends on the inflation potential and the scalar field. A power-law inflation potential does not.
Miller, C.; Little, C.A.
1982-08-01
The purpose is to summarize estimates based on currently available data of the uncertainty associated with radiological assessment models. The models being examined herein are those recommended previously for use in breeder reactor assessments. Uncertainty estimates are presented for models of atmospheric and hydrologic transport, terrestrial and aquatic food-chain bioaccumulation, and internal and external dosimetry. Both long-term and short-term release conditions are discussed. The uncertainty estimates presented in this report indicate that, for many sites, generic models and representative parameter values may be used to calculate doses from annual average radionuclide releases when these calculated doses are on the order of one-tenth or less of a relevant dose limit. For short-term, accidental releases, especially those from breeder reactors located in sites dominated by complex terrain and/or coastal meteorology, the uncertainty in the dose calculations may be much larger than an order of magnitude. As a result, it may be necessary to incorporate site-specific information into the dose calculation under these circumstances to reduce this uncertainty. However, even using site-specific information, natural variability and the uncertainties in the dose conversion factor will likely result in an overall uncertainty of greater than an order of magnitude for predictions of dose or concentration in environmental media following shortterm releases.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Yankov, Artem; Collins, Benjamin; Klein, Markus; Jessee, Matthew A.; Zwermann, Winfried; Velkov, Kiril; Pautz, Andreas; Downar, Thomas
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore »in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Jun, Mina
2007-01-01
Estimating, presenting, and assessing uncertainties are important parts in assessment of a complex system. This thesis focuses on the assessment of uncertainty in the price module and the climate module in the Aviation ...
Aghamohammadi, Aliakbar
2014-05-07
This dissertation addresses the problem of stochastic optimal control with imperfect measurements. The main application of interest is robot motion planning under uncertainty. In the presence of process uncertainty and imperfect measurements...
Statistical Assessment of Proton Treatment Plans Under Setup and Range Uncertainties
Park, Peter C.; Cheung, Joey P.; Zhu, X. Ronald; Lee, Andrew K.; Sahoo, Narayan; Tucker, Susan L.; Liu, Wei; Li, Heng; Mohan, Radhe; Court, Laurence E.; Dong, Lei
2013-08-01
Purpose: To evaluate a method for quantifying the effect of setup errors and range uncertainties on dose distribution and dose–volume histogram using statistical parameters; and to assess existing planning practice in selected treatment sites under setup and range uncertainties. Methods and Materials: Twenty passively scattered proton lung cancer plans, 10 prostate, and 1 brain cancer scanning-beam proton plan(s) were analyzed. To account for the dose under uncertainties, we performed a comprehensive simulation in which the dose was recalculated 600 times per given plan under the influence of random and systematic setup errors and proton range errors. On the basis of simulation results, we determined the probability of dose variations and calculated the expected values and standard deviations of dose–volume histograms. The uncertainties in dose were spatially visualized on the planning CT as a probability map of failure to target coverage or overdose of critical structures. Results: The expected value of target coverage under the uncertainties was consistently lower than that of the nominal value determined from the clinical target volume coverage without setup error or range uncertainty, with a mean difference of ?1.1% (?0.9% for breath-hold), ?0.3%, and ?2.2% for lung, prostate, and a brain cases, respectively. The organs with most sensitive dose under uncertainties were esophagus and spinal cord for lung, rectum for prostate, and brain stem for brain cancer. Conclusions: A clinically feasible robustness plan analysis tool based on direct dose calculation and statistical simulation has been developed. Both the expectation value and standard deviation are useful to evaluate the impact of uncertainties. The existing proton beam planning method used in this institution seems to be adequate in terms of target coverage. However, structures that are small in volume or located near the target area showed greater sensitivity to uncertainties.
Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes
Gerhard Strydom
2010-06-01
The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.
Dakota uncertainty quantification methods applied to the NEK-5000 SAHEX model.
Weirs, V. Gregory
2014-03-01
This report summarizes the results of a NEAMS project focused on the use of uncertainty and sensitivity analysis methods within the NEK-5000 and Dakota software framework for assessing failure probabilities as part of probabilistic risk assessment. NEK-5000 is a software tool under development at Argonne National Laboratory to perform computational fluid dynamics calculations for applications such as thermohydraulics of nuclear reactor cores. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. The goal of this work is to demonstrate the use of uncertainty quantification methods in Dakota with NEK-5000.
Jet energy and missing ET systematic uncertainties for early Run 2 data with the ATLAS detector
Alkire, Steven Patrick; The ATLAS collaboration
2015-01-01
The jet energy scale and resolution and their systematic uncertainties are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of \\sqrt{s}=13 TeV. Jets are clustered with the anti-kt algorithm with R=0.4 and calibrated using MC simulations. Uncertainties are based on measurements using 2012 data, measurements using early 2015 data and dedicated studies using MC simulations. In addition, systematic uncertainties on soft activity and its contributions to the missing transverse energy are determined using MC simulations and validated using early 2015 data.
Balance Calibration – A Method for Assigning a Direct-Reading Uncertainty to an Electronic Balance.
Mike Stears
2010-07-01
Paper Title: Balance Calibration – A method for assigning a direct-reading uncertainty to an electronic balance. Intended Audience: Those who calibrate or use electronic balances. Abstract: As a calibration facility, we provide on-site (at the customer’s location) calibrations of electronic balances for customers within our company. In our experience, most of our customers are not using their balance as a comparator, but simply putting an unknown quantity on the balance and reading the displayed mass value. Manufacturer’s specifications for balances typically include specifications such as readability, repeatability, linearity, and sensitivity temperature drift, but what does this all mean when the balance user simply reads the displayed mass value and accepts the reading as the true value? This paper discusses a method for assigning a direct-reading uncertainty to a balance based upon the observed calibration data and the environment where the balance is being used. The method requires input from the customer regarding the environment where the balance is used and encourages discussion with the customer regarding sources of uncertainty and possible means for improvement; the calibration process becomes an educational opportunity for the balance user as well as calibration personnel. This paper will cover the uncertainty analysis applied to the calibration weights used for the field calibration of balances; the uncertainty is calculated over the range of environmental conditions typically encountered in the field and the resulting range of air density. The temperature stability in the area of the balance is discussed with the customer and the temperature range over which the balance calibration is valid is decided upon; the decision is based upon the uncertainty needs of the customer and the desired rigor in monitoring by the customer. Once the environmental limitations are decided, the calibration is performed and the measurement data is entered into a custom spreadsheet. The spreadsheet uses measurement results, along with the manufacturer’s specifications, to assign a direct-read measurement uncertainty to the balance. The fact that the assigned uncertainty is a best-case uncertainty is discussed with the customer; the assigned uncertainty contains no allowance for contributions associated with the unknown weighing sample, such as density, static charges, magnetism, etc. The attendee will learn uncertainty considerations associated with balance calibrations along with one method for assigning an uncertainty to a balance used for non-comparison measurements.
Boyer, Edmond
, and simple models are usually considered in the analysis. This is an important constraint when uncertaintiesProbabilistic model identification of the bit-rock-interaction-model uncertainties in nonlinear model of uncertainties in a bit-rock interaction model for the nonlinear dynamics of a drill
Joint location of microseismic events in the presence of velocity uncertainty
Poliannikov, Oleg V.
The locations of seismic events are used to infer reservoir properties and to guide future production activity, as well as to determine and understand the stress field. Thus, locating seismic events with uncertainty ...
Clements, Emily Baker
2013-01-01
Most space programs experience significant cost and schedule growth over the course of program development. Poor uncertainty management has been identified as one of the leading causes of program cost and schedule overruns. ...
Morris, J.
The electric power sector, which accounts for approximately 40% of U.S. carbon dioxide emissions, will be a critical component of any policy the U.S. government pursues to confront climate change. In the context of uncertainty ...
Cao, Jianshu
Linear and nonlinear response functions of the Morse oscillator: Classical divergence the linear and nonlinear quantum response functions for microcanonical Morse systems and to demonstrate the linear divergence in the corresponding classical response function. On the basis of the uncertainty
Managing uncertainty in systems with a Valuation Approach for Strategic Changeability
Fitzgerald, Matthew Edward
2012-01-01
Complex engineering systems are frequently exposed to large amounts of uncertainty, as many exogenous, uncontrollable conditions change over time and can affect the performance and value delivery of a system. Engineering ...
Transport of exotic anti-nuclei: II- Antiproton and Antideuteron astrophysical uncertainties
D. Maurin; R. Taillet; C. Combet
2009-01-21
We use a 1D propagation model to study the dependence of the pbar and dbar exotic fluxes on the transport parameters. The simple analytical solutions allow us i) to clarify the origin of the astrophysical uncertainties, and ii) to compare two models used for {\\em signal} predictions, namely the constant and the linear Galactic wind models. We also study how these uncertainties should be reduced using forthcoming nuclear cosmic ray data. We confirm that the degeneracy of the transport parameters for a given propagation model leads to very different fluxes for primary antinuclei (~10^2). However, we show that with forthcoming data, these uncertainties could be greatly reduced (~2). As the precision will increase, the astrophysical uncertainty could then be dominated by our ignorance of the correct spatial dependence for some of the transport parameters: for instance, the constant and the linear wind models do not predict the same amount of exotic pbar at low energy.
Threat assessment for safe navigation in environments with uncertainty in predictability
Aoudé, Georges Salim
2011-01-01
This thesis develops threat assessment algorithms to improve the safety of the decision making of autonomous and human-operated vehicles navigating in dynamic and uncertain environments, where the source of uncertainty is ...
Pruet, J
2007-06-23
This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directory structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.
Investment Timing and Capacity Choice for Small-Scale Wind Power Under Uncertainty
Fleten, Stein-Erik; Maribu, Karl Magnus
2004-01-01
REFERENCES [1] American Wind Power Association (AWEA), Road-CHOICE FOR SMALL-SCALE WIND POWER UNDER UNCERTAINTY Stein-Power production from wind power has stochastic inflows, and
Quantification of the impact of climate uncertainty on regional air quality
Liao, K.-J.
Uncertainties in calculated impacts of climate forecasts on future regional air quality are investigated using downscaled MM5 meteorological fields from the NASA GISS and MIT IGSM global models and the CMAQ model in 2050 ...
ARTICLE IN PRESS 2 Factors affecting remotely sensed snow water equivalent uncertainty
Walker, Jeff
equivalent uncertainty 3 Jiarui Donga,b,*, Jeffrey P. Walkerc , Paul R. Houserd 4 a Hydrological Sciences 24 April 2005 9 10 Abstract 11 State-of-the-art passive microwave remote sensing-based snow water
Sobes, Vladimir
2014-01-01
A new methodology has been developed that couples differential cross section data evaluation with integral benchmark analysis for improved uncertainty quantification. The new methodology was applied to the two new copper ...
Horvitz, Eric
to as grounding. We explore representations and control strategies for grounding utterances founded on performingUncertainty, Utility, and Misunderstanding: A DecisionTheoretic Perspective on Grounding explicit probabilistic inference about failures in communication. The methods are informed by psychological
Horvitz, Eric
to as grounding. We explore representations and control strategies for grounding utterances founded on performingUncertainty, Utility, and Misunderstanding: A Decision-Theoretic Perspective on Grounding explicit probabilistic inference about failures in communication. The methods are informed by psychological
Sensitivity and uncertainty analyses for thermo-hydraulic calculation of research reactor
Hartini, Entin; Andiwijayakusuma, Dinan [Center for Development of Nuclear Informatics - National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)] [Center for Development of Nuclear Informatics - National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia); Isnaeni, Muh Darwis [Center for Reactor Technology and Nuclear Safety- National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)] [Center for Reactor Technology and Nuclear Safety- National Nuclear Energy Agency PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)
2013-09-09
The sensitivity and uncertainty analysis of input parameters on thermohydraulic calculations for a research reactor has successfully done in this research. The uncertainty analysis was carried out on input parameters for thermohydraulic calculation of sub-channel analysis using Code COOLOD-N. The input parameters include radial peaking factor, the increase bulk coolant temperature, heat flux factor and the increase temperature cladding and fuel meat at research reactor utilizing plate fuel element. The input uncertainty of 1% - 4% were used in nominal power calculation. The bubble detachment parameters were computed for S ratio (the safety margin against the onset of flow instability ratio) which were used to determine safety level in line with the design of 'Reactor Serba Guna-G. A. Siwabessy' (RSG-GA Siwabessy). It was concluded from the calculation results that using the uncertainty input more than 3% was beyond the safety margin of reactor operation.
UNIVERSITY OF CALGARY Reduction of Wellbore Positional Uncertainty During Directional Drilling
Calgary, University of
UNIVERSITY OF CALGARY Reduction of Wellbore Positional Uncertainty During Directional Drilling the wellbore positional accuracy in directional drilling operations taken by Measurement While Drilling (MWD-survey correction for compensating drilling assembly magnetic interference to solve the problem of wellbore
A rock physics strategy for quantifying uncertainty in common hydrocarbon indicators
Mavko, G.M.; Mukerji, T.
1995-12-31
We present a strategy for hydrocarbon detection and for quantifying the uncertainty in hydrocarbon indicators, by combining statistical techniques with deterministic rock physics relations derived from the laboratory and theory. A simple example combines Gassmann`s deterministic equation for fluid substitution with statistics inferred from log and core data, to detect hydrocarbons from observed seismic velocities. The formulation gives the most likely estimate of the pore fluid modulus, corresponding to each observed velocity, and also the uncertainty of that interpretation. The variances of seismic velocity and porosity in the calibration data determine the uncertainty of the pore fluid interpretation. As expected, adding information about shear wave velocity, from AVO for example, narrows the uncertainty of the hydrocarbon indicator. The formulation offers a convenient way to implement deterministic fluid substitution equations in the realistic case when the reference porosity and velocity span a range of values.
Framework for Modeling the Uncertainty of Future Events in Life Cycle Assessment
Chen, Yi-Fen; Simon, Rachel; Dornfeld, David
2013-01-01
Recent developments in Life Cycle Assessment, Journal ofThe uncertainty of Life Cycle Assessment is a very importantFuture Events in Life Cycle Assessment Yi-Fen Chen, Rachel
Compensating for model uncertainty in the control of cooperative field robots
Sujan, Vivek Anand, 1972-
2002-01-01
Current control and planning algorithms are largely unsuitable for mobile robots in unstructured field environment due to uncertainties in the environment, task, robot models and sensors. A key problem is that it is often ...
Effects of the Uncertainty about Global Economic Recovery on Energy Transition and CO2 Price
Durand-Lasserve, Olivier
This paper examines the impact that uncertainty over economic growth may have on global energy transition and CO2 prices. We use a general-equilibrium model derived from MERGE, and define several stochastic scenarios for ...
Lynch, Sharon G.; Kroencke, Dawn C.; Denney, Douglas R.
2001-12-01
The relationship between disability and depression was studied in 188 patients with clinically definite multiple sclerosis (MS). Patients were administered the Zung Self-Rating Depression Scale, Ways of Coping, Uncertainty of Illness Scale, and Hope...
Dealing with uncertainty in estimating average annual flood damage for ungaged watersheds
Toneatti, Silvana Victoria
1996-01-01
decades. A new risk-based analysis approach, currently being adopted in the United States, is based on modifying the conventional procedures to explicitly model the uncertainties involved in developing the required hydrologic, hydraulic, and economic...
A simplified analysis of uncertainty propagation in inherently controlled ATWS events
Wade, D.C.
1987-01-01
The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ..delta..T across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect.
Seelhof, Michael
2014-01-01
A computer model was developed to find optimal long-term investment strategies for the electric power sector under uncertainty with respect to future regulatory regimes and market conditions. The model is based on a ...
Uncertainties in Estimating Moisture Fluxes over the Intra-Americas Sea ALBERTO M. MESTAS-NUEZ
single-year sounding observations bear large uncertainties because of interannual variability. Large Plains low-level jet (GPLLJ) along the lee (east) side of the Rocky Mountain range and southerly flow
Grossmann, Ignacio E.
Pittsburgh, PA 15213 Vikas Goel ExxonMobil Upstream Research Company, Houston, TX 77098 Abstract In many that involve uncertainty in the data which are represented by probability distributions. There are two broad E
Mikaelian, Tsoline
2009-01-01
Complex systems and enterprises, such as those typical in the aerospace industry, are subject to uncertainties that may lead to suboptimal performance or even catastrophic failures if unmanaged. This work focuses on ...
Miller, Bruno, 1974-
2005-01-01
Real options analysis is being increasingly used as a tool to evaluate investments under uncertainty; however, traditional real options methodologies have some shortcomings that limit their utility, such as the use of the ...
Gregor, Jeffrey Allen
2003-01-01
The United States Navy is facing a need for a novel surface combatant capability. This new system of ships must be deigned to meet the uncertainty associated with constantly changing required mission capabilities, threats, ...
Ereira, Eleanor Charlotte
2010-01-01
Climate change is a threat that could be mitigated by introducing new energy technologies into the electricity market that emit fewer greenhouse gas (GHG) emissions. We face many uncertainties that would affect the demand ...
Life Cycle Regulation of Transportation Fuels: Uncertainty and its Policy Implications
Plevin, Richard Jay
2010-01-01
2.3.2. Methodological issues with LCA . . .2.3.3. Attributional versus consequential LCA 2.3.4.Economic Input-Output LCA . . . . . 2.4. Uncertainty in
Seo, Sangtaek
2006-04-12
Agricultural producers face uncertain agricultural production and market conditions. Much of the uncertainty faced by agricultural producers cannot be controlled by the producer, but can be managed. Several risk management programs are available...
Arumugam, Sankar
CAREER: Climate Informed Uncertainty Analyses for Integrated Water Resources Sustainability the relative roles of climate variability in modulating seasonal streamflow and water quality variability over forecasts in improving water supply and water quality management and in developing adaptive water management
Cardoni, Jeffrey N.; Kalinich, Donald A.
2014-02-01
Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.
Reducing Uncertainty in Fisheries Management: The Time for Fishers' Ecological Knowledge
Carr, Liam
2012-07-16
This dissertation work presents a novel method for addressing system uncertainty to improve management of a small-scale fishery in St. Croix, United States Virgin Islands. Using fishers' ecological knowledge (FEK), this research examines existing...
Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models
Frey, H. Christopher
. Performance, Emissions, and Cost of Combustion-Based NOx Controls for Wall and Tangential Furnace Coal ................................................................................................................... 20 3. VARIABILITY AND UNCERTAINTY IN NOX EMISSION MEASUREMENTS.................. 29 3.1 GENERAL............................................................................................................................... 1 1.2 NOX REGULATIONS
Determination of uncertainty in reserves estimate from analysis of production decline data
Wang, Yuhong
2007-09-17
Analysts increasingly have used probabilistic approaches to evaluate the uncertainty in reserves estimates based on a decline curve analysis. This is because the results represent statistical analysis of historical data ...
PDF uncertainties on the W boson mass measurement from the lepton transverse momentum distribution
Giuseppe Bozzi; Luca Citelli; Alessandro Vicini
2015-05-22
We study the charged current Drell-Yan process and we evaluate the proton parton densities uncertainties on the lepton transverse momentum distribution and their impact on the determination of the W-boson mass. We consider the global PDF sets CT10, MSTW2008CPdeut, NNPDF2.3, NNPDF3.0, MMHT2014, and apply the PDF4LHC recipe to combine the individual results, obtaining an uncertainty on MW that ranges between +-18 and +-24 MeV, depending on the final state, collider energy and kind. We discuss the dependence of the uncertainty on the acceptance cuts and the role of the individual parton densities in the final result. We remark that some PDF sets predict an uncertainty on MW of O(10 MeV); this encouraging result is spoiled, in the combined analysis of the different sets, by an important spread of the central values predicted by each group.
PDF uncertainties on the W boson mass measurement from the lepton transverse momentum distribution
Bozzi, Giuseppe; Vicini, Alessandro
2015-01-01
We study the charged current Drell-Yan process and we evaluate the proton parton densities uncertainties on the lepton transverse momentum distribution and their impact on the determination of the W-boson mass. We consider the global PDF sets CT10, MSTW2008CPdeut, NNPDF2.3, NNPDF3.0, MMHT2014, and apply the PDF4LHC recipe to combine the individual results, obtaining an uncertainty on MW that ranges between +-18 and +-24 MeV, depending on the final state, collider energy and kind. We discuss the dependence of the uncertainty on the acceptance cuts and the role of the individual parton densities in the final result. We remark that some PDF sets predict an uncertainty on MW of O(10 MeV); this encouraging result is spoiled, in the combined analysis of the different sets, by an important spread of the central values predicted by each group.
Continuous reservoir simulation incorporating uncertainty quantification and real-time data
Holmes, Jay Cuthbert
2009-05-15
of uncertainty in resulting forecasts. A new technology to allow this process to run continuously with little human interaction is real-time production and pressure data, which can be automatically integrated into runs. Two tests of this continuous simulation...
Investment Timing and Capacity Choice for Small-Scale Wind Power Under Uncertainty
Fleten, Stein-Erik; Maribu, Karl Magnus
2004-01-01
A 20-year industry plan for small wind turbine tech- nology,has estimated that small wind turbines could contribute to 3CAPACITY CHOICE FOR SMALL-SCALE WIND POWER UNDER UNCERTAINTY
System level assessment of uncertainty in aviation environmental policy impact analysis
Liem, Rhea Patricia
2010-01-01
This thesis demonstrates the assessment of uncertainty of a simulation model at the system level, which takes into account the interaction between the modules that comprise the system. Results from this system level ...
Strategic investment in power generation under uncertainty : Electric Reliability Council of Texas
Chiyangwa, Diana Kudakwashe
2010-01-01
The purpose of this study is to develop a strategy for investment in power generation technologies in the future given the uncertainties in climate policy and fuel prices. First, such studies are commonly conducted using ...
Lyons, Jeffrey M. (Jeffrey Michael), 1973-
2000-01-01
As the use of distributed engineering models becomes more prevalent, engineers need tools to evaluate the quality of these models and understand how subsystem uncertainty affects predictions of system behavior. This thesis ...
Climate Change Impacts on Extreme Events in the United States: An Uncertainty Analysis
Monier, Erwan
Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in ...
Bei, Naifang
The purpose of the present study is to investigate the sensitivity of ozone (O3)[(O subscript 3)] predictions in the Mexico City Metropolitan Area (MCMA) to meteorological initial uncertainties and planetary boundary layer ...
The Low energy structure of the Nucleon-Nucleon interaction: Statistical vs Systematic Uncertainties
R. Navarro Perez; J. E. Amaro; E. Ruiz Arriola
2015-05-15
We analyze the low energy NN interaction by confronting statistical vs systematic uncertainties. This is based on the analysis of 6 different potentials fitted to the Granada-2013 database where a statistically meaningful partial wave analysis comprising a total of $6713$ np and pp published scattering data from 1950 till 2013 below pion production threshold has been made. This required the design and implementation of three new interactions which are introduced here. We extract threshold parameters uncertainties from the coupled channel effective range expansion up to $j \\le 5$. We find that for threshold parameters systematic uncertainties are generally at least an order of magnitude larger than statistical uncertainties. Similar results are found for np phase-shifts and amplitude parameters.
Huang, Yinlun
Technology Evaluation and Decision Making for Sustainability Enhancement of Industrial Systems. A case study on sustainable development of biodiesel manufacturing demonstrates methodological efficacy: sustainability enhancement, decision making, uncertainty, interval-parameter-based analysis, technology
Northern winter climate change: Assessment of uncertainty in CMIP5 projections related
Gerber, Edwin
Northern winter climate change: Assessment of uncertainty in CMIP5 projections related circulation could have an important impact on northern winter tropospheric climate change, given that sea coherent variations in troposphere-stratosphere circulation. Here we assess northern winter stratospheric
Theory uncertainties for Higgs mass and other searches using jet bins
Stewart, Iain
Bounds on the Higgs mass from the Tevatron and LHC are determined using exclusive jet bins to maximize sensitivity. Scale variation in exclusive fixed-order predictions underestimates the perturbative uncertainty for these ...
AlMisnad, Abdulla
2014-01-01
The development of new infrastructure projects is a key part of global efforts to meet the demands of growing populations in times of increasing uncertainty. The deterministic approaches commonly used for the development ...
Introduction Course Logistics Sets and Elements CMPSCI 240: Reasoning about Uncertainty
McGregor, Andrew
Introduction Course Logistics Sets and Elements CMPSCI 240: Reasoning about Uncertainty Lecture 1 Course Logistics Sets and Elements What's this course about. . . Course description: Development arguments and using mathematical concepts. #12;Introduction Course Logistics Sets and Elements Probability
Morris, Jennifer F. (Jennifer Faye)
2013-01-01
The electric power sector, which accounts for approximately 40% of U.S. carbon dioxide emissions, will be a critical component of any policy the U.S. government pursues to confront climate change. In the context of uncertainty ...
Rate Optimization for Polymer and CO2 Flooding Under Geologic Uncertainty
Sharma, Mohan
2012-10-19
FOR POLYMER AND CO2 FLOODING UNDER GEOLOGIC UNCERTAINTY A Thesis by MOHAN SHARMA Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE August... 2011 Major Subject: Petroleum Engineering Rate Optimization for Polymer and CO2 Flooding Under Geologic Uncertainty Copyright 2011 Mohan Sharma RATE OPTIMIZATION FOR POLYMER AND CO2 FLOODING...
Using a Monte-Carlo-based approach to evaluate the uncertainty on fringe projection technique
Molimard, Jérôme
2013-01-01
A complete uncertainty analysis on a given fringe projection set-up has been performed using Monte-Carlo approach. In particular the calibration procedure is taken into account. Two applications are given: at a macroscopic scale, phase noise is predominant whilst at microscopic scale, both phase noise and calibration errors are important. Finally, uncertainty found at macroscopic scale is close to some experimental tests (~100 {\\mu}m).
Study of the systematic uncertainty of tracking from $J/?\\to p \\overline{p} ?^+ ?^-$
Wenlong Yuan; Xiaocong Ai; Xiaobin Ji; Shenjian Chen; Yao Zhang; Linghui Wu; Liangliang Wang; Ye Yuan
2015-07-13
Based on the $J/\\psi$ events collected with the BESIII detector, with corresponding Monte Carlo samples, the systematic uncertainties of tracking are studied using the control sample of $J/\\psi \\to p \\overline{p} \\pi^+ \\pi^-$. Method validations and different influence factors to the tracking efficiency are studied in detail. The tracking efficiency and its systematic uncertainty of proton and pion with the transverse momentum and polar angle dependence are also discussed.
Position-Momentum Uncertainty Relations in the Presence of Quantum Memory
Fabian Furrer; Mario Berta; Marco Tomamichel; Volkher B. Scholz; Matthias Christandl
2015-01-05
A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg's original setting of position and momentum observables. Here we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.
Zhang, Xuesong; Liang, Faming; Yu, Beibei; Zong, Ziliang
2011-11-09
Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associated with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.
Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications
Arbanas, Goran; Williams, Mark L; Leal, Luiz C; Dunn, Michael E; Khuwaileh, Bassam A.; Wang, C; Abdel-Khalik, Hany
2015-01-01
The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimiator (INSURE) module of the AMPX system [1]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncer- tainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore we have incorporated integral benchmark exper- iments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way. We show how the IS/UQ method could be used to optimize uncertainties of IBE s and differential cross section data simultaneously.
Da Cruz, D. F.; Rochman, D.; Koning, A. J. [Nuclear Research and Consultancy Group NRG, Westerduinweg 3, 1755 ZG Petten (Netherlands)
2012-07-01
This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {sup 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)
Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov
2012-10-01
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest status and plans are presented.
Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis
Perkó, Zoltán Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen
2014-03-01
The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (15–20), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.
State-of-the-Art Solar Simulator Reduces Measurement Time and Uncertainty (Fact Sheet)
Not Available
2012-04-01
One-Sun Multisource Solar Simulator (OSMSS) brings accurate energy-rating predictions that account for the nonlinear behavior of multijunction photovoltaic devices. The National Renewable Energy Laboratory (NREL) is one of only a few International Organization for Standardization (ISO)-accredited calibration labs in the world for primary and secondary reference cells and modules. As such, it is critical to seek new horizons in developing simulators and measurement methods. Current solar simulators are not well suited for accurately measuring multijunction devices. To set the electrical current to each junction independently, simulators must precisely tune the spectral content with no overlap between the wavelength regions. Current simulators do not have this capability, and the overlaps lead to large measurement uncertainties of {+-}6%. In collaboration with LabSphere, NREL scientists have designed and implemented the One-Sun Multisource Solar Simulator (OSMSS), which enables automatic spectral adjustment with nine independent wavelength regions. This fiber-optic simulator allows researchers and developers to set the current to each junction independently, reducing errors relating to spectral effects. NREL also developed proprietary software that allows this fully automated simulator to rapidly 'build' a spectrum under which all junctions of a multijunction device are current matched and behave as they would under a reference spectrum. The OSMSS will reduce the measurement uncertainty for multijunction devices, while significantly reducing the current-voltage measurement time from several days to minutes. These features will enable highly accurate energy-rating predictions that take into account the nonlinear behavior of multijunction photovoltaic devices.
David Brizuela
2014-11-03
The classical and quantum evolution of a generic probability distribution is analyzed. To that end, a formalism based on the decomposition of the distribution in terms of its statistical moments is used, which makes explicit the differences between the classical and quantum dynamics. In particular, there are two different sources of quantum effects. Distributional effects, which are also present in the classical evolution of an extended distribution, are due to the fact that all moments can not be vanishing because of the Heisenberg uncertainty principle. In addition, the non-commutativity of the basic quantum operators add some terms to the quantum equations of motion that explicitly depend on the Planck constant and are not present in the classical setting. These are thus purely-quantum effects. Some particular Hamiltonians are analyzed that have very special properties regarding the evolution they generate in the classical and quantum sector. In addition, a large class of inequalities obeyed by high-order statistical moments, and in particular uncertainty relations that bound the information that is possible to obtain from a quantum system, are derived.
Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia; Hough, Patricia Diane; Eddy, John P.
2011-12-01
The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.
WESTSIK, G.A.
2001-06-06
This report presents the results of an evaluation of the Total Measurement Uncertainty (TMU) for the Canberra manufactured Segmented Gamma Scanner Assay System (SGSAS) as employed at the Hanford Plutonium Finishing Plant (PFP). In this document, TMU embodies the combined uncertainties due to all of the individual random and systematic sources of measurement uncertainty. It includes uncertainties arising from corrections and factors applied to the analysis of transuranic waste to compensate for inhomogeneities and interferences from the waste matrix and radioactive components. These include uncertainty components for any assumptions contained in the calibration of the system or computation of the data. Uncertainties are propagated at 1 sigma. The final total measurement uncertainty value is reported at the 95% confidence level. The SGSAS is a gamma assay system that is used to assay plutonium and uranium waste. The SGSAS system can be used in a stand-alone mode to perform the NDA characterization of a container, particularly for low to medium density (0-2.5 g/cc) container matrices. The SGSAS system provides a full gamma characterization of the container content. This document is an edited version of the Rocky Flats TMU Report for the Can Scan Segment Gamma Scanners, which are in use for the plutonium residues projects at the Rocky Flats plant. The can scan segmented gamma scanners at Rocky Flats are the same design as the PFP SGSAS system and use the same software (with the exception of the plutonium isotopics software). Therefore, all performance characteristics are expected to be similar. Modifications in this document reflect minor differences in the system configuration, container packaging, calibration technique, etc. These results are supported by the Quality Assurance Objective (QAO) counts, safeguards test data, calibration data, etc. for the PFP SGSAS system. Other parts of the TMU analysis utilize various modeling techniques such as Monte Carlo N-Particle (MCNP) and In Situ Object Counting Software (ISOCS).
The ends of uncertainty: Air quality science and planning in Central California
Fine, James
2003-09-01
Air quality planning in Central California is complicated and controversial despite millions of dollars invested to improve scientific understanding. This research describes and critiques the use of photochemical air quality simulation modeling studies in planning to attain standards for ground-level ozone in the San Francisco Bay Area and the San Joaquin Valley during the 1990's. Data are gathered through documents and interviews with planners, modelers, and policy-makers at public agencies and with representatives from the regulated and environmental communities. Interactions amongst organizations are diagramed to identify significant nodes of interaction. Dominant policy coalitions are described through narratives distinguished by their uses of and responses to uncertainty, their exposures to risks, and their responses to the principles of conservatism, civil duty, and caution. Policy narratives are delineated using aggregated respondent statements to describe and understand advocacy coalitions. I found that models impacted the planning process significantly, but were used not purely for their scientific capabilities. Modeling results provided justification for decisions based on other constraints and political considerations. Uncertainties were utilized opportunistically by stakeholders instead of managed explicitly. Ultimately, the process supported the partisan views of those in control of the modeling. Based on these findings, as well as a review of model uncertainty analysis capabilities, I recommend modifying the planning process to allow for the development and incorporation of uncertainty information, while addressing the need for inclusive and meaningful public participation. By documenting an actual air quality planning process these findings provide insights about the potential for using new scientific information and understanding to achieve environmental goals, most notably the analysis of uncertainties in modeling applications. Concurrently, needed uncertainty information is identified and capabilities to produce it are assessed. Practices to facilitate incorporation of uncertainty information are suggested based on research findings, as well as theory from the literatures of the policy sciences, decision sciences, science and technology studies, consensus-based and communicative planning, and modeling.
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina
2014-02-01
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)
F. Herwig; Sam M. Austin; John C. Lattanzio
2005-11-14
Calculations that demonstrate the influence of three key nuclear reaction rates on the evolution of Asymptotic Giant Branch stars have been carried out. We study the case of a star with an initial mass of 2Msun and a metallicity of Z=0.01, somewhat less than the solar metallicity. The dredge-up of nuclear processed material from the interior of the star, and the yield predictions for carbon, are sensitive to the rate of the N14(p,gamma)O15 and triple-alpha reactions. These reactions dominate the H- and He-burning shells of stars in this late evolutionary phase. Published uncertainty estimates for each of these two rates propagated through stellar evolution calculations cause uncertainties in carbon enrichment and yield predictions of about a factor of two. The other important He-burning reaction C12(alpha,gamma)O16, although associated with the largest uncertainty in our study, does not have a significant influence on the abundance evolution compared to other modelling uncertainties. This finding remains valid when the entire evolution from the main-sequence to the tip of the AGB is considered. We discuss the experimental sources of the rate uncertainties addressed here, and give some outlook for future work.
Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling Unit Chilled Water Valve
Song, Li; Wang, Gang; Brambley, Michael R.
2013-04-28
A virtual water flow meter is developed that uses the chilled water control valve on an air-handling unit as a measurement device. The flow rate of water through the valve is calculated using the differential pressure across the valve and its associated coil, the valve command, and an empirically determined valve characteristic curve. Thus, the probability of error in the measurements is significantly greater than for conventionally manufactured flow meters. In this paper, mathematical models are developed and used to conduct uncertainty analysis for the virtual flow meter, and the results from the virtual meter are compared to measurements made with an ultrasonic flow meter. Theoretical uncertainty analysis shows that the total uncertainty in flow rates from the virtual flow meter is 1.46% with 95% confidence; comparison of virtual flow meter results with measurements from an ultrasonic flow meter yielded anuncertainty of 1.46% with 99% confidence. The comparable results from the theoretical uncertainty analysis and empirical comparison with the ultrasonic flow meter corroborate each other, and tend to validate the approach to computationally estimating uncertainty for virtual sensors introduced in this study.
Optimization Under Uncertainty for Water Consumption in a Pulverized Coal Power Plant
Juan M. Salazara; Stephen E. Zitney; Urmila M. Diwekara
2009-01-01
Pulverized coal (PC) power plants are widely recognized as major water consumers whose operability has started to be affected by drought conditions across some regions of the country. Water availability will further restrict the retrofitting of existing PC plants with water-expensive carbon capture technologies. Therefore, national efforts to reduce water withdrawal and consumption have been intensified. Water consumption in PC plants is strongly associated to losses from the cooling water cycle, particularly water evaporation from cooling towers. Accurate estimation of these water losses requires realistic cooling tower models, as well as the inclusion of uncertainties arising from atmospheric conditions. In this work, the cooling tower for a supercritical PC power plant was modeled as a humidification operation and used for optimization under uncertainty. Characterization of the uncertainty (air temperature and humidity) was based on available weather data. Process characteristics including boiler conditions, reactant ratios, and pressure ratios in turbines were calculated to obtain the minimum water consumption under the above mentioned uncertainties. In this study, the calculated conditions predicted up to 12% in reduction in the average water consumption for a 548 MW supercritical PC power plant simulated using Aspen Plus. Optimization under uncertainty for these large-scale PC plants cannot be solved with conventional stochastic programming algorithms because of the computational expenses involved. In this work, we discuss the use of a novel better optimization of nonlinear uncertain systems (BONUS) algorithm which dramatically decreases the computational requirements of the stochastic optimization.
Optimization under Uncertainty for Water Consumption in a Pulverized Coal Power Plant
Juan M. Salazar; Stephen E. Zitney; Urmila Diwekar
2009-01-01
Pulverized coal (PC) power plants are widely recognized as major water consumers whose operability has started to be affected by drought conditions across some regions of the country. Water availability will further restrict the retrofitting of existing PC plants with water-expensive carbon capture technologies. Therefore, national efforts to reduce water withdrawal and consumption have been intensified. Water consumption in PC plants is strongly associated to losses from the cooling water cycle, particularly water evaporation from cooling towers. Accurate estimation of these water losses requires realistic cooling tower models, as well as the inclusion of uncertainties arising from atmospheric conditions. In this work, the cooling tower for a supercritical PC power plant was modeled as a humidification operation and used for optimization under uncertainty. Characterization of the uncertainty (air temperature and humidity) was based on available weather data. Process characteristics including boiler conditions, reactant ratios, and pressure ratios in turbines were calculated to obtain the minimum water consumption under the above mentioned uncertainties. In this study, the calculated conditions predicted up to 12% in reduction in the average water consumption for a 548 MW supercritical PC power plant simulated using Aspen Plus. Optimization under uncertainty for these large-scale PC plants cannot be solved with conventional stochastic programming algorithms because of the computational expenses involved. In this work, we discuss the use of a novel better optimization of nonlinear uncertain systems (BONUS) algorithm which dramatically decreases the computational requirements of the stochastic optimization.
Franco, Guillermo; Shen-Tu, Bing Ming; Bazzurro, Paolo [AIR Worldwide Corporation, 131 Dartmouth Street, Boston, MA 02116 (United States); Goretti, Agostino [Seismic Risk Office, CPD, Rome (Italy); Valensise, Gianluca [Institute of Geophysics and Vulcanology (INGV), Rome (Italy)
2008-07-08
Increasing sophistication in the insurance and reinsurance market is stimulating the move towards catastrophe models that offer a greater degree of flexibility in the definition of model parameters and model assumptions. This study explores the impact of uncertainty in the input parameters on the loss estimates by departing from the exclusive usage of mean values to establish the earthquake event mechanism, the ground motion fields, or the damageability of the building stock. Here the potential losses due to a repeat of the 1908 Messina-Reggio Calabria event are calculated using different plausible alternatives found in the literature that encompass 12 event scenarios, 2 different ground motion prediction equations, and 16 combinations of damage functions for the building stock, a total of 384 loss scenarios. These results constitute the basis for a sensitivity analysis of the different assumptions on the loss estimates that allows the model user to estimate the impact of the uncertainty on input parameters and the potential spread of the model results. For the event under scrutiny, average losses would amount today to about 9.000 to 10.000 million Euros. The uncertainty in the model parameters is reflected in the high coefficient of variation of this loss, reaching approximately 45%. The choice of ground motion prediction equations and vulnerability functions of the building stock contribute the most to the uncertainty in loss estimates. This indicates that the application of non-local-specific information has a great impact on the spread of potential catastrophic losses. In order to close this uncertainty gap, more exhaustive documentation practices in insurance portfolios will have to go hand in hand with greater flexibility in the model input parameters.
Jackson, Don
permission. UNCERTAINTY ANALYSIS OF DIOXIN-LIKE POLYCHLORINATED BIPHENYLS-RELATED TOXIC E.. Satyendra P
Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification through PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.
Shogo Tanimura
2015-04-06
The uncertainty relation between angle and orbital angular momentum had not been formulated in a similar form as the uncertainty relation between position and linear momentum because the angle variable is not represented by a quantum mechanical self-adjoint operator. Instead of the angle variable operator, we introduce the complex position operator $ \\hat{Z} = \\hat{x}+i \\hat{y} $ and interpret the order parameter $ \\mu = \\langle \\hat{Z} \\rangle / \\sqrt{ \\langle \\hat{Z}^\\dagger \\hat{Z} \\rangle} $ as a measure of certainty of the angle distribution. We prove the relation between the uncertainty of angular momentum and the angle order parameter. We prove also its generalizations and discuss experimental methods for testing these relations.
nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties in the CTEQ framework
Kovarik, K; Jezo, T; Clark, D B; Keppel, C; Lyonnet, F; Morfin, J G; Olness, F I; Owens, J F; Schienbein, I; Yu, J Y
2015-01-01
We present the new nCTEQ15 set of nuclear parton distribution functions with uncertainties. This fit extends the CTEQ proton PDFs to include the nuclear dependence using data on nuclei all the way up to 208^Pb. The uncertainties are determined using the Hessian method with an optimal rescaling of the eigenvectors to accurately represent the uncertainties for the chosen tolerance criteria. In addition to the Deep Inelastic Scattering (DIS) and Drell-Yan (DY) processes, we also include inclusive pion production data from RHIC to help constrain the nuclear gluon PDF. Furthermore, we investigate the correlation of the data sets with specific nPDF flavor components, and asses the impact of individual experiments. We also provide comparisons of the nCTEQ15 set with recent fits from other groups.
Survey of sampling-based methods for uncertainty and sensitivity analysis.
Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD.; Storlie, Curt B.
2006-06-01
Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.
Banaszuk, Andrzej; Frewen, Thomas A; Kobilarov, Marin; Mathew, George; Mezic, Igor; Pinto, Alessandro; Sahai, Tuhin; Sane, Harshad; Speranzon, Alberto; Surana, Amit
2011-01-01
Development of robust dynamical systems and networks such as autonomous aircraft systems capable of accomplishing complex missions faces challenges due to the dynamically evolving uncertainties coming from model uncertainties, necessity to operate in a hostile cluttered urban environment, and the distributed and dynamic nature of the communication and computation resources. Model-based robust design is difficult because of the complexity of the hybrid dynamic models including continuous vehicle dynamics, the discrete models of computations and communications, and the size of the problem. We will overview recent advances in methodology and tools to model, analyze, and design robust autonomous aerospace systems operating in uncertain environment, with stress on efficient uncertainty quantification and robust design using the case studies of the mission including model-based target tracking and search, and trajectory planning in uncertain urban environment. To show that the methodology is generally applicable to...
Jet energy scale uncertainty correlations between ATLAS and CMS at 8 TeV
CMS Collaboration
2015-01-01
An evaluation of the correlations between ATLAS and CMS jet energy scale uncertainties is presented for $\\sqrt{s}=8$ TeV $pp$ collisions recorded in 2012. Uncertainties within each experiment are grouped based on the general type of systematic effect they are intended to cover and the means by which they are derived. Inter-experimental correlation value ranges are established for each corresponding group of uncertainty components. This correlation range is intended to cover the possible correlation values when performing combinations between the two experiments, where the most conservative value obtained from scanning over the correlation range should be used for the final combined measurement. The procedure described here is primarily aimed at single-observable analyses, and has limitations when applied to multi-observable measurements.
Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations
Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.
2013-05-01
Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.
Uncertainty quantification of a radionuclide release model using an adaptive spectral technique
Gilli, L.; Hoogwerf, C.; Lathouwers, D.; Kloosterman, J. L.
2013-07-01
In this paper we present the application of a non-intrusive spectral techniques we recently developed for the evaluation of the uncertainties associated with a radionuclide migration problem. Spectral techniques can be used to reconstruct stochastic quantities of interest by means of a Fourier-like expansion. Their application to uncertainty propagation problems can be performed by evaluating a set of realizations which are chosen adaptively, in this work the main details about how this is done are presented. The uncertainty quantification problem we are going to deal with was first solved in a recent work where the authors used a spectral technique based on an intrusive approach. In this paper we are going to reproduce the results of this reference work, compare them and discuss the main numerical aspects. (authors)
Degeneracies of particle and nuclear physics uncertainties in neutrinoless double beta decay
Lisi, E; Simkovic, F
2015-01-01
Theoretical estimates for the half life of neutrinoless double beta decay in candidate nuclei are affected by both particle and nuclear physics uncertainties, which may complicate the interpretation of decay signals or limits. We study such uncertainties and their degeneracies in the following context: three nuclei of great interest for large-scale experiments (76-Ge, 130-Te, 136-Xe), two representative particle physics mechanisms (light and heavy Majorana neutrino exchange), and a large set of nuclear matrix elements (NME), computed within the quasiparticle random phase approximation (QRPA). It turns out that the main theoretical uncertainties, associated with the effective axial coupling g_A and with the nucleon-nucleon potential, can be parametrized in terms of NME rescaling factors, up to small residuals. From this parametrization, the following QRPA features emerge: (1) the NME dependence on g_A is milder than quadratic; (2) in each of the two mechanisms, the relevant lepton flavor violating parameter is...
Intermittent optical frequency measurements to reduce the dead time uncertainty of frequency link
Hachisu, Hidekazu
2015-01-01
The absolute frequency of the $^{87}{\\rm Sr}$ lattice clock transition was evaluated with an uncertainty of $1.1\\times 10^{-15}$ using a frequency link to the international atomic time (TAI). The frequency uncertainty of a hydrogen maser used as a transfer oscillator was reduced by homogeneously distributed intermittent measurement over a five-day grid of TAI. Three sets of four or five days measurements as well as systematic uncertainty of the clock at $8.6\\times 10^{-17}$ have resulted in an absolute frequency of $^{87}{\\rm Sr}\\ {}^1S_0 - {}^3P_0$ clock transition to be 429 228 004 229 872.85 (47) Hz.
DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)
Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore »PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
ROBUSTNESS OF DECISION INSIGHTS UNDER ALTERNATIVE ALEATORY/EPISTEMIC UNCERTAINTY CLASSIFICATIONS
Unwin, Stephen D.; Eslinger, Paul W.; Johnson, Kenneth I.
2013-09-22
The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). A key technical challenge is to establish the conceptual and technical feasibility of analyzing safety margin in a risk-informed way, which, unlike conventionally defined deterministic margin analysis, would be founded on probabilistic characterizations of SSC performance. Evaluation of probabilistic safety margins will in general entail the uncertainty characterization both of the prospective challenge to the performance of an SSC ("load") and of its "capacity" to withstand that challenge. The RISMC framework contrasts sharply with the traditional probabilistic risk assessment (PRA) structure in that the underlying models are not inherently aleatory. Rather, they are largely deterministic physical/engineering models with ambiguities about the appropriate uncertainty classification of many model parameters. The current analysis demonstrates that if the distinction between epistemic and aleatory uncertainties is to be preserved in a RISMC-like modeling environment, then it is unlikely that analysis insights supporting decision-making will in general be robust under recategorization of input uncertainties. If it is believed there is a true conceptual distinction between epistemic and aleatory uncertainty (as opposed to the distinction being primarily a legacy of the PRA paradigm) then a consistent and defensible basis must be established by which to categorize input uncertainties.
Quantification of margins and uncertainty for risk-informed decision analysis.
Alvin, Kenneth Fredrick
2010-09-01
QMU stands for 'Quantification of Margins and Uncertainties'. QMU is a basic framework for consistency in integrating simulation, data, and/or subject matter expertise to provide input into a risk-informed decision-making process. QMU is being applied to a wide range of NNSA stockpile issues, from performance to safety. The implementation of QMU varies with lab and application focus. The Advanced Simulation and Computing (ASC) Program develops validated computational simulation tools to be applied in the context of QMU. QMU provides input into a risk-informed decision making process. The completeness aspect of QMU can benefit from the structured methodology and discipline of quantitative risk assessment (QRA)/probabilistic risk assessment (PRA). In characterizing uncertainties it is important to pay attention to the distinction between those arising from incomplete knowledge ('epistemic' or systematic), and those arising from device-to-device variation ('aleatory' or random). The national security labs should investigate the utility of a probability of frequency (PoF) approach in presenting uncertainties in the stockpile. A QMU methodology is connected if the interactions between failure modes are included. The design labs should continue to focus attention on quantifying uncertainties that arise from epistemic uncertainties such as poorly-modeled phenomena, numerical errors, coding errors, and systematic uncertainties in experiment. The NNSA and design labs should ensure that the certification plan for any RRW is supported by strong, timely peer review and by an ongoing, transparent QMU-based documentation and analysis in order to permit a confidence level necessary for eventual certification.
Seitz, R.
2011-03-02
It is widely recognized that the results of safety assessment calculations provide an important contribution to the safety arguments for a disposal facility, but cannot in themselves adequately demonstrate the safety of the disposal system. The safety assessment and a broader range of arguments and activities need to be considered holistically to justify radioactive waste disposal at any particular site. Many programs are therefore moving towards the production of what has become known as a Safety Case, which includes all of the different activities that are conducted to demonstrate the safety of a disposal concept. Recognizing the growing interest in the concept of a Safety Case, the International Atomic Energy Agency (IAEA) is undertaking an intercomparison and harmonization project called PRISM (Practical Illustration and use of the Safety Case Concept in the Management of Near-surface Disposal). The PRISM project is organized into four Task Groups that address key aspects of the Safety Case concept: Task Group 1 - Understanding the Safety Case; Task Group 2 - Disposal facility design; Task Group 3 - Managing waste acceptance; and Task Group 4 - Managing uncertainty. This paper addresses the work of Task Group 4, which is investigating approaches for managing the uncertainties associated with near-surface disposal of radioactive waste and their consideration in the context of the Safety Case. Emphasis is placed on identifying a wide variety of approaches that can and have been used to manage different types of uncertainties, especially non-quantitative approaches that have not received as much attention in previous IAEA projects. This paper includes discussions of the current results of work on the task on managing uncertainty, including: the different circumstances being considered, the sources/types of uncertainties being addressed and some initial proposals for approaches that can be used to manage different types of uncertainties.
Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory
J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts
2006-05-01
This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and improving the prediction accuracy of the damage modeling and finite element simulation.
Reda, I.; Stoffel, T.; Habte, A.
2014-03-01
The National Renewable Energy Laboratory (NREL) and the Atmospheric Radiation Measurement (ARM) Climate Research Facility work together in providing data from strategically located in situ measurement observatories around the world. Both work together in improving and developing new technologies that assist in acquiring high quality radiometric data. In this presentation we summarize the uncertainty estimates of the ARM data collected at the ARM Solar Infrared Radiation Station (SIRS), Sky Radiometers on Stand for Downwelling Radiation (SKYRAD), and Ground Radiometers on Stand for Upwelling Radiation (GNDRAD), which ultimately improve the existing radiometric data. Three studies are also included to show the difference between calibrating pyrgeometers (e.g., Eppley PIR) using the manufacturer blackbody versus the interim World Infrared Standard Group (WISG), a pyrgeometer aging study, and the sampling rate effect of correcting historical data.
Chance Constrained Optimal Power Flow: Risk-Aware Network Control under Uncertainty
Bienstock, Daniel; Harnett, Sean
2012-01-01
When uncontrollable resources fluctuate, Optimum Power Flow (OPF), routinely used by the electric power industry to re-dispatch hourly controllable generation (coal, gas and hydro plants) over control areas of transmission networks, can result in grid instability, and, potentially, cascading outages. This risk arises because OPF dispatch is computed without awareness of major uncertainty, in particular fluctuations in renewable output. As a result, grid operation under OPF with renewable variability can lead to frequent conditions where power line flow ratings are significantly exceeded. Such a condition, which is borne by simulations of real grids, would likely resulting in automatic line tripping to protect lines from thermal stress, a risky and undesirable outcome which compromises stability. Smart grid goals include a commitment to large penetration of highly fluctuating renewables, thus calling to reconsider current practices, in particular the use of standard OPF. Our Chance Constrained (CC) OPF correct...