National Library of Energy BETA

Sample records for analysis price uncertainty

  1. Market Prices and Uncertainty Report

    Reports and Publications (EIA)

    2016-01-01

    Monthly analysis of crude oil, petroleum products, natural gas, and propane prices is released as a regular supplement to the Short-Term Energy Outlook.

  2. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    November 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 November 9, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged almost $82 per barrel in October, about $7 per barrel higher than the September average, as expectations of higher oil demand pushed up prices. EIA has raised the average fourth quarter 2010 WTI spot price forecast to about $83 per barrel compared with $79 per barrel in last monthʹs Outlook. WTI spot prices rise to $87 per

  3. Microsoft Word - Price Uncertainty Supplement .docx

    Gasoline and Diesel Fuel Update (EIA)

    1 1 January 2011 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 January 11, 2011 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged over $89 per barrel in December, about $5 per barrel higher than the November average. Expectations of higher oil demand, combined with unusually cold weather in both Europe and the U.S. Northeast, contributed to prices. EIA has raised the first quarter 2011 WTI spot price forecast by $8 per barrel

  4. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 August 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 August 10, 2010 Release WTI crude oil spot prices averaged $76.32 per barrel in July 2010 or about $1 per barrel above the prior month's average, and close to the $77 per barrel projected in last month's Outlook. EIA projects WTI prices will average about $80 per barrel over the second half of this year and rise to $85 by the end of next year (West Texas Intermediate Crude Oil Price Chart). Energy price

  5. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    December 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 December 7, 2010 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged over $84 per barrel in November, more than $2 per barrel higher than the October average. EIA has raised the average winter 2010-2011 period WTI spot price forecast by $1 per barrel from the last monthʹs Outlook to $84 per barrel. WTI spot prices rise to $89 per barrel by the end of next year, $2 per

  6. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 July 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 July 7, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $75.34 per barrel in June 2010 ($1.60 per barrel above the prior month's average), close to the $76 per barrel projected in the forecast in last month's Outlook. EIA projects WTI prices will average about $79 per barrel over the second half of this year and rise to $84 by the end of next year (West Texas Intermediate Crude Oil Price

  7. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    March 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 March 9, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $76.39 per barrel in February 2010, almost $2 per barrel lower than the prior month's average and very near the $76 per barrel forecast in last month's Outlook. Last month, the WTI spot price reached a low of $71.15 on February 5 and peaked at $80.04 on February 22. EIA expects WTI prices to average above $80 per barrel this spring,

  8. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    April 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 April 6, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $81 per barrel in March 2010, almost $5 per barrel above the prior month's average and $3 per barrel higher than forecast in last month's Outlook. Oil prices rose from a low this year of $71.15 per barrel on February 5 to $80 per barrel by the end of February, generally on news of robust economic and energy demand growth in non-OECD

  9. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 June 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 June 8, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged less than $74 per barrel in May 2010, almost $11 per barrel below the prior month's average and $7 per barrel lower than forecast in last month's Outlook. EIA projects WTI prices will average about $79 per barrel over the second half of this year and rise to $84 by the end of next year, a decrease of about $3 per barrel from the

  10. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    May 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 May 11, 2010 Release Crude Oil Prices. WTI crude oil spot prices averaged $84 per barrel in April 2010, about $3 per barrel above the prior month's average and $2 per barrel higher than forecast in last month's Outlook. EIA projects WTI prices will average about $84 per barrel over the second half of this year and rise to $87 by the end of next year, an increase of about $2 per barrel from the previous Outlook

  11. Microsoft Word - Price Uncertainty Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 September 2010 Short-Term Energy Outlook Energy Price Volatility and Forecast Uncertainty 1 September 8, 2010 Release Crude Oil Prices. West Texas Intermediate (WTI) crude oil spot prices averaged about $77 per barrel in August 2010, very close to the July average, but $3 per barrel lower than projected in last month's Outlook. WTI spot prices averaged almost $82 per barrel over the first 10 days of August but then fell by $9 per barrel over the next 2 weeks as the market reacted to a series

  12. Uncertainty Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Analysis - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy

  13. Microsoft Word - Documentation - Price Forecast Uncertainty.doc

    U.S. Energy Information Administration (EIA) Indexed Site

    October 2009 1 October 2009 Short-Term Energy Outlook Supplement: Energy Price Volatility and Forecast Uncertainty 1 Summary It is often noted that energy prices are quite volatile, reflecting market participants' adjustments to new information from physical energy markets and/or markets in energy- related financial derivatives. Price volatility is an indication of the level of uncertainty, or risk, in the market. This paper describes how markets price risk and how the market- clearing process

  14. Sensitivity and Uncertainty Analysis

    Broader source: Energy.gov [DOE]

    Summary Notes from 15 November 2007 Generic Technical Issue Discussion on Sensitivity and Uncertainty Analysis and Model Support

  15. Volatile coal prices reflect supply, demand uncertainties

    SciTech Connect (OSTI)

    Ryan, M.

    2004-12-15

    Coal mine owners and investors say that supply and demand are now finally in balance. But coal consumers find that both spot tonnage and new contract coal come at a much higher price.

  16. Uncertainty Analysis Technique for OMEGA Dante Measurements ...

    Office of Scientific and Technical Information (OSTI)

    Uncertainty Analysis Technique for OMEGA Dante Measurements Citation Details In-Document Search Title: Uncertainty Analysis Technique for OMEGA Dante Measurements You are...

  17. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect (OSTI)

    Smith, F.; Phifer, M.

    2011-06-30

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in

  18. Sensitivity and Uncertainty Analysis Shell

    Energy Science and Technology Software Center (OSTI)

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  19. Cost Analysis: Technology, Competitiveness, Market Uncertainty | Department

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    of Energy Technology to Market » Cost Analysis: Technology, Competitiveness, Market Uncertainty Cost Analysis: Technology, Competitiveness, Market Uncertainty As a basis for strategic planning, competitiveness analysis, funding metrics and targets, SunShot supports analysis teams at national laboratories to assess technology costs, location-specific competitive advantages, policy impacts on system financing, and to perform detailed levelized cost of energy (LCOE) analyses. This shows the

  20. Uncertainty Analysis for Photovoltaic Degradation Rates (Poster)

    SciTech Connect (OSTI)

    Jordan, D.; Kurtz, S.; Hansen, C.

    2014-04-01

    Dependable and predictable energy production is the key to the long-term success of the PV industry. PV systems show over the lifetime of their exposure a gradual decline that depends on many different factors such as module technology, module type, mounting configuration, climate etc. When degradation rates are determined from continuous data the statistical uncertainty is easily calculated from the regression coefficients. However, total uncertainty that includes measurement uncertainty and instrumentation drift is far more difficult to determine. A Monte Carlo simulation approach was chosen to investigate a comprehensive uncertainty analysis. The most important effect for degradation rates is to avoid instrumentation that changes over time in the field. For instance, a drifting irradiance sensor, which can be achieved through regular calibration, can lead to a substantially erroneous degradation rates. However, the accuracy of the irradiance sensor has negligible impact on degradation rate uncertainty emphasizing that precision (relative accuracy) is more important than absolute accuracy.

  1. Analysis of the Uncertainty in Wind Measurements from the Atmospheric...

    Office of Scientific and Technical Information (OSTI)

    Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation ... Citation Details In-Document Search Title: Analysis of the Uncertainty in Wind ...

  2. Analysis and Reduction of Chemical Models under Uncertainty ...

    Office of Scientific and Technical Information (OSTI)

    Analysis and Reduction of Chemical Models under Uncertainty Citation Details In-Document Search Title: Analysis and Reduction of Chemical Models under Uncertainty Abstract not ...

  3. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect (OSTI)

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  4. October 16, 2014 Webinar - Decisional Analysis under Uncertainty |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 6, 2014 Webinar - Decisional Analysis under Uncertainty October 16, 2014 Webinar - Decisional Analysis under Uncertainty Webinar - October 16, 2014, 11 am - 12:40 pm EDT: Dr. Paul Black (Neptune, Inc), Decisional Analysis under Uncertainty Agenda - October 16, 2014 - P&RA CoP Webinar (59.42 KB) Presentation - Decision Making under Uncertainty: Introduction to Structured Decision Analysis for Performance Assessments (4.02 MB) More Documents & Publications Status

  5. Examining Uncertainty in Demand Response Baseline Models and Variability in Automated Response to Dynamic Pricing

    SciTech Connect (OSTI)

    Mathieu, Johanna L.; Callaway, Duncan S.; Kiliccote, Sila

    2011-08-15

    Controlling electric loads to deliver power system services presents a number of interesting challenges. For example, changes in electricity consumption of Commercial and Industrial (C&I) facilities are usually estimated using counterfactual baseline models, and model uncertainty makes it difficult to precisely quantify control responsiveness. Moreover, C&I facilities exhibit variability in their response. This paper seeks to understand baseline model error and demand-side variability in responses to open-loop control signals (i.e. dynamic prices). Using a regression-based baseline model, we define several Demand Response (DR) parameters, which characterize changes in electricity use on DR days, and then present a method for computing the error associated with DR parameter estimates. In addition to analyzing the magnitude of DR parameter error, we develop a metric to determine how much observed DR parameter variability is attributable to real event-to-event variability versus simply baseline model error. Using data from 38 C&I facilities that participated in an automated DR program in California, we find that DR parameter errors are large. For most facilities, observed DR parameter variability is likely explained by baseline model error, not real DR parameter variability; however, a number of facilities exhibit real DR parameter variability. In some cases, the aggregate population of C&I facilities exhibits real DR parameter variability, resulting in implications for the system operator with respect to both resource planning and system stability.

  6. Representation of analysis results involving aleatory and epistemic uncertainty.

    SciTech Connect (OSTI)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis; Sallaberry, Cedric J.

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for the representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.

  7. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    SciTech Connect (OSTI)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  8. Subject: Cost and Price Analysis | Department of Energy

    Broader source: Energy.gov (indexed) [DOE]

    Subject: Cost and Price Analysis More Documents & Publications Acquisition Letter 2009-03 Acquisition Planning-Extending A Management and Operating Contract Without Full and Open...

  9. Analysis of Price Volatility in Natural Gas Markets

    Reports and Publications (EIA)

    2007-01-01

    This article presents an analysis of price volatility in the spot natural gas market, with particular emphasis on the Henry Hub in Louisiana.

  10. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect (OSTI)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  11. Measurement uncertainty analysis techniques applied to PV performance measurements

    SciTech Connect (OSTI)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  12. Risk Analysis and Decision-Making Under Uncertainty: A Strategy...

    Office of Environmental Management (EM)

    Uncertainty Analysis and Parameter Estimation Since 2002 To view all the P&RA CoP ... Update of Hydrogen from Biomass - Determination of the Delivered Cost of Hydrogen: ...

  13. Analysis of the Uncertainty in Wind Measurements from the Atmospheric

    Office of Scientific and Technical Information (OSTI)

    Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report (Program Document) | SciTech Connect Program Document: Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report Citation Details In-Document Search Title: Analysis of the Uncertainty in Wind Measurements from the Atmospheric Radiation Measurement Doppler Lidar during XPIA: Field Campaign Report In March and April of 2015, the ARM Doppler

  14. Marginal pricing of transmission services: An analysis of cost recovery

    SciTech Connect (OSTI)

    Perez-Arriaga, I.J.; Rubio, F.J.; Puerta, J.F.; Arceluz, J.; Marin, J.

    1995-02-01

    This paper presents an in-depth analysis of network revenues computed with marginal pricing, and in particular it investigates the reasons why marginal prices fail to recover the total incurred network costs in actual power systems. The basic theoretical results are presented and the major causes of the mismatch between network costs and marginal revenues are identified and illustrated with numerical examples, some tutorial and others of realistic size. The regulatory implications of marginal network pricing in the context of competitive electricity markets are analyzed, and suggestions are provided for the meaningful allocation of the costs of the network among its users.

  15. Uncertainty Analysis for RELAP5-3D

    SciTech Connect (OSTI)

    Aaron J. Pawel; Dr. George L. Mesina

    2011-08-01

    In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.

  16. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  17. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  18. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    SciTech Connect (OSTI)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  19. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect (OSTI)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  20. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect (OSTI)

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  1. Analysis of Federal Subsidies: Implied Price of Carbon

    SciTech Connect (OSTI)

    D. Craig Cooper; Thomas Foulke

    2010-10-01

    For informed climate change policy, it is important for decision makers to be able to assess how the costs and benefits of federal energy subsidies are distributed and to be able to have some measure to compare them. One way to do this is to evaluate the implied price of carbon (IPC) for a federal subsidy, or set of subsidies; where the IPC is the cost of the subsidy to the U.S. Treasury divided by the emissions reductions it generated. Subsidies with lower IPC are more cost effective at reducing greenhouse gas emissions, while subsidies with a negative IPC act to increase emissions. While simple in concept, the IPC is difficult to calculate in practice. Calculation of the IPC requires knowledge of (i) the amount of energy associated with the subsidy, (ii) the amount and type of energy that would have been produced in the absence of the subsidy, and (iii) the greenhouse gas emissions associated with both the subsidized energy and the potential replacement energy. These pieces of information are not consistently available for federal subsidies, and there is considerable uncertainty in cases where the information is available. Thus, exact values for the IPC based upon fully consistent standards cannot be calculated with available data. However, it is possible to estimate a range of potential values sufficient for initial comparisons. This study has employed a range of methods to generate “first order” estimates for the IPC of a range of federal subsidies using static methods that do not account for the dynamics of supply and demand. The study demonstrates that, while the IPC value depends upon how the inquiry is framed and the IPC cannot be calculated in a “one size fits all” manner, IPC calculations can provide a valuable perspective for climate policy analysis. IPC values are most useful when calculated within the perspective of a case study, with the method and parameters of the calculation determined by the case. The IPC of different policy measures can

  2. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect (OSTI)

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  3. Mandatory petroleum price and allocation regulations: a history and analysis

    SciTech Connect (OSTI)

    Lane, W.C. Jr.

    1981-05-05

    This study is a history and analysis of Federal controls over petroleum price and allocation decisions throughout the 1970's. It relies in large part on the many analyses and comments written by academics, government officials and others who have studied the topic over the past nine years. The first part of this study outlines the conditions in the domestic petroleum market that led to the imposition of first price, and then allocation controls on the oil industry. The second part explores the operation and effectiveness of the regulations during periods of oil supply disruptions - the Arab embargo of 1973-1974 and the 1979 shortfall attendant to the revolution in Iran. The third part describes many of the more important changes made to the regulations during non-disrupted periods, and examines the effect the regulations had on the operation and structure of the domestic oil industry. The last part analyzes the overall effects of the regulations on consumer prices and national welfare.

  4. Microsoft Word - Price Probabilities Supplement.doc

    Gasoline and Diesel Fuel Update (EIA)

    0 1 April 2010 Short-Term Energy Outlook Supplement: Probabilities of Possible Future Prices 1 EIA introduced a monthly analysis of energy price volatility and forecast uncertainty in the October 2009 Short-Term Energy Outlook (STEO). Included in the analysis were charts portraying confidence intervals around the New York Mercantile Exchange (NYMEX) futures prices of West Texas Intermediate (equivalent to light sweet crude oil) and Henry Hub natural gas contracts. The March 2010 STEO added

  5. Thermal hydraulic limits analysis using statistical propagation of parametric uncertainties

    SciTech Connect (OSTI)

    Chiang, K. Y.; Hu, L. W.; Forget, B.

    2012-07-01

    The MIT Research Reactor (MITR) is evaluating the conversion from highly enriched uranium (HEU) to low enrichment uranium (LEU) fuel. In addition to the fuel element re-design, a reactor power upgraded from 6 MW to 7 MW is proposed in order to maintain the same reactor performance of the HEU core. Previous approach in analyzing the impact of engineering uncertainties on thermal hydraulic limits via the use of engineering hot channel factors (EHCFs) was unable to explicitly quantify the uncertainty and confidence level in reactor parameters. The objective of this study is to develop a methodology for MITR thermal hydraulic limits analysis by statistically combining engineering uncertainties with an aim to eliminate unnecessary conservatism inherent in traditional analyses. This method was employed to analyze the Limiting Safety System Settings (LSSS) for the MITR, which is the avoidance of the onset of nucleate boiling (ONB). Key parameters, such as coolant channel tolerances and heat transfer coefficients, were considered as normal distributions using Oracle Crystal Ball to calculate ONB. The LSSS power is determined with 99.7% confidence level. The LSSS power calculated using this new methodology is 9.1 MW, based on core outlet coolant temperature of 60 deg. C, and primary coolant flow rate of 1800 gpm, compared to 8.3 MW obtained from the analytical method using the EHCFs with same operating conditions. The same methodology was also used to calculate the safety limit (SL) for the MITR, conservatively determined using onset of flow instability (OFI) as the criterion, to verify that adequate safety margin exists between LSSS and SL. The calculated SL is 10.6 MW, which is 1.5 MW higher than LSSS. (authors)

  6. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    SciTech Connect (OSTI)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  7. Technical analysis in short-term uranium price forecasting

    SciTech Connect (OSTI)

    Schramm, D.S.

    1990-03-01

    As market participants anticipate the end of the current uranium price decline and its subsequent reversal, increased attention will be focused upon forecasting future price movements. Although uranium is economically similar to other mineral commodities, it is questionable whether methodologies used to forecast price movements of such commodities may be successfully applied to uranium.

  8. Global analysis of energy prices and agriculture. Staff report

    SciTech Connect (OSTI)

    McDonald, B.J.; Martinez, S.W.; Otradovsky, M.; Stout, J.V.

    1991-09-01

    A multiregion computable general equilibrium (CGE) model was used to assess the longrun effects of higher energy prices on agricultural production, prices, and trade. An increase in the price of energy enters farmers' cost functions through direct energy use and through the indirect influence of energy prices on intermediate inputs, especially fertilizers. The multiregion feature of the model allows us to include the effects of energy price shocks on economies of other regions and to assess price changes in a global context. Because farming is highly energy-intensive, agricultural output falls more than output in the manufacturing and services sectors of each region of the model. Real returns to farmland, a good indicator of farm welfare, falls in each of the four regions. The U.S. land price declines by 3.5 percent, a drop comparable to that resulting from a 20-percent multilateral agricultural policy liberalization in a similar model.

  9. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    SciTech Connect (OSTI)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  10. Estimation and Uncertainty Analysis of Impacts of Future Heat...

    Office of Scientific and Technical Information (OSTI)

    However, the estimation of excess mortality attributable to future heat waves is subject to large uncertainties, which have not been examined under the latest greenhouse gas ...

  11. Electricity prices in a competitive environment: Marginal cost pricing of generation services and financial status of electric utilities. A preliminary analysis through 2015

    SciTech Connect (OSTI)

    1997-08-01

    The emergence of competitive markets for electricity generation services is changing the way that electricity is and will be priced in the United States. This report presents the results of an analysis that focuses on two questions: (1) How are prices for competitive generation services likely to differ from regulated prices if competitive prices are based on marginal costs rather than regulated {open_quotes}cost-of-service{close_quotes} pricing? (2) What impacts will the competitive pricing of generation services (based on marginal costs) have on electricity consumption patterns, production costs, and the financial integrity patterns, production costs, and the financial integrity of electricity suppliers? This study is not intended to be a cost-benefit analysis of wholesale or retail competition, nor does this report include an analysis of the macroeconomic impacts of competitive electricity prices.

  12. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    SciTech Connect (OSTI)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  13. An Analysis of Residential PV System Price Differences between...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Residential photovoltaic (PV) systems were twice as expensive in the United States as in ... Tracking the Sun VIII: The Installed Price of Residential and Non-Residential Photovoltaic ...

  14. Problem Solving Environment for Uncertainty Analysis and Design Exploration

    Energy Science and Technology Software Center (OSTI)

    2011-10-26

    PSUADE is an software system that is used to study the releationships between the inputs and outputs of gerneral simulation models for the purpose of performing uncertainty and sensitivity analyses on simulation models.

  15. Uncertainty analysis of multi-rate kinetics of uranium desorption...

    Office of Scientific and Technical Information (OSTI)

    in the multi-rate model to simualte U(VI) desorption; 3) however, long-term prediction and its uncertainty may be significantly biased by the lognormal assumption for the ...

  16. Techno-economic and uncertainty analysis of in situ and ex situ fast pyrolysis for biofuel production

    SciTech Connect (OSTI)

    Li, Boyan; Ou, Longwen; Dang, Qi; Meyer, Pimphan A.; Jones, Susanne B.; Brown, Robert C.; Wright, Mark

    2015-11-01

    This study evaluates the techno-economic uncertainty in cost estimates for two emerging biorefinery technologies for biofuel production: in situ and ex situ catalytic pyrolysis. Stochastic simulations based on process and economic parameter distributions are applied to calculate biorefinery performance and production costs. The probability distributions for the minimum fuel-selling price (MFSP) indicate that in situ catalytic pyrolysis has an expected MFSP of $4.20 per gallon with a standard deviation of 1.15, while the ex situ catalytic pyrolysis has a similar MFSP with a smaller deviation ($4.27 per gallon and 0.79 respectively). These results suggest that a biorefinery based on ex situ catalytic pyrolysis could have a lower techno-economic risk than in situ pyrolysis despite a slightly higher MFSP cost estimate. Analysis of how each parameter affects the NPV indicates that internal rate of return, feedstock price, total project investment, electricity price, biochar yield and bio-oil yield are significant parameters which have substantial impact on the MFSP for both in situ and ex situ catalytic pyrolysis.

  17. LCOE Uncertainty Analysis for Hydropower using Monte Carlo Simulations

    SciTech Connect (OSTI)

    Chalise, Dol Raj; O'Connor, Patrick W; DeNeale, Scott T; Uria Martinez, Rocio; Kao, Shih-Chieh

    2015-01-01

    Levelized Cost of Energy (LCOE) is an important metric to evaluate the cost and performance of electricity production generation alternatives, and combined with other measures, can be used to assess the economics of future hydropower development. Multiple assumptions on input parameters are required to calculate the LCOE, which each contain some level of uncertainty, in turn affecting the accuracy of LCOE results. This paper explores these uncertainties, their sources, and ultimately the level of variability they introduce at the screening level of project evaluation for non-powered dams (NPDs) across the U.S. Owing to site-specific differences in site design, the LCOE for hydropower varies significantly from project to project unlike technologies with more standardized configurations such as wind and gas. Therefore, to assess the impact of LCOE input uncertainty on the economics of U.S. hydropower resources, these uncertainties must be modeled across the population of potential opportunities. To demonstrate the impact of uncertainty, resource data from a recent nationwide non-powered dam (NPD) resource assessment (Hadjerioua et al., 2012) and screening-level predictive cost equations (O Connor et al., 2015) are used to quantify and evaluate uncertainties in project capital and operations & maintenance costs, and generation potential at broad scale. LCOE dependence on financial assumptions is also evaluated on a sensitivity basis to explore ownership/investment implications on project economics for the U.S. hydropower fleet. The results indicate that the LCOE for U.S. NPDs varies substantially. The LCOE estimates for the potential NPD projects of capacity greater than 1 MW range from 40 to 182 $/MWh, with average of 106 $/MWh. 4,000 MW could be developed through projects with individual LCOE values below 100 $/MWh. The results also indicate that typically 90 % of LCOE uncertainty can be attributed to uncertainties in capital costs and energy production; however

  18. Uncertainty Analysis of RELAP5-3D

    SciTech Connect (OSTI)

    Alexandra E Gertman; Dr. George L Mesina

    2012-07-01

    As world-wide energy consumption continues to increase, so does the demand for the use of alternative energy sources, such as Nuclear Energy. Nuclear Power Plants currently supply over 370 gigawatts of electricity, and more than 60 new nuclear reactors have been commissioned by 15 different countries. The primary concern for Nuclear Power Plant operation and lisencing has been safety. The safety of the operation of Nuclear Power Plants is no simple matter- it involves the training of operators, design of the reactor, as well as equipment and design upgrades throughout the lifetime of the reactor, etc. To safely design, operate, and understand nuclear power plants, industry and government alike have relied upon the use of best-estimate simulation codes, which allow for an accurate model of any given plant to be created with well-defined margins of safety. The most widely used of these best-estimate simulation codes in the Nuclear Power industry is RELAP5-3D. Our project focused on improving the modeling capabilities of RELAP5-3D by developing uncertainty estimates for its calculations. This work involved analyzing high, medium, and low ranked phenomena from an INL PIRT on a small break Loss-Of-Coolant Accident as wall as an analysis of a large break Loss-Of- Coolant Accident. Statistical analyses were performed using correlation coefficients. To perform the studies, computer programs were written that modify a template RELAP5 input deck to produce one deck for each combination of key input parameters. Python scripting enabled the running of the generated input files with RELAP5-3D on INL’s massively parallel cluster system. Data from the studies was collected and analyzed with SAS. A summary of the results of our studies are presented.

  19. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect (OSTI)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  20. Principles and applications of measurement and uncertainty analysis in research and calibration

    SciTech Connect (OSTI)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  1. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Convergence of the Uncertainty Results

    SciTech Connect (OSTI)

    Bixler, Nathan E.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Eckert-Gallup, Aubrey Celia; Mattie, Patrick D.; Ghosh, S. Tina

    2014-02-01

    This paper describes the convergence of MELCOR Accident Consequence Code System, Version 2 (MACCS2) probabilistic results of offsite consequences for the uncertainty analysis of the State-of-the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout scenario at the Peach Bottom Atomic Power Station. The consequence metrics evaluated are individual latent-cancer fatality (LCF) risk and individual early fatality risk. Consequence results are presented as conditional risk (i.e., assuming the accident occurs, risk per event) to individuals of the public as a result of the accident. In order to verify convergence for this uncertainty analysis, as recommended by the Nuclear Regulatory Commission’s Advisory Committee on Reactor Safeguards, a ‘high’ source term from the original population of Monte Carlo runs has been selected to be used for: (1) a study of the distribution of consequence results stemming solely from epistemic uncertainty in the MACCS2 parameters (i.e., separating the effect from the source term uncertainty), and (2) a comparison between Simple Random Sampling (SRS) and Latin Hypercube Sampling (LHS) in order to validate the original results obtained with LHS. Three replicates (each using a different random seed) of size 1,000 each using LHS and another set of three replicates of size 1,000 using SRS are analyzed. The results show that the LCF risk results are well converged with either LHS or SRS sampling. The early fatality risk results are less well converged at radial distances beyond 2 miles, and this is expected due to the sparse data (predominance of “zero” results).

  2. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    SciTech Connect (OSTI)

    Harper, F.T.; Young, M.L.; Miller, L.A.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  3. Energy Price Indices and Discount Factors for Life-Cycle Cost Analysis -

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    2015 | Department of Energy Price Indices and Discount Factors for Life-Cycle Cost Analysis - 2015 Energy Price Indices and Discount Factors for Life-Cycle Cost Analysis - 2015 Handbook describes the annual supplements to the NIST Handbook 135 and NBS Special Publication 709. Download the handbook. (564.47 KB) More Documents & Publications Guidance on Life-Cycle Cost Analysis Required by Executive Order 13123 Vehicle Technologies Office Merit Review 2015: Fuel-Neutral Studies of

  4. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    SciTech Connect (OSTI)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD.; Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  5. Building-Integrated Photovoltaics (BIPV) in the Residential Section: An Analysis of Installed Rooftop Prices (Presentation)

    SciTech Connect (OSTI)

    James, T.; Goodrich, A.; Woodhouse, M.; Margolis, R.; Ong, S.

    2012-06-01

    This powerpoint presentation to be presented at the World Renewable Energy Forum on May 17, 2012, in Denver, CO, discusses building-integrated photovoltaics (BIPV) in the residential section and includes an analysis of installed rooftop prices.

  6. An Analysis of Residential PV System Price Differences between the United

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    States and Germany | Department of Energy An Analysis of Residential PV System Price Differences between the United States and Germany An Analysis of Residential PV System Price Differences between the United States and Germany Residential photovoltaic (PV) systems were twice as expensive in the United States as in Germany (median of $5.29/W vs. $2.59/W) in 2012. This price discrepancy stems primarily from differences in non-hardware or "soft" costs between the two countries, which

  7. 2007 Wholesale Power Rate Case Final Proposal : Market Price Forecast Study.

    SciTech Connect (OSTI)

    United States. Bonneville Power Administration.

    2006-07-01

    This study presents BPA's market price forecasts for the Final Proposal, which are based on AURORA modeling. AURORA calculates the variable cost of the marginal resource in a competitively priced energy market. In competitive market pricing, the marginal cost of production is equivalent to the market-clearing price. Market-clearing prices are important factors for informing BPA's power rates. AURORA was used as the primary tool for (a) estimating the forward price for the IOU REP Settlement benefits calculation for fiscal years (FY) 2008 and 2009, (b) estimating the uncertainty surrounding DSI payments and IOU REP Settlements benefits, (c) informing the secondary revenue forecast and (d) providing a price input used for the risk analysis. For information about the calculation of the secondary revenues, uncertainty regarding the IOU REP Settlement benefits and DSI payment uncertainty, and the risk run, see Risk Analysis Study WP-07-FS-BPA-04.

  8. Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling...

    Office of Scientific and Technical Information (OSTI)

    Unit Chilled Water Valve Citation Details In-Document Search Title: Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling Unit Chilled Water Valve A virtual ...

  9. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    SciTech Connect (OSTI)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  10. Analysis and Reduction of Complex Networks Under Uncertainty.

    SciTech Connect (OSTI)

    Ghanem, Roger G

    2014-07-31

    This effort was a collaboration with Youssef Marzouk of MIT, Omar Knio of Duke University (at the time at Johns Hopkins University) and Habib Najm of Sandia National Laboratories. The objective of this effort was to develop the mathematical and algorithmic capacity to analyze complex networks under uncertainty. Of interest were chemical reaction networks and smart grid networks. The statements of work for USC focused on the development of stochastic reduced models for uncertain networks. The USC team was led by Professor Roger Ghanem and consisted of one graduate student and a postdoc. The contributions completed by the USC team consisted of 1) methodology and algorithms to address the eigenvalue problem, a problem of significance in the stability of networks under stochastic perturbations, 2) methodology and algorithms to characterize probability measures on graph structures with random flows. This is an important problem in characterizing random demand (encountered in smart grid) and random degradation (encountered in infrastructure systems), as well as modeling errors in Markov Chains (with ubiquitous relevance !). 3) methodology and algorithms for treating inequalities in uncertain systems. This is an important problem in the context of models for material failure and network flows under uncertainty where conditions of failure or flow are described in the form of inequalities between the state variables.

  11. An Analysis of Price Determination and Markups in the Air-Conditioning and Heating Equipment Industry

    SciTech Connect (OSTI)

    Dale, Larry; Millstein, Dev; Coughlin, Katie; Van Buskirk, Robert; Rosenquist, Gregory; Lekov, Alex; Bhuyan, Sanjib

    2004-01-30

    In this report we calculate the change in final consumer prices due to minimum efficiency standards, focusing on a standard economic model of the air-conditioning and heating equipment (ACHE) wholesale industry. The model examines the relationship between the marginal cost to distribute and sell equipment and the final consumer price in this industry. The model predicts that the impact of a standard on the final consumer price is conditioned by its impact on marginal distribution costs. For example, if a standard raises the marginal cost to distribute and sell equipment a small amount, the model predicts that the standard will raise the final consumer price a small amount as well. Statistical analysis suggest that standards do not increase the amount of labor needed to distribute equipment the same employees needed to sell lower efficiency equipment can sell high efficiency equipment. Labor is a large component of the total marginal cost to distribute and sell air-conditioning and heating equipment. We infer from this that standards have a relatively small impact on ACHE marginal distribution and sale costs. Thus, our model predicts that a standard will have a relatively small impact on final ACHE consumer prices. Our statistical analysis of U.S. Census Bureau wholesale revenue tends to confirm this model prediction. Generalizing, we find that the ratio of manufacturer price to final consumer price prior to a standard tends to exceed the ratio of the change in manufacturer price to the change in final consumer price resulting from a standard. The appendix expands our analysis through a typical distribution chain for commercial and residential air-conditioning and heating equipment.

  12. SSL Pricing and Efficacy Trend Analysis for Utility Program Planning

    SciTech Connect (OSTI)

    Tuenge, Jason R.

    2013-10-01

    An LED lamp or luminaire can generally be found that matches or exceeds the efficacy of benchmark technologies in a given product category, and LED products continue to expand into ever-higher lumen output niches. However, the price premium for LED continues to pose a barrier to adoption in many applications, in spite of expected savings from reduced energy use and maintenance. Other factors—such as dimmability and quality of light—can also present challenges. The appropriate type, timing, and magnitude of energy efficiency activities will vary from organization to organization based on local variables and the method of evaluation. A number of factors merit consideration when prioritizing activities for development. Category-specific projections for pricing and efficacy are provided herein to assist in efficiency program planning efforts.

  13. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    SciTech Connect (OSTI)

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  14. The IAEA Coordinated Research Program on HTGR Uncertainty Analysis: Phase I Status and Initial Results

    SciTech Connect (OSTI)

    Strydom, Gerhard; Bostelmann, Friederike; Ivanov, Kostadin

    2014-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. One way to address the uncertainties in the HTGR analysis tools is to assess the sensitivity of critical parameters (such as the calculated maximum fuel temperature during loss of coolant accidents) to a few important input uncertainties. The input parameters were identified by engineering judgement in the past but are today typically based on a Phenomena Identification Ranking Table (PIRT) process. The input parameters can also be derived from sensitivity studies and are then varied in the analysis to find a spread in the parameter of importance. However, there is often no easy way to compensate for these uncertainties. In engineering system design, a common approach for addressing performance uncertainties is to add compensating margins to the system, but with passive properties credited it is not so clear how to apply it in the case of modular HTGR heat removal path. Other more sophisticated uncertainty modelling approaches, including Monte Carlo analysis, have also been proposed and applied. Ideally one wishes to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies, and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Therefore some safety analysis calculations may use a mixture of these approaches for different parameters depending upon the particular requirements of the analysis problem involved. Sensitivity analysis can for example be used to provide information as part of an uncertainty analysis to determine best estimate plus uncertainty results to the

  15. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    SciTech Connect (OSTI)

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft fr Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.

  16. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    SciTech Connect (OSTI)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  17. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect (OSTI)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  18. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    SciTech Connect (OSTI)

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have been launched in recent years in the LWR, BWR and VVER domains (e.g. the OECD/NEA BEMUSE program [1], from which the current OECD/NEA LWR Uncertainty Analysis in Modelling (UAM) benchmark [2] effort was derived), the systematic propagation of uncertainties in cross-section, manufacturing and model parameters for High Temperature Reactor (HTGR) designs has not been attempted yet. This paper summarises the scope, objectives and exercise definitions of the IAEA Coordinated Research Project (CRP) on HTGR UAM [3]. Note that no results will be included here, as the HTGR UAM benchmark was only launched formally in April 2012, and the specification is currently still under development.

  19. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    SciTech Connect (OSTI)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.; Ross, Kyle; Cardoni, Jeffrey N; Kalinich, Donald A.; Osborn, Douglas M.; Sallaberry, Cedric Jean-Marie; Ghosh, S. Tina

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the model response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)

  20. Determining Price Reasonableness in Federal ESPCs

    SciTech Connect (OSTI)

    Shonder, J.A.

    2005-03-08

    This document reports the findings and implementation recommendations of the Price Reasonableness Working Group to the Federal ESPC Steering Committee. The working group was formed to address concerns of agencies and oversight organizations related to pricing and fair and reasonable price determination in federal energy savings performance contracts (ESPCs). This report comprises the working group's recommendations and is the proposed draft of a training curriculum on fair and reasonable price determination for users of federal ESPCs. The report includes: (1) A review of federal regulations applicable to determining price reasonableness of federal ESPCs (section 2), (2) Brief descriptions of the techniques described in Federal Acquisition Regulations (FAR) 15.404-1 and their applicability to ESPCs (section 3), and (3) Recommended strategies and procedures for cost-effectively completing price reasonableness determinations (sections 4). Agencies have struggled with fair and reasonable price determinations in their ESPCs primarily because this alternative financing vehicle is relatively new and relatively rare in the federal sector. The methods of determining price reasonableness most familiar to federal contracting officers (price competition based on the government's design and specifications, in particular) are generally not applicable to ESPCs. The regulatory requirements for determining price reasonableness in federal ESPCs have also been misunderstood, as federal procurement professionals who are inexperienced with ESPCs are further confused by multiple directives, including Executive Order 13123, which stresses life-cycle cost-effectiveness. Uncertainty about applicable regulations and inconsistent practice and documentation among agencies have fueled claims that price reasonableness determinations have not been sufficiently rigorous in federal ESPCs or that the prices paid in ESPCs are generally higher than the prices paid for similar goods and services

  1. Implementation of a Bayesian Engine for Uncertainty Analysis

    SciTech Connect (OSTI)

    Leng Vang; Curtis Smith; Steven Prescott

    2014-08-01

    In probabilistic risk assessment, it is important to have an environment where analysts have access to a shared and secured high performance computing and a statistical analysis tool package. As part of the advanced small modular reactor probabilistic risk analysis framework implementation, we have identified the need for advanced Bayesian computations. However, in order to make this technology available to non-specialists, there is also a need of a simplified tool that allows users to author models and evaluate them within this framework. As a proof-of-concept, we have implemented an advanced open source Bayesian inference tool, OpenBUGS, within the browser-based cloud risk analysis framework that is under development at the Idaho National Laboratory. This development, the “OpenBUGS Scripter” has been implemented as a client side, visual web-based and integrated development environment for creating OpenBUGS language scripts. It depends on the shared server environment to execute the generated scripts and to transmit results back to the user. The visual models are in the form of linked diagrams, from which we automatically create the applicable OpenBUGS script that matches the diagram. These diagrams can be saved locally or stored on the server environment to be shared with other users.

  2. Probabilities of Possible Future Prices (Released in the STEO April 2010)

    Reports and Publications (EIA)

    2010-01-01

    The Energy Information Administration introduced a monthly analysis of energy price volatility and forecast uncertainty in the October 2009 Short-Term Energy Outlook (STEO). Included in the analysis were charts portraying confidence intervals around the New York Mercantile Exchange (NYMEX) futures prices of West Texas Intermediate (equivalent to light sweet crude oil) and Henry Hub natural gas contracts.

  3. Uncertainty Analysis for a Virtual Flow Meter Using an Air-Handling Unit Chilled Water Valve

    SciTech Connect (OSTI)

    Song, Li; Wang, Gang; Brambley, Michael R.

    2013-04-28

    A virtual water flow meter is developed that uses the chilled water control valve on an air-handling unit as a measurement device. The flow rate of water through the valve is calculated using the differential pressure across the valve and its associated coil, the valve command, and an empirically determined valve characteristic curve. Thus, the probability of error in the measurements is significantly greater than for conventionally manufactured flow meters. In this paper, mathematical models are developed and used to conduct uncertainty analysis for the virtual flow meter, and the results from the virtual meter are compared to measurements made with an ultrasonic flow meter. Theoretical uncertainty analysis shows that the total uncertainty in flow rates from the virtual flow meter is 1.46% with 95% confidence; comparison of virtual flow meter results with measurements from an ultrasonic flow meter yielded anuncertainty of 1.46% with 99% confidence. The comparable results from the theoretical uncertainty analysis and empirical comparison with the ultrasonic flow meter corroborate each other, and tend to validate the approach to computationally estimating uncertainty for virtual sensors introduced in this study.

  4. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    SciTech Connect (OSTI)

    Strydom, Gerhard; Bostelmann, F.

    2015-09-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained). SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on

  5. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect (OSTI)

    Harp, Dylan; Atchley, Adam; Painter, Scott L; Coon, Ethan T.; Wilson, Cathy; Romanovsky, Vladimir E; Rowland, Joel

    2016-01-01

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21$^{st}$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant

  6. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although

  7. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    SciTech Connect (OSTI)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is

  8. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; Coon, Ethan T.; Wilson, Cathy J.; Romanovsky, Vladimir E.; Rowland, Joel C.

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21more » $$^{st}$$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties

  9. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect (OSTI)

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  10. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect (OSTI)

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  11. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect (OSTI)

    G. Pastore; L.P. Swiler; J.D. Hales; S.R. Novascone; D.M. Perez; B.W. Spencer; L. Luzzi; P. Van Uffelen; R.L. Williamson

    2014-10-01

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  12. SSL Pricing and Efficacy Trend Analysis for Utility Program Planning

    SciTech Connect (OSTI)

    Tuenge, J. R.

    2013-10-01

    Report to help utilities and energy efficiency organizations forecast the order in which important SSL applications will become cost-effective and estimate when each "tipping point" will be reached. Includes performance trend analysis from DOE's LED Lighting Facts® and CALiPER programs plus cost analysis from various sources.

  13. Uncertainty Analysis

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable ... Arctic Climate Measurements Global Climate Models Software Sustainable Subsurface ...

  14. SENSIT-2D: a two-dimensional cross-section sensitivity and uncertainty analysis code

    SciTech Connect (OSTI)

    Embrechts, M.J.

    1982-10-01

    SENSIT-2D is a computer program that calculates the sensitivity and/or uncertainty for an integral response (e.g., heating, radiation damage), obtained from the two-dimensional discrete ordinates transport code TRIDENT-CTR, to the cross sections and cross-section uncertainties. A design-sensitivity option allows one to calculate the integral response when the cross sections in certain regions are changed. A secondary-energy-distribution sensitivity- and uncertainty-analysis capability is included. SENSIT-2D incorporates all the essential features of TRIDENT-CTR (r,z geometry option, triangular mesh, nonorthogonal boundaries, group-dependent quadrature sets) and is aimed at the needs of the fusion community. The structure of SENSIT-2D is similar to the structure of the SENSIT code, a one-dimensional sensitivity- and uncertainty-analysis code. This report covers the theory used in SENSIT-2D, outlines the code structure, and gives detailed input specifications. Where appropriate, parts of the SENSIT report are taken over in this write-up. Two sample problems which illustrate the use of SENSIT-2D are explained.

  15. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  16. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    SciTech Connect (OSTI)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  17. Quantification of margins and uncertainty for risk-informed decision analysis.

    SciTech Connect (OSTI)

    Alvin, Kenneth Fredrick

    2010-09-01

    QMU stands for 'Quantification of Margins and Uncertainties'. QMU is a basic framework for consistency in integrating simulation, data, and/or subject matter expertise to provide input into a risk-informed decision-making process. QMU is being applied to a wide range of NNSA stockpile issues, from performance to safety. The implementation of QMU varies with lab and application focus. The Advanced Simulation and Computing (ASC) Program develops validated computational simulation tools to be applied in the context of QMU. QMU provides input into a risk-informed decision making process. The completeness aspect of QMU can benefit from the structured methodology and discipline of quantitative risk assessment (QRA)/probabilistic risk assessment (PRA). In characterizing uncertainties it is important to pay attention to the distinction between those arising from incomplete knowledge ('epistemic' or systematic), and those arising from device-to-device variation ('aleatory' or random). The national security labs should investigate the utility of a probability of frequency (PoF) approach in presenting uncertainties in the stockpile. A QMU methodology is connected if the interactions between failure modes are included. The design labs should continue to focus attention on quantifying uncertainties that arise from epistemic uncertainties such as poorly-modeled phenomena, numerical errors, coding errors, and systematic uncertainties in experiment. The NNSA and design labs should ensure that the certification plan for any RRW is supported by strong, timely peer review and by an ongoing, transparent QMU-based documentation and analysis in order to permit a confidence level necessary for eventual certification.

  18. Regional price targets appropriate for advanced coal extraction. [Forecasting to 1985 and 2000; USA; Regional analysis

    SciTech Connect (OSTI)

    Terasawa, K.L.; Whipple, D.W.

    1980-12-01

    The object of the study is to provide a methodology for predicting coal prices in regional markets for the target time frames 1985 and 2000 that could subsequently be used to guide the development of an advanced coal extraction system. The model constructed for the study is a supply and demand model that focuses on underground mining, since the advanced technology is expected to be developed for these reserves by the target years. The supply side of the model is based on coal reserve data generated by Energy and Environmental Analysis, Inc. (EEA). Given this data and the cost of operating a mine (data from US Department of Energy and Bureau of Mines), the Minimum Acceptable Selling Price (MASP) is obtained. The MASP is defined as the smallest price that would induce the producer to bring the mine into production, and is sensitive to the current technology and to assumptions concerning miner productivity. Based on this information, market supply curves can then be generated. On the demand side of the model, demand by region is calculated based on an EEA methodology that emphasizes demand by electric utilities and demand by industry. The demand and supply curves are then used to obtain the price targets. This last step is accomplished by allocating the demands among the suppliers so that the combined cost of producing and transporting coal is minimized.

  19. An Analysis of the Price Elasticity of Demand for Household Appliances

    SciTech Connect (OSTI)

    Fujita, Kimberly; Dale, Larry; Fujita, K. Sydny

    2008-01-25

    This report summarizes our study of the price elasticity of demand for home appliances, including refrigerators, clothes washers, and dishwashers. In the context of increasingly stringent appliance standards, we are interested in what kind of impact the increased manufacturing costs caused by higher efficiency requirements will have on appliance sales. We begin with a review of existing economics literature describing the impact of economic variables on the sale of durable goods.We then describe the market for home appliances and changes in this market over the past 20 years, performing regression analysis on the shipments of home appliances and relevant economic variables including changes to operating cost and household income. Based on our analysis, we conclude that the demand for home appliances is price inelastic.

  20. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect (OSTI)

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  1. Uncertainty analysis of an IGCC system with single-stage entrained-flow gasifier

    SciTech Connect (OSTI)

    Shastri, Y.; Diwekar, U.; Zitney, S.

    2008-01-01

    Integrated Gasification Combined Cycle (IGCC) systems using coal gasification is an attractive option for future energy plants. Consequenty, understanding the system operation and optimizing gasifier performance in the presence of uncertain operating conditions is essential to extract the maximum benefits from the system. This work focuses on conducting such a study using an IGCC process simulation and a high-fidelity gasifier simulation coupled with stochastic simulation and multi-objective optimization capabilities. Coal gasifiers are the necessary basis of IGCC systems, and hence effective modeling and uncertainty analysis of the gasification process constitutes an important element of overall IGCC process design and operation. In this work, an Aspen Plus{reg_sign} steady-state process model of an IGCC system with carbon capture enables us to conduct simulation studies so that the effect of gasification variability on the whole process can be understood. The IGCC plant design consists of an single-stage entrained-flow gasifier, a physical solvent-based acid gas removal process for carbon capture, two model-7FB combustion turbine generators, two heat recovery steam generators, and one steam turbine generator in a multi-shaft 2x2x1 configuration. In the Aspen Plus process simulation, the gasifier is represented as a simplified lumped-parameter, restricted-equilibrium reactor model. In this work, we also make use of a distributed-parameter FLUENT{reg_sign} computational fluid dynamics (CFD) model to characterize the uncertainty for the entrained-flow gasifier. The CFD-based gasifer model is much more comprehensive, predictive, and hence better suited to understand the effects of uncertainty. The possible uncertain parameters of the gasifier model are identified. This includes input coal composition as well as mass flow rates of coal, slurry water, and oxidant. Using a selected number of random (Monte Carlo) samples for the different parameters, the CFD model is

  2. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    SciTech Connect (OSTI)

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  3. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    E.T. Coon; C.J. Wilson; S.L. Painter; V.E. Romanovsky; D.R. Harp; A.L. Atchley; J.C. Rowland

    2016-02-02

    This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  4. The price of electricity from private power producers: Stage 2, Expansion of sample and preliminary statistical analysis

    SciTech Connect (OSTI)

    Comnes, G.A.; Belden, T.N.; Kahn, E.P.

    1995-02-01

    The market for long-term bulk power is becoming increasingly competitive and mature. Given that many privately developed power projects have been or are being developed in the US, it is possible to begin to evaluate the performance of the market by analyzing its revealed prices. Using a consistent method, this paper presents levelized contract prices for a sample of privately developed US generation properties. The sample includes 26 projects with a total capacity of 6,354 MW. Contracts are described in terms of their choice of technology, choice of fuel, treatment of fuel price risk, geographic location, dispatchability, expected dispatch niche, and size. The contract price analysis shows that gas technologies clearly stand out as the most attractive. At an 80% capacity factor, coal projects have an average 20-year levelized price of $0.092/kWh, whereas natural gas combined cycle and/or cogeneration projects have an average price of $0.069/kWh. Within each technology type subsample, however, there is considerable variation. Prices for natural gas combustion turbines and one wind project are also presented. A preliminary statistical analysis is conducted to understand the relationship between price and four categories of explanatory factors including product heterogeneity, geographic heterogeneity, economic and technological change, and other buyer attributes (including avoided costs). Because of residual price variation, we are unable to accept the hypothesis that electricity is a homogeneous product. Instead, the analysis indicates that buyer value still plays an important role in the determination of price for competitively-acquired electricity.

  5. Idealization, uncertainty and heterogeneity : game frameworks defined with formal concept analysis.

    SciTech Connect (OSTI)

    Racovitan, M. T.; Sallach, D. L.; Decision and Information Sciences; Northern Illinois Univ.

    2006-01-01

    The present study begins with Formal Concept Analysis, and undertakes to demonstrate how a succession of game frameworks may, by design, address increasingly complex and interesting social phenomena. We develop a series of multi-agent exchange games, each of which incorporates an additional dimension of complexity. All games are based on coalition patterns in exchanges where diverse cultural markers provide a basis for trust and reciprocity. The first game is characterized by an idealized concept of trust. A second game framework introduces uncertainty regarding the reciprocity of prospective transactions. A third game framework retains idealized trust and uncertainty, and adds additional agent heterogeneity. Cultural markers are not equally salient in conferring or withholding trust, and the result is a richer transactional process.

  6. AEP Ohio gridSMART Demonstration Project Real-Time Pricing Demonstration Analysis

    SciTech Connect (OSTI)

    Widergren, Steven E.; Subbarao, Krishnappa; Fuller, Jason C.; Chassin, David P.; Somani, Abhishek; Marinovici, Maria C.; Hammerstrom, Janelle L.

    2014-02-01

    This report contributes initial findings from an analysis of significant aspects of the gridSMART® Real-Time Pricing (RTP) – Double Auction demonstration project. Over the course of four years, Pacific Northwest National Laboratory (PNNL) worked with American Electric Power (AEP), Ohio and Battelle Memorial Institute to design, build, and operate an innovative system to engage residential consumers and their end-use resources in a participatory approach to electric system operations, an incentive-based approach that has the promise of providing greater efficiency under normal operating conditions and greater flexibility to react under situations of system stress. The material contained in this report supplements the findings documented by AEP Ohio in the main body of the gridSMART report. It delves into three main areas: impacts on system operations, impacts on households, and observations about the sensitivity of load to price changes.

  7. Uncertainty Studies of Real Anode Surface Area in Computational Analysis for Molten Salt Electrorefining

    SciTech Connect (OSTI)

    Sungyeol Choi; Jaeyeong Park; Robert O. Hoover; Supathorn Phongikaroon; Michael F. Simpson; Kwang-Rag Kim; Il Soon Hwang

    2011-09-01

    This study examines how much cell potential changes with five differently assumed real anode surface area cases. Determining real anode surface area is a significant issue to be resolved for precisely modeling molten salt electrorefining. Based on a three-dimensional electrorefining model, calculated cell potentials compare with an experimental cell potential variation over 80 hours of operation of the Mark-IV electrorefiner with driver fuel from the Experimental Breeder Reactor II. We succeeded to achieve a good agreement with an overall trend of the experimental data with appropriate selection of a mode for real anode surface area, but there are still local inconsistencies between theoretical calculation and experimental observation. In addition, the results were validated and compared with two-dimensional results to identify possible uncertainty factors that had to be further considered in a computational electrorefining analysis. These uncertainty factors include material properties, heterogeneous material distribution, surface roughness, and current efficiency. Zirconium's abundance and complex behavior have more impact on uncertainty towards the latter period of electrorefining at given batch of fuel. The benchmark results found that anode materials would be dissolved from both axial and radial directions at least for low burn-up metallic fuels after active liquid sodium bonding was dissolved.

  8. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    SciTech Connect (OSTI)

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  9. SWEPP PAN assay system uncertainty analysis: Passive mode measurements of graphite waste

    SciTech Connect (OSTI)

    Blackwood, L.G.; Harker, Y.D.; Meachum, T.R.; Yoon, Woo Y.

    1997-07-01

    The Idaho National Engineering and Environmental Laboratory is being used as a temporary storage facility for transuranic waste generated by the U.S. Nuclear Weapons program at the Rocky Flats Plant (RFP) in Golden, Colorado. Currently, there is a large effort in progress to prepare to ship this waste to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. In order to meet the TRU Waste Characterization Quality Assurance Program Plan nondestructive assay compliance requirements and quality assurance objectives, it is necessary to determine the total uncertainty of the radioassay results produced by the Stored Waste Examination Pilot Plant (SWEPP) Passive Active Neutron (PAN) radioassay system. To this end a modified statistical sampling and verification approach has been developed to determine the total uncertainty of a PAN measurement. In this approach the total performance of the PAN nondestructive assay system is simulated using computer models of the assay system and the resultant output is compared with the known input to assess the total uncertainty. This paper is one of a series of reports quantifying the results of the uncertainty analysis of the PAN system measurements for specific waste types and measurement modes. In particular this report covers passive mode measurements of weapons grade plutonium-contaminated graphite molds contained in 208 liter drums (waste code 300). The validity of the simulation approach is verified by comparing simulated output against results from measurements using known plutonium sources and a surrogate graphite waste form drum. For actual graphite waste form conditions, a set of 50 cases covering a statistical sampling of the conditions exhibited in graphite wastes was compiled using a Latin hypercube statistical sampling approach.

  10. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect (OSTI)

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  11. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect (OSTI)

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  12. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    SciTech Connect (OSTI)

    Perk, Zoltn Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods such as first order perturbation theory or Monte Carlo sampling Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in

  13. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis: Modeling Archive

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    J.C. Rowland; D.R. Harp; C.J. Wilson; A.L. Atchley; V.E. Romanovsky; E.T. Coon; S.L. Painter

    2016-02-02

    This Modeling Archive is in support of an NGEE Arctic publication available at doi:10.5194/tc-10-341-2016. This dataset contains an ensemble of thermal-hydro soil parameters including porosity, thermal conductivity, thermal conductivity shape parameters, and residual saturation of peat and mineral soil. The ensemble was generated using a Null-Space Monte Carlo analysis of parameter uncertainty based on a calibration to soil temperatures collected at the Barrow Environmental Observatory site by the NGEE team. The micro-topography of ice wedge polygons present at the site is included in the analysis using three 1D column models to represent polygon center, rim and trough features. The Arctic Terrestrial Simulator (ATS) was used in the calibration to model multiphase thermal and hydrological processes in the subsurface.

  14. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    SciTech Connect (OSTI)

    Hartini, Entin Andiwijayakusuma, Dinan

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  15. TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE

    SciTech Connect (OSTI)

    Atkinson, R.

    2012-07-31

    Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.

  16. Hydropower generation management under uncertainty via scenario analysis and parallel computation

    SciTech Connect (OSTI)

    Escudero, L.F.; Garcia, C.; Fuente, J.L. de la; Prieto, F.J.

    1996-05-01

    The authors present a modeling framework for the robust solution of hydroelectric power management problems with uncertainty in the values of the water inflows and outflows. A deterministic treatment of the problem provides unsatisfactory results, except for very short time horizons. The authors describe a model based on scenario analysis that allows a satisfactory treatment of uncertainty in the model data for medium and long-term planning problems. Their approach results in a huge model with a network submodel per scenario plus coupling constraints. The size of the problem and the structure of the constraints are adequate for the use of decomposition techniques and parallel computation tools. The authors present computational results for both sequential and parallel implementation versions of the codes, running on a cluster of workstations. The codes have been tested on data obtained from the reservoir network of Iberdrola, a power utility owning 50% of the total installed hydroelectric capacity of Spain, and generating 40% of the total energy demand.

  17. Hydropower generation management under uncertainty via scenario analysis and parallel computation

    SciTech Connect (OSTI)

    Escudero, L.F.; Garcia, C.; Fuente, J.L. de la; Prieto, F.J.

    1995-12-31

    The authors present a modeling framework for the robust solution of hydroelectric power management problems and uncertainty in the values of the water inflows and outflows. A deterministic treatment of the problem provides unsatisfactory results, except for very short time horizons. The authors describe a model based on scenario analysis that allows a satisfactory treatment of uncertainty in the model data for medium and long-term planning problems. This approach results in a huge model with a network submodel per scenario plus coupling constraints. The size of the problem and the structure of the constraints are adequate for the use of decomposition techniques and parallel computation tools. The authors present computational results for both sequential and parallel implementation versions of the codes, running on a cluster of workstations. The code have been tested on data obtained from the reservoir network of Iberdrola, a power utility owning 50% of the total installed hydroelectric capacity of Spain, and generating 40% of the total energy demand.

  18. Report on INL Activities for Uncertainty Reduction Analysis of FY11

    SciTech Connect (OSTI)

    G. Plamiotti; H. Hiruta; M. Salvatores

    2011-09-01

    This report presents the status of activities performed at INL under the ARC Work Package on 'Uncertainty Reduction Analyses' that has a main goal the reduction of uncertainties associated with nuclear data on neutronic integral parameters of interest for the design of advanced fast reactors under consideration by the ARC program. First, an analysis of experiments was carried out. For both JOYO (the first Japanese fast reactor) and ZPPR-9 (a large size zero power plutonium fueled experiment performed at ANL-W in Idaho) the performance of ENDF/B-VII.0 is quite satisfying except for the sodium void configurations of ZPPR-9, but for which one has to take into account the approximation of the modeling. In fact, when one uses a more detailed model (calculations performed at ANL in a companion WP) more reasonable results are obtained. A large effort was devoted to the analysis of the irradiation experiments, PROFIL-1 and -2 and TRAPU, performed at the French fast reactor PHENIX. For these experiments a pre-release of the ENDF/B-VII.1 cross section files was also used, in order to provide validation feedback to the CSWEG nuclear data evaluation community. In the PROFIL experiments improvements can be observed for the ENDF/B-VII.1 capture data in 238Pu, 241Am, 244Cm, 97Mo, 151Sm, 153Eu, and for 240Pu(n,2n). On the other hand, 240,242Pu, 95Mo, 133Cs and 145Nd capture C/E results are worse. For the major actinides 235U and especially 239Pu capture C/E's are underestimated. For fission products, 105,106Pd, 143,144Nd and 147,149Sm are significantly underestimated, while 101Ru and 151Sm are overestimated. Other C/E deviations from unity are within the combined experimental and calculated statistical uncertainty. From the TRAPU analysis, the major improvement is in the predicted 243Cm build-up, presumably due to an improved 242Cm capture evaluation. The COSMO experiment was also analyzed in order to provide useful feedback on fission cross sections. It was found out that ENDF

  19. Identifying the Oil Price-Macroeconomy Relationship: An Empirical Mode Decomposition Analysis of U.S. Data

    SciTech Connect (OSTI)

    Oladosu, Gbadebo A

    2009-01-01

    This work applies the empirical mode decomposition (EMD) method to data on real quarterly oil price (West Texas Intermediate - WTI) and U.S. gross domestic product (GDP). This relatively new method is adaptive and capable of handling non-linear and non-stationary data. Correlation analysis of the decomposition results was performed and examined for insights into the oil-macroeconomy relationship. Several components of this relationship were identified. However, the principal one is that the medium-run cyclical component of the oil price exerts a negative and exogenous influence on the main cyclical component of the GDP. This can be interpreted as the supply-driven or supply-shock component of the oil price-GDP relationship. In addition, weak correlations suggesting a lagging demand-driven, an expectations-driven, and a long-run supply-driven component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identified a number of lessons applicable to recent oil market events, including the eventuality of persistent economic and price declines following a long oil price run-up. In addition, it was found that oil-market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply disruptions.

  20. Uncertainty and Sensitivity Analysis Results Obtained in the 1996 Performance Assessment for the Waste Isolation Pilot Plant

    SciTech Connect (OSTI)

    Bean, J.E.; Berglund, J.W.; Davis, F.J.; Economy, K.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; MacKinnon, R.J.; Miller, J.; O'Brien, D.G.; Ramsey, J.L.; Schreiber, J.D.; Shinta, A.; Smith, L.N.; Stockman, C.; Stoelzel, D.M.; Vaughn, P.

    1998-09-01

    The Waste Isolation Pilot Plant (WPP) is located in southeastern New Mexico and is being developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. A detailed performance assessment (PA) for the WIPP was carried out in 1996 and supports an application by the DOE to the U.S. Environmental Protection Agency (EPA) for the certification of the WIPP for the disposal of TRU waste. The 1996 WIPP PA uses a computational structure that maintains a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000 yr regulatory period that applies to the WIPP and subjective uncertainty arising from the imprecision with which many of the quantities required in the PA are known. Important parts of this structure are (1) the use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (2) the use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (3) the efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The use of Latin hypercube sampling generates a mapping from imprecisely known analysis inputs to analysis outcomes of interest that provides both a display of the uncertainty in analysis outcomes (i.e., uncertainty analysis) and a basis for investigating the effects of individual inputs on these outcomes (i.e., sensitivity analysis). The sensitivity analysis procedures used in the PA include examination of scatterplots, stepwise regression analysis, and partial correlation analysis. Uncertainty and sensitivity analysis results obtained as part of the 1996 WIPP PA are presented and discussed. Specific topics considered include two phase flow in the vicinity of the repository, radionuclide release from the repository, fluid flow and radionuclide

  1. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    SciTech Connect (OSTI)

    Frey, H. Christopher; Rhodes, David S.

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  2. An inquiry into the potential of scenario analysis for dealing with uncertainty in strategic environmental assessment in China

    SciTech Connect (OSTI)

    Zhu Zhixi Bai, Hongtao Xu He Zhu Tan

    2011-11-15

    Strategic environmental assessment (SEA) inherently needs to address greater levels of uncertainty in the formulation and implementation processes of strategic decisions, compared with project environmental impact assessment. The range of uncertainties includes internal and external factors of the complex system that is concerned in the strategy. Scenario analysis is increasingly being used to cope with uncertainty in SEA. Following a brief introduction of scenarios and scenario analysis, this paper examines the rationale for scenario analysis in SEA in the context of China. The state of the art associated with scenario analysis applied to SEA in China was reviewed through four SEA case analyses. Lessons learned from these cases indicated the word 'scenario' appears to be abused and the scenario-based methods appear to be misused due to the lack of understanding of an uncertain future and scenario analysis. However, good experiences were also drawn on, regarding how to integrate scenario analysis into the SEA process in China, how to cope with driving forces including uncertainties, how to combine qualitative scenario storylines with quantitative impact predictions, and how to conduct assessments and propose recommendations based on scenarios. Additionally, the ways to improve the application of this tool in SEA were suggested. We concluded by calling for further methodological research on this issue and more practices.

  3. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    SciTech Connect (OSTI)

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia; Hough, Patricia Diane; Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  4. Uncertainty Analysis of Certified Photovoltaic Measurements at the National Renewable Energy Laboratory

    SciTech Connect (OSTI)

    Emery, K.

    2009-08-01

    Discusses NREL Photovoltaic Cell and Module Performance Characterization Group's procedures to achieve lowest practical uncertainty in measuring PV performance with respect to reference conditions.

  5. Japan's Solar Photovoltaic (PV) Market: An Analysis of Residential System Prices (Presentation)

    SciTech Connect (OSTI)

    James, T.

    2014-03-01

    This presentation summarizes market and policy factors influencing residential solar photovoltaic system prices in Japan, and compares these factors to related developments in the United States.

  6. Using Uncertainty Analysis to Guide the Development of Accelerated Stress Tests (Presentation)

    SciTech Connect (OSTI)

    Kempe, M.

    2014-03-01

    Extrapolation of accelerated testing to the long-term results expected in the field has uncertainty associated with the acceleration factors and the range of possible stresses in the field. When multiple stresses (such as temperature and humidity) can be used to increase the acceleration, the uncertainty may be reduced according to which stress factors are used to accelerate the degradation.

  7. Users manual for the FORSS sensitivity and uncertainty analysis code system

    SciTech Connect (OSTI)

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  8. Uncertainty analysis in geospatial merit matrix–based hydropower resource assessment

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pasha, M. Fayzul K.; Yeasmin, Dilruba; Saetern, Sen; Yang, Majntxov; Kao, Shih -Chieh; Smith, Brennan T.

    2016-03-30

    Hydraulic head and mean annual streamflow, two main input parameters in hydropower resource assessment, are not measured at every point along the stream. Translation and interpolation are used to derive these parameters, resulting in uncertainties. This study estimates the uncertainties and their effects on model output parameters: the total potential power and the number of potential locations (stream-reach). These parameters are quantified through Monte Carlo Simulation (MCS) linking with a geospatial merit matrix based hydropower resource assessment (GMM-HRA) Model. The methodology is applied to flat, mild, and steep terrains. Results show that the uncertainty associated with the hydraulic head ismore » within 20% for mild and steep terrains, and the uncertainty associated with streamflow is around 16% for all three terrains. Output uncertainty increases as input uncertainty increases. However, output uncertainty is around 10% to 20% of the input uncertainty, demonstrating the robustness of the GMM-HRA model. Hydraulic head is more sensitive to output parameters in steep terrain than in flat and mild terrains. Furthermore, mean annual streamflow is more sensitive to output parameters in flat terrain.« less

  9. Electricity Prices in a Competitive Environment: Marginal Cost Pricing

    Reports and Publications (EIA)

    1997-01-01

    Presents the results of an analysis that focuses on two questions: (1) How are prices for competitive generation services likely to differ from regulated prices if competitive prices are based on marginal costs rather than regulated cost-of-service pricing? (2) What impacts will the competitive pricing of generation services (based on marginal costs) have on electricity consumption patterns, production costs, and the financial integrity of electricity suppliers?

  10. Gamma-Ray Library and Uncertainty Analysis: Passively Emitted Gamma Rays Used in Safeguards Technology

    SciTech Connect (OSTI)

    Parker, W

    2009-09-18

    Non-destructive gamma-ray analysis is a fundamental part of nuclear safeguards, including nuclear energy safeguards technology. Developing safeguards capabilities for nuclear energy will certainly benefit from the advanced use of gamma-ray spectroscopy as well as the ability to model various reactor scenarios. There is currently a wide variety of nuclear data that could be used in computer modeling and gamma-ray spectroscopy analysis. The data can be discrepant (with varying uncertainties), and it may difficult for a modeler or software developer to determine the best nuclear data set for a particular situation. To use gamma-ray spectroscopy to determine the relative isotopic composition of nuclear materials, the gamma-ray energies and the branching ratios or intensities of the gamma-rays emitted from the nuclides in the material must be well known. A variety of computer simulation codes will be used during the development of the nuclear energy safeguards, and, to compare the results of various codes, it will be essential to have all the {gamma}-ray libraries agree. Assessing our nuclear data needs allows us to create a prioritized list of desired measurements, and provides uncertainties for energies and especially for branching intensities. Of interest are actinides, fission products, and activation products, and most particularly mixtures of all of these radioactive isotopes, including mixtures of actinides and other products. Recent work includes the development of new detectors with increased energy resolution, and studies of gamma-rays and their lines used in simulation codes. Because new detectors are being developed, there is an increased need for well known nuclear data for radioactive isotopes of some elements. Safeguards technology should take advantage of all types of gamma-ray detectors, including new super cooled detectors, germanium detectors and cadmium zinc telluride detectors. Mixed isotopes, particularly mixed actinides found in nuclear reactor

  11. SU-E-J-159: Analysis of Total Imaging Uncertainty in Respiratory-Gated Radiotherapy

    SciTech Connect (OSTI)

    Suzuki, J; Okuda, T; Sakaino, S; Yokota, N

    2015-06-15

    Purpose: In respiratory-gated radiotherapy, the gating phase during treatment delivery needs to coincide with the corresponding phase determined during the treatment plan. However, because radiotherapy is performed based on the image obtained for the treatment plan, the time delay, motion artifact, volume effect, and resolution in the images are uncertain. Thus, imaging uncertainty is the most basic factor that affects the localization accuracy. Therefore, these uncertainties should be analyzed. This study aims to analyze the total imaging uncertainty in respiratory-gated radiotherapy. Methods: Two factors of imaging uncertainties related to respiratory-gated radiotherapy were analyzed. First, CT image was used to determine the target volume and 4D treatment planning for the Varian Realtime Position Management (RPM) system. Second, an X-ray image was acquired for image-guided radiotherapy (IGRT) for the BrainLAB ExacTrac system. These factors were measured using a respiratory gating phantom. The conditions applied during phantom operation were as follows: respiratory wave form, sine curve; respiratory cycle, 4 s; phantom target motion amplitude, 10, 20, and 29 mm (which is maximum phantom longitudinal motion). The target and cylindrical marker implanted in the phantom coverage of the CT images was measured and compared with the theoretically calculated coverage from the phantom motion. The theoretical position of the cylindrical marker implanted in the phantom was compared with that acquired from the X-ray image. The total imaging uncertainty was analyzed from these two factors. Results: In the CT image, the uncertainty between the target and cylindrical marker’s actual coverage and the coverage of CT images was 1.19 mm and 2.50mm, respectively. In the Xray image, the uncertainty was 0.39 mm. The total imaging uncertainty from the two factors was 1.62mm. Conclusion: The total imaging uncertainty in respiratory-gated radiotherapy was clinically acceptable. However

  12. WE-D-BRE-07: Variance-Based Sensitivity Analysis to Quantify the Impact of Biological Uncertainties in Particle Therapy

    SciTech Connect (OSTI)

    Kamp, F.; Brueningk, S.C.; Wilkens, J.J.

    2014-06-15

    Purpose: In particle therapy, treatment planning and evaluation are frequently based on biological models to estimate the relative biological effectiveness (RBE) or the equivalent dose in 2 Gy fractions (EQD2). In the context of the linear-quadratic model, these quantities depend on biological parameters (α, β) for ions as well as for the reference radiation and on the dose per fraction. The needed biological parameters as well as their dependency on ion species and ion energy typically are subject to large (relative) uncertainties of up to 20–40% or even more. Therefore it is necessary to estimate the resulting uncertainties in e.g. RBE or EQD2 caused by the uncertainties of the relevant input parameters. Methods: We use a variance-based sensitivity analysis (SA) approach, in which uncertainties in input parameters are modeled by random number distributions. The evaluated function is executed 10{sup 4} to 10{sup 6} times, each run with a different set of input parameters, randomly varied according to their assigned distribution. The sensitivity S is a variance-based ranking (from S = 0, no impact, to S = 1, only influential part) of the impact of input uncertainties. The SA approach is implemented for carbon ion treatment plans on 3D patient data, providing information about variations (and their origin) in RBE and EQD2. Results: The quantification enables 3D sensitivity maps, showing dependencies of RBE and EQD2 on different input uncertainties. The high number of runs allows displaying the interplay between different input uncertainties. The SA identifies input parameter combinations which result in extreme deviations of the result and the input parameter for which an uncertainty reduction is the most rewarding. Conclusion: The presented variance-based SA provides advantageous properties in terms of visualization and quantification of (biological) uncertainties and their impact. The method is very flexible, model independent, and enables a broad assessment

  13. Why Do Motor Gasoline Prices Vary Regionally? California Case Study

    Reports and Publications (EIA)

    1998-01-01

    Analysis of the difference between the retail gasoline prices in California and the average U.S. retail prices.

  14. Uncertainty analysis of integrated gasification combined cycle systems based on Frame 7H versus 7F gas turbines

    SciTech Connect (OSTI)

    Yunhua Zhu; H. Christopher Frey

    2006-12-15

    Integrated gasification combined cycle (IGCC) technology is a promising alternative for clean generation of power and coproduction of chemicals from coal and other feedstocks. Advanced concepts for IGCC systems that incorporate state-of-the-art gas turbine systems, however, are not commercially demonstrated. Therefore, there is uncertainty regarding the future commercial-scale performance, emissions, and cost of such technologies. The Frame 7F gas turbine represents current state-of-practice, whereas the Frame 7H is the most recently introduced advanced commercial gas turbine. The objective of this study was to evaluate the risks and potential payoffs of IGCC technology based on different gas turbine combined cycle designs. Models of entrained-flow gasifier-based IGCC systems with Frame 7F (IGCC-7F) and 7H gas turbine combined cycles (IGCC-7H) were developed in ASPEN Plus. An uncertainty analysis was conducted. Gasifier carbon conversion and project cost uncertainty are identified as the most important uncertain inputs with respect to system performance and cost. The uncertainties in the difference of the efficiencies and costs for the two systems are characterized. Despite uncertainty, the IGCC-7H system is robustly preferred to the IGCC-7F system. Advances in gas turbine design will improve the performance, emissions, and cost of IGCC systems. The implications of this study for decision-making regarding technology selection, research planning, and plant operation are discussed. 38 refs., 11 figs., 5 tabs.

  15. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    SciTech Connect (OSTI)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    2000-02-28

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases.

  16. A joint analysis of Planck and BICEP2 B modes including dust polarization uncertainty

    SciTech Connect (OSTI)

    Mortonson, Michael J.; Seljak, Uro E-mail: useljak@berkeley.edu

    2014-10-01

    We analyze BICEP2 and Planck data using a model that includes CMB lensing, gravity waves, and polarized dust. Recently published Planck dust polarization maps have highlighted the difficulty of estimating the amount of dust polarization in low intensity regions, suggesting that the polarization fractions have considerable uncertainties and may be significantly higher than previous predictions. In this paper, we start by assuming nothing about the dust polarization except for the power spectrum shape, which we take to be C{sub l}{sup BB,dust}?l{sup -2.42}. The resulting joint BICEP2+Planck analysis favors solutions without gravity waves, and the upper limit on the tensor-to-scalar ratio is r<0.11, a slight improvement relative to the Planck analysis alone which gives r<0.13 (95% c.l.). The estimated amplitude of the dust polarization power spectrum agrees with expectations for this field based on both HI column density and Planck polarization measurements at 353 GHz in the BICEP2 field. Including the latter constraint on the dust spectrum amplitude in our analysis improves the limit further to r<0.09, placing strong constraints on theories of inflation (e.g., models with r>0.14 are excluded with 99.5% confidence). We address the cross-correlation analysis of BICEP2 at 150 GHz with BICEP1 at 100 GHz as a test of foreground contamination. We find that the null hypothesis of dust and lensing with 0r= gives ??{sup 2}<2 relative to the hypothesis of no dust, so the frequency analysis does not strongly favor either model over the other. We also discuss how more accurate dust polarization maps may improve our constraints. If the dust polarization is measured perfectly, the limit can reach r<0.05 (or the corresponding detection significance if the observed dust signal plus the expected lensing signal is below the BICEP2 observations), but this degrades quickly to almost no improvement if the dust calibration error is 20% or larger or if the dust maps are not

  17. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    SciTech Connect (OSTI)

    Budnitz, R.J.; Apostolakis, G.; Boore, D.M.

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  18. Uncertainty in soil-structure interaction analysis of a nuclear power plant due to different analytical techniques

    SciTech Connect (OSTI)

    Chen, J.C.; Chun, R.C.; Goudreau, G.L.; Maslenikov, O.R.; Johnson, J.J.

    1984-01-01

    This paper summarizes the results of the dynamic response analysis of the Zion reactor containment building using three different soil-structure interaction (SSI) analytical procedures which are: the substructure method, CLASSI; the equivalent linear finite element approach, ALUSH; and the nonlinear finite element procedure, DYNA3D. Uncertainties in analyzing a soil-structure system due to SSI analysis procedures were investigated. Responses at selected locations in the structure were compared through peak accelerations and response spectra.

  19. A simplified analysis of uncertainty propagation in inherently controlled ATWS events

    SciTech Connect (OSTI)

    Wade, D.C.

    1987-01-01

    The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ..delta..T across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect.

  20. Validation and quantification of uncertainty in coupled climate models using network analysis

    SciTech Connect (OSTI)

    Bracco, Annalisa

    2015-08-10

    We developed a fast, robust and scalable methodology to examine, quantify, and visualize climate patterns and their relationships. It is based on a set of notions, algorithms and metrics used in the study of graphs, referred to as complex network analysis. This approach can be applied to explain known climate phenomena in terms of an underlying network structure and to uncover regional and global linkages in the climate system, while comparing general circulation models outputs with observations. The proposed method is based on a two-layer network representation, and is substantially new within the available network methodologies developed for climate studies. At the first layer, gridded climate data are used to identify ‘‘areas’’, i.e., geographical regions that are highly homogeneous in terms of the given climate variable. At the second layer, the identified areas are interconnected with links of varying strength, forming a global climate network. The robustness of the method (i.e. the ability to separate between topological distinct fields, while identifying correctly similarities) has been extensively tested. It has been proved that it provides a reliable, fast framework for comparing and ranking the ability of climate models of reproducing observed climate patterns and their connectivity. We further developed the methodology to account for lags in the connectivity between climate patterns and refined our area identification algorithm to account for autocorrelation in the data. The new methodology based on complex network analysis has been applied to state-of-the-art climate model simulations that participated to the last IPCC (International Panel for Climate Change) assessment to verify their performances, quantify uncertainties, and uncover changes in global linkages between past and future projections. Network properties of modeled sea surface temperature and rainfall over 1956–2005 have been constrained towards observations or reanalysis data sets

  1. 2009 Technical Risk and Uncertainty Analysis of the U.S. Department of Energy's Solar Energy Technologies Program Concentrating Solar Power and Photovoltaics R&D

    SciTech Connect (OSTI)

    McVeigh, J.; Lausten, M.; Eugeni, E.; Soni, A.

    2010-11-01

    The U.S. Department of Energy (DOE) Solar Energy Technologies Program (SETP) conducted a 2009 Technical Risk and Uncertainty Analysis to better assess its cost goals for concentrating solar power (CSP) and photovoltaic (PV) systems, and to potentially rebalance its R&D portfolio. This report details the methodology, schedule, and results of this technical risk and uncertainty analysis.

  2. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik; Ivanov, Kostadin

    2016-01-11

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the

  3. nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties in the CTEQ framework

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Kovarik, K.; Kusina, A.; Jezo, T.; Clark, D. B.; Keppel, C.; Lyonnet, F.; Morfin, J. G.; Olness, F. I.; Owens, J. F.; Schienbein, I.; et al

    2016-04-28

    We present the new nCTEQ15 set of nuclear parton distribution functions with uncertainties. This fit extends the CTEQ proton PDFs to include the nuclear dependence using data on nuclei all the way up to 208Pb. The uncertainties are determined using the Hessian method with an optimal rescaling of the eigenvectors to accurately represent the uncertainties for the chosen tolerance criteria. In addition to the Deep Inelastic Scattering (DIS) and Drell-Yan (DY) processes, we also include inclusive pion production data from RHIC to help constrain the nuclear gluon PDF. Here, we investigate the correlation of the data sets with specific nPDFmore » flavor components, and asses the impact of individual experiments. We also provide comparisons of the nCTEQ15 set with recent fits from other groups.« less

  4. Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations

    SciTech Connect (OSTI)

    Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.

    2013-05-01

    Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.

  5. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    SciTech Connect (OSTI)

    Denman, Matthew R.; Brooks, Dusty Marie

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  6. Summary Notes from 15 November 2007 Generic Technical Issue Discussion on Sensitivity and Uncertainty Analysis and Model Support

    Office of Environmental Management (EM)

    5 November 2007 Generic Technical Issue Discussion on Sensitivity and Uncertainty Analysis and Model Support Attendees: Representatives from Department of Energy-Headquarters (DOE-HQ) and the U.S. Nuclear Regulatory Commission (NRC) met at the DOE offices in Germantown, Maryland on 15 November 2007. Representatives from Department of Energy-Savannah River (DOE-SR) and the South Carolina Department of Health and Environmental Control (SCDHEC) participated in the meeting via a teleconference link.

  7. Response model and activity analysis of the revenue reconciliation problem in the marginal cost pricing of electricity

    SciTech Connect (OSTI)

    Hassig, N.L.

    1980-01-01

    The objective of the research was to determine if feasible reconciliation procedures exist that meet the multiple (and sometimes competing) goals of the electricity pricing problem while staying within the constraints of the problem. The answer was that such procedures do exist. Selection among the alternative, feasible procedures depends on the weighting factors placed on the goals. One procedure did not universally satisfy all the goals; the various procedures satisfied the alternative goals to varying degrees. The selection process was sensitive to the initial conditions of the model and to the band width of the constraint boundary conditions. Discriminate analysis was used to identify the variables that contribute the most to the optimal selection process. The results of the research indicated that the variables that are the most effective in selecting among the various procedures were the following: the ratio of peak to off-peak prices, the amount of revenue adjustment required, the constraint on equity, the constraint on peak price stability, and the constraint on meeting the revenue requirement. The poicy recommendations that can be derived from this research are very relevant in light of today's energy problems. Time-of-use pricing of electricity is needed in order to signal to the consumer the true cost of electricity by season and by time of day. Marginal costs capture such costs and rates should be based on such costs. Revenue reconciliation procedures make marginal cost-based rates feasible from a regulatory requirement perspective. This research showed that such procedures are available and selection among alternative procedures depends on the preference rankings placed on the multiple, and sometimes competing goals of electricity pricing.

  8. Analysis of Variability and Uncertainty in Wind Power Forecasting: An International Comparison (Presentation)

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B.; Miettinen, J.; Holttinen, H.; Gomez-Lozaro, E.; Cutululis, N.; Litong-Palima, M.; Sorensen, P.; Lovholm, A.; Berge, E.; Dobschinski, J.

    2013-10-01

    This presentation summarizes the work to investigate the uncertainty in wind forecasting at different times of year and compare wind forecast errors in different power systems using large-scale wind power prediction data from six countries: the United States, Finland, Spain, Denmark, Norway, and Germany.

  9. Microsoft Word - Price Uncertainty Supplement.doc

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    ... In Figure 3, the then-prompt February 2009 futures contract (green dashed curve in Figure ... Figure 4 shows the comparable 2007 period and the same effect. U.S. Energy Information ...

  10. Microsoft Word - Price Uncertainty Supplement.doc

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    ... This can be seen in the spread between the green (July 1 forwards) and black curves ... Source: U.S. EIA, CME Group Energy Information AdministrationShort-Term Energy Outlook ...

  11. Characterization, propagation and analysis of aleatory and epistemic uncertainty in the 2008 performance assessment for the proposed repository for radioactive waste at Yucca Mountain, Nevada.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-10-01

    The 2008 performance assessment (PA) for the proposed repository for high-level radioactive waste at Yucca Mountain (YM), Nevada, illustrates the conceptual structure of risk assessments for complex systems. The 2008 YM PA is based on the following three conceptual entities: a probability space that characterizes aleatory uncertainty; a function that predicts consequences for individual elements of the sample space for aleatory uncertainty; and a probability space that characterizes epistemic uncertainty. These entities and their use in the characterization, propagation and analysis of aleatory and epistemic uncertainty are described and illustrated with results from the 2008 YM PA.

  12. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    SciTech Connect (OSTI)

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.

  13. Mapping and uncertainty analysis of energy and pitch angle phase space in the DIII-D fast ion loss detector

    SciTech Connect (OSTI)

    Pace, D. C. Fisher, R. K.; Van Zeeland, M. A.; Pipes, R.

    2014-11-15

    New phase space mapping and uncertainty analysis of energetic ion loss data in the DIII-D tokamak provides experimental results that serve as valuable constraints in first-principles simulations of energetic ion transport. Beam ion losses are measured by the fast ion loss detector (FILD) diagnostic system consisting of two magnetic spectrometers placed independently along the outer wall. Monte Carlo simulations of mono-energetic and single-pitch ions reaching the FILDs are used to determine the expected uncertainty in the measurements. Modeling shows that the variation in gyrophase of 80 keV beam ions at the FILD aperture can produce an apparent measured energy signature spanning across 50-140 keV. These calculations compare favorably with experiments in which neutral beam prompt loss provides a well known energy and pitch distribution.

  14. Considerations for sensitivity analysis, uncertainty quantification, and data assimilation for grid-to-rod fretting

    SciTech Connect (OSTI)

    Michael Pernice

    2012-10-01

    Grid-to-rod fretting is the leading cause of fuel failures in pressurized water reactors, and is one of the challenge problems being addressed by the Consortium for Advanced Simulation of Light Water Reactors to guide its efforts to develop a virtual reactor environment. Prior and current efforts in modeling and simulation of grid-to-rod fretting are discussed. Sources of uncertainty in grid-to-rod fretting are also described.

  15. The IAEA Coordinated Research Program on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis: Description of the Benchmark Test Cases and Phases

    SciTech Connect (OSTI)

    Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov

    2012-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest

  16. Analysis of Variability and Uncertainty in Wind Power Forecasting: An International Comparison: Preprint

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B. M.; Gomez-Lazaro, E.; Lovholm, A. L.; Berge, E.; Miettinen, J.; Holttinen, H.; Cutululis, N.; Litong-Palima, M.; Sorensen, P.; Dobschinski, J.

    2013-10-01

    One of the critical challenges of wind power integration is the variable and uncertain nature of the resource. This paper investigates the variability and uncertainty in wind forecasting for multiple power systems in six countries. An extensive comparison of wind forecasting is performed among the six power systems by analyzing the following scenarios: (i) wind forecast errors throughout a year; (ii) forecast errors at a specific time of day throughout a year; (iii) forecast errors at peak and off-peak hours of a day; (iv) forecast errors in different seasons; (v) extreme forecasts with large overforecast or underforecast errors; and (vi) forecast errors when wind power generation is at different percentages of the total wind capacity. The kernel density estimation method is adopted to characterize the distribution of forecast errors. The results show that the level of uncertainty and the forecast error distribution vary among different power systems and scenarios. In addition, for most power systems, (i) there is a tendency to underforecast in winter; and (ii) the forecasts in winter generally have more uncertainty than the forecasts in summer.

  17. Draft regulatory analysis: Notice of proposed rulemaking for the allocation and pricing of gasohol

    SciTech Connect (OSTI)

    None,

    1980-05-01

    The three principal problem areas addressed are: how to price unleaded blend stock and gasohol; how blenders are to obtain unleaded blend stock to blend with ethanol to produce gasohol; and how gasohol suppliers may distribute gasohol to purchasers. The proposed pricing and allocation rules, if adopted as final rules, would be in effect for about a year, because the statutory authority for gasoline price and allocation controls has an expiration date of September 30, 1981. The principal issues addressed are: what volume of ethanol and gasohol production can be expected between now and the end of 1981; what prices these products are likely to reach, independent of the rule and its alternative; what effect the rule and its alternative may have on the price and distribution of ethanol and gasohol; and what effect the rule and its alternative may have on motor vehicle misfueling and competition in the motor gasoline industry. On supply issues, it is concluded that by December, 1981, ethanol and gasohol production should increase by a factor of 3 or 4 above present levels, enough to meet the President's goals, without requiring additional corn acreage or adversely affecting food production. Ethanol production should increase from its present level of about 92 million gallons per year (6062 B/D) to the 3, 4, and 7 hundred million gallons per year levels (20,000, 30,000, and 45,000 B/D) necessaryto produce gasohol at year-end rates of 200,000 B/D in 1980, 300,000 B/D in 1981 and 450,000 B/D in 1982. In 1980 gasohol will represent about 3.2 percent of the total gasoline market, and 7.9 percent of the total unleaded market. Gasohol should help extend, rather than adversely affect, unleaded supplies. 30 references, 8 tables.

  18. Fukushima Daiichi Unit 1 Accident Progression Uncertainty Analysis and Implications for Decommissioning of Fukushima Reactors - Volume I.

    SciTech Connect (OSTI)

    Gauntt, Randall O.; Mattie, Patrick D.

    2016-01-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysis (UA) on the Fukushima Daiichi unit (1F1) accident progression with the MELCOR code. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). That study focused on reconstructing the accident progressions, as postulated by the limited plant data. This work was focused evaluation of uncertainty in core damage progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, reactor damage state, fraction of intact fuel, vessel lower head failure). The primary intent of this study was to characterize the range of predicted damage states in the 1F1 reactor considering state of knowledge uncertainties associated with MELCOR modeling of core damage progression and to generate information that may be useful in informing the decommissioning activities that will be employed to defuel the damaged reactors at the Fukushima Daiichi Nuclear Power Plant. Additionally, core damage progression variability inherent in MELCOR modeling numerics is investigated.

  19. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    SciTech Connect (OSTI)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.; Jakeman, John Davis; Swiler, Laura Painton; Stephens, John Adam; Vigil, Dena M.; Wildey, Timothy Michael; Bohnhoff, William J.; Eddy, John P.; Hu, Kenneth T.; Dalbey, Keith R.; Bauman, Lara E; Hough, Patricia Diane

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  20. Measuring and Explaining Electricity Price Changes in Restructured States

    SciTech Connect (OSTI)

    Fagan, Mark L.

    2006-06-15

    An effort to determine the effect of restructuring on prices finds that, on average, prices for industrial customers in restructured states were lower, relative to predicted prices, than prices for industrial customers in non-restructured states. This preliminary analysis also finds that these price changes are explained primarily by high pre-restructuring prices, not whether or not a state restructured. (author)

  1. Systematic uncertainties associated with the cosmological analysis of the first Pan-STARRS1 type Ia supernova sample

    SciTech Connect (OSTI)

    Scolnic, D.; Riess, A.; Brout, D.; Rodney, S. [Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Rest, A. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Huber, M. E.; Tonry, J. L. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Foley, R. J.; Chornock, R.; Berger, E.; Soderberg, A. M.; Stubbs, C. W.; Kirshner, R. P.; Challis, P.; Czekala, I.; Drout, M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Narayan, G. [Department of Physics, Harvard University, 17 Oxford Street, Cambridge, MA 02138 (United States); Smartt, S. J.; Botticella, M. T. [Astrophysics Research Centre, School of Mathematics and Physics, Queens University Belfast, Belfast BT7 1NN (United Kingdom); Schlafly, E. [Max Planck Institute for Astronomy, Konigstuhl 17, D-69117 Heidelberg (Germany); and others

    2014-11-01

    We probe the systematic uncertainties from the 113 Type Ia supernovae (SN Ia) in the Pan-STARRS1 (PS1) sample along with 197 SN Ia from a combination of low-redshift surveys. The companion paper by Rest et al. describes the photometric measurements and cosmological inferences from the PS1 sample. The largest systematic uncertainty stems from the photometric calibration of the PS1 and low-z samples. We increase the sample of observed Calspec standards from 7 to 10 used to define the PS1 calibration system. The PS1 and SDSS-II calibration systems are compared and discrepancies up to ?0.02 mag are recovered. We find uncertainties in the proper way to treat intrinsic colors and reddening produce differences in the recovered value of w up to 3%. We estimate masses of host galaxies of PS1 supernovae and detect an insignificant difference in distance residuals of the full sample of 0.037 0.031 mag for host galaxies with high and low masses. Assuming flatness and including systematic uncertainties in our analysis of only SNe measurements, we find w =?1.120{sub ?0.206}{sup +0.360}(Stat){sub ?0.291}{sup +0.269}(Sys). With additional constraints from Baryon acoustic oscillation, cosmic microwave background (CMB) (Planck) and H {sub 0} measurements, we find w=?1.166{sub ?0.069}{sup +0.072} and ?{sub m}=0.280{sub ?0.012}{sup +0.013} (statistical and systematic errors added in quadrature). The significance of the inconsistency with w = 1 depends on whether we use Planck or Wilkinson Microwave Anisotropy Probe measurements of the CMB: w{sub BAO+H0+SN+WMAP}=?1.124{sub ?0.065}{sup +0.083}.

  2. nCTEQ15 - Global analysis of nuclear parton distributions with uncertainties

    SciTech Connect (OSTI)

    Kusina, A.; Jezo, T.; Clark, D. B.; Keppel, Cynthia; Lyonnet, F.; Morfin, Jorge; Olness, F. I.; Owens, Jeff; Schienbein, I.

    2015-09-01

    We present the first official release of the nCTEQ nuclear parton distribution functions with errors. The main addition to the previous nCTEQ PDFs is the introduction of PDF uncertainties based on the Hessian method. Another important addition is the inclusion of pion production data from RHIC that give us a handle on constraining the gluon PDF. This contribution summarizes our results from arXiv:1509.00792 and concentrates on the comparison with other groups providing nuclear parton distributions.

  3. Analysis of sampling plan options for tank 16H from the perspective of statistical uncertainty

    SciTech Connect (OSTI)

    Shine, E. P.

    2013-02-28

    This report develops a concentration variability model for Tank 16H in order to compare candidate sampling plans for assessing the concentrations of analytes in the residual material in the annulus and on the floor of the primary vessel. A concentration variability model is used to compare candidate sampling plans based on the expected upper 95% confidence limit (UCL95) for the mean. The result is expressed as a rank order of candidate sampling plans from lowest to highest expected UCL95, with the lowest being the most desirable from an uncertainty perspective.

  4. Electromagnetic form factors of the nucleon: New fit and analysis of uncertainties

    SciTech Connect (OSTI)

    Alberico, W. M.; Giunti, C.; Bilenky, S. M.; Graczyk, K. M.

    2009-06-15

    Electromagnetic form factors of proton and neutron, obtained from a new fit of data, are presented. The proton form factors are obtained from a simultaneous fit to the ratio {mu}{sub p}G{sub Ep}/G{sub Mp} determined from polarization transfer measurements and to ep elastic cross section data. Phenomenological two-photon exchange corrections are taken into account. The present fit for protons was performed in the kinematical region Q{sup 2} is an element of (0,6) GeV{sup 2}. For both protons and neutrons we use the latest available data. For all form factors, the uncertainties and correlations of form factor parameters are investigated with the {chi}{sup 2} method.

  5. REVIEW OF MECHANISTIC UNDERSTANDING AND MODELING AND UNCERTAINTY ANALYSIS METHODS FOR PREDICTING CEMENTITIOUS BARRIER PERFORMANCE

    SciTech Connect (OSTI)

    Langton, C.; Kosson, D.

    2009-11-30

    Cementitious barriers for nuclear applications are one of the primary controls for preventing or limiting radionuclide release into the environment. At the present time, performance and risk assessments do not fully incorporate the effectiveness of engineered barriers because the processes that influence performance are coupled and complicated. Better understanding the behavior of cementitious barriers is necessary to evaluate and improve the design of materials and structures used for radioactive waste containment, life extension of current nuclear facilities, and design of future nuclear facilities, including those needed for nuclear fuel storage and processing, nuclear power production and waste management. The focus of the Cementitious Barriers Partnership (CBP) literature review is to document the current level of knowledge with respect to: (1) mechanisms and processes that directly influence the performance of cementitious materials (2) methodologies for modeling the performance of these mechanisms and processes and (3) approaches to addressing and quantifying uncertainties associated with performance predictions. This will serve as an important reference document for the professional community responsible for the design and performance assessment of cementitious materials in nuclear applications. This review also provides a multi-disciplinary foundation for identification, research, development and demonstration of improvements in conceptual understanding, measurements and performance modeling that would be lead to significant reductions in the uncertainties and improved confidence in the estimating the long-term performance of cementitious materials in nuclear applications. This report identifies: (1) technology gaps that may be filled by the CBP project and also (2) information and computational methods that are in currently being applied in related fields but have not yet been incorporated into performance assessments of cementitious barriers. The various

  6. An Analysis of the Effects of Residential Photovoltaic Energy Systems on Home Sales Prices in California

    SciTech Connect (OSTI)

    Hoen, Ben; Cappers, Peter; Wiser, Ryan; Thayer, Mark

    2011-04-19

    An increasing number of homes in the U.S. have sold with photovoltaic (PV) energy systems installed at the time of sale, yet relatively little research exists that estimates the marginal impacts of those PV systems on home sale prices. A clearer understanding of these possible impacts might influence the decisions of homeowners considering the installation of a PV system, homebuyers considering the purchase of a home with PV already installed, and new home builders considering including PV as an optional or standard product on their homes. This research analyzes a large dataset of California homes that sold from 2000 through mid-2009 with PV installed. It finds strong evidence that homes with PV systems sold for a premium over comparable homes without PV systems during this time frame. Estimates for this premium expressed in dollars per watt of installed PV range, on average, from roughly $4 to $5.5/watt across a large number of hedonic and repeat sales model specifications and robustness tests. When expressed as a ratio of the sales price premium of PV to estimated annual energy cost savings associated with PV, an average ratio of 14:1 to 19:1 can be calculated; these results are consistent with those of the more-extensive existing literature on the impact of energy efficiency on sales prices. When the data are split among new and existing homes, however, PV system premiums are markedly affected. New homes with PV show premiums of $2.3-2.6/watt, while existing homes with PV show premiums of more than $6/watt. Reasons for this discrepancy are suggested, yet further research is warranted. A number of other areas where future research would be useful are also highlighted.

  7. Signal discovery, limits, and uncertainties with sparse on/off measurements: an objective bayesian analysis

    SciTech Connect (OSTI)

    Knoetig, Max L., E-mail: mknoetig@phys.ethz.ch [Institute for Particle Physics, ETH Zurich, 8093 Zurich (Switzerland)

    2014-08-01

    For decades researchers have studied the On/Off counting problem where a measured rate consists of two parts. One part is due to a signal process and the other is due to a background process, the magnitudes for both of which are unknown. While most frequentist methods are adequate for large number counts, they cannot be applied to sparse data. Here, I want to present a new objective Bayesian solution that only depends on three parameters: the number of events in the signal region, the number of events in the background region, and the ratio of the exposure for both regions. First, the probability of the counts only being due to background is derived analytically. Second, the marginalized posterior for the signal parameter is also derived analytically. With this two-step approach it is easy to calculate the signal's significance, strength, uncertainty, or upper limit in a unified way. This approach is valid without restrictions for any number count, including zero, and may be widely applied in particle physics, cosmic-ray physics, and high-energy astrophysics. In order to demonstrate the performance of this approach, I apply the method to gamma-ray burst data.

  8. Approximate option pricing

    SciTech Connect (OSTI)

    Chalasani, P.; Saias, I.; Jha, S.

    1996-04-08

    As increasingly large volumes of sophisticated options (called derivative securities) are traded in world financial markets, determining a fair price for these options has become an important and difficult computational problem. Many valuation codes use the binomial pricing model, in which the stock price is driven by a random walk. In this model, the value of an n-period option on a stock is the expected time-discounted value of the future cash flow on an n-period stock price path. Path-dependent options are particularly difficult to value since the future cash flow depends on the entire stock price path rather than on just the final stock price. Currently such options are approximately priced by Monte carlo methods with error bounds that hold only with high probability and which are reduced by increasing the number of simulation runs. In this paper the authors show that pricing an arbitrary path-dependent option is {number_sign}-P hard. They show that certain types f path-dependent options can be valued exactly in polynomial time. Asian options are path-dependent options that are particularly hard to price, and for these they design deterministic polynomial-time approximate algorithms. They show that the value of a perpetual American put option (which can be computed in constant time) is in many cases a good approximation to the value of an otherwise identical n-period American put option. In contrast to Monte Carlo methods, the algorithms have guaranteed error bounds that are polynormally small (and in some cases exponentially small) in the maturity n. For the error analysis they derive large-deviation results for random walks that may be of independent interest.

  9. Risk-Informed Safety Margin Characterization (RISMC): Integrated Treatment of Aleatory and Epistemic Uncertainty in Safety Analysis

    SciTech Connect (OSTI)

    R. W. Youngblood

    2010-10-01

    The concept of “margin” has a long history in nuclear licensing and in the codification of good engineering practices. However, some traditional applications of “margin” have been carried out for surrogate scenarios (such as design basis scenarios), without regard to the actual frequencies of those scenarios, and have been carried out with in a systematically conservative fashion. This means that the effectiveness of the application of the margin concept is determined in part by the original choice of surrogates, and is limited in any case by the degree of conservatism imposed on the evaluation. In the RISMC project, which is part of the Department of Energy’s “Light Water Reactor Sustainability Program” (LWRSP), we are developing a risk-informed characterization of safety margin. Beginning with the traditional discussion of “margin” in terms of a “load” (a physical challenge to system or component function) and a “capacity” (the capability of that system or component to accommodate the challenge), we are developing the capability to characterize probabilistic load and capacity spectra, reflecting both aleatory and epistemic uncertainty in system response. For example, the probabilistic load spectrum will reflect the frequency of challenges of a particular severity. Such a characterization is required if decision-making is to be informed optimally. However, in order to enable the quantification of probabilistic load spectra, existing analysis capability needs to be extended. Accordingly, the INL is working on a next-generation safety analysis capability whose design will allow for much more efficient parameter uncertainty analysis, and will enable a much better integration of reliability-related and phenomenology-related aspects of margin.

  10. Columbia University flow instability experimental program: Volume 2. Single tube uniformly heated tests -- Part 2: Uncertainty analysis and data

    SciTech Connect (OSTI)

    Dougherty, T.; Maciuca, C.; McAssey, E.V. Jr.; Reddy, D.G.; Yang, B.W.

    1990-05-01

    In June 1988, Savannah River Laboratory requested that the Heat Transfer Research Facility modify the flow excursion program, which had been in progress since November 1987, to include testing of single tubes in vertical down-flow over a range of length to diameter (L/D) ratios of 100 to 500. The impetus for the request was the desire to obtain experimental data as quickly as possible for code development work. In July 1988, HTRF submitted a proposal to SRL indicating that by modifying a facility already under construction the data could be obtained within three to four months. In January 1990, HTFR issued report CU-HTRF-T4, part 1. This report contained the technical discussion of the results from the single tube uniformly heated tests. The present report is part 2 of CU-HTRF-T4 which contains further discussion of the uncertainty analysis and the complete set of data.

  11. HOW TO DEAL WITH WASTE ACCEPTANCE UNCERTAINTY USING THE WASTE ACCEPTANCE CRITERIA FORECASTING AND ANALYSIS CAPABILITY SYSTEM (WACFACS)

    SciTech Connect (OSTI)

    Redus, K. S.; Hampshire, G. J.; Patterson, J. E.; Perkins, A. B.

    2002-02-25

    The Waste Acceptance Criteria Forecasting and Analysis Capability System (WACFACS) is used to plan for, evaluate, and control the supply of approximately 1.8 million yd3 of low-level radioactive, TSCA, and RCRA hazardous wastes from over 60 environmental restoration projects between FY02 through FY10 to the Oak Ridge Environmental Management Waste Management Facility (EMWMF). WACFACS is a validated decision support tool that propagates uncertainties inherent in site-related contaminant characterization data, disposition volumes during EMWMF operations, and project schedules to quantitatively determine the confidence that risk-based performance standards are met. Trade-offs in schedule, volumes of waste lots, and allowable concentrations of contaminants are performed to optimize project waste disposition, regulatory compliance, and disposal cell management.

  12. Natural Gas Wellhead Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Pipeline and Distribution Use Price City Gate Price Residential Price Percentage of Total Residential Deliveries included in Prices Commercial Price Percentage of Total Commercial Deliveries included in Prices Industrial Price Percentage of Total Industrial Deliveries included in Prices Vehicle Fuel Price Electric Power Price Period: Monthly Annual Download Series History Download Series History Definitions, Sources & Notes Definitions, Sources & Notes Show Data By: Data Series Area 2010

  13. Sandia Energy - Price Premiums for Solar Home Sales

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Price Premiums for Solar Home Sales Home Renewable Energy Energy Partnership News News & Events Photovoltaic Solar Systems Analysis Price Premiums for Solar Home Sales Previous...

  14. Utility-Scale Solar 2014. An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    SciTech Connect (OSTI)

    Bolinger, Mark; Seel, Joachim

    2015-09-01

    Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MWAC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the next few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.

  15. Crude Oil and Gasoline Price Monitoring

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    What drives crude oil prices? July 12, 2016 | Washington, DC An analysis of 7 factors that influence oil markets, with chart data updated monthly and quarterly price per barrel ...

  16. Development, sensitivity analysis, and uncertainty quantification of high-fidelity arctic sea ice models.

    SciTech Connect (OSTI)

    Peterson, Kara J.; Bochev, Pavel Blagoveston; Paskaleva, Biliana S.

    2010-09-01

    Arctic sea ice is an important component of the global climate system and due to feedback effects the Arctic ice cover is changing rapidly. Predictive mathematical models are of paramount importance for accurate estimates of the future ice trajectory. However, the sea ice components of Global Climate Models (GCMs) vary significantly in their prediction of the future state of Arctic sea ice and have generally underestimated the rate of decline in minimum sea ice extent seen over the past thirty years. One of the contributing factors to this variability is the sensitivity of the sea ice to model physical parameters. A new sea ice model that has the potential to improve sea ice predictions incorporates an anisotropic elastic-decohesive rheology and dynamics solved using the material-point method (MPM), which combines Lagrangian particles for advection with a background grid for gradient computations. We evaluate the variability of the Los Alamos National Laboratory CICE code and the MPM sea ice code for a single year simulation of the Arctic basin using consistent ocean and atmospheric forcing. Sensitivities of ice volume, ice area, ice extent, root mean square (RMS) ice speed, central Arctic ice thickness, and central Arctic ice speed with respect to ten different dynamic and thermodynamic parameters are evaluated both individually and in combination using the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA). We find similar responses for the two codes and some interesting seasonal variability in the strength of the parameters on the solution.

  17. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    SciTech Connect (OSTI)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. This report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.

  18. Technology choice in a least-cost expansion analysis framework: The impact of gas prices, planning horizon, and system characteristics

    SciTech Connect (OSTI)

    Guziel, K.A.; South, D.W.

    1990-01-01

    The current outlook for new capacity addition by electric utilities is uncertain and tenuous. Regardless of the amount, it is inevitable that new capacity will be needed in the 1990s and beyond. The fundamental question about the addition capacity requirements centers on technology choice and the factors influencing the decision process. We examined technology choices in 10 representative power pools with a dynamic optimization expansion model, the Wien Automatic System Planning (WASP) Package. These 10 power pools were determined to be representative on the basis of a cluster analysis conducted on all 26 power pools in the United States. A least-cost expansion plan was determined for each power pool with three candidate technologies--natural gas combustion turbine (CT), natural gas combined cycle (NGCC), and integrated gasification combined cycle (IGCC)--three alternative gas price tracks, and two planning horizons between the years 1995 and 2020. This paper summarizes the analysis framework and presents results for Power Pool 1, the American Electric Power (AEP) service territory. 7 refs., 9 figs., 1 tab.

  19. Uncertainty Analysis of Runoff Simulations and Parameter Identifiability in the Community Land Model Evidence from MOPEX Basins

    SciTech Connect (OSTI)

    Huang, Maoyi; Hou, Zhangshuan; Leung, Lai-Yung R.; Ke, Yinghai; Liu, Ying; Fang, Zhufeng; Sun, Yu

    2013-12-01

    With the emergence of earth system models as important tools for understanding and predicting climate change and implications to mitigation and adaptation, it has become increasingly important to assess the fidelity of the land component within earth system models to capture realistic hydrological processes and their response to the changing climate and quantify the associated uncertainties. This study investigates the sensitivity of runoff simulations to major hydrologic parameters in version 4 of the Community Land Model (CLM4) by integrating CLM4 with a stochastic exploratory sensitivity analysis framework at 20 selected watersheds from the Model Parameter Estimation Experiment (MOPEX) spanning a wide range of climate and site conditions. We found that for runoff simulations, the most significant parameters are those related to the subsurface runoff parameterizations. Soil texture related parameters and surface runoff parameters are of secondary significance. Moreover, climate and soil conditions play important roles in the parameter sensitivity. In general, site conditions within water-limited hydrologic regimes and with finer soil texture result in stronger sensitivity of output variables, such as runoff and its surface and subsurface components, to the input parameters in CLM4. This study demonstrated the feasibility of parameter inversion for CLM4 using streamflow observations to improve runoff simulations. By ranking the significance of the input parameters, we showed that the parameter set dimensionality could be reduced for CLM4 parameter calibration under different hydrologic and climatic regimes so that the inverse problem is less ill posed.

  20. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    SciTech Connect (OSTI)

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane; Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  1. Average Residential Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Data Series: Average Residential Price Residential Price - Local Distribution Companies Residential Price - Marketers Residential % Sold by Local Distribution Companies Average Commercial Price Commercial Price - Local Distribution Companies Commerical Price - Marketers Commercial % Sold by Local Distribution Companies Period: Monthly Annual Download Series History Download Series History Definitions, Sources & Notes Definitions, Sources & Notes Show Data By: Data Series Area 2010 2011

  2. FERC's acceptance of market-based pricing: An antitrust analysis. [Federal Energy Regulatory Commission

    SciTech Connect (OSTI)

    Harris, B.C.; Frankena, M.W. )

    1992-06-01

    In large part, FERC's determination of market power is based on an analysis that focuses on the ability of power suppliers to foreclose' other potential power suppliers by withholding transmission access to the buyer. The authors believe that this analysis is flawed because the conditions it considers are neither necessary nor sufficient for the existence of market power. That is, it is possible that market-based rates can be subject to market power even if no transmission supplier has the ability to foreclose some power suppliers; conversely, it is possible that no market power exists despite the ability to foreclose other suppliers. This paper provides a critical analysis of FERC's market-power determinations. The concept of market power is defined and its relationship to competition is discussed in Section 1, while a framework for evaluating the existence of market power is presented in Section 2. In Section 3, FERC's recent order in Terra Comfort is examined using this framework. A brief preview of FERC's order in TECO Power Services comprises Section 4. Overall conclusions are presented in Section 5.

  3. Understanding the Impact of Higher Corn Prices on Consumer Food Prices

    SciTech Connect (OSTI)

    none,

    2007-04-18

    In an effort to assess the true effects of higher corn prices, the National Corn Growers Association (NCGA) commissioned an analysis on the impact of increased corn prices on retail food prices. This paper summarizes key results of the study and offers additional analysis based on information from a variety of other sources.

  4. Optimization of Complex Energy System Under Uncertainty | Argonne

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Leadership Computing Facility On the left: This shows the Illinois power grid system overlaid on fields portraying electricity prices under a deterministic economic dispatch scenario. Dark blue areas have the lowest prices while red and yellow have the highest. Argonne National Laboratory researchers use a model of the Illinois grid to test algorithms for making power dispatch decisions under uncertainty. On the right: This shows electricity prices in Illinois under a stochastic economic

  5. Selling Into the Sun: Price Premium Analysis of a Multi-State Dataset of Solar Homes

    Broader source: Energy.gov [DOE]

    Homes with solar photovoltaic (PV) systems have multiplied in the United States recently, reaching more than half a million in 2014, in part due to plummeting PV costs and innovative financing options. As PV systems become an increasingly common feature of U.S. homes, the ability to assess the value of these homes appropriately will become increasingly important. At the same time, capturing the value of PV to homes will be important for facilitating a robust residential PV market. Appraisers and real estate agents have made strides toward valuing PV homes, and several limited studies have suggested the presence of PV home premiums; however, gaps remain in understanding these premiums for housing markets nationwide. To fill these gaps, researchers from Lawrence Berkeley National Laboratory (LBNL) and their collaborators from other institutions conducted the most comprehensive PV home premium analysis to date. The study more than doubles the number of PV home sales previously analyzed, examines transactions in eight states, and spans the years 2002–2013. The results impart confidence that PV consistently adds value across a variety of states, housing and PV markets, and home types.

  6. Natural Gas Citygate Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Pipeline and Distribution Use Price Citygate Price Residential Price Commercial Price Industrial Price Vehicle Fuel Price Electric Power Price Proved Reserves as of 12/31 Reserves Adjustments Reserves Revision Increases Reserves Revision Decreases Reserves Sales Reserves Acquisitions Reserves Extensions Reserves New Field Discoveries New Reservoir Discoveries in Old Fields Estimated Production Number of Producing Gas Wells Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From

  7. Average Commercial Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Pipeline and Distribution Use Price Citygate Price Residential Price Commercial Price Industrial Price Vehicle Fuel Price Electric Power Price Proved Reserves as of 12/31 Reserves Adjustments Reserves Revision Increases Reserves Revision Decreases Reserves Sales Reserves Acquisitions Reserves Extensions Reserves New Field Discoveries New Reservoir Discoveries in Old Fields Estimated Production Number of Producing Gas Wells Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From

  8. Average Residential Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Pipeline and Distribution Use Price Citygate Price Residential Price Commercial Price Industrial Price Vehicle Fuel Price Electric Power Price Proved Reserves as of 12/31 Reserves Adjustments Reserves Revision Increases Reserves Revision Decreases Reserves Sales Reserves Acquisitions Reserves Extensions Reserves New Field Discoveries New Reservoir Discoveries in Old Fields Estimated Production Number of Producing Gas Wells Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From

  9. Reducing Transaction Costs for Energy Efficiency Investments and Analysis of Economic Risk Associated With Building Performance Uncertainties: Small Buildings and Small Portfolios Program

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Reducing Transaction Costs for Energy Efficiency Investments and Analysis of Economic Risk Associated With Building Performance Uncertainties Small Buildings and Small Portfolios Program Rois Langner, Bob Hendron, and Eric Bonnema Technical Report NREL/TP-5500-60976 August 2014 NREL is a national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC This report is available at no cost from the National

  10. Price controls and international petroleum product prices

    SciTech Connect (OSTI)

    Deacon, R.T.; Mead, W.J.; Agarwal, V.B.

    1980-02-01

    The effects of Federal refined-product price controls upon the price of motor gasoline in the United States through 1977 are examined. A comparison of domestic and foreign gasoline prices is made, based on the prices of products actually moving in international trade. There is also an effort to ascribe US/foreign market price differentials to identifiable cost factors. Primary emphasis is on price comparisons at the wholesale level, although some retail comparisons are presented. The study also examines the extent to which product price controls are binding, and attempts to estimate what the price of motor gasoline would have been in the absence of controls. The time period under consideration is from 1969 through 1977, with primary focus on price relationships in 1970-1971 (just before US controls) and 1976-1977. The foreign-domestic comparisons are made with respect to four major US cities, namely, Boston, New York, New Orleans, and Los Angeles. 20 figures, 14 tables.

  11. MCNP6 Results for the Phase III Sensitivity Benchmark of the OCED/NEA Expert Group on Uncertainty Analysis for Criticality Safety Assessment

    SciTech Connect (OSTI)

    Kiedrowski, Brian C.

    2012-06-19

    Within the last decade, there has been increasing interest in the calculation of cross section sensitivity coefficients of k{sub eff} for integral experiment design and uncertainty analysis. The OECD/NEA has an Expert Group devoted to Sensitivity and Uncertainty Analysis within the Working Party for Nuclear Criticality Safety. This expert group has developed benchmarks to assess code capabilities and performance for doing sensitivity and uncertainty analysis. Phase III of a set of sensitivity benchmarks evaluates capabilities for computing sensitivity coefficients. MCNP6 has the capability to compute cross section sensitivities for k{sub eff} using continuous-energy physics. To help verify this capability, results for the Phase III benchmark cases are generated and submitted to the Expert Group for comparison. The Phase III benchmark has three cases: III.1, an array of MOX fuel pins, III.2, a series of infinite lattices of MOX fuel pins with varying pitches, and III.3 two spheres with homogeneous mixtures of UF{sub 4} and polyethylene with different enrichments.

  12. The Role of Uncertainty Quantification for Reactor Physics

    SciTech Connect (OSTI)

    Salvatores, Massimo; Palmiotti, Giuseppe; Aliberti, G.

    2015-01-01

    The quantification of uncertainties is a crucial step in design. The comparison of a-priori uncertainties with the target accuracies, allows to define needs and priorities for uncertainty reduction. In view of their impact, the uncertainty analysis requires a reliability assessment of the uncertainty data used. The choice of the appropriate approach and the consistency of different approaches are discussed.

  13. Real Time Pricing as a Default or Optional Service for C&ICustomers: A Comparative Analysis of Eight Case Studies

    SciTech Connect (OSTI)

    Barbose, Galen; Goldman, Charles; Bharvirkar, Ranjit; Hopper,Nicole; Ting, Michael; Neenan, Bernie

    2005-08-01

    Demand response (DR) has been broadly recognized to be an integral component of well-functioning electricity markets, although currently underdeveloped in most regions. Among the various initiatives undertaken to remedy this deficiency, public utility commissions (PUC) and utilities have considered implementing dynamic pricing tariffs, such as real-time pricing (RTP), and other retail pricing mechanisms that communicate an incentive for electricity consumers to reduce their usage during periods of high generation supply costs or system reliability contingencies. Efforts to introduce DR into retail electricity markets confront a range of basic policy issues. First, a fundamental issue in any market context is how to organize the process for developing and implementing DR mechanisms in a manner that facilitates productive participation by affected stakeholder groups. Second, in regions with retail choice, policymakers and stakeholders face the threshold question of whether it is appropriate for utilities to offer a range of dynamic pricing tariffs and DR programs, or just ''plain vanilla'' default service. Although positions on this issue may be based primarily on principle, two empirical questions may have some bearing--namely, what level of price response can be expected through the competitive retail market, and whether establishing RTP as the default service is likely to result in an appreciable level of DR? Third, if utilities are to have a direct role in developing DR, what types of retail pricing mechanisms are most appropriate and likely to have the desired policy impact (e.g., RTP, other dynamic pricing options, DR programs, or some combination)? Given a decision to develop utility RTP tariffs, three basic implementation issues require attention. First, should it be a default or optional tariff, and for which customer classes? Second, what types of tariff design is most appropriate, given prevailing policy objectives, wholesale market structure, ratemaking

  14. Analysis of changes in OPEC's crude oil prices, current account, and surplus investments, with emphasis upon oil-revenue purchasing power - 1973 through 1980

    SciTech Connect (OSTI)

    Tadayon, S.

    1984-01-01

    The study sought to provide a comprehensive investigation of changes in the Organisation of Petroleum Exporting Countries (OPEC) crude oil prices, current-account balance, and current-account surplus investments abroad. The study emphasized analysis and, to some extent, quantification of the real value, or purchasing power, of OPEC oil revenues. The research approach was descriptive-elemental to expand upon characteristics of variables identified for the study. Research questions were answered by direct findings for each question. The method utilized for the study included document research and statistical analyses of data derived. The aim was to obtain complete and accurate information. The study compiled documented data regarding OPEC's crude oil prices, current-account balance, and current-account surplus investments abroad and analyzed the purchasing power of oil revenues as time passed and events occurred over the eight years from 1973 through 1980.

  15. Price Quotes and Isotope Ordering

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Price Elasticities for Energy Use in Buildings of the United States October 2014 Independent Statistics & Analysis www.eia.gov U.S. Department of Energy Washington, DC 20585 U.S. Energy Information Administration | Price Elasticities for Energy Use in Buildings of the United States i This report was prepared by the U.S. Energy Information Administration (EIA), the statistical and analytical agency within the U.S. Department of Energy. By law, EIA's data, analyses, and forecasts are

  16. Technology choice in a least-cost expansion analysis framework: Effects of gas price, planning period, and system characteristics

    SciTech Connect (OSTI)

    Guziel, K.A.; South, D.W.; Bhatarakamol, S.; Poch, L.A.

    1990-04-01

    The current outlook for new capacity additions by electric utilities is uncertain and tenuous. The fundamental question about the additional capacity requirements center on technology choice and the factors influencing the decision process. Instead of building capital-intensive power plants, utilities have begun relying on natural gas technologies, which permit rapid construction and deployment and low capital investment. Of concern to policymakers and utility planners are the following questions: (1) What is the impact of alternative gas price projections on technology choice (2) What influence does the planning horizon have on technology choice (3) How important are existing system characteristics on technology choice (4) What effect does capital cost, when combined with other technology characteristics in a capacity expansion framework, have on technology choice In this study Argonne National Laboratory examined the impact of these concerns on technology choices in 10 representative power pools with a dynamic optimization expansion model, the Wien Automatic System Planning Package (WASP). At least-cost expansion plan was determined for each power pool with three candidate technologies--natural gas combustion turbine technology (GT), natural gas combined-cycle technology (NGCC), and integrated gasification combined-cycle technology (IGCC)--three alternative fuel price tracks, and two planning periods (10-yr versus 30-yr optimization) between the years 1995 and 2025. The three fuel price tracks represented scenarios for low, medium, and high gas prices. Sensitivity analyses were conducted on IGCC capital cost and unserved energy costs. 21 refs., 79 figs., 21 tabs.

  17. Large-Scale Uncertainty and Error Analysis for Time-dependent Fluid/Structure Interactions in Wind Turbine Applications

    SciTech Connect (OSTI)

    Alonso, Juan J.; Iaccarino, Gianluca

    2013-08-25

    The following is the final report covering the entire period of this aforementioned grant, June 1, 2011 - May 31, 2013 for the portion of the effort corresponding to Stanford University (SU). SU has partnered with Sandia National Laboratories (PI: Mike S. Eldred) and Purdue University (PI: Dongbin Xiu) to complete this research project and this final report includes those contributions made by the members of the team at Stanford. Dr. Eldred is continuing his contributions to this project under a no-cost extension and his contributions to the overall effort will be detailed at a later time (once his effort has concluded) on a separate project submitted by Sandia National Laboratories. At Stanford, the team is made up of Profs. Alonso, Iaccarino, and Duraisamy, post-doctoral researcher Vinod Lakshminarayan, and graduate student Santiago Padron. At Sandia National Laboratories, the team includes Michael Eldred, Matt Barone, John Jakeman, and Stefan Domino, and at Purdue University, we have Prof. Dongbin Xiu as our main collaborator. The overall objective of this project was to develop a novel, comprehensive methodology for uncertainty quantification by combining stochastic expansions (nonintrusive polynomial chaos and stochastic collocation), the adjoint approach, and fusion with experimental data to account for aleatory and epistemic uncertainties from random variable, random field, and model form sources. The expected outcomes of this activity were detailed in the proposal and are repeated here to set the stage for the results that we have generated during the time period of execution of this project: 1. The rigorous determination of an error budget comprising numerical errors in physical space and statistical errors in stochastic space and its use for optimal allocation of resources; 2. A considerable increase in efficiency when performing uncertainty quantification with a large number of uncertain variables in complex non-linear multi-physics problems; 3. A

  18. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect (OSTI)

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  19. Domestic petroleum-product prices around the world. Survey: free market or government price controls

    SciTech Connect (OSTI)

    Not Available

    1983-01-27

    In this issue, Energy Detente draws from their regular Western and Eastern Hemisphere Fuel Price/Tax Series, each produced monthly, and adds other survey data and analysis for a broad view of 48 countries around the world. They find that seven Latin American nations, including OPEC members Venezuela and Ecuador, are among the ten countries with lowest gasoline prices. In this Fourth Special Price Report, Energy Detente provides a first-time presentation of which prices are government-controlled, and which are free to respond to market forces. South Korea, with fixed prices since 1964, has the highest premium-grade gasoline price in our survey, US $5.38 per gallon. Paraguay, with prices fixed by PETROPAR, the national oil company, has the second highest premium gasoline price, US $4.21 per gallon. Nicaragua, also with government price controls, ranks third highest in the survey, with US $3.38 per gallon for premium gasoline. Kuwait shows the lowest price at US $0.55 per gallon. Several price changes from the previous survey reflect changes in currency exchange as all prices are converted to US dollars. The Energy Detente fuel price/tax series is presented for Western Hemisphere countries.

  20. Calculating impacts of energy standards on energy demand in U.S. buildings with uncertainty in an integrated assessment model

    SciTech Connect (OSTI)

    Scott, Michael J.; Daly, Don S.; Hathaway, John E.; Lansing, Carina S.; Liu, Ying; McJeon, Haewon C.; Moss, Richard H.; Patel, Pralit L.; Peterson, Marty J.; Rice, Jennie S.; Zhou, Yuyu

    2015-10-01

    In this paper, an integrated assessment model (IAM) uses a newly-developed Monte Carlo analysis capability to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The paper finds that aggressive building-energy codes and equipment standards are an effective, cost-saving way to reduce energy consumption in buildings and greenhouse gas emissions in U.S. states. This conclusion is robust to significant uncertainties in population, economic activity, climate, carbon prices, and technology performance and costs.

  1. DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual.

    SciTech Connect (OSTI)

    Griffin, Joshua D.; Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane; Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  2. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual.

    SciTech Connect (OSTI)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  3. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    SciTech Connect (OSTI)

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  4. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    SciTech Connect (OSTI)

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  5. Appendix C: Price case comparisons

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    High oil price Low oil price Reference High oil price Low oil price Reference High oil price Production Crude oil and lease condensate ... 13.87 19.06 20.36...

  6. World Crude Oil Prices

    U.S. Energy Information Administration (EIA) Indexed Site

    World Crude Oil Prices (Dollars per Barrel) The data on this page are no longer available.

  7. Natural Gas Citygate Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Citygate Price Residential Price Commercial Price Industrial Price Electric Power Price Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From Oil Wells Gross Withdrawals From Shale Gas Wells Gross Withdrawals From Coalbed Wells Repressuring Nonhydrocarbon Gases Removed Vented and Flared Marketed Production NGPL Production, Gaseous Equivalent Dry Production Imports By Pipeline LNG Imports Exports Exports By Pipeline LNG Exports Underground Storage Capacity Gas in Underground

  8. Natural Gas Industrial Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Citygate Price Residential Price Commercial Price Industrial Price Electric Power Price Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From Oil Wells Gross Withdrawals From Shale Gas Wells Gross Withdrawals From Coalbed Wells Repressuring Nonhydrocarbon Gases Removed Vented and Flared Marketed Production NGPL Production, Gaseous Equivalent Dry Production Imports By Pipeline LNG Imports Exports Exports By Pipeline LNG Exports Underground Storage Capacity Gas in Underground

  9. Average Commercial Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Citygate Price Residential Price Commercial Price Industrial Price Electric Power Price Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From Oil Wells Gross Withdrawals From Shale Gas Wells Gross Withdrawals From Coalbed Wells Repressuring Nonhydrocarbon Gases Removed Vented and Flared Marketed Production NGPL Production, Gaseous Equivalent Dry Production Imports By Pipeline LNG Imports Exports Exports By Pipeline LNG Exports Underground Storage Capacity Gas in Underground

  10. Average Residential Price

    U.S. Energy Information Administration (EIA) Indexed Site

    Citygate Price Residential Price Commercial Price Industrial Price Electric Power Price Gross Withdrawals Gross Withdrawals From Gas Wells Gross Withdrawals From Oil Wells Gross Withdrawals From Shale Gas Wells Gross Withdrawals From Coalbed Wells Repressuring Nonhydrocarbon Gases Removed Vented and Flared Marketed Production NGPL Production, Gaseous Equivalent Dry Production Imports By Pipeline LNG Imports Exports Exports By Pipeline LNG Exports Underground Storage Capacity Gas in Underground

  11. State energy price and expenditure report 1993

    SciTech Connect (OSTI)

    1995-12-01

    The State Energy Price and Expenditure Report (SEPER) presents energy price and expenditure estimates individually for the 50 states and the District of Columbia and in aggregate for the US. The five economic sectors used in SEPER correspond to those used in SEDR and are residential, commercial, industrial, transportation, and electric utility. Documentation in appendices describe how the price estimates are developed, provide conversion factors for measures used in the energy analysis, and include a glossary. 65 tabs.

  12. Crude Oil and Gasoline Price Monitoring

    U.S. Energy Information Administration (EIA) Indexed Site

    Petroleum Product Price Formation September 7, 2016 | Washington, DC An analysis of the factors that influence product prices, with chart data updated monthly, quarterly and annually Gasoline spot prices 2 Sources: U.S. Energy Information Administration, Bloomberg L.P. September 7, 2016 dollars per gallon Chicago CBOB New York Harbor Conventional gasoline Gulf Coast Conventional gasoline Los Angeles CARBOB Northwest Europe gasoline Singapore gasoline 2002 2003 2004 2005 2006 2007 2008 2009 2010

  13. Techno-Economic Analysis of the Deacetylation and Disk Refining Process. Characterizing the Effect of Refining Energy and Enzyme Usage on Minimum Sugar Selling Price and Minimum Ethanol Selling Price

    SciTech Connect (OSTI)

    Chen, Xiaowen; Shekiro, Joseph; Pschorn, Thomas; Sabourin, Marc; Tucker, Melvin P.; Tao, Ling

    2015-10-29

    A novel, highly efficient deacetylation and disk refining (DDR) process to liberate fermentable sugars from biomass was recently developed at the National Renewable Energy Laboratory (NREL). The DDR process consists of a mild, dilute alkaline deacetylation step followed by low-energy-consumption disk refining. The DDR corn stover substrates achieved high process sugar conversion yields, at low to modest enzyme loadings, and also produced high sugar concentration syrups at high initial insoluble solid loadings. The sugar syrups derived from corn stover are highly fermentable due to low concentrations of fermentation inhibitors. The objective of this work is to evaluate the economic feasibility of the DDR process through a techno-economic analysis (TEA). A large array of experiments designed using a response surface methodology was carried out to investigate the two major cost-driven operational parameters of the novel DDR process: refining energy and enzyme loadings. The boundary conditions for refining energy (128–468 kWh/ODMT), cellulase (Novozyme’s CTec3) loading (11.6–28.4 mg total protein/g of cellulose), and hemicellulase (Novozyme’s HTec3) loading (0–5 mg total protein/g of cellulose) were chosen to cover the most commercially practical operating conditions. The sugar and ethanol yields were modeled with good adequacy, showing a positive linear correlation between those yields and refining energy and enzyme loadings. The ethanol yields ranged from 77 to 89 gallons/ODMT of corn stover. The minimum sugar selling price (MSSP) ranged from $0.191 to $0.212 per lb of 50 % concentrated monomeric sugars, while the minimum ethanol selling price (MESP) ranged from $2.24 to $2.54 per gallon of ethanol. The DDR process concept is evaluated for economic feasibility through TEA. The MSSP and MESP of the DDR process falls within a range similar to that found with the deacetylation/dilute acid pretreatment process modeled in NREL’s 2011 design report. The DDR

  14. Techno-Economic Analysis of the Deacetylation and Disk Refining Process. Characterizing the Effect of Refining Energy and Enzyme Usage on Minimum Sugar Selling Price and Minimum Ethanol Selling Price

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Chen, Xiaowen; Shekiro, Joseph; Pschorn, Thomas; Sabourin, Marc; Tucker, Melvin P.; Tao, Ling

    2015-10-29

    A novel, highly efficient deacetylation and disk refining (DDR) process to liberate fermentable sugars from biomass was recently developed at the National Renewable Energy Laboratory (NREL). The DDR process consists of a mild, dilute alkaline deacetylation step followed by low-energy-consumption disk refining. The DDR corn stover substrates achieved high process sugar conversion yields, at low to modest enzyme loadings, and also produced high sugar concentration syrups at high initial insoluble solid loadings. The sugar syrups derived from corn stover are highly fermentable due to low concentrations of fermentation inhibitors. The objective of this work is to evaluate the economic feasibilitymore » of the DDR process through a techno-economic analysis (TEA). A large array of experiments designed using a response surface methodology was carried out to investigate the two major cost-driven operational parameters of the novel DDR process: refining energy and enzyme loadings. The boundary conditions for refining energy (128–468 kWh/ODMT), cellulase (Novozyme’s CTec3) loading (11.6–28.4 mg total protein/g of cellulose), and hemicellulase (Novozyme’s HTec3) loading (0–5 mg total protein/g of cellulose) were chosen to cover the most commercially practical operating conditions. The sugar and ethanol yields were modeled with good adequacy, showing a positive linear correlation between those yields and refining energy and enzyme loadings. The ethanol yields ranged from 77 to 89 gallons/ODMT of corn stover. The minimum sugar selling price (MSSP) ranged from $0.191 to $0.212 per lb of 50 % concentrated monomeric sugars, while the minimum ethanol selling price (MESP) ranged from $2.24 to $2.54 per gallon of ethanol. The DDR process concept is evaluated for economic feasibility through TEA. The MSSP and MESP of the DDR process falls within a range similar to that found with the deacetylation/dilute acid pretreatment process modeled in NREL’s 2011 design report. The

  15. Crude Oil and Gasoline Price Monitoring

    U.S. Energy Information Administration (EIA) Indexed Site

    What drives crude oil prices? September 7, 2016 | Washington, DC An analysis of 7 factors that influence oil markets, with chart data updated monthly and quarterly price per barrel (real 2010 dollars) imported refiner acquisition cost of crude oil WTI crude oil price 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015 0 25 50 75 100 125 150 Crude oil prices react to a variety of geopolitical and economic events September 7, 2016 2 Low spare capacity Iraq invades Kuwait Saudis abandon swing

  16. Supernova relic neutrinos and the supernova rate problem: Analysis of uncertainties and detectability of ONeMg and failed supernovae

    SciTech Connect (OSTI)

    Mathews, Grant J. [Center for Astrophysics, Department of Physics, University of Notre Dame, Notre Dame, IN 46556 (United States); Hidaka, Jun; Kajino, Toshitaka; Suzuki, Jyutaro [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan)

    2014-08-01

    Direct measurements of the core collapse supernova rate (R{sub SN}) in the redshift range 0 ? z ? 1 appear to be about a factor of two smaller than the rate inferred from the measured cosmic massive star formation rate (SFR). This discrepancy would imply that about one-half of the massive stars that have been born in the local observed comoving volume did not explode as luminous supernovae. In this work, we explore the possibility that one could clarify the source of this 'supernova rate problem' by detecting the energy spectrum of supernova relic neutrinos with a next generation 10{sup 6} ton water ?erenkov detector like Hyper-Kamiokande. First, we re-examine the supernova rate problem. We make a conservative alternative compilation of the measured SFR data over the redshift range 0 ?z ? 7. We show that by only including published SFR data for which the dust obscuration has been directly determined, the ratio of the observed massive SFR to the observed supernova rate R{sub SN} has large uncertainties ?1.8{sub ?0.6}{sup +1.6} and is statistically consistent with no supernova rate problem. If we further consider that a significant fraction of massive stars will end their lives as faint ONeMg SNe or as failed SNe leading to a black hole remnant, then the ratio reduces to ?1.1{sub ?0.4}{sup +1.0} and the rate problem is essentially solved. We next examine the prospects for detecting this solution to the supernova rate problem. We first study the sources of uncertainty involved in the theoretical estimates of the neutrino detection rate and analyze whether the spectrum of relic neutrinos can be used to independently identify the existence of a supernova rate problem and its source. We consider an ensemble of published and unpublished core collapse supernova simulation models to estimate the uncertainties in the anticipated neutrino luminosities and temperatures. We illustrate how the spectrum of detector events might be used to establish the average neutrino

  17. Uncertainty and sensitivity analysis in the 2008 performance assessment for the proposed repository for high-level radioactive waste at Yucca Mountain, Nevada.

    SciTech Connect (OSTI)

    Helton, Jon Craig; Sallaberry, Cedric M.; Hansen, Clifford W.

    2010-05-01

    Extensive work has been carried out by the U.S. Department of Energy (DOE) in the development of a proposed geologic repository at Yucca Mountain (YM), Nevada, for the disposal of high-level radioactive waste. As part of this development, an extensive performance assessment (PA) for the YM repository was completed in 2008 [1] and supported a license application by the DOE to the U.S. Nuclear Regulatory Commission (NRC) for the construction of the YM repository [2]. This presentation provides an overview of the conceptual and computational structure of the indicated PA (hereafter referred to as the 2008 YM PA) and the roles that uncertainty analysis and sensitivity analysis play in this structure.

  18. Diesel prices flat

    U.S. Energy Information Administration (EIA) Indexed Site

    Diesel prices flat The U.S. average retail price for on-highway diesel fuel saw no movement from last week. Prices remained flat at $3.89 a gallon on Monday, based on the weekly price survey by the U.S. Energy Information Administration. Diesel prices were highest in the West Coast states at 4.05 a gallon, up 2-tenths of a penny from a week ago. Prices were lowest in the Gulf Coast region at 3.80 a gallon, up 3-tenths of a penny

  19. ,"Texas Natural Gas Prices"

    U.S. Energy Information Administration (EIA) Indexed Site

    Data for" ,"Data 1","Texas Natural Gas Prices",8,"Monthly","2... 6:46:23 AM" "Back to Contents","Data 1: Texas Natural Gas Prices" "Sourcekey","N3050TX3"...

  20. California Natural Gas Prices

    U.S. Energy Information Administration (EIA) Indexed Site

    2010 2011 2012 2013 2014 2015 View History Wellhead Price 4.87 1967-2010 Imports Price 4.76 3.57 -- 3.59 -- 2007-2014 Exports Price 4.51 4.18 2.90 3.89 4.56 1997-2014 Pipeline and Distribution Use Price 1967-2005 Citygate Price 4.86 4.47 3.46 4.18 4.88 3.27 1984-2015 Residential Price 9.92 9.93 9.14 9.92 11.51 11.38 1967-2015 Percentage of Total Residential Deliveries included in Prices 98.5 98.3 97.5 96.1 94.8 94.9 1989-2015 Commercial Price 8.30 8.29 7.05 7.81 9.05 7.98 1967-2015 Percentage of

  1. Texas Natural Gas Prices

    U.S. Energy Information Administration (EIA) Indexed Site

    2010 2011 2012 2013 2014 2015 View History Wellhead Price 4.70 1967-2010 Imports Price 6.72 6.78 10.09 12.94 11.79 1993-2014 Exports Price 4.68 4.44 3.14 3.94 4.67 1989-2014 Pipeline and Distribution Use Price 1967-2005 Citygate Price 5.89 5.39 4.30 4.89 5.77 4.20 1984-2015 Residential Price 10.82 10.21 10.55 10.50 11.16 10.65 1967-2015 Percentage of Total Residential Deliveries included in Prices 100.0 99.7 99.7 99.7 99.8 99.9 1989-2015 Commercial Price 7.90 7.07 6.63 7.25 8.26 NA 1967-2015

  2. Residential propane price decreases

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    propane price decreases The average retail price for propane is 2.32 per gallon, down 2 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  3. Residential propane price increases

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    propane price increases The average retail price for propane is 1.98 per gallon, up 1.1 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  4. Residential propane price decreases

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    05, 2014 Residential propane price decreases The average retail price for propane fell to 2.40 per gallon, down 1.2 cents from a week ago, based on the residential heating fuel ...

  5. Residential propane prices increase

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    propane prices increase The average retail price for propane rose 3.9 cents from a week ago to 2.80 per gallon. That's up 53.7 cents from a year ago, based on the residential ...

  6. Residential propane prices stable

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    propane prices stable The average retail price for propane is 2.37 per gallon. That's down 4-tenths of a penny from a week ago, based on the U.S. Energy Information ...

  7. Residential propane price

    Gasoline and Diesel Fuel Update (EIA)

    propane price increases The average retail price for propane is 2.39 per gallon, up 3.9 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  8. Residential propane price decreases

    Gasoline and Diesel Fuel Update (EIA)

    propane price decreases The average retail price for propane is 2.38 per gallon, down 1.1 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  9. Residential propane price decreases

    U.S. Energy Information Administration (EIA) Indexed Site

    6, 2014 Residential propane price decreases The average retail price for propane fell to 3.48 per gallon, down 15.9 cents from a week ago, based on the residential heating fuel ...

  10. Residential propane price decreases

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    8, 2015 Residential propane price decreases The average retail price for propane is 2.34 per gallon, down 1.7 cents from last week, based on the residential heating fuel survey by ...

  11. Residential propane prices stable

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    propane price decreases The average retail price for propane is 2.40 per gallon, down 9-tenths of a cent from last week, based on the residential heating fuel survey by the U.S. ...

  12. Residential propane prices available

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    4, 2015 Residential propane price increases The average retail price for propane is 1.92 per gallon, up 1.4 cents from last week, based on the residential heating fuel survey by ...

  13. Residential propane price increases

    U.S. Energy Information Administration (EIA) Indexed Site

    Residential propane price decreases The average retail price for propane is 2.03 per gallon, down 2-tenths of a cent from last week, based on the residential heating fuel survey ...

  14. Residential propane price

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    propane price increases The average retail price for propane is 2.29 per gallon, down 3.1 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  15. Residential propane prices available

    U.S. Energy Information Administration (EIA) Indexed Site

    8, 2015 Residential propane price increases The average retail price for propane is 1.91 per gallon, up 1.4 cents from last week, based on the residential heating fuel survey by ...

  16. Residential propane prices surges

    Gasoline and Diesel Fuel Update (EIA)

    propane prices surges The average retail price for propane rose to an all-time high of 4.01 a gallon, that's up 1.05 from a week ago, based on the residential heating fuel survey ...

  17. Residential propane price increases

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    propane price increases The average retail price for propane is 1.96 per gallon, up 1.8 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  18. Residential propane prices surges

    Gasoline and Diesel Fuel Update (EIA)

    5, 2014 Residential propane price decreases The average retail price for propane fell to 3.30 per gallon, down 17.5 cents from a week ago, based on the residential heating fuel ...

  19. Residential propane price decreases

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    propane price decreases The average retail price for propane is 2.36 per gallon, down 7-tenths of a cent from last week, based on the residential heating fuel survey by the U.S. ...

  20. Residential propane price

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    propane price decreases The average retail price for propane is 2.35 per gallon, down 1.1 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  1. Residential propane price decreases

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    propane price decreases The average retail price for propane is 2.39 per gallon, down 2.2 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  2. Residential propane price increases

    Gasoline and Diesel Fuel Update (EIA)

    Residential propane price decreases The average retail price for propane is 2.02 per gallon, down 5-tenths of a cent from last week, based on the residential heating fuel survey ...

  3. Residential propane prices decreases

    Annual Energy Outlook [U.S. Energy Information Administration (EIA)]

    5, 2014 Residential propane prices decreases The average retail price for propane fell to 3.89 per gallon, that's down 11.9 cents from a week ago, based on the residential heating ...

  4. Residential propane prices available

    U.S. Energy Information Administration (EIA) Indexed Site

    Residential propane price decreases The average retail price for propane is 1.91 per gallon, down 6.7 cents from last week, based on the residential heating fuel survey by the ...

  5. Residential propane price increases

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    propane price increases The average retail price for propane is 2.03 per gallon, up 1 cent from last week, based on the residential heating fuel survey by the U.S. Energy ...

  6. Residential propane prices increase

    U.S. Energy Information Administration (EIA) Indexed Site

    propane prices increase The average retail price for propane rose to 2.40 per gallon, up 1.1 cents from a week ago, based on the residential heating fuel survey by the U.S. Energy ...

  7. Residential propane prices surges

    Gasoline and Diesel Fuel Update (EIA)

    2, 2014 Residential propane price decreases The average retail price for propane fell to 3.17 per gallon, down 13.1 cents from a week ago, based on the residential heating fuel ...

  8. Residential propane price decreases

    Gasoline and Diesel Fuel Update (EIA)

    propane price decreases The average retail price for propane is 2.36 per gallon, down 6-tenths of a cent from last week, based on the residential heating fuel survey by the U.S. ...

  9. Residential propane price decreases

    Gasoline and Diesel Fuel Update (EIA)

    propane price decreases The average retail price for propane is 2.36 per gallon, down 1.1 cents from last week, based on the residential heating fuel survey by the U.S. Energy ...

  10. Residential propane price increases

    U.S. Energy Information Administration (EIA) Indexed Site

    propane price increases The average retail price for propane is 2.41 per gallon, up 6-tenths of a cent from last week, based on the residential heating fuel survey by the U.S. ...