National Library of Energy BETA

Sample records for uec estimates based

  1. 09/14/2012 UEC Lunch Meeting Attended by: All 2012 UEC members...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    UEC Lunch Meeting Attended by: All 2012 UEC members, Sean Smith, Peter Cummings, Tony ... Jeff Smith Deputy for Operations Internal Audit Gail Lewis, Director Information ...

  2. CNMS UEC Agenda, Friday, November 6, 2015

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    for session chairs' - UEC Discussion leaders should contribute (Molly, Eric, Ray, Rafael); Eric volunteered to assemble - Nazanin will poll all 2015 roundtable leaders for...

  3. Microsoft Word - UEC-CC_120415_min_teh.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    members: Vivek, Milan, Zheng, Martyn, Ray * 2015 User Satisfaction Survey (Eric with Rafael, Milan) o Draft of UEC summary for triennial review and recommendations was...

  4. Knowledge Based Estimation of Material Release Transients

    Energy Science and Technology Software Center (OSTI)

    1998-07-29

    KBERT is an easy to use desktop decision support tool for estimating public and in-facility worker doses and consequences of radioactive material releases in non-reactort nuclear facilities. It automatically calculates release and respirable fractions based on published handbook data, and calculates material transport concurrently with personnel evacuation simulations. Any facility layout can be modeled easily using the intuitive graphical user interface.

  5. The ARM Best Estimate Station-based Surface (ARMBESTNS) Data...

    Office of Scientific and Technical Information (OSTI)

    Station-based Surface (ARMBESTNS) Data set Title: The ARM Best Estimate Station-based Surface (ARMBESTNS) Data set The ARM Best Estimate Station-based Surface (ARMBESTNS) data set ...

  6. Output-Based Error Estimation and Adaptation for Uncertainty...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Output-Based Error Estimation and Adaptation for Uncertainty Quantification Isaac M. Asher and Krzysztof J. Fidkowski University of Michigan US National Congress on Computational...

  7. CALIBRATING C-IV-BASED BLACK HOLE MASS ESTIMATORS

    SciTech Connect (OSTI)

    Park, Daeseong; Woo, Jong-Hak; Shin, Jaejin [Astronomy Program, Department of Physics and Astronomy, Seoul National University, Seoul 151-742 (Korea, Republic of); Denney, Kelly D., E-mail: pds2001@astro.snu.ac.kr, E-mail: woo@astro.snu.ac.kr, E-mail: jjshin@astro.snu.ac.kr, E-mail: kelly@dark-cosmology.dk [Dark Cosmology Centre, Niels Bohr Institute, Juliane Maries Vej 30, DK-2100 Copenhagen O (Denmark)

    2013-06-20

    We present the single-epoch black hole mass estimators based on the C IV {lambda}1549 broad emission line, using the updated sample of the reverberation-mapped active galactic nuclei and high-quality UV spectra. By performing multi-component spectral fitting analysis, we measure the C IV line widths (FWHM{sub C{sub IV}} and line dispersion, {sigma}{sub C{sub IV}}) and the continuum luminosity at 1350 A (L{sub 1350}) to calibrate the C-IV-based mass estimators. By comparing with the H{beta} reverberation-based masses, we provide new mass estimators with the best-fit relationships, i.e., M{sub BH}{proportional_to}L{sub 1350}{sup 0.50{+-}0.07}{sigma}{sub C{sub IV}{sup 2}} and M{sub BH}{proportional_to}L{sub 1350}{sup 0.52{+-}0.09} FWHM{sub C{sub IV}{sup 0.56{+-}0.48}}. The new C-IV-based mass estimators show significant mass-dependent systematic difference compared to the estimators commonly used in the literature. Using the published Sloan Digital Sky Survey QSO catalog, we show that the black hole mass of high-redshift QSOs decreases on average by {approx}0.25 dex if our recipe is adopted.

  8. Synchrophasor Measurement-Based Wind Plant Inertia Estimation: Preprint

    SciTech Connect (OSTI)

    Zhang, Y.; Bank, J.; Wan, Y. H.; Muljadi, E.; Corbus, D.

    2013-05-01

    The total inertia stored in all rotating masses that are connected to power systems, such as synchronous generations and induction motors, is an essential force that keeps the system stable after disturbances. To ensure bulk power system stability, there is a need to estimate the equivalent inertia available from a renewable generation plant. An equivalent inertia constant analogous to that of conventional rotating machines can be used to provide a readily understandable metric. This paper explores a method that utilizes synchrophasor measurements to estimate the equivalent inertia that a wind plant provides to the system.

  9. Cellulose triacetate based novel optical sensor for uranium estimation

    SciTech Connect (OSTI)

    Joshi, J.M.; Pathak, P.N.; Pandey, A.K.; Manchanda, V.K.

    2008-07-01

    A cellulose triacetate (CTA) based optode has been developed by immobilizing tricapryl-methyl ammonium chloride (Aliquat 336) as the extractant and 2-(5-bromo-2-pyridylazo)-5- diethyl-aminophenol (Br-PADAP) as the chromophore. The optode changes color (from yellow to magenta) due to uranium uptake in bicarbonate medium ({approx}10{sup -4} M) at pH 7-8 in the presence of triethanolamine (TEA) buffer. The detection limit of the optode film (dimension: 3 cm x 1 cm) was determined to be {approx}0.3 {mu}g/mL for a 15 mL pure uranium sample at pH 7-8 (in TEA buffer). The effects of experimental parameters have been evaluated in terms of maximum uptake of U(VI), minimum response time, and reproducibility and stability of the Br-PADAP-U(VI ) complex formed in the optode matrix. The applicability of the optimized optode has been examined in the effluent samples obtained during magnesium diuranate precipitation step following the TBP purification cycle. (authors)

  10. Estimating Methods

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    Based on the project's scope, the purpose of the estimate, and the availability of estimating resources, the estimator can choose one or a combination of techniques when estimating an activity or project. Estimating methods, estimating indirect and direct costs, and other estimating considerations are discussed in this chapter.

  11. CNMS UEC Agenda, Wednesday, September 3, 2014

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  12. CNMS UEC Agenda, Tuesday, July 7, 2015

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  13. CNMS UEC Agenda, Wednesday, February 5, 2014

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  14. CNMS UEC Agenda, Wednesday, July 2, 2014

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  15. CNMS UEC Agenda, Thursday, January 9, 2014

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  16. CNMS UEC Agenda, Wednesday, August 6, 2014

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  17. CNMS UEC Agenda, Wednesday, October 1, 2014

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    as unique "destination" attractions vs regional * Review percentages and demographics from outside Tennessee (Haynes will distribute) * Accessibility - shuttle service...

  18. The ARM Best Estimate Station-based Surface (ARMBESTNS) Data set

    SciTech Connect (OSTI)

    Qi,Tang; Xie,Shaocheng

    2015-08-06

    The ARM Best Estimate Station-based Surface (ARMBESTNS) data set merges together key surface measurements from the Southern Great Plains (SGP) sites. It is a twin data product of the ARM Best Estimate 2-dimensional Gridded Surface (ARMBE2DGRID) data set. Unlike the 2DGRID data set, the STNS data are reported at the original site locations and show the original information, except for the interpolation over time. Therefore, users have the flexibility to process the data with the approach more suitable for their applications.

  19. Advanced digital PWR plant protection system based on optimal estimation theory

    SciTech Connect (OSTI)

    Tylee, J.L.

    1981-04-01

    An advanced plant protection system for the Loss-of-Fluid Test (LOFT) reactor plant is described and evaluated. The system, based on a Kalman filter estimator, is capable of providing on-line estimates of such critical variables as fuel and cladding temperature, departure from nucleate boiling ratio, and maximum linear heat generation rate. The Kalman filter equations are presented, as is a description of the LOFT plant dynamic model inherent in the filter. Simulation results demonstrate the performance of the advanced system.

  20. Reconnaissance Estimates of Recharge Based on an Elevation-dependent Chloride Mass-balance Approach

    SciTech Connect (OSTI)

    Charles E. Russell; Tim Minor

    2002-08-31

    Significant uncertainty is associated with efforts to quantity recharge in arid regions such as southern Nevada. However, accurate estimates of groundwater recharge are necessary to understanding the long-term sustainability of groundwater resources and predictions of groundwater flow rates and directions. Currently, the most widely accepted method for estimating recharge in southern Nevada is the Maxey and Eakin method. This method has been applied to most basins within Nevada and has been independently verified as a reconnaissance-level estimate of recharge through several studies. Recharge estimates derived from the Maxey and Eakin and other recharge methodologies ultimately based upon measures or estimates of groundwater discharge (outflow methods) should be augmented by a tracer-based aquifer-response method. The objective of this study was to improve an existing aquifer-response method that was based on the chloride mass-balance approach. Improvements were designed to incorporate spatial variability within recharge areas (rather than recharge as a lumped parameter), develop a more defendable lower limit of recharge, and differentiate local recharge from recharge emanating as interbasin flux. Seventeen springs, located in the Sheep Range, Spring Mountains, and on the Nevada Test Site were sampled during the course of this study and their discharge was measured. The chloride and bromide concentrations of the springs were determined. Discharge and chloride concentrations from these springs were compared to estimates provided by previously published reports. A literature search yielded previously published estimates of chloride flux to the land surface. {sup 36}Cl/Cl ratios and discharge rates of the three largest springs in the Amargosa Springs discharge area were compiled from various sources. This information was utilized to determine an effective chloride concentration for recharging precipitation and its associated uncertainty via Monte Carlo simulations

  1. Critical analysis of the Hanford spent nuclear fuel project activity based cost estimate

    SciTech Connect (OSTI)

    Warren, R.N.

    1998-09-29

    In 1997, the SNFP developed a baseline change request (BCR) and submitted it to DOE-RL for approval. The schedule was formally evaluated to have a 19% probability of success [Williams, 1998]. In December 1997, DOE-RL Manager John Wagoner approved the BCR contingent upon a subsequent independent review of the new baseline. The SNFP took several actions during the first quarter of 1998 to prepare for the independent review. The project developed the Estimating Requirements and Implementation Guide [DESH, 1998] and trained cost account managers (CAMS) and other personnel involved in the estimating process in activity-based cost (ABC) estimating techniques. The SNFP then applied ABC estimating techniques to develop the basis for the December Baseline (DB) and documented that basis in Basis of Estimate (BOE) books. These BOEs were provided to DOE in April 1998. DOE commissioned Professional Analysis, Inc. (PAI) to perform a critical analysis (CA) of the DB. PAI`s review formally began on April 13. PAI performed the CA, provided three sets of findings to the SNFP contractor, and initiated reconciliation meetings. During the course of PAI`s review, DOE directed the SNFP to develop a new baseline with a higher probability of success. The contractor transmitted the new baseline, which is referred to as the High Probability Baseline (HPB), to DOE on April 15, 1998 [Williams, 1998]. The HPB was estimated to approach a 90% confidence level on the start of fuel movement [Williams, 1998]. This high probability resulted in an increased cost and a schedule extension. To implement the new baseline, the contractor initiated 26 BCRs with supporting BOES. PAI`s scope was revised on April 28 to add reviewing the HPB and the associated BCRs and BOES.

  2. MM-Estimator and Adjusted Super Smoother based Simultaneous Prediction Confedenc

    Energy Science and Technology Software Center (OSTI)

    2002-07-19

    A Novel Application of Regression Analysis (MM-Estimator) with Simultaneous Prediction Confidence Intervals are proposed to detect up- or down-regulated genes, which are outliers in scatter plots based on log-transformed red (Cy5 fluorescent dye) versus green (Cy3 fluorescent Dye) intensities. Advantages of the application: 1) Robust and Resistant MM-Estimator is a Reliable Method to Build Linear Regression In the presence of Outliers, 2) Exploratory Data Analysis Tools (Boxplots, Averaged Shifted Histograms, Quantile-Quantile Normal Plots and Scattermore » Plots) are Unsed to Test Visually underlying assumptions of linearity and Contaminated Normality in Microarray data), 3) Simultaneous prediction confidence intervals (SPCIs) Guarantee a desired confidence level across the whole range of the data points used for the scatter plots. Results of the outlier detection procedure is a set of significantly differentially expressed genes extracted from the employed microarray data set. A scatter plot smoother (super smoother or locally weighted regression) is used to quantify heteroscendasticity is residual variance (Commonly takes place in lower and higher intensity areas). The set of differentially expressed genes is quantified using interval estimates for P-values as a probabilistic measure of being outlier by chance. Monte Carlo simultations are used to adjust super smoother-based SPCIs.her.« less

  3. Evaluation of Clear Sky Models for Satellite-Based Irradiance Estimates

    SciTech Connect (OSTI)

    Sengupta, M.; Gotseff, P.

    2013-12-01

    This report describes an intercomparison of three popular broadband clear sky solar irradiance model results with measured data, as well as satellite-based model clear sky results compared to measured clear sky data. The authors conclude that one of the popular clear sky models (the Bird clear sky model developed by Richard Bird and Roland Hulstrom) could serve as a more accurate replacement for current satellite-model clear sky estimations. Additionally, the analysis of the model results with respect to model input parameters indicates that rather than climatological, annual, or monthly mean input data, higher-time-resolution input parameters improve the general clear sky model performance.

  4. A BIM-based system for demolition and renovation waste estimation and planning

    SciTech Connect (OSTI)

    Cheng, Jack C.P., E-mail: cejcheng@ust.hk [Department of Civil and Environmental Engineering, The Hong Kong University of Science and Technology (Hong Kong); Ma, Lauren Y.H., E-mail: yingzi@ust.hk [Department of Civil and Environmental Engineering, The Hong Kong University of Science and Technology (Hong Kong)

    2013-06-15

    Highlights: ? We developed a waste estimation system leveraging the BIM technology. ? The system can calculate waste disposal charging fee and pick-up truck demand. ? We presented an example scenario demonstrating this system. ? Automatic, time-saving and wide applicability are the features of the system. - Abstract: Due to the rising worldwide awareness of green environment, both government and contractors have to consider effective construction and demolition (C and D) waste management practices. The last two decades have witnessed the growing importance of demolition and renovation (D and R) works and the growing amount of D and R waste disposed to landfills every day, especially in developed cities like Hong Kong. Quantitative waste prediction is crucial for waste management. It can enable contractors to pinpoint critical waste generation processes and to plan waste control strategies. In addition, waste estimation could also facilitate some government waste management policies, such as the waste disposal charging scheme in Hong Kong. Currently, tools that can accurately and conveniently estimate the amount of waste from construction, renovation, and demolition projects are lacking. In the light of this research gap, this paper presents a building information modeling (BIM) based system that we have developed for estimation and planning of D and R waste. BIM allows multi-disciplinary information to be superimposed within one digital building model. Our system can extract material and volume information through the BIM model and integrate the information for detailed waste estimation and planning. Waste recycling and reuse are also considered in our system. Extracted material information can be provided to recyclers before demolition or renovation to make recycling stage more cooperative and more efficient. Pick-up truck requirements and waste disposal charging fee for different waste facilities will also be predicted through our system. The results could

  5. Reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization

    SciTech Connect (OSTI)

    Shi, Xin Zhao, Xiangmo Hui, Fei Ma, Junyan Yang, Lan

    2014-10-06

    Clock synchronization in wireless sensor networks (WSNs) has been studied extensively in recent years and many protocols are put forward based on the point of statistical signal processing, which is an effective way to optimize accuracy. However, the accuracy derived from the statistical data can be improved mainly by sufficient packets exchange, which will consume the limited power resources greatly. In this paper, a reliable clock estimation using linear weighted fusion based on pairwise broadcast synchronization is proposed to optimize sync accuracy without expending additional sync packets. As a contribution, a linear weighted fusion scheme for multiple clock deviations is constructed with the collaborative sensing of clock timestamp. And the fusion weight is defined by the covariance of sync errors for different clock deviations. Extensive simulation results show that the proposed approach can achieve better performance in terms of sync overhead and sync accuracy.

  6. Comparison of Historical Satellite-Based Estimates of Solar Radiation Resources with Recent Rotating Shadowband Radiometer Measurements: Preprint

    SciTech Connect (OSTI)

    Myers, D. R.

    2009-03-01

    The availability of rotating shadow band radiometer measurement data at several new stations provides an opportunity to compare historical satellite-based estimates of solar resources with measurements. We compare mean monthly daily total (MMDT) solar radiation data from eight years of NSRDB and 22 years of NASA hourly global horizontal and direct beam solar estimates with measured data from three stations, collected after the end of the available resource estimates.

  7. Geothermal resource base of the world: a revision of the Electric Power Research Institute's estimate

    SciTech Connect (OSTI)

    Aldrich, M.J.; Laughlin, A.W.; Gambill, D.T.

    1981-04-01

    Review of the Electric Power Research Institute's (EPRI) method for calculating the geothermal resource base of a country shows that modifications are needed for several of the assumptions used in the calculation. These modifications include: (1) separating geothermal belts into volcanic types with a geothermal gradient of 50{sup 0}C/km and complex types in which 80% of the area has a temperature gradient of 30{sup 0}C/km and 20% has a gradient of 45{sup 0}C/km, (2) using the actual mean annual temperature of a country rather than an assumed 15{sup 0}C average ambient temperature, and (3) making separate calculations for the resource stored in water/brine and that stored in rock. Comparison of this method (Revised EPRI) for calculating a geothermal resource base with other resource base estimates made from a heat flow map of Europe indicates that the technique yields reasonable values. The calculated geothermal resource bases, stored in water and rock to a depth of 5 km, for each country in the world are given. Approximately five times as much energy is stored in rock as is stored in water.

  8. Precipitation Estimate Using NEXRAD Ground-Based Radar Images: Validation, Calibration and Spatial Analysis

    SciTech Connect (OSTI)

    Zhang, Xuesong

    2012-12-17

    Precipitation is an important input variable for hydrologic and ecological modeling and analysis. Next Generation Radar (NEXRAD) can provide precipitation products that cover most of the continental United States with a high resolution display of approximately 4 × 4 km2. Two major issues concerning the applications of NEXRAD data are (1) lack of a NEXRAD geo-processing and geo-referencing program and (2) bias correction of NEXRAD estimates. In this chapter, a geographic information system (GIS) based software that can automatically support processing of NEXRAD data for hydrologic and ecological models is presented. Some geostatistical approaches to calibrating NEXRAD data using rain gauge data are introduced, and two case studies on evaluating accuracy of NEXRAD Multisensor Precipitation Estimator (MPE) and calibrating MPE with rain-gauge data are presented. The first case study examines the performance of MPE in mountainous region versus south plains and cold season versus warm season, as well as the effect of sub-grid variability and temporal scale on NEXRAD performance. From the results of the first case study, performance of MPE was found to be influenced by complex terrain, frozen precipitation, sub-grid variability, and temporal scale. Overall, the assessment of MPE indicates the importance of removing bias of the MPE precipitation product before its application, especially in the complex mountainous region. The second case study examines the performance of three MPE calibration methods using rain gauge observations in the Little River Experimental Watershed in Georgia. The comparison results show that no one method can perform better than the others in terms of all evaluation coefficients and for all time steps. For practical estimation of precipitation distribution, implementation of multiple methods to predict spatial precipitation is suggested.

  9. Development of the town data base: Estimates of exposure rates and times of fallout arrival near the Nevada Test Site

    SciTech Connect (OSTI)

    Thompson, C.B.; McArthur, R.D.; Hutchinson, S.W.

    1994-09-01

    As part of the U.S. Department of Energy`s Off-Site Radiation Exposure Review Project, the time of fallout arrival and the H+12 exposure rate were estimated for populated locations in Arizona, California, Nevada, and Utah that were affected by fallout from one or more nuclear tests at the Nevada Test Site. Estimates of exposure rate were derived from measured values recorded before and after each test by fallout monitors in the field. The estimate for a given location was obtained by retrieving from a data base all measurements made in the vicinity, decay-correcting them to H+12, and calculating an average. Estimates were also derived from maps produced after most events that show isopleths of exposure rate and time of fallout arrival. Both sets of isopleths on these maps were digitized, and kriging was used to interpolate values at the nodes of a 10-km grid covering the pattern. The values at any location within the grid were then estimated from the values at the surrounding grid nodes. Estimates of dispersion (standard deviation) were also calculated. The Town Data Base contains the estimates for all combinations of location and nuclear event for which the estimated mean H+12 exposure rate was greater than three times background. A listing of the data base is included as an appendix. The information was used by other project task groups to estimate the radiation dose that off-site populations and individuals may have received as a result of exposure to fallout from Nevada nuclear tests.

  10. Estimating present climate in a warming world: a model-based approach

    SciTech Connect (OSTI)

    Raeisaenen, J.; Ruokolainen, L. [University of Helsinki (Finland). Division of Atmospheric Sciences and Geophysics

    2008-09-30

    Weather services base their operational definitions of 'present' climate on past observations, using a 30-year normal period such as 1961-1990 or 1971-2000. In a world with ongoing global warming, however, past data give a biased estimate of the actual present-day climate. Here we propose to correct this bias with a 'delta change' method, in which model-simulated climate changes and observed global mean temperature changes are used to extrapolate past observations forward in time, to make them representative of present or future climate conditions. In a hindcast test for the years 1991-2002, the method works well for temperature, with a clear improvement in verification statistics compared to the case in which the hindcast is formed directly from the observations for 1961-1990. However, no improvement is found for precipitation, for which the signal-to-noise ratio between expected anthropogenic changes and interannual variability is much lower than for temperature. An application of the method to the present (around the year 2007) climate suggests that, as a geographical average over land areas excluding Antarctica, 8-9 months per year and 8-9 years per decade can be expected to be warmer than the median for 1971-2000. Along with the overall warming, a substantial increase in the frequency of warm extremes at the expense of cold extremes of monthly-to-annual temperature is expected.

  11. Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2

    SciTech Connect (OSTI)

    MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.

    1999-11-01

    This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.

  12. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Jakeman, J. D.; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity. We show that utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchical surplus based strategies. Throughout this papermore » we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.« less

  13. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    SciTech Connect (OSTI)

    Jakeman, J.D. Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the physical discretization error and the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity of the sparse grid. Utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchical surplus based strategies. Throughout this paper we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.

  14. Remote sensing-based estimation of annual soil respiration at two contrasting forest sites

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Gu, Lianhong; Huang, Ni; Black, T. Andrew; Wang, Li; Niu, Zheng

    2015-11-23

    Soil respiration (Rs), an important component of the global carbon cycle, can be estimated using remotely sensed data, but the accuracy of this technique has not been thoroughly investigated. In this article, we proposed a methodology for the remote estimation of annual Rs at two contrasting FLUXNET forest sites (a deciduous broadleaf forest and an evergreen needleleaf forest).

  15. Analysis of In-Use Fuel Economy Shortfall Based on Voluntarily Reported MPG Estimates

    SciTech Connect (OSTI)

    Greene, David L; Goeltz, Rick; Hopson, Dr Janet L; Tworek, Elzbieta

    2007-01-01

    The usefulness of the Environmental Protection Agency's (EPA) passenger car and light truck fuel economy estimates has been the subject of debate for the past three decades. For the labels on new vehicles and the fuel economy information given to the public, the EPA adjusts dynamometer test results downward by 10% for the city cycle and 22% for the highway cycle to better reflect real world driving conditions. These adjustment factors were developed in 1984 and their continued validity has repeatedly been questioned. In March of 2005 the U.S. Department of Energy (DOE) and EPA's fuel economy information website, www.fueleconomy.gov, began allowing users to voluntarily share fuel economy estimates. This paper presents an initial statistical analysis of more than 3,000 estimates submitted by website users. The analysis suggests two potentially important results: (1) adjusted, combined EPA fuel economy estimates appear to be approximately unbiased estimators of the average fuel economy consumers will experience in actual driving, and (2) the EPA estimates are highly imprecise predictors of any given individual's in-use fuel economy, an approximate 95% confidence interval being +/-7 MPG. These results imply that what is needed is not less biased adjustment factors for the EPA estimates but rather more precise methods of predicting the fuel economy individual consumers will achieve in their own driving.

  16. Remote sensing-based estimation of annual soil respiration at two contrasting forest sites

    SciTech Connect (OSTI)

    Gu, Lianhong; Huang, Ni; Black, T. Andrew; Wang, Li; Niu, Zheng

    2015-11-23

    Soil respiration (Rs), an important component of the global carbon cycle, can be estimated using remotely sensed data, but the accuracy of this technique has not been thoroughly investigated. In this article, we proposed a methodology for the remote estimation of annual Rs at two contrasting FLUXNET forest sites (a deciduous broadleaf forest and an evergreen needleleaf forest).

  17. A class of error estimators based on interpolating the finite element solutions for reaction-diffusion equations

    SciTech Connect (OSTI)

    Lin, T.; Wang, H.

    1995-12-31

    The swift improvement of computational capabilities enables us to apply finite element methods to simulate more and more problems arising from various applications. A fundamental question associated with finite element simulations is their accuracy. In other words, before we can make any decisions based on the numerical solutions, we must be sure that they are acceptable in the sense that their errors are within the given tolerances. Various estimators have been developed to assess the accuracy of finite element solutions, and they can be classified basically into two types: a priori error estimates and a posteriori error estimates. While a priori error estimates can give us asymptotic convergence rates of numerical solutions in terms of the grid size before the computations, they depend on certain Sobolev norms of the true solutions which are not known, in general. Therefore, it is difficult, if not impossible, to use a priori estimates directly to decide whether a numerical solution is acceptable or a finer partition (and so a new numerical solution) is needed. In contrast, a posteriori error estimates depends only on the numerical solutions, and they usually give computable quantities about the accuracy of the numerical solutions.

  18. Estimates of Refrigerator Loads in Public Housing Based on Metered Consumption Data

    SciTech Connect (OSTI)

    Miller, JD; Pratt, RG

    1998-09-11

    The New York Power Authority (NYPA), the New York City Housing Authority (NYCHA), and the U.S. Departments of Housing and Urban Development (HUD) and Energy (DOE) have joined in a project to replace refrigerators in New York City public housing with new, highly energy-efficient models. This project laid the ground work for the Consortium for Energy Efficiency (CEE) and DOE to enable housing authorities throughout the United States to bulk-purchase energy-efficient appliances. DOE helped develop and plan the program through the ENERGY STAR@ Partnerships program conducted by its Pacific Nofiwest National Laboratory (PNNL). PNNL was subsequently asked to conduct the savings evahations for 1996 and 1997. PNNL designed the metering protocol and occupant survey, supplied and calibrated the metering equipment, and managed and analyzed the data. The 1996 metering study of refrigerator energy usage in New York City public housing (Pratt and Miller 1997) established the need and justification for a regression-model-based approach to an energy savings estimate. The need originated in logistical difficulties associated with sampling the population and pen?orming a stratified analysis. Commonly, refrigerators[a) with high representation in the population were missed in the sampling schedule, leaving significant holes in the sample and difficulties for the stratified anrdysis. The just{jfcation was found in the fact that strata (distinct groups of identical refrigerators) were not statistically distinct in terms of their label ratio (ratio of metered consumption to label rating). This finding suggested a general regression model could be used to represent the consumption of all refrigerators in the population. In 1996 a simple two-coefficient regression model, a function of only the refrigerator label rating, was developed and used to represent the existing population of refrigerators. A key concept used in the 1997 study grew from findings in a small number of apartments

  19. Dynamic State Estimation and Parameter Calibration of DFIG based on Ensemble Kalman Filter

    SciTech Connect (OSTI)

    Fan, Rui; Huang, Zhenyu; Wang, Shaobu; Diao, Ruisheng; Meng, Da

    2015-07-30

    With the growing interest in the application of wind energy, doubly fed induction generator (DFIG) plays an essential role in the industry nowadays. To deal with the increasing stochastic variations introduced by intermittent wind resource and responsive loads, dynamic state estimation (DSE) are introduced in any power system associated with DFIGs. However, sometimes this dynamic analysis canould not work because the parameters of DFIGs are not accurate enough. To solve the problem, an ensemble Kalman filter (EnKF) method is proposed for the state estimation and parameter calibration tasks. In this paper, a DFIG is modeled and implemented with the EnKF method. Sensitivity analysis is demonstrated regarding the measurement noise, initial state errors and parameter errors. The results indicate this EnKF method has a robust performance on the state estimation and parameter calibration of DFIGs.

  20. Shielding and activity estimator for template-based nuclide identification methods

    DOE Patents [OSTI]

    Nelson, Karl Einar

    2013-04-09

    According to one embodiment, a method for estimating an activity of one or more radio-nuclides includes receiving one or more templates, the one or more templates corresponding to one or more radio-nuclides which contribute to a probable solution, receiving one or more weighting factors, each weighting factor representing a contribution of one radio-nuclide to the probable solution, computing an effective areal density for each of the one more radio-nuclides, computing an effective atomic number (Z) for each of the one more radio-nuclides, computing an effective metric for each of the one or more radio-nuclides, and computing an estimated activity for each of the one or more radio-nuclides. In other embodiments, computer program products, systems, and other methods are presented for estimating an activity of one or more radio-nuclides.

  1. GPU Acceleration of Mean Free Path Based Kernel Density Estimators for Monte Carlo Neutronics Simulations

    SciTech Connect (OSTI)

    Burke, TImothy P.; Kiedrowski, Brian C.; Martin, William R.; Brown, Forrest B.

    2015-11-19

    Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics for one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.

  2. Model-based PSF and MTF estimation and validation from skeletal clinical CT images

    SciTech Connect (OSTI)

    Pakdel, Amirreza; Mainprize, James G.; Robert, Normand; Fialkov, Jeffery; Whyne, Cari M.

    2014-01-15

    Purpose: A method was developed to correct for systematic errors in estimating the thickness of thin bones due to image blurring in CT images using bone interfaces to estimate the point-spread-function (PSF). This study validates the accuracy of the PSFs estimated using said method from various clinical CT images featuring cortical bones. Methods: Gaussian PSFs, characterized by a different extent in the z (scan) direction than in the x and y directions were obtained using our method from 11 clinical CT scans of a cadaveric craniofacial skeleton. These PSFs were estimated for multiple combinations of scanning parameters and reconstruction methods. The actual PSF for each scan setting was measured using the slanted-slit technique within the image slice plane and the longitudinal axis. The Gaussian PSF and the corresponding modulation transfer function (MTF) are compared against the actual PSF and MTF for validation. Results: The differences (errors) between the actual and estimated full-width half-max (FWHM) of the PSFs were 0.09 0.05 and 0.14 0.11 mm for the xy and z axes, respectively. The overall errors in the predicted frequencies measured at 75%, 50%, 25%, 10%, and 5% MTF levels were 0.06 0.07 and 0.06 0.04 cycles/mm for the xy and z axes, respectively. The accuracy of the estimates was dependent on whether they were reconstructed with a standard kernel (Toshiba's FC68, mean error of 0.06 0.05 mm, MTF mean error 0.02 0.02 cycles/mm) or a high resolution bone kernel (Toshiba's FC81, PSF FWHM error 0.12 0.03 mm, MTF mean error 0.09 0.08 cycles/mm). Conclusions: The method is accurate in 3D for an image reconstructed using a standard reconstruction kernel, which conforms to the Gaussian PSF assumption but less accurate when using a high resolution bone kernel. The method is a practical and self-contained means of estimating the PSF in clinical CT images featuring cortical bones, without the need phantoms or any prior knowledge about the scanner

  3. A procedure for the estimation of the numerical uncertainty of CFD calculations based on grid refinement studies

    SciTech Connect (OSTI)

    Eça, L.; Hoekstra, M.

    2014-04-01

    This paper offers a procedure for the estimation of the numerical uncertainty of any integral or local flow quantity as a result of a fluid flow computation; the procedure requires solutions on systematically refined grids. The error is estimated with power series expansions as a function of the typical cell size. These expansions, of which four types are used, are fitted to the data in the least-squares sense. The selection of the best error estimate is based on the standard deviation of the fits. The error estimate is converted into an uncertainty with a safety factor that depends on the observed order of grid convergence and on the standard deviation of the fit. For well-behaved data sets, i.e. monotonic convergence with the expected observed order of grid convergence and no scatter in the data, the method reduces to the well known Grid Convergence Index. Examples of application of the procedure are included. - Highlights: • Estimation of the numerical uncertainty of any integral or local flow quantity. • Least squares fits to power series expansions to handle noisy data. • Excellent results obtained for manufactured solutions. • Consistent results obtained for practical CFD calculations. • Reduces to the well known Grid Convergence Index for well-behaved data sets.

  4. ESTIMATING PHOTOMETRIC REDSHIFTS OF QUASARS VIA THE k-NEAREST NEIGHBOR APPROACH BASED ON LARGE SURVEY DATABASES

    SciTech Connect (OSTI)

    Zhang Yanxia; Ma He; Peng Nanbo; Zhao Yongheng; Wu Xuebing

    2013-08-01

    We apply one of the lazy learning methods, the k-nearest neighbor (kNN) algorithm, to estimate the photometric redshifts of quasars based on various data sets from the Sloan Digital Sky Survey (SDSS), the UKIRT Infrared Deep Sky Survey (UKIDSS), and the Wide-field Infrared Survey Explorer (WISE; the SDSS sample, the SDSS-UKIDSS sample, the SDSS-WISE sample, and the SDSS-UKIDSS-WISE sample). The influence of the k value and different input patterns on the performance of kNN is discussed. kNN performs best when k is different with a special input pattern for a special data set. The best result belongs to the SDSS-UKIDSS-WISE sample. The experimental results generally show that the more information from more bands, the better performance of photometric redshift estimation with kNN. The results also demonstrate that kNN using multiband data can effectively solve the catastrophic failure of photometric redshift estimation, which is met by many machine learning methods. Compared with the performance of various other methods of estimating the photometric redshifts of quasars, kNN based on KD-Tree shows superiority, exhibiting the best accuracy.

  5. SU-E-J-01: 3D Fluoroscopic Image Estimation From Patient-Specific 4DCBCT-Based Motion Models

    SciTech Connect (OSTI)

    Dhou, S; Hurwitz, M; Lewis, J; Mishra, P

    2014-06-01

    Purpose: 3D motion modeling derived from 4DCT images, taken days or weeks before treatment, cannot reliably represent patient anatomy on the day of treatment. We develop a method to generate motion models based on 4DCBCT acquired at the time of treatment, and apply the model to estimate 3D time-varying images (referred to as 3D fluoroscopic images). Methods: Motion models are derived through deformable registration between each 4DCBCT phase, and principal component analysis (PCA) on the resulting displacement vector fields. 3D fluoroscopic images are estimated based on cone-beam projections simulating kV treatment imaging. PCA coefficients are optimized iteratively through comparison of these cone-beam projections and projections estimated based on the motion model. Digital phantoms reproducing ten patient motion trajectories, and a physical phantom with regular and irregular motion derived from measured patient trajectories, are used to evaluate the method in terms of tumor localization, and the global voxel intensity difference compared to ground truth. Results: Experiments included: 1) assuming no anatomic or positioning changes between 4DCT and treatment time; and 2) simulating positioning and tumor baseline shifts at the time of treatment compared to 4DCT acquisition. 4DCBCT were reconstructed from the anatomy as seen at treatment time. In case 1) the tumor localization error and the intensity differences in ten patient were smaller using 4DCT-based motion model, possible due to superior image quality. In case 2) the tumor localization error and intensity differences were 2.85 and 0.15 respectively, using 4DCT-based motion models, and 1.17 and 0.10 using 4DCBCT-based models. 4DCBCT performed better due to its ability to reproduce daily anatomical changes. Conclusion: The study showed an advantage of 4DCBCT-based motion models in the context of 3D fluoroscopic images estimation. Positioning and tumor baseline shift uncertainties were mitigated by the 4DCBCT-based

  6. Waste Estimates for a Future Recycling Plant in the US Based Upon AREVA Operating Experience - 13206

    SciTech Connect (OSTI)

    Foare, Genevieve; Meze, Florian; Bader, Sven; McGee, Don; Murray, Paul; Prud'homme, Pascal

    2013-07-01

    Estimates of process and secondary wastes produced by a recycling plant built in the U.S., which is composed of a used nuclear fuel (UNF) reprocessing facility and a mixed oxide (MOX) fuel fabrication facility, are performed as part of a U.S. Department of Energy (DOE) sponsored study [1]. In this study, a set of common inputs, assumptions, and constraints were identified to allow for comparison of these wastes between different industrial teams. AREVA produced a model of a reprocessing facility, an associated fuel fabrication facility, and waste treatment facilities to develop the results for this study. These facilities were divided into a number of discrete functional areas for which inlet and outlet flow streams were clearly identified to allow for an accurate determination of the radionuclide balance throughout the facility and the waste streams. AREVA relied primarily on its decades of experience and feedback from its La Hague (reprocessing) and MELOX (MOX fuel fabrication) commercial operating facilities in France to support this assessment. However, to perform these estimates for a U.S. facility with different regulatory requirements and to take advantage of some technological advancements, such as in the potential treatment of off-gases, some deviations from this experience were necessary. A summary of AREVA's approach and results for the recycling of 800 metric tonnes of initial heavy metal (MTIHM) of LWR UNF per year into MOX fuel under the assumptions and constraints identified for this DOE study are presented. (authors)

  7. Hawaii demand-side management resource assessment. Final report, Reference Volume 2: Final residential and commercial building prototypes and DOE-2.1E developed UECs and EUIs; Part 3

    SciTech Connect (OSTI)

    1995-04-01

    This section contains the detailed measured impact results and market segment data for each DSM case examined for this building type. A complete index of all base and measure cases defined for this building type is shown first. This index represents an expansion of the base and measure matrix presented in Table 1 (residential) or Table 2 (commercial) for the applicable sector. Following this index, a summary report sheet is provided for each DSM measure case in the order shown in the index. The summary report sheet contains a host of information and selected graphs which define and depict the measure impacts and outline the market segment data assumptions utilized for each case in the DBEDT DSM Forecasting models. The variables and figures included in the summary report sheet are described. Numerous tables and figures are included.

  8. The Role of Mathematical Methods in Efficiency Calibration and Uncertainty Estimation in Gamma Based Non-Destructive Assay - 12311

    SciTech Connect (OSTI)

    Venkataraman, R.; Nakazawa, D.

    2012-07-01

    Mathematical methods are being increasingly employed in the efficiency calibration of gamma based systems for non-destructive assay (NDA) of radioactive waste and for the estimation of the Total Measurement Uncertainty (TMU). Recently, ASTM (American Society for Testing and Materials) released a standard guide for use of modeling passive gamma measurements. This is a testimony to the common use and increasing acceptance of mathematical techniques in the calibration and characterization of NDA systems. Mathematical methods offer flexibility and cost savings in terms of rapidly incorporating calibrations for multiple container types, geometries, and matrix types in a new waste assay system or a system that may already be operational. Mathematical methods are also useful in modeling heterogeneous matrices and non-uniform activity distributions. In compliance with good practice, if a computational method is used in waste assay (or in any other radiological application), it must be validated or benchmarked using representative measurements. In this paper, applications involving mathematical methods in gamma based NDA systems are discussed with several examples. The application examples are from NDA systems that were recently calibrated and performance tested. Measurement based verification results are presented. Mathematical methods play an important role in the efficiency calibration of gamma based NDA systems. This is especially true when the measurement program involves a wide variety of complex item geometries and matrix combinations for which the development of physical standards may be impractical. Mathematical methods offer a cost effective means to perform TMU campaigns. Good practice demands that all mathematical estimates be benchmarked and validated using representative sets of measurements. (authors)

  9. Adjusting lidar-derived digital terrain models in coastal marshes based on estimated aboveground biomass density

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Medeiros, Stephen; Hagen, Scott; Weishampel, John; Angelo, James

    2015-03-25

    Digital elevation models (DEMs) derived from airborne lidar are traditionally unreliable in coastal salt marshes due to the inability of the laser to penetrate the dense grasses and reach the underlying soil. To that end, we present a novel processing methodology that uses ASTER Band 2 (visible red), an interferometric SAR (IfSAR) digital surface model, and lidar-derived canopy height to classify biomass density using both a three-class scheme (high, medium and low) and a two-class scheme (high and low). Elevation adjustments associated with these classes using both median and quartile approaches were applied to adjust lidar-derived elevation values closer tomore » true bare earth elevation. The performance of the method was tested on 229 elevation points in the lower Apalachicola River Marsh. The two-class quartile-based adjusted DEM produced the best results, reducing the RMS error in elevation from 0.65 m to 0.40 m, a 38% improvement. The raw mean errors for the lidar DEM and the adjusted DEM were 0.61 ± 0.24 m and 0.32 ± 0.24 m, respectively, thereby reducing the high bias by approximately 49%.« less

  10. Adjusting lidar-derived digital terrain models in coastal marshes based on estimated aboveground biomass density

    SciTech Connect (OSTI)

    Medeiros, Stephen; Hagen, Scott; Weishampel, John; Angelo, James

    2015-03-25

    Digital elevation models (DEMs) derived from airborne lidar are traditionally unreliable in coastal salt marshes due to the inability of the laser to penetrate the dense grasses and reach the underlying soil. To that end, we present a novel processing methodology that uses ASTER Band 2 (visible red), an interferometric SAR (IfSAR) digital surface model, and lidar-derived canopy height to classify biomass density using both a three-class scheme (high, medium and low) and a two-class scheme (high and low). Elevation adjustments associated with these classes using both median and quartile approaches were applied to adjust lidar-derived elevation values closer to true bare earth elevation. The performance of the method was tested on 229 elevation points in the lower Apalachicola River Marsh. The two-class quartile-based adjusted DEM produced the best results, reducing the RMS error in elevation from 0.65 m to 0.40 m, a 38% improvement. The raw mean errors for the lidar DEM and the adjusted DEM were 0.61 ± 0.24 m and 0.32 ± 0.24 m, respectively, thereby reducing the high bias by approximately 49%.

  11. Experimental data base for estimating the consequences from a hypothetical sabotage attack on a spent fuel shipping cask

    SciTech Connect (OSTI)

    Sandoval, R.P.; Luna, R.E.

    1986-01-01

    This paper describes the results of a program conducted at Sandia National Laboratories for the US Department of Energy to provide an experimental data base for estimating the radiological health effects that could result from the sabotage of a light water reactor spent fuel shipping cask. The primary objectives of the program were limited to: (1) evaluating the effectiveness of selected high energy devices (HED) in breaching full-scale spent fuel shipping casks, (2) quantifying and characterizing relevant aerosol and radiological properties of the released fuel, and (3) using the resulting experimental data to evaluate the radiological health effects resulting from a hypothetical attack on a spent fuel shipping cask in a densely populated urban area. 3 refs.

  12. Microsoft Word - UEC-CC_020315_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  13. Microsoft Word - UEC-CC_120314_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  14. Microsoft Word - UEC-CC_040715_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  15. Microsoft Word - UEC-CC_010615_min.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  16. Microsoft Word - UEC-CC_100215_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  17. Microsoft Word - UEC-CC_060414_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  18. Location: SNS CLO, Room C-156 UEC Members attending: Nazanin...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Milan Buncick, Eric Formo, Zheng Gai, Molly Kennedy, Vivek Prabhu, Ray Unocic, Rafael Verduzco Invited guests: Hans Christen, Bobby Sumpter, Tony Haynes, Brad Lokitz...

  19. Microsoft Word - UEC-CC_030514_min.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  20. Microsoft Word - UEC-CC_050515_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  1. Microsoft Word - UEC-CC_040214_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  2. Microsoft Word - UEC-CC_050714_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  3. Microsoft Word - UEC-CC_080415_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  4. Microsoft Word - UEC-CC_030315_min_TEH.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Vivek Prabhu (NIST) CNMS...

  5. Microsoft Word - UEC-CC_111714_min.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    (Imperial College, London); Megan Robertson (U. Houston); Ray Unocic (ORNLCNMS); Rafael Verduzco (Rice U.) Past Chair, ex officio member - Tony Hmelo (Vanderbilt U.) CNMS...

  6. Microsoft Word - UEC Town_Hall_Meeting_notes.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    * John Lange, Legislative Assistant to Congressman Robert Hurt, 5th District VA * Xan Fishman, Legislative Assistant to Congressman John Delany, 6th District MD * Erika Appel,...

  7. Microsoft Word - UEC-Dinner-meeting-2014_Final.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    electron materials - Soft matter theorysimulation Providing access to state-of-the-art nanoscience research capabilities by building on CNMS foundational strengths and the...

  8. UEC Members attending: Nina Balke, Nazanin Bassiri-Gharb, Milan...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Prabhu, Megan Robertson, Ichiro Takeuchi, Rafael Verduzco CNMS representatives: Sean Smith, Tony Haynes, Vivane Schwartz DOE Guest: Linda Horton Meeting convened approximately...

  9. CNMS_UEC_19_Sep_2011_Smith.pptx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Peter Cummings, Laura Edwards, Tony Haynes, Sean Smith Meeting C onvened 1 2:15P CNMS Update Provided by Sean Smith (slides attached) Discussion P oints * It was noticed ...

  10. Lung motion estimation using dynamic point shifting: An innovative model based on a robust point matching algorithm

    SciTech Connect (OSTI)

    Yi, Jianbing; Yang, Xuan Li, Yan-Ran; Chen, Guoliang

    2015-10-15

    Purpose: Image-guided radiotherapy is an advanced 4D radiotherapy technique that has been developed in recent years. However, respiratory motion causes significant uncertainties in image-guided radiotherapy procedures. To address these issues, an innovative lung motion estimation model based on a robust point matching is proposed in this paper. Methods: An innovative robust point matching algorithm using dynamic point shifting is proposed to estimate patient-specific lung motion during free breathing from 4D computed tomography data. The correspondence of the landmark points is determined from the Euclidean distance between the landmark points and the similarity between the local images that are centered at points at the same time. To ensure that the points in the source image correspond to the points in the target image during other phases, the virtual target points are first created and shifted based on the similarity between the local image centered at the source point and the local image centered at the virtual target point. Second, the target points are shifted by the constrained inverse function mapping the target points to the virtual target points. The source point set and shifted target point set are used to estimate the transformation function between the source image and target image. Results: The performances of the authors’ method are evaluated on two publicly available DIR-lab and POPI-model lung datasets. For computing target registration errors on 750 landmark points in six phases of the DIR-lab dataset and 37 landmark points in ten phases of the POPI-model dataset, the mean and standard deviation by the authors’ method are 1.11 and 1.11 mm, but they are 2.33 and 2.32 mm without considering image intensity, and 1.17 and 1.19 mm with sliding conditions. For the two phases of maximum inhalation and maximum exhalation in the DIR-lab dataset with 300 landmark points of each case, the mean and standard deviation of target registration errors on the

  11. The Application of Traits-Based Assessment Approaches to Estimate the Effects of Hydroelectric Turbine Passage on Fish Populations

    SciTech Connect (OSTI)

    Cada, Glenn F; Schweizer, Peter E

    2012-04-01

    One of the most important environmental issues facing the hydropower industry is the adverse impact of hydroelectric projects on downstream fish passage. Fish that migrate long distances as part of their life cycle include not only important diadromous species (such as salmon, shads, and eels) but also strictly freshwater species. The hydropower reservoirs that downstream-moving fish encounter differ greatly from free-flowing rivers. Many of the environmental changes that occur in a reservoir (altered water temperature and transparency, decreased flow velocities, increased predation) can reduce survival. Upon reaching the dam, downstream-migrating fish may suffer increased mortality as they pass through the turbines, spillways and other bypasses, or turbulent tailraces. Downstream from the dam, insufficient environmental flow releases may slow downstream fish passage rates or decrease survival. There is a need to refine our understanding of the relative importance of causative factors that contribute to turbine passage mortality (e.g., strike, pressure changes, turbulence) so that turbine design efforts can focus on mitigating the most damaging components. Further, present knowledge of the effectiveness of turbine improvements is based on studies of only a few species (mainly salmon and American shad). These data may not be representative of turbine passage effects for the hundreds of other fish species that are susceptible to downstream passage at hydroelectric projects. For example, there are over 900 species of fish in the United States. In Brazil there are an estimated 3,000 freshwater fish species, of which 30% are believed to be migratory (Viana et al. 2011). Worldwide, there are some 14,000 freshwater fish species (Magurran 2009), of which significant numbers are susceptible to hydropower impacts. By comparison, in a compilation of fish entrainment and turbine survival studies from over 100 hydroelectric projects in the United States, Winchell et al. (2000

  12. Assumptions to the Annual Energy Outlook 2015

    U.S. Energy Information Administration (EIA) Indexed Site

    3 Residential Demand Module The NEMS Residential Demand Module projects future residential sector energy requirements based on projections of the number of households and the stock, efficiency, and intensity of energy-consuming equipment. The Residential Demand Module projections begin with a base year estimate of the housing stock, the types and numbers of energy-consuming appliances servicing the stock, and the "unit energy consumption" (UEC) by appliance (in million Btu per

  13. Reservoir Temperature Estimator

    Energy Science and Technology Software Center (OSTI)

    2014-12-08

    The Reservoir Temperature Estimator (RTEst) is a program that can be used to estimate deep geothermal reservoir temperature and chemical parameters such as CO2 fugacity based on the water chemistry of shallower, cooler reservoir fluids. This code uses the plugin features provided in The Geochemist’s Workbench (Bethke and Yeakel, 2011) and interfaces with the model-independent parameter estimation code Pest (Doherty, 2005) to provide for optimization of the estimated parameters based on the minimization of themore » weighted sum of squares of a set of saturation indexes from a user-provided mineral assemblage.« less

  14. A wedge-based approach to estimating health co-benefits of climate change mitigation activities in the United States

    SciTech Connect (OSTI)

    Balbus, John M.; Greenblatt, Jeffery B.; Chari, Ramya; Millstein, Dev; Ebi, Kristie L.

    2015-02-01

    While it has been recognized that actions reducing greenhouse gas (GHG) emissions can have significant positive and negative impacts on human health through reductions in ambient fine particulate matter (PM2.5) concentrations, these impacts are rarely taken into account when analyzing specific policies. This study presents a new framework for estimating the change in health outcomes resulting from implementation of specific carbon dioxide (CO2) reduction activities, allowing comparison of different sectors and options for climate mitigation activities. Our estimates suggest that in the year 2020, the reductions in adverse health outcomes from lessened exposure to PM2.5 would yield economic benefits in the range of $6 to $14 billion (in 2008 USD), depending on the specific activity. This equates to between $40 and $93 per metric ton of CO2 in health benefits. Specific climate interventions will vary in the health co-benefits they provide as well as in potential harms that may result from their implementation. Rigorous assessment of these health impacts is essential for guiding policy decisions as efforts to reduce GHG emissions increase in scope and intensity.

  15. Electrochemical state and internal variables estimation using a reduced-order physics-based model of a lithium-ion cell and an extended Kalman filter

    SciTech Connect (OSTI)

    Stetzel, KD; Aldrich, LL; Trimboli, MS; Plett, GL

    2015-03-15

    This paper addresses the problem of estimating the present value of electrochemical internal variables in a lithium-ion cell in real time, using readily available measurements of cell voltage, current, and temperature. The variables that can be estimated include any desired set of reaction flux and solid and electrolyte potentials and concentrations at any set of one-dimensional spatial locations, in addition to more standard quantities such as state of charge. The method uses an extended Kalman filter along with a one-dimensional physics-based reduced-order model of cell dynamics. Simulations show excellent and robust predictions having dependable error bounds for most internal variables. (C) 2014 Elsevier B.V. All rights reserved.

  16. How Well Can We Estimate Areal-Averaged Spectral Surface Albedo from Ground-Based Transmission in an Atlantic Coastal Area?

    SciTech Connect (OSTI)

    Kassianov, Evgueni I.; Barnard, James C.; Flynn, Connor J.; Riihimaki, Laura D.; Marinovici, Maria C.

    2015-10-15

    Areal-averaged albedos are particularly difficult to measure in coastal regions, because the surface is not homogenous, consisting of a sharp demarcation between land and water. With this difficulty in mind, we evaluate a simple retrieval of areal-averaged surface albedo using ground-based measurements of atmospheric transmission alone under fully overcast conditions. To illustrate the performance of our retrieval, we find the areal-averaged albedo using measurements from the Multi-Filter Rotating Shadowband Radiometer (MFRSR) at five wavelengths (415, 500, 615, 673, and 870 nm). These MFRSR data are collected at a coastal site in Graciosa Island, Azores supported by the U.S. Department of Energy’s (DOE’s) Atmospheric Radiation Measurement (ARM) Program. The areal-averaged albedos obtained from the MFRSR are compared with collocated and coincident Moderate Resolution Imaging Spectroradiometer (MODIS) white-sky albedo at four nominal wavelengths (470, 560, 670 and 860 nm). These comparisons are made during a 19-month period (June 2009 - December 2010). We also calculate composite-based spectral values of surface albedo by a weighted-average approach using estimated fractions of major surface types observed in an area surrounding this coastal site. Taken as a whole, these three methods of finding albedo show spectral and temporal similarities, and suggest that our simple, transmission-based technique holds promise, but with estimated errors of about ±0.03. Additional work is needed to reduce this uncertainty in areas with inhomogeneous surfaces.

  17. Rapid estimation of 4DCT motion-artifact severity based on 1D breathing-surrogate periodicity

    SciTech Connect (OSTI)

    Li, Guang Caraveo, Marshall; Wei, Jie; Rimner, Andreas; Wu, Abraham J.; Goodman, Karyn A.; Yorke, Ellen

    2014-11-01

    Purpose: Motion artifacts are common in patient four-dimensional computed tomography (4DCT) images, leading to an ill-defined tumor volume with large variations for radiotherapy treatment and a poor foundation with low imaging fidelity for studying respiratory motion. The authors developed a method to estimate 4DCT image quality by establishing a correlation between the severity of motion artifacts in 4DCT images and the periodicity of the corresponding 1D respiratory waveform (1DRW) used for phase binning in 4DCT reconstruction. Methods: Discrete Fourier transformation (DFT) was applied to analyze 1DRW periodicity. The breathing periodicity index (BPI) was defined as the sum of the largest five Fourier coefficients, ranging from 0 to 1. Distortional motion artifacts (excluding blurring) of cine-scan 4DCT at the junctions of adjacent couch positions around the diaphragm were classified in three categories: incomplete, overlapping, and duplicate anatomies. To quantify these artifacts, discontinuity of the diaphragm at the junctions was measured in distance and averaged along six directions in three orthogonal views. Artifacts per junction (APJ) across the entire diaphragm were calculated in each breathing phase and phase-averaged APJ{sup }, defined as motion-artifact severity (MAS), was obtained for each patient. To make MAS independent of patient-specific motion amplitude, two new MAS quantities were defined: MAS{sup D} is normalized to the maximum diaphragmatic displacement and MAS{sup V} is normalized to the mean diaphragmatic velocity (the breathing period was obtained from DFT analysis of 1DRW). Twenty-six patients free-breathing 4DCT images and corresponding 1DRW data were studied. Results: Higher APJ values were found around midventilation and full inhalation while the lowest APJ values were around full exhalation. The distribution of MAS is close to Poisson distribution with a mean of 2.2 mm. The BPI among the 26 patients was calculated with a value

  18. SU-E-T-388: Estimating the Radioactivity Inventory of a Cyclotron Based Pencil Beam Proton Therapy Facility

    SciTech Connect (OSTI)

    Langen, K; Chen, S

    2014-06-01

    Purpose: Parts of the cyclotron and energy degrader are incidentally activated by protons lost during the acceleration and transport of protons for radiation therapy. An understanding of the radioactive material inventory is needed when regulatory requirements are assessed. Methods: First, the tumor dose and volume is used to determine the required energy deposition. For spot scanning, the tumor length along the beam path determines the number of required energy layers. For each energy layer the energy deposition per proton can be calculated from the residual proton range within the tumor. Assuming a typical layer weighting, an effective energy deposition per proton can then be calculated. The total number of required protons and the number of protons per energy layer can then be calculated. For each energy layer, proton losses in the energy degrader are calculated separately since its transmission efficiency, and hence the amount of protons lost, is energy dependent. The degrader efficiency also determines the number of protons requested from the cyclotron. The cyclotron extraction efficiency allows a calculation of the proton losses within the cyclotron. The saturation activity induced in the cyclotron and the degrader is equal to the production rate R for isotopes whose half-life is shorter that the projected cyclotron life time. R can be calculated from the proton loss rate and published production cross sections. Results: About 1/3 of the saturation activity is produced in the cyclotron and 2/3 in the energy degrader. For a projected case mix and a patient load of 1100 fractions per week at 1.8 Gy per fraction a combined activity of 180 mCi was estimated at saturation. Conclusion: Calculations were used to support to application of a radioactive materials license for the possession of 200 mCi of activity for isotopes with atomic numbers ranging from 1-83.

  19. SU-E-J-63: Estimating the Effects of Respiratory Motion On Dose Heterogeneity for ITV-Based Treatments

    SciTech Connect (OSTI)

    Williams, C; Lewis, J

    2014-06-01

    Purpose: To quantify the relationship between the amount of dose heterogeneity in a treatment plan that uses an internal target volume (ITV) to account for respiratory motion and the true amount of heterogeneity in the dose delivered to the tumor contained within that ITV. Methods: We develop a convolution-based framework for calculating dose delivered to a tumor moving inside an ITV according to a common sinusoid-based breathing model including asymmetry. We model the planned ITV dose distribution as a centrally peaked analytic function approximating the profile of clinical stereotactic body radiotherapy treatments. Expressions for the minimum and maximum dose received by the tumor are derived and evaluated for a range of clinically relevant parameters. Results of the model are validated with phantom measurements using an ion chamber array. Results: An analytic expression is presented for the maximum and minimum doses received by the tumor relative to the planned ITV dose. The tumor dose heterogeneity depends solely on the ratio of tumor size to ITV size, the peak dose in the planned ITV dose distribution, and the respiratory asymmetry parameter. Under the assumptions of this model, using a typical breathing asymmetry parameter and a dose distribution with a fixed size ITV covered by the 100% line and with a 130% hotspot, the maximum dose to the tumor varies between 113%130%, and the minimum dose varies between 100%116% depending on the amount of tumor motion. Conclusion: This modeling exercise demonstrates the interplay between motion and dose heterogeneity. Tumors that exhibit large amounts of respiratory motion relative to their size will receive a more homogeneous dose and a larger minimum dose than would be inferred from the ITV dose distribution. This effect is not captured in current clinical treatment planning methods unless 4D dose calculation techniques are used. This work was partially supported by a Varian Medical Systems research grant.

  20. Real-time global flood estimation using satellite-based precipitation and a coupled land surface and routing model

    SciTech Connect (OSTI)

    Wu, Huan; Adler, Robert F.; Tian, Yudong; Huffman, George J.; Li, Hongyi; Wang, JianJian

    2014-03-01

    A widely used land surface model, the Variable Infiltration Capacity (VIC) model, is coupled with a newly developed hierarchical dominant river tracing-based runoff-routing model to form the Dominant river tracing-Routing Integrated with VIC Environment (DRIVE) model, which serves as the new core of the real-time Global Flood Monitoring System (GFMS). The GFMS uses real-time satellite-based precipitation to derive flood monitoring parameters for the latitude band 50°N–50°S at relatively high spatial (~12 km) and temporal (3 hourly) resolution. Examples of model results for recent flood events are computed using the real-time GFMS (http://flood.umd.edu). To evaluate the accuracy of the new GFMS, the DRIVE model is run retrospectively for 15 years using both research-quality and real-time satellite precipitation products. Evaluation results are slightly better for the research-quality input and significantly better for longer duration events (3 day events versus 1 day events). Basins with fewer dams tend to provide lower false alarm ratios. For events longer than three days in areas with few dams, the probability of detection is ~0.9 and the false alarm ratio is ~0.6. In general, these statistical results are better than those of the previous system. Streamflow was evaluated at 1121 river gauges across the quasi-global domain. Validation using real-time precipitation across the tropics (30°S–30°N) gives positive daily Nash-Sutcliffe Coefficients for 107 out of 375 (28%) stations with a mean of 0.19 and 51% of the same gauges at monthly scale with a mean of 0.33. Finally, there were poorer results in higher latitudes, probably due to larger errors in the satellite precipitation input.

  1. SU-E-CAMPUS-T-02: Can Pre-Treatment 4DCT-Based Motion Margins Estimates Be Trusted for Proton Radiotherapy?

    SciTech Connect (OSTI)

    Seco, J; Koybasi, O; Mishra, P; James, S St.; Lewis, J

    2014-06-15

    Purpose: Radiotherapy motion margins are generated using pre-treatment 4DCT data. The purpose of this study is to assess if pre-treatment 4DCT is sufficient in proton therapy to provide accurate estimate of motion margins. A dosimetric assessment is performed comparing pre-treatment margins with daily-customized margins. Methods: Gold fiducial markers implanted in lung tumors of patients were used to track the tumor. A spherical tumor of diameter 20 mm is inserted into a realistic digital respiratory phantom, where the tumor motion is based on real patient lung tumor trajectories recorded over multiple days. Using Day 1 patient data, 100 ITVs were generated with 1 s interval between consecutive scan start times. Each ITV was made up by the union of 10 tumor positions obtained from 6 s scan time. Two ITV volumes were chosen for treatment planning: ITVmean-? and ITVmean+?. The delivered dose was computed on i) 10 phases forming the planning ITV (10-phase - simulating dose calculation based on 4DCT) and ii) 50 phantoms produced from 100 s of data from any other day with tumor positions sampled every 2 s (dynamic - simulating the dose that would actually be delivered). Results: For similar breathing patterns between Day 1 and any other Day N(>1), the 95% volume coverage (D95) for dynamic case was 8.13% lower than the 10-phase case for ITVmean+?. For breathing patterns that were very different between Day 1 and any other Day N(>1), this difference was as high as 24.5% for ITVmean-?. Conclusion: Proton treatment planning based on pre-treatment 4DCT can lead to under-dosage of the tumor and over-dosage of the surrounding tissues, because of inadequate estimate of the range of motion of the tumor. This is due to the shift of the Bragg peak compared to photon therapy in which the tumor is surrounded by an electron bath.

  2. A Pilot Evaluation of a 4-Dimensional Cone-Beam Computed Tomographic Scheme Based on Simultaneous Motion Estimation and Image Reconstruction

    SciTech Connect (OSTI)

    Dang, Jun; Gu, Xuejun; Pan, Tinsu; Wang, Jing

    2015-02-01

    Purpose: To evaluate the performance of a 4-dimensional (4-D) cone-beam computed tomographic (CBCT) reconstruction scheme based on simultaneous motion estimation and image reconstruction (SMEIR) through patient studies. Methods and Materials: The SMEIR algorithm contains 2 alternating steps: (1) motion-compensated CBCT reconstruction using projections from all phases to reconstruct a reference phase 4D-CBCT by explicitly considering the motion models between each different phase and (2) estimation of motion models directly from projections by matching the measured projections to the forward projection of the deformed reference phase 4D-CBCT. Four lung cancer patients were scanned for 4 to 6 minutes to obtain approximately 2000 projections for each patient. To evaluate the performance of the SMEIR algorithm on a conventional 1-minute CBCT scan, the number of projections at each phase was reduced by a factor of 5, 8, or 10 for each patient. Then, 4D-CBCTs were reconstructed from the down-sampled projections using Feldkamp-Davis-Kress, total variation (TV) minimization, prior image constrained compressive sensing (PICCS), and SMEIR. Using the 4D-CBCT reconstructed from the fully sampled projections as a reference, the relative error (RE) of reconstructed images, root mean square error (RMSE), and maximum error (MaxE) of estimated tumor positions were analyzed to quantify the performance of the SMEIR algorithm. Results: The SMEIR algorithm can achieve results consistent with the reference 4D-CBCT reconstructed with many more projections per phase. With an average of 30 to 40 projections per phase, the MaxE in tumor position detection is less than 1 mm in SMEIR for all 4 patients. Conclusion: The results from a limited number of patients show that SMEIR is a promising tool for high-quality 4D-CBCT reconstruction and tumor motion modeling.

  3. MO-E-17A-08: Attenuation-Based Size Adjusted, Scanner-Independent Organ Dose Estimates for Head CT Exams: TG 204 for Head CT

    SciTech Connect (OSTI)

    McMillan, K; Bostani, M; Cagnon, C; McNitt-Gray, M; Zankl, M; DeMarco, J

    2014-06-15

    Purpose: AAPM Task Group 204 described size specific dose estimates (SSDE) for body scans. The purpose of this work is to use a similar approach to develop patient-specific, scanner-independent organ dose estimates for head CT exams using an attenuation-based size metric. Methods: For eight patient models from the GSF family of voxelized phantoms, dose to brain and lens of the eye was estimated using Monte Carlo simulations of contiguous axial scans for 64-slice MDCT scanners from four major manufacturers. Organ doses were normalized by scannerspecific 16 cm CTDIvol values and averaged across all scanners to obtain scanner-independent CTDIvol-to-organ-dose conversion coefficients for each patient model. Head size was measured at the first slice superior to the eyes; patient perimeter and effective diameter (ED) were measured directly from the GSF data. Because the GSF models use organ identification codes instead of Hounsfield units, water equivalent diameter (WED) was estimated indirectly. Using the image data from 42 patients ranging from 2 weeks old to adult, the perimeter, ED and WED size metrics were obtained and correlations between each metric were established. Applying these correlations to the GSF perimeter and ED measurements, WED was calculated for each model. The relationship between the various patient size metrics and CTDIvol-to-organ-dose conversion coefficients was then described. Results: The analysis of patient images demonstrated the correlation between WED and ED across a wide range of patient sizes. When applied to the GSF patient models, an exponential relationship between CTDIvol-to-organ-dose conversion coefficients and the WED size metric was observed with correlation coefficients of 0.93 and 0.77 for the brain and lens of the eye, respectively. Conclusion: Strong correlation exists between CTDIvol normalized brain dose and WED. For the lens of the eye, a lower correlation is observed, primarily due to surface dose variations. Funding

  4. Cost Estimating, Analysis, and Standardization

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1984-11-02

    To establish policy and responsibilities for: (a) developing and reviewing project cost estimates; (b) preparing independent cost estimates and analysis; (c) standardizing cost estimating procedures; and (d) improving overall cost estimating and analytical techniques, cost data bases, cost and economic escalation models, and cost estimating systems. Cancels DOE O 5700.2B, dated 8-5-1983; DOE O 5700.8, dated 5-27-1981; and HQ 1130.1A, dated 12-30-1981. Canceled by DOE O 5700.2D, dated 6-12-1992

  5. Parametric Hazard Function Estimation.

    Energy Science and Technology Software Center (OSTI)

    1999-09-13

    Version 00 Phaze performs statistical inference calculations on a hazard function (also called a failure rate or intensity function) based on reported failure times of components that are repaired and restored to service. Three parametric models are allowed: the exponential, linear, and Weibull hazard models. The inference includes estimation (maximum likelihood estimators and confidence regions) of the parameters and of the hazard function itself, testing of hypotheses such as increasing failure rate, and checking ofmore » the model assumptions.« less

  6. Estimation of the electron density and radiative energy losses in a calcium plasma source based on an electron cyclotron resonance discharge

    SciTech Connect (OSTI)

    Potanin, E. P. Ustinov, A. L.

    2013-06-15

    The parameters of a calcium plasma source based on an electron cyclotron resonance (ECR) discharge were calculated. The analysis was performed as applied to an ion cyclotron resonance system designed for separation of calcium isotopes. The plasma electrons in the source were heated by gyrotron microwave radiation in the zone of the inhomogeneous magnetic field. It was assumed that, in such a combined trap, the energy of the extraordinary microwave propagating from the high-field side was initially transferred to a small group of resonance electrons. As a result, two electron components with different transverse temperatures-the hot resonance component and the cold nonresonance component-were created in the plasma. The longitudinal temperatures of both components were assumed to be equal. The entire discharge space was divided into a narrow ECR zone, where resonance electrons acquired transverse energy, and the region of the discharge itself, where the gas was ionized. The transverse energy of resonance electrons was calculated by solving the equations for electron motion in an inhomogeneous magnetic field. Using the law of energy conservation and the balance condition for the number of hot electrons entering the discharge zone and cooled due to ionization and elastic collisions, the density of hot electrons was estimated and the dependence of the longitudinal temperature T{sub e Parallel-To} of the main (cold) electron component on the energy fraction {beta} lost for radiation was obtained.

  7. Microsoft Word - UEC meeting with SHUG reps 09-17-09.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Attendees- Facility Staff Lynn Kzsos (SNS) Al Ekkebus (SNS) Tony Haynes (CNMS) Laura Edwards (CNMS) Discussion of combining proposal calls, pros & cons Pros: expectations for...

  8. Microsoft Word - CNMS UEC Meeting 09-26-08 minutes.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    corrected electron microscope and continuing development of the SNS instrument suite. He also discussed what makes a successful user program - high quality science output and...

  9. Statistical Evaluation of Travel Time Estimation Based on Data from Freeze-Branded Chinook Salmon on the Snake River, 1982-1990.

    SciTech Connect (OSTI)

    Smith, Steven G.; Skalski, J.R.; Giorgi, Albert E.

    1993-10-01

    The purpose of this investigation is to assess the strengths and limitations of existing freeze brand recapture data in describing the migratory dynamics of juvenile salmonids in the mainstream, impounded sections of the Snake and Columbia Rivers. With the increased concern over the threatened status of spring and summer chinook salmon in the Snake River drainage, we used representative stocks for these races as our study populations. However, statistical considerations resultant from these analyses apply to other species and drainages as well. This report describes analyses we conducted using information derived from freeze-branded groups. We examined both index production groups released from hatcheries upstream from Lower Granite Dam (1982--1990) and freeze-branded groups used as controls in smolt transportation evaluations conducted by the National Marine Fisheries Service (1986, 1989). The scope of our analysis was limited to describing travel time estimates and derived relationships, as well as reach survival estimates through the mainstem Snake River from Lower Granite to McNary Dam.

  10. Cost Estimation Package

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    This chapter focuses on the components (or elements) of the cost estimation package and their documentation.

  11. Check Estimates and Independent Costs

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    Check estimates and independent cost estimates (ICEs) are tools that can be used to validate a cost estimate. Estimate validation entails an objective review of the estimate to ensure that estimate criteria and requirements have been met and well documented, defensible estimate has been developed. This chapter describes check estimates and their procedures and various types of independent cost estimates.

  12. State Energy Production Estimates

    U.S. Energy Information Administration (EIA) Indexed Site

    Energy Production Estimates 1960 Through 2012 2012 Summary Tables Table P1. Energy Production Estimates in Physical Units, 2012 Alabama 19,455 215,710 9,525 0 Alaska 2,052 351,259...

  13. Types of Cost Estimates

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    The chapter describes the estimates required on government-managed projects for both general construction and environmental management.

  14. Hydrogen Station Cost Estimates: Comparing Hydrogen Station Cost Calculator Results with other Recent Estimates

    SciTech Connect (OSTI)

    Melaina, M.; Penev, M.

    2013-09-01

    This report compares hydrogen station cost estimates conveyed by expert stakeholders through the Hydrogen Station Cost Calculation (HSCC) to a select number of other cost estimates. These other cost estimates include projections based upon cost models and costs associated with recently funded stations.

  15. Estimation of the relationship between remotely sensed anthropogenic...

    Office of Scientific and Technical Information (OSTI)

    of Indianapolis, Indiana, USA. Anthropogenic heat discharge was estimated based on a remote sensing-based surface energy balance model, which was parameterized using land ...

  16. Weldon Spring historical dose estimate

    SciTech Connect (OSTI)

    Meshkov, N.; Benioff, P.; Wang, J.; Yuan, Y.

    1986-07-01

    This study was conducted to determine the estimated radiation doses that individuals in five nearby population groups and the general population in the surrounding area may have received as a consequence of activities at a uranium processing plant in Weldon Spring, Missouri. The study is retrospective and encompasses plant operations (1957-1966), cleanup (1967-1969), and maintenance (1969-1982). The dose estimates for members of the nearby population groups are as follows. Of the three periods considered, the largest doses to the general population in the surrounding area would have occurred during the plant operations period (1957-1966). Dose estimates for the cleanup (1967-1969) and maintenance (1969-1982) periods are negligible in comparison. Based on the monitoring data, if there was a person residing continually in a dwelling 1.2 km (0.75 mi) north of the plant, this person is estimated to have received an average of about 96 mrem/yr (ranging from 50 to 160 mrem/yr) above background during plant operations, whereas the dose to a nearby resident during later years is estimated to have been about 0.4 mrem/yr during cleanup and about 0.2 mrem/yr during the maintenance period. These values may be compared with the background dose in Missouri of 120 mrem/yr.

  17. Estimation of placental and lactational transfer and tissue distribution of atrazine and its main metabolites in rodent dams, fetuses, and neonates with physiologically based pharmacokinetic modeling

    SciTech Connect (OSTI)

    Lin, Zhoumeng; Fisher, Jeffrey W.; Wang, Ran; Ross, Matthew K.; Filipov, Nikolay M.

    2013-11-15

    Atrazine (ATR) is a widely used chlorotriazine herbicide, a ubiquitous environmental contaminant, and a potential developmental toxicant. To quantitatively evaluate placental/lactational transfer and fetal/neonatal tissue dosimetry of ATR and its major metabolites, physiologically based pharmacokinetic models were developed for rat dams, fetuses and neonates. These models were calibrated using pharmacokinetic data from rat dams repeatedly exposed (oral gavage; 5 mg/kg) to ATR followed by model evaluation against other available rat data. Model simulations corresponded well to the majority of available experimental data and suggest that: (1) the fetus is exposed to both ATR and its major metabolite didealkylatrazine (DACT) at levels similar to maternal plasma levels, (2) the neonate is exposed mostly to DACT at levels two-thirds lower than maternal plasma or fetal levels, while lactational exposure to ATR is minimal, and (3) gestational carryover of DACT greatly affects its neonatal dosimetry up until mid-lactation. To test the model's cross-species extrapolation capability, a pharmacokinetic study was conducted with pregnant C57BL/6 mice exposed (oral gavage; 5 mg/kg) to ATR from gestational day 12 to 18. By using mouse-specific parameters, the model predictions fitted well with the measured data, including placental ATR/DACT levels. However, fetal concentrations of DACT were overestimated by the model (10-fold). This overestimation suggests that only around 10% of the DACT that reaches the fetus is tissue-bound. These rodent models could be used in fetal/neonatal tissue dosimetry predictions to help design/interpret early life toxicity/pharmacokinetic studies with ATR and as a foundation for scaling to humans. - Highlights: We developed PBPK models for atrazine in rat dams, fetuses, and neonates. We conducted pharmacokinetic (PK) study with atrazine in pregnant mice. Model predictions were in good agreement with experimental rat and mouse PK data. The

  18. Methodology for Monthly Crude Oil Production Estimates

    U.S. Energy Information Administration (EIA) Indexed Site

    015 U.S. Energy Information Administration | Methodology for Monthly Crude Oil Production Estimates 1 Methodology for Monthly Crude Oil Production Estimates Executive summary The U.S. Energy Information Administration (EIA) relies on data from state and other federal agencies and does not currently collect survey data directly from crude oil producers. Summarizing the estimation process in terms of percent of U.S. production: * 20% is based on state agency data, including North Dakota and

  19. U.S. Uranium Reserves Estimates

    Gasoline and Diesel Fuel Update (EIA)

    Methodology The U.S. uranium ore reserves reported by EIA for specific MFC categories represent the sums of quantities estimated to occur in known deposits on properties where data about the ore grade, configuration, and depth indicate that the quantities estimated could be recovered at or less than the stated costs given current mining and milling technology and regulations. The reserves estimates for year-end (delete: December 31, 2008), are based on historical data for uranium properties

  20. Estimating Specialty Costs

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    Specialty costs are those nonstandard, unusual costs that are not typically estimated. Costs for research and development (R&D) projects involving new technologies, costs associated with future regulations, and specialty equipment costs are examples of specialty costs. This chapter discusses those factors that are significant contributors to project specialty costs and methods of estimating costs for specialty projects.

  1. How EIA Estimates Natural Gas Production

    Reports and Publications (EIA)

    2004-01-01

    The Energy Information Administration (EIA) publishes estimates monthly and annually of the production of natural gas in the United States. The estimates are based on data EIA collects from gas producing states and data collected by the U. S. Minerals Management Service (MMS) in the Department of Interior. The states and MMS collect this information from producers of natural gas for various reasons, most often for revenue purposes. Because the information is not sufficiently complete or timely for inclusion in EIA's Natural Gas Monthly (NGM), EIA has developed estimation methodologies to generate monthly production estimates that are described in this document.

  2. HSRL mass estimate based on CALIPSO

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    NASA B-200 King Air ARCTASISDAC Operations and Science Richard Ferrare, Chris Hostetler, ... hours science *5 flights coordinated with NASA DC-8 *3 flights coordinated with NASA P-3 ...

  3. HSRL mass estimate based on CALIPSO

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    NASA B-200 King Air ARCTAS/ISDAC Operations and Science Richard Ferrare, Chris Hostetler, John Hair, Anthony Cook, David Harper, Mike Obland, Ray Rogers, Sharon Burton, Matt Shupe, Dave Turner, Connor Flynn B200/HSRL Deployment During ARCTAS (Spring)  Independently measures aerosol/cloud extinction and backscatter profiles at 532 nm  Includes - Backscatter channels at 1064 nm - Polarization sensitivity at 532 and 1064 nm  Profile Measurement capabilities - Extensive measurements *

  4. HSRL mass estimate based on CALIPSO

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    OBSERVATIONS FROM THE NASA LANGLEY AIRBORNE HIGH SPECTRAL RESOLUTION LIDAR AND PLANS FOR ACTIVE-PASSIVE AEROSOL-CLOUD RETRIEVALS Chris A. Hostetler, Richard A. Ferrare, John W. Hair, Raymond R. Rogers, Mike Obland, Sharon P. Burton, Wenying Su, Anthony L. Cook, David B. Harper NASA HQ Science Mission Directorate Radiation Sciences Program Environmental Protection Agency (EPA) NASA CALIPSO Project Sponsors Department of Energy Atmospheric Science Program Texas Commission on Environmental Quality

  5. Cost Estimating Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2011-05-09

    This Guide provides uniform guidance and best practices that describe the methods and procedures that could be used in all programs and projects at DOE for preparing cost estimates. No cancellations.

  6. Derived Annual Estimates

    U.S. Energy Information Administration (EIA) Indexed Site

    74-1988 For Methodology Concerning the Derived Estimates Total Consumption of Offsite-Produced Energy for Heat and Power by Industry Group, 1974-1988 Total Energy *** Electricity...

  7. Cost Estimating Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2011-05-09

    This Guide provides uniform guidance and best practices that describe the methods and procedures that could be used in all programs and projects at DOE for preparing cost estimates.

  8. Cost Estimating Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    The objective of this Guide is to improve the quality of cost estimates and further strengthen the DOE program/project management system. The original 25 separate chapters and three appendices have been combined to create a single document.

  9. Independent Cost Estimate (ICE)

    Broader source: Energy.gov [DOE]

    Independent Cost Estimate (ICE). On August 8-12, the Office of Project Management Oversight and Assessments (PM) will conduct an ICE on the NNSA Albuquerque Complex Project (NACP) at Albuquerque, NM. This estimate will support the Critical Decision (CD) for establishing the performance baseline and approval to start construction (CD-2/3). This project is at CD-1, with a total project cost range of $183M to $251M.

  10. REQUESTS FOR RETIREMENT ESTIMATE

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    REQUEST FOR RETIREMENT ANNUITY ESTIMATE Instructions: Please read and answer the following questions thoroughly to include checking all applicable boxes. Unanswered questions may delay processing. Print and Fax back your request form to 202.586.6395 or drop request to GM-169. The request will be assigned to your servicing retirement specialist. They will confirm receipt of your request. SECTION A Request Submitted _____________________ ______________________ ________________________

  11. Weekly Coal Production Estimation Methodology

    Gasoline and Diesel Fuel Update (EIA)

    Weekly Coal Production Estimation Methodology Step 1 (Estimate total amount of weekly U.S. coal production) U.S. coal production for the current week is estimated using a ratio ...

  12. Magnetic nanoparticle temperature estimation

    SciTech Connect (OSTI)

    Weaver, John B.; Rauwerdink, Adam M.; Hansen, Eric W.

    2009-05-15

    The authors present a method of measuring the temperature of magnetic nanoparticles that can be adapted to provide in vivo temperature maps. Many of the minimally invasive therapies that promise to reduce health care costs and improve patient outcomes heat tissue to very specific temperatures to be effective. Measurements are required because physiological cooling, primarily blood flow, makes the temperature difficult to predict a priori. The ratio of the fifth and third harmonics of the magnetization generated by magnetic nanoparticles in a sinusoidal field is used to generate a calibration curve and to subsequently estimate the temperature. The calibration curve is obtained by varying the amplitude of the sinusoidal field. The temperature can then be estimated from any subsequent measurement of the ratio. The accuracy was 0.3 deg. K between 20 and 50 deg. C using the current apparatus and half-second measurements. The method is independent of nanoparticle concentration and nanoparticle size distribution.

  13. State Energy Production Estimates

    Gasoline and Diesel Fuel Update (EIA)

    Production Estimates 1960 Through 2014 2014 Summary Tables U.S. Energy Information Administration | State Energy Data 2014: Production 1 Table P1. Energy Production Estimates in Physical Units, 2014 Alabama 16,377 181,054 9,828 0 Alaska 1,502 345,331 181,175 0 Arizona 8,051 106 56 1,044 Arkansas 94 1,123,678 6,845 0 California 0 252,718 204,269 4,462 Colorado 24,007 1,631,391 95,192 3,133 Connecticut 0 0 0 0 Delaware 0 0 0 0 District of Columbia 0 0 0 0 Florida 0 369 2,227 0 Georgia 0 0 0 2,517

  14. Use of Cost Estimating Relationships

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    Cost Estimating Relationships (CERs) are an important tool in an estimator's kit, and in many cases, they are the only tool. Thus, it is important to understand their limitations and characteristics. This chapter discusses considerations of which the estimator must be aware so the Cost Estimating Relationships can be properly used.

  15. A simple method to estimate interwell autocorrelation

    SciTech Connect (OSTI)

    Pizarro, J.O.S.; Lake, L.W.

    1997-08-01

    The estimation of autocorrelation in the lateral or interwell direction is important when performing reservoir characterization studies using stochastic modeling. This paper presents a new method to estimate the interwell autocorrelation based on parameters, such as the vertical range and the variance, that can be estimated with commonly available data. We used synthetic fields that were generated from stochastic simulations to provide data to construct the estimation charts. These charts relate the ratio of areal to vertical variance and the autocorrelation range (expressed variously) in two directions. Three different semivariogram models were considered: spherical, exponential and truncated fractal. The overall procedure is demonstrated using field data. We find that the approach gives the most self-consistent results when it is applied to previously identified facies. Moreover, the autocorrelation trends follow the depositional pattern of the reservoir, which gives confidence in the validity of the approach.

  16. Automated Estimating System

    Energy Science and Technology Software Center (OSTI)

    1996-04-15

    AES6.1 is a PC software package developed to aid in the preparation and reporting of cost estimates. AES6.1 provides an easy means for entering and updating the detailed cost, schedule information, project work breakdown structure, and escalation information contained in a typical project cost estimate through the use of menus and formatted input screens. AES6.1 combines this information to calculate both unescalated and escalated cost for a project which can be reported at varying levelsmore » of detail. Following are the major modifications to AES6.0f: Contingency update was modified to provide greater flexibility for user updates, Schedule Update was modified to provide user ability to schedule Bills of Material at the WBS/Participant/Cost Code level, Schedule Plot was modified to graphically show schedule by WBS/Participant/Cost Code, All Fiscal Year reporting has been modified to use the new schedule format, The Schedule 1-B-7, Cost Schedule, and the WBS/Participant reprorts were modified to determine Phase of Work from the B/M Cost Code, Utility program was modified to allow selection by cost code and update cost code in the Global Schedule update, Generic summary and line item download were added to the utility program, and an option was added to all reports which allows the user to indicate where overhead is to be reported (bottom line or in body of report)« less

  17. Battery Calendar Life Estimator Manual Modeling and Simulation

    SciTech Connect (OSTI)

    Jon P. Christophersen; Ira Bloom; Ed Thomas; Vince Battaglia

    2012-10-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  18. Battery Life Estimator Manual Linear Modeling and Simulation

    SciTech Connect (OSTI)

    Jon P. Christophersen; Ira Bloom; Ed Thomas; Vince Battaglia

    2009-08-01

    The Battery Life Estimator (BLE) Manual has been prepared to assist developers in their efforts to estimate the calendar life of advanced batteries for automotive applications. Testing requirements and procedures are defined by the various manuals previously published under the United States Advanced Battery Consortium (USABC). The purpose of this manual is to describe and standardize a method for estimating calendar life based on statistical models and degradation data acquired from typical USABC battery testing.

  19. Robust and intelligent bearing estimation

    DOE Patents [OSTI]

    Claassen, John P.

    2000-01-01

    A method of bearing estimation comprising quadrature digital filtering of event observations, constructing a plurality of observation matrices each centered on a time-frequency interval, determining for each observation matrix a parameter such as degree of polarization, linearity of particle motion, degree of dyadicy, or signal-to-noise ratio, choosing observation matrices most likely to produce a set of best available bearing estimates, and estimating a bearing for each observation matrix of the chosen set.

  20. Supercooled liquid water Estimation Tool

    Energy Science and Technology Software Center (OSTI)

    2012-05-04

    The Cloud Supercooled liquid water Estimation Tool (SEET) is a user driven Graphical User Interface (GUI) that estimates cloud supercooled liquid water (SLW) content in terms of vertical column and total mass from Moderate resolution Imaging Supercooled liquid water Estimation Tool Spectroradiometer (MODIS) spatially derived cloud products and realistic vertical cloud parameterizations that are user defined. It also contains functions for post-processing of the resulting data in tabular and graphical form.

  1. Estimation of benchmark dose as the threshold levels of urinary cadmium, based on excretion of total protein, {beta} {sub 2}-microglobulin, and N-acetyl-{beta}-D-glucosaminidase in cadmium nonpolluted regions in Japan

    SciTech Connect (OSTI)

    Kobayashi, Etsuko . E-mail: ekoba@faculty.chiba-u.jp; Suwazono, Yasushi; Uetani, Mirei; Inaba, Takeya; Oishi, Mitsuhiro; Kido, Teruhiko; Nishijo, Muneko; Nakagawa, Hideaki; Nogawa, Koji

    2006-07-15

    Previously, we investigated the association between urinary cadmium (Cd) concentration and indicators of renal dysfunction, including total protein, {beta} {sub 2}-microglobulin ({beta} {sub 2}-MG), and N-acetyl-{beta}-D-glucosaminidase (NAG). In 2778 inhabitants {>=}50 years of age (1114 men, 1664 women) in three different Cd nonpolluted areas in Japan, we showed that a dose-response relationship existed between renal effects and Cd exposure in the general environment without any known Cd pollution. However, we could not estimate the threshold levels of urinary Cd at that time. In the present study, we estimated the threshold levels of urinary Cd as the benchmark dose low (BMDL) using the benchmark dose (BMD) approach. Urinary Cd excretion was divided into 10 categories, and an abnormality rate was calculated for each. Cut-off values for urinary substances were defined as corresponding to the 84% and 95% upper limit values of the target population who have not smoked. Then we calculated the BMD and BMDL using a log-logistic model. The values of BMD and BMDL for all urinary substances could be calculated. The BMDL for the 84% cut-off value of {beta} {sub 2}-MG, setting an abnormal value at 5%, was 2.4 {mu}g/g creatinine (cr) in men and 3.3 {mu}g/g cr in women. In conclusion, the present study demonstrated that the threshold level of urinary Cd could be estimated in people living in the general environment without any known Cd-pollution in Japan, and the value was inferred to be almost the same as that in Belgium, Sweden, and China.

  2. The ARM Best Estimate 2-dimensional Gridded Surface

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Xie,Shaocheng; Qi, Tang

    The ARM Best Estimate 2-dimensional Gridded Surface (ARMBE2DGRID) data set merges together key surface measurements at the Southern Great Plains (SGP) sites and interpolates the data to a regular 2D grid to facilitate data application. Data from the original site locations can be found in the ARM Best Estimate Station-based Surface (ARMBESTNS) data set.

  3. Concurrent signal combining and channel estimation in digital communications

    DOE Patents [OSTI]

    Ormesher, Richard C.; Mason, John J.

    2011-08-30

    In the reception of digital information transmitted on a communication channel, a characteristic exhibited by the communication channel during transmission of the digital information is estimated based on a communication signal that represents the digital information and has been received via the communication channel. Concurrently with the estimating, the communication signal is used to decide what digital information was transmitted.

  4. Sensitivity of health risk estimates to air quality adjustment procedure

    SciTech Connect (OSTI)

    Whitfield, R.G.

    1997-06-30

    This letter is a summary of risk results associated with exposure estimates using two-parameter Weibull and quadratic air quality adjustment procedures (AQAPs). New exposure estimates were developed for children and child-occurrences, six urban areas, and five alternative air quality scenarios. In all cases, the Weibull and quadratic results are compared to previous results, which are based on a proportional AQAP.

  5. The ARM Best Estimate 2-dimensional Gridded Surface

    SciTech Connect (OSTI)

    Xie,Shaocheng; Qi, Tang

    2015-06-15

    The ARM Best Estimate 2-dimensional Gridded Surface (ARMBE2DGRID) data set merges together key surface measurements at the Southern Great Plains (SGP) sites and interpolates the data to a regular 2D grid to facilitate data application. Data from the original site locations can be found in the ARM Best Estimate Station-based Surface (ARMBESTNS) data set.

  6. Generalized REGression Package for Nonlinear Parameter Estimation

    Energy Science and Technology Software Center (OSTI)

    1995-05-15

    GREG computes modal (maximum-posterior-density) and interval estimates of the parameters in a user-provided Fortran subroutine MODEL, using a user-provided vector OBS of single-response observations or matrix OBS of multiresponse observations. GREG can also select the optimal next experiment from a menu of simulated candidates, so as to minimize the volume of the parametric inference region based on the resulting augmented data set.

  7. Communications circuit including a linear quadratic estimator

    DOE Patents [OSTI]

    Ferguson, Dennis D.

    2015-07-07

    A circuit includes a linear quadratic estimator (LQE) configured to receive a plurality of measurements a signal. The LQE is configured to weight the measurements based on their respective uncertainties to produce weighted averages. The circuit further includes a controller coupled to the LQE and configured to selectively adjust at least one data link parameter associated with a communication channel in response to receiving the weighted averages.

  8. Low-Temperature Hydrothermal Resource Potential Estimate

    DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

    Katherine Young

    2016-06-30

    Compilation of data (spreadsheet and shapefiles) for several low-temperature resource types, including isolated springs and wells, delineated area convection systems, sedimentary basins and coastal plains sedimentary systems. For each system, we include estimates of the accessible resource base, mean extractable resource and beneficial heat. Data compiled from USGS and other sources. The paper (submitted to GRC 2016) describing the methodology and analysis is also included.

  9. Derived Annual Estimates of Manufacturing Energy Consumption...

    U.S. Energy Information Administration (EIA) Indexed Site

    > Derived Annual Estimates - Executive Summary Derived Annual Estimates of Manufacturing Energy Consumption, 1974-1988 Figure showing Derived Estimates Executive Summary This...

  10. Examples of Cost Estimation Packages

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    Estimates can be performed in a variety of ways. Some of these are for projects for an undefined scope, a conventional construction project, or where there is a level of effort required to complete the work. Examples of cost estimation packages for these types of projects are described in this appendix.

  11. Emissions Tool Estimates the Impact of Emissions on Smart Grid...

    Energy Savers [EERE]

    The free, web-based calculator aims to estimate the impact of NOx, SO2 and CO2 emissions on smart grid infrastructure investments, taking into account specific context and project ...

  12. Guidelines for Estimating Unmetered Landscaping Water Use

    SciTech Connect (OSTI)

    McMordie Stoughton, Kate

    2010-07-28

    The document lays-out step by step instructions to estimate landscaping water using two alternative approaches: evapotranspiration method and irrigation audit method. The evapotranspiration method option calculates the amount of water needed to maintain a healthy turf or landscaped area for a given location based on the amount of water transpired and evaporated from the plants. The evapotranspiration method offers a relatively easy one-stop-shop for Federal agencies to develop an initial estimate of annual landscape water use. The document presents annual irrigation factors for 36 cities across the U.S. that represents the gallons of irrigation required per square foot for distinct landscape types. By following the steps outlined in the document, the reader can choose a location that is a close match their location and landscape type to provide a rough estimate of annual irrigation needs without the need to research specific data on their site. The second option presented in the document is the irrigation audit method, which is the physical measurement of water applied to landscaped areas through irrigation equipment. Steps to perform an irrigation audit are outlined in the document, which follow the Recommended Audit Guidelines produced by the Irrigation Association.[5] An irrigation audit requires some knowledge on the specific procedures to accurately estimate how much water is being consumed by the irrigation equipment.

  13. Module: Estimating Historical Emissions from Deforestation |...

    Open Energy Info (EERE)

    Website: www.leafasia.orgtoolstechnical-guidance-series-estimating-historical Cost: Free Language: English Module: Estimating Historical Emissions from Deforestation Screenshot...

  14. Estimating pixel variances in the scenes of staring sensors

    DOE Patents [OSTI]

    Simonson, Katherine M. (Cedar Crest, NM); Ma, Tian J. (Albuquerque, NM)

    2012-01-24

    A technique for detecting changes in a scene perceived by a staring sensor is disclosed. The technique includes acquiring a reference image frame and a current image frame of a scene with the staring sensor. A raw difference frame is generated based upon differences between the reference image frame and the current image frame. Pixel error estimates are generated for each pixel in the raw difference frame based at least in part upon spatial error estimates related to spatial intensity gradients in the scene. The pixel error estimates are used to mitigate effects of camera jitter in the scene between the current image frame and the reference image frame.

  15. Estimate Radiological Dose for Animals

    Energy Science and Technology Software Center (OSTI)

    1997-12-18

    Estimate Radiological dose for animals in ecological environment using open literature values for parameters such as body weight, plant and soil ingestion rate, rad. halflife, absorbed energy, biological halflife, gamma energy per decay, soil-to-plant transfer factor, ...etc

  16. Estimates of Green potentials. Applications

    SciTech Connect (OSTI)

    Danchenko, V I

    2003-02-28

    Optimal Cartan-type covers by hyperbolic discs of carriers of Green {alpha}-potentials are obtained in a simply connected domain in the complex plane and estimates of the potentials outside the carriers are presented. These results are applied to problems on the separation of singularities of analytic and harmonic functions. For instance, uniform and integral estimates in terms of Green capacities of components of meromorphic functions are obtained.

  17. Development of surface mine cost estimating equations

    SciTech Connect (OSTI)

    Not Available

    1980-09-26

    Cost estimating equations were developed to determine capital and operating costs for five surface coal mine models in Central Appalachia, Northern Appalachia, Mid-West, Far-West, and Campbell County, Wyoming. Engineering equations were used to estimate equipment costs for the stripping function and for the coal loading and hauling function for the base case mine and for several mines with different annual production levels and/or different overburden removal requirements. Deferred costs were then determined through application of the base case depreciation schedules, and direct labor costs were easily established once the equipment quantities (and, hence, manpower requirements) were determined. The data points were then fit with appropriate functional forms, and these were then multiplied by appropriate adjustment factors so that the resulting equations yielded the model mine costs for initial and deferred capital and annual operating cost. (The validity of this scaling process is based on the assumption that total initial and deferred capital costs are proportional to the initial and deferred costs for the primary equipment types that were considered and that annual operating cost is proportional to the direct labor costs that were determined based on primary equipment quantities.) Initial capital costs ranged from $3,910,470 in Central Appalachia to $49,296,785; deferred capital costs ranged from $3,220,000 in Central Appalachia to $30,735,000 in Campbell County, Wyoming; and annual operating costs ranged from $2,924,148 in Central Appalachia to $32,708,591 in Campbell County, Wyoming. (DMC)

  18. o CNMS Status Update

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Town Hall User Meeting (see attached slides) * Agenda: o Intro of UEC o CNMS Status Update o UEC activities o Nominations for open seat on the UEC * Activities o Monthly Telecon o...

  19. Estimates of Savings Achievable from Irrigation Controller

    SciTech Connect (OSTI)

    Williams, Alison; Fuchs, Heidi; Whitehead, Camilla Dunham

    2014-03-28

    This paper performs a literature review and meta-analysis of water savings from several types of advanced irrigation controllers: rain sensors (RS), weather-based irrigation controllers (WBIC), and soil moisture sensors (SMS).The purpose of this work is to derive average water savings per controller type, based to the extent possible on all available data. After a preliminary data scrubbing, we utilized a series of analytical filters to develop our best estimate of average savings. We applied filters to remove data that might bias the sample such as data self-reported by manufacturers, data resulting from studies focusing on high-water users, or data presented in a non-comparable format such as based on total household water use instead of outdoor water use. Because the resulting number of studies was too small to be statistically significant when broken down by controller type, this paper represents a survey and synthesis of available data rather than a definitive statement regarding whether the estimated water savings are representative.

  20. Quick estimating for thermal conductivity

    SciTech Connect (OSTI)

    Sastri, S.R.S.; Rao, K.K. )

    1993-08-01

    Accurate values for thermal conductivity--an important engineering property used in heat transfer calculations of liquids--are not as readily available as those for other physical properties. Therefore, it often becomes necessary to use estimated data. A new estimating method combines ease of use with an accuracy that is generally better than existing procedures. The paper discusses how to select terms and testing correlations, then gives two examples of the use of the method for calculation of the thermal conductivity of propionic acid and chlorobenzene.

  1. Kalman filter data assimilation: Targeting observations and parameter estimation

    SciTech Connect (OSTI)

    Bellsky, Thomas Kostelich, Eric J.; Mahalov, Alex

    2014-06-15

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly located observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation.

  2. Mandatory Photovoltaic System Cost Estimate

    Broader source: Energy.gov [DOE]

    If the customer has a ratio of estimated monthly kilowatt-hour (kWh) usage to line extension mileage that is less than or equal to 1,000, the utility must provide the comparison at no cost. If the...

  3. BLE: Battery Life Estimator | Argonne National Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    BLE: Battery Life Estimator BLE: Battery Life Estimator Argonne's Battery Life Estimator (BLE) software is a state-of-the-art tool kit for fitting battery aging data and for ...

  4. RADIATION DOSE ESTIMATES TO ADULTS AND CHILDREN FROM VARIOUS

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    RADIATION DOSE ESTIMATES TO ADULTS AND CHILDREN FROM VARIOUS RADIOPHARMACEUTICALS Latest Revision Date: 4/30/96 Radiation Internal Dose Information Center Oak Ridge Institute for Science and Education P.O. Box 117 Mail Stop 51 Oak Ridge, TN 37831 The radiopharmaceuticals and nuclear medicine studies considered in this report are listed in the Table of Contents on page 2. The radiation dose estimates given in the dose tables are based on the assumptions and models given in Chapter 17, Radiation

  5. Estimating propagation velocity through a surface acoustic wave sensor

    DOE Patents [OSTI]

    Xu, Wenyuan; Huizinga, John S.

    2010-03-16

    Techniques are described for estimating the propagation velocity through a surface acoustic wave sensor. In particular, techniques which measure and exploit a proper segment of phase frequency response of the surface acoustic wave sensor are described for use as a basis of bacterial detection by the sensor. As described, use of velocity estimation based on a proper segment of phase frequency response has advantages over conventional techniques that use phase shift as the basis for detection.

  6. DC Microgrids Scoping Study: Estimate of Technical and Economic Benefits

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    (March 2015) | Department of Energy Microgrids Scoping Study: Estimate of Technical and Economic Benefits (March 2015) DC Microgrids Scoping Study: Estimate of Technical and Economic Benefits (March 2015) Microgrid demonstrations and deployments have shown the ability of microgrids to provide higher reliability and higher power quality than utility power systems and improved energy utilization. The vast majority of these microgrids are based on AC power, but some manufacturers, power system

  7. Activity Based Costing

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    Activity Based Costing (ABC) is method for developing cost estimates in which the project is subdivided into discrete, quantifiable activities or a work unit. This chapter outlines the Activity Based Costing method and discusses applicable uses of ABC.

  8. Estimating Waste Inventory and Waste Tank Characterization |...

    Office of Environmental Management (EM)

    Estimating Waste Inventory and Waste Tank Characterization Estimating Waste Inventory and Waste Tank Characterization Summary Notes from 28 May 2008 Generic Technical Issue ...

  9. Estimates and Recommendations for Coincidence Geometry (Technical...

    Office of Scientific and Technical Information (OSTI)

    Estimates and Recommendations for Coincidence Geometry Citation Details In-Document Search Title: Estimates and Recommendations for Coincidence Geometry You are accessing a...

  10. Background estimation in experimental spectra

    SciTech Connect (OSTI)

    Fischer, R.; Hanson, K. M.; Los Alamos National Laboratory, MS P940, Los Alamos, New Mexico 87545 ; Dose, V.; Linden, W. von der

    2000-02-01

    A general probabilistic technique for estimating background contributions to measured spectra is presented. A Bayesian model is used to capture the defining characteristics of the problem, namely, that the background is smoother than the signal. The signal is allowed to have positive and/or negative components. The background is represented in terms of a cubic spline basis. A variable degree of smoothness of the background is attained by allowing the number of knots and the knot positions to be adaptively chosen on the basis of the data. The fully Bayesian approach taken provides a natural way to handle knot adaptivity and allows uncertainties in the background to be estimated. Our technique is demonstrated on a particle induced x-ray emission spectrum from a geological sample and an Auger spectrum from iron, which contains signals with both positive and negative components. (c) 2000 The American Physical Society.

  11. Guidelines for Estimating Unmetered Landscapting Water Use

    SciTech Connect (OSTI)

    None

    2010-07-30

    Guidance to help Federal agencies estimate unmetered landscaping water use as required by Executive Order 13514

  12. Improving Estimation Accuracy of Aggregate Queries on Data Cubes

    SciTech Connect (OSTI)

    Pourabbas, Elaheh; Shoshani, Arie

    2008-08-15

    In this paper, we investigate the problem of estimation of a target database from summary databases derived from a base data cube. We show that such estimates can be derived by choosing a primary database which uses a proxy database to estimate the results. This technique is common in statistics, but an important issue we are addressing is the accuracy of these estimates. Specifically, given multiple primary and multiple proxy databases, that share the same summary measure, the problem is how to select the primary and proxy databases that will generate the most accurate target database estimation possible. We propose an algorithmic approach for determining the steps to select or compute the source databases from multiple summary databases, which makes use of the principles of information entropy. We show that the source databases with the largest number of cells in common provide the more accurate estimates. We prove that this is consistent with maximizing the entropy. We provide some experimental results on the accuracy of the target database estimation in order to verify our results.

  13. Estimation of economic parameters of U.S. hydropower resources

    SciTech Connect (OSTI)

    Hall, Douglas G.; Hunt, Richard T.; Reeves, Kelly S.; Carroll, Greg R.

    2003-06-01

    Tools for estimating the cost of developing and operating and maintaining hydropower resources in the form of regression curves were developed based on historical plant data. Development costs that were addressed included: licensing, construction, and five types of environmental mitigation. It was found that the data for each type of cost correlated well with plant capacity. A tool for estimating the annual and monthly electric generation of hydropower resources was also developed. Additional tools were developed to estimate the cost of upgrading a turbine or a generator. The development and operation and maintenance cost estimating tools, and the generation estimating tool were applied to 2,155 U.S. hydropower sites representing a total potential capacity of 43,036 MW. The sites included totally undeveloped sites, dams without a hydroelectric plant, and hydroelectric plants that could be expanded to achieve greater capacity. Site characteristics and estimated costs and generation for each site were assembled in a database in Excel format that is also included within the EERE Library under the title, “Estimation of Economic Parameters of U.S. Hydropower Resources - INL Hydropower Resource Economics Database.”

  14. Temperature estimates from zircaloy oxidation kinetics and microstructures. [PWR

    SciTech Connect (OSTI)

    Olsen, C.S.

    1982-10-01

    This report reviews state-of-the-art capability to determine peak zircaloy fuel rod cladding temperatures following an abnormal temperature excursion in a nuclear reactor, based on postirradiation metallographic analysis of zircaloy microstructural and oxidation characteristics. Results of a comprehensive literature search are presented to evaluate the suitability of available zircaloy microstructural and oxidation data for estimating anticipated reactor fuel rod cladding temperatures. Additional oxidation experiments were conducted to evaluate low-temperature zircaloy oxidation characteristics for postirradiation estimation of cladding temperature by metallographic examination. Results of these experiments were used to calculate peak cladding temperatures of electrical heater rods and nuclear fuel rods that had been subjected to reactor temperature transients. Comparison of the calculated and measured peak cladding temperatures for these rods indicates that oxidation kinetics is a viable technique for estimating peak cladding temperatures over a broad temperature range. However, further improvement in zircaloy microstructure technology is necessary for precise estimation of peak cladding temperatures by microstructural examination.

  15. A Physically-Based Estimate of Radiative Forcing by Anthropogenic...

    Office of Scientific and Technical Information (OSTI)

    Authors: Ghan, Steven J. 1 ; Easter, Richard C. 1 ; Chapman, Elaine G. 1 ; Abdul-Razzak, Hayder 2 ; Zhang, Yang 1 ; Leung, Ruby 1 ; Laulainen, Nels S. 1 ; Saylor, ...

  16. Estimation of net primary productivity using a process-based...

    Office of Scientific and Technical Information (OSTI)

    Net primary productivity (NPP) modeling can help to improve the understanding of the ecosystem, and therefore, improve ecological efficiency. The boreal ecosystem productivity ...

  17. Fast, moment-based estimation methods for delay network tomography...

    Office of Scientific and Technical Information (OSTI)

    Authors: Lawrence, Earl Christophre 1 ; Michailidis, George 2 ; Nair, Vijayan N 2 + Show Author Affiliations Los Alamos National Laboratory U OF MICHIGAN Publication Date: ...

  18. Fast, moment-based estimation methods for delay network tomography...

    Office of Scientific and Technical Information (OSTI)

    Much of the previous literature deals with discrete delay distributions by discretizing ... Country of Publication: United States Language: English Subject: 42; MONITORING; PROBES; ...

  19. Surface daytime net radiation estimation using artificial neural networks

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Jiang, Bo; Zhang, Yi; Liang, Shunlin; Zhang, Xiaotong; Xiao, Zhiqiang

    2014-11-11

    Net all-wave surface radiation (Rn) is one of the most important fundamental parameters in various applications. However, conventional Rn measurements are difficult to collect because of the high cost and ongoing maintenance of recording instruments. Therefore, various empirical Rn estimation models have been developed. This study presents the results of two artificial neural network (ANN) models (general regression neural networks (GRNN) and Neuroet) to estimate Rn globally from multi-source data, including remotely sensed products, surface measurements, and meteorological reanalysis products. Rn estimates provided by the two ANNs were tested against in-situ radiation measurements obtained from 251 global sites between 1991–2010more » both in global mode (all data were used to fit the models) and in conditional mode (the data were divided into four subsets and the models were fitted separately). Based on the results obtained from extensive experiments, it has been proved that the two ANNs were superior to linear-based empirical models in both global and conditional modes and that the GRNN performed better and was more stable than Neuroet. The GRNN estimates had a determination coefficient (R2) of 0.92, a root mean square error (RMSE) of 34.27 W·m–2 , and a bias of –0.61 W·m–2 in global mode based on the validation dataset. In conclusion, ANN methods are a potentially powerful tool for global Rn estimation.« less

  20. Development on electromagnetic impedance function modeling and its estimation

    SciTech Connect (OSTI)

    Sutarno, D.

    2015-09-30

    Today the Electromagnetic methods such as magnetotellurics (MT) and controlled sources audio MT (CSAMT) is used in a broad variety of applications. Its usefulness in poor seismic areas and its negligible environmental impact are integral parts of effective exploration at minimum cost. As exploration was forced into more difficult areas, the importance of MT and CSAMT, in conjunction with other techniques, has tended to grow continuously. However, there are obviously important and difficult problems remaining to be solved concerning our ability to collect process and interpret MT as well as CSAMT in complex 3D structural environments. This talk aim at reviewing and discussing the recent development on MT as well as CSAMT impedance functions modeling, and also some improvements on estimation procedures for the corresponding impedance functions. In MT impedance modeling, research efforts focus on developing numerical method for computing the impedance functions of three dimensionally (3-D) earth resistivity models. On that reason, 3-D finite elements numerical modeling for the impedances is developed based on edge element method. Whereas, in the CSAMT case, the efforts were focused to accomplish the non-plane wave problem in the corresponding impedance functions. Concerning estimation of MT and CSAMT impedance functions, researches were focused on improving quality of the estimates. On that objective, non-linear regression approach based on the robust M-estimators and the Hilbert transform operating on the causal transfer functions, were used to dealing with outliers (abnormal data) which are frequently superimposed on a normal ambient MT as well as CSAMT noise fields. As validated, the proposed MT impedance modeling method gives acceptable results for standard three dimensional resistivity models. Whilst, the full solution based modeling that accommodate the non-plane wave effect for CSAMT impedances is applied for all measurement zones, including near-, transition

  1. Dynamic estimator for determining operating conditions in an internal combustion engine

    DOE Patents [OSTI]

    Hellstrom, Erik; Stefanopoulou, Anna; Jiang, Li; Larimore, Jacob

    2016-01-05

    Methods and systems are provided for estimating engine performance information for a combustion cycle of an internal combustion engine. Estimated performance information for a previous combustion cycle is retrieved from memory. The estimated performance information includes an estimated value of at least one engine performance variable. Actuator settings applied to engine actuators are also received. The performance information for the current combustion cycle is then estimated based, at least in part, on the estimated performance information for the previous combustion cycle and the actuator settings applied during the previous combustion cycle. The estimated performance information for the current combustion cycle is then stored to the memory to be used in estimating performance information for a subsequent combustion cycle.

  2. 2007 Estimated International Energy Flows

    SciTech Connect (OSTI)

    Smith, C A; Belles, R D; Simon, A J

    2011-03-10

    An energy flow chart or 'atlas' for 136 countries has been constructed from data maintained by the International Energy Agency (IEA) and estimates of energy use patterns for the year 2007. Approximately 490 exajoules (460 quadrillion BTU) of primary energy are used in aggregate by these countries each year. While the basic structure of the energy system is consistent from country to country, patterns of resource use and consumption vary. Energy can be visualized as it flows from resources (i.e. coal, petroleum, natural gas) through transformations such as electricity generation to end uses (i.e. residential, commercial, industrial, transportation). These flow patterns are visualized in this atlas of 136 country-level energy flow charts.

  3. Bayesian estimation methods in metrology

    SciTech Connect (OSTI)

    Cox, M.G.; Forbes, A.B.; Harris, P.M.

    2004-11-16

    In metrology -- the science of measurement -- a measurement result must be accompanied by a statement of its associated uncertainty. The degree of validity of a measurement result is determined by the validity of the uncertainty statement. In recognition of the importance of uncertainty evaluation, the International Standardization Organization in 1995 published the Guide to the Expression of Uncertainty in Measurement and the Guide has been widely adopted. The validity of uncertainty statements is tested in interlaboratory comparisons in which an artefact is measured by a number of laboratories and their measurement results compared. Since the introduction of the Mutual Recognition Arrangement, key comparisons are being undertaken to determine the degree of equivalence of laboratories for particular measurement tasks. In this paper, we discuss the possible development of the Guide to reflect Bayesian approaches and the evaluation of key comparison data using Bayesian estimation methods.

  4. State Energy Price and Expenditure Estimates

    Reports and Publications (EIA)

    2016-01-01

    Energy price and expenditure estimates in dollars per million Btu and in million dollars, by state, 1970-2014.

  5. Cost Model and Cost Estimating Software

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1997-03-28

    This chapter discusses a formalized methodology is basically a cost model, which forms the basis for estimating software.

  6. Distributed Dynamic State Estimator, Generator Parameter Estimation and Stability Monitoring Demonstration

    SciTech Connect (OSTI)

    Meliopoulos, Sakis; Cokkinides, George; Fardanesh, Bruce; Hedrington, Clinton

    2013-12-31

    This is the final report for this project that was performed in the period: October1, 2009 to June 30, 2013. In this project, a fully distributed high-fidelity dynamic state estimator (DSE) that continuously tracks the real time dynamic model of a wide area system with update rates better than 60 times per second is achieved. The proposed technology is based on GPS-synchronized measurements but also utilizes data from all available Intelligent Electronic Devices in the system (numerical relays, digital fault recorders, digital meters, etc.). The distributed state estimator provides the real time model of the system not only the voltage phasors. The proposed system provides the infrastructure for a variety of applications and two very important applications (a) a high fidelity generating unit parameters estimation and (b) an energy function based transient stability monitoring of a wide area electric power system with predictive capability. Also the dynamic distributed state estimation results are stored (the storage scheme includes data and coincidental model) enabling an automatic reconstruction and “play back” of a system wide disturbance. This approach enables complete play back capability with fidelity equal to that of real time with the advantage of “playing back” at a user selected speed. The proposed technologies were developed and tested in the lab during the first 18 months of the project and then demonstrated on two actual systems, the USVI Water and Power Administration system and the New York Power Authority’s Blenheim-Gilboa pumped hydro plant in the last 18 months of the project. The four main thrusts of this project, mentioned above, are extremely important to the industry. The DSE with the achieved update rates (more than 60 times per second) provides a superior solution to the “grid visibility” question. The generator parameter identification method fills an important and practical need of the industry. The “energy function” based

  7. New Methodology for Natural Gas Production Estimates

    Reports and Publications (EIA)

    2010-01-01

    A new methodology is implemented with the monthly natural gas production estimates from the EIA-914 survey this month. The estimates, to be released April 29, 2010, include revisions for all of 2009. The fundamental changes in the new process include the timeliness of the historical data used for estimation and the frequency of sample updates, both of which are improved.

  8. The Federal Highway Administration Gasohol Consumption Estimation Model

    SciTech Connect (OSTI)

    Hwang, HL

    2003-08-28

    The Federal Highway Administration (FHWA) is responsible for estimating the portion of Federal highway funds attributable to each State. The process involves use of State-reported data (gallons) and a set of estimation models when accurate State data is unavailable. To ensure that the distribution of funds is equitable, FHWA periodically reviews the estimation models. Estimation of the use of gasohol is difficult because of State differences in the definition of gasohol, inability of many States to separate and report gasohol usage from other fuel types, changes in fuel composition in nonattainment areas to address concerns over the use of certain fuel additives, and the lack of a valid State-level surrogate data set for gasohol use. Under the sponsorship of FHWA, Oak Ridge National Laboratory (ORNL) reviewed the regression-based gasohol estimation model that has been in use for several years. Based on an analytical assessment of that model and an extensive review of potential data sets, ORNL developed an improved rule-based model. The new model uses data from Internal Revenue Service, Energy Information Administration, Environmental Protection Agency, Department of Energy, ORNL, and FHWA sources. The model basically consists of three parts: (1) development of a controlled total of national gasohol usage, (2) determination of reliable State gasohol consumption data, and (3) estimation of gasohol usage for all other States. The new model will be employed for the 2004 attribution process. FHWA is currently soliciting comments and inputs from interested parties. Relevant data, as identified, will be pursued and refinements will be made by the research team if warranted.

  9. $100 billion mistake: is the windfall revenue estimate too high

    SciTech Connect (OSTI)

    Samuelson, R.J.

    1980-04-26

    An economic analysis of the Windfall Profits Tax (as proposed at the time) suggests that the estimate of a $227 billion revenue over the next decade may be as much as $100 billion too high. This judgment is based on provisions in the law allowing states to deduct severance taxes up to 15 percent on oil before federal taxes are paid and offering tax incentives for tertiary projects. The arithmetic, particularly in the case of enhanced oil recovery, illustrates how the incentives could shift more production from a 70% to a 30% tax rate than the Federal government had estimated. (DCK)

  10. Anisotropic parameter estimation using velocity variation with offset analysis

    SciTech Connect (OSTI)

    Herawati, I.; Saladin, M.; Pranowo, W.; Winardhie, S.; Priyono, A.

    2013-09-09

    Seismic anisotropy is defined as velocity dependent upon angle or offset. Knowledge about anisotropy effect on seismic data is important in amplitude analysis, stacking process and time to depth conversion. Due to this anisotropic effect, reflector can not be flattened using single velocity based on hyperbolic moveout equation. Therefore, after normal moveout correction, there will still be residual moveout that relates to velocity information. This research aims to obtain anisotropic parameters, ? and ?, using two proposed methods. The first method is called velocity variation with offset (VVO) which is based on simplification of weak anisotropy equation. In VVO method, velocity at each offset is calculated and plotted to obtain vertical velocity and parameter ?. The second method is inversion method using linear approach where vertical velocity, ?, and ? is estimated simultaneously. Both methods are tested on synthetic models using ray-tracing forward modelling. Results show that ? value can be estimated appropriately using both methods. Meanwhile, inversion based method give better estimation for obtaining ? value. This study shows that estimation on anisotropic parameters rely on the accuracy of normal moveout velocity, residual moveout and offset to angle transformation.

  11. Estimating Terrorist Risk with Possibility Theory

    SciTech Connect (OSTI)

    J.L. Darby

    2004-11-30

    This report summarizes techniques that use possibility theory to estimate the risk of terrorist acts. These techniques were developed under the sponsorship of the Department of Homeland Security (DHS) as part of the National Infrastructure Simulation Analysis Center (NISAC) project. The techniques have been used to estimate the risk of various terrorist scenarios to support NISAC analyses during 2004. The techniques are based on the Logic Evolved Decision (LED) methodology developed over the past few years by Terry Bott and Steve Eisenhawer at LANL. [LED] The LED methodology involves the use of fuzzy sets, possibility theory, and approximate reasoning. LED captures the uncertainty due to vagueness and imprecision that is inherent in the fidelity of the information available for terrorist acts; probability theory cannot capture these uncertainties. This report does not address the philosophy supporting the development of nonprobabilistic approaches, and it does not discuss possibility theory in detail. The references provide a detailed discussion of these subjects. [Shafer] [Klir and Yuan] [Dubois and Prade] Suffice to say that these approaches were developed to address types of uncertainty that cannot be addressed by a probability measure. An earlier report discussed in detail the problems with using a probability measure to evaluate terrorist risk. [Darby Methodology]. Two related techniques are discussed in this report: (1) a numerical technique, and (2) a linguistic technique. The numerical technique uses traditional possibility theory applied to crisp sets, while the linguistic technique applies possibility theory to fuzzy sets. Both of these techniques as applied to terrorist risk for NISAC applications are implemented in software called PossibleRisk. The techniques implemented in PossibleRisk were developed specifically for use in estimating terrorist risk for the NISAC program. The LEDTools code can be used to perform the same linguistic evaluation as

  12. Robust Bearing Estimation for 3-Component Stations

    SciTech Connect (OSTI)

    Claassen, John P.

    1999-06-03

    A robust bearing estimation process for 3-component stations has been developed and explored. The method, called SEEC for Search, Estimate, Evaluate and Correct, intelligently exploits the in- herent information in the arrival at every step of the process to achieve near-optimal results. In particular, the approach uses a consistent framework to define the optimal time-frequency windows on which to make estimates, to make the bearing estimates themselves, to construct metrics helpful in choosing the better estimates or admitting that the bearing is immeasurable, andjinally to apply bias corrections when calibration information is available to yield a single final estimate. The method was applied to a small but challenging set of events in a seismically active region. The method demonstrated remarkable utility by providing better estimates and insights than previously available. Various monitoring implications are noted fiom these findings.

  13. Robust bearing estimation for 3-component stations

    SciTech Connect (OSTI)

    CLAASSEN,JOHN P.

    2000-02-01

    A robust bearing estimation process for 3-component stations has been developed and explored. The method, called SEEC for Search, Estimate, Evaluate and Correct, intelligently exploits the inherent information in the arrival at every step of the process to achieve near-optimal results. In particular the approach uses a consistent framework to define the optimal time-frequency windows on which to make estimates, to make the bearing estimates themselves, to construct metrics helpful in choosing the better estimates or admitting that the bearing is immeasurable, and finally to apply bias corrections when calibration information is available to yield a single final estimate. The algorithm was applied to a small but challenging set of events in a seismically active region. It demonstrated remarkable utility by providing better estimates and insights than previously available. Various monitoring implications are noted from these findings.

  14. Impedance-estimation methods, modeling methods, articles of manufacture, impedance-modeling devices, and estimated-impedance monitoring systems

    DOE Patents [OSTI]

    Richardson, John G.

    2009-11-17

    An impedance estimation method includes measuring three or more impedances of an object having a periphery using three or more probes coupled to the periphery. The three or more impedance measurements are made at a first frequency. Three or more additional impedance measurements of the object are made using the three or more probes. The three or more additional impedance measurements are made at a second frequency different from the first frequency. An impedance of the object at a point within the periphery is estimated based on the impedance measurements and the additional impedance measurements.

  15. Reliability Estimates for Power Supplies

    SciTech Connect (OSTI)

    Lee C. Cadwallader; Peter I. Petersen

    2005-09-01

    Failure rates for large power supplies at a fusion facility are critical knowledge needed to estimate availability of the facility or to set priorties for repairs and spare components. A study of the "failure to operate on demand" and "failure to continue to operate" failure rates has been performed for the large power supplies at DIII-D, which provide power to the magnet coils, the neutral beam injectors, the electron cyclotron heating systems, and the fast wave systems. When one of the power supplies fails to operate, the research program has to be either temporarily changed or halted. If one of the power supplies for the toroidal or ohmic heating coils fails, the operations have to be suspended or the research is continued at de-rated parameters until a repair is completed. If one of the power supplies used in the auxiliary plasma heating systems fails the research is often temporarily changed until a repair is completed. The power supplies are operated remotely and repairs are only performed when the power supplies are off line, so that failure of a power supply does not cause any risk to personnel. The DIII-D Trouble Report database was used to determine the number of power supply faults (over 1,700 reports), and tokamak annual operations data supplied the number of shots, operating times, and power supply usage for the DIII-D operating campaigns between mid-1987 and 2004. Where possible, these power supply failure rates from DIII-D will be compared to similar work that has been performed for the Joint European Torus equipment. These independent data sets support validation of the fusion-specific failure rate values.

  16. Budget estimates. Fiscal year 1998

    SciTech Connect (OSTI)

    1997-02-01

    The U.S. Congress has determined that the safe use of nuclear materials for peaceful purposes is a legitimate and important national goal. It has entrusted the Nuclear Regulatory Commission (NRC) with the primary Federal responsibility for achieving that goal. The NRC`s mission, therefore, is to regulate the Nation`s civilian use of byproduct, source, and special nuclear materials to ensure adequate protection of public health and safety, to promote the common defense and security, and to protect the environment. The NRC`s FY 1998 budget requests new budget authority of $481,300,000 to be funded by two appropriations - one is the NRC`s Salaraies and Expenses appropriation for $476,500,000, and the other is NRC`s Office of Inspector General appropriation for $4,800,000. Of the funds appropriated to the NRC`s Salaries and Expenses, $17,000,000, shall be derived from the Nuclear Waste Fund and $2,000,000 shall be derived from general funds. The proposed FY 1998 appropriation legislation would also exempt the $2,000,000 for regulatory reviews and other assistance provided to the Department of Energy from the requirement that the NRC collect 100 percent of its budget from fees. The sums appropriated to the NRC`s Salaries and Expenses and NRC`s Office of Inspector General shall be reduced by the amount of revenues received during FY 1998 from licensing fees, inspection services, and other services and collections, so as to result in a final FY 1998 appropriation for the NRC of an estimated $19,000,000 - the amount appropriated from the Nuclear Waste Fund and from general funds. Revenues derived from enforcement actions shall be deposited to miscellaneous receipts of the Treasury.

  17. Time and Resource Estimation Tool

    Energy Science and Technology Software Center (OSTI)

    2004-06-08

    RESTORE is a computer software tool that allows one to model a complex set of steps required to accomplish a goal (e.g., repair a ruptured natural gas pipeline and restore service to customers). However, the time necessary to complete step may be uncertain and may be affected by conditions, such as the weather, the time of day, the day of the week. Therefore, "nature" can influence which steps are taken and the time needed tomore » complete each step. In addition, the tool allows one to model the costs for each step, which also may be uncertain. RESTORE allows the user to estimate the time and cost, both of which may be uncertain, to achieve an intermediate stage of completion, as well as overall completion. The software also makes it possible to model parallel, competing groups of activities (i.e., parallel paths) so that progreSs at a ‘merge point’ can proceed before other competing activities are completed. For example, RESTORE permits one to model a workaround and a simultaneous complete repair to determine a probability distribution for the earliest time service can be restored to a critical customer. The tool identifies the ‘most active path’ through the network of tasks, which is extremely important information for assessing the most effective way to speed-up or slow-down progress. Unlike other project planning and risk analysis tools, RESTORE provides an intuitive, graphical, and object-oriented environment for structuring a model and setting its parameters.« less

  18. Method and system to estimate variables in an integrated gasification combined cycle (IGCC) plant

    DOE Patents [OSTI]

    Kumar, Aditya; Shi, Ruijie; Dokucu, Mustafa

    2013-09-17

    System and method to estimate variables in an integrated gasification combined cycle (IGCC) plant are provided. The system includes a sensor suite to measure respective plant input and output variables. An extended Kalman filter (EKF) receives sensed plant input variables and includes a dynamic model to generate a plurality of plant state estimates and a covariance matrix for the state estimates. A preemptive-constraining processor is configured to preemptively constrain the state estimates and covariance matrix to be free of constraint violations. A measurement-correction processor may be configured to correct constrained state estimates and a constrained covariance matrix based on processing of sensed plant output variables. The measurement-correction processor is coupled to update the dynamic model with corrected state estimates and a corrected covariance matrix. The updated dynamic model may be configured to estimate values for at least one plant variable not originally sensed by the sensor suite.

  19. Atmospheric Inverse Estimates of Methane Emissions from Central California

    SciTech Connect (OSTI)

    Zhao, Chuanfeng; Andrews, Arlyn E.; Bianco, Laura; Eluszkiewicz, Janusz; Hirsch, Adam; MacDonald, Clinton; Nehrkorn, Thomas; Fischer, Marc L.

    2008-11-21

    Methane mixing ratios measured at a tall-tower are compared to model predictions to estimate surface emissions of CH{sub 4} in Central California for October-December 2007 using an inverse technique. Predicted CH{sub 4} mixing ratios are calculated based on spatially resolved a priori CH{sub 4} emissions and simulated atmospheric trajectories. The atmospheric trajectories, along with surface footprints, are computed using the Weather Research and Forecast (WRF) coupled to the Stochastic Time-Inverted Lagrangian Transport (STILT) model. An uncertainty analysis is performed to provide quantitative uncertainties in estimated CH{sub 4} emissions. Three inverse model estimates of CH{sub 4} emissions are reported. First, linear regressions of modeled and measured CH{sub 4} mixing ratios obtain slopes of 0.73 {+-} 0.11 and 1.09 {+-} 0.14 using California specific and Edgar 3.2 emission maps respectively, suggesting that actual CH{sub 4} emissions were about 37 {+-} 21% higher than California specific inventory estimates. Second, a Bayesian 'source' analysis suggests that livestock emissions are 63 {+-} 22% higher than the a priori estimates. Third, a Bayesian 'region' analysis is carried out for CH{sub 4} emissions from 13 sub-regions, which shows that inventory CH{sub 4} emissions from the Central Valley are underestimated and uncertainties in CH{sub 4} emissions are reduced for sub-regions near the tower site, yielding best estimates of flux from those regions consistent with 'source' analysis results. The uncertainty reductions for regions near the tower indicate that a regional network of measurements will be necessary to provide accurate estimates of surface CH{sub 4} emissions for multiple regions.

  20. New Methodology for Estimating Fuel Economy by Vehicle Class

    SciTech Connect (OSTI)

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumption rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.

  1. Notes on a New Coherence Estimator

    SciTech Connect (OSTI)

    Bickel, Douglas L.

    2016-01-01

    This document discusses some interesting features of the new coherence estimator in [1] . The estimator is d erived from a slightly different viewpoint. We discuss a few properties of the estimator, including presenting the probability density function of the denominator of the new estimator , which is a new feature of this estimator . Finally, we present an appr oximate equation for analysis of the sensitivity of the estimator to the knowledge of the noise value. ACKNOWLEDGEMENTS The preparation of this report is the result of an unfunded research and development activity. Sandia National Laboratories is a multi - program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE - AC04 - 94AL85000.

  2. Surface daytime net radiation estimation using artificial neural networks

    SciTech Connect (OSTI)

    Jiang, Bo; Zhang, Yi; Liang, Shunlin; Zhang, Xiaotong; Xiao, Zhiqiang

    2014-11-11

    Net all-wave surface radiation (Rn) is one of the most important fundamental parameters in various applications. However, conventional Rn measurements are difficult to collect because of the high cost and ongoing maintenance of recording instruments. Therefore, various empirical Rn estimation models have been developed. This study presents the results of two artificial neural network (ANN) models (general regression neural networks (GRNN) and Neuroet) to estimate Rn globally from multi-source data, including remotely sensed products, surface measurements, and meteorological reanalysis products. Rn estimates provided by the two ANNs were tested against in-situ radiation measurements obtained from 251 global sites between 1991–2010 both in global mode (all data were used to fit the models) and in conditional mode (the data were divided into four subsets and the models were fitted separately). Based on the results obtained from extensive experiments, it has been proved that the two ANNs were superior to linear-based empirical models in both global and conditional modes and that the GRNN performed better and was more stable than Neuroet. The GRNN estimates had a determination coefficient (R2) of 0.92, a root mean square error (RMSE) of 34.27 W·m–2 , and a bias of –0.61 W·m–2 in global mode based on the validation dataset. In conclusion, ANN methods are a potentially powerful tool for global Rn estimation.

  3. State energy data report 1994: Consumption estimates

    SciTech Connect (OSTI)

    1996-10-01

    This document provides annual time series estimates of State-level energy consumption by major economic sector. The estimates are developed in the State Energy Data System (SEDS), operated by EIA. SEDS provides State energy consumption estimates to members of Congress, Federal and State agencies, and the general public, and provides the historical series needed for EIA`s energy models. Division is made for each energy type and end use sector. Nuclear electric power is included.

  4. Interruption Cost Estimate Calculator | Open Energy Information

    Open Energy Info (EERE)

    Cost Estimate (ICE) Calculator This calculator is a tool designed for electric reliability planners at utilities, government organizations or other entities that are...

  5. An Estimator of Propagation of Cascading Failure

    SciTech Connect (OSTI)

    Dobson, Ian; Wierzbicki, Kevin; Carreras, Benjamin A; Lynch, Vickie E; Newman, David E

    2006-01-01

    The authors suggest a statistical estimator to measure the extent to which failures propagate in cascading failures such as large blackouts.

  6. ORISE: Radiation Dose Estimates and Other Compendia

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    rapidly estimate internal and external radiation dose magnitudes that can be used to help ... (PDF) Health Concerns Related to Radiation Exposure of the Female Nuclear Medicine ...

  7. Structure Learning and Statistical Estimation in Distribution...

    Office of Scientific and Technical Information (OSTI)

    Citation Details In-Document Search Title: Structure Learning and Statistical Estimation ... Part I of this paper discusses the problem of learning the operational structure of the ...

  8. Adjusted Estimates of Texas Natural Gas Production

    Reports and Publications (EIA)

    2005-01-01

    The Energy Information Administration (EIA) is adjusting its estimates of natural gas production in Texas for 2004 and 2005 to correctly account for carbon dioxide (CO2) production.

  9. Guidelines for Estimating Unmetered Industrial Water Use

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Guidelines for Estimating Unmetered Industrial Water Use Prepared for U.S. Department of ... PNNL would like to thank the Federal Water Working Group of the Interagency Energy ...

  10. Fuel Cell System for Transportation -- 2005 Cost Estimate

    SciTech Connect (OSTI)

    Wheeler, D.

    2006-10-01

    Independent review report of the methodology used by TIAX to estimate the cost of producing PEM fuel cells using 2005 cell stack technology. The U.S. Department of Energy (DOE) Hydrogen, Fuel Cells and Infrastructure Technologies Program Manager asked the National Renewable Energy Laboratory (NREL) to commission an independent review of the 2005 TIAX cost analysis for fuel cell production. The NREL Systems Integrator is responsible for conducting independent reviews of progress toward meeting the DOE Hydrogen Program (the Program) technical targets. An important technical target of the Program is the proton exchange membrane (PEM) fuel cell cost in terms of dollars per kilowatt ($/kW). The Program's Multi-Year Program Research, Development, and Demonstration Plan established $125/kW as the 2005 technical target. Over the last several years, the Program has contracted with TIAX, LLC (TIAX) to produce estimates of the high volume cost of PEM fuel cell production for transportation use. Since no manufacturer is yet producing PEM fuel cells in the quantities needed for an initial hydrogen-based transportation economy, these estimates are necessary for DOE to gauge progress toward meeting its targets. For a PEM fuel cell system configuration developed by Argonne National Laboratory, TIAX estimated the total cost to be $108/kW, based on assumptions of 500,000 units per year produced with 2005 cell stack technology, vertical integration of cell stack manufacturing, and balance-of-plant (BOP) components purchased from a supplier network. Furthermore, TIAX conducted a Monte Carlo analysis by varying ten key parameters over a wide range of values and estimated with 98% certainty that the mean PEM fuel cell system cost would be below DOE's 2005 target of $125/kW. NREL commissioned DJW TECHNOLOGY, LLC to form an Independent Review Team (the Team) of industry fuel cell experts and to evaluate the cost estimation process and the results reported by TIAX. The results of this

  11. Integration of remote measurement calibration with state estimation; A feasibility study

    SciTech Connect (OSTI)

    Adibi, M.M. ); Clements, K.A. ); Kafka, R.J. ); Stovall, J.P. )

    1992-08-01

    This paper describes the integration of measurement calibration and state estimation methodologies for increasing the confidence level in the real-time data base. The objectives are to improve performance of state estimators and to reduce the system engineering effort which goes into its installations and the related measurement calibrations.

  12. Cost Estimating Handbook for Environmental Restoration

    SciTech Connect (OSTI)

    1990-09-01

    Environmental restoration (ER) projects have presented the DOE and cost estimators with a number of properties that are not comparable to the normal estimating climate within DOE. These properties include: An entirely new set of specialized expressions and terminology. A higher than normal exposure to cost and schedule risk, as compared to most other DOE projects, due to changing regulations, public involvement, resource shortages, and scope of work. A higher than normal percentage of indirect costs to the total estimated cost due primarily to record keeping, special training, liability, and indemnification. More than one estimate for a project, particularly in the assessment phase, in order to provide input into the evaluation of alternatives for the cleanup action. While some aspects of existing guidance for cost estimators will be applicable to environmental restoration projects, some components of the present guidelines will have to be modified to reflect the unique elements of these projects. The purpose of this Handbook is to assist cost estimators in the preparation of environmental restoration estimates for Environmental Restoration and Waste Management (EM) projects undertaken by DOE. The DOE has, in recent years, seen a significant increase in the number, size, and frequency of environmental restoration projects that must be costed by the various DOE offices. The coming years will show the EM program to be the largest non-weapons program undertaken by DOE. These projects create new and unique estimating requirements since historical cost and estimating precedents are meager at best. It is anticipated that this Handbook will enhance the quality of cost data within DOE in several ways by providing: The basis for accurate, consistent, and traceable baselines. Sound methodologies, guidelines, and estimating formats. Sources of cost data/databases and estimating tools and techniques available at DOE cost professionals.

  13. Numerical estimation of the relative entropy of entanglement

    SciTech Connect (OSTI)

    Zinchenko, Yuriy; Friedland, Shmuel; Gour, Gilad

    2010-11-15

    We propose a practical algorithm for the calculation of the relative entropy of entanglement (REE), defined as the minimum relative entropy between a state and the set of states with positive partial transpose. Our algorithm is based on a practical semidefinite cutting plane approach. In low dimensions the implementation of the algorithm in matlab provides an estimation for the REE with an absolute error smaller than 10{sup -3}.

  14. Guidance on Utility Rate Estimations and Weather Normalization...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Utility Rate Estimations and Weather Normalization in an ESPC Guidance on Utility Rate Estimations and Weather Normalization in an ESPC Document explains how to use estimated ...

  15. Cost and Schedule Estimate and Analysis (FPM 207), Amarillo ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Course topics include: identifying cost and schedule estimates; Basic estimating methods; Group analysis techniques; Applying life-cycle costing technique; Validating estimates; ...

  16. Property:Estimated End Date | Open Energy Information

    Open Energy Info (EERE)

    Estimated End Date Jump to: navigation, search Property Name Estimated End Date Property Type String Pages using the property "Estimated End Date" Showing 4 pages using this...

  17. Request for Retirement Annuity Estimates | Department of Energy

    Office of Environmental Management (EM)

    Request for Retirement Annuity Estimates Request for Retirement Annuity Estimates Upon request, Office of the Chief Human Capital Officer provides retirement estimates for ...

  18. West Virginia Dry Natural Gas Reserves Estimated Production ...

    U.S. Energy Information Administration (EIA) Indexed Site

    Estimated Production (Billion Cubic Feet) West Virginia Dry Natural Gas Reserves Estimated ... Dry Natural Gas Reserves Estimated Production West Virginia Dry Natural Gas Proved ...

  19. Virginia Dry Natural Gas Reserves Estimated Production (Billion...

    U.S. Energy Information Administration (EIA) Indexed Site

    Estimated Production (Billion Cubic Feet) Virginia Dry Natural Gas Reserves Estimated ... Dry Natural Gas Reserves Estimated Production Virginia Dry Natural Gas Proved Reserves Dry ...

  20. New York Dry Natural Gas Reserves Estimated Production (Billion...

    Gasoline and Diesel Fuel Update (EIA)

    Estimated Production (Billion Cubic Feet) New York Dry Natural Gas Reserves Estimated ... Dry Natural Gas Reserves Estimated Production New York Dry Natural Gas Proved Reserves Dry ...

  1. New Mexico Dry Natural Gas Reserves Estimated Production (Billion...

    Gasoline and Diesel Fuel Update (EIA)

    Estimated Production (Billion Cubic Feet) New Mexico Dry Natural Gas Reserves Estimated ... Dry Natural Gas Reserves Estimated Production New Mexico Dry Natural Gas Proved Reserves ...

  2. North Dakota Dry Natural Gas Reserves Estimated Production (Billion...

    U.S. Energy Information Administration (EIA) Indexed Site

    Estimated Production (Billion Cubic Feet) North Dakota Dry Natural Gas Reserves Estimated ... Dry Natural Gas Reserves Estimated Production North Dakota Dry Natural Gas Proved Reserves ...

  3. Systematic Approach for Decommissioning Planning and Estimating

    SciTech Connect (OSTI)

    Dam, A. S.

    2002-02-26

    Nuclear facility decommissioning, satisfactorily completed at the lowest cost, relies on a systematic approach to the planning, estimating, and documenting the work. High quality information is needed to properly perform the planning and estimating. A systematic approach to collecting and maintaining the needed information is recommended using a knowledgebase system for information management. A systematic approach is also recommended to develop the decommissioning plan, cost estimate and schedule. A probabilistic project cost and schedule risk analysis is included as part of the planning process. The entire effort is performed by a experienced team of decommissioning planners, cost estimators, schedulers, and facility knowledgeable owner representatives. The plant data, work plans, cost and schedule are entered into a knowledgebase. This systematic approach has been used successfully for decommissioning planning and cost estimating for a commercial nuclear power plant. Elements of this approach have been used for numerous cost estimates and estimate reviews. The plan and estimate in the knowledgebase should be a living document, updated periodically, to support decommissioning fund provisioning, with the plan ready for use when the need arises.

  4. Power, Optimization, Waste Estimating, Resourcing Tool

    Energy Science and Technology Software Center (OSTI)

    2009-08-13

    Planning, Optimization, Waste Estimating, Resourcing tool (POWERtool) is a comprehensive relational database software tool that can be used to develop and organize a detailed project scope, plan work tasks, develop bottoms-up field cost and waste estimates for facility Deactivation and Decommissioning (D&D), equipment, and environmental restoration (ER) projects and produces resource-loaded schedules.

  5. Input-output model for MACCS nuclear accident impacts estimation¹

    SciTech Connect (OSTI)

    Outkin, Alexander V.; Bixler, Nathan E.; Vargas, Vanessa N

    2015-01-27

    Since the original economic model for MACCS was developed, better quality economic data (as well as the tools to gather and process it) and better computational capabilities have become available. The update of the economic impacts component of the MACCS legacy model will provide improved estimates of business disruptions through the use of Input-Output based economic impact estimation. This paper presents an updated MACCS model, bases on Input-Output methodology, in which economic impacts are calculated using the Regional Economic Accounting analysis tool (REAcct) created at Sandia National Laboratories. This new GDP-based model allows quick and consistent estimation of gross domestic product (GDP) losses due to nuclear power plant accidents. This paper outlines the steps taken to combine the REAcct Input-Output-based model with the MACCS code, describes the GDP loss calculation, and discusses the parameters and modeling assumptions necessary for the estimation of long-term effects of nuclear power plant accidents.

  6. Implementing nonquadratic objective functions for state estimation and bad data rejection

    SciTech Connect (OSTI)

    Baldick, R.; Clements, K.A.; Pinjo-Dzigal, Z.; Davis, P.W.

    1997-02-01

    Using a nonquadratic objective function for network state estimation can combine several estimation and bad data rejection techniques into one algorithm; e.g., the benefits of maximum likelihood least squares estimation can be coupled with the bad data rejection properties of least absolute value estimation. For such estimators, the authors describe an efficient implementation, one that builds naturally on existing least squares software, that is based on an iterative Gauss-Newton solution of the KKT optimality conditions. The authors illustrate the behavior of a quadratic-linear and a quadratic-constant objective function on a set of test networks. The former is closely related to the Huber M-estimator. The latter shows somewhat better bad data rejection properties, perhaps because it arises from a natural model of meter failure.

  7. Estimating vehicle roadside encroachment frequency using accident prediction models

    SciTech Connect (OSTI)

    Miaou, S.-P.

    1996-07-01

    The existing data to support the development of roadside encroachment- based accident models are extremely limited and largely outdated. Under the sponsorship of the Federal Highway Administration and Transportation Research Board, several roadside safety projects have attempted to address this issue by providing rather comprehensive data collection plans and conducting pilot data collection efforts. It is clear from the results of these studies that the required field data collection efforts will be expensive. Furthermore, the validity of any field collected encroachment data may be questionable because of the technical difficulty to distinguish intentional from unintentional encroachments. This paper proposes an alternative method for estimating the basic roadside encroachment data without actually field collecting them. The method is developed by exploring the probabilistic relationships between a roadside encroachment event and a run-off-the-road event With some mild assumptions, the method is capable of providing a wide range of basic encroachment data from conventional accident prediction models. To illustrate the concept and use of such a method, some basic encroachment data are estimated for rural two-lane undivided roads. In addition, the estimated encroachment data are compared with the existing collected data. The illustration shows that the method described in this paper can be a viable approach to estimating basic encroachment data without actually collecting them which can be very costly.

  8. State energy data report 1993: Consumption estimates

    SciTech Connect (OSTI)

    1995-07-01

    The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sector. The estimates are developed in the State Energy Data System (SEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining SEDS is to create historical time series of energy consumption by State that are defined as consistently as possible over time and across sectors. SEDS exists for two principal reasons: (1) to provide State energy consumption estimates to Members of Congress, Federal and State agencies, and the general public; and (2) to provide the historical series necessary for EIA`s energy models.

  9. Parallel State Estimation Assessment with Practical Data

    SciTech Connect (OSTI)

    Chen, Yousu; Jin, Shuangshuang; Rice, Mark J.; Huang, Zhenyu

    2014-10-31

    This paper presents a full-cycle parallel state estimation (PSE) implementation using a preconditioned conjugate gradient algorithm. The developed code is able to solve large-size power system state estimation within 5 seconds using real-world data, comparable to the Supervisory Control And Data Acquisition (SCADA) rate. This achievement allows the operators to know the system status much faster to help improve grid reliability. Case study results of the Bonneville Power Administration (BPA) system with real measurements are presented. The benefits of fast state estimation are also discussed.

  10. State Energy Data Report, 1991: Consumption estimates

    SciTech Connect (OSTI)

    Not Available

    1993-05-01

    The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sector. The estimates are developed in the State Energy Data System (SEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining SEDS is to create historical time series of energy consumption by State that are defined as consistently as possible over time and across sectors. SEDS exists for two principal reasons: (1) to provide State energy consumption estimates to the Government, policy makers, and the public; and (2) to provide the historical series necessary for EIA`s energy models.

  11. State energy data report 1995 - consumption estimates

    SciTech Connect (OSTI)

    1997-12-01

    The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sectors. The estimates are developed in the State Energy Data System (SEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining SEDS exists for two principal reasons: (1) to provide State energy consumption estimates to Members of Congress, Federal and State agencies, and the general public, and (2) to provide the historical series necessary for EIA`s energy models.

  12. Parameter estimation with Sandage-Loeb test

    SciTech Connect (OSTI)

    Geng, Jia-Jia; Zhang, Jing-Fei; Zhang, Xin E-mail: jfzhang@mail.neu.edu.cn

    2014-12-01

    The Sandage-Loeb (SL) test directly measures the expansion rate of the universe in the redshift range of 2 ∼< z ∼< 5 by detecting redshift drift in the spectra of Lyman-α forest of distant quasars. We discuss the impact of the future SL test data on parameter estimation for the ΛCDM, the wCDM, and the w{sub 0}w{sub a}CDM models. To avoid the potential inconsistency with other observational data, we take the best-fitting dark energy model constrained by the current observations as the fiducial model to produce 30 mock SL test data. The SL test data provide an important supplement to the other dark energy probes, since they are extremely helpful in breaking the existing parameter degeneracies. We show that the strong degeneracy between Ω{sub m} and H{sub 0} in all the three dark energy models is well broken by the SL test. Compared to the current combined data of type Ia supernovae, baryon acoustic oscillation, cosmic microwave background, and Hubble constant, the 30-yr observation of SL test could improve the constraints on Ω{sub m} and H{sub 0} by more than 60% for all the three models. But the SL test can only moderately improve the constraint on the equation of state of dark energy. We show that a 30-yr observation of SL test could help improve the constraint on constant w by about 25%, and improve the constraints on w{sub 0} and w{sub a} by about 20% and 15%, respectively. We also quantify the constraining power of the SL test in the future high-precision joint geometric constraints on dark energy. The mock future supernova and baryon acoustic oscillation data are simulated based on the space-based project JDEM. We find that the 30-yr observation of SL test would help improve the measurement precision of Ω{sub m}, H{sub 0}, and w{sub a} by more than 70%, 20%, and 60%, respectively, for the w{sub 0}w{sub a}CDM model.

  13. Fuel Cycle Services Needs Estimator v.2.0

    Energy Science and Technology Software Center (OSTI)

    2008-03-18

    The "Fuel Cycle Services Needs Estimator", Version 2.0 allows users to estimate the amount of uranium enrichment services needed and amount of spent nuclear fuel produced by a given fleet of nuclear power reactors through 2050 based on user-determined information about the size of a reactor fleet and average characteristics of reactors in that fleet. The program helps users evaluate the current and future supply of nuclear fuel cycle services. The program also allows usersmore » to compare the enrichment needs and spent fuel production of more up to seven defined nuclear power reactor fleets and to aggregate estimated needs. Version 2.0 of the program has an additions of new graphs to show results of calculations (calculation capabilities and other graphing tools included in version 1.o), maps showing flows of material based on calculation results, and additional calculation capabilities that allow the user to compare supply to demand (demand calculations included in version 1.0). Default values for seven selected nuclear energy programs in East Asia are included for reference and comparison. The program was designed using the dynamic simulation software, Powersim.« less

  14. Estimating electron drift velocities in magnetron discharges...

    Office of Scientific and Technical Information (OSTI)

    Citation Details In-Document Search Title: Estimating ... OSTI Identifier: 1172974 Report Number(s): LBNL-5865E DOE Contract Number: DE-AC02-05CH11231 Resource Type: Journal ...

  15. Estimating Temperature Distributions In Geothermal Areas Using...

    Open Energy Info (EERE)

    "education level" (which depends on the amount and structure of information used for teaching) and (b) the distance of the point at which the estimate is made from the area for...

  16. Forward estimation for game-tree search

    SciTech Connect (OSTI)

    Zhang, Weixiong

    1996-12-31

    It is known that bounds on the minimax values of nodes in a game tree can be used to reduce the computational complexity of minimax search for two-player games. We describe a very simple method to estimate bounds on the minimax values of interior nodes of a game tree, and use the bounds to improve minimax search. The new algorithm, called forward estimation, does not require additional domain knowledge other than a static node evaluation function, and has small constant overhead per node expansion. We also propose a variation of forward estimation, which provides a tradeoff between computational complexity and decision quality. Our experimental results show that forward estimation outperforms alpha-beta pruning on random game trees and the game of Othello.

  17. Lensed CMB simulation and parameter estimation

    SciTech Connect (OSTI)

    Lewis, Antony

    2005-04-15

    Modelling of the weak lensing of the CMB will be crucial to obtain correct cosmological parameter constraints from forthcoming precision CMB anisotropy observations. The lensing affects the power spectrum as well as inducing non-Gaussianities. We discuss the simulation of full-sky CMB maps in the weak lensing approximation and describe a fast numerical code. The series expansion in the deflection angle cannot be used to simulate accurate CMB maps, so a pixel remapping must be used. For parameter estimation accounting for the change in the power spectrum but assuming Gaussianity is sufficient to obtain accurate results up to Planck sensitivity using current tools. A fuller analysis may be required to obtain accurate error estimates and for more sensitive observations. We demonstrate a simple full-sky simulation and subsequent parameter estimation at Planck-like sensitivity. The lensed CMB simulation and parameter estimation codes are publicly available.

  18. Buildings GHG Mitigation Estimator Worksheet, Version 1

    Broader source: Energy.gov [DOE]

    Xcel document describes Version 1 of the the Buildings GHG Mitigation Estimator tool. This tool assists federal agencies in estimating the greenhouse gas mitigation reduction from implementing energy efficiency measures across a portfolio of buildings. It is designed to be applied to groups of office buildings, for example, at a program level (regional or site) that can be summarized at the agency level. While the default savings and cost estimates apply to office buildings, users can define their own efficiency measures, costs, and savings estimates for inclusion in the portfolio assessment. More information on user-defined measures can be found in Step 2 of the buildings emission reduction guidance. The output of this tool is a prioritized set of activities that can help the agency to achieve its greenhouse gas reduction targets most cost-effectively.

  19. Hydrogen Production Cost Estimate Using Biomass Gasification...

    Broader source: Energy.gov (indexed) [DOE]

    potential of Hydrogen Production Cost Estimate Using Biomass Gasification The Panel reviewed the current H2A case (Version 2.12, Case 01D) for hydrogen production via ...

  20. Preliminary CBECS End-Use Estimates

    U.S. Energy Information Administration (EIA) Indexed Site

    For the past three CBECS (1989, 1992, and 1995), we used a statistically-adjusted engineering (SAE) methodology to estimate end-use consumption. The core of the SAE methodology...

  1. Budget estimates, fiscal year 1997. Volume 12

    SciTech Connect (OSTI)

    1996-03-01

    This report contains the fiscal year budget justification to Congress. The budget provides estimates for salaries and expenses and for the Office of the Inspector General for fiscal year 1997.

  2. WEATHER PREDICTIONS AND SURFACE RADIATION ESTIMATES

    Office of Legacy Management (LM)

    RADIATION ESTIMATES for the RULISON EVENT Final Report Albert H . S t o u t , Ray E . ... f i c e U . ' S . Atomic Energy Commission January 1970 LEGAL NOTSCCE ; L *U . . . . . - . ...

  3. gtp_flow_power_estimator.xlsx

    Broader source: Energy.gov [DOE]

    This simple spreadsheet model estimates either the flow rate required to produce a specified level of power output, or the power output that can be produced from a specified flow rate.

  4. Estimates of US biomass energy consumption 1992

    SciTech Connect (OSTI)

    Not Available

    1994-05-06

    This report is the seventh in a series of publications developed by the Energy Information Administration (EIA) to quantify the biomass-derived primary energy used by the US economy. It presents estimates of 1991 and 1992 consumption. The objective of this report is to provide updated estimates of biomass energy consumption for use by Congress, Federal and State agencies, biomass producers and end-use sectors, and the public at large.

  5. Chapter 3: FY 2005 benefits estimates

    SciTech Connect (OSTI)

    None, None

    2009-01-18

    The Office of Energy Efficiency and Renewable Energy (EERE) estimates expected benefits for its overall portfolio and for each of its 11 programs. Benefits for the FY 2005 budget request are estimated for the midterm (2010-2025) and long term (2030-2050). Two separate models suited to these periods are employed—NEMS-GPRA05 for the midterm and MARKAL-GPRA05 for the long term.

  6. Chapter 3: FY 2006 benefits estimates

    SciTech Connect (OSTI)

    None, None

    2009-01-18

    The Office of Energy Efficiency and Renewable Energy (EERE) estimates expected benefits for its overall portfolio and for each of its 11 programs. Benefits for the FY 2006 budget request are estimated for the midterm (2010-2025) and long term (2030-2050). Two separate models suited to these periods are employed–NEMS-GPRA06 for the midterm and MARKAL-GPRA06 for the long term.

  7. Guidelines for Estimating Unmetered Industrial Water Use

    SciTech Connect (OSTI)

    Boyd, Brian K.

    2010-08-01

    The document provides a methodology to estimate unmetered industrial water use for evaporative cooling systems, steam generating boiler systems, batch process applications, and wash systems. For each category standard mathematical relationships are summarized and provided in a single resource to assist Federal agencies in developing an initial estimate of their industrial water use. The approach incorporates industry norms, general rules of thumb, and industry survey information to provide methodologies for each section.

  8. Structure Learning and Statistical Estimation in Distribution Networks - Part II

    SciTech Connect (OSTI)

    Deka, Deepjyoti; Backhaus, Scott N.; Chertkov, Michael

    2015-02-13

    Limited placement of real-time monitoring devices in the distribution grid, recent trends notwithstanding, has prevented the easy implementation of demand-response and other smart grid applications. Part I of this paper discusses the problem of learning the operational structure of the grid from nodal voltage measurements. In this work (Part II), the learning of the operational radial structure is coupled with the problem of estimating nodal consumption statistics and inferring the line parameters in the grid. Based on a Linear-Coupled(LC) approximation of AC power flows equations, polynomial time algorithms are designed to identify the structure and estimate nodal load characteristics and/or line parameters in the grid using the available nodal voltage measurements. Then the structure learning algorithm is extended to cases with missing data, where available observations are limited to a fraction of the grid nodes. The efficacy of the presented algorithms are demonstrated through simulations on several distribution test cases.

  9. RSMASS: A simple model for estimating reactor and shield masses

    SciTech Connect (OSTI)

    Marshall, A.C.; Aragon, J.; Gallup, D.

    1987-01-01

    A simple mathematical model (RSMASS) has been developed to provide rapid estimates of reactor and shield masses for space-based reactor power systems. Approximations are used rather than correlations or detailed calculations to estimate the reactor fuel mass and the masses of the moderator, structure, reflector, pressure vessel, miscellaneous components, and the reactor shield. The fuel mass is determined either by neutronics limits, thermal/hydraulic limits, or fuel damage limits, whichever yields the largest mass. RSMASS requires the reactor power and energy, 24 reactor parameters, and 20 shield parameters to be specified. This parametric approach should be applicable to a very broad range of reactor types. Reactor and shield masses calculated by RSMASS were found to be in good agreement with the masses obtained from detailed calculations.

  10. Estimated recharge rates at the Hanford Site

    SciTech Connect (OSTI)

    Fayer, M.J.; Walters, T.B.

    1995-02-01

    The Ground-Water Surveillance Project monitors the distribution of contaminants in ground water at the Hanford Site for the U.S. Department of Energy. A subtask called {open_quotes}Water Budget at Hanford{close_quotes} was initiated in FY 1994. The objective of this subtask was to produce a defensible map of estimated recharge rates across the Hanford Site. Methods that have been used to estimate recharge rates at the Hanford Site include measurements (of drainage, water contents, and tracers) and computer modeling. For the simulations of 12 soil-vegetation combinations, the annual rates varied from 0.05 mm/yr for the Ephrata sandy loam with bunchgrass to 85.2 mm/yr for the same soil without vegetation. Water content data from the Grass Site in the 300 Area indicated that annual rates varied from 3.0 to 143.5 mm/yr during an 8-year period. The annual volume of estimated recharge was calculated to be 8.47 {times} 10{sup 9} L for the potential future Hanford Site (i.e., the portion of the current Site bounded by Highway 240 and the Columbia River). This total volume is similar to earlier estimates of natural recharge and is 2 to 10x higher than estimates of runoff and ground-water flow from higher elevations. Not only is the volume of natural recharge significant in comparison to other ground-water inputs, the distribution of estimated recharge is highly skewed to the disturbed sandy soils (i.e., the 200 Areas, where most contaminants originate). The lack of good estimates of the means and variances of the supporting data (i.e., the soil map, the vegetation/land use map, the model parameters) translates into large uncertainties in the recharge estimates. When combined, the significant quantity of estimated recharge, its high sensitivity to disturbance, and the unquantified uncertainty of the data and model parameters suggest that the defensibility of the recharge estimates should be improved.

  11. Icing rate meter estimation of in-cloud cable icing

    SciTech Connect (OSTI)

    McComber, P.; Druez, J.; Laflamme, J.

    1994-12-31

    In many northern countries, the design and reliability of power transmission lines are closely related to atmospheric icing overloads. It is becoming increasingly important to have reliable instrument systems to warn of icing conditions before icing loads become sufficient to damage the power transmission network. Various instruments are presently being developed to provide better monitoring of icing conditions. One such instrument is the icing rate meter (IRM) which counts icing and de-icing cycles per unit time on a standard probe and can be used to estimate the icing rate on nearby cables. The calibration presently used was originally based on experiments conducted in a cold room. Even though this calibration has shown that the IRM estimation already offers an improvement over model prediction based on standard meteorological parameters, it can certainly be improved further with appropriate field data. For this purpose, the instrument was tested on an icing test site at Mt. Valin (altitude 902 m) Quebec, Canada. In this paper measurements from twelve in-cloud icing events during the 1991--92 winter are divided into one hour periods of icing to provide the experimental icing rate data. The icing rates measured on a 12.5 mm and a 35 mm cables are then compared with the number of IRM signals, also for one hour periods, in relation to initial ice load, temperature, wind velocity and direction. From this analysis, a better calibration for the IRM instrument is suggested. The improvement of the IRM estimation is illustrated by making a comparison with measurements, of the icing load estimation with the old and new calibrations for two complete icing events.

  12. Robust Optical Richness Estimation with Reduced Scatter

    SciTech Connect (OSTI)

    Rykoff, E.S.; Koester, B.P.; Rozo, E.; Annis, J.; Evrard, A.E.; Hansen, S.M.; Hao, J.; Johnston, D.E.; McKay, T.A.; Wechsler, R.H.; /KIPAC, Menlo Park /SLAC

    2012-06-07

    Reducing the scatter between cluster mass and optical richness is a key goal for cluster cosmology from photometric catalogs. We consider various modifications to the red-sequence matched filter richness estimator of Rozo et al. (2009b), and evaluate their impact on the scatter in X-ray luminosity at fixed richness. Most significantly, we find that deeper luminosity cuts can reduce the recovered scatter, finding that {sigma}{sub ln L{sub X}|{lambda}} = 0.63 {+-} 0.02 for clusters with M{sub 500c} {approx}> 1.6 x 10{sup 14} h{sub 70}{sup -1} M{sub {circle_dot}}. The corresponding scatter in mass at fixed richness is {sigma}{sub ln M|{lambda}} {approx} 0.2-0.3 depending on the richness, comparable to that for total X-ray luminosity. We find that including blue galaxies in the richness estimate increases the scatter, as does weighting galaxies by their optical luminosity. We further demonstrate that our richness estimator is very robust. Specifically, the filter employed when estimating richness can be calibrated directly from the data, without requiring a-priori calibrations of the red-sequence. We also demonstrate that the recovered richness is robust to up to 50% uncertainties in the galaxy background, as well as to the choice of photometric filter employed, so long as the filters span the 4000 {angstrom} break of red-sequence galaxies. Consequently, our richness estimator can be used to compare richness estimates of different clusters, even if they do not share the same photometric data. Appendix A includes 'easy-bake' instructions for implementing our optimal richness estimator, and we are releasing an implementation of the code that works with SDSS data, as well as an augmented maxBCG catalog with the {lambda} richness measured for each cluster.

  13. Handbook for cost estimating. A method for developing estimates of costs for generic actions for nuclear power plants

    SciTech Connect (OSTI)

    Ball, J.R.; Cohen, S.; Ziegler, E.Z.

    1984-10-01

    This document provides overall guidance to assist the NRC in preparing the types of cost estimates required by the Regulatory Analysis Guidelines and to assist in the assignment of priorities in resolving generic safety issues. The Handbook presents an overall cost model that allows the cost analyst to develop a chronological series of activities needed to implement a specific regulatory requirement throughout all applicable commercial LWR power plants and to identify the significant cost elements for each activity. References to available cost data are provided along with rules of thumb and cost factors to assist in evaluating each cost element. A suitable code-of-accounts data base is presented to assist in organizing and aggregating costs. Rudimentary cost analysis methods are described to allow the analyst to produce a constant-dollar, lifetime cost for the requirement. A step-by-step example cost estimate is included to demonstrate the overall use of the Handbook.

  14. GAO Cost Estimating and Assessment Guide | Department of Energy

    Energy Savers [EERE]

    Cost Estimating and Assessment Guide GAO Cost Estimating and Assessment Guide GAO Cost Estimating and Assessment Guide: Twelve Steps of a High-Quality Cost Estimating Process, from the first step of defining the estimate's purpose to the last step of updating the estimate to reflect actual costs and changes. Twelve Steps of a High-Quality Cost Estimating Process (75.75 KB) Key Resources PMCDP EVMS PARS IIe FPD Resource Center PM Newsletter Forms and Templates More Documents & Publications

  15. Sub-Second Parallel State Estimation

    SciTech Connect (OSTI)

    Chen, Yousu; Rice, Mark J.; Glaesemann, Kurt R.; Wang, Shaobu; Huang, Zhenyu

    2014-10-31

    This report describes the performance of Pacific Northwest National Laboratory (PNNL) sub-second parallel state estimation (PSE) tool using the utility data from the Bonneville Power Administrative (BPA) and discusses the benefits of the fast computational speed for power system applications. The test data were provided by BPA. They are two-days’ worth of hourly snapshots that include power system data and measurement sets in a commercial tool format. These data are extracted out from the commercial tool box and fed into the PSE tool. With the help of advanced solvers, the PSE tool is able to solve each BPA hourly state estimation problem within one second, which is more than 10 times faster than today’s commercial tool. This improved computational performance can help increase the reliability value of state estimation in many aspects: (1) the shorter the time required for execution of state estimation, the more time remains for operators to take appropriate actions, and/or to apply automatic or manual corrective control actions. This increases the chances of arresting or mitigating the impact of cascading failures; (2) the SE can be executed multiple times within time allowance. Therefore, the robustness of SE can be enhanced by repeating the execution of the SE with adaptive adjustments, including removing bad data and/or adjusting different initial conditions to compute a better estimate within the same time as a traditional state estimator’s single estimate. There are other benefits with the sub-second SE, such as that the PSE results can potentially be used in local and/or wide-area automatic corrective control actions that are currently dependent on raw measurements to minimize the impact of bad measurements, and provides opportunities to enhance the power grid reliability and efficiency. PSE also can enable other advanced tools that rely on SE outputs and could be used to further improve operators’ actions and automated controls to mitigate effects

  16. Building unbiased estimators from non-gaussian likelihoods with application to shear estimation

    SciTech Connect (OSTI)

    Madhavacheril, Mathew S.; McDonald, Patrick; Sehgal, Neelima; Slosar, Anze

    2015-01-15

    We develop a general framework for generating estimators of a given quantity which are unbiased to a given order in the difference between the true value of the underlying quantity and the fiducial position in theory space around which we expand the likelihood. We apply this formalism to rederive the optimal quadratic estimator and show how the replacement of the second derivative matrix with the Fisher matrix is a generic way of creating an unbiased estimator (assuming choice of the fiducial model is independent of data). Next we apply the approach to estimation of shear lensing, closely following the work of Bernstein and Armstrong (2014). Our first order estimator reduces to their estimator in the limit of zero shear, but it also naturally allows for the case of non-constant shear and the easy calculation of correlation functions or power spectra using standard methods. Both our first-order estimator and Bernstein and Armstrongs estimator exhibit a bias which is quadratic in true shear. Our third-order estimator is, at least in the realm of the toy problem of Bernstein and Armstrong, unbiased to 0.1% in relative shear errors ?g/g for shears up to |g| = 0.2.

  17. Building unbiased estimators from non-Gaussian likelihoods with application to shear estimation

    SciTech Connect (OSTI)

    Madhavacheril, Mathew S.; Sehgal, Neelima; McDonald, Patrick; Slosar, Ane E-mail: pvmcdonald@lbl.gov E-mail: anze@bnl.gov

    2015-01-01

    We develop a general framework for generating estimators of a given quantity which are unbiased to a given order in the difference between the true value of the underlying quantity and the fiducial position in theory space around which we expand the likelihood. We apply this formalism to rederive the optimal quadratic estimator and show how the replacement of the second derivative matrix with the Fisher matrix is a generic way of creating an unbiased estimator (assuming choice of the fiducial model is independent of data). Next we apply the approach to estimation of shear lensing, closely following the work of Bernstein and Armstrong (2014). Our first order estimator reduces to their estimator in the limit of zero shear, but it also naturally allows for the case of non-constant shear and the easy calculation of correlation functions or power spectra using standard methods. Both our first-order estimator and Bernstein and Armstrong's estimator exhibit a bias which is quadratic in true shear. Our third-order estimator is, at least in the realm of the toy problem of Bernstein and Armstrong, unbiased to 0.1% in relative shear errors ?g/g for shears up to |g|=0.2.

  18. Building unbiased estimators from non-gaussian likelihoods with application to shear estimation

    SciTech Connect (OSTI)

    Madhavacheril, Mathew S.; Slosar, Anze; McDonald, Patrick; Sehgal, Neelima

    2015-01-01

    We develop a general framework for generating estimators of a given quantity which are unbiased to a given order in the difference between the true value of the underlying quantity and the fiducial position in theory space around which we expand the likelihood. We apply this formalism to rederive the optimal quadratic estimator and show how the replacement of the second derivative matrix with the Fisher matrix is a generic way of creating an unbiased estimator (assuming choice of the fiducial model is independent of data). Next we apply the approach to estimation of shear lensing, closely following the work of Bernstein and Armstrong (2014). Our first order estimator reduces to their estimator in the limit of zero shear, but it also naturally allows for the case of non-constant shear and the easy calculation of correlation functions or power spectra using standard methods. Both our first-order estimator and Bernstein and Armstrongs estimator exhibit a bias which is quadratic in true shear. Our third-order estimator is, at least in the realm of the toy problem of Bernstein and Armstrong, unbiased to 0.1% in relative shear errors ?g/g for shears up to |g| = 0.2.

  19. Building unbiased estimators from non-gaussian likelihoods with application to shear estimation

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Madhavacheril, Mathew S.; McDonald, Patrick; Sehgal, Neelima; Slosar, Anze

    2015-01-15

    We develop a general framework for generating estimators of a given quantity which are unbiased to a given order in the difference between the true value of the underlying quantity and the fiducial position in theory space around which we expand the likelihood. We apply this formalism to rederive the optimal quadratic estimator and show how the replacement of the second derivative matrix with the Fisher matrix is a generic way of creating an unbiased estimator (assuming choice of the fiducial model is independent of data). Next we apply the approach to estimation of shear lensing, closely following the workmore » of Bernstein and Armstrong (2014). Our first order estimator reduces to their estimator in the limit of zero shear, but it also naturally allows for the case of non-constant shear and the easy calculation of correlation functions or power spectra using standard methods. Both our first-order estimator and Bernstein and Armstrong’s estimator exhibit a bias which is quadratic in true shear. Our third-order estimator is, at least in the realm of the toy problem of Bernstein and Armstrong, unbiased to 0.1% in relative shear errors Δg/g for shears up to |g| = 0.2.« less

  20. Estimating exposure of terrestrial wildlife to contaminants

    SciTech Connect (OSTI)

    Sample, B.E.; Suter, G.W. II

    1994-09-01

    This report describes generalized models for the estimation of contaminant exposure experienced by wildlife on the Oak Ridge Reservation. The primary exposure pathway considered is oral ingestion, e.g. the consumption of contaminated food, water, or soil. Exposure through dermal absorption and inhalation are special cases and are not considered hereIN. Because wildlife mobile and generally consume diverse diets and because environmental contamination is not spatial homogeneous, factors to account for variation in diet, movement, and contaminant distribution have been incorporated into the models. To facilitate the use and application of the models, life history parameters necessary to estimate exposure are summarized for 15 common wildlife species. Finally, to display the application of the models, exposure estimates were calculated for four species using data from a source operable unit on the Oak Ridge Reservation.

  1. Reionization history and CMB parameter estimation

    SciTech Connect (OSTI)

    Dizgah, Azadeh Moradinezhad; Kinney, William H.; Gnedin, Nickolay Y. E-mail: gnedin@fnal.edu

    2013-05-01

    We study how uncertainty in the reionization history of the universe affects estimates of other cosmological parameters from the Cosmic Microwave Background. We analyze WMAP7 data and synthetic Planck-quality data generated using a realistic scenario for the reionization history of the universe obtained from high-resolution numerical simulation. We perform parameter estimation using a simple sudden reionization approximation, and using the Principal Component Analysis (PCA) technique proposed by Mortonson and Hu. We reach two main conclusions: (1) Adopting a simple sudden reionization model does not introduce measurable bias into values for other parameters, indicating that detailed modeling of reionization is not necessary for the purpose of parameter estimation from future CMB data sets such as Planck. (2) PCA analysis does not allow accurate reconstruction of the actual reionization history of the universe in a realistic case.

  2. Robust estimation procedure in panel data model

    SciTech Connect (OSTI)

    Shariff, Nurul Sima Mohamad; Hamzah, Nor Aishah

    2014-06-19

    The panel data modeling has received a great attention in econometric research recently. This is due to the availability of data sources and the interest to study cross sections of individuals observed over time. However, the problems may arise in modeling the panel in the presence of cross sectional dependence and outliers. Even though there are few methods that take into consideration the presence of cross sectional dependence in the panel, the methods may provide inconsistent parameter estimates and inferences when outliers occur in the panel. As such, an alternative method that is robust to outliers and cross sectional dependence is introduced in this paper. The properties and construction of the confidence interval for the parameter estimates are also considered in this paper. The robustness of the procedure is investigated and comparisons are made to the existing method via simulation studies. Our results have shown that robust approach is able to produce an accurate and reliable parameter estimates under the condition considered.

  3. State energy data report 1996: Consumption estimates

    SciTech Connect (OSTI)

    1999-02-01

    The State Energy Data Report (SEDR) provides annual time series estimates of State-level energy consumption by major economic sectors. The estimates are developed in the Combined State Energy Data System (CSEDS), which is maintained and operated by the Energy Information Administration (EIA). The goal in maintaining CSEDS is to create historical time series of energy consumption by State that are defined as consistently as possible over time and across sectors. CSEDS exists for two principal reasons: (1) to provide State energy consumption estimates to Members of Congress, Federal and State agencies, and the general public and (2) to provide the historical series necessary for EIA`s energy models. To the degree possible, energy consumption has been assigned to five sectors: residential, commercial, industrial, transportation, and electric utility sectors. Fuels covered are coal, natural gas, petroleum, nuclear electric power, hydroelectric power, biomass, and other, defined as electric power generated from geothermal, wind, photovoltaic, and solar thermal energy. 322 tabs.

  4. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    SciTech Connect (OSTI)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  5. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    SciTech Connect (OSTI)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  6. Estimating vehicle height using homographic projections

    DOE Patents [OSTI]

    Cunningham, Mark F; Fabris, Lorenzo; Gee, Timothy F; Ghebretati, Jr., Frezghi H; Goddard, James S; Karnowski, Thomas P; Ziock, Klaus-peter

    2013-07-16

    Multiple homography transformations corresponding to different heights are generated in the field of view. A group of salient points within a common estimated height range is identified in a time series of video images of a moving object. Inter-salient point distances are measured for the group of salient points under the multiple homography transformations corresponding to the different heights. Variations in the inter-salient point distances under the multiple homography transformations are compared. The height of the group of salient points is estimated to be the height corresponding to the homography transformation that minimizes the variations.

  7. Process Equipment Cost Estimation, Final Report

    SciTech Connect (OSTI)

    H.P. Loh; Jennifer Lyons; Charles W. White, III

    2002-01-01

    This report presents generic cost curves for several equipment types generated using ICARUS Process Evaluator. The curves give Purchased Equipment Cost as a function of a capacity variable. This work was performed to assist NETL engineers and scientists in performing rapid, order of magnitude level cost estimates or as an aid in evaluating the reasonableness of cost estimates submitted with proposed systems studies or proposals for new processes. The specific equipment types contained in this report were selected to represent a relatively comprehensive set of conventional chemical process equipment types.

  8. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    SciTech Connect (OSTI)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-02-27

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  9. A semantic characterization of an algorithm for estimating others` beliefs from observation

    SciTech Connect (OSTI)

    Isozaki, Hideki; Katsuno, Hirofumi

    1996-12-31

    Human beings often estimate others beliefs and intentions when they interact with others. Estimation of others beliefs will be useful also in controlling the behavior and utterances of artificial agents, especially when lines of communication are unstable or slow. But, devising such estimation algorithms and background theories for the algorithms is difficult, because of many factors affecting one`s belief. We have proposed an algorithm that estimates others beliefs from observation in the changing world. Experimental results show that this algorithm returns natural answers to various queries. However, the algorithm is only heuristic, and how the algorithm deals with beliefs and their changes is not entirely clear. We propose certain semantics based on a nonstandard structure for modal logic. By using these semantics, we shed light on a logical meaning of the belief estimation that the algorithm deals with. We also discuss how the semantics and the algorithm can be generalized.

  10. Uncertainty Estimation Improves Energy Measurement and Verification Procedures

    SciTech Connect (OSTI)

    Walter, Travis; Price, Phillip N.; Sohn, Michael D.

    2014-05-14

    Implementing energy conservation measures in buildings can reduce energy costs and environmental impacts, but such measures cost money to implement so intelligent investment strategies require the ability to quantify the energy savings by comparing actual energy used to how much energy would have been used in absence of the conservation measures (known as the baseline energy use). Methods exist for predicting baseline energy use, but a limitation of most statistical methods reported in the literature is inadequate quantification of the uncertainty in baseline energy use predictions. However, estimation of uncertainty is essential for weighing the risks of investing in retrofits. Most commercial buildings have, or soon will have, electricity meters capable of providing data at short time intervals. These data provide new opportunities to quantify uncertainty in baseline predictions, and to do so after shorter measurement durations than are traditionally used. In this paper, we show that uncertainty estimation provides greater measurement and verification (M&V) information and helps to overcome some of the difficulties with deciding how much data is needed to develop baseline models and to confirm energy savings. We also show that cross-validation is an effective method for computing uncertainty. In so doing, we extend a simple regression-based method of predicting energy use using short-interval meter data. We demonstrate the methods by predicting energy use in 17 real commercial buildings. We discuss the benefits of uncertainty estimates which can provide actionable decision making information for investing in energy conservation measures.