National Library of Energy BETA

Sample records for quality metrics level

  1. Label-invariant Mesh Quality Metrics. (Conference) | SciTech...

    Office of Scientific and Technical Information (OSTI)

    Label-invariant Mesh Quality Metrics. Citation Details In-Document Search Title: Label-invariant Mesh Quality Metrics. Abstract not provided. Authors: Knupp, Patrick Publication ...

  2. ARM - Evaluation Product - AERI Data Quality Metric (AERI-QC)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ProductsAERI Data Quality Metric (AERI-QC) ARM Data Discovery Browse Data Documentation Use the Data File Inventory tool to view data availability at the file level. Comments? We would love to hear from you! Send us a note below or call us at 1-888-ARM-DATA. Send Evaluation Product : AERI Data Quality Metric (AERI-QC) Ancillary NetCDF file to be used with the regular AERI data files to document times when the data may not be correct. Data Details Contact David Turner National Oceanic and

  3. Analysis of Solar Cell Quality Using Voltage Metrics: Preprint

    SciTech Connect (OSTI)

    Toberer, E. S.; Tamboli, A. C.; Steiner, M.; Kurtz, S.

    2012-06-01

    The highest efficiency solar cells provide both excellent voltage and current. Of these, the open-circuit voltage (Voc) is more frequently viewed as an indicator of the material quality. However, since the Voc also depends on the band gap of the material, the difference between the band gap and the Voc is a better metric for comparing material quality of unlike materials. To take this one step further, since Voc also depends on the shape of the absorption edge, we propose to use the ultimate metric: the difference between the measured Voc and the Voc calculated from the external quantum efficiency using a detailed balance approach. This metric is less sensitive to changes in cell design and definition of band gap. The paper defines how to implement this metric and demonstrates how it can be useful in tracking improvements in Voc, especially as Voc approaches its theoretical maximum.

  4. Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Metrics Metrics Los Alamos expands its innovation network by engaging in sponsored research and licensing across technical disciplines. These agreements are the basis of a working relationship with industry and other research institutions and highlight the diversity of our collaborations. Los Alamos has a remarkable 70-year legacy of creating entirely new technologies that have revolutionized the country's understanding of science and engineering. Collaborations Data from Fiscal Year 2014. FY14

  5. Development of Technology Readiness Level (TRL) Metrics and Risk Measures

    SciTech Connect (OSTI)

    Engel, David W.; Dalton, Angela C.; Anderson, K. K.; Sivaramakrishnan, Chandrika; Lansing, Carina

    2012-10-01

    This is an internal project milestone report to document the CCSI Element 7 team's progress on developing Technology Readiness Level (TRL) metrics and risk measures. In this report, we provide a brief overview of the current technology readiness assessment research, document the development of technology readiness levels (TRLs) specific to carbon capture technologies, describe the risk measures and uncertainty quantification approaches used in our research, and conclude by discussing the next steps that the CCSI Task 7 team aims to accomplish.

  6. GPRA 2003 quality metrics methodology and results: Office of Industrial Technologies

    SciTech Connect (OSTI)

    None, None

    2002-04-19

    This report describes the results, calculations, and assumptions underlying the GPRA 2003 Quality Metrics results for all Planning Units withing the Office of Industrial Technologies.

  7. Large-scale seismic waveform quality metric calculation using Hadoop

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.; Dodge, Douglas A.; Ruppert, Stanley D.

    2016-05-27

    Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will

  8. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    2016-02-28

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  9. Metric Presentation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... MODERN GRID S T R A T E G Y 14 14 Value Metrics - Work to date Reliability Outage duration and frequency Momentary outages Power Quality measures Security Ratio of distributed ...

  10. Using research metrics to evaluate the International Atomic Energy Agency guidelines on quality assurance for R&D

    SciTech Connect (OSTI)

    Bodnarczuk, M.

    1994-06-01

    The objective of the International Atomic Energy Agency (IAEA) Guidelines on Quality Assurance for R&D is to provide guidance for developing quality assurance (QA) programs for R&D work on items, services, and processes important to safety, and to support the siting, design, construction, commissioning, operation, and decommissioning of nuclear facilities. The standard approach to writing papers describing new quality guidelines documents is to present a descriptive overview of the contents of the document. I will depart from this approach. Instead, I will first discuss a conceptual framework of metrics for evaluating and improving basic and applied experimental science as well as the associated role that quality management should play in understanding and implementing these metrics. I will conclude by evaluating how well the IAEA document addresses the metrics from this conceptual framework and the broader principles of quality management.

  11. Derivation of a Levelized Cost of Coating (LCOC) metric for evaluation of solar selective absorber materials

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ho, C. K.; Pacheco, J. E.

    2015-06-05

    A new metric, the Levelized Cost of Coating (LCOC), is derived in this paper to evaluate and compare alternative solar selective absorber coatings against a baseline coating (Pyromark 2500). In contrast to previous metrics that focused only on the optical performance of the coating, the LCOC includes costs, durability, and optical performance for more comprehensive comparisons among candidate materials. The LCOC is defined as the annualized marginal cost of the coating to produce a baseline annual thermal energy production. Costs include the cost of materials and labor for initial application and reapplication of the coating, as well as the costmore » of additional or fewer heliostats to yield the same annual thermal energy production as the baseline coating. Results show that important factors impacting the LCOC include the initial solar absorptance, thermal emittance, reapplication interval, degradation rate, reapplication cost, and downtime during reapplication. The LCOC can also be used to determine the optimal reapplication interval to minimize the levelized cost of energy production. As a result, similar methods can be applied more generally to determine the levelized cost of component for other applications and systems.« less

  12. Derivation of a Levelized Cost of Coating (LCOC) metric for evaluation of solar selective absorber materials

    SciTech Connect (OSTI)

    Ho, C. K.; Pacheco, J. E.

    2015-06-05

    A new metric, the Levelized Cost of Coating (LCOC), is derived in this paper to evaluate and compare alternative solar selective absorber coatings against a baseline coating (Pyromark 2500). In contrast to previous metrics that focused only on the optical performance of the coating, the LCOC includes costs, durability, and optical performance for more comprehensive comparisons among candidate materials. The LCOC is defined as the annualized marginal cost of the coating to produce a baseline annual thermal energy production. Costs include the cost of materials and labor for initial application and reapplication of the coating, as well as the cost of additional or fewer heliostats to yield the same annual thermal energy production as the baseline coating. Results show that important factors impacting the LCOC include the initial solar absorptance, thermal emittance, reapplication interval, degradation rate, reapplication cost, and downtime during reapplication. The LCOC can also be used to determine the optimal reapplication interval to minimize the levelized cost of energy production. As a result, similar methods can be applied more generally to determine the levelized cost of component for other applications and systems.

  13. Recommendations for mass spectrometry data quality metrics for open access data(corollary to the Amsterdam principles)

    SciTech Connect (OSTI)

    Kingsinger, Christopher R.; Apffel, James; Baker, Mark S.; Bian, Xiaopeng; Borchers, Christoph H.; Bradshaw, Ralph A.; Brusniak, Mi-Youn; Chan, Daniel W.; Deutsch, Eric W.; Domon, Bruno; Gorman, Jeff; Grimm, Rudolf; Hancock, William S.; Hermjakob, Henning; Horn, David; Hunter, Christie; Kolar, Patrik; Kraus, Hans-Joachim; Langen, Hanno; Linding, Rune; Moritz, Robert L.; Omenn, Gilbert S.; Orlando, Ron; Pandey, Akhilesh; Ping, Peipei; Rahbar, Amir; Rivers, Robert; Seymour, Sean L.; Simpson, Richard J.; Slotta, Douglas; Smith, Richard D.; Stein, Stephen E.; Tabb, David L.; Tagle, Danilo; Yates, John R.; Rodriguez, Henry

    2011-12-01

    Policies supporting the rapid and open sharing of proteomic data are being implemented by the leading journals in the field. The proteomics community is taking steps to ensure that data are made publicly accessible and are of high quality, a challenging task that requires the development and deployment of methods for measuring and documenting data quality metrics. On September 18, 2010, the U.S. National Cancer Institute (NCI) convened the 'International Workshop on Proteomic Data Quality Metrics' in Sydney, Australia, to identify and address issues facing the development and use of such methods for open access proteomics data. The stakeholders at the workshop enumerated the key principles underlying a framework for data quality assessment in mass spectrometry data that will meet the needs of the search community, journals, funding agencies, and data repositories. Attendees discussed and agreed upon two primary needs for the wide use of quality metrics: (i)an evolving list of comprehensive quality metrics and (ii)standards accompanied by software analytics. Attendees stressed the importance of increased education and training programs to promote reliable protocols in proteomics. This workshop report explores the historic precedents, key discussions, and necessary next steps to enhance the quality of open access data. By agreement, this article is published simultaneously in Proteomics, Proteomics Clinical Applications, Journal of Proteome Research, and Molecular and Cellular Proteomics, as a public service to the research community.The peer review process was a coordinated effort conducted by a panel of referees selected by the journals.

  14. SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation

    SciTech Connect (OSTI)

    Zhao, T; Ruan, D

    2015-06-15

    Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject to random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be

  15. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    SciTech Connect (OSTI)

    Shiraishi, Satomi; Moore, Kevin L.; Tan, Jun; Olsen, Lindsey A.

    2015-02-15

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V{sub 10Gy} (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM{sub clin} − QM{sub pred}, and a coefficient of determination, R{sup 2}. For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are

  16. Determine metrics and set targets for soil quality on agriculture residue and energy crop pathways

    SciTech Connect (OSTI)

    Ian Bonner; David Muth

    2013-09-01

    There are three objectives for this project: 1) support OBP in meeting MYPP stated performance goals for the Sustainability Platform, 2) develop integrated feedstock production system designs that increase total productivity of the land, decrease delivered feedstock cost to the conversion facilities, and increase environmental performance of the production system, and 3) deliver to the bioenergy community robust datasets and flexible analysis tools for establishing sustainable and viable use of agricultural residues and dedicated energy crops. The key project outcome to date has been the development and deployment of a sustainable agricultural residue removal decision support framework. The modeling framework has been used to produce a revised national assessment of sustainable residue removal potential. The national assessment datasets are being used to update national resource assessment supply curves using POLYSIS. The residue removal modeling framework has also been enhanced to support high fidelity sub-field scale sustainable removal analyses. The framework has been deployed through a web application and a mobile application. The mobile application is being used extensively in the field with industry, research, and USDA NRCS partners to support and validate sustainable residue removal decisions. The results detailed in this report have set targets for increasing soil sustainability by focusing on primary soil quality indicators (total organic carbon and erosion) in two agricultural residue management pathways and a dedicated energy crop pathway. The two residue pathway targets were set to, 1) increase residue removal by 50% while maintaining soil quality, and 2) increase soil quality by 5% as measured by Soil Management Assessment Framework indicators. The energy crop pathway was set to increase soil quality by 10% using these same indicators. To demonstrate the feasibility and impact of each of these targets, seven case studies spanning the US are presented

  17. STAR METRICS

    Broader source: Energy.gov [DOE]

    Energy continues to define Phase II of the STAR METRICS program, a collaborative initiative to track Research and Development expenditures and their outcomes. Visit the STAR METRICS website for...

  18. DOE JGI Quality Metrics; Approaches to Scaling and Improving Metagenome Assembly (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema (OSTI)

    Copeland, Alex [DOE JGI]; Brown, C Titus [Michigan State University

    2013-01-22

    DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  19. Levelized cost of energy (LCOE) metric to characterize solar absorber coatings for the CSP industry

    SciTech Connect (OSTI)

    Boubault, Antoine; Ho, Clifford K.; Hall, Aaron; Lambert, Timothy N.; Ambrosini, Andrea

    2015-07-08

    The contribution of each component of a power generation plant to the levelized cost of energy (LCOE) can be estimated and used to increase the power output while reducing system operation and maintenance costs. The LCOE is used in order to quantify solar receiver coating influence on the LCOE of solar power towers. Two new parameters are introduced: the absolute levelized cost of coating (LCOC) and the LCOC efficiency. Depending on the material properties, aging, costs, and temperature, the absolute LCOC enables quantifying the cost-effectiveness of absorber coatings, as well as finding optimal operating conditions. The absolute LCOC is investigated for different hypothetic coatings and is demonstrated on Pyromark 2500 paint. Results show that absorber coatings yield lower LCOE values in most cases, even at significant costs. Optimal reapplication intervals range from one to five years. At receiver temperatures greater than 700 C, non-selective coatings are not always worthwhile while durable selective coatings consistently reduce the LCOEup to 12% of the value obtained for an uncoated receiver. Moreover the absolute LCOC is a powerful tool to characterize and compare different coatings, not only considering their initial efficiencies but also including their durability.

  20. Levelized cost of energy (LCOE) metric to characterize solar absorber coatings for the CSP industry

    SciTech Connect (OSTI)

    Boubault, Antoine; Ho, Clifford K.; Hall, Aaron; Lambert, Timothy N.; Ambrosini, Andrea

    2015-07-08

    The contribution of each component of a power generation plant to the levelized cost of energy (LCOE) can be estimated and used to increase the power output while reducing system operation and maintenance costs. The LCOE is used in order to quantify solar receiver coating influence on the LCOE of solar power towers. Two new parameters are introduced: the absolute levelized cost of coating (LCOC) and the LCOC efficiency. Depending on the material properties, aging, costs, and temperature, the absolute LCOC enables quantifying the cost-effectiveness of absorber coatings, as well as finding optimal operating conditions. The absolute LCOC is investigated for different hypothetic coatings and is demonstrated on Pyromark 2500 paint. Results show that absorber coatings yield lower LCOE values in most cases, even at significant costs. Optimal reapplication intervals range from one to five years. At receiver temperatures greater than 700 °C, non-selective coatings are not always worthwhile while durable selective coatings consistently reduce the LCOE—up to 12% of the value obtained for an uncoated receiver. Moreover the absolute LCOC is a powerful tool to characterize and compare different coatings, not only considering their initial efficiencies but also including their durability.

  1. Levelized cost of energy (LCOE) metric to characterize solar absorber coatings for the CSP industry

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Boubault, Antoine; Ho, Clifford K.; Hall, Aaron; Lambert, Timothy N.; Ambrosini, Andrea

    2015-07-08

    The contribution of each component of a power generation plant to the levelized cost of energy (LCOE) can be estimated and used to increase the power output while reducing system operation and maintenance costs. The LCOE is used in order to quantify solar receiver coating influence on the LCOE of solar power towers. Two new parameters are introduced: the absolute levelized cost of coating (LCOC) and the LCOC efficiency. Depending on the material properties, aging, costs, and temperature, the absolute LCOC enables quantifying the cost-effectiveness of absorber coatings, as well as finding optimal operating conditions. The absolute LCOC is investigatedmore » for different hypothetic coatings and is demonstrated on Pyromark 2500 paint. Results show that absorber coatings yield lower LCOE values in most cases, even at significant costs. Optimal reapplication intervals range from one to five years. At receiver temperatures greater than 700 °C, non-selective coatings are not always worthwhile while durable selective coatings consistently reduce the LCOE—up to 12% of the value obtained for an uncoated receiver. Moreover the absolute LCOC is a powerful tool to characterize and compare different coatings, not only considering their initial efficiencies but also including their durability.« less

  2. Resilience Metrics

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    for Quadrennial Energy Review Technical Workshop on Resilience Metrics for Energy Transmission and Distribution Infrastructure April 28, 2014 Infrastructure Assurance Center ...

  3. Surveillance metrics sensitivity study.

    SciTech Connect (OSTI)

    Hamada, Michael S.; Bierbaum, Rene Lynn; Robertson, Alix A.

    2011-09-01

    In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.

  4. performance metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    performance metrics - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear

  5. Metric Presentation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    MODERN GRID S T R A T E G Y Smart Grid Metrics Monitoring our Progress Smart Grid Implementation Workshop Joe Miller - Modern Grid Team June 19, 2008 1 Conducted by the National Energy Technology Laboratory Funded by the U.S. Department of Energy, Office of Electricity Delivery and Energy Reliability 2 Office of Electricity Delivery and Energy Reliability MODERN GRID S T R A T E G Y Many are working on the Smart Grid FERC DOE-OE Grid 2030 GridWise Alliance EEI NERC (FM) DOE/NETL Modern Grid

  6. Evaluating IMRT and VMAT dose accuracy: Practical examples of failure to detect systematic errors when applying a commonly used metric and action levels

    SciTech Connect (OSTI)

    Nelms, Benjamin E.; Chan, Maria F.; Jarry, Genevive; Lemire, Matthieu; Lowden, John; Hampton, Carnell

    2013-11-15

    Purpose: This study (1) examines a variety of real-world cases where systematic errors were not detected by widely accepted methods for IMRT/VMAT dosimetric accuracy evaluation, and (2) drills-down to identify failure modes and their corresponding means for detection, diagnosis, and mitigation. The primary goal of detailing these case studies is to explore different, more sensitive methods and metrics that could be used more effectively for evaluating accuracy of dose algorithms, delivery systems, and QA devices.Methods: The authors present seven real-world case studies representing a variety of combinations of the treatment planning system (TPS), linac, delivery modality, and systematic error type. These case studies are typical to what might be used as part of an IMRT or VMAT commissioning test suite, varying in complexity. Each case study is analyzed according to TG-119 instructions for gamma passing rates and action levels for per-beam and/or composite plan dosimetric QA. Then, each case study is analyzed in-depth with advanced diagnostic methods (dose profile examination, EPID-based measurements, dose difference pattern analysis, 3D measurement-guided dose reconstruction, and dose grid inspection) and more sensitive metrics (2% local normalization/2 mm DTA and estimated DVH comparisons).Results: For these case studies, the conventional 3%/3 mm gamma passing rates exceeded 99% for IMRT per-beam analyses and ranged from 93.9% to 100% for composite plan dose analysis, well above the TG-119 action levels of 90% and 88%, respectively. However, all cases had systematic errors that were detected only by using advanced diagnostic techniques and more sensitive metrics. The systematic errors caused variable but noteworthy impact, including estimated target dose coverage loss of up to 5.5% and local dose deviations up to 31.5%. Types of errors included TPS model settings, algorithm limitations, and modeling and alignment of QA phantoms in the TPS. Most of the errors were

  7. Instructions for EM Corporate Performance Metrics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Instructions for EM Corporate Performance Metrics Instructions for EM Corporate Performance Metrics Quality Program Criteria Instructions for EM Corporate Performance Metrics (128.47 KB) More Documents & Publications EM Corporate QA Performance Metrics CPMS Tables QA Corporate Board Meeting - July 2008

  8. Wind resource quality affected by high levels of renewables

    SciTech Connect (OSTI)

    Diakov, Victor

    2015-06-17

    For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a given level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.

  9. Wind resource quality affected by high levels of renewables

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Diakov, Victor

    2015-06-17

    For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a givenmore » level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.« less

  10. ARM - 2008 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  11. ARM - 2006 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  12. ARM - 2007 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  13. NIF Target Shot Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    target shot metrics NIF Target Shot Metrics Exp Cap - Experimental Capability Natl Sec Appl - National Security Applications DS - Discovery Science ICF - Inertial Confinement Fusion HED - High Energy Density For internal LLNL firewall viewing - if the page is blank, please open www.google.com to flush out BCB

  14. Quality assurance program plan for low-level waste at the WSCF Laboratory

    SciTech Connect (OSTI)

    Morrison, J.A.

    1994-11-01

    The purpose of this document is to provide guidance for the implementation of the Quality Assurance Program Plan (QAPP) for the management of low-level waste at the Waste Sampling and Characterization Facility (WSCF) Laboratory Complex as required by WHC-CM-4-2, Quality Assurance Manual, which is based on Quality Assurance Program Requirements for Nuclear Facilities, NQA-1 (ASME).

  15. Metric Construction | Open Energy Information

    Open Energy Info (EERE)

    Metric Construction Jump to: navigation, search Name: Metric Construction Place: Boston, MA Information About Partnership with NREL Partnership with NREL Yes Partnership Type Test...

  16. Software quality for 1997 - what works and what doesn`t?

    SciTech Connect (OSTI)

    Jones, C.

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  17. Cyber threat metrics.

    SciTech Connect (OSTI)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  18. Software quality in 1997

    SciTech Connect (OSTI)

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  19. Metrics for Energy Resilience

    SciTech Connect (OSTI)

    Paul E. Roege; Zachary A. Collier; James Mancillas; John A. McDonagh; Igor Linkov

    2014-09-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today?s energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system?s energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth.

  20. Ames Laboratory Metrics | The Ames Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Metrics Document Number: NA Effective Date: 01/2016 File (public): PDF icon ameslab_metrics_01-14-16

  1. EM Corporate Performance Metrics, Site Level

    Office of Environmental Management (EM)

    completed 1 1 1 1 Grand Junction Geographic Sites Eliminated Number completed 3 2 2 2 Inhalation Toxicology Laboratory LLLLMW disposed Legacy (Stored) and NGW Cubic Meters 359...

  2. EM Corporate Performance Metrics, Complex Level

    Office of Environmental Management (EM)

    98,053 106,526 LLLLMW disposed Legacy (Stored) and NGW Cubic Meters 1,558,048 1,209,709 1,237,779 1,265,849 MAAs eliminated Number of Material Access Areas 35 30 30 30 Nuclear...

  3. Variable metric conjugate gradient methods

    SciTech Connect (OSTI)

    Barth, T.; Manteuffel, T.

    1994-07-01

    1.1 Motivation. In this paper we present a framework that includes many well known iterative methods for the solution of nonsymmetric linear systems of equations, Ax = b. Section 2 begins with a brief review of the conjugate gradient method. Next, we describe a broader class of methods, known as projection methods, to which the conjugate gradient (CG) method and most conjugate gradient-like methods belong. The concept of a method having either a fixed or a variable metric is introduced. Methods that have a metric are referred to as either fixed or variable metric methods. Some relationships between projection methods and fixed (variable) metric methods are discussed. The main emphasis of the remainder of this paper is on variable metric methods. In Section 3 we show how the biconjugate gradient (BCG), and the quasi-minimal residual (QMR) methods fit into this framework as variable metric methods. By modifying the underlying Lanczos biorthogonalization process used in the implementation of BCG and QMR, we obtain other variable metric methods. These, we refer to as generalizations of BCG and QMR.

  4. Daylight metrics and energy savings

    SciTech Connect (OSTI)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  5. Metrics for border management systems.

    SciTech Connect (OSTI)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  6. List of SEP Reporting Metrics

    Broader source: Energy.gov [DOE]

    DOE State Energy Program List of Reporting Metrics, which was produced by the Office of Energy Efficiency and Renewable Energy Weatherization and Intergovernmental Program for SEP and the Energy Efficiency and Conservation Block Grants (EECBG) programs.

  7. Quality assurance plan for the High Level Controller for the CBMS Block II

    SciTech Connect (OSTI)

    Reid, R.W.; Robbins, I.F.; Stewart, K.A.; Terry, C.L.; Whitaker, R.A.; Wolf, D.A.; Zager, J.C.

    1997-09-01

    This document establishes the software Quality Assurance Plan (QAP) for the High Level Controller for the Chemical and Biological Mass Spectrometer Block II (HLC/CBMS-II) project activities under the Computing, Robotics, and Education (CRE) Directorate management. It defines the requirements and assigns responsibilities for ensuring, with a high degree of confidence, that project objectives will be achieved as planned. The CBMS Program was awarded to ORNL by the US Army Chemical and Biological Defense command, Aberdeen Proving Ground, Maryland, to design the next version (Block II) mass spectrometer for the detection and identification of chemical and biological warfare agents, to fabricate four engineering prototypes, and to construct eight preproduction units. Section 1 of this document provides an introduction to the HLC/CBMS-II project QAP. Sections 2 and 3 describe the specific aspects of quality assurance as applicable to the project. Section 4 reviews the project approach to risk management. The Risk Management Matrix given in Appendix A is a tool to assess, prioritize, and prevent problems before they occur; therefore, the matrix will be reviewed and revised on a periodic basis. Appendix B shows the quality assurance criteria of the DOE Order 5700.6C and their applicability to this project.

  8. Common Carbon Metric | Open Energy Information

    Open Energy Info (EERE)

    Common Carbon Metric Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Common Carbon Metric AgencyCompany Organization: United Nations Environment Programme, World...

  9. ARM - Evaluation Product - Barrow Radiation Data (2009 metric)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ProductsBarrow Radiation Data (2009 metric) ARM Data Discovery Browse Data Documentation Use the Data File Inventory tool to view data availability at the file level. Comments? We would love to hear from you! Send us a note below or call us at 1-888-ARM-DATA. Send Evaluation Product : Barrow Radiation Data (2009 metric) Observations from a suite of radiometers including Precision Spectral Pyranometers (PSPs), Precision Infrared Radiometers (PIRs), and a Normal Incident Pyrheliometer (NIP) are

  10. Performance Metrics Tiers | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Performance Metrics Tiers Performance Metrics Tiers The performance metrics defined by the Commercial Buildings Integration Program offer different tiers of information to address the needs of various users. On this page you will find information about the various goals users are trying to achieve by using performance metrics and the tiers of metrics. Goals in Measuring Performance Many individuals and groups are involved with a building over its lifetime, and all have different interests in and

  11. Thermodynamic Metrics and Optimal Paths

    SciTech Connect (OSTI)

    Sivak, David; Crooks, Gavin

    2012-05-08

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  12. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    SciTech Connect (OSTI)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  13. Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B. M.; Florita, A.; Lu, S.; Hamann, H. F.; Banunarayanan, V.

    2013-10-01

    Forecasting solar energy generation is a challenging task due to the variety of solar power systems and weather regimes encountered. Forecast inaccuracies can result in substantial economic losses and power system reliability issues. This paper presents a suite of generally applicable and value-based metrics for solar forecasting for a comprehensive set of scenarios (i.e., different time horizons, geographic locations, applications, etc.). In addition, a comprehensive framework is developed to analyze the sensitivity of the proposed metrics to three types of solar forecasting improvements using a design of experiments methodology, in conjunction with response surface and sensitivity analysis methods. The results show that the developed metrics can efficiently evaluate the quality of solar forecasts, and assess the economic and reliability impact of improved solar forecasting.

  14. Multi-Metric Sustainability Analysis

    SciTech Connect (OSTI)

    Cowlin, S.; Heimiller, D.; Macknick, J.; Mann, M.; Pless, J.; Munoz, D.

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  15. Comparing Resource Adequacy Metrics: Preprint

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Comparing Resource Adequacy Metrics Preprint E. Ibanez and M. Milligan National Renewable Energy Laboratory To be presented at the 13th International Workshop on Large-Scale Integration of Wind Power into Power Systems as Well as on Transmission Networks for Offshore Wind Power Plants Berlin, Germany November 11-13, 2014 Conference Paper NREL/CP-5D00-62847 September 2014 NOTICE The submitted manuscript has been offered by an employee of the Alliance for Sustainable Energy, LLC (Alliance), a

  16. Low-Activity Waste and High-Level Waste Feed Processing Data Quality Objectives

    SciTech Connect (OSTI)

    Patello, Gertrude K. ); Truex, Michael J. ); Wiemers, Karyn D.

    1999-04-15

    fallback positions are realized or eliminated early in the planning process. This DQO replaces earlier separate low-activity waste feed data quality objectives (Truex and Wiemers 1998) and high-level waste feed data quality objectives documents (Wiemers et,al. 1998). This combined DQO updates the data requirements based on the TWRS Privatization Contact issued August 1998 (DOE-RL 1998). Regulatory compliance for TWRS Privatization is addressed in a separate DQO (Wiemers et al. 1998). Additional characterization of the Phase I waste feed will be performed by DOE's contractors: the M&I contractor and the private contractor. Characterization for feed certification and waste acceptance will be completed before transfer of the feed to the private contractor facility. Characterization requirements for staged feed will be identified in other DQOS consistent with the Feed Certification Plans, ICDS 19 and 20, and applicable permits. Newly obtained analytical data and contract changes that have become available in parallel with or subsequent to preparation of this DQO update will be assessed and incorporated into the data needs optimization in the next revision of this DQO. Data available at the time of the tank waste sample request will be considered in the development of the Tank Sampling and Analysis Plan.

  17. Buildings Performance Metrics Terminology | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Performance Metrics Terminology Buildings Performance Metrics Terminology This document provides the terms and definitions used in the Department of Energys Performance Metrics Research Project. metrics_terminology_20090203.pdf (152.35 KB) More Documents & Publications Procuring Architectural and Engineering Services for Energy Efficiency and Sustainability Transmittal Letter for the Statewide Benchmarking Process Evaluation Guide for Benchmarking Residential Energy Efficiency Program

  18. EECBG SEP Attachment 1 - Process metric list

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    10-07B/SEP 10-006A Attachment 1: Process Metrics List Metric Area Metric Primary or Optional Metric Item(s) to Report On 1. Building Retrofits 1a. Buildings retrofitted, by sector Number of buildings retrofitted Square footage of buildings retrofitted 1b. Energy management systems installed, by sector Number of energy management systems installed Square footage of buildings under management 1c. Building roofs retrofitted, by sector Number of building roofs retrofitted Square footage of building

  19. Definition of GPRA08 benefits metrics

    SciTech Connect (OSTI)

    None, None

    2009-01-18

    Background information for the FY 2007 GPRA methodology review on the definitions of GPRA08 benefits metrics.

  20. Module 6- Metrics, Performance Measurements and Forecasting

    Broader source: Energy.gov [DOE]

    This module reviews metrics such as cost and schedule variance along with cost and schedule performance indices.

  1. Comparing Resource Adequacy Metrics: Preprint

    SciTech Connect (OSTI)

    Ibanez, E.; Milligan, M.

    2014-09-01

    As the penetration of variable generation (wind and solar) increases around the world, there is an accompanying growing interest and importance in accurately assessing the contribution that these resources can make toward planning reserve. This contribution, also known as the capacity credit or capacity value of the resource, is best quantified by using a probabilistic measure of overall resource adequacy. In recognizing the variable nature of these renewable resources, there has been interest in exploring the use of reliability metrics other than loss of load expectation. In this paper, we undertake some comparisons using data from the Western Electricity Coordinating Council in the western United States.

  2. Conceptual Soundness, Metric Development, Benchmarking, and Targeting for PATH Subprogram Evaluation

    SciTech Connect (OSTI)

    Mosey. G.; Doris, E.; Coggeshall, C.; Antes, M.; Ruch, J.; Mortensen, J.

    2009-01-01

    The objective of this study is to evaluate the conceptual soundness of the U.S. Department of Housing and Urban Development (HUD) Partnership for Advancing Technology in Housing (PATH) program's revised goals and establish and apply a framework to identify and recommend metrics that are the most useful for measuring PATH's progress. This report provides an evaluative review of PATH's revised goals, outlines a structured method for identifying and selecting metrics, proposes metrics and benchmarks for a sampling of individual PATH programs, and discusses other metrics that potentially could be developed that may add value to the evaluation process. The framework and individual program metrics can be used for ongoing management improvement efforts and to inform broader program-level metrics for government reporting requirements.

  3. Culture, and a Metrics Methodology for Biological Countermeasure Scenarios

    SciTech Connect (OSTI)

    Simpson, Mary J.

    2007-03-15

    Outcome Metrics Methodology defines a way to evaluate outcome metrics associated with scenario analyses related to biological countermeasures. Previous work developed a schema to allow evaluation of common elements of impacts across a wide range of potential threats and scenarios. Classes of metrics were identified that could be used by decision makers to differentiate the common bases among disparate scenarios. Typical impact metrics used in risk calculations include the anticipated number of deaths, casualties, and the direct economic costs should a given event occur. There are less obvious metrics that are often as important and require more intensive initial work to be incorporated. This study defines a methodology for quantifying, evaluating, and ranking metrics other than direct health and economic impacts. As has been observed with the consequences of Hurricane Katrina, impacts to the culture of specific sectors of society are less obvious on an immediate basis but equally important over the ensuing and long term. Culture is used as the example class of metrics within which • requirements for a methodology are explored • likely methodologies are examined • underlying assumptions for the respective methodologies are discussed • the basis for recommending a specific methodology is demonstrated. Culture, as a class of metrics, is shown to consist of political, sociological, and psychological elements that are highly valued by decision makers. In addition, cultural practices, dimensions, and kinds of knowledge offer complementary sets of information that contribute to the context within which experts can provide input. The quantification and evaluation of sociopolitical, socio-economic, and sociotechnical impacts depend predominantly on subjective, expert judgment. Epidemiological data is limited, resulting in samples with statistical limits. Dose response assessments and curves depend on the quality of data and its relevance to human modes of exposure

  4. Efficient Synchronization Stability Metrics for Fault Clearing...

    Office of Scientific and Technical Information (OSTI)

    Title: Efficient Synchronization Stability Metrics for Fault Clearing Authors: Backhaus, Scott N. 1 ; Chertkov, Michael 1 ; Bent, Russell Whitford 1 ; Bienstock, Daniel 2...

  5. Module 6 - Metrics, Performance Measurements and Forecasting...

    Broader source: Energy.gov (indexed) [DOE]

    This module reviews metrics such as cost and schedule variance along with cost and schedule performance indices. In addition, this module will outline forecasting tools such as ...

  6. Western Resource Adequacy: Challenges - Approaches - Metrics...

    Energy Savers [EERE]

    Eastern Wind Integration and Transmission Study (EWITS) (Revised) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United ...

  7. Microsoft Word - QER Resilience Metrics - Technical Workshp ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Workshop Resilience Metrics for Energy Transmission and Distribution Infrastructure Offices of Electricity Delivery and Energy Reliability (OE) and Energy Policy and Systems ...

  8. Microsoft Word - QER Resilience Metrics - Technical Workshp ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Quadrennial Energy Review Technical Workshop on Resilience Metrics for Energy Transmission and Distribution Infrastructure April, 29th, 2014 777 North Capitol St NE Ste 300, ...

  9. The image quality of ion computed tomography at clinical imaging dose levels

    SciTech Connect (OSTI)

    Hansen, David C.; Bassler, Niels; Sørensen, Thomas Sangild; Seco, Joao

    2014-11-01

    Purpose: Accurately predicting the range of radiotherapy ions in vivo is important for the precise delivery of dose in particle therapy. Range uncertainty is currently the single largest contribution to the dose margins used in planning and leads to a higher dose to normal tissue. The use of ion CT has been proposed as a method to improve the range uncertainty and thereby reduce dose to normal tissue of the patient. A wide variety of ions have been proposed and studied for this purpose, but no studies evaluate the image quality obtained with different ions in a consistent manner. However, imaging doses ion CT is a concern which may limit the obtainable image quality. In addition, the imaging doses reported have not been directly comparable with x-ray CT doses due to the different biological impacts of ion radiation. The purpose of this work is to develop a robust methodology for comparing the image quality of ion CT with respect to particle therapy, taking into account different reconstruction methods and ion species. Methods: A comparison of different ions and energies was made. Ion CT projections were simulated for five different scenarios: Protons at 230 and 330 MeV, helium ions at 230 MeV/u, and carbon ions at 430 MeV/u. Maps of the water equivalent stopping power were reconstructed using a weighted least squares method. The dose was evaluated via a quality factor weighted CT dose index called the CT dose equivalent index (CTDEI). Spatial resolution was measured by the modulation transfer function. This was done by a noise-robust fit to the edge spread function. Second, the image quality as a function of the number of scanning angles was evaluated for protons at 230 MeV. In the resolution study, the CTDEI was fixed to 10 mSv, similar to a typical x-ray CT scan. Finally, scans at a range of CTDEI’s were done, to evaluate dose influence on reconstruction error. Results: All ions yielded accurate stopping power estimates, none of which were statistically

  10. Sheet1 Water Availability Metric (Acre-Feet/Yr) Water Cost Metric...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sheet1 Water Availability Metric (Acre-FeetYr) Water Cost Metric (Acre-Foot) Current Water Use (Acre-FeetYr) Projected Use in 2030 (Acre-FeetYr) HUC8 STATE BASIN SUBBASIN ...

  11. Smart Grid Status and Metrics Report Appendices

    SciTech Connect (OSTI)

    Balducci, Patrick J.; Antonopoulos, Chrissi A.; Clements, Samuel L.; Gorrissen, Willy J.; Kirkham, Harold; Ruiz, Kathleen A.; Smith, David L.; Weimar, Mark R.; Gardner, Chris; Varney, Jeff

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  12. Practical Diagnostics for Evaluating Residential Commissioning Metrics

    SciTech Connect (OSTI)

    Wray, Craig; Walker, Iain; Siegel, Jeff; Sherman, Max

    2002-06-11

    In this report, we identify and describe 24 practical diagnostics that are ready now to evaluate residential commissioning metrics, and that we expect to include in the commissioning guide. Our discussion in the main body of this report is limited to existing diagnostics in areas of particular concern with significant interactions: envelope and HVAC systems. These areas include insulation quality, windows, airtightness, envelope moisture, fan and duct system airflows, duct leakage, cooling equipment charge, and combustion appliance backdrafting with spillage. Appendix C describes the 83 other diagnostics that we have examined in the course of this project, but that are not ready or are inappropriate for residential commissioning. Combined with Appendix B, Table 1 in the main body of the report summarizes the advantages and disadvantages of all 107 diagnostics. We first describe what residential commissioning is, its characteristic elements, and how one might structure its process. Our intent in this discussion is to formulate and clarify these issues, but is largely preliminary because such a practice does not yet exist. Subsequent sections of the report describe metrics one can use in residential commissioning, along with the consolidated set of 24 practical diagnostics that the building industry can use now to evaluate them. Where possible, we also discuss the accuracy and usability of diagnostics, based on recent laboratory work and field studies by LBNL staff and others in more than 100 houses. These studies concentrate on evaluating diagnostics in the following four areas: the DeltaQ duct leakage test, air-handler airflow tests, supply and return grille airflow tests, and refrigerant charge tests. Appendix A describes those efforts in detail. In addition, where possible, we identify the costs to purchase diagnostic equipment and the amount of time required to conduct the diagnostics. Table 1 summarizes these data. Individual equipment costs for the 24

  13. Metrics and Benchmarks for Energy Efficiency in Laboratories

    SciTech Connect (OSTI)

    Mathew, Paul

    2007-10-26

    A wide spectrum of laboratory owners, ranging from universities to federal agencies, have explicit goals for energy efficiency in their facilities. For example, the Energy Policy Act of 2005 (EPACT 2005) requires all new federal buildings to exceed ASHRAE 90.1-2004 1 by at least 30 percent. The University of California Regents Policy requires all new construction to exceed California Title 24 2 by at least 20 percent. A new laboratory is much more likely to meet energy efficiency goals if quantitative metrics and targets are explicitly specified in programming documents and tracked during the course of the delivery process. If efficiency targets are not explicitly and properly defined, any additional capital costs or design time associated with attaining higher efficiencies can be difficult to justify. The purpose of this guide is to provide guidance on how to specify and compute energy efficiency metrics and benchmarks for laboratories, at the whole building as well as the system level. The information in this guide can be used to incorporate quantitative metrics and targets into the programming of new laboratory facilities. Many of these metrics can also be applied to evaluate existing facilities. For information on strategies and technologies to achieve energy efficiency, the reader is referred to Labs21 resources, including technology best practice guides, case studies, and the design guide (available at www.labs21century.gov/toolkit).

  14. Quality assurance of temporal variability of natural decay chain and neutron induced background for low-level NORM analysis

    SciTech Connect (OSTI)

    Yoho, Michael; Porterfield, Donivan R.; Landsberger, Sheldon

    2015-09-22

    In this study, twenty-one high purity germanium (HPGe) background spectra were collected over 2 years at Los Alamos National Laboratory. A quality assurance methodology was developed to monitor spectral background levels from thermal and fast neutron flux levels and naturally occurring radioactive material decay series radionuclides. 238U decay products above 222Rn demonstrated minimal temporal variability beyond that expected from counting statistics. 238U and 232Th progeny below Rn gas displayed at most twice the expected variability. Further, an analysis of the 139 keV 74Ge(n, γ) and 691 keV 72Ge(n, n') spectral features demonstrated temporal stability for both thermal and fast neutron fluxes.

  15. Quality assurance of temporal variability of natural decay chain and neutron induced background for low-level NORM analysis

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Yoho, Michael; Porterfield, Donivan R.; Landsberger, Sheldon

    2015-09-22

    In this study, twenty-one high purity germanium (HPGe) background spectra were collected over 2 years at Los Alamos National Laboratory. A quality assurance methodology was developed to monitor spectral background levels from thermal and fast neutron flux levels and naturally occurring radioactive material decay series radionuclides. 238U decay products above 222Rn demonstrated minimal temporal variability beyond that expected from counting statistics. 238U and 232Th progeny below Rn gas displayed at most twice the expected variability. Further, an analysis of the 139 keV 74Ge(n, γ) and 691 keV 72Ge(n, n') spectral features demonstrated temporal stability for both thermal and fast neutronmore » fluxes.« less

  16. Metrics for comparison of crystallographic maps

    SciTech Connect (OSTI)

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects, such as regions of high density, are of interest.

  17. Clean Cities Annual Metrics Report 2009 (Revised)

    SciTech Connect (OSTI)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  18. Technical Workshop: Resilience Metrics for Energy Transmission...

    Broader source: Energy.gov (indexed) [DOE]

    List (55.27 KB) Sandia Report: Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (14.49 MB) Sandia ...

  19. Business Metrics for High-Performance Homes: A Colorado Springs...

    Office of Scientific and Technical Information (OSTI)

    Technical Report: Business Metrics for High-Performance Homes: A Colorado Springs Case Study Citation Details In-Document Search Title: Business Metrics for High-Performance Homes: ...

  20. FY 2014 Q3 Metric Summary | Department of Energy

    Office of Environmental Management (EM)

    FY 2014 Overall Contract and Project Management Improvement Performance Metrics and Targets FY 2015 Overall Contract and Project Management Improvement Performance Metrics and ...

  1. Texas CO2 Capture Demonstration Project Hits Three Million Metric...

    Office of Environmental Management (EM)

    Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone June 30, 2016 - ...

  2. Deep-level emission in ZnO nanowires and bulk crystals: Excitation-intensity dependence versus crystalline quality

    SciTech Connect (OSTI)

    Hou, Dongchao; Voss, Tobias; Ronning, Carsten; Menzel, Andreas; Zacharias, Margit

    2014-06-21

    The excitation-intensity dependence of the excitonic near-band-edge emission (NBE) and deep-level related emission (DLE) bands in ZnO nanowires and bulk crystals is studied, which show distinctly different power laws. The behavior can be well explained with a rate-equation model taking into account deep donor and acceptor levels with certain capture cross sections for electrons from the conduction band and different radiative lifetimes. In addition, a further crucial ingredient of this model is the background n-type doping concentration inherent in almost all ZnO single crystals. The interplay of the deep defects and the background free-electron concentration in the conduction band at room temperature reproduces the experimental results well over a wide range of excitation intensities (almost five orders of magnitude). The results demonstrate that for many ZnO bulk samples and nanostructures, the relative intensity R?=?I{sub NBE}/I{sub DLE} can be adjusted over a wide range by varying the excitation intensity, thus, showing that R should not be taken as an indicator for the crystalline quality of ZnO samples unless absolute photoluminescence intensities under calibrated excitation conditions are compared. On the other hand, the results establish an all-optical technique to determine the relative doping levels in different ZnO samples by measuring the excitation-intensity dependence of the UV and visible luminescence bands.

  3. Implementing the Data Center Energy Productivity Metric

    SciTech Connect (OSTI)

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2012-10-01

    As data centers proliferate in both size and number, their energy efficiency is becoming increasingly important. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high performance computing data center. We found that DCeP was successful in clearly distinguishing between different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve (or even maximize) energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and among data centers.

  4. Shallow ground-water flow, water levels, and quality of water, 1980-84, Cowles Unit, Indiana Dunes National Lakeshore

    SciTech Connect (OSTI)

    Cohen, D.A.; Shedlock, R.J.

    1986-01-01

    The Cowles Unit of Indiana Dunes National Lakeshore in Porter County, northwest Indiana, contains a broad dune-beach complex along the southern shoreline of Lake Michigan and a large wetland, called the Great Marsh, that occupies the lowland between the shoreline dunes and an older dune-beach complex farther inland. Water levels and water quality in the surficial aquifer were monitored from 1977 to 1984 near settling ponds on adjacent industrial property at the western end of the Cowles Unit. Since 1980, when the settling pond bottoms were sealed, these intradunal lowlands contained standing water only during periods of high snowmelt or rainfall. Water level declines following the cessation of seepage ranged from 6 feet at the eastern-most settling pond to nearly 14 feet at the western-most pond. No general pattern of water table decline was observed in the Great Marsh or in the shoreline dune complex at distances > 3,000 ft east or north of the settling ponds. Since the settling ponds were sealed, the concentration of boron has decreased while concentrations of cadmium, arsenic, zinc, and molybdenum in shallow ground-water downgradient of the ponds show no definite trends in time. Arsenic, boron and molybdenum have remained at concentrations above those of shallow groundwater in areas unaffected by settling pond seepage. 11 refs., 10 figs., 1 tab.

  5. Metrics for Evaluating the Accuracy of Solar Power Forecasting (Presentation)

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B.; Florita, A.; Lu, S.; Hamann, H.; Banunarayanan, V.

    2013-10-01

    This presentation proposes a suite of metrics for evaluating the performance of solar power forecasting.

  6. Metrics for comparison of crystallographic maps

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects,more » such as regions of high density, are of interest.« less

  7. Enhanced Accident Tolerant LWR Fuels: Metrics Development

    SciTech Connect (OSTI)

    Shannon Bragg-Sitton; Lori Braase; Rose Montgomery; Chris Stanek; Robert Montgomery; Lance Snead; Larry Ott; Mike Billone

    2013-09-01

    The Department of Energy (DOE) Fuel Cycle Research and Development (FCRD) Advanced Fuels Campaign (AFC) is conducting research and development on enhanced Accident Tolerant Fuels (ATF) for light water reactors (LWRs). This mission emphasizes the development of novel fuel and cladding concepts to replace the current zirconium alloy-uranium dioxide (UO2) fuel system. The overall mission of the ATF research is to develop advanced fuels/cladding with improved performance, reliability and safety characteristics during normal operations and accident conditions, while minimizing waste generation. The initial effort will focus on implementation in operating reactors or reactors with design certifications. To initiate the development of quantitative metrics for ATR, a LWR Enhanced Accident Tolerant Fuels Metrics Development Workshop was held in October 2012 in Germantown, MD. This paper summarizes the outcome of that workshop and the current status of metrics development for LWR ATF.

  8. EECBG SEP Attachment 1 - Process metric list | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    SEP Attachment 1 - Process metric list EECBG SEP Attachment 1 - Process metric list Reporting Guidance Process Metric List eecbg_10_07b_sep__10_006a_attachment1_process_metric_list.pdf (93.56 KB) More Documents & Publications EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List EECBG Program Notice 10-07A DOE Recovery Act Reporting Requirements for the State Energy Program

  9. Performance Metrics Research Project - Final Report

    SciTech Connect (OSTI)

    Deru, M.; Torcellini, P.

    2005-10-01

    NREL began work for DOE on this project to standardize the measurement and characterization of building energy performance. NREL's primary research objectives were to determine which performance metrics have greatest value for determining energy performance and to develop standard definitions and methods of measuring and reporting that performance.

  10. Clean Cities 2011 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  11. Clean Cities 2010 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  12. Mesh Quality Improvement Toolkit

    Energy Science and Technology Software Center (OSTI)

    2002-11-15

    MESQUITE is a linkable software library to be used by simulation and mesh generation tools to improve the quality of meshes. Mesh quality is improved by node movement and/or local topological modifications. Various aspects of mesh quality such as smoothness, element shape, size, and orientation are controlled by choosing the appropriate mesh qualtiy metric, and objective function tempate, and a numerical optimization solver to optimize the quality of meshes, MESQUITE uses the TSTT mesh interfacemore » specification to provide an interoperable toolkit that can be used by applications which adopt the standard. A flexible code design makes it easy for meshing researchers to add additional mesh quality metrics, templates, and solvers to develop new quality improvement algorithms by making use of the MESQUITE infrastructure.« less

  13. Widget:CrazyEggMetrics | Open Energy Information

    Open Energy Info (EERE)

    CrazyEggMetrics Jump to: navigation, search This widget runs javascript code for the Crazy Egg user experience metrics. This should not be on all pages, but on select pages...

  14. Energy Department Project Captures and Stores One Million Metric...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    One Million Metric Tons of Carbon Energy Department Project Captures and Stores One Million Metric Tons of Carbon January 8, 2015 - 11:18am Addthis News Media Contact 202-586-4940 ...

  15. Smart Grid Status and Metrics Report

    SciTech Connect (OSTI)

    Balducci, Patrick J.; Weimar, Mark R.; Kirkham, Harold

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  16. Financial Metrics Data Collection Protocol, Version 1.0

    SciTech Connect (OSTI)

    Fowler, Kimberly M.; Gorrissen, Willy J.; Wang, Na

    2010-04-30

    Brief description of data collection process and plan that will be used to collect financial metrics associated with sustainable design.

  17. Nonmaximality of known extremal metrics on torus and Klein bottle

    SciTech Connect (OSTI)

    Karpukhin, M A

    2013-12-31

    The El Soufi-Ilias theorem establishes a connection between minimal submanifolds of spheres and extremal metrics for eigenvalues of the Laplace-Beltrami operator. Recently, this connection was used to provide several explicit examples of extremal metrics. We investigate the properties of these metrics and prove that none of them is maximal. Bibliography: 24 titles.

  18. Annex A Metrics for the Smart Grid System Report

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Annex A Metrics for the Smart Grid System Report A.iii Table of Contents Introduction ........................................................................................................................................... A.1 Metric #1: The Fraction of Customers and Total Load Served by Real-Time Pricing, Critical Peak Pricing, and Time-of-Use Pricing ........................................................................................ A.2 Metric #2: Real-Time System Operations Data

  19. On the determination of reference levels for quality assurance of flattening filter free photon beams in radiation therapy

    SciTech Connect (OSTI)

    Clivio, Alessandro; Belosi, Maria Francesca; Cozzi, Luca; Nicolini, Giorgia; Vanetti, Eugenio; Fogliata, Antonella; Bolard, Grgory; Fenoglietto, Pascal; Krauss, Harald

    2014-02-15

    Purpose: New definitions for some dosimetric parameters for use in quality assurance of flattening filter free (FFF) beams generated by medical linear accelerators have been suggested. The present study aims to validate these suggestions and to propose possible reference levels. Methods: The main characteristics of FFF photon beams were described in terms of: field size, penumbra, unflatness, slope, and peak-position parameters. Data were collected for 6 and 10 MV-FFF beams from three different Varian TrueBeam Linacs. Measurements were performed with a 2D-array (Starcheck system from PTW-Freiburg) and with the portal dosimetry method GLAaS utilizing the build-in portal imager of TrueBeam. Data were also compared to ion chamber measurements. A cross check validation has been performed on a FFF beam of 6 MV generated by a Varian Clinac-iX upgraded to FFF capability. Results : All the parameters suggested to characterize the FFF beams resulted easily measurable and little variation was observed among different Linacs. Referring to two reference field sizes of 10 10 and 20 20 cm{sup 2}, at SDD = 100 cm and d = dmax, from the portal dosimetry data, the following results (averaging X and Y profiles) were obtained. Field size: 9.95 0.02 and 19.98 0.03 cm for 6 MV-FFF (9.94 0.02 and 19.98 0.03 cm for 10 MV-FFF). Penumbra: 2.7 0.3 and 2.9 0.3 mm for 6 MV-FFF (3.1 0.2 and 3.3 0.3 for 10 MV-FFF). Unflatness: 1.11 0.01 and 1.25 0.01 for 6 MV-FFF (1.21 0.01 and 1.50 0.01 for 10 MV-FFF). Slope: 0.320 0.020%/mm and 0.43 0.015%/mm for 6 MV-FFF (0.657 0.023%/mm and 0.795 0.017%/mm for 10 MV-FFF). Peak Position ?0.2 0.2 and ?0.4 0.2 mm for 6 MV-FFF (?0.3 0.2 and 0.7 0.3 mm for 10 MV-FFF). Results would depend upon measurement depth. With thresholds set to at least 95% confidence level from the measured data and to account for possible variations between detectors and methods and experimental settings, a tolerance set of: 1 mm for field

  20. Metrics For Comparing Plasma Mass Filters

    SciTech Connect (OSTI)

    Abraham J. Fetterman and Nathaniel J. Fisch

    2012-08-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter. __________________________________________________

  1. Metrics for comparing plasma mass filters

    SciTech Connect (OSTI)

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  2. Clean Cities 2013 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center [Office of Energy Efficiency and Renewable Energy (EERE)]

    3 Annual Metrics Report Caley Johnson and Mark Singer National Renewable Energy Laboratory Technical Report NREL/TP-5400-62838 October 2014 NREL is a national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC This report is available at no cost from the National Renewable Energy Laboratory (NREL) at www.nrel.gov/publications. Contract No. DE-AC36-08GO28308 National Renewable Energy Laboratory 15013

  3. Clean Cities 2014 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center [Office of Energy Efficiency and Renewable Energy (EERE)]

    4 Annual Metrics Report Caley Johnson and Mark Singer National Renewable Energy Laboratory Technical Report NREL/TP-5400-65265 December 2015 NREL is a national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC This report is available at no cost from the National Renewable Energy Laboratory (NREL) at www.nrel.gov/publications. Contract No. DE-AC36-08GO28308 National Renewable Energy Laboratory 15013

  4. Metric redefinitions in Einstein-Aether theory

    SciTech Connect (OSTI)

    Foster, Brendan Z.

    2005-08-15

    'Einstein-Aether' theory, in which gravity couples to a dynamical, timelike, unit-norm vector field, provides a means for studying Lorentz violation in a generally covariant setting. Demonstrated here is the effect of a redefinition of the metric and 'aether' fields in terms of the original fields and two free parameters. The net effect is a change of the coupling constants appearing in the action. Using such a redefinition, one of the coupling constants can be set to zero, simplifying studies of solutions of the theory.

  5. Hanford Waste Vitrification Plant Quality Assurance Program description for high-level waste form development and qualification. Revision 3, Part 2

    SciTech Connect (OSTI)

    Not Available

    1993-08-01

    The Hanford Waste Vitrification Plant Project has been established to convert the high-level radioactive waste associated with nuclear defense production at the Hanford Site into a waste form suitable for disposal in a deep geologic repository. The Hanford Waste Vitrification Plant will mix processed radioactive waste with borosilicate material, then heat the mixture to its melting point (vitrification) to forin a glass-like substance that traps the radionuclides in the glass matrix upon cooling. The Hanford Waste Vitrification Plant Quality Assurance Program has been established to support the mission of the Hanford Waste Vitrification Plant. This Quality Assurance Program Description has been written to document the Hanford Waste Vitrification Plant Quality Assurance Program.

  6. RAVEN Quality Assurance Activities

    SciTech Connect (OSTI)

    Cogliati, Joshua Joseph

    2015-09-01

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  7. Defining a Standard Metric for Electricity Savings

    SciTech Connect (OSTI)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  8. Bio-oil Quality Improvement and Catalytic Hydrotreating of Bio...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    2.3.1.302 Bio-oil Quality Improvement and Catalytic Hydrotreating of Bio-oils - PNNL ... lifetime Define quality metric for oil feed and intermediate streams Understand ...

  9. Metrics for Measuring Progress Toward Implementation of the Smart Grid

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    (June 2008) | Department of Energy Metrics for Measuring Progress Toward Implementation of the Smart Grid (June 2008) Metrics for Measuring Progress Toward Implementation of the Smart Grid (June 2008) Results of the breakout session discussions at the Smart Grid Implementation Workshop, June 19-20, 2008 Metrics for Measuring Progress Toward Implementation of the Smart Grid (308.23 KB) More Documents & Publications 5th Annual CHP Roadmap Workshop Breakout Group Results, September 2004

  10. Measuring energy efficiency: Opportunities from standardization and common metrics

    U.S. Energy Information Administration (EIA) Indexed Site

    Measuring energy efficiency: Opportunities from standardization and common metrics For 2016 EIA Energy Conference July 11, 2016 | Washington, D.C. By Stacy Angel, Energy Information Portfolio Analyst Carol White, Senior Energy Efficiency Analyst How is the importance of measuring energy efficiency changing? * The number of energy efficiency policies and programs is growing. * Common metrics help measure progress towards multiple objectives. * Clear metrics help consumers make informed energy

  11. Technical Workshop: Resilience Metrics for Energy Transmission and

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Distribution Infrastructure | Department of Energy Resilience Metrics for Energy Transmission and Distribution Infrastructure Technical Workshop: Resilience Metrics for Energy Transmission and Distribution Infrastructure During this workshop, EPSA invited technical experts from industry, national laboratories, academia, and NGOs to discuss the state of play of and need for resilience metrics and how they vary by natural gas, liquid fuels and electric grid infrastructures. Issues important to

  12. Integration of the EM Corporate QA Performance Metrics With Performance

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Analysis Process | Department of Energy the EM Corporate QA Performance Metrics With Performance Analysis Process Integration of the EM Corporate QA Performance Metrics With Performance Analysis Process August 2009 Presenter: Robert Hinds, Savannah River Remediation, LLC Track 9-12 Topics Covered: Implementing CPMS for QA Corporate QA Performance Metrics Contractor Performance Analysis Contractor Assessment Programs Assessment Program Structure CPMS Integration with P/A Process Validating

  13. Clean Cities 2013 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  14. Clean Cities 2014 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, Caley; Singer, Mark

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  15. Metrics correlation and analysis service (MCAS)

    SciTech Connect (OSTI)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya; /Fermilab

    2009-05-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  16. Development and evaluation of aperture-based complexity metrics using film and EPID measurements of static MLC openings

    SciTech Connect (OSTI)

    Götstedt, Julia; Karlsson Hauer, Anna; Bäck, Anna

    2015-07-15

    Purpose: Complexity metrics have been suggested as a complement to measurement-based quality assurance for intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT). However, these metrics have not yet been sufficiently validated. This study develops and evaluates new aperture-based complexity metrics in the context of static multileaf collimator (MLC) openings and compares them to previously published metrics. Methods: This study develops the converted aperture metric and the edge area metric. The converted aperture metric is based on small and irregular parts within the MLC opening that are quantified as measured distances between MLC leaves. The edge area metric is based on the relative size of the region around the edges defined by the MLC. Another metric suggested in this study is the circumference/area ratio. Earlier defined aperture-based complexity metrics—the modulation complexity score, the edge metric, the ratio monitor units (MU)/Gy, the aperture area, and the aperture irregularity—are compared to the newly proposed metrics. A set of small and irregular static MLC openings are created which simulate individual IMRT/VMAT control points of various complexities. These are measured with both an amorphous silicon electronic portal imaging device and EBT3 film. The differences between calculated and measured dose distributions are evaluated using a pixel-by-pixel comparison with two global dose difference criteria of 3% and 5%. The extent of the dose differences, expressed in terms of pass rate, is used as a measure of the complexity of the MLC openings and used for the evaluation of the metrics compared in this study. The different complexity scores are calculated for each created static MLC opening. The correlation between the calculated complexity scores and the extent of the dose differences (pass rate) are analyzed in scatter plots and using Pearson’s r-values. Results: The complexity scores calculated by the edge

  17. DOE Announces Webinars on Solar Forecasting Metrics, the DOE...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    DOE Announces Webinars on Solar Forecasting Metrics, the DOE ... from adopting the latest energy efficiency and renewable ... to liquids technology, advantages of using natural gas, ...

  18. Integration of the EM Corporate QA Performance Metrics With Performanc...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Integration of the EM Corporate QA Performance Metrics With Performance Analysis Process ... Assessment Program Structure CPMS Integration with PA Process Validating The Process ...

  19. Exploration Cost and Time Metric | Open Energy Information

    Open Energy Info (EERE)

    lt":0,"address":"","icon":"","group":"","inlineLabel":"","visitedicon":"" Hide Map Language: English Exploration Cost and Time Metric Screenshot References: Conference Paper1...

  20. Wave Energy Converter System Requirements and Performance Metrics

    Broader source: Energy.gov [DOE]

    The Energy Department and Wave Energy Scotland are holding a joint workshop on wave energy converter (WEC) system requirements and performance metrics on Friday, February 26.

  1. Conceptual Framework for Developing Resilience Metrics for the...

    Energy Savers [EERE]

    for the Electricity, Oil, and Gas Sectors in the United States (September 2015) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in ...

  2. Office of HC Strategy Budget and Performance Metrics (HC-50)

    Broader source: Energy.gov [DOE]

    The Office of Human Capital Strategy, Budget, and Performance Metrics provides strategic direction and advice to its stakeholders through the integration of budget analysis, workforce projections,...

  3. EM Corporate QA Performance Metrics | Department of Energy

    Broader source: Energy.gov (indexed) [DOE]

    QA Corporate Board Meeting - November 2008 Instructions for EM Corporate Performance Metrics FY 2015 SENIOR EXECUTIVE SERVICE (SES) AND SENIOR PROFESSIONAL (SP) PERFORMANCE ...

  4. Performance Metrics and Budget Division (HC-51) | Department...

    Broader source: Energy.gov (indexed) [DOE]

    of the Department of Energy's human capital initiatives and functions through the strategic integration of corporate human capital performance metrics and the budget ...

  5. Energy Department Sponsored Project Captures One Millionth Metric...

    Office of Environmental Management (EM)

    ... | Photo courtesy of Air Products and Chemicals Inc. Energy Department Project Captures and Stores more than One Million Metric Tons of CO2 Carbon Pollution Being Captured, ...

  6. Metrics for Evaluating Conventional and Renewable Energy Technologies (Presentation)

    SciTech Connect (OSTI)

    Mann, M. K.

    2013-01-01

    With numerous options for the future of natural gas, how do we know we're going down the right path? How do we designate a metric to measure and demonstrate change and progress, and how does that metric incorporate all stakeholders and scenarios?

  7. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    SciTech Connect (OSTI)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  8. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    SciTech Connect (OSTI)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  9. Self-benchmarking Guide for Data Centers: Metrics, Benchmarks, Actions

    SciTech Connect (OSTI)

    Mathew, Paul; Ganguly, Srirupa; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in data centers. This guide is primarily intended for personnel who have responsibility for managing energy use in existing data centers - including facilities managers, energy managers, and their engineering consultants. Additionally, data center designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior data center benchmarking studies supported by the California Energy Commission. Much of the benchmarking data are drawn from the LBNL data center benchmarking database that was developed from these studies. Additional benchmark data were obtained from engineering experts including facility designers and energy managers. This guide also builds on recent research supported by the U.S. Department of Energy's Save Energy Now program.

  10. Measuring solar reflectance Part I: Defining a metric that accurately...

    Office of Scientific and Technical Information (OSTI)

    A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool ...

  11. Microsoft Word - followup to Fin Risk Metrics workshop.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    March 21, 2008 PurposeSubject: Follow-up to Financial Risk Metrics Workshop Page 1 of 1 Differences in Cash Flow between Net Billing and Direct Pay for Energy Northwest Attached...

  12. A Graph Analytic Metric for Mitigating Advanced Persistent Threat

    SciTech Connect (OSTI)

    Johnson, John R.; Hogan, Emilie A.

    2013-06-04

    This paper introduces a novel graph analytic metric that can be used to measure the potential vulnerability of a cyber network to specific types of attacks that use lateral movement and privilege escalation such as the well known Pass The Hash, (PTH). The metric is computed from an oriented subgraph of the underlying cyber network induced by selecting only those edges for which a given property holds between the two vertices of the edge. The metric with respect to a select node on the subgraph is defined as the likelihood that the select node is reachable from another arbitrary node in the graph. This metric can be calculated dynamically from the authorization and auditing layers during the network security authorization phase and will potentially enable predictive deterrence against attacks such as PTH.

  13. Towards Efficient Supercomputing: Searching for the Right Efficiency Metric

    SciTech Connect (OSTI)

    Hsu, Chung-Hsing; Kuehn, Jeffery A; Poole, Stephen W

    2012-01-01

    The efficiency of supercomputing has traditionally been in the execution time. In early 2000 s, the concept of total cost of ownership was re-introduced, with the introduction of efficiency measure to include aspects such as energy and space. Yet the supercomputing community has never agreed upon a metric that can cover these aspects altogether and also provide a fair basis for comparison. This paper exam- ines the metrics that have been proposed in the past decade, and proposes a vector-valued metric for efficient supercom- puting. Using this metric, the paper presents a study of where the supercomputing industry has been and how it stands today with respect to efficient supercomputing.

  14. New IEC Specifications Help Define Wind Plant Performance Reporting Metrics

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    | Department of Energy IEC Specifications Help Define Wind Plant Performance Reporting Metrics New IEC Specifications Help Define Wind Plant Performance Reporting Metrics January 6, 2014 - 10:00am Addthis This is an excerpt from the Fourth Quarter 2013 edition of the Wind Program R&D Newsletter. The U.S. Department of Energy Wind Program and Sandia National Laboratories have been working with the International Electrotechnical Commission (IEC) Committee on wind turbine availability to

  15. Weatherization Assistance Program Goals and Metrics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Goals and Metrics Weatherization Assistance Program Goals and Metrics UT - Bettelle - Oak Ridge National Laboratory Logo The U.S. Department of Energy (DOE) Weatherization Assistance Program (WAP) regularly reviews the work of states and grant recipients for effectiveness and for meeting program goals. DOE's Oak Ridge National Laboratory provides technical support to the program and conducts the evaluations. Goals The overall goal of WAP is to reduce the burden of energy prices on the

  16. Conceptual Framework for Developing Resilience Metrics for the Electricity,

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Oil, and Gas Sectors in the United States (September 2015) | Department of Energy Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (September 2015) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (September 2015) This report has been written for the Department of Energy's Office of Electricity Delivery and Energy Reliability to support the Office of

  17. Enclosure - FY 2015 Q4 Metrics Report 2015-11-02.xlsx

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Fourth Quarter Overall Root Cause Analysis (RCA)Corrective Action Plan (CAP) Performance Metrics No. ContractProject Management Performance Metrics FY 2015 Target Comment No. 2 3 ...

  18. Microsoft Word - 2014-5-27 RCA Qtr 2 Metrics Attachment_R1

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Second Quarter Overall Root Cause Analysis (RCA)Corrective Action Plan (CAP) Performance Metrics 1 ContractProject Management Performance Metric FY 2014 Target FY 2014 Projected ...

  19. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    SciTech Connect (OSTI)

    Schanfein, Mark J; Gouveia, Fernando S

    2010-07-01

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls.

  20. Metrics Evolution in an Energy Research & Development Program

    SciTech Connect (OSTI)

    Brent Dixon

    2011-08-01

    All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R&D Program, which is working to improve the sustainability of nuclear energy.

  1. Non-minimal derivative couplings of the composite metric

    SciTech Connect (OSTI)

    Heisenberg, Lavinia

    2015-11-04

    In the context of massive gravity, bi-gravity and multi-gravity non-minimal matter couplings via a specific composite effective metric were investigated recently. Even if these couplings generically reintroduce the Boulware-Deser ghost, this composite metric is unique in the sense that the ghost reemerges only beyond the decoupling limit and the matter quantum loop corrections do not detune the potential interactions. We consider non-minimal derivative couplings of the composite metric to matter fields for a specific subclass of Horndeski scalar-tensor interactions. We first explore these couplings in the mini-superspace and investigate in which scenario the ghost remains absent. We further study these non-minimal derivative couplings in the decoupling-limit of the theory and show that the equation of motion for the helicity-0 mode remains second order in derivatives. Finally, we discuss preliminary implications for cosmology.

  2. Stochastic inverse problems: Models and metrics

    SciTech Connect (OSTI)

    Sabbagh, Elias H.; Sabbagh, Harold A.; Murphy, R. Kim; Aldrin, John C.; Annis, Charles; Knopp, Jeremy S.

    2015-03-31

    In past work, we introduced model-based inverse methods, and applied them to problems in which the anomaly could be reasonably modeled by simple canonical shapes, such as rectangular solids. In these cases the parameters to be inverted would be length, width and height, as well as the occasional probe lift-off or rotation. We are now developing a formulation that allows more flexibility in modeling complex flaws. The idea consists of expanding the flaw in a sequence of basis functions, and then solving for the expansion coefficients of this sequence, which are modeled as independent random variables, uniformly distributed over their range of values. There are a number of applications of such modeling: 1. Connected cracks and multiple half-moons, which we have noted in a POD set. Ideally we would like to distinguish connected cracks from one long shallow crack. 2. Cracks of irregular profile and shape which have appeared in cold work holes during bolt-hole eddy-current inspection. One side of such cracks is much deeper than other. 3. L or C shaped crack profiles at the surface, examples of which have been seen in bolt-hole cracks. By formulating problems in a stochastic sense, we are able to leverage the stochastic global optimization algorithms in NLSE, which is resident in VIC-3D®, to answer questions of global minimization and to compute confidence bounds using the sensitivity coefficient that we get from NLSE. We will also address the issue of surrogate functions which are used during the inversion process, and how they contribute to the quality of the estimation of the bounds.

  3. Primer Control System Cyber Security Framework and Technical Metrics

    SciTech Connect (OSTI)

    Wayne F. Boyer; Miles A. McQueen

    2008-05-01

    The Department of Homeland Security National Cyber Security Division supported development of a control system cyber security framework and a set of technical metrics to aid owner-operators in tracking control systems security. The framework defines seven relevant cyber security dimensions and provides the foundation for thinking about control system security. Based on the developed security framework, a set of ten technical metrics are recommended that allow control systems owner-operators to track improvements or degradations in their individual control systems security posture.

  4. Calabi-Yau metrics for quotients and complete intersections

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Braun, Volker; Brelidze, Tamaz; Douglas, Michael R.; Ovrut, Burt A.

    2008-05-22

    We extend previous computations of Calabi-Yau metrics on projective hypersurfaces to free quotients, complete intersections, and free quotients of complete intersections. In particular, we construct these metrics on generic quintics, four-generation quotients of the quintic, Schoen Calabi-Yau complete intersections and the quotient of a Schoen manifold with Z₃ x Z₃ fundamental group that was previously used to construct a heterotic standard model. Various numerical investigations into the dependence of Donaldson's algorithm on the integration scheme, as well as on the Kähler and complex structure moduli, are also performed.

  5. SU-E-T-359: Measurement of Various Metrics to Determine Changes in Megavoltage Photon Beam Energy

    SciTech Connect (OSTI)

    Gao, S; Balter, P; Rose, M; Simon, W

    2014-06-01

    Purpose: To examine the relationship between photon beam energy and various metrics for energy on the flattened and flattening filter free (FFF) beams generated by the Varian TrueBeam. Methods: Energy changes were accomplished by adjusting the bending magnet current 10% from the nominal value for the 4, 6, 8, and 10 MV flattened and 6 and 10 MV FFF beams. Profiles were measured for a 3030 cm{sup 2} field using a 2D ionization chamber array and a 3D water Scanner which was also used to measure PDDs. For flattened beams we compared several energy metrics; PDD at 10 cm depth in water (PDD(10)); the variation over the central 80% of the field (Flat); and the average of the highest reading along each diagonal divided by the CAX value, diagonal normalized flatness (FDN). For FFF beams we examined PDD(10), FDN, and the width of a chosen isodose level in a 3030 cm{sup 2} field (W(d%)). Results: Changes in PDD(10) were nearly linear with changes in energy for both flattened and FFF beams as were changes in FDN. Changes in W(d%) were also nearly linear with energy for the FFF beams. PDD(10) was not as sensitive to changes in energy compared to the other metrics for either flattened or FFF beams. Flat was not as sensitive to changes in energy compared to FDN for flattened beams and its behavior depends on depth. FDN was the metric that had the highest sensitivity to the changes in energy for flattened beams while W(d%) was the metric that had highest sensitivity to the changes in energy for FFF beams. Conclusions: The metric FDN was found to be most sensitive to energy changes for flattened beams, while the W(d%) was most sensitive to energy changes for FFF beams.

  6. Deep Energy Retrofit Performance Metric Comparison: Eight California Case Studies

    SciTech Connect (OSTI)

    Walker, Iain; Fisher, Jeremy; Less, Brennan

    2014-06-01

    In this paper we will present the results of monitored annual energy use data from eight residential Deep Energy Retrofit (DER) case studies using a variety of performance metrics. For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use energy consumption for approximately one year. Annual performance in site and source energy, as well as carbon dioxide equivalent (CO2e) emissions were determined on a per house, per person and per square foot basis to examine the sensitivity to these different metrics. All eight DERs showed consistent success in achieving substantial site energy and CO2e reductions, but some projects achieved very little, if any source energy reduction. This problem emerged in those homes that switched from natural gas to electricity for heating and hot water, resulting in energy consumption dominated by electricity use. This demonstrates the crucial importance of selecting an appropriate metric to be used in guiding retrofit decisions. Also, due to the dynamic nature of DERs, with changes in occupancy, size, layout, and comfort, several performance metrics might be necessary to understand a project’s success.

  7. EERE Portfolio. Primary Benefits Metrics for FY09

    SciTech Connect (OSTI)

    none,

    2011-11-01

    This collection of data tables shows the benefits metrics related to energy security, environmental impacts, and economic impacts for both the entire EERE portfolio of renewable energy technologies as well as the individual technologies. Data are presented for the years 2015, 2020, 2030, and 2050, for both the NEMS and MARKAL models.

  8. Comparing Resource Adequacy Metrics and Their Influence on Capacity Value: Preprint

    SciTech Connect (OSTI)

    Ibanez, E.; Milligan, M.

    2014-04-01

    Traditional probabilistic methods have been used to evaluate resource adequacy. The increasing presence of variable renewable generation in power systems presents a challenge to these methods because, unlike thermal units, variable renewable generation levels change over time because they are driven by meteorological events. Thus, capacity value calculations for these resources are often performed to simple rules of thumb. This paper follows the recommendations of the North American Electric Reliability Corporation?s Integration of Variable Generation Task Force to include variable generation in the calculation of resource adequacy and compares different reliability metrics. Examples are provided using the Western Interconnection footprint under different variable generation penetrations.

  9. Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems

    SciTech Connect (OSTI)

    Greitzer, Frank L.

    2008-09-15

    The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighters cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts; and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.

  10. Data quality objectives for TWRS privatization phase 1: confirm tank T is an appropriate feed source for high-level waste feed batch X

    SciTech Connect (OSTI)

    NGUYEN, D.M.

    1999-06-01

    The U.S. Department of Energy-Richland Operations Office (DOE-RL) has initiated Phase 1 of a two-phase privatization strategy for treatment and immobilization of high-level waste (HLW) that is currently managed by the Hanford Tank Waste Remediation System (TWRS) Project. In this strategy, DOE will purchase services from a contractor-owned and operated facility under a fixed price. The Phase 1 TWRS privatization contract requires that the Project Hanford Management Contract (PHMC) contractors, on behalf of DOE, deliver HLW feed in specified quantities and composition to the Privatization Contractor in a timely manner (DOE-RL 1996). Additional requirements are imposed by the interface control document (ICD) for HLW feed (PHMC 1997). In response to these requirements, the Tank Waste Remediation System Operation and Utilization Plan (TWRSO and UP) (Kirkbride et al. 1997) was prepared by the PHMC. The TWRSO and UP, as updated by the Readiness-To-Proceed (RTP) deliverable (Payne et al. 1998), establishes the baseline operating scenario for the delivery of HLW feed to the Privatization Contractor. The scenario specifies tanks from which HLW will be provided for each feed batch, the operational activities needed to prepare and deliver each batch, and the timing of these activities. The operating scenario was developed based on current knowledge of waste composition and chemistry, waste transfer methods, and operating constraints such as tank farm logistics and availability of tank space. A project master baseline schedule (PMBS) has been developed to implement the operating scenario. The PMBS also includes activities aimed at reducing programmatic risks. One of the activities, ''Confirm Tank TI is Acceptable for Feed,'' was identified to verify the basis used to develop the scenario Additional data on waste quantity, physical and chemical characteristics, and transfer properties will be needed to support this activity. This document describes the data quality objective

  11. On the existence of certain axisymmetric interior metrics

    SciTech Connect (OSTI)

    Angulo Santacruz, C.; Batic, D.; Nowakowski, M.

    2010-08-15

    One of the effects of noncommutative coordinate operators is that the delta function connected to the quantum mechanical amplitude between states sharp to the position operator gets smeared by a Gaussian distribution. Although this is not the full account of the effects of noncommutativity, this effect is, in particular, important as it removes the point singularities of Schwarzschild and Reissner-Nordstroem solutions. In this context, it seems to be of some importance to probe also into ringlike singularities which appear in the Kerr case. In particular, starting with an anisotropic energy-momentum tensor and a general axisymmetric ansatz of the metric together with an arbitrary mass distribution (e.g., Gaussian), we derive the full set of Einstein equations that the noncommutative geometry inspired Kerr solution should satisfy. Using these equations we prove two theorems regarding the existence of certain Kerr metrics inspired by noncommutative geometry.

  12. Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and...

    Office of Scientific and Technical Information (OSTI)

    Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and Conformal Quantum Mechanics Citation Details In-Document Search Title: Modified Anti-de-Sitter Metric, Light-Front...

  13. DOE to Remove 200 Metric Tons of Highly Enriched Uranium from...

    Energy Savers [EERE]

    200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile ...

  14. Microsoft Word - 2014-1-1 RCA Qtr 1 Metrics Attachment_R1

    Energy Savers [EERE]

    ContractProject Management Performance Metric FY 2014 Target FY 2014 Projected FY 2014 ... ContractProject Management Performance Metrics FY 2014 Target FY 2014 1 th Qtr Actual ...

  15. Optimal recovery of linear operators in non-Euclidean metrics

    SciTech Connect (OSTI)

    Osipenko, K Yu

    2014-10-31

    The paper looks at problems concerning the recovery of operators from noisy information in non-Euclidean metrics. Anumber of general theorems are proved and applied to recovery problems for functions and their derivatives from the noisy Fourier transform. In some cases, afamily of optimal methods is found, from which the methods requiring the least amount of original information are singled out. Bibliography: 25 titles.

  16. Microsoft Word - DOE_ANNUAL_METRICS_2009Q3.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    14404 Third Quarter 2009 Modeling Program Metric: Coupled model comparison with observations using improved dynamics at coarse resolution Quantifying the impact of a finite volume dynamical core in CCSM3 on simulated precipitation over major catchment areas July 2009 Peter J. Gleckler and Karl E. Taylor Lawrence Livermore National Laboratory Livermore, CA Work supported by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research 
 2
 Disclaimer This

  17. Guidebook for ARRA Smart Grid Program Metrics and Benefits | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Guidebook for ARRA Smart Grid Program Metrics and Benefits Guidebook for ARRA Smart Grid Program Metrics and Benefits The Guidebook for American Recovery and Reinvestment Act (ARRA) Smart Grid Program Metrics and Benefits describes the type of information to be collected from each of the Project Teams and how it will be used by the Department of Energy to communicate overall conclusions to the public. Guidebook for ARRA Smart Grid Program Metrics and Benefits (975.03 KB) More

  18. Method and system for assigning a confidence metric for automated determination of optic disc location

    DOE Patents [OSTI]

    Karnowski, Thomas P.; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya; Chaum, Edward

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  19. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Laney, Daniel; Langer, Steven; Weber, Christopher; Lindstrom, Peter; Wegener, Al

    2014-01-01

    This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less

  20. En route to Background Independence: Broken split-symmetry, and how to restore it with bi-metric average actions

    SciTech Connect (OSTI)

    Becker, D. Reuter, M.

    2014-11-15

    The most momentous requirement a quantum theory of gravity must satisfy is Background Independence, necessitating in particular an ab initio derivation of the arena all non-gravitational physics takes place in, namely spacetime. Using the background field technique, this requirement translates into the condition of an unbroken split-symmetry connecting the (quantized) metric fluctuations to the (classical) background metric. If the regularization scheme used violates split-symmetry during the quantization process it is mandatory to restore it in the end at the level of observable physics. In this paper we present a detailed investigation of split-symmetry breaking and restoration within the Effective Average Action (EAA) approach to Quantum Einstein Gravity (QEG) with a special emphasis on the Asymptotic Safety conjecture. In particular we demonstrate for the first time in a non-trivial setting that the two key requirements of Background Independence and Asymptotic Safety can be satisfied simultaneously. Carefully disentangling fluctuation and background fields, we employ a ‘bi-metric’ ansatz for the EAA and project the flow generated by its functional renormalization group equation on a truncated theory space spanned by two separate Einstein–Hilbert actions for the dynamical and the background metric, respectively. A new powerful method is used to derive the corresponding renormalization group (RG) equations for the Newton- and cosmological constant, both in the dynamical and the background sector. We classify and analyze their solutions in detail, determine their fixed point structure, and identify an attractor mechanism which turns out instrumental in the split-symmetry restoration. We show that there exists a subset of RG trajectories which are both asymptotically safe and split-symmetry restoring: In the ultraviolet they emanate from a non-Gaussian fixed point, and in the infrared they loose all symmetry violating contributions inflicted on them by the

  1. Metrics for the National SCADA Test Bed Program

    SciTech Connect (OSTI)

    Craig, Philip A.; Mortensen, J.; Dagle, Jeffery E.

    2008-12-05

    The U.S. Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) National SCADA Test Bed (NSTB) Program is providing valuable inputs into the electric industry by performing topical research and development (R&D) to secure next generation and legacy control systems. In addition, the program conducts vulnerability and risk analysis, develops tools, and performs industry liaison, outreach and awareness activities. These activities will enhance the secure and reliable delivery of energy for the United States. This report will describe metrics that could be utilized to provide feedback to help enhance the effectiveness of the NSTB Program.

  2. User's Guide to the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect (OSTI)

    Taasevigen, Danny J.; Koran, William

    2012-02-28

    The intent of this user guide is to provide a brief description of the functionality of the Energy Charting and Metrics (ECAM) tool, including the expanded building re-tuning functionality developed for Pacific Northwest National laboratory (PNNL). This document describes the tool's general functions and features, and offers detailed instructions for PNNL building re-tuning charts, a feature in ECAM intended to help building owners and operators look at trend data (recommended 15-minute time intervals) in a series of charts (both time series and scatter) to analyze air-handler, zone, and central plant information gathered from a building automation system (BAS).

  3. Water Quality

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Water Quality Water Quality We protect water quality through stormwater control measures and an extensive network of monitoring wells and stations encompassing groundwater, surface...

  4. Water Quality

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Water Quality Water Quality We protect water quality through stormwater control measures and an extensive network of monitoring wells and stations encompassing groundwater, surface ...

  5. Career Map: Quality Engineer | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Quality Engineer Career Map: Quality Engineer A male quality engineer sits at a desk with several computers showing data. Quality Engineer Position Title Quality Engineer Alternate Title(s) Quality Assurance, Quality Control Education & Training Level Advanced, Bachelors required, prefer graduate degree or equivalent experience Education & Training Level Description Quality engineers need a bachelor's degree in an engineering field, plus experience. Professional certifications may be

  6. Measurement Practices for Reliability and Power Quality

    SciTech Connect (OSTI)

    Kueck, JD

    2005-05-06

    This report provides a distribution reliability measurement ''toolkit'' that is intended to be an asset to regulators, utilities and power users. The metrics and standards discussed range from simple reliability, to power quality, to the new blend of reliability and power quality analysis that is now developing. This report was sponsored by the Office of Electric Transmission and Distribution, U.S. Department of Energy (DOE). Inconsistencies presently exist in commonly agreed-upon practices for measuring the reliability of the distribution systems. However, efforts are being made by a number of organizations to develop solutions. In addition, there is growing interest in methods or standards for measuring power quality, and in defining power quality levels that are acceptable to various industries or user groups. The problems and solutions vary widely among geographic areas and among large investor-owned utilities, rural cooperatives, and municipal utilities; but there is still a great degree of commonality. Industry organizations such as the National Rural Electric Cooperative Association (NRECA), the Electric Power Research Institute (EPRI), the American Public Power Association (APPA), and the Institute of Electrical and Electronics Engineers (IEEE) have made tremendous strides in preparing self-assessment templates, optimization guides, diagnostic techniques, and better definitions of reliability and power quality measures. In addition, public utility commissions have developed codes and methods for assessing performance that consider local needs. There is considerable overlap among these various organizations, and we see real opportunity and value in sharing these methods, guides, and standards in this report. This report provides a ''toolkit'' containing synopses of noteworthy reliability measurement practices. The toolkit has been developed to address the interests of three groups: electric power users, utilities, and regulators. The report will also serve

  7. CEM_Metrics_and_Technical_Note_7_14_10.pdf | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    CEM_Metrics_and_Technical_Note_7_14_10.pdf CEM_Metrics_and_Technical_Note_7_14_10.pdf (129.47 KB) More Documents & Publications SEAD-Fact-Sheet.pdf Schematics of a heat pump clothes dryer<br /> Credit: Oak Ridge National Lab Heat Pump Clothes Dryer CEM_Metrics_and_Technical_Note_7_14_10.pdf Wind Vision: A New Era for Wind Power in the United States

  8. EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 10-07C/SEP 10-006B Attachment 1: Process Metrics List EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List eecbg_sep_reporting_guidance_attachment_06242011.pdf (56.65 KB) More Documents & Publications EECBG SEP Attachment 1 - Process metric list EECBG Program Notice 10-07A DOE Recovery Act Reporting Requirements for the State Energy Program

  9. DOE Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for

    National Nuclear Security Administration (NNSA)

    Civilian Reactors | National Nuclear Security Administration | (NNSA) Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for Civilian Reactors DOE Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for Civi Washington, DC Secretary Abraham announced that DOE will dispose of 34 metric tons of surplus weapons grade plutonium by turning the material into mixed oxide fuel (MOX) for use in nuclear reactors. The decision follows an exhaustive Administration review

  10. Variable-metric diffraction crystals for x-ray optics

    SciTech Connect (OSTI)

    Smither, R.K.; Fernandez, P.B. )

    1992-02-01

    A variable-metric (VM) crystal is one in which the spacing between the crystalline planes changes with position in the crystal. This variation can be either parallel to the crystalline planes or perpendicular to the crystalline planes of interest and can be produced by either introducing a thermal gradient in the crystal or by growing a crystal made of two or more elements and changing the relative percentages of the two elements as the crystal is grown. A series of experiments were performed in the laboratory to demonstrate the principle of the variable-metric crystal and its potential use in synchrotron beam lines. One of the most useful applications of the VM crystal is to increase the number of photons per unit bandwidth in a diffracted beam without losing any of the overall intensity. In a normal synchrotron beam line that uses a two-crystal monochromator, the bandwidth of the diffracted photon beam is determined by the vertical opening angle of the beam which is typically 0.10--0.30 mrad or 20--60 arcsec. When the VM crystal approach is applied, the bandwidth of the beam can be made as narrow as the rocking curve of the diffracting crystal, which is typically 0.005--0.050 mrad or 1--10 arcsec. Thus a very large increase of photons per unit bandwidth (or per unit energy) can be achieved through the use of VM crystals. When the VM principle is used with bent crystals, new kinds of x-ray optical elements can be generated that can focus and defocus x-ray beams much like simple lenses where the focal length of the lens can be changed to match its application. Thus both large magnifications and large demagnifications can be achieved as well as parallel beams with narrow bandwidths.

  11. Metrics for Assessment of Smart Grid Data Integrity Attacks

    SciTech Connect (OSTI)

    Annarita Giani; Miles McQueen; Russell Bent; Kameshwar Poolla; Mark Hinrichs

    2012-07-01

    There is an emerging consensus that the nation’s electricity grid is vulnerable to cyber attacks. This vulnerability arises from the increasing reliance on using remote measurements, transmitting them over legacy data networks to system operators who make critical decisions based on available data. Data integrity attacks are a class of cyber attacks that involve a compromise of information that is processed by the grid operator. This information can include meter readings of injected power at remote generators, power flows on transmission lines, and relay states. These data integrity attacks have consequences only when the system operator responds to compromised data by redispatching generation under normal or contingency protocols. These consequences include (a) financial losses from sub-optimal economic dispatch to service loads, (b) robustness/resiliency losses from placing the grid at operating points that are at greater risk from contingencies, and (c) systemic losses resulting from cascading failures induced by poor operational choices. This paper is focused on understanding the connections between grid operational procedures and cyber attacks. We first offer two examples to illustrate how data integrity attacks can cause economic and physical damage by misleading operators into taking inappropriate decisions. We then focus on unobservable data integrity attacks involving power meter data. These are coordinated attacks where the compromised data are consistent with the physics of power flow, and are therefore passed by any bad data detection algorithm. We develop metrics to assess the economic impact of these attacks under re-dispatch decisions using optimal power flow methods. These metrics can be use to prioritize the adoption of appropriate countermeasures including PMU placement, encryption, hardware upgrades, and advance attack detection algorithms.

  12. Building Cost and Performance Metrics: Data Collection Protocol, Revision 1.0

    SciTech Connect (OSTI)

    Fowler, Kimberly M.; Solana, Amy E.; Spees, Kathleen L.

    2005-09-29

    This technical report describes the process for selecting and applying the building cost and performance metrics for measuring sustainably designed buildings in comparison to traditionally designed buildings.

  13. EVMS Training Snippet: 3.2 Schedule Health Metrics | Department of Energy

    Office of Environmental Management (EM)

    2 Schedule Health Metrics EVMS Training Snippet: 3.2 Schedule Health Metrics This EVMS Training Snippet sponsored by the Office of Project Management (PM) focuses on 'what' the metrics are, 'why' they are important, and what they tell us about the schedule health. This Snippet does not focus on the 'how' the metrics are calculated, other than to provide a basic understanding of what is being calculated. Link to Video Presentation (21:52) | Prior Snippet (3.1B) | Next Snippet (3.3) | Return to

  14. FY 2015 Q1 Metrics Supporting Documentation 2015-02-09.xls

    Broader source: Energy.gov (indexed) [DOE]

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Success: Complete 90% of capital asset projects at ...

  15. Enclosure - FY 2015 Q3 Metrics Report 2015-08-12.xlsx

    Broader source: Energy.gov (indexed) [DOE]

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Management Success: Complete 90% of capital asset ...

  16. (SSS)GAO Metrics - Project Success 2015-04-29 1100.xls

    Broader source: Energy.gov (indexed) [DOE]

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Success: Complete 90% of capital asset projects at ...

  17. Microsoft PowerPoint - Snippet 3.2 Schedule Health Metrics 20140713...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ... available software. These metrics can be quickly reviewed each month to identify any schedule health risks on your project, whether you are the contractor or the customer. ...

  18. New Selection Metric for Design of Thin-Film Solar Cell Absorber...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Maximum Efficiency (SLME) is a new and calculable selection metric to identify new andor improved photovoltaic (PV) absorber candidate materials for thin- film solar cells. ...

  19. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect (OSTI)

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  20. Enhanced Accident Tolerant LWR Fuels National Metrics Workshop Report

    SciTech Connect (OSTI)

    Lori Braase

    2013-01-01

    Commercialization. The activities performed during the feasibility assessment phase include laboratory scale experiments; fuel performance code updates; and analytical assessment of economic, operational, safety, fuel cycle, and environmental impacts of the new concepts. The development and qualification stage will consist of fuel fabrication and large scale irradiation and safety basis testing, leading to qualification and ultimate NRC licensing of the new fuel. The commercialization phase initiates technology transfer to industry for implementation. Attributes for fuels with enhanced accident tolerance include improved reaction kinetics with steam and slower hydrogen generation rate, while maintaining acceptable cladding thermo-mechanical properties; fuel thermo-mechanical properties; fuel-clad interactions; and fission-product behavior. These attributes provide a qualitative guidance for parameters that must be considered in the development of fuels and cladding with enhanced accident tolerance. However, quantitative metrics must be developed for these attributes. To initiate the quantitative metrics development, a Light Water Reactor Enhanced Accident Tolerant Fuels Metrics Development Workshop was held October 10-11, 2012, in Germantown, Maryland. This document summarizes the structure and outcome of the two-day workshop. Questions regarding the content can be directed to Lori Braase, 208-526-7763, lori.braase@inl.gov.

  1. Description of the Sandia National Laboratories science, technology & engineering metrics process.

    SciTech Connect (OSTI)

    Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter

    2010-04-01

    There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.

  2. Air Quality

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Air Quality Air Quality Tour The Laboratory calculates the dose to the maximally exposed individual (MEI) to determine effects of Laboratory operations on the public. Open full...

  3. Air Quality

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Air Quality Air Quality Tour The Laboratory calculates the dose to the maximally exposed individual (MEI) to determine effects of Laboratory operations on the public.

  4. Air Quality

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Air Quality Air Quality Tour The Laboratory calculates the dose to the maximally exposed individual (MEI) to determine effects of Laboratory operations on the public.

  5. Impact of Different Economic Performance Metrics on the Perceived Value of Solar Photovoltaics

    SciTech Connect (OSTI)

    Drury, E.; Denholm, P.; Margolis, R.

    2011-10-01

    Photovoltaic (PV) systems are installed by several types of market participants, ranging from residential customers to large-scale project developers and utilities. Each type of market participant frequently uses a different economic performance metric to characterize PV value because they are looking for different types of returns from a PV investment. This report finds that different economic performance metrics frequently show different price thresholds for when a PV investment becomes profitable or attractive. Several project parameters, such as financing terms, can have a significant impact on some metrics [e.g., internal rate of return (IRR), net present value (NPV), and benefit-to-cost (B/C) ratio] while having a minimal impact on other metrics (e.g., simple payback time). As such, the choice of economic performance metric by different customer types can significantly shape each customer's perception of PV investment value and ultimately their adoption decision.

  6. Proceedings of the 2009 Performance Metrics for Intelligent Systems Workshop

    SciTech Connect (OSTI)

    Madhavan, Raj; Messina, Elena

    2009-09-01

    The Performance Metrics for Intelligent Systems (PerMIS) workshop is dedicated to defining measures and methodologies of evaluating performance of intelligent systems. As the only workshop of its kind, PerMIS has proved to be an excellent forum for sharing lessons learned and discussions as well as fostering collaborations between researchers and practitioners from industry, academia and government agencies. The main theme of the ninth iteration of the workshop, PerMIS'09, seeks to address the question: 'Does performance measurement accelerate the pace of advancement for intelligent systems?' In addition to the main theme, as in previous years, the workshop will focus on applications of performance measures to practical problems in commercial, industrial, homeland security, and military applications. The PerMIS'09 program consists of six plenary addresses and six general and special sessions. The topics that are to be discussed by the speakers cover a wide array of themes centered on many intricate facets of intelligent system research. The presentations will emphasize and showcase the interdisciplinary nature of intelligent systems research and why it is not straightforward to evaluate such interconnected system of systems. The three days of twelve sessions will span themes from manufacturing, mobile robotics, human-system interaction, theory of mind, testing and evaluation of unmanned systems, to name a few.

  7. Sensitivity of Multi-gas Climate Policy to Emission Metrics

    SciTech Connect (OSTI)

    Smith, Steven J.; Karas, Joseph F.; Edmonds, James A.; Eom, Jiyong; Mizrahi, Andrew H.

    2013-04-01

    Multi-gas greenhouse emission targets require that different emissions be combined into an aggregate total. The Global Warming Potential (GWP) index is currently used for this purpose, despite various criticisms of the underlying concept. It is not possible to uniquely define a single metric that perfectly captures the different impacts of emissions of substances with widely disparate atmospheric lifetimes, which leads to a wide range of possible index values. We examine the sensitivity of emissions and climate outcomes to the value of the index used to aggregate methane emissions using a technologically detailed integrated assessment model. We find that the sensitivity to index value is of order 4-14% in terms of methane emissions and 2% in terms of total radiative forcing, using index values between 4 and 70 for methane, with larger regional differences in some cases. The sensitivity to index value is much higher in economic terms, with total 2-gas mitigation cost decreasing 4-5% for a lower index and increasing 10-13% for a larger index, with even larger changes if the emissions reduction targets are small. The sensitivity to index value also depends on the assumed maximum amount of mitigation available in each sector. Evaluation of the maximum mitigation potential for major sources of non-CO2 greenhouse gases would greatly aid analysis

  8. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    SciTech Connect (OSTI)

    Schanfein, Mark; Gouveia, Fernando; Crawford, Cary E.; Pickett, Chris J.; Jay, Jeffrey

    2010-07-15

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls. Introduction The U.S. safeguards technology base got its start almost half a century ago in the nuclear weapons program of the U.S. Department of Energy/National Nuclear Security Administration (DOE/NNSA) and their predecessors: AEC & ERDA. Due to nuclear materials’ strategic importance and value, and the risk associated with the public’s and worker’s health and the potential for theft, significant investments were made to develop techniques to measure nuclear materials using both destructive assay (DA) and non-destructive assay (NDA). Major investment within the U.S. DOE Domestic Safeguards Program continued over the next three decades, resulting in continuous improvements in the state-of-the-art of these techniques. This was particularly true in the area of NDA with its ability to use gamma rays, neutrons, and heat to identify and quantify nuclear materials without the need to take direct samples of the material. Most of these techniques were commercialized and transferred to

  9. Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and...

    Office of Scientific and Technical Information (OSTI)

    Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and Conformal Quantum Mechanics Dosch, Hans Gunter; U. Heidelberg, ITP; Brodsky, Stanley J.; SLAC; de Teramond, Guy F.;...

  10. 11,202,720 Metric Tons of CO2 Injected as of October 14, 2015...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    This carbon dioxide (CO2) has been injected in the United States as part of DOE's Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is ...

  11. FY 2014 Q3 RCA CAP Performance Metrics Report 2014-09-05.xlsx

    Energy Savers [EERE]

    ContractProject Management Performance Metrics FY 2014 Target FY 2014 Pre- & Post- CAP* ... TPC is Total Project Cost. No. FY 2014 Target FY 2014 3rd Qtr Actual 2 95% 92% 3 95% ...

  12. FY 2014 Q4 Metrics Report 2014-11-06.xlsx

    Energy Savers [EERE]

    ContractProject Management Performance Metrics FY 2014 Target FY 2014 Pre- & Post- CAP* ... TPC is Total Project Cost. No. FY 2014 Target FY 2014 4th Qtr Actual 2 95% 89% 3 95% ...

  13. EAC Presentation: Metrics and Benefits Analysis for the ARRA Smart Grid Programs- March 10, 2011

    Broader source: Energy.gov [DOE]

    PowerPoint presentation by Joe Paladino from the Office of Electricity Delivery and Energy Reliability before the Electricity Advisory Committee (EAC) on metrics and benefits analysis for the...

  14. Enclosure - FY 2016 Q1 Metrics Report 2016-02-11.xlsx

    Broader source: Energy.gov (indexed) [DOE]

    No. ContractProject Management Performance Metrics FY 2016 Target No. 2 3 4 5 6 7 Comment FY 2016 Forecast Certified Contracting Staff: By the end of FY 2011, 85% of the 1102 ...

  15. 12,877,644 Metric Tons of CO2 Injected as of July 1, 2016

    Broader source: Energy.gov [DOE]

    This carbon dioxide (CO2) has been injected in the United States as part of DOE’s Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the...

  16. 11,202,720 Metric Tons of CO2 Injected as of October 14, 2015

    Office of Energy Efficiency and Renewable Energy (EERE)

    This carbon dioxide (CO2) has been injected in the United States as part of DOEs Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the...

  17. Metrics for Developing an Endorsed Set of Radiographic Threat Surrogates for JINII/CAARS

    SciTech Connect (OSTI)

    Wurtz, R; Walston, S; Dietrich, D; Martz, H

    2009-02-11

    CAARS (Cargo Advanced Automated Radiography System) is developing x-ray dual energy and x-ray backscatter methods to automatically detect materials that are greater than Z=72 (hafnium). This works well for simple geometry materials, where most of the radiographic path is through one material. However, this is usually not the case. Instead, the radiographic path includes many materials of different lengths. Single energy can be used to compute {mu}y{sub l} which is related to areal density (mass per unit area) while dual energy yields more information. This report describes a set of metrics suitable and sufficient for characterizing the appearance of assemblies as detected by x-ray radiographic imaging systems, such as those being tested by Joint Integrated Non-Intrusive Inspection (JINII) or developed under CAARS. These metrics will be simulated both for threat assemblies and surrogate threat assemblies (such as are found in Roney et al. 2007) using geometrical and compositional information of the assemblies. The imaging systems are intended to distinguish assemblies containing high-Z material from those containing low-Z material, regardless of thickness, density, or compounds and mixtures. The systems in question operate on the principle of comparing images obtained by using two different x-ray end-point energies--so-called 'dual energy' imaging systems. At the direction of the DHS JINII sponsor, this report does not cover metrics that implement scattering, in the form of either forward-scattered radiation or high-Z detection systems operating on the principle of backscatter detection. Such methods and effects will be covered in a later report. The metrics described here are to be used to compare assemblies and not x-ray radiography systems. We intend to use these metrics to determine whether two assemblies do or do not look the same. We are tasked to develop a set of assemblies whose appearance using this class of detection systems is indistinguishable from the

  18. NNSA Eliminates 100 Metric Tons Of Weapons-Grade Nuclear Material |

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    National Nuclear Security Administration | (NNSA) Eliminates 100 Metric Tons Of Weapons-Grade Nuclear Material August 25, 2008 WASHINGTON, D.C. -Today the Department of Energy's National Nuclear Security Administration (NNSA) announced that it successfully eliminated 100 metric tons of U.S. highly enriched uranium (HEU), enough for thousands of nuclear weapons. For the last decade, the U.S. HEU disposition program has eliminated surplus HEU from the nuclear weapons program by downblending

  19. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect (OSTI)

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  20. Implementing the Data Center Energy Productivity Metric in a High Performance Computing Data Center

    SciTech Connect (OSTI)

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2013-06-30

    As data centers proliferate in size and number, the improvement of their energy efficiency and productivity has become an economic and environmental imperative. Making these improvements requires metrics that are robust, interpretable, and practical. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high-performance computing data center. We found that DCeP was successful in clearly distinguishing different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and between data centers.

  1. Quality Assurance Specialist

    Broader source: Energy.gov [DOE]

    Alternate Title(s):Quality Control Technician; Quality Assurance Inspector; Quality Assurance Representative

  2. Effective detective quantum efficiency for two mammography systems: Measurement and comparison against established metrics

    SciTech Connect (OSTI)

    Salvagnini, Elena; Bosmans, Hilde; Marshall, Nicholas W.; Struelens, Lara

    2013-10-15

    Purpose: The aim of this paper was to illustrate the value of the new metric effective detective quantum efficiency (eDQE) in relation to more established measures in the optimization process of two digital mammography systems. The following metrics were included for comparison against eDQE: detective quantum efficiency (DQE) of the detector, signal difference to noise ratio (SdNR), and detectability index (d′) calculated using a standard nonprewhitened observer with eye filter.Methods: The two systems investigated were the Siemens MAMMOMAT Inspiration and the Hologic Selenia Dimensions. The presampling modulation transfer function (MTF) required for the eDQE was measured using two geometries: a geometry containing scattered radiation and a low scatter geometry. The eDQE, SdNR, and d′ were measured for poly(methyl methacrylate) (PMMA) thicknesses of 20, 40, 60, and 70 mm, with and without the antiscatter grid and for a selection of clinically relevant target/filter (T/F) combinations. Figures of merit (FOMs) were then formed from SdNR and d′ using the mean glandular dose as the factor to express detriment. Detector DQE was measured at energies covering the range of typical clinically used spectra.Results: The MTF measured in the presence of scattered radiation showed a large drop at low spatial frequency compared to the low scatter method and led to a corresponding reduction in eDQE. The eDQE for the Siemens system at 1 mm{sup −1} ranged between 0.15 and 0.27, depending on T/F and grid setting. For the Hologic system, eDQE at 1 mm{sup −1} varied from 0.15 to 0.32, again depending on T/F and grid setting. The eDQE results for both systems showed that the grid increased the system efficiency for PMMA thicknesses of 40 mm and above but showed only small sensitivity to T/F setting. While results of the SdNR and d′ based FOMs confirmed the eDQE grid position results, they were also more specific in terms of T/F selection. For the Siemens system at 20 mm PMMA

  3. Quality Management

    Broader source: Energy.gov [DOE]

    The Office of Quality Management, within the Office of Health, Safety and Security develops policies and procedures to ensure the classification and control of information is effective and...

  4. A Comparison of Model Short-Range Forecasts and the ARM Microbase Data Fourth Quarter ARM Science Metric

    SciTech Connect (OSTI)

    Hnilo, J.

    2006-09-19

    For the fourth quarter ARM metric we will make use of new liquid water data that has become available, and called the “Microbase” value added product (referred to as OBS, within the text) at three sites: the North Slope of Alaska (NSA), Tropical West Pacific (TWP) and the Southern Great Plains (SGP) and compare these observations to model forecast data. Two time periods will be analyzed March 2000 for the SGP and October 2004 for both TWP and NSA. The Microbase data have been averaged to 35 pressure levels (e.g., from 1000hPa to 100hPa at 25hPa increments) and time averaged to 3hourly data for direct comparison to our model output.

  5. ARM - Data Quality Program

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Quality Program DQ Resources Data Quality Assessment and Control Report (PDF, 747KB) Data Quality Office Data Quality Problem Reporting (DQPR) Contact Us Submit Data Quality ...

  6. 2011 SAPHIRE 8 Software Quality Assurance Status Report

    SciTech Connect (OSTI)

    Kurt G. Vedros

    2011-09-01

    The Software Quality Assurance engineer position was created in fiscal year 2011 to better maintain and improve the quality of the SAPHIRE 8 development program. This year's Software Quality Assurance tasks concentrated on developing the framework of the SQA program. This report reviews the accomplishments and recommendations for each of the subtasks set forth for JCN V6059: (1) Reviews, Tests, and Code Walkthroughs; (2) Data Dictionary; (3) Metrics; (4) Requirements Traceability Matrix; (5) Provide Oversight on SAPHIRE QA Activities; and (6) Support NRC Presentations and Meetings.

  7. Energy Department Project Captures and Stores more than One Million Metric

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Tons of CO2 | Department of Energy Project Captures and Stores more than One Million Metric Tons of CO2 Energy Department Project Captures and Stores more than One Million Metric Tons of CO2 June 26, 2014 - 11:30am Addthis Aerial view of Air Products’ existing steam methane reforming facility at Port Arthur, Texas, with new carbon-capture units and central co-gen and CO2 product compressor. | Photo courtesy of Air Products and Chemicals Inc. Aerial view of Air Products' existing steam

  8. DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Weapons Stockpile | Department of Energy to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile November 7, 2005 - 12:38pm Addthis Will Be Redirected to Naval Reactors, Down-blended or Used for Space Programs WASHINGTON, DC - Secretary of Energy Samuel W. Bodman today announced that the Department of Energy's (DOE) National Nuclear Security Administration (NNSA) will

  9. Quality Policy

    Broader source: Energy.gov [DOE]

    Quality Policy It is the policy of the Department of Energy to establish quality requirements to ensure that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized through the application of effective management systems commensurate with the risks posed by the facility or activity and its work. The Department implements this policy through the QA Order and the QA rule directives to ensure quality assurance requirements are clearly specified for the broad spectrum of work performed by DOE and its contractors.

  10. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1999-09-29

    To establish an effective management system [i.e., quality assurance programs (QAPs)] using the performance requirements of this Order, coupled with technical standards where appropriate. Cancels DOE O 414.1.

  11. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2005-06-17

    This Order ensures that the quality of DOE/NNSA products and services meets or exceeds the customers' expectations. Cancels DOE O 414.1B and DOE N 411.1. Canceled by DOE O 414.1D.

  12. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2011-04-25

    The Order defines roles and responsibilities for providing quality assurance for DOE products and services.Admin Chg 1, dated 5-8-13, supersedes DOE O 414.1D.

  13. Quality Program

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    QP001 Revision 0 Effective October 15, 2001 QUALITY PROGRAM Prepared by Electric Transportation Applications Prepared by: _______________________________ Date:__________ Jude M. Clark Approved by: _______________________________________________ Date: ______________ Donald B. Karner Procedure ETA-QP001 Revision 0 2 2001 Electric Transportation Applications All Rights Reserved TABLE OF CONTENTS 1.0 Objectives 3 2.0 Scope 3 3.0 Documentation 3 4.0 Prerequisites 4 5.0 Exclusions 5 6.0 Quality

  14. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2004-04-29

    This Order ensures that the quality of DOE/NNSA products and services meets or exceeds the customer's expectations. This Order cancels DOE O 414.1A, Quality Assurance, dated 9-29-99, and Attachment 1, paragraph 8, and Attachment 2, paragraph 22, of DOE O 440.1A, Worker Protection Management for DOE Federal and Contractor Employees, dated 3-27-98. Cancels: DOE O 414.1A and DOE O 440.1A, parts as noted.

  15. Multidimensional metrics for estimating phage abundance, distribution, gene density, and sequence coverage in metagenomes

    SciTech Connect (OSTI)

    Aziz, Ramy K.; Dwivedi, Bhakti; Akhter, Sajia; Breitbart, Mya; Edwards, Robert A.

    2015-05-08

    Phages are the most abundant biological entities on Earth and play major ecological roles, yet the current sequenced phage genomes do not adequately represent their diversity, and little is known about the abundance and distribution of these sequenced genomes in nature. Although the study of phage ecology has benefited tremendously from the emergence of metagenomic sequencing, a systematic survey of phage genes and genomes in various ecosystems is still lacking, and fundamental questions about phage biology, lifestyle, and ecology remain unanswered. To address these questions and improve comparative analysis of phages in different metagenomes, we screened a core set of publicly available metagenomic samples for sequences related to completely sequenced phages using the web tool, Phage Eco-Locator. We then adopted and deployed an array of mathematical and statistical metrics for a multidimensional estimation of the abundance and distribution of phage genes and genomes in various ecosystems. Experiments using those metrics individually showed their usefulness in emphasizing the pervasive, yet uneven, distribution of known phage sequences in environmental metagenomes. Using these metrics in combination allowed us to resolve phage genomes into clusters that correlated with their genotypes and taxonomic classes as well as their ecological properties. By adding this set of metrics to current metaviromic analysis pipelines, where they can provide insight regarding phage mosaicism, habitat specificity, and evolution.

  16. Metrics of closed world of Friedmann, agitated by electric charge (towards a theory electromagnetic Friedmanns)

    SciTech Connect (OSTI)

    Markov, M.A.; Frolov, V.P.

    1986-06-10

    The generalization is considered of the well-known Tolman problem to the case of electrically charged dust-like matter of the central symmetrical system. The first integrals of the correspondent system of the Einstein-Maxwell equations are found. The problem is specificated in such a way that with the full charge of the system going to zero, the metrics of the closed Friedman world arises. Such a system is considered at the initial moment, that of maximal enlargement. With any nonvanishing but no-matter-how-small value of the electric charge, the metrics is unclosed. The metrics of the almost-Friedmanian part of the world allows the continuation through the narrow manhole (at the small charge) as the Nordstroem Reissner metrics with the parameters m/sub O/ sq rt (chi) = e/sub o/. The expression for the electric potential in the manhole phi/sub h/ = c-squared/sq rt chi does not depend upon the value of the electric charge. The radius of the manhole r/sub h/ = e/sub O/ sq. rt (chi)/ c-squared increases with the increase of the charge. The state of the manhole as given by the classical description appears as essentially unstable from the quantum-physics viewpoint. The production of various pairs in the enormous electric fields of the manhole gives rise to the polarisation of the latter up to effective charge Z < 137e irrespective of the initial (no matter how great) charge of the system.

  17. Energy Department Project Captures and Stores One Million Metric Tons of Carbon

    Broader source: Energy.gov [DOE]

    As part of President Obama’s all-of-the-above energy strategy, the Department of Energy announced today that its Illinois Basin-Decatur Project successfully captured and stored one million metric tons of carbon dioxide (CO2) and injected it into a deep saline formation.

  18. Multidimensional metrics for estimating phage abundance, distribution, gene density, and sequence coverage in metagenomes

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Aziz, Ramy K.; Dwivedi, Bhakti; Akhter, Sajia; Breitbart, Mya; Edwards, Robert A.

    2015-05-08

    Phages are the most abundant biological entities on Earth and play major ecological roles, yet the current sequenced phage genomes do not adequately represent their diversity, and little is known about the abundance and distribution of these sequenced genomes in nature. Although the study of phage ecology has benefited tremendously from the emergence of metagenomic sequencing, a systematic survey of phage genes and genomes in various ecosystems is still lacking, and fundamental questions about phage biology, lifestyle, and ecology remain unanswered. To address these questions and improve comparative analysis of phages in different metagenomes, we screened a core set ofmore » publicly available metagenomic samples for sequences related to completely sequenced phages using the web tool, Phage Eco-Locator. We then adopted and deployed an array of mathematical and statistical metrics for a multidimensional estimation of the abundance and distribution of phage genes and genomes in various ecosystems. Experiments using those metrics individually showed their usefulness in emphasizing the pervasive, yet uneven, distribution of known phage sequences in environmental metagenomes. Using these metrics in combination allowed us to resolve phage genomes into clusters that correlated with their genotypes and taxonomic classes as well as their ecological properties. By adding this set of metrics to current metaviromic analysis pipelines, where they can provide insight regarding phage mosaicism, habitat specificity, and evolution.« less

  19. Hydrogen Fuel Quality

    SciTech Connect (OSTI)

    Rockward, Tommy

    2012-07-16

    For the past 6 years, open discussions and/or meetings have been held and are still on-going with OEM, Hydrogen Suppliers, other test facilities from the North America Team and International collaborators regarding experimental results, fuel clean-up cost, modeling, and analytical techniques to help determine levels of constituents for the development of an international standard for hydrogen fuel quality (ISO TC197 WG-12). Significant progress has been made. The process for the fuel standard is entering final stages as a result of the technical accomplishments. The objectives are to: (1) Determine the allowable levels of hydrogen fuel contaminants in support of the development of science-based international standards for hydrogen fuel quality (ISO TC197 WG-12); and (2) Validate the ASTM test method for determining low levels of non-hydrogen constituents.

  20. Data Quality of Quality Measurement Experiments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Data Quality of Quality Measurement Experiments S. Bottone and S. Moore Mission Research ... assessment of the quality of incoming data based on internal consistency checks, ...

  1. Development of new VOC exposure metrics and their relationship to ''Sick Building Syndrome'' symptoms

    SciTech Connect (OSTI)

    Ten Brinke, JoAnn

    1995-08-01

    Volatile organic compounds (VOCs) are suspected to contribute significantly to ''Sick Building Syndrome'' (SBS), a complex of subchronic symptoms that occurs during and in general decreases away from occupancy of the building in question. A new approach takes into account individual VOC potencies, as well as the highly correlated nature of the complex VOC mixtures found indoors. The new VOC metrics are statistically significant predictors of symptom outcomes from the California Healthy Buildings Study data. Multivariate logistic regression analyses were used to test the hypothesis that a summary measure of the VOC mixture, other risk factors, and covariates for each worker will lead to better prediction of symptom outcome. VOC metrics based on animal irritancy measures and principal component analysis had the most influence in the prediction of eye, dermal, and nasal symptoms. After adjustment, a water-based paints and solvents source was found to be associated with dermal and eye irritation. The more typical VOC exposure metrics used in prior analyses were not useful in symptom prediction in the adjusted model (total VOC (TVOC), or sum of individually identified VOCs ({Sigma}VOC{sub i})). Also not useful were three other VOC metrics that took into account potency, but did not adjust for the highly correlated nature of the data set, or the presence of VOCs that were not measured. High TVOC values (2--7 mg m{sup {minus}3}) due to the presence of liquid-process photocopiers observed in several study spaces significantly influenced symptoms. Analyses without the high TVOC values reduced, but did not eliminate the ability of the VOC exposure metric based on irritancy and principal component analysis to explain symptom outcome.

  2. LANL sponsors Quality New Mexico

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Quality New Mexico performance excellence conference April 19, 2011 April 12, 2011 LOS ALAMOS, New Mexico, April 12, 2011-Want to take your organization to the next level and...

  3. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1981-07-21

    To provide Department of Energy (DOE) policy, set forth principles, and assign responsibilities for establishing, implementing, and maintaining programs of plans and actions to assure quality achievement in DOE programs. Cancels DOE O 5700.6, dated 1-16-1981. Canceled by DOE O 5700.6B, dated 9-23-1986.

  4. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1981-01-16

    To provide Department of Energy (DOE) policy, set forth principles, and assign responsibilities for establishing, implementing, and maintaining programs of plans and actions to assure quality achievement in DOE programs. Canceled by DOE O 5700.6A, dated 7-21-1981.

  5. Service Levels

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Service Levels Service Levels NERSC Supported Services Model NERSC supports various services at various levels of support. This document outlines the different levels of support that can be expected for a given service. Production Services All production services at NERSC have the following characteristics: Monitored by NERSC Operations with automated tools (Nagios). Outages are announced on the MOTD and must follow the rules defined in System Outages document. User facing documentation

  6. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  7. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect (OSTI)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  8. Maintaining System Air Quality; Industrial Technologies Program...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    2 * August 2004 Industrial Technologies Program Suggested Actions * Review compressed air applica- tions and determine the required level of air quality for each. * Review the ...

  9. Performance Metrics

    Broader source: Energy.gov [DOE]

    RCA/CAP Closure Report 2011 - This RCA/CAP Closure Report presents a status of the Department’s initiatives to address the most significant issues and their corresponding root causes and officially...

  10. Quality Assurance

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2001-07-12

    To establish an effective management system [i.e., quality assurance programs(QAPs)] using the performance requirements of this Order, coupled with technical standards where appropriate. Change 1, dated 7/12/01, facilitates the Department's organizational transition necessitated by establishment of the NNSA. (Attachment 2 of this Order is canceled by DOE O 470.2B.) Cancels: DOE O 414.1

  11. Dynamical Systems in the Variational Formulation of the Fokker-Planck Equation by the Wasserstein Metric

    SciTech Connect (OSTI)

    Mikami, T.

    2000-07-01

    R. Jordan, D. Kinderlehrer, and F. Otto proposed the discrete-time approximation of the Fokker-Planck equation by the variational formulation. It is determined by the Wasserstein metric, an energy functional, and the Gibbs-Boltzmann entropy functional. In this paper we study the asymptotic behavior of the dynamical systems which describe their approximation of the Fokker-Planck equation and characterize the limit as a solution to a class of variational problems.

  12. Time delay of light signals in an energy-dependent spacetime metric

    SciTech Connect (OSTI)

    Grillo, A. F.; Luzio, E.; Mendez, F.

    2008-05-15

    In this paper we review the problem of time delay of photons propagating in a spacetime with a metric that explicitly depends on the energy of the particles (gravity-rainbow approach). We show that corrections due to this approach--which is closely related to the double special relativity proposal--produce for small redshifts (z<<1) smaller time delays than in the generic Lorentz invariance violating case.

  13. Microsoft Word - McIntyre-Metrics Report SAND draft9-14.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2070P Unlimited Release September 2007 Security Metrics for Process Control Systems Annie McIntyre, Blair Becker, Ron Halbgewachs Prepared by Sandia National Laboratories Albuquerque, New Mexico 87185 and Livermore, California 94550 Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000. Approved for public release; further dissemination

  14. Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone

    Broader source: Energy.gov [DOE]

    On June 30, Allentown, PA-based Air Products and Chemicals, Inc. successfully captured and transported, via pipeline, its 3 millionth metric ton of carbon dioxide (CO2) to be used for enhanced oil recovery. This achievement highlights the ongoing success of a carbon capture and storage (CCS) project sponsored by the U.S. Department of Energy (DOE) and managed by the National Energy Technology Laboratory (NETL).

  15. Performance metrics and life-cycle information management for building performance assurance

    SciTech Connect (OSTI)

    Hitchcock, R.J.; Piette, M.A.; Selkowitz, S.E.

    1998-06-01

    Commercial buildings account for over $85 billion per year in energy costs, which is far more energy than technically necessary. One of the primary reasons buildings do not perform as well as intended is that critical information is lost, through ineffective documentation and communication, leading to building systems that are often improperly installed and operated. A life-cycle perspective on the management of building information provides a framework for improving commercial building energy performance. This paper describes a project to develop strategies and techniques to provide decision-makers with information needed to assure the desired building performance across the complete life cycle of a building project. A key element in this effort is the development of explicit performance metrics that quantitatively represent performance objectives of interest to various building stakeholders. The paper begins with a discussion of key problems identified in current building industry practice, and ongoing work to address these problems. The paper then focuses on the concept of performance metrics and their use in improving building performance during design, commissioning, and on-going operations. The design of a Building Life-cycle Information System (BLISS) is presented. BLISS is intended to provide an information infrastructure capable of integrating a variety of building information technologies that support performance assurance. The use of performance metrics in case study building projects is explored to illustrate current best practice. The application of integrated information technology for improving current practice is discussed.

  16. Specification and implementation of IFC based performance metrics to support building life cycle assessment of hybrid energy systems

    SciTech Connect (OSTI)

    Morrissey, Elmer; O'Donnell, James; Keane, Marcus; Bazjanac, Vladimir

    2004-03-29

    Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offering an accurate method of quantitatively assessing building performance.

  17. 12,893,780 Metric Tons of CO2 Injected as of July 19, 2016 | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy 12,893,780 Metric Tons of CO2 Injected as of July 19, 2016 12,893,780 Metric Tons of CO2 Injected as of July 19, 2016 This carbon dioxide (CO2) has been injected in the United States as part of DOE's Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the annual greenhouse gas emissions from 210,526 passenger vehicles. The projects currently injecting CO2 within DOE's Regional Carbon Sequestration Partnership Program and the

  18. How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

    SciTech Connect (OSTI)

    Mathew, Paul; Greenberg, Steve; Ganguly, Srirupa; Sartor, Dale; Tschudi, William

    2009-04-01

    Data centers are among the most energy intensive types of facilities, and they are growing dramatically in terms of size and intensity [EPA 2007]. As a result, in the last few years there has been increasing interest from stakeholders - ranging from data center managers to policy makers - to improve the energy efficiency of data centers, and there are several industry and government organizations that have developed tools, guidelines, and training programs. There are many opportunities to reduce energy use in data centers and benchmarking studies reveal a wide range of efficiency practices. Data center operators may not be aware of how efficient their facility may be relative to their peers, even for the same levels of service. Benchmarking is an effective way to compare one facility to another, and also to track the performance of a given facility over time. Toward that end, this article presents the key metrics that facility managers can use to assess, track, and manage the efficiency of the infrastructure systems in data centers, and thereby identify potential efficiency actions. Most of the benchmarking data presented in this article are drawn from the data center benchmarking database at Lawrence Berkeley National Laboratory (LBNL). The database was developed from studies commissioned by the California Energy Commission, Pacific Gas and Electric Co., the U.S. Department of Energy and the New York State Energy Research and Development Authority.

  19. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    SciTech Connect (OSTI)

    Ronald Boring; Roger Lew; Thomas Ulrich; Jeffrey Joe

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how the process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.

  20. Genome Assembly Forensics: Metrics for Assessing Assembly Correctness (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema (OSTI)

    Pop, Mihai [University of Maryland

    2013-01-22

    University of Maryland's Mihai Pop on "Genome Assembly Forensics: Metrics for Assessing Assembly Correctness" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  1. Data Driven Quality Assurance and Quality Control

    Broader source: Energy.gov [DOE]

    "Data Driven Quality Assurance & Quality Control," Patrick Roche, Conservation Services Group. Provides an overview of data QA/QC system design.

  2. SU-F-18C-01: Minimum Detectability Analysis for Comprehensive Sized Based Optimization of Image Quality and Radiation Dose Across CT Protocols

    SciTech Connect (OSTI)

    Smitherman, C; Chen, B; Samei, E

    2014-06-15

    Purpose: This work involved a comprehensive modeling of task-based performance of CT across a wide range of protocols. The approach was used for optimization and consistency of dose and image quality within a large multi-vendor clinical facility. Methods: 150 adult protocols from the Duke University Medical Center were grouped into sub-protocols with similar acquisition characteristics. A size based image quality phantom (Duke Mercury Phantom) was imaged using these sub-protocols for a range of clinically relevant doses on two CT manufacturer platforms (Siemens, GE). The images were analyzed to extract task-based image quality metrics such as the Task Transfer Function (TTF), Noise Power Spectrum, and Az based on designer nodule task functions. The data were analyzed in terms of the detectability of a lesion size/contrast as a function of dose, patient size, and protocol. A graphical user interface (GUI) was developed to predict image quality and dose to achieve a minimum level of detectability. Results: Image quality trends with variations in dose, patient size, and lesion contrast/size were evaluated and calculated data behaved as predicted. The GUI proved effective to predict the Az values representing radiologist confidence for a targeted lesion, patient size, and dose. As an example, an abdomen pelvis exam for the GE scanner, with a task size/contrast of 5-mm/50-HU, and an Az of 0.9 requires a dose of 4.0, 8.9, and 16.9 mGy for patient diameters of 25, 30, and 35 cm, respectively. For a constant patient diameter of 30 cm, the minimum detected lesion size at those dose levels would be 8.4, 5, and 3.9 mm, respectively. Conclusion: The designed CT protocol optimization platform can be used to evaluate minimum detectability across dose levels and patient diameters. The method can be used to improve individual protocols as well as to improve protocol consistency across CT scanners.

  3. CT head-scan dosimetry in an anthropomorphic phantom and associated measurement of ACR accreditation-phantom imaging metrics under clinically representative scan conditions

    SciTech Connect (OSTI)

    Brunner, Claudia C.; Stern, Stanley H.; Chakrabarti, Kish; Minniti, Ronaldo; Parry, Marie I.; Skopec, Marlene

    2013-08-15

    Gy, respectively. The GE Discovery delivers about the same amount of dose (43.7 mGy) when run under similar operating and image-reconstruction conditions, i.e., without tube current modulation and ASIR. The image-metrics analysis likewise showed that the MTF, NPS, and CNR associated with the reconstructed images are mutually comparable when the three scanners are run with similar settings, and differences can be attributed to different edge-enhancement properties of the applied reconstruction filters. Moreover, when the GE scanner was operated with the facility's scanner settings for routine head exams, which apply 50% ASIR and use only approximately half of the 100%-FBP dose, the CNR of the images showed no significant change. Even though the CNR alone is not sufficient to characterize the image quality and justify any dose reduction claims, it can be useful as a constancy test metric.Conclusions: This work presents a straightforward method to connect direct measurements of CT dose with objective image metrics such as high-contrast resolution, noise, and CNR. It demonstrates that OSLD measurements in an anthropomorphic head phantom allow a realistic and locally precise estimation of magnitude and spatial distribution of dose in tissue delivered during a typical CT head scan. Additional objective analysis of the images of the ACR accreditation phantom can be used to relate the measured doses to high contrast resolution, noise, and CNR.

  4. Perfect fluid and scalar field in the Reissner-Nordstroem metric

    SciTech Connect (OSTI)

    Babichev, E. O.; Dokuchaev, V. I. Eroshenko, Yu. N.

    2011-05-15

    We describe the spherically symmetric steady-state accretion of perfect fluid in the Reissner-Nordstroem metric. We present analytic solutions for accretion of a fluid with linear equations of state and of the Chaplygin gas. We also show that under reasonable physical conditions, there is no steady-state accretion of a perfect fluid onto a Reissner-Nordstroem naked singularity. Instead, a static atmosphere of fluid is formed. We discuss a possibility of violation of the third law of black hole thermodynamics for a phantom fluid accretion.

  5. Ultrahard fluid and scalar field in the Kerr-Newman metric

    SciTech Connect (OSTI)

    Babichev, E.; Chernov, S.; Dokuchaev, V.; Eroshenko, Yu.

    2008-11-15

    An analytic solution for the accretion of ultrahard perfect fluid onto a moving Kerr-Newman black hole is found. This solution is a generalization of the previously known solution by Petrich, Shapiro, and Teukolsky for a Kerr black hole. We show that the found solution is applicable for the case of a nonextreme black hole, however it cannot describe the accretion onto an extreme black hole due to violation of the test fluid approximation. We also present a stationary solution for a massless scalar field in the metric of a Kerr-Newman naked singularity.

  6. Table 11.4 Nitrous Oxide Emissions, 1980-2009 (Thousand Metric Tons of Nitrous Oxide)

    U.S. Energy Information Administration (EIA) Indexed Site

    Nitrous Oxide Emissions, 1980-2009 (Thousand Metric Tons of Nitrous Oxide) Year Energy Sources Waste Management Agricultural Sources Industrial Processes 3 Total Mobile Combustion 1 Stationary Combustion 2 Total Waste Combustion Human Sewage in Wastewater Total Nitrogen Fertilization of Soils Crop Residue Burning Solid Waste of Domesticated Animals Total 1980 60 44 104 1 10 11 364 1 75 440 88 642 1981 63 44 106 1 10 11 364 2 74 440 84 641 1982 67 42 108 1 10 11 339 2 74 414 80 614 1983 71 43 114

  7. Einstein-aether theory, violation of Lorentz invariance, and metric-affine gravity

    SciTech Connect (OSTI)

    Heinicke, Christian; Baekler, Peter; Hehl, Friedrich W.

    2005-07-15

    We show that the Einstein-aether theory of Jacobson and Mattingly (J and M) can be understood in the framework of the metric-affine (gauge theory of) gravity (MAG). We achieve this by relating the aether vector field of J and M to certain post-Riemannian nonmetricity pieces contained in an independent linear connection of spacetime. Then, for the aether, a corresponding geometrical curvature-square Lagrangian with a massive piece can be formulated straightforwardly. We find an exact spherically symmetric solution of our model.

  8. OSTIblog Articles in the metrics Topic | OSTI, US Dept of Energy Office of

    Office of Scientific and Technical Information (OSTI)

    Scientific and Technical Information metrics Topic OSTI's Committee of Visitors, An Update by Dr. Jeffrey Salmon 23 May, 2011 in Science Communications 4333 COV%202009%20Group.jpg OSTI's Committee of Visitors, An Update Read more about 4333 "The unexamined life is not worth living." So says Plato's Socrates in the Apology. His self-examination led to extreme humility (or to an extreme irony) when Socrates confessed to his accusers that the only knowledge he had was knowledge of his

  9. Integration of Sustainability Metrics into Design Cases and State of Technology Assessments

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    This presentation does not contain any proprietary, confidential, or otherwise restricted information DOE Bioenergy Technologies Office (BETO) 2015 Project Peer Review Integration of Sustainability Metrics into Design Cases and State of Technology Assessments 2.1.0.100/2.1.0.302 NREL 2.1.0.301 PNNL Mary Biddy On behalf Eric Tan, Abhijit Dutta, Ryan Davis, Mike Talmadge NREL Lesley Snowden-Swan On behalf of Sue Jones, Aye Meyer, Ken Rappe, Kurt Spies PNNL Goal Statement 2 Support the development

  10. Specified assurance level sampling procedure

    SciTech Connect (OSTI)

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level.