National Library of Energy BETA

Sample records for factors metric prefixes

  1. Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Metrics Metrics Los Alamos expands its innovation network by engaging in sponsored research and licensing across technical disciplines. These agreements are the basis of a working relationship with industry and other research institutions and highlight the diversity of our collaborations. Los Alamos has a remarkable 70-year legacy of creating entirely new technologies that have revolutionized the country's understanding of science and engineering. Collaborations Data from Fiscal Year 2014. FY14

  2. Draft Supplemental Environmental Impact Statement for the Production...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    METRIC PREFIXES Prefix Symbol Multiplication factor Scientific notation tera- T 1,000,000,... gramscubic centimeter poundscubic feet 16,025.6 gramscubic meter inches 2.54 ...

  3. STAR METRICS

    Broader source: Energy.gov [DOE]

    Energy continues to define Phase II of the STAR METRICS program, a collaborative initiative to track Research and Development expenditures and their outcomes. Visit the STAR METRICS website for...

  4. Resilience Metrics

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    for Quadrennial Energy Review Technical Workshop on Resilience Metrics for Energy Transmission and Distribution Infrastructure April 28, 2014 Infrastructure Assurance Center ...

  5. Metric Presentation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... MODERN GRID S T R A T E G Y 14 14 Value Metrics - Work to date Reliability Outage duration and frequency Momentary outages Power Quality measures Security Ratio of distributed ...

  6. performance metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    performance metrics - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear

  7. Metric Presentation

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    MODERN GRID S T R A T E G Y Smart Grid Metrics Monitoring our Progress Smart Grid Implementation Workshop Joe Miller - Modern Grid Team June 19, 2008 1 Conducted by the National Energy Technology Laboratory Funded by the U.S. Department of Energy, Office of Electricity Delivery and Energy Reliability 2 Office of Electricity Delivery and Energy Reliability MODERN GRID S T R A T E G Y Many are working on the Smart Grid FERC DOE-OE Grid 2030 GridWise Alliance EEI NERC (FM) DOE/NETL Modern Grid

  8. ARM - 2008 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  9. ARM - 2006 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  10. ARM - 2007 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  11. NIF Target Shot Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    target shot metrics NIF Target Shot Metrics Exp Cap - Experimental Capability Natl Sec Appl - National Security Applications DS - Discovery Science ICF - Inertial Confinement Fusion HED - High Energy Density For internal LLNL firewall viewing - if the page is blank, please open www.google.com to flush out BCB

  12. Surveillance metrics sensitivity study.

    SciTech Connect (OSTI)

    Hamada, Michael S.; Bierbaum, Rene Lynn; Robertson, Alix A.

    2011-09-01

    In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.

  13. Multi-Metric Sustainability Analysis

    SciTech Connect (OSTI)

    Cowlin, S.; Heimiller, D.; Macknick, J.; Mann, M.; Pless, J.; Munoz, D.

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  14. Metric Construction | Open Energy Information

    Open Energy Info (EERE)

    Metric Construction Jump to: navigation, search Name: Metric Construction Place: Boston, MA Information About Partnership with NREL Partnership with NREL Yes Partnership Type Test...

  15. Cyber threat metrics.

    SciTech Connect (OSTI)

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  16. Metrics for Energy Resilience

    SciTech Connect (OSTI)

    Paul E. Roege; Zachary A. Collier; James Mancillas; John A. McDonagh; Igor Linkov

    2014-09-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today?s energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system?s energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth.

  17. Ames Laboratory Metrics | The Ames Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Metrics Document Number: NA Effective Date: 01/2016 File (public): PDF icon ameslab_metrics_01-14-16

  18. Variable metric conjugate gradient methods

    SciTech Connect (OSTI)

    Barth, T.; Manteuffel, T.

    1994-07-01

    1.1 Motivation. In this paper we present a framework that includes many well known iterative methods for the solution of nonsymmetric linear systems of equations, Ax = b. Section 2 begins with a brief review of the conjugate gradient method. Next, we describe a broader class of methods, known as projection methods, to which the conjugate gradient (CG) method and most conjugate gradient-like methods belong. The concept of a method having either a fixed or a variable metric is introduced. Methods that have a metric are referred to as either fixed or variable metric methods. Some relationships between projection methods and fixed (variable) metric methods are discussed. The main emphasis of the remainder of this paper is on variable metric methods. In Section 3 we show how the biconjugate gradient (BCG), and the quasi-minimal residual (QMR) methods fit into this framework as variable metric methods. By modifying the underlying Lanczos biorthogonalization process used in the implementation of BCG and QMR, we obtain other variable metric methods. These, we refer to as generalizations of BCG and QMR.

  19. Daylight metrics and energy savings

    SciTech Connect (OSTI)

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  20. List of SEP Reporting Metrics

    Broader source: Energy.gov [DOE]

    DOE State Energy Program List of Reporting Metrics, which was produced by the Office of Energy Efficiency and Renewable Energy Weatherization and Intergovernmental Program for SEP and the Energy Efficiency and Conservation Block Grants (EECBG) programs.

  1. Common Carbon Metric | Open Energy Information

    Open Energy Info (EERE)

    Common Carbon Metric Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Common Carbon Metric AgencyCompany Organization: United Nations Environment Programme, World...

  2. Performance Metrics Tiers | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Performance Metrics Tiers Performance Metrics Tiers The performance metrics defined by the Commercial Buildings Integration Program offer different tiers of information to address the needs of various users. On this page you will find information about the various goals users are trying to achieve by using performance metrics and the tiers of metrics. Goals in Measuring Performance Many individuals and groups are involved with a building over its lifetime, and all have different interests in and

  3. Thermodynamic Metrics and Optimal Paths

    SciTech Connect (OSTI)

    Sivak, David; Crooks, Gavin

    2012-05-08

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  4. Comparing Resource Adequacy Metrics: Preprint

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Comparing Resource Adequacy Metrics Preprint E. Ibanez and M. Milligan National Renewable Energy Laboratory To be presented at the 13th International Workshop on Large-Scale Integration of Wind Power into Power Systems as Well as on Transmission Networks for Offshore Wind Power Plants Berlin, Germany November 11-13, 2014 Conference Paper NREL/CP-5D00-62847 September 2014 NOTICE The submitted manuscript has been offered by an employee of the Alliance for Sustainable Energy, LLC (Alliance), a

  5. EECBG SEP Attachment 1 - Process metric list

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    10-07B/SEP 10-006A Attachment 1: Process Metrics List Metric Area Metric Primary or Optional Metric Item(s) to Report On 1. Building Retrofits 1a. Buildings retrofitted, by sector Number of buildings retrofitted Square footage of buildings retrofitted 1b. Energy management systems installed, by sector Number of energy management systems installed Square footage of buildings under management 1c. Building roofs retrofitted, by sector Number of building roofs retrofitted Square footage of building

  6. Module 6- Metrics, Performance Measurements and Forecasting

    Broader source: Energy.gov [DOE]

    This module reviews metrics such as cost and schedule variance along with cost and schedule performance indices.

  7. Definition of GPRA08 benefits metrics

    SciTech Connect (OSTI)

    None, None

    2009-01-18

    Background information for the FY 2007 GPRA methodology review on the definitions of GPRA08 benefits metrics.

  8. Buildings Performance Metrics Terminology | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Performance Metrics Terminology Buildings Performance Metrics Terminology This document provides the terms and definitions used in the Department of Energys Performance Metrics Research Project. metrics_terminology_20090203.pdf (152.35 KB) More Documents & Publications Procuring Architectural and Engineering Services for Energy Efficiency and Sustainability Transmittal Letter for the Statewide Benchmarking Process Evaluation Guide for Benchmarking Residential Energy Efficiency Program

  9. Defining a Standard Metric for Electricity Savings

    SciTech Connect (OSTI)

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  10. Comparing Resource Adequacy Metrics: Preprint

    SciTech Connect (OSTI)

    Ibanez, E.; Milligan, M.

    2014-09-01

    As the penetration of variable generation (wind and solar) increases around the world, there is an accompanying growing interest and importance in accurately assessing the contribution that these resources can make toward planning reserve. This contribution, also known as the capacity credit or capacity value of the resource, is best quantified by using a probabilistic measure of overall resource adequacy. In recognizing the variable nature of these renewable resources, there has been interest in exploring the use of reliability metrics other than loss of load expectation. In this paper, we undertake some comparisons using data from the Western Electricity Coordinating Council in the western United States.

  11. Western Resource Adequacy: Challenges - Approaches - Metrics...

    Energy Savers [EERE]

    Eastern Wind Integration and Transmission Study (EWITS) (Revised) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United ...

  12. Microsoft Word - QER Resilience Metrics - Technical Workshp ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Workshop Resilience Metrics for Energy Transmission and Distribution Infrastructure Offices of Electricity Delivery and Energy Reliability (OE) and Energy Policy and Systems ...

  13. Microsoft Word - QER Resilience Metrics - Technical Workshp ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Quadrennial Energy Review Technical Workshop on Resilience Metrics for Energy Transmission and Distribution Infrastructure April, 29th, 2014 777 North Capitol St NE Ste 300, ...

  14. Module 6 - Metrics, Performance Measurements and Forecasting...

    Broader source: Energy.gov (indexed) [DOE]

    This module reviews metrics such as cost and schedule variance along with cost and schedule performance indices. In addition, this module will outline forecasting tools such as ...

  15. Efficient Synchronization Stability Metrics for Fault Clearing...

    Office of Scientific and Technical Information (OSTI)

    Title: Efficient Synchronization Stability Metrics for Fault Clearing Authors: Backhaus, Scott N. 1 ; Chertkov, Michael 1 ; Bent, Russell Whitford 1 ; Bienstock, Daniel 2...

  16. Sheet1 Water Availability Metric (Acre-Feet/Yr) Water Cost Metric...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sheet1 Water Availability Metric (Acre-FeetYr) Water Cost Metric (Acre-Foot) Current Water Use (Acre-FeetYr) Projected Use in 2030 (Acre-FeetYr) HUC8 STATE BASIN SUBBASIN ...

  17. Smart Grid Status and Metrics Report Appendices

    SciTech Connect (OSTI)

    Balducci, Patrick J.; Antonopoulos, Chrissi A.; Clements, Samuel L.; Gorrissen, Willy J.; Kirkham, Harold; Ruiz, Kathleen A.; Smith, David L.; Weimar, Mark R.; Gardner, Chris; Varney, Jeff

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  18. Metrics for border management systems.

    SciTech Connect (OSTI)

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  19. Description of the Sandia National Laboratories science, technology & engineering metrics process.

    SciTech Connect (OSTI)

    Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter

    2010-04-01

    There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.

  20. Metrics for comparison of crystallographic maps

    SciTech Connect (OSTI)

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects, such as regions of high density, are of interest.

  1. Technical Workshop: Resilience Metrics for Energy Transmission...

    Broader source: Energy.gov (indexed) [DOE]

    List (55.27 KB) Sandia Report: Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (14.49 MB) Sandia ...

  2. Clean Cities Annual Metrics Report 2009 (Revised)

    SciTech Connect (OSTI)

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  3. FY 2014 Q3 Metric Summary | Department of Energy

    Office of Environmental Management (EM)

    FY 2014 Overall Contract and Project Management Improvement Performance Metrics and Targets FY 2015 Overall Contract and Project Management Improvement Performance Metrics and ...

  4. Business Metrics for High-Performance Homes: A Colorado Springs...

    Office of Scientific and Technical Information (OSTI)

    Technical Report: Business Metrics for High-Performance Homes: A Colorado Springs Case Study Citation Details In-Document Search Title: Business Metrics for High-Performance Homes: ...

  5. Texas CO2 Capture Demonstration Project Hits Three Million Metric...

    Office of Environmental Management (EM)

    Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone June 30, 2016 - ...

  6. Label-invariant Mesh Quality Metrics. (Conference) | SciTech...

    Office of Scientific and Technical Information (OSTI)

    Label-invariant Mesh Quality Metrics. Citation Details In-Document Search Title: Label-invariant Mesh Quality Metrics. Abstract not provided. Authors: Knupp, Patrick Publication ...

  7. Implementing the Data Center Energy Productivity Metric

    SciTech Connect (OSTI)

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2012-10-01

    As data centers proliferate in both size and number, their energy efficiency is becoming increasingly important. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high performance computing data center. We found that DCeP was successful in clearly distinguishing between different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve (or even maximize) energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and among data centers.

  8. Instructions for EM Corporate Performance Metrics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Instructions for EM Corporate Performance Metrics Instructions for EM Corporate Performance Metrics Quality Program Criteria Instructions for EM Corporate Performance Metrics (128.47 KB) More Documents & Publications EM Corporate QA Performance Metrics CPMS Tables QA Corporate Board Meeting - July 2008

  9. Metrics for Evaluating the Accuracy of Solar Power Forecasting (Presentation)

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B.; Florita, A.; Lu, S.; Hamann, H.; Banunarayanan, V.

    2013-10-01

    This presentation proposes a suite of metrics for evaluating the performance of solar power forecasting.

  10. Metrics for comparison of crystallographic maps

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects,more » such as regions of high density, are of interest.« less

  11. Enhanced Accident Tolerant LWR Fuels: Metrics Development

    SciTech Connect (OSTI)

    Shannon Bragg-Sitton; Lori Braase; Rose Montgomery; Chris Stanek; Robert Montgomery; Lance Snead; Larry Ott; Mike Billone

    2013-09-01

    The Department of Energy (DOE) Fuel Cycle Research and Development (FCRD) Advanced Fuels Campaign (AFC) is conducting research and development on enhanced Accident Tolerant Fuels (ATF) for light water reactors (LWRs). This mission emphasizes the development of novel fuel and cladding concepts to replace the current zirconium alloy-uranium dioxide (UO2) fuel system. The overall mission of the ATF research is to develop advanced fuels/cladding with improved performance, reliability and safety characteristics during normal operations and accident conditions, while minimizing waste generation. The initial effort will focus on implementation in operating reactors or reactors with design certifications. To initiate the development of quantitative metrics for ATR, a LWR Enhanced Accident Tolerant Fuels Metrics Development Workshop was held in October 2012 in Germantown, MD. This paper summarizes the outcome of that workshop and the current status of metrics development for LWR ATF.

  12. EECBG SEP Attachment 1 - Process metric list | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    SEP Attachment 1 - Process metric list EECBG SEP Attachment 1 - Process metric list Reporting Guidance Process Metric List eecbg_10_07b_sep__10_006a_attachment1_process_metric_list.pdf (93.56 KB) More Documents & Publications EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List EECBG Program Notice 10-07A DOE Recovery Act Reporting Requirements for the State Energy Program

  13. Performance Metrics Research Project - Final Report

    SciTech Connect (OSTI)

    Deru, M.; Torcellini, P.

    2005-10-01

    NREL began work for DOE on this project to standardize the measurement and characterization of building energy performance. NREL's primary research objectives were to determine which performance metrics have greatest value for determining energy performance and to develop standard definitions and methods of measuring and reporting that performance.

  14. Clean Cities 2011 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  15. Clean Cities 2010 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  16. Energy Department Project Captures and Stores One Million Metric...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    One Million Metric Tons of Carbon Energy Department Project Captures and Stores One Million Metric Tons of Carbon January 8, 2015 - 11:18am Addthis News Media Contact 202-586-4940 ...

  17. Widget:CrazyEggMetrics | Open Energy Information

    Open Energy Info (EERE)

    CrazyEggMetrics Jump to: navigation, search This widget runs javascript code for the Crazy Egg user experience metrics. This should not be on all pages, but on select pages...

  18. Smart Grid Status and Metrics Report

    SciTech Connect (OSTI)

    Balducci, Patrick J.; Weimar, Mark R.; Kirkham, Harold

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  19. Financial Metrics Data Collection Protocol, Version 1.0

    SciTech Connect (OSTI)

    Fowler, Kimberly M.; Gorrissen, Willy J.; Wang, Na

    2010-04-30

    Brief description of data collection process and plan that will be used to collect financial metrics associated with sustainable design.

  20. Nonmaximality of known extremal metrics on torus and Klein bottle

    SciTech Connect (OSTI)

    Karpukhin, M A

    2013-12-31

    The El Soufi-Ilias theorem establishes a connection between minimal submanifolds of spheres and extremal metrics for eigenvalues of the Laplace-Beltrami operator. Recently, this connection was used to provide several explicit examples of extremal metrics. We investigate the properties of these metrics and prove that none of them is maximal. Bibliography: 24 titles.

  1. Annex A Metrics for the Smart Grid System Report

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Annex A Metrics for the Smart Grid System Report A.iii Table of Contents Introduction ........................................................................................................................................... A.1 Metric #1: The Fraction of Customers and Total Load Served by Real-Time Pricing, Critical Peak Pricing, and Time-of-Use Pricing ........................................................................................ A.2 Metric #2: Real-Time System Operations Data

  2. Metrics For Comparing Plasma Mass Filters

    SciTech Connect (OSTI)

    Abraham J. Fetterman and Nathaniel J. Fisch

    2012-08-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter. __________________________________________________

  3. Metrics for comparing plasma mass filters

    SciTech Connect (OSTI)

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  4. Clean Cities 2013 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center [Office of Energy Efficiency and Renewable Energy (EERE)]

    3 Annual Metrics Report Caley Johnson and Mark Singer National Renewable Energy Laboratory Technical Report NREL/TP-5400-62838 October 2014 NREL is a national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC This report is available at no cost from the National Renewable Energy Laboratory (NREL) at www.nrel.gov/publications. Contract No. DE-AC36-08GO28308 National Renewable Energy Laboratory 15013

  5. Clean Cities 2014 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center [Office of Energy Efficiency and Renewable Energy (EERE)]

    4 Annual Metrics Report Caley Johnson and Mark Singer National Renewable Energy Laboratory Technical Report NREL/TP-5400-65265 December 2015 NREL is a national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC This report is available at no cost from the National Renewable Energy Laboratory (NREL) at www.nrel.gov/publications. Contract No. DE-AC36-08GO28308 National Renewable Energy Laboratory 15013

  6. Metric redefinitions in Einstein-Aether theory

    SciTech Connect (OSTI)

    Foster, Brendan Z.

    2005-08-15

    'Einstein-Aether' theory, in which gravity couples to a dynamical, timelike, unit-norm vector field, provides a means for studying Lorentz violation in a generally covariant setting. Demonstrated here is the effect of a redefinition of the metric and 'aether' fields in terms of the original fields and two free parameters. The net effect is a change of the coupling constants appearing in the action. Using such a redefinition, one of the coupling constants can be set to zero, simplifying studies of solutions of the theory.

  7. Metrics for Measuring Progress Toward Implementation of the Smart Grid

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    (June 2008) | Department of Energy Metrics for Measuring Progress Toward Implementation of the Smart Grid (June 2008) Metrics for Measuring Progress Toward Implementation of the Smart Grid (June 2008) Results of the breakout session discussions at the Smart Grid Implementation Workshop, June 19-20, 2008 Metrics for Measuring Progress Toward Implementation of the Smart Grid (308.23 KB) More Documents & Publications 5th Annual CHP Roadmap Workshop Breakout Group Results, September 2004

  8. Technical Workshop: Resilience Metrics for Energy Transmission and

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Distribution Infrastructure | Department of Energy Resilience Metrics for Energy Transmission and Distribution Infrastructure Technical Workshop: Resilience Metrics for Energy Transmission and Distribution Infrastructure During this workshop, EPSA invited technical experts from industry, national laboratories, academia, and NGOs to discuss the state of play of and need for resilience metrics and how they vary by natural gas, liquid fuels and electric grid infrastructures. Issues important to

  9. Measuring energy efficiency: Opportunities from standardization and common metrics

    U.S. Energy Information Administration (EIA) Indexed Site

    Measuring energy efficiency: Opportunities from standardization and common metrics For 2016 EIA Energy Conference July 11, 2016 | Washington, D.C. By Stacy Angel, Energy Information Portfolio Analyst Carol White, Senior Energy Efficiency Analyst How is the importance of measuring energy efficiency changing? * The number of energy efficiency policies and programs is growing. * Common metrics help measure progress towards multiple objectives. * Clear metrics help consumers make informed energy

  10. Integration of the EM Corporate QA Performance Metrics With Performance

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Analysis Process | Department of Energy the EM Corporate QA Performance Metrics With Performance Analysis Process Integration of the EM Corporate QA Performance Metrics With Performance Analysis Process August 2009 Presenter: Robert Hinds, Savannah River Remediation, LLC Track 9-12 Topics Covered: Implementing CPMS for QA Corporate QA Performance Metrics Contractor Performance Analysis Contractor Assessment Programs Assessment Program Structure CPMS Integration with P/A Process Validating

  11. Clean Cities 2013 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  12. Metrics correlation and analysis service (MCAS)

    SciTech Connect (OSTI)

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya; /Fermilab

    2009-05-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  13. Clean Cities 2014 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, Caley; Singer, Mark

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  14. Conceptual Framework for Developing Resilience Metrics for the...

    Energy Savers [EERE]

    for the Electricity, Oil, and Gas Sectors in the United States (September 2015) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in ...

  15. Office of HC Strategy Budget and Performance Metrics (HC-50)

    Broader source: Energy.gov [DOE]

    The Office of Human Capital Strategy, Budget, and Performance Metrics provides strategic direction and advice to its stakeholders through the integration of budget analysis, workforce projections,...

  16. EM Corporate QA Performance Metrics | Department of Energy

    Broader source: Energy.gov (indexed) [DOE]

    QA Corporate Board Meeting - November 2008 Instructions for EM Corporate Performance Metrics FY 2015 SENIOR EXECUTIVE SERVICE (SES) AND SENIOR PROFESSIONAL (SP) PERFORMANCE ...

  17. DOE Announces Webinars on Solar Forecasting Metrics, the DOE...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    DOE Announces Webinars on Solar Forecasting Metrics, the DOE ... from adopting the latest energy efficiency and renewable ... to liquids technology, advantages of using natural gas, ...

  18. Integration of the EM Corporate QA Performance Metrics With Performanc...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Integration of the EM Corporate QA Performance Metrics With Performance Analysis Process ... Assessment Program Structure CPMS Integration with PA Process Validating The Process ...

  19. Exploration Cost and Time Metric | Open Energy Information

    Open Energy Info (EERE)

    lt":0,"address":"","icon":"","group":"","inlineLabel":"","visitedicon":"" Hide Map Language: English Exploration Cost and Time Metric Screenshot References: Conference Paper1...

  20. Wave Energy Converter System Requirements and Performance Metrics

    Broader source: Energy.gov [DOE]

    The Energy Department and Wave Energy Scotland are holding a joint workshop on wave energy converter (WEC) system requirements and performance metrics on Friday, February 26.

  1. Performance Metrics and Budget Division (HC-51) | Department...

    Broader source: Energy.gov (indexed) [DOE]

    of the Department of Energy's human capital initiatives and functions through the strategic integration of corporate human capital performance metrics and the budget ...

  2. Energy Department Sponsored Project Captures One Millionth Metric...

    Office of Environmental Management (EM)

    ... | Photo courtesy of Air Products and Chemicals Inc. Energy Department Project Captures and Stores more than One Million Metric Tons of CO2 Carbon Pollution Being Captured, ...

  3. Practical Diagnostics for Evaluating Residential Commissioning Metrics

    SciTech Connect (OSTI)

    Wray, Craig; Walker, Iain; Siegel, Jeff; Sherman, Max

    2002-06-11

    In this report, we identify and describe 24 practical diagnostics that are ready now to evaluate residential commissioning metrics, and that we expect to include in the commissioning guide. Our discussion in the main body of this report is limited to existing diagnostics in areas of particular concern with significant interactions: envelope and HVAC systems. These areas include insulation quality, windows, airtightness, envelope moisture, fan and duct system airflows, duct leakage, cooling equipment charge, and combustion appliance backdrafting with spillage. Appendix C describes the 83 other diagnostics that we have examined in the course of this project, but that are not ready or are inappropriate for residential commissioning. Combined with Appendix B, Table 1 in the main body of the report summarizes the advantages and disadvantages of all 107 diagnostics. We first describe what residential commissioning is, its characteristic elements, and how one might structure its process. Our intent in this discussion is to formulate and clarify these issues, but is largely preliminary because such a practice does not yet exist. Subsequent sections of the report describe metrics one can use in residential commissioning, along with the consolidated set of 24 practical diagnostics that the building industry can use now to evaluate them. Where possible, we also discuss the accuracy and usability of diagnostics, based on recent laboratory work and field studies by LBNL staff and others in more than 100 houses. These studies concentrate on evaluating diagnostics in the following four areas: the DeltaQ duct leakage test, air-handler airflow tests, supply and return grille airflow tests, and refrigerant charge tests. Appendix A describes those efforts in detail. In addition, where possible, we identify the costs to purchase diagnostic equipment and the amount of time required to conduct the diagnostics. Table 1 summarizes these data. Individual equipment costs for the 24

  4. Metrics for Evaluating Conventional and Renewable Energy Technologies (Presentation)

    SciTech Connect (OSTI)

    Mann, M. K.

    2013-01-01

    With numerous options for the future of natural gas, how do we know we're going down the right path? How do we designate a metric to measure and demonstrate change and progress, and how does that metric incorporate all stakeholders and scenarios?

  5. Development of new VOC exposure metrics and their relationship to ''Sick Building Syndrome'' symptoms

    SciTech Connect (OSTI)

    Ten Brinke, JoAnn

    1995-08-01

    Volatile organic compounds (VOCs) are suspected to contribute significantly to ''Sick Building Syndrome'' (SBS), a complex of subchronic symptoms that occurs during and in general decreases away from occupancy of the building in question. A new approach takes into account individual VOC potencies, as well as the highly correlated nature of the complex VOC mixtures found indoors. The new VOC metrics are statistically significant predictors of symptom outcomes from the California Healthy Buildings Study data. Multivariate logistic regression analyses were used to test the hypothesis that a summary measure of the VOC mixture, other risk factors, and covariates for each worker will lead to better prediction of symptom outcome. VOC metrics based on animal irritancy measures and principal component analysis had the most influence in the prediction of eye, dermal, and nasal symptoms. After adjustment, a water-based paints and solvents source was found to be associated with dermal and eye irritation. The more typical VOC exposure metrics used in prior analyses were not useful in symptom prediction in the adjusted model (total VOC (TVOC), or sum of individually identified VOCs ({Sigma}VOC{sub i})). Also not useful were three other VOC metrics that took into account potency, but did not adjust for the highly correlated nature of the data set, or the presence of VOCs that were not measured. High TVOC values (2--7 mg m{sup {minus}3}) due to the presence of liquid-process photocopiers observed in several study spaces significantly influenced symptoms. Analyses without the high TVOC values reduced, but did not eliminate the ability of the VOC exposure metric based on irritancy and principal component analysis to explain symptom outcome.

  6. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    2016-02-28

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  7. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    SciTech Connect (OSTI)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  8. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    SciTech Connect (OSTI)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  9. Self-benchmarking Guide for Data Centers: Metrics, Benchmarks, Actions

    SciTech Connect (OSTI)

    Mathew, Paul; Ganguly, Srirupa; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in data centers. This guide is primarily intended for personnel who have responsibility for managing energy use in existing data centers - including facilities managers, energy managers, and their engineering consultants. Additionally, data center designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior data center benchmarking studies supported by the California Energy Commission. Much of the benchmarking data are drawn from the LBNL data center benchmarking database that was developed from these studies. Additional benchmark data were obtained from engineering experts including facility designers and energy managers. This guide also builds on recent research supported by the U.S. Department of Energy's Save Energy Now program.

  10. Measuring solar reflectance Part I: Defining a metric that accurately...

    Office of Scientific and Technical Information (OSTI)

    A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool ...

  11. Microsoft Word - followup to Fin Risk Metrics workshop.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    March 21, 2008 PurposeSubject: Follow-up to Financial Risk Metrics Workshop Page 1 of 1 Differences in Cash Flow between Net Billing and Direct Pay for Energy Northwest Attached...

  12. Analysis of Solar Cell Quality Using Voltage Metrics: Preprint

    SciTech Connect (OSTI)

    Toberer, E. S.; Tamboli, A. C.; Steiner, M.; Kurtz, S.

    2012-06-01

    The highest efficiency solar cells provide both excellent voltage and current. Of these, the open-circuit voltage (Voc) is more frequently viewed as an indicator of the material quality. However, since the Voc also depends on the band gap of the material, the difference between the band gap and the Voc is a better metric for comparing material quality of unlike materials. To take this one step further, since Voc also depends on the shape of the absorption edge, we propose to use the ultimate metric: the difference between the measured Voc and the Voc calculated from the external quantum efficiency using a detailed balance approach. This metric is less sensitive to changes in cell design and definition of band gap. The paper defines how to implement this metric and demonstrates how it can be useful in tracking improvements in Voc, especially as Voc approaches its theoretical maximum.

  13. Towards Efficient Supercomputing: Searching for the Right Efficiency Metric

    SciTech Connect (OSTI)

    Hsu, Chung-Hsing; Kuehn, Jeffery A; Poole, Stephen W

    2012-01-01

    The efficiency of supercomputing has traditionally been in the execution time. In early 2000 s, the concept of total cost of ownership was re-introduced, with the introduction of efficiency measure to include aspects such as energy and space. Yet the supercomputing community has never agreed upon a metric that can cover these aspects altogether and also provide a fair basis for comparison. This paper exam- ines the metrics that have been proposed in the past decade, and proposes a vector-valued metric for efficient supercom- puting. Using this metric, the paper presents a study of where the supercomputing industry has been and how it stands today with respect to efficient supercomputing.

  14. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    SciTech Connect (OSTI)

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  15. A Graph Analytic Metric for Mitigating Advanced Persistent Threat

    SciTech Connect (OSTI)

    Johnson, John R.; Hogan, Emilie A.

    2013-06-04

    This paper introduces a novel graph analytic metric that can be used to measure the potential vulnerability of a cyber network to specific types of attacks that use lateral movement and privilege escalation such as the well known Pass The Hash, (PTH). The metric is computed from an oriented subgraph of the underlying cyber network induced by selecting only those edges for which a given property holds between the two vertices of the edge. The metric with respect to a select node on the subgraph is defined as the likelihood that the select node is reachable from another arbitrary node in the graph. This metric can be calculated dynamically from the authorization and auditing layers during the network security authorization phase and will potentially enable predictive deterrence against attacks such as PTH.

  16. ARM - Evaluation Product - Barrow Radiation Data (2009 metric)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ProductsBarrow Radiation Data (2009 metric) ARM Data Discovery Browse Data Documentation Use the Data File Inventory tool to view data availability at the file level. Comments? We would love to hear from you! Send us a note below or call us at 1-888-ARM-DATA. Send Evaluation Product : Barrow Radiation Data (2009 metric) Observations from a suite of radiometers including Precision Spectral Pyranometers (PSPs), Precision Infrared Radiometers (PIRs), and a Normal Incident Pyrheliometer (NIP) are

  17. New IEC Specifications Help Define Wind Plant Performance Reporting Metrics

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    | Department of Energy IEC Specifications Help Define Wind Plant Performance Reporting Metrics New IEC Specifications Help Define Wind Plant Performance Reporting Metrics January 6, 2014 - 10:00am Addthis This is an excerpt from the Fourth Quarter 2013 edition of the Wind Program R&D Newsletter. The U.S. Department of Energy Wind Program and Sandia National Laboratories have been working with the International Electrotechnical Commission (IEC) Committee on wind turbine availability to

  18. Weatherization Assistance Program Goals and Metrics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Goals and Metrics Weatherization Assistance Program Goals and Metrics UT - Bettelle - Oak Ridge National Laboratory Logo The U.S. Department of Energy (DOE) Weatherization Assistance Program (WAP) regularly reviews the work of states and grant recipients for effectiveness and for meeting program goals. DOE's Oak Ridge National Laboratory provides technical support to the program and conducts the evaluations. Goals The overall goal of WAP is to reduce the burden of energy prices on the

  19. Conceptual Framework for Developing Resilience Metrics for the Electricity,

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Oil, and Gas Sectors in the United States (September 2015) | Department of Energy Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (September 2015) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (September 2015) This report has been written for the Department of Energy's Office of Electricity Delivery and Energy Reliability to support the Office of

  20. Enclosure - FY 2015 Q4 Metrics Report 2015-11-02.xlsx

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Fourth Quarter Overall Root Cause Analysis (RCA)Corrective Action Plan (CAP) Performance Metrics No. ContractProject Management Performance Metrics FY 2015 Target Comment No. 2 3 ...

  1. Microsoft Word - 2014-5-27 RCA Qtr 2 Metrics Attachment_R1

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Second Quarter Overall Root Cause Analysis (RCA)Corrective Action Plan (CAP) Performance Metrics 1 ContractProject Management Performance Metric FY 2014 Target FY 2014 Projected ...

  2. Metrics Evolution in an Energy Research & Development Program

    SciTech Connect (OSTI)

    Brent Dixon

    2011-08-01

    All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R&D Program, which is working to improve the sustainability of nuclear energy.

  3. Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint

    SciTech Connect (OSTI)

    Zhang, J.; Hodge, B. M.; Florita, A.; Lu, S.; Hamann, H. F.; Banunarayanan, V.

    2013-10-01

    Forecasting solar energy generation is a challenging task due to the variety of solar power systems and weather regimes encountered. Forecast inaccuracies can result in substantial economic losses and power system reliability issues. This paper presents a suite of generally applicable and value-based metrics for solar forecasting for a comprehensive set of scenarios (i.e., different time horizons, geographic locations, applications, etc.). In addition, a comprehensive framework is developed to analyze the sensitivity of the proposed metrics to three types of solar forecasting improvements using a design of experiments methodology, in conjunction with response surface and sensitivity analysis methods. The results show that the developed metrics can efficiently evaluate the quality of solar forecasts, and assess the economic and reliability impact of improved solar forecasting.

  4. Non-minimal derivative couplings of the composite metric

    SciTech Connect (OSTI)

    Heisenberg, Lavinia

    2015-11-04

    In the context of massive gravity, bi-gravity and multi-gravity non-minimal matter couplings via a specific composite effective metric were investigated recently. Even if these couplings generically reintroduce the Boulware-Deser ghost, this composite metric is unique in the sense that the ghost reemerges only beyond the decoupling limit and the matter quantum loop corrections do not detune the potential interactions. We consider non-minimal derivative couplings of the composite metric to matter fields for a specific subclass of Horndeski scalar-tensor interactions. We first explore these couplings in the mini-superspace and investigate in which scenario the ghost remains absent. We further study these non-minimal derivative couplings in the decoupling-limit of the theory and show that the equation of motion for the helicity-0 mode remains second order in derivatives. Finally, we discuss preliminary implications for cosmology.

  5. ARM - Evaluation Product - AERI Data Quality Metric (AERI-QC)

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ProductsAERI Data Quality Metric (AERI-QC) ARM Data Discovery Browse Data Documentation Use the Data File Inventory tool to view data availability at the file level. Comments? We would love to hear from you! Send us a note below or call us at 1-888-ARM-DATA. Send Evaluation Product : AERI Data Quality Metric (AERI-QC) Ancillary NetCDF file to be used with the regular AERI data files to document times when the data may not be correct. Data Details Contact David Turner National Oceanic and

  6. Calabi-Yau metrics for quotients and complete intersections

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Braun, Volker; Brelidze, Tamaz; Douglas, Michael R.; Ovrut, Burt A.

    2008-05-22

    We extend previous computations of Calabi-Yau metrics on projective hypersurfaces to free quotients, complete intersections, and free quotients of complete intersections. In particular, we construct these metrics on generic quintics, four-generation quotients of the quintic, Schoen Calabi-Yau complete intersections and the quotient of a Schoen manifold with Z₃ x Z₃ fundamental group that was previously used to construct a heterotic standard model. Various numerical investigations into the dependence of Donaldson's algorithm on the integration scheme, as well as on the Kähler and complex structure moduli, are also performed.

  7. Primer Control System Cyber Security Framework and Technical Metrics

    SciTech Connect (OSTI)

    Wayne F. Boyer; Miles A. McQueen

    2008-05-01

    The Department of Homeland Security National Cyber Security Division supported development of a control system cyber security framework and a set of technical metrics to aid owner-operators in tracking control systems security. The framework defines seven relevant cyber security dimensions and provides the foundation for thinking about control system security. Based on the developed security framework, a set of ten technical metrics are recommended that allow control systems owner-operators to track improvements or degradations in their individual control systems security posture.

  8. Culture, and a Metrics Methodology for Biological Countermeasure Scenarios

    SciTech Connect (OSTI)

    Simpson, Mary J.

    2007-03-15

    Outcome Metrics Methodology defines a way to evaluate outcome metrics associated with scenario analyses related to biological countermeasures. Previous work developed a schema to allow evaluation of common elements of impacts across a wide range of potential threats and scenarios. Classes of metrics were identified that could be used by decision makers to differentiate the common bases among disparate scenarios. Typical impact metrics used in risk calculations include the anticipated number of deaths, casualties, and the direct economic costs should a given event occur. There are less obvious metrics that are often as important and require more intensive initial work to be incorporated. This study defines a methodology for quantifying, evaluating, and ranking metrics other than direct health and economic impacts. As has been observed with the consequences of Hurricane Katrina, impacts to the culture of specific sectors of society are less obvious on an immediate basis but equally important over the ensuing and long term. Culture is used as the example class of metrics within which • requirements for a methodology are explored • likely methodologies are examined • underlying assumptions for the respective methodologies are discussed • the basis for recommending a specific methodology is demonstrated. Culture, as a class of metrics, is shown to consist of political, sociological, and psychological elements that are highly valued by decision makers. In addition, cultural practices, dimensions, and kinds of knowledge offer complementary sets of information that contribute to the context within which experts can provide input. The quantification and evaluation of sociopolitical, socio-economic, and sociotechnical impacts depend predominantly on subjective, expert judgment. Epidemiological data is limited, resulting in samples with statistical limits. Dose response assessments and curves depend on the quality of data and its relevance to human modes of exposure

  9. Deep Energy Retrofit Performance Metric Comparison: Eight California Case Studies

    SciTech Connect (OSTI)

    Walker, Iain; Fisher, Jeremy; Less, Brennan

    2014-06-01

    In this paper we will present the results of monitored annual energy use data from eight residential Deep Energy Retrofit (DER) case studies using a variety of performance metrics. For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use energy consumption for approximately one year. Annual performance in site and source energy, as well as carbon dioxide equivalent (CO2e) emissions were determined on a per house, per person and per square foot basis to examine the sensitivity to these different metrics. All eight DERs showed consistent success in achieving substantial site energy and CO2e reductions, but some projects achieved very little, if any source energy reduction. This problem emerged in those homes that switched from natural gas to electricity for heating and hot water, resulting in energy consumption dominated by electricity use. This demonstrates the crucial importance of selecting an appropriate metric to be used in guiding retrofit decisions. Also, due to the dynamic nature of DERs, with changes in occupancy, size, layout, and comfort, several performance metrics might be necessary to understand a project’s success.

  10. Metrics and Benchmarks for Energy Efficiency in Laboratories

    SciTech Connect (OSTI)

    Mathew, Paul

    2007-10-26

    A wide spectrum of laboratory owners, ranging from universities to federal agencies, have explicit goals for energy efficiency in their facilities. For example, the Energy Policy Act of 2005 (EPACT 2005) requires all new federal buildings to exceed ASHRAE 90.1-2004 1 by at least 30 percent. The University of California Regents Policy requires all new construction to exceed California Title 24 2 by at least 20 percent. A new laboratory is much more likely to meet energy efficiency goals if quantitative metrics and targets are explicitly specified in programming documents and tracked during the course of the delivery process. If efficiency targets are not explicitly and properly defined, any additional capital costs or design time associated with attaining higher efficiencies can be difficult to justify. The purpose of this guide is to provide guidance on how to specify and compute energy efficiency metrics and benchmarks for laboratories, at the whole building as well as the system level. The information in this guide can be used to incorporate quantitative metrics and targets into the programming of new laboratory facilities. Many of these metrics can also be applied to evaluate existing facilities. For information on strategies and technologies to achieve energy efficiency, the reader is referred to Labs21 resources, including technology best practice guides, case studies, and the design guide (available at www.labs21century.gov/toolkit).

  11. EERE Portfolio. Primary Benefits Metrics for FY09

    SciTech Connect (OSTI)

    none,

    2011-11-01

    This collection of data tables shows the benefits metrics related to energy security, environmental impacts, and economic impacts for both the entire EERE portfolio of renewable energy technologies as well as the individual technologies. Data are presented for the years 2015, 2020, 2030, and 2050, for both the NEMS and MARKAL models.

  12. On the existence of certain axisymmetric interior metrics

    SciTech Connect (OSTI)

    Angulo Santacruz, C.; Batic, D.; Nowakowski, M.

    2010-08-15

    One of the effects of noncommutative coordinate operators is that the delta function connected to the quantum mechanical amplitude between states sharp to the position operator gets smeared by a Gaussian distribution. Although this is not the full account of the effects of noncommutativity, this effect is, in particular, important as it removes the point singularities of Schwarzschild and Reissner-Nordstroem solutions. In this context, it seems to be of some importance to probe also into ringlike singularities which appear in the Kerr case. In particular, starting with an anisotropic energy-momentum tensor and a general axisymmetric ansatz of the metric together with an arbitrary mass distribution (e.g., Gaussian), we derive the full set of Einstein equations that the noncommutative geometry inspired Kerr solution should satisfy. Using these equations we prove two theorems regarding the existence of certain Kerr metrics inspired by noncommutative geometry.

  13. Microsoft Word - 2014-1-1 RCA Qtr 1 Metrics Attachment_R1

    Energy Savers [EERE]

    ContractProject Management Performance Metric FY 2014 Target FY 2014 Projected FY 2014 ... ContractProject Management Performance Metrics FY 2014 Target FY 2014 1 th Qtr Actual ...

  14. Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and...

    Office of Scientific and Technical Information (OSTI)

    Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and Conformal Quantum Mechanics Citation Details In-Document Search Title: Modified Anti-de-Sitter Metric, Light-Front...

  15. DOE to Remove 200 Metric Tons of Highly Enriched Uranium from...

    Energy Savers [EERE]

    200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile ...

  16. Optimal recovery of linear operators in non-Euclidean metrics

    SciTech Connect (OSTI)

    Osipenko, K Yu

    2014-10-31

    The paper looks at problems concerning the recovery of operators from noisy information in non-Euclidean metrics. Anumber of general theorems are proved and applied to recovery problems for functions and their derivatives from the noisy Fourier transform. In some cases, afamily of optimal methods is found, from which the methods requiring the least amount of original information are singled out. Bibliography: 25 titles.

  17. Development of Technology Readiness Level (TRL) Metrics and Risk Measures

    SciTech Connect (OSTI)

    Engel, David W.; Dalton, Angela C.; Anderson, K. K.; Sivaramakrishnan, Chandrika; Lansing, Carina

    2012-10-01

    This is an internal project milestone report to document the CCSI Element 7 team's progress on developing Technology Readiness Level (TRL) metrics and risk measures. In this report, we provide a brief overview of the current technology readiness assessment research, document the development of technology readiness levels (TRLs) specific to carbon capture technologies, describe the risk measures and uncertainty quantification approaches used in our research, and conclude by discussing the next steps that the CCSI Task 7 team aims to accomplish.

  18. Microsoft Word - DOE_ANNUAL_METRICS_2009Q3.docx

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    14404 Third Quarter 2009 Modeling Program Metric: Coupled model comparison with observations using improved dynamics at coarse resolution Quantifying the impact of a finite volume dynamical core in CCSM3 on simulated precipitation over major catchment areas July 2009 Peter J. Gleckler and Karl E. Taylor Lawrence Livermore National Laboratory Livermore, CA Work supported by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research 
 2
 Disclaimer This

  19. Guidebook for ARRA Smart Grid Program Metrics and Benefits | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Guidebook for ARRA Smart Grid Program Metrics and Benefits Guidebook for ARRA Smart Grid Program Metrics and Benefits The Guidebook for American Recovery and Reinvestment Act (ARRA) Smart Grid Program Metrics and Benefits describes the type of information to be collected from each of the Project Teams and how it will be used by the Department of Energy to communicate overall conclusions to the public. Guidebook for ARRA Smart Grid Program Metrics and Benefits (975.03 KB) More

  20. Derivation of a Levelized Cost of Coating (LCOC) metric for evaluation of solar selective absorber materials

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ho, C. K.; Pacheco, J. E.

    2015-06-05

    A new metric, the Levelized Cost of Coating (LCOC), is derived in this paper to evaluate and compare alternative solar selective absorber coatings against a baseline coating (Pyromark 2500). In contrast to previous metrics that focused only on the optical performance of the coating, the LCOC includes costs, durability, and optical performance for more comprehensive comparisons among candidate materials. The LCOC is defined as the annualized marginal cost of the coating to produce a baseline annual thermal energy production. Costs include the cost of materials and labor for initial application and reapplication of the coating, as well as the costmore » of additional or fewer heliostats to yield the same annual thermal energy production as the baseline coating. Results show that important factors impacting the LCOC include the initial solar absorptance, thermal emittance, reapplication interval, degradation rate, reapplication cost, and downtime during reapplication. The LCOC can also be used to determine the optimal reapplication interval to minimize the levelized cost of energy production. As a result, similar methods can be applied more generally to determine the levelized cost of component for other applications and systems.« less

  1. Derivation of a Levelized Cost of Coating (LCOC) metric for evaluation of solar selective absorber materials

    SciTech Connect (OSTI)

    Ho, C. K.; Pacheco, J. E.

    2015-06-05

    A new metric, the Levelized Cost of Coating (LCOC), is derived in this paper to evaluate and compare alternative solar selective absorber coatings against a baseline coating (Pyromark 2500). In contrast to previous metrics that focused only on the optical performance of the coating, the LCOC includes costs, durability, and optical performance for more comprehensive comparisons among candidate materials. The LCOC is defined as the annualized marginal cost of the coating to produce a baseline annual thermal energy production. Costs include the cost of materials and labor for initial application and reapplication of the coating, as well as the cost of additional or fewer heliostats to yield the same annual thermal energy production as the baseline coating. Results show that important factors impacting the LCOC include the initial solar absorptance, thermal emittance, reapplication interval, degradation rate, reapplication cost, and downtime during reapplication. The LCOC can also be used to determine the optimal reapplication interval to minimize the levelized cost of energy production. As a result, similar methods can be applied more generally to determine the levelized cost of component for other applications and systems.

  2. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

    SciTech Connect (OSTI)

    Price, Lynn; Murtishaw, Scott; Worrell, Ernst

    2003-06-01

    Energy Commission (Energy Commission) related to the Registry in three areas: (1) assessing the availability and usefulness of industry-specific metrics, (2) evaluating various methods for establishing baselines for calculating GHG emissions reductions related to specific actions taken by Registry participants, and (3) establishing methods for calculating electricity CO2 emission factors. The third area of research was completed in 2002 and is documented in Estimating Carbon Dioxide Emissions Factors for the California Electric Power Sector (Marnay et al., 2002). This report documents our findings related to the first areas of research. For the first area of research, the overall objective was to evaluate the metrics, such as emissions per economic unit or emissions per unit of production that can be used to report GHG emissions trends for potential Registry participants. This research began with an effort to identify methodologies, benchmarking programs, inventories, protocols, and registries that u se industry-specific metrics to track trends in energy use or GHG emissions in order to determine what types of metrics have already been developed. The next step in developing industry-specific metrics was to assess the availability of data needed to determine metric development priorities. Berkeley Lab also determined the relative importance of different potential Registry participant categories in order to asses s the availability of sectoral or industry-specific metrics and then identified industry-specific metrics in use around the world. While a plethora of metrics was identified, no one metric that adequately tracks trends in GHG emissions while maintaining confidentiality of data was identified. As a result of this review, Berkeley Lab recommends the development of a GHG intensity index as a new metric for reporting and tracking GHG emissions trends.Such an index could provide an industry-specific metric for reporting and tracking GHG emissions trends to accurately

  3. Metrics for the National SCADA Test Bed Program

    SciTech Connect (OSTI)

    Craig, Philip A.; Mortensen, J.; Dagle, Jeffery E.

    2008-12-05

    The U.S. Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) National SCADA Test Bed (NSTB) Program is providing valuable inputs into the electric industry by performing topical research and development (R&D) to secure next generation and legacy control systems. In addition, the program conducts vulnerability and risk analysis, develops tools, and performs industry liaison, outreach and awareness activities. These activities will enhance the secure and reliable delivery of energy for the United States. This report will describe metrics that could be utilized to provide feedback to help enhance the effectiveness of the NSTB Program.

  4. User's Guide to the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect (OSTI)

    Taasevigen, Danny J.; Koran, William

    2012-02-28

    The intent of this user guide is to provide a brief description of the functionality of the Energy Charting and Metrics (ECAM) tool, including the expanded building re-tuning functionality developed for Pacific Northwest National laboratory (PNNL). This document describes the tool's general functions and features, and offers detailed instructions for PNNL building re-tuning charts, a feature in ECAM intended to help building owners and operators look at trend data (recommended 15-minute time intervals) in a series of charts (both time series and scatter) to analyze air-handler, zone, and central plant information gathered from a building automation system (BAS).

  5. SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation

    SciTech Connect (OSTI)

    Zhao, T; Ruan, D

    2015-06-15

    Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject to random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be

  6. Conceptual Soundness, Metric Development, Benchmarking, and Targeting for PATH Subprogram Evaluation

    SciTech Connect (OSTI)

    Mosey. G.; Doris, E.; Coggeshall, C.; Antes, M.; Ruch, J.; Mortensen, J.

    2009-01-01

    The objective of this study is to evaluate the conceptual soundness of the U.S. Department of Housing and Urban Development (HUD) Partnership for Advancing Technology in Housing (PATH) program's revised goals and establish and apply a framework to identify and recommend metrics that are the most useful for measuring PATH's progress. This report provides an evaluative review of PATH's revised goals, outlines a structured method for identifying and selecting metrics, proposes metrics and benchmarks for a sampling of individual PATH programs, and discusses other metrics that potentially could be developed that may add value to the evaluation process. The framework and individual program metrics can be used for ongoing management improvement efforts and to inform broader program-level metrics for government reporting requirements.

  7. DOE Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for

    National Nuclear Security Administration (NNSA)

    Civilian Reactors | National Nuclear Security Administration | (NNSA) Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for Civilian Reactors DOE Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for Civi Washington, DC Secretary Abraham announced that DOE will dispose of 34 metric tons of surplus weapons grade plutonium by turning the material into mixed oxide fuel (MOX) for use in nuclear reactors. The decision follows an exhaustive Administration review

  8. EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 10-07C/SEP 10-006B Attachment 1: Process Metrics List EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List eecbg_sep_reporting_guidance_attachment_06242011.pdf (56.65 KB) More Documents & Publications EECBG SEP Attachment 1 - Process metric list EECBG Program Notice 10-07A DOE Recovery Act Reporting Requirements for the State Energy Program

  9. CEM_Metrics_and_Technical_Note_7_14_10.pdf | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    CEM_Metrics_and_Technical_Note_7_14_10.pdf CEM_Metrics_and_Technical_Note_7_14_10.pdf (129.47 KB) More Documents & Publications SEAD-Fact-Sheet.pdf Schematics of a heat pump clothes dryer<br /> Credit: Oak Ridge National Lab Heat Pump Clothes Dryer CEM_Metrics_and_Technical_Note_7_14_10.pdf Wind Vision: A New Era for Wind Power in the United States

  10. Variable-metric diffraction crystals for x-ray optics

    SciTech Connect (OSTI)

    Smither, R.K.; Fernandez, P.B. )

    1992-02-01

    A variable-metric (VM) crystal is one in which the spacing between the crystalline planes changes with position in the crystal. This variation can be either parallel to the crystalline planes or perpendicular to the crystalline planes of interest and can be produced by either introducing a thermal gradient in the crystal or by growing a crystal made of two or more elements and changing the relative percentages of the two elements as the crystal is grown. A series of experiments were performed in the laboratory to demonstrate the principle of the variable-metric crystal and its potential use in synchrotron beam lines. One of the most useful applications of the VM crystal is to increase the number of photons per unit bandwidth in a diffracted beam without losing any of the overall intensity. In a normal synchrotron beam line that uses a two-crystal monochromator, the bandwidth of the diffracted photon beam is determined by the vertical opening angle of the beam which is typically 0.10--0.30 mrad or 20--60 arcsec. When the VM crystal approach is applied, the bandwidth of the beam can be made as narrow as the rocking curve of the diffracting crystal, which is typically 0.005--0.050 mrad or 1--10 arcsec. Thus a very large increase of photons per unit bandwidth (or per unit energy) can be achieved through the use of VM crystals. When the VM principle is used with bent crystals, new kinds of x-ray optical elements can be generated that can focus and defocus x-ray beams much like simple lenses where the focal length of the lens can be changed to match its application. Thus both large magnifications and large demagnifications can be achieved as well as parallel beams with narrow bandwidths.

  11. Metrics for Assessment of Smart Grid Data Integrity Attacks

    SciTech Connect (OSTI)

    Annarita Giani; Miles McQueen; Russell Bent; Kameshwar Poolla; Mark Hinrichs

    2012-07-01

    There is an emerging consensus that the nation’s electricity grid is vulnerable to cyber attacks. This vulnerability arises from the increasing reliance on using remote measurements, transmitting them over legacy data networks to system operators who make critical decisions based on available data. Data integrity attacks are a class of cyber attacks that involve a compromise of information that is processed by the grid operator. This information can include meter readings of injected power at remote generators, power flows on transmission lines, and relay states. These data integrity attacks have consequences only when the system operator responds to compromised data by redispatching generation under normal or contingency protocols. These consequences include (a) financial losses from sub-optimal economic dispatch to service loads, (b) robustness/resiliency losses from placing the grid at operating points that are at greater risk from contingencies, and (c) systemic losses resulting from cascading failures induced by poor operational choices. This paper is focused on understanding the connections between grid operational procedures and cyber attacks. We first offer two examples to illustrate how data integrity attacks can cause economic and physical damage by misleading operators into taking inappropriate decisions. We then focus on unobservable data integrity attacks involving power meter data. These are coordinated attacks where the compromised data are consistent with the physics of power flow, and are therefore passed by any bad data detection algorithm. We develop metrics to assess the economic impact of these attacks under re-dispatch decisions using optimal power flow methods. These metrics can be use to prioritize the adoption of appropriate countermeasures including PMU placement, encryption, hardware upgrades, and advance attack detection algorithms.

  12. GPRA 2003 quality metrics methodology and results: Office of Industrial Technologies

    SciTech Connect (OSTI)

    None, None

    2002-04-19

    This report describes the results, calculations, and assumptions underlying the GPRA 2003 Quality Metrics results for all Planning Units withing the Office of Industrial Technologies.

  13. Building Cost and Performance Metrics: Data Collection Protocol, Revision 1.0

    SciTech Connect (OSTI)

    Fowler, Kimberly M.; Solana, Amy E.; Spees, Kathleen L.

    2005-09-29

    This technical report describes the process for selecting and applying the building cost and performance metrics for measuring sustainably designed buildings in comparison to traditionally designed buildings.

  14. EVMS Training Snippet: 3.2 Schedule Health Metrics | Department of Energy

    Office of Environmental Management (EM)

    2 Schedule Health Metrics EVMS Training Snippet: 3.2 Schedule Health Metrics This EVMS Training Snippet sponsored by the Office of Project Management (PM) focuses on 'what' the metrics are, 'why' they are important, and what they tell us about the schedule health. This Snippet does not focus on the 'how' the metrics are calculated, other than to provide a basic understanding of what is being calculated. Link to Video Presentation (21:52) | Prior Snippet (3.1B) | Next Snippet (3.3) | Return to

  15. New Selection Metric for Design of Thin-Film Solar Cell Absorber...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Maximum Efficiency (SLME) is a new and calculable selection metric to identify new andor improved photovoltaic (PV) absorber candidate materials for thin- film solar cells. ...

  16. Microsoft PowerPoint - Snippet 3.2 Schedule Health Metrics 20140713...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ... available software. These metrics can be quickly reviewed each month to identify any schedule health risks on your project, whether you are the contractor or the customer. ...

  17. FY 2015 Q1 Metrics Supporting Documentation 2015-02-09.xls

    Broader source: Energy.gov (indexed) [DOE]

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Success: Complete 90% of capital asset projects at ...

  18. Enclosure - FY 2015 Q3 Metrics Report 2015-08-12.xlsx

    Broader source: Energy.gov (indexed) [DOE]

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Management Success: Complete 90% of capital asset ...

  19. (SSS)GAO Metrics - Project Success 2015-04-29 1100.xls

    Broader source: Energy.gov (indexed) [DOE]

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Success: Complete 90% of capital asset projects at ...

  20. Enhanced Accident Tolerant LWR Fuels National Metrics Workshop Report

    SciTech Connect (OSTI)

    Lori Braase

    2013-01-01

    Commercialization. The activities performed during the feasibility assessment phase include laboratory scale experiments; fuel performance code updates; and analytical assessment of economic, operational, safety, fuel cycle, and environmental impacts of the new concepts. The development and qualification stage will consist of fuel fabrication and large scale irradiation and safety basis testing, leading to qualification and ultimate NRC licensing of the new fuel. The commercialization phase initiates technology transfer to industry for implementation. Attributes for fuels with enhanced accident tolerance include improved reaction kinetics with steam and slower hydrogen generation rate, while maintaining acceptable cladding thermo-mechanical properties; fuel thermo-mechanical properties; fuel-clad interactions; and fission-product behavior. These attributes provide a qualitative guidance for parameters that must be considered in the development of fuels and cladding with enhanced accident tolerance. However, quantitative metrics must be developed for these attributes. To initiate the quantitative metrics development, a Light Water Reactor Enhanced Accident Tolerant Fuels Metrics Development Workshop was held October 10-11, 2012, in Germantown, Maryland. This document summarizes the structure and outcome of the two-day workshop. Questions regarding the content can be directed to Lori Braase, 208-526-7763, lori.braase@inl.gov.

  1. Impact of Different Economic Performance Metrics on the Perceived Value of Solar Photovoltaics

    SciTech Connect (OSTI)

    Drury, E.; Denholm, P.; Margolis, R.

    2011-10-01

    Photovoltaic (PV) systems are installed by several types of market participants, ranging from residential customers to large-scale project developers and utilities. Each type of market participant frequently uses a different economic performance metric to characterize PV value because they are looking for different types of returns from a PV investment. This report finds that different economic performance metrics frequently show different price thresholds for when a PV investment becomes profitable or attractive. Several project parameters, such as financing terms, can have a significant impact on some metrics [e.g., internal rate of return (IRR), net present value (NPV), and benefit-to-cost (B/C) ratio] while having a minimal impact on other metrics (e.g., simple payback time). As such, the choice of economic performance metric by different customer types can significantly shape each customer's perception of PV investment value and ultimately their adoption decision.

  2. Proceedings of the 2009 Performance Metrics for Intelligent Systems Workshop

    SciTech Connect (OSTI)

    Madhavan, Raj; Messina, Elena

    2009-09-01

    The Performance Metrics for Intelligent Systems (PerMIS) workshop is dedicated to defining measures and methodologies of evaluating performance of intelligent systems. As the only workshop of its kind, PerMIS has proved to be an excellent forum for sharing lessons learned and discussions as well as fostering collaborations between researchers and practitioners from industry, academia and government agencies. The main theme of the ninth iteration of the workshop, PerMIS'09, seeks to address the question: 'Does performance measurement accelerate the pace of advancement for intelligent systems?' In addition to the main theme, as in previous years, the workshop will focus on applications of performance measures to practical problems in commercial, industrial, homeland security, and military applications. The PerMIS'09 program consists of six plenary addresses and six general and special sessions. The topics that are to be discussed by the speakers cover a wide array of themes centered on many intricate facets of intelligent system research. The presentations will emphasize and showcase the interdisciplinary nature of intelligent systems research and why it is not straightforward to evaluate such interconnected system of systems. The three days of twelve sessions will span themes from manufacturing, mobile robotics, human-system interaction, theory of mind, testing and evaluation of unmanned systems, to name a few.

  3. Sensitivity of Multi-gas Climate Policy to Emission Metrics

    SciTech Connect (OSTI)

    Smith, Steven J.; Karas, Joseph F.; Edmonds, James A.; Eom, Jiyong; Mizrahi, Andrew H.

    2013-04-01

    Multi-gas greenhouse emission targets require that different emissions be combined into an aggregate total. The Global Warming Potential (GWP) index is currently used for this purpose, despite various criticisms of the underlying concept. It is not possible to uniquely define a single metric that perfectly captures the different impacts of emissions of substances with widely disparate atmospheric lifetimes, which leads to a wide range of possible index values. We examine the sensitivity of emissions and climate outcomes to the value of the index used to aggregate methane emissions using a technologically detailed integrated assessment model. We find that the sensitivity to index value is of order 4-14% in terms of methane emissions and 2% in terms of total radiative forcing, using index values between 4 and 70 for methane, with larger regional differences in some cases. The sensitivity to index value is much higher in economic terms, with total 2-gas mitigation cost decreasing 4-5% for a lower index and increasing 10-13% for a larger index, with even larger changes if the emissions reduction targets are small. The sensitivity to index value also depends on the assumed maximum amount of mitigation available in each sector. Evaluation of the maximum mitigation potential for major sources of non-CO2 greenhouse gases would greatly aid analysis

  4. Geothermal Plant Capacity Factors

    SciTech Connect (OSTI)

    Greg Mines; Jay Nathwani; Christopher Richard; Hillary Hanson; Rachel Wood

    2015-01-01

    The capacity factors recently provided by the Energy Information Administration (EIA) indicated this plant performance metric had declined for geothermal power plants since 2008. Though capacity factor is a term commonly used by geothermal stakeholders to express the ability of a plant to produce power, it is a term frequently misunderstood and in some instances incorrectly used. In this paper we discuss how this capacity factor is defined and utilized by the EIA, including discussion on the information that the EIA requests from operations in their 923 and 860 forms that are submitted both monthly and annually by geothermal operators. A discussion is also provided regarding the entities utilizing the information in the EIA reports, and how those entities can misinterpret the data being supplied by the operators. The intent of the paper is to inform the facility operators as the importance of the accuracy of the data that they provide, and the implications of not providing the correct information.

  5. Large-scale seismic waveform quality metric calculation using Hadoop

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.; Dodge, Douglas A.; Ruppert, Stanley D.

    2016-05-27

    Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will

  6. FY 2014 Q3 RCA CAP Performance Metrics Report 2014-09-05.xlsx

    Energy Savers [EERE]

    ContractProject Management Performance Metrics FY 2014 Target FY 2014 Pre- & Post- CAP* ... TPC is Total Project Cost. No. FY 2014 Target FY 2014 3rd Qtr Actual 2 95% 92% 3 95% ...

  7. FY 2014 Q4 Metrics Report 2014-11-06.xlsx

    Energy Savers [EERE]

    ContractProject Management Performance Metrics FY 2014 Target FY 2014 Pre- & Post- CAP* ... TPC is Total Project Cost. No. FY 2014 Target FY 2014 4th Qtr Actual 2 95% 89% 3 95% ...

  8. EAC Presentation: Metrics and Benefits Analysis for the ARRA Smart Grid Programs- March 10, 2011

    Broader source: Energy.gov [DOE]

    PowerPoint presentation by Joe Paladino from the Office of Electricity Delivery and Energy Reliability before the Electricity Advisory Committee (EAC) on metrics and benefits analysis for the...

  9. 11,202,720 Metric Tons of CO2 Injected as of October 14, 2015...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    This carbon dioxide (CO2) has been injected in the United States as part of DOE's Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is ...

  10. 11,202,720 Metric Tons of CO2 Injected as of October 14, 2015

    Office of Energy Efficiency and Renewable Energy (EERE)

    This carbon dioxide (CO2) has been injected in the United States as part of DOEs Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the...

  11. Enclosure - FY 2016 Q1 Metrics Report 2016-02-11.xlsx

    Broader source: Energy.gov (indexed) [DOE]

    No. ContractProject Management Performance Metrics FY 2016 Target No. 2 3 4 5 6 7 Comment FY 2016 Forecast Certified Contracting Staff: By the end of FY 2011, 85% of the 1102 ...

  12. Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and...

    Office of Scientific and Technical Information (OSTI)

    Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and Conformal Quantum Mechanics Dosch, Hans Gunter; U. Heidelberg, ITP; Brodsky, Stanley J.; SLAC; de Teramond, Guy F.;...

  13. 12,877,644 Metric Tons of CO2 Injected as of July 1, 2016

    Broader source: Energy.gov [DOE]

    This carbon dioxide (CO2) has been injected in the United States as part of DOE’s Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the...

  14. Metrics for Developing an Endorsed Set of Radiographic Threat Surrogates for JINII/CAARS

    SciTech Connect (OSTI)

    Wurtz, R; Walston, S; Dietrich, D; Martz, H

    2009-02-11

    CAARS (Cargo Advanced Automated Radiography System) is developing x-ray dual energy and x-ray backscatter methods to automatically detect materials that are greater than Z=72 (hafnium). This works well for simple geometry materials, where most of the radiographic path is through one material. However, this is usually not the case. Instead, the radiographic path includes many materials of different lengths. Single energy can be used to compute {mu}y{sub l} which is related to areal density (mass per unit area) while dual energy yields more information. This report describes a set of metrics suitable and sufficient for characterizing the appearance of assemblies as detected by x-ray radiographic imaging systems, such as those being tested by Joint Integrated Non-Intrusive Inspection (JINII) or developed under CAARS. These metrics will be simulated both for threat assemblies and surrogate threat assemblies (such as are found in Roney et al. 2007) using geometrical and compositional information of the assemblies. The imaging systems are intended to distinguish assemblies containing high-Z material from those containing low-Z material, regardless of thickness, density, or compounds and mixtures. The systems in question operate on the principle of comparing images obtained by using two different x-ray end-point energies--so-called 'dual energy' imaging systems. At the direction of the DHS JINII sponsor, this report does not cover metrics that implement scattering, in the form of either forward-scattered radiation or high-Z detection systems operating on the principle of backscatter detection. Such methods and effects will be covered in a later report. The metrics described here are to be used to compare assemblies and not x-ray radiography systems. We intend to use these metrics to determine whether two assemblies do or do not look the same. We are tasked to develop a set of assemblies whose appearance using this class of detection systems is indistinguishable from the

  15. NNSA Eliminates 100 Metric Tons Of Weapons-Grade Nuclear Material |

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    National Nuclear Security Administration | (NNSA) Eliminates 100 Metric Tons Of Weapons-Grade Nuclear Material August 25, 2008 WASHINGTON, D.C. -Today the Department of Energy's National Nuclear Security Administration (NNSA) announced that it successfully eliminated 100 metric tons of U.S. highly enriched uranium (HEU), enough for thousands of nuclear weapons. For the last decade, the U.S. HEU disposition program has eliminated surplus HEU from the nuclear weapons program by downblending

  16. Implementing the Data Center Energy Productivity Metric in a High Performance Computing Data Center

    SciTech Connect (OSTI)

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2013-06-30

    As data centers proliferate in size and number, the improvement of their energy efficiency and productivity has become an economic and environmental imperative. Making these improvements requires metrics that are robust, interpretable, and practical. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high-performance computing data center. We found that DCeP was successful in clearly distinguishing different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and between data centers.

  17. Effective detective quantum efficiency for two mammography systems: Measurement and comparison against established metrics

    SciTech Connect (OSTI)

    Salvagnini, Elena; Bosmans, Hilde; Marshall, Nicholas W.; Struelens, Lara

    2013-10-15

    Purpose: The aim of this paper was to illustrate the value of the new metric effective detective quantum efficiency (eDQE) in relation to more established measures in the optimization process of two digital mammography systems. The following metrics were included for comparison against eDQE: detective quantum efficiency (DQE) of the detector, signal difference to noise ratio (SdNR), and detectability index (d′) calculated using a standard nonprewhitened observer with eye filter.Methods: The two systems investigated were the Siemens MAMMOMAT Inspiration and the Hologic Selenia Dimensions. The presampling modulation transfer function (MTF) required for the eDQE was measured using two geometries: a geometry containing scattered radiation and a low scatter geometry. The eDQE, SdNR, and d′ were measured for poly(methyl methacrylate) (PMMA) thicknesses of 20, 40, 60, and 70 mm, with and without the antiscatter grid and for a selection of clinically relevant target/filter (T/F) combinations. Figures of merit (FOMs) were then formed from SdNR and d′ using the mean glandular dose as the factor to express detriment. Detector DQE was measured at energies covering the range of typical clinically used spectra.Results: The MTF measured in the presence of scattered radiation showed a large drop at low spatial frequency compared to the low scatter method and led to a corresponding reduction in eDQE. The eDQE for the Siemens system at 1 mm{sup −1} ranged between 0.15 and 0.27, depending on T/F and grid setting. For the Hologic system, eDQE at 1 mm{sup −1} varied from 0.15 to 0.32, again depending on T/F and grid setting. The eDQE results for both systems showed that the grid increased the system efficiency for PMMA thicknesses of 40 mm and above but showed only small sensitivity to T/F setting. While results of the SdNR and d′ based FOMs confirmed the eDQE grid position results, they were also more specific in terms of T/F selection. For the Siemens system at 20 mm PMMA

  18. Energy Department Project Captures and Stores more than One Million Metric

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Tons of CO2 | Department of Energy Project Captures and Stores more than One Million Metric Tons of CO2 Energy Department Project Captures and Stores more than One Million Metric Tons of CO2 June 26, 2014 - 11:30am Addthis Aerial view of Air Products’ existing steam methane reforming facility at Port Arthur, Texas, with new carbon-capture units and central co-gen and CO2 product compressor. | Photo courtesy of Air Products and Chemicals Inc. Aerial view of Air Products' existing steam

  19. DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Weapons Stockpile | Department of Energy to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile November 7, 2005 - 12:38pm Addthis Will Be Redirected to Naval Reactors, Down-blended or Used for Space Programs WASHINGTON, DC - Secretary of Energy Samuel W. Bodman today announced that the Department of Energy's (DOE) National Nuclear Security Administration (NNSA) will

  20. Metrics of closed world of Friedmann, agitated by electric charge (towards a theory electromagnetic Friedmanns)

    SciTech Connect (OSTI)

    Markov, M.A.; Frolov, V.P.

    1986-06-10

    The generalization is considered of the well-known Tolman problem to the case of electrically charged dust-like matter of the central symmetrical system. The first integrals of the correspondent system of the Einstein-Maxwell equations are found. The problem is specificated in such a way that with the full charge of the system going to zero, the metrics of the closed Friedman world arises. Such a system is considered at the initial moment, that of maximal enlargement. With any nonvanishing but no-matter-how-small value of the electric charge, the metrics is unclosed. The metrics of the almost-Friedmanian part of the world allows the continuation through the narrow manhole (at the small charge) as the Nordstroem Reissner metrics with the parameters m/sub O/ sq rt (chi) = e/sub o/. The expression for the electric potential in the manhole phi/sub h/ = c-squared/sq rt chi does not depend upon the value of the electric charge. The radius of the manhole r/sub h/ = e/sub O/ sq. rt (chi)/ c-squared increases with the increase of the charge. The state of the manhole as given by the classical description appears as essentially unstable from the quantum-physics viewpoint. The production of various pairs in the enormous electric fields of the manhole gives rise to the polarisation of the latter up to effective charge Z < 137e irrespective of the initial (no matter how great) charge of the system.

  1. Energy Department Project Captures and Stores One Million Metric Tons of Carbon

    Broader source: Energy.gov [DOE]

    As part of President Obama’s all-of-the-above energy strategy, the Department of Energy announced today that its Illinois Basin-Decatur Project successfully captured and stored one million metric tons of carbon dioxide (CO2) and injected it into a deep saline formation.

  2. Multidimensional metrics for estimating phage abundance, distribution, gene density, and sequence coverage in metagenomes

    SciTech Connect (OSTI)

    Aziz, Ramy K.; Dwivedi, Bhakti; Akhter, Sajia; Breitbart, Mya; Edwards, Robert A.

    2015-05-08

    Phages are the most abundant biological entities on Earth and play major ecological roles, yet the current sequenced phage genomes do not adequately represent their diversity, and little is known about the abundance and distribution of these sequenced genomes in nature. Although the study of phage ecology has benefited tremendously from the emergence of metagenomic sequencing, a systematic survey of phage genes and genomes in various ecosystems is still lacking, and fundamental questions about phage biology, lifestyle, and ecology remain unanswered. To address these questions and improve comparative analysis of phages in different metagenomes, we screened a core set of publicly available metagenomic samples for sequences related to completely sequenced phages using the web tool, Phage Eco-Locator. We then adopted and deployed an array of mathematical and statistical metrics for a multidimensional estimation of the abundance and distribution of phage genes and genomes in various ecosystems. Experiments using those metrics individually showed their usefulness in emphasizing the pervasive, yet uneven, distribution of known phage sequences in environmental metagenomes. Using these metrics in combination allowed us to resolve phage genomes into clusters that correlated with their genotypes and taxonomic classes as well as their ecological properties. By adding this set of metrics to current metaviromic analysis pipelines, where they can provide insight regarding phage mosaicism, habitat specificity, and evolution.

  3. Multidimensional metrics for estimating phage abundance, distribution, gene density, and sequence coverage in metagenomes

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Aziz, Ramy K.; Dwivedi, Bhakti; Akhter, Sajia; Breitbart, Mya; Edwards, Robert A.

    2015-05-08

    Phages are the most abundant biological entities on Earth and play major ecological roles, yet the current sequenced phage genomes do not adequately represent their diversity, and little is known about the abundance and distribution of these sequenced genomes in nature. Although the study of phage ecology has benefited tremendously from the emergence of metagenomic sequencing, a systematic survey of phage genes and genomes in various ecosystems is still lacking, and fundamental questions about phage biology, lifestyle, and ecology remain unanswered. To address these questions and improve comparative analysis of phages in different metagenomes, we screened a core set ofmore » publicly available metagenomic samples for sequences related to completely sequenced phages using the web tool, Phage Eco-Locator. We then adopted and deployed an array of mathematical and statistical metrics for a multidimensional estimation of the abundance and distribution of phage genes and genomes in various ecosystems. Experiments using those metrics individually showed their usefulness in emphasizing the pervasive, yet uneven, distribution of known phage sequences in environmental metagenomes. Using these metrics in combination allowed us to resolve phage genomes into clusters that correlated with their genotypes and taxonomic classes as well as their ecological properties. By adding this set of metrics to current metaviromic analysis pipelines, where they can provide insight regarding phage mosaicism, habitat specificity, and evolution.« less

  4. Performance Metrics

    Broader source: Energy.gov [DOE]

    RCA/CAP Closure Report 2011 - This RCA/CAP Closure Report presents a status of the Department’s initiatives to address the most significant issues and their corresponding root causes and officially...

  5. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    SciTech Connect (OSTI)

    Schanfein, Mark J; Gouveia, Fernando S

    2010-07-01

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls.

  6. Dynamical Systems in the Variational Formulation of the Fokker-Planck Equation by the Wasserstein Metric

    SciTech Connect (OSTI)

    Mikami, T.

    2000-07-01

    R. Jordan, D. Kinderlehrer, and F. Otto proposed the discrete-time approximation of the Fokker-Planck equation by the variational formulation. It is determined by the Wasserstein metric, an energy functional, and the Gibbs-Boltzmann entropy functional. In this paper we study the asymptotic behavior of the dynamical systems which describe their approximation of the Fokker-Planck equation and characterize the limit as a solution to a class of variational problems.

  7. Time delay of light signals in an energy-dependent spacetime metric

    SciTech Connect (OSTI)

    Grillo, A. F.; Luzio, E.; Mendez, F.

    2008-05-15

    In this paper we review the problem of time delay of photons propagating in a spacetime with a metric that explicitly depends on the energy of the particles (gravity-rainbow approach). We show that corrections due to this approach--which is closely related to the double special relativity proposal--produce for small redshifts (z<<1) smaller time delays than in the generic Lorentz invariance violating case.

  8. Microsoft Word - McIntyre-Metrics Report SAND draft9-14.doc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2070P Unlimited Release September 2007 Security Metrics for Process Control Systems Annie McIntyre, Blair Becker, Ron Halbgewachs Prepared by Sandia National Laboratories Albuquerque, New Mexico 87185 and Livermore, California 94550 Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000. Approved for public release; further dissemination

  9. Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone

    Broader source: Energy.gov [DOE]

    On June 30, Allentown, PA-based Air Products and Chemicals, Inc. successfully captured and transported, via pipeline, its 3 millionth metric ton of carbon dioxide (CO2) to be used for enhanced oil recovery. This achievement highlights the ongoing success of a carbon capture and storage (CCS) project sponsored by the U.S. Department of Energy (DOE) and managed by the National Energy Technology Laboratory (NETL).

  10. Performance metrics and life-cycle information management for building performance assurance

    SciTech Connect (OSTI)

    Hitchcock, R.J.; Piette, M.A.; Selkowitz, S.E.

    1998-06-01

    Commercial buildings account for over $85 billion per year in energy costs, which is far more energy than technically necessary. One of the primary reasons buildings do not perform as well as intended is that critical information is lost, through ineffective documentation and communication, leading to building systems that are often improperly installed and operated. A life-cycle perspective on the management of building information provides a framework for improving commercial building energy performance. This paper describes a project to develop strategies and techniques to provide decision-makers with information needed to assure the desired building performance across the complete life cycle of a building project. A key element in this effort is the development of explicit performance metrics that quantitatively represent performance objectives of interest to various building stakeholders. The paper begins with a discussion of key problems identified in current building industry practice, and ongoing work to address these problems. The paper then focuses on the concept of performance metrics and their use in improving building performance during design, commissioning, and on-going operations. The design of a Building Life-cycle Information System (BLISS) is presented. BLISS is intended to provide an information infrastructure capable of integrating a variety of building information technologies that support performance assurance. The use of performance metrics in case study building projects is explored to illustrate current best practice. The application of integrated information technology for improving current practice is discussed.

  11. Specification and implementation of IFC based performance metrics to support building life cycle assessment of hybrid energy systems

    SciTech Connect (OSTI)

    Morrissey, Elmer; O'Donnell, James; Keane, Marcus; Bazjanac, Vladimir

    2004-03-29

    Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offering an accurate method of quantitatively assessing building performance.

  12. 12,893,780 Metric Tons of CO2 Injected as of July 19, 2016 | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy 12,893,780 Metric Tons of CO2 Injected as of July 19, 2016 12,893,780 Metric Tons of CO2 Injected as of July 19, 2016 This carbon dioxide (CO2) has been injected in the United States as part of DOE's Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the annual greenhouse gas emissions from 210,526 passenger vehicles. The projects currently injecting CO2 within DOE's Regional Carbon Sequestration Partnership Program and the

  13. Development and evaluation of aperture-based complexity metrics using film and EPID measurements of static MLC openings

    SciTech Connect (OSTI)

    Götstedt, Julia; Karlsson Hauer, Anna; Bäck, Anna

    2015-07-15

    Purpose: Complexity metrics have been suggested as a complement to measurement-based quality assurance for intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT). However, these metrics have not yet been sufficiently validated. This study develops and evaluates new aperture-based complexity metrics in the context of static multileaf collimator (MLC) openings and compares them to previously published metrics. Methods: This study develops the converted aperture metric and the edge area metric. The converted aperture metric is based on small and irregular parts within the MLC opening that are quantified as measured distances between MLC leaves. The edge area metric is based on the relative size of the region around the edges defined by the MLC. Another metric suggested in this study is the circumference/area ratio. Earlier defined aperture-based complexity metrics—the modulation complexity score, the edge metric, the ratio monitor units (MU)/Gy, the aperture area, and the aperture irregularity—are compared to the newly proposed metrics. A set of small and irregular static MLC openings are created which simulate individual IMRT/VMAT control points of various complexities. These are measured with both an amorphous silicon electronic portal imaging device and EBT3 film. The differences between calculated and measured dose distributions are evaluated using a pixel-by-pixel comparison with two global dose difference criteria of 3% and 5%. The extent of the dose differences, expressed in terms of pass rate, is used as a measure of the complexity of the MLC openings and used for the evaluation of the metrics compared in this study. The different complexity scores are calculated for each created static MLC opening. The correlation between the calculated complexity scores and the extent of the dose differences (pass rate) are analyzed in scatter plots and using Pearson’s r-values. Results: The complexity scores calculated by the edge

  14. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    SciTech Connect (OSTI)

    Ronald Boring; Roger Lew; Thomas Ulrich; Jeffrey Joe

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how the process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.

  15. Genome Assembly Forensics: Metrics for Assessing Assembly Correctness (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema (OSTI)

    Pop, Mihai [University of Maryland

    2013-01-22

    University of Maryland's Mihai Pop on "Genome Assembly Forensics: Metrics for Assessing Assembly Correctness" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  16. Table 11.4 Nitrous Oxide Emissions, 1980-2009 (Thousand Metric Tons of Nitrous Oxide)

    U.S. Energy Information Administration (EIA) Indexed Site

    Nitrous Oxide Emissions, 1980-2009 (Thousand Metric Tons of Nitrous Oxide) Year Energy Sources Waste Management Agricultural Sources Industrial Processes 3 Total Mobile Combustion 1 Stationary Combustion 2 Total Waste Combustion Human Sewage in Wastewater Total Nitrogen Fertilization of Soils Crop Residue Burning Solid Waste of Domesticated Animals Total 1980 60 44 104 1 10 11 364 1 75 440 88 642 1981 63 44 106 1 10 11 364 2 74 440 84 641 1982 67 42 108 1 10 11 339 2 74 414 80 614 1983 71 43 114

  17. Einstein-aether theory, violation of Lorentz invariance, and metric-affine gravity

    SciTech Connect (OSTI)

    Heinicke, Christian; Baekler, Peter; Hehl, Friedrich W.

    2005-07-15

    We show that the Einstein-aether theory of Jacobson and Mattingly (J and M) can be understood in the framework of the metric-affine (gauge theory of) gravity (MAG). We achieve this by relating the aether vector field of J and M to certain post-Riemannian nonmetricity pieces contained in an independent linear connection of spacetime. Then, for the aether, a corresponding geometrical curvature-square Lagrangian with a massive piece can be formulated straightforwardly. We find an exact spherically symmetric solution of our model.

  18. Perfect fluid and scalar field in the Reissner-Nordstroem metric

    SciTech Connect (OSTI)

    Babichev, E. O.; Dokuchaev, V. I. Eroshenko, Yu. N.

    2011-05-15

    We describe the spherically symmetric steady-state accretion of perfect fluid in the Reissner-Nordstroem metric. We present analytic solutions for accretion of a fluid with linear equations of state and of the Chaplygin gas. We also show that under reasonable physical conditions, there is no steady-state accretion of a perfect fluid onto a Reissner-Nordstroem naked singularity. Instead, a static atmosphere of fluid is formed. We discuss a possibility of violation of the third law of black hole thermodynamics for a phantom fluid accretion.

  19. Ultrahard fluid and scalar field in the Kerr-Newman metric

    SciTech Connect (OSTI)

    Babichev, E.; Chernov, S.; Dokuchaev, V.; Eroshenko, Yu.

    2008-11-15

    An analytic solution for the accretion of ultrahard perfect fluid onto a moving Kerr-Newman black hole is found. This solution is a generalization of the previously known solution by Petrich, Shapiro, and Teukolsky for a Kerr black hole. We show that the found solution is applicable for the case of a nonextreme black hole, however it cannot describe the accretion onto an extreme black hole due to violation of the test fluid approximation. We also present a stationary solution for a massless scalar field in the metric of a Kerr-Newman naked singularity.

  20. Comparing Resource Adequacy Metrics and Their Influence on Capacity Value: Preprint

    SciTech Connect (OSTI)

    Ibanez, E.; Milligan, M.

    2014-04-01

    Traditional probabilistic methods have been used to evaluate resource adequacy. The increasing presence of variable renewable generation in power systems presents a challenge to these methods because, unlike thermal units, variable renewable generation levels change over time because they are driven by meteorological events. Thus, capacity value calculations for these resources are often performed to simple rules of thumb. This paper follows the recommendations of the North American Electric Reliability Corporation?s Integration of Variable Generation Task Force to include variable generation in the calculation of resource adequacy and compares different reliability metrics. Examples are provided using the Western Interconnection footprint under different variable generation penetrations.

  1. OSTIblog Articles in the metrics Topic | OSTI, US Dept of Energy Office of

    Office of Scientific and Technical Information (OSTI)

    Scientific and Technical Information metrics Topic OSTI's Committee of Visitors, An Update by Dr. Jeffrey Salmon 23 May, 2011 in Science Communications 4333 COV%202009%20Group.jpg OSTI's Committee of Visitors, An Update Read more about 4333 "The unexamined life is not worth living." So says Plato's Socrates in the Apology. His self-examination led to extreme humility (or to an extreme irony) when Socrates confessed to his accusers that the only knowledge he had was knowledge of his

  2. Integration of Sustainability Metrics into Design Cases and State of Technology Assessments

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    This presentation does not contain any proprietary, confidential, or otherwise restricted information DOE Bioenergy Technologies Office (BETO) 2015 Project Peer Review Integration of Sustainability Metrics into Design Cases and State of Technology Assessments 2.1.0.100/2.1.0.302 NREL 2.1.0.301 PNNL Mary Biddy On behalf Eric Tan, Abhijit Dutta, Ryan Davis, Mike Talmadge NREL Lesley Snowden-Swan On behalf of Sue Jones, Aye Meyer, Ken Rappe, Kurt Spies PNNL Goal Statement 2 Support the development

  3. Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems

    SciTech Connect (OSTI)

    Greitzer, Frank L.

    2008-09-15

    The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighters cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts; and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.

  4. An Aquatic Acoustic Metrics Interface Utility for Underwater Sound Monitoring and Analysis

    SciTech Connect (OSTI)

    Ren, Huiying; Halvorsen, Michele B.; Deng, Zhiqun; Carlson, Thomas J.

    2012-05-31

    Fishes and other marine mammals suffer a range of potential effects from intense sound sources generated by anthropogenic underwater processes such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording devices (USR) were built to monitor the acoustic sound pressure waves generated by those anthropogenic underwater activities, so the relevant processing software becomes indispensable for analyzing the audio files recorded by these USRs. However, existing software packages did not meet performance and flexibility requirements. In this paper, we provide a detailed description of a new software package, named Aquatic Acoustic Metrics Interface (AAMI), which is a Graphical User Interface (GUI) designed for underwater sound monitoring and analysis. In addition to the general functions, such as loading and editing audio files recorded by USRs, the software can compute a series of acoustic metrics in physical units, monitor the sound's influence on fish hearing according to audiograms from different species of fishes and marine mammals, and batch process the sound files. The detailed applications of the software AAMI will be discussed along with several test case scenarios to illustrate its functionality.

  5. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Laney, Daniel; Langer, Steven; Weber, Christopher; Lindstrom, Peter; Wegener, Al

    2014-01-01

    This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less

  6. Using research metrics to evaluate the International Atomic Energy Agency guidelines on quality assurance for R&D

    SciTech Connect (OSTI)

    Bodnarczuk, M.

    1994-06-01

    The objective of the International Atomic Energy Agency (IAEA) Guidelines on Quality Assurance for R&D is to provide guidance for developing quality assurance (QA) programs for R&D work on items, services, and processes important to safety, and to support the siting, design, construction, commissioning, operation, and decommissioning of nuclear facilities. The standard approach to writing papers describing new quality guidelines documents is to present a descriptive overview of the contents of the document. I will depart from this approach. Instead, I will first discuss a conceptual framework of metrics for evaluating and improving basic and applied experimental science as well as the associated role that quality management should play in understanding and implementing these metrics. I will conclude by evaluating how well the IAEA document addresses the metrics from this conceptual framework and the broader principles of quality management.

  7. SU-E-T-359: Measurement of Various Metrics to Determine Changes in Megavoltage Photon Beam Energy

    SciTech Connect (OSTI)

    Gao, S; Balter, P; Rose, M; Simon, W

    2014-06-01

    Purpose: To examine the relationship between photon beam energy and various metrics for energy on the flattened and flattening filter free (FFF) beams generated by the Varian TrueBeam. Methods: Energy changes were accomplished by adjusting the bending magnet current 10% from the nominal value for the 4, 6, 8, and 10 MV flattened and 6 and 10 MV FFF beams. Profiles were measured for a 3030 cm{sup 2} field using a 2D ionization chamber array and a 3D water Scanner which was also used to measure PDDs. For flattened beams we compared several energy metrics; PDD at 10 cm depth in water (PDD(10)); the variation over the central 80% of the field (Flat); and the average of the highest reading along each diagonal divided by the CAX value, diagonal normalized flatness (FDN). For FFF beams we examined PDD(10), FDN, and the width of a chosen isodose level in a 3030 cm{sup 2} field (W(d%)). Results: Changes in PDD(10) were nearly linear with changes in energy for both flattened and FFF beams as were changes in FDN. Changes in W(d%) were also nearly linear with energy for the FFF beams. PDD(10) was not as sensitive to changes in energy compared to the other metrics for either flattened or FFF beams. Flat was not as sensitive to changes in energy compared to FDN for flattened beams and its behavior depends on depth. FDN was the metric that had the highest sensitivity to the changes in energy for flattened beams while W(d%) was the metric that had highest sensitivity to the changes in energy for FFF beams. Conclusions: The metric FDN was found to be most sensitive to energy changes for flattened beams, while the W(d%) was most sensitive to energy changes for FFF beams.

  8. Table 11.3 Methane Emissions, 1980-2009 (Million Metric Tons of Methane)

    U.S. Energy Information Administration (EIA) Indexed Site

    Methane Emissions, 1980-2009 (Million Metric Tons of Methane) Year Energy Sources Waste Management Agricultural Sources Industrial Processes 9 Total 5 Coal Mining Natural Gas Systems 1 Petroleum Systems 2 Mobile Com- bustion 3 Stationary Com- bustion 4 Total 5 Landfills Waste- water Treatment 6 Total 5 Enteric Fermen- tation 7 Animal Waste 8 Rice Cultivation Crop Residue Burning Total 5 1980 3.06 4.42 NA 0.28 0.45 8.20 10.52 0.52 11.04 5.47 2.87 0.48 0.04 8.86 0.17 28.27 1981 2.81 5.02 NA .27

  9. MULTI-SCALE MORPHOLOGICAL ANALYSIS OF SDSS DR5 SURVEY USING THE METRIC SPACE TECHNIQUE

    SciTech Connect (OSTI)

    Wu Yongfeng; Batuski, David J.; Khalil, Andre

    2009-12-20

    Following the novel development and adaptation of the Metric Space Technique (MST), a multi-scale morphological analysis of the Sloan Digital Sky Survey (SDSS) Data Release 5 (DR5) was performed. The technique was adapted to perform a space-scale morphological analysis by filtering the galaxy point distributions with a smoothing Gaussian function, thus giving quantitative structural information on all size scales between 5 and 250 Mpc. The analysis was performed on a dozen slices of a volume of space containing many newly measured galaxies from the SDSS DR5 survey. Using the MST, observational data were compared to galaxy samples taken from N-body simulations with current best estimates of cosmological parameters and from random catalogs. By using the maximal ranking method among MST output functions, we also develop a way to quantify the overall similarity of the observed samples with the simulated samples.

  10. Quantifying Availability in SCADA Environments Using the Cyber Security Metric MFC

    SciTech Connect (OSTI)

    Aissa, Anis Ben; Rabai, Latifa Ben Arfa; Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2014-01-01

    Supervisory Control and Data Acquisition (SCADA) systems are distributed networks dispersed over large geographic areas that aim to monitor and control industrial processes from remote areas and/or a centralized location. They are used in the management of critical infrastructures such as electric power generation, transmission and distribution, water and sewage, manufacturing/industrial manufacturing as well as oil and gas production. The availability of SCADA systems is tantamount to assuring safety, security and profitability. SCADA systems are the backbone of the national cyber-physical critical infrastructure. Herein, we explore the definition and quantification of an econometric measure of availability, as it applies to SCADA systems; our metric is a specialization of the generic measure of mean failure cost.

  11. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    SciTech Connect (OSTI)

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign and threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.

  12. Method and system for assigning a confidence metric for automated determination of optic disc location

    DOE Patents [OSTI]

    Karnowski, Thomas P.; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya; Chaum, Edward

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  13. Interval Data Analysis with the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect (OSTI)

    Taasevigen, Danny J.; Katipamula, Srinivas; Koran, William

    2011-07-07

    Analyzing whole building interval data is an inexpensive but effective way to identify and improve building operations, and ultimately save money. Utilizing the Energy Charting and Metrics Tool (ECAM) add-in for Microsoft Excel, building operators and managers can begin implementing changes to their Building Automation System (BAS) after trending the interval data. The two data components needed for full analyses are whole building electricity consumption (kW or kWh) and outdoor air temperature (OAT). Using these two pieces of information, a series of plots and charts and be created in ECAM to monitor the buildings performance over time, gain knowledge of how the building is operating, and make adjustments to the BAS to improve efficiency and start saving money.

  14. Marker-free registration of forest terrestrial laser scanner data pairs with embedded confidence metrics

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Van Aardt, Jan; Romanczyk, Paul; van Leeuwen, Martin; Kelbe, David; Cawse-Nicholson, Kerry

    2016-04-04

    Terrestrial laser scanning (TLS) has emerged as an effective tool for rapid comprehensive measurement of object structure. Registration of TLS data is an important prerequisite to overcome the limitations of occlusion. However, due to the high dissimilarity of point cloud data collected from disparate viewpoints in the forest environment, adequate marker-free registration approaches have not been developed. The majority of studies instead rely on the utilization of artificial tie points (e.g., reflective tooling balls) placed within a scene to aid in coordinate transformation. We present a technique for generating view-invariant feature descriptors that are intrinsic to the point cloud datamore » and, thus, enable blind marker-free registration in forest environments. To overcome the limitation of initial pose estimation, we employ a voting method to blindly determine the optimal pairwise transformation parameters, without an a priori estimate of the initial sensor pose. To provide embedded error metrics, we developed a set theory framework in which a circular transformation is traversed between disjoint tie point subsets. This provides an upper estimate of the Root Mean Square Error (RMSE) confidence associated with each pairwise transformation. Output RMSE errors are commensurate with the RMSE of input tie points locations. Thus, while the mean output RMSE=16.3cm, improved results could be achieved with a more precise laser scanning system. This study 1) quantifies the RMSE of the proposed marker-free registration approach, 2) assesses the validity of embedded confidence metrics using receiver operator characteristic (ROC) curves, and 3) informs optimal sample spacing considerations for TLS data collection in New England forests. Furthermore, while the implications for rapid, accurate, and precise forest inventory are obvious, the conceptual framework outlined here could potentially be extended to built environments.« less

  15. Measuring solar reflectance Part I: Defining a metric that accurately predicts solar heat gain

    SciTech Connect (OSTI)

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    Solar reflectance can vary with the spectral and angular distributions of incident sunlight, which in turn depend on surface orientation, solar position and atmospheric conditions. A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool colored' surface because this irradiance contains a greater fraction of near-infrared light than typically found in ordinary (unconcentrated) global sunlight. At mainland U.S. latitudes, this metric RE891BN can underestimate the annual peak solar heat gain of a typical roof or pavement (slope {le} 5:12 [23{sup o}]) by as much as 89 W m{sup -2}, and underestimate its peak surface temperature by up to 5 K. Using R{sub E891BN} to characterize roofs in a building energy simulation can exaggerate the economic value N of annual cool-roof net energy savings by as much as 23%. We define clear-sky air mass one global horizontal ('AM1GH') solar reflectance R{sub g,0}, a simple and easily measured property that more accurately predicts solar heat gain. R{sub g,0} predicts the annual peak solar heat gain of a roof or pavement to within 2 W m{sup -2}, and overestimates N by no more than 3%. R{sub g,0} is well suited to rating the solar reflectances of roofs, pavements and walls. We show in Part II that R{sub g,0} can be easily and accurately measured with a pyranometer, a solar spectrophotometer or version 6 of the Solar Spectrum Reflectometer.

  16. Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States

    SciTech Connect (OSTI)

    Watson, Jean-Paul; Guttromson, Ross; Silva-Monroy, Cesar; Jeffers, Robert; Jones, Katherine; Ellison, James; Rath, Charles; Gearhart, Jared; Jones, Dean; Corbet, Tom; Hanley, Charles; Walker, La Tonya

    2014-09-01

    This report has been written for the Department of Energy’s Energy Policy and Systems Analysis Office to inform their writing of the Quadrennial Energy Review in the area of energy resilience. The topics of measuring and increasing energy resilience are addressed, including definitions, means of measuring, and analytic methodologies that can be used to make decisions for policy, infrastructure planning, and operations. A risk-based framework is presented which provides a standard definition of a resilience metric. Additionally, a process is identified which explains how the metrics can be applied. Research and development is articulated that will further accelerate the resilience of energy infrastructures.

  17. A Year of Radiation Measurements at the North Slope of Alaska Second Quarter 2009 ARM and Climate Change Prediction Program Metric Report

    SciTech Connect (OSTI)

    S.A. McFarlane, Y. Shi, C.N. Long

    2009-04-15

    In 2009, the Atmospheric Radiation Measurement (ARM) Program and the Climate Change Prediction Program (CCPP) have been asked to produce joint science metrics. For CCPP, the second quarter metrics are reported in Evaluation of Simulated Precipitation in CCSM3: Annual Cycle Performance Metrics at Watershed Scales. For ARM, the metrics will produce and make available new continuous time series of radiative fluxes based on one year of observations from Barrow, Alaska, during the International Polar Year and report on comparisons of observations with baseline simulations of the Community Climate System Model (CCSM).

  18. Recommendations for mass spectrometry data quality metrics for open access data(corollary to the Amsterdam principles)

    SciTech Connect (OSTI)

    Kingsinger, Christopher R.; Apffel, James; Baker, Mark S.; Bian, Xiaopeng; Borchers, Christoph H.; Bradshaw, Ralph A.; Brusniak, Mi-Youn; Chan, Daniel W.; Deutsch, Eric W.; Domon, Bruno; Gorman, Jeff; Grimm, Rudolf; Hancock, William S.; Hermjakob, Henning; Horn, David; Hunter, Christie; Kolar, Patrik; Kraus, Hans-Joachim; Langen, Hanno; Linding, Rune; Moritz, Robert L.; Omenn, Gilbert S.; Orlando, Ron; Pandey, Akhilesh; Ping, Peipei; Rahbar, Amir; Rivers, Robert; Seymour, Sean L.; Simpson, Richard J.; Slotta, Douglas; Smith, Richard D.; Stein, Stephen E.; Tabb, David L.; Tagle, Danilo; Yates, John R.; Rodriguez, Henry

    2011-12-01

    Policies supporting the rapid and open sharing of proteomic data are being implemented by the leading journals in the field. The proteomics community is taking steps to ensure that data are made publicly accessible and are of high quality, a challenging task that requires the development and deployment of methods for measuring and documenting data quality metrics. On September 18, 2010, the U.S. National Cancer Institute (NCI) convened the 'International Workshop on Proteomic Data Quality Metrics' in Sydney, Australia, to identify and address issues facing the development and use of such methods for open access proteomics data. The stakeholders at the workshop enumerated the key principles underlying a framework for data quality assessment in mass spectrometry data that will meet the needs of the search community, journals, funding agencies, and data repositories. Attendees discussed and agreed upon two primary needs for the wide use of quality metrics: (i)an evolving list of comprehensive quality metrics and (ii)standards accompanied by software analytics. Attendees stressed the importance of increased education and training programs to promote reliable protocols in proteomics. This workshop report explores the historic precedents, key discussions, and necessary next steps to enhance the quality of open access data. By agreement, this article is published simultaneously in Proteomics, Proteomics Clinical Applications, Journal of Proteome Research, and Molecular and Cellular Proteomics, as a public service to the research community.The peer review process was a coordinated effort conducted by a panel of referees selected by the journals.

  19. Implementation Guide - Performance Indicators (Metrics ) for Use with DOE O 440.2B, Aviation Management and Safety

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2005-09-19

    The Guide provides information regarding specific provisions of DOE O 440.2B and is intended to be useful in understanding and implementing performance indicators (metrics) required by the Order. Cancels DOE G 440.2B-1. Canceled by DOE N 251.98.

  20. Implementation Guide - Aviation Program Performance Indicators (Metrics) for use with DOE O 440.2B, Aviation Management And Safety

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2002-12-10

    The Guide provides information regarding Departmental expectations on provisions of DOE 440.2B, identifies acceptable methods of implementing Aviation Program Performance Indicators (Metrics) requirements in the Order, and identifies relevant principles and practices by referencing Government and non-Government standards. Canceled by DOE G 440.2B-1A.

  1. Use of Frequency Response Metrics to Assess the Planning and Operating Requirements for Reliable Integration of Variable Renewable Generation

    SciTech Connect (OSTI)

    Eto, Joseph H.; Undrill, John; Mackin, Peter; Daschmans, Ron; Williams, Ben; Haney, Brian; Hunt, Randall; Ellis, Jeff; Illian, Howard; Martinez, Carlos; O'Malley, Mark; Coughlin, Katie; LaCommare, Kristina Hamachi

    2010-12-20

    An interconnected electric power system is a complex system that must be operated within a safe frequency range in order to reliably maintain the instantaneous balance between generation and load. This is accomplished by ensuring that adequate resources are available to respond to expected and unexpected imbalances and restoring frequency to its scheduled value in order to ensure uninterrupted electric service to customers. Electrical systems must be flexible enough to reliably operate under a variety of"change" scenarios. System planners and operators must understand how other parts of the system change in response to the initial change, and need tools to manage such changes to ensure reliable operation within the scheduled frequency range. This report presents a systematic approach to identifying metrics that are useful for operating and planning a reliable system with increased amounts of variable renewable generation which builds on existing industry practices for frequency control after unexpected loss of a large amount of generation. The report introduces a set of metrics or tools for measuring the adequacy of frequency response within an interconnection. Based on the concept of the frequency nadir, these metrics take advantage of new information gathering and processing capabilities that system operators are developing for wide-area situational awareness. Primary frequency response is the leading metric that will be used by this report to assess the adequacy of primary frequency control reserves necessary to ensure reliable operation. It measures what is needed to arrest frequency decline (i.e., to establish frequency nadir) at a frequency higher than the highest set point for under-frequency load shedding within an interconnection. These metrics can be used to guide the reliable operation of an interconnection under changing circumstances.

  2. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    SciTech Connect (OSTI)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  3. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect (OSTI)

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  4. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    SciTech Connect (OSTI)

    Schanfein, Mark; Gouveia, Fernando; Crawford, Cary E.; Pickett, Chris J.; Jay, Jeffrey

    2010-07-15

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls. Introduction The U.S. safeguards technology base got its start almost half a century ago in the nuclear weapons program of the U.S. Department of Energy/National Nuclear Security Administration (DOE/NNSA) and their predecessors: AEC & ERDA. Due to nuclear materials’ strategic importance and value, and the risk associated with the public’s and worker’s health and the potential for theft, significant investments were made to develop techniques to measure nuclear materials using both destructive assay (DA) and non-destructive assay (NDA). Major investment within the U.S. DOE Domestic Safeguards Program continued over the next three decades, resulting in continuous improvements in the state-of-the-art of these techniques. This was particularly true in the area of NDA with its ability to use gamma rays, neutrons, and heat to identify and quantify nuclear materials without the need to take direct samples of the material. Most of these techniques were commercialized and transferred to

  5. ZFS on RBODs - Leveraging RAID Controllers for Metrics and Enclosure Management

    SciTech Connect (OSTI)

    Stearman, D. M.

    2015-03-30

    Traditionally, the Lustre file system has relied on the ldiskfs file system with reliable RAID (Redundant Array of Independent Disks) storage underneath. As of Lustre 2.4, ZFS was added as a backend file system, with built-in software RAID, thereby removing the need of expensive RAID controllers. ZFS was designed to work with JBOD (Just a Bunch Of Disks) storage enclosures under the Solaris Operating System, which provided a rich device management system. Long time users of the Lustre file system have relied on the RAID controllers to provide metrics and enclosure monitoring and management services, with rich APIs and command line interfaces. This paper will study a hybrid approach using an advanced full featured RAID enclosure which is presented to the host as a JBOD, This RBOD (RAIDed Bunch Of Disks) allows ZFS to do the RAID protection and error correction, while the RAID controller handles management of the disks and monitors the enclosure. It was hoped that the value of the RAID controller features would offset the additional cost, and that performance would not suffer in this mode. The test results revealed that the hybrid RBOD approach did suffer reduced performance.

  6. Advanced Fuels Campaign Light Water Reactor Accident Tolerant Fuel Performance Metrics Executive Summary

    SciTech Connect (OSTI)

    Shannon Bragg-Sitton

    2014-02-01

    Research and development (R&D) activities on advanced, higher performance Light Water Reactor (LWR) fuels have been ongoing for the last few years. Following the unfortunate March 2011 events at the Fukushima Nuclear Power Plant in Japan, the R&D shifted toward enhancing the accident tolerance of LWRs. Qualitative attributes for fuels with enhanced accident tolerance, such as improved reaction kinetics with steam resulting in slower hydrogen generation rate, provide guidance for the design and development of fuels and cladding with enhanced accident tolerance. A common set of technical metrics should be established to aid in the optimization and down selection of candidate designs on a more quantitative basis. “Metrics” describe a set of technical bases by which multiple concepts can be fairly evaluated against a common baseline and against one another. This report describes a proposed technical evaluation methodology that can be applied to evaluate the ability of each concept to meet performance and safety goals relative to the current UO2 – zirconium alloy system and relative to one another. The resultant ranked evaluation can then inform concept down-selection, such that the most promising accident tolerant fuel design option(s) can continue to be developed toward qualification.

  7. En route to Background Independence: Broken split-symmetry, and how to restore it with bi-metric average actions

    SciTech Connect (OSTI)

    Becker, D. Reuter, M.

    2014-11-15

    The most momentous requirement a quantum theory of gravity must satisfy is Background Independence, necessitating in particular an ab initio derivation of the arena all non-gravitational physics takes place in, namely spacetime. Using the background field technique, this requirement translates into the condition of an unbroken split-symmetry connecting the (quantized) metric fluctuations to the (classical) background metric. If the regularization scheme used violates split-symmetry during the quantization process it is mandatory to restore it in the end at the level of observable physics. In this paper we present a detailed investigation of split-symmetry breaking and restoration within the Effective Average Action (EAA) approach to Quantum Einstein Gravity (QEG) with a special emphasis on the Asymptotic Safety conjecture. In particular we demonstrate for the first time in a non-trivial setting that the two key requirements of Background Independence and Asymptotic Safety can be satisfied simultaneously. Carefully disentangling fluctuation and background fields, we employ a ‘bi-metric’ ansatz for the EAA and project the flow generated by its functional renormalization group equation on a truncated theory space spanned by two separate Einstein–Hilbert actions for the dynamical and the background metric, respectively. A new powerful method is used to derive the corresponding renormalization group (RG) equations for the Newton- and cosmological constant, both in the dynamical and the background sector. We classify and analyze their solutions in detail, determine their fixed point structure, and identify an attractor mechanism which turns out instrumental in the split-symmetry restoration. We show that there exists a subset of RG trajectories which are both asymptotically safe and split-symmetry restoring: In the ultraviolet they emanate from a non-Gaussian fixed point, and in the infrared they loose all symmetry violating contributions inflicted on them by the

  8. DOE-HDBK-1122-99; Radiological Control Technician Training

    Energy Savers [EERE]

    ... PREFIX FACTOR SYMBOL PREFIX FACTOR SYMBOL yotta 10 24 Y deci 10 -1 d zetta 10 21 Z centi 10 -2 c exa 10 18 E milli 10 -3 m peta 10 15 P micro 10 -6 tera 10 12 T nano 10 -9 n ...

  9. Simulation information regarding Sandia National Laboratories%3CU%2B2019%3E trinity capability improvement metric.

    SciTech Connect (OSTI)

    Agelastos, Anthony Michael; Lin, Paul T.

    2013-10-01

    Sandia National Laboratories, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory each selected a representative simulation code to be used as a performance benchmark for the Trinity Capability Improvement Metric. Sandia selected SIERRA Low Mach Module: Nalu, which is a uid dynamics code that solves many variable-density, acoustically incompressible problems of interest spanning from laminar to turbulent ow regimes, since it is fairly representative of implicit codes that have been developed under ASC. The simulations for this metric were performed on the Cielo Cray XE6 platform during dedicated application time and the chosen case utilized 131,072 Cielo cores to perform a canonical turbulent open jet simulation within an approximately 9-billion-elementunstructured- hexahedral computational mesh. This report will document some of the results from these simulations as well as provide instructions to perform these simulations for comparison.

  10. Advanced Fuels Campaign Light Water Reactor Accident Tolerant Fuel Performance Metrics

    SciTech Connect (OSTI)

    Brad Merrill; Melissa Teague; Robert Youngblood; Larry Ott; Kevin Robb; Michael Todosow; Chris Stanek; Mitchell Farmer; Michael Billone; Robert Montgomery; Nicholas Brown; Shannon Bragg-Sitton

    2014-02-01

    The safe, reliable and economic operation of the nation’s nuclear power reactor fleet has always been a top priority for the United States’ nuclear industry. As a result, continual improvement of technology, including advanced materials and nuclear fuels, remains central to industry’s success. Decades of research combined with continual operation have produced steady advancements in technology and yielded an extensive base of data, experience, and knowledge on light water reactor (LWR) fuel performance under both normal and accident conditions. In 2011, following the Great East Japan Earthquake, resulting tsunami, and subsequent damage to the Fukushima Daiichi nuclear power plant complex, enhancing the accident tolerance of LWRs became a topic of serious discussion. As a result of direction from the U.S. Congress, the U.S. Department of Energy Office of Nuclear Energy (DOE-NE) initiated an Accident Tolerant Fuel (ATF) Development program. The complex multiphysics behavior of LWR nuclear fuel makes defining specific material or design improvements difficult; as such, establishing qualitative attributes is critical to guide the design and development of fuels and cladding with enhanced accident tolerance. This report summarizes a common set of technical evaluation metrics to aid in the optimization and down selection of candidate designs. As used herein, “metrics” describe a set of technical bases by which multiple concepts can be fairly evaluated against a common baseline and against one another. Furthermore, this report describes a proposed technical evaluation methodology that can be applied to assess the ability of each concept to meet performance and safety goals relative to the current UO2 – zirconium alloy system and relative to one another. The resultant ranked evaluation can then inform concept down-selection, such that the most promising accident tolerant fuel design option(s) can continue to be developed for lead test rod or lead test assembly

  11. Douglas Factors

    Broader source: Energy.gov [DOE]

    The Merit Systems Protection Board in its landmark decision, Douglas vs. Veterans Administration, 5 MSPR 280, established criteria that supervisors must consider in determining an appropriate penalty to impose for an act of employee misconduct. These twelve factors are commonly referred to as “Douglas Factors” and have been incorporated into the Federal Aviation Administration (FAA) Personnel Management System and various FAA Labor Agreements.

  12. New Pathways and Metrics for Enhanced, Reversible Hydrogen Storage in Boron-Doped Carbon Nanospaces

    SciTech Connect (OSTI)

    Pfeifer, Peter; Wexler, Carlos; Hawthorne, M. Frederick; Lee, Mark W.; Jalistegi, Satish S.

    2014-08-14

    This project, since its start in 2007—entitled “Networks of boron-doped carbon nanopores for low-pressure reversible hydrogen storage” (2007-10) and “New pathways and metrics for enhanced, reversible hydrogen storage in boron-doped carbon nanospaces” (2010-13)—is in support of the DOE's National Hydrogen Storage Project, as part of the DOE Hydrogen and Fuel Cells Program’s comprehensive efforts to enable the widespread commercialization of hydrogen and fuel cell technologies in diverse sectors of the economy. Hydrogen storage is widely recognized as a critical enabling technology for the successful commercialization and market acceptance of hydrogen powered vehicles. Storing sufficient hydrogen on board a wide range of vehicle platforms, at energy densities comparable to gasoline, without compromising passenger or cargo space, remains an outstanding technical challenge. Of the main three thrust areas in 2007—metal hydrides, chemical hydrogen storage, and sorption-based hydrogen storage—sorption-based storage, i.e., storage of molecular hydrogen by adsorption on high-surface-area materials (carbons, metal-organic frameworks, and other porous organic networks), has emerged as the most promising path toward achieving the 2017 DOE storage targets of 0.055 kg H2/kg system (“5.5 wt%”) and 0.040 kg H2/liter system. The objective of the project is to develop high-surface-area carbon materials that are boron-doped by incorporation of boron into the carbon lattice at the outset, i.e., during the synthesis of the material. The rationale for boron-doping is the prediction that boron atoms in carbon will raise the binding energy of hydro- gen from 4-5 kJ/mol on the undoped surface to 10-14 kJ/mol on a doped surface, and accordingly the hydro- gen storage capacity of the material. The mechanism for the increase in binding energy is electron donation from H2 to electron-deficient B atoms, in the form of sp2 boron-carbon bonds. Our team is proud to have

  13. User's Guide to Pre-Processing Data in Universal Translator 2 for the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect (OSTI)

    Taasevigen, Danny J.

    2011-11-30

    This document is a user's guide for the Energy Charting and Metrics Tool to facilitate the examination of energy information from buildings, reducing the time spent analyzing trend and utility meter data. This user guide was generated to help pre-process data with the intention of utilizing the Energy Charting and Metrics (ECAM) tool to improve building operational efficiency. There are numerous occasions when the metered data that is received from the building automation system (BAS) isn't in the right format acceptable for ECAM. This includes, but isn't limited to, cases such as inconsistent time-stamps for the trends (e.g., each trend has its own time-stamp), data with holes (e.g., some time-stamps have data and others are missing data), each point in the BAS is trended and exported into an individual .csv or .txt file, the time-stamp is unrecognizable by ECAM, etc. After reading through this user guide, the user should be able to pre-process all data files and be ready to use this data in ECAM to improve their building operational efficiency.

  14. Development of Metric for Measuring the Impact of RD&D Funding on GTO's Geothermal Exploration Goals (Presentation)

    SciTech Connect (OSTI)

    Jenne, S.; Young, K. R.; Thorsteinsson, H.

    2013-04-01

    The Department of Energy's Geothermal Technologies Office (GTO) provides RD&D funding for geothermal exploration technologies with the goal of lowering the risks and costs of geothermal development and exploration. In 2012, NREL was tasked with developing a metric to measure the impacts of this RD&D funding on the cost and time required for exploration activities. The development of this metric included collecting cost and time data for exploration techniques, creating a baseline suite of exploration techniques to which future exploration and cost and time improvements could be compared, and developing an online tool for graphically showing potential project impacts (all available at http://en.openei.org/wiki/Gateway:Geothermal). The conference paper describes the methodology used to define the baseline exploration suite of techniques (baseline), as well as the approach that was used to create the cost and time data set that populates the baseline. The resulting product, an online tool for measuring impact, and the aggregated cost and time data are available on the Open EI website for public access (http://en.openei.org).

  15. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    SciTech Connect (OSTI)

    Shiraishi, Satomi; Moore, Kevin L.; Tan, Jun; Olsen, Lindsey A.

    2015-02-15

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V{sub 10Gy} (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM{sub clin} − QM{sub pred}, and a coefficient of determination, R{sup 2}. For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are

  16. Impact of ASTM Standard E722 update on radiation damage metrics.

    SciTech Connect (OSTI)

    DePriest, Kendall Russell

    2014-06-01

    The impact of recent changes to the ASTM Standard E722 is investigated. The methodological changes in the production of the displacement kerma factors for silicon has significant impact for some energy regions of the 1-MeV(Si) equivalent fluence response function. When evaluating the integral over all neutrons energies in various spectra important to the SNL electronics testing community, the change in the response results in an increase in the total 1-MeV(Si) equivalent fluence of 2 7%. Response functions have been produced and are available for users of both the NuGET and MCNP codes.

  17. DOE Safety Metrics Indicator Program (SMIP) Fiscal Year 2000 Annual Report of Packaging- and Transportation-related Occurrences

    SciTech Connect (OSTI)

    Dickerson, L.S.

    2001-07-26

    The Oak Ridge National Laboratory (ORNL) has been charged by the DOE National Transportation Program (NTP) with the responsibility of retrieving reports and information pertaining to packaging and transportation (P&T) incidents from the centralized Occurrence Reporting and Processing System (ORPS) database. These selected reports have been analyzed for trends, impact on P&T operations and safety concerns, and lessons learned (LL) in P&T operations. This task is designed not only to keep the NTP aware of what is occurring at DOE sites on a periodic basis, but also to highlight potential P&T problems that may need management attention and allow dissemination of LL to DOE Operations Offices, with the subsequent flow of information to contractors. The Safety Metrics Indicator Program (SMIP) was established by the NTP in fiscal year (FY) 1998 as an initiative to develop a methodology for reporting occurrences with the appropriate metrics to show rates and trends. One of its chief goals has been to augment historical reporting of occurrence-based information and present more meaningful statistics for comparison of occurrences. To this end, the SMIP established a severity weighting system for the classification of the occurrences, which would allow normalization of the data and provide a basis for trending analyses. The process for application of this methodology is documented in the September 1999 report DOE Packaging and Transportation Measurement Methodology for the Safety Metrics Indicator Program (SMIP). This annual report contains information on those P&T-related occurrences reported to the ORPS during the period from October 1, 1999, through September 30, 2000. Only those incidents that occur in preparation for transport, during transport, and during unloading of hazardous material are considered as packaging- or transportation-related occurrences. Other incidents with P&T significance, but not involving hazardous material (such as vehicle accidents or empty

  18. Table B1. Summary statistics for natural gas in the United States, metric equivalents, 2010-2014

    U.S. Energy Information Administration (EIA) Indexed Site

    8 Table B1. Summary statistics for natural gas in the United States, metric equivalents, 2010-2014 See footnotes at end of table. Number of Wells Producing at End of Year 487,627 514,637 482,822 R 484,994 514,786 Production (million cubic meters) Gross Withdrawals From Gas Wells 375,127 348,044 354,080 R 304,676 294,045 From Oil Wells 165,220 167,294 140,617 R 153,044 167,695 From Coalbed Wells 54,277 50,377 43,591 R 40,374 36,392 From Shale Gas Wells 164,723 240,721 298,257 R 337,891 389,474

  19. A Comparison of Model Short-Range Forecasts and the ARM Microbase Data Fourth Quarter ARM Science Metric

    SciTech Connect (OSTI)

    Hnilo, J.

    2006-09-19

    For the fourth quarter ARM metric we will make use of new liquid water data that has become available, and called the “Microbase” value added product (referred to as OBS, within the text) at three sites: the North Slope of Alaska (NSA), Tropical West Pacific (TWP) and the Southern Great Plains (SGP) and compare these observations to model forecast data. Two time periods will be analyzed March 2000 for the SGP and October 2004 for both TWP and NSA. The Microbase data have been averaged to 35 pressure levels (e.g., from 1000hPa to 100hPa at 25hPa increments) and time averaged to 3hourly data for direct comparison to our model output.

  20. On use of CO{sub 2} chemiluminescence for combustion metrics in natural gas fired reciprocating engines.

    SciTech Connect (OSTI)

    Gupta, S. B.; Bihari, B.; Biruduganti, M.; Sekar, R.; Zigan, J.

    2011-01-01

    Flame chemiluminescence is widely acknowledged to be an indicator of heat release rate in premixed turbulent flames that are representative of gas turbine combustion. Though heat release rate is an important metric for evaluating combustion strategies in reciprocating engine systems, its correlation with flame chemiluminescence is not well studied. To address this gap an experimental study was carried out in a single-cylinder natural gas fired reciprocating engine that could simulate turbocharged conditions with exhaust gas recirculation. Crank angle resolved spectra (266-795 nm) of flame luminosity were measured for various operational conditions by varying the ignition timing for MBT conditions and by holding the speed at 1800 rpm and Brake Mean effective Pressure (BMEP) at 12 bar. The effect of dilution on CO*{sub 2}chemiluminescence intensities was studied, by varying the global equivalence ratio (0.6-1.0) and by varying the exhaust gas recirculation rate. It was attempted to relate the measured chemiluminescence intensities to thermodynamic metrics of importance to engine research -- in-cylinder bulk gas temperature and heat release rate (HRR) calculated from measured cylinder pressure signals. The peak of the measured CO*{sub 2} chemiluminescence intensities coincided with peak pressures within {+-}2 CAD for all test conditions. For each combustion cycle, the peaks of heat release rate, spectral intensity and temperature occurred in that sequence, well separated temporally. The peak heat release rates preceded the peak chemiluminescent emissions by 3.8-9.5 CAD, whereas the peak temperatures trailed by 5.8-15.6 CAD. Such a temporal separation precludes correlations on a crank-angle resolved basis. However, the peak cycle heat release rates and to a lesser extent the peak cycle temperatures correlated well with the chemiluminescent emission from CO*{sub 2}. Such observations point towards the potential use of flame chemiluminescence to monitor peak bulk gas

  1. Relating fish health and reproductive metrics to contaminant bioaccumulation at the Tennessee Valley Authority Kingston coal ash spill site

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Pracheil, Brenda M.; Marshall Adams, S.; Bevelhimer, Mark S.; Fortner, Allison M.; Greeley, Mark S.; Murphy, Cheryl A.; Mathews, Teresa J.; Peterson, Mark J.

    2016-05-06

    A 4.1 million m3 release of coal ash into the Emory and Clinch rivers in December 2008 at Tennessee Valley Authority s Kingston Fossil Plant has prompted a long-term, large-scale biological monitoring effort to determine if there are chronic effects of this spill on biota. Of concern in this spill were arsenic (As) and selenium (Se), heavy metal constituents of coal ash that can be toxic to fish and wildlife and also mercury (Hg): a legacy contaminant that can interact with Se in organisms. We used fish filet bioaccumulation data from Bluegill Lepomis macrochirus, Redear Lepomis microlophus, Largemouth Bass Micropterusmore » salmoides and Channel Catfish Ictalurus punctatus and metrics of fish health including fish condition indices, blood chemistry parameters and liver histopathology data collected from 2009-2013 to determine whether tissue heavy metal burdens relate 1) to each other 2) to metrics of fish health (e.g., blood chemistry characteristics and liver histopathology) and condition, and 3) whether relationships between fish health characteristics and heavy metals are related to site and ash-exposure. We found that burdens of Se and As are generally related to each other between tissues, but burdens of Hg between tissues are not generally positively associated. Taking analyses together, there appears to be reductions in growth and sublethal liver and kidney dysfunction in Bluegill and Largemouth Bass as indicated by blood chemistry parameters (elevated blood protein, glucose, phosphorous, blood urea nitrogen and creatinine in ash-affected sites) and related to concentrations of As and Se. Seeing sub-lethal effects in these species of fish is interesting because Redear had the highest filet burdens of Se, but did not have biomarkers indicating disease or dysfunction. We conclude our study by highlighting the complexities inherent in multimetric fish health data and the need for continued monitoring to further untangle contaminant and fish health

  2. DOE JGI Quality Metrics; Approaches to Scaling and Improving Metagenome Assembly (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema (OSTI)

    Copeland, Alex [DOE JGI]; Brown, C Titus [Michigan State University

    2013-01-22

    DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  3. SU-E-T-379: Concave Approximations of Target Volume Dose Metrics for Intensity- Modulated Radiotherapy Treatment Planning

    SciTech Connect (OSTI)

    Xie, Y; Chen, Y; Wickerhauser, M; Deasy, J

    2014-06-01

    Purpose: The widely used treatment plan metric Dx (mimimum dose to the hottest x% by volume of the target volume) is simple to interpret and use, but is computationally poorly behaved (non-convex), this impedes its use in computationally efficient intensity-modulated radiotherapy (IMRT) treatment planning algorithms. We therefore searched for surrogate metrics that are concave, computationally efficient, and accurately correlated to Dx values in IMRT treatment plans. Methods: To find concave surrogates of D95and more generally, Dx values with variable x valueswe tested equations containing one or two generalized equivalent uniform dose (gEUD) functions. Fits were obtained by varying gEUD a parameter values, as well as the linear equation coefficients. Fitting was performed using a dataset of dose-volume histograms from 498 de-identified head and neck IMRT treatment plans. Fit characteristics were tested using a crossvalidation process. Reported root-mean-square error values were averaged over the cross-validation shuffles. Results: As expected, the two-gEUD formula provided a superior fit, compared to the single-gEUD formula. The best approximation uses two gEUD terms: 16.25 x gEUD[a=0.45] 15.30 x gEUD[a=1.75] 0.69. The average root-mean-square error on repeated (70/30) cross validation was 0.94 Gy. In addition, a formula was found that reasonably approximates Dx for x between 80% and 96%. Conclusion: A simple concave function using two gEUD terms was found that correlates well with PTV D95s for these head and neck treatment plans. More generally, a formula was found that represents well the Dx for x values from 80% to 96%, thus providing a computationally efficient formula for use in treatment planning optimization. The formula may need to be adjusted for other institutions with different treatment planning protocols. We conclude that the strategy of replacing Dx values with gEUD-based formulas is promising.

  4. Geothermal Resource Reporting Metric (GRRM) Developed for the U.S. Department of Energy's Geothermal Technologies Office

    SciTech Connect (OSTI)

    Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.

    2015-09-02

    This paper reviews a methodology being developed for reporting geothermal resources and project progress. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of evaluating the impacts of its funding programs. This framework will allow the GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress and the public. Standards and reporting codes used in other countries and energy sectors provide guidance to develop the relevant geothermal methodology, but industry feedback and our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by the GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for evaluating and reporting on GTO funding according to resource grade (geological, technical and socio-economic) and project progress. This methodology would allow GTO to target funding, measure impact by monitoring the progression of projects, or assess geological potential of targeted areas for development.

  5. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect (OSTI)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  6. How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

    SciTech Connect (OSTI)

    Mathew, Paul; Greenberg, Steve; Ganguly, Srirupa; Sartor, Dale; Tschudi, William

    2009-04-01

    Data centers are among the most energy intensive types of facilities, and they are growing dramatically in terms of size and intensity [EPA 2007]. As a result, in the last few years there has been increasing interest from stakeholders - ranging from data center managers to policy makers - to improve the energy efficiency of data centers, and there are several industry and government organizations that have developed tools, guidelines, and training programs. There are many opportunities to reduce energy use in data centers and benchmarking studies reveal a wide range of efficiency practices. Data center operators may not be aware of how efficient their facility may be relative to their peers, even for the same levels of service. Benchmarking is an effective way to compare one facility to another, and also to track the performance of a given facility over time. Toward that end, this article presents the key metrics that facility managers can use to assess, track, and manage the efficiency of the infrastructure systems in data centers, and thereby identify potential efficiency actions. Most of the benchmarking data presented in this article are drawn from the data center benchmarking database at Lawrence Berkeley National Laboratory (LBNL). The database was developed from studies commissioned by the California Energy Commission, Pacific Gas and Electric Co., the U.S. Department of Energy and the New York State Energy Research and Development Authority.

  7. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  8. ASR - 2011 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    climate modeling within BER CESD. The goal of the climate modeling program is the development of climate models that include natural and human systems, which will project...

  9. ARM - 2009 Performance Metrics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    for climate data products and report the Barrow radiation times series data set. ... (PDF). The Barrow radiation time series data set was developed and is available at the ...

  10. Fire Protection Program Metrics

    Broader source: Energy.gov [DOE]

    Presenter: Perry E. D ’Antonio, P.E., Acting Sr. Manager, Fire Protection - Sandia National Laboratories

  11. Oil Security Metrics Model

    SciTech Connect (OSTI)

    Greene, David L.; Leiby, Paul N.

    2005-03-06

    A presentation to the IWG GPRA USDOE, March 6, 2005, Washington, DC. OSMM estimates oil security benefits of changes in the U.S. oil market.

  12. FY 2015 METRIC SUMMARY

    Broader source: Energy.gov [DOE]

    The Root Cause Analysis report identifies the key elements necessary to make the meaningful changes required to consistently deliver projects within cost and schedule performance parameters.

  13. Reducing Power Factor Cost

    Broader source: Energy.gov [DOE]

    Low power factor is expensive and inefficient. Many utility companies charge an additional fee if your power factor is less than 0.95. Low power factor also reduces your electrical system’s distribution capacity by increasing current flow and causing voltage drops. This fact sheet describes power factor and explains how you can improve your power factor to reduce electric bills and enhance your electrical system’s capacity.

  14. Land and Water Use, CO2 Emissions, and Worker Radiological Exposure Factors for the Nuclear Fuel Cycle

    SciTech Connect (OSTI)

    Brett W Carlsen; Brent W Dixon; Urairisa Pathanapirom; Eric Schneider; Bethany L. Smith; Timothy M. AUlt; Allen G. Croff; Steven L. Krahn

    2013-08-01

    The Department of Energy Office of Nuclear Energy’s Fuel Cycle Technologies program is preparing to evaluate several proposed nuclear fuel cycle options to help guide and prioritize Fuel Cycle Technology research and development. Metrics are being developed to assess performance against nine evaluation criteria that will be used to assess relevant impacts resulting from all phases of the fuel cycle. This report focuses on four specific environmental metrics. • land use • water use • CO2 emissions • radiological Dose to workers Impacts associated with the processes in the front-end of the nuclear fuel cycle, mining through enrichment and deconversion of DUF6 are summarized from FCRD-FCO-2012-000124, Revision 1. Impact estimates are developed within this report for the remaining phases of the nuclear fuel cycle. These phases include fuel fabrication, reactor construction and operations, fuel reprocessing, and storage, transport, and disposal of associated used fuel and radioactive wastes. Impact estimates for each of the phases of the nuclear fuel cycle are given as impact factors normalized per unit process throughput or output. These impact factors can then be re-scaled against the appropriate mass flows to provide estimates for a wide range of potential fuel cycles. A companion report, FCRD-FCO-2013-000213, applies the impact factors to estimate and provide a comparative evaluation of 40 fuel cycles under consideration relative to these four environmental metrics.

  15. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect (OSTI)

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  16. Evaluating IMRT and VMAT dose accuracy: Practical examples of failure to detect systematic errors when applying a commonly used metric and action levels

    SciTech Connect (OSTI)

    Nelms, Benjamin E.; Chan, Maria F.; Jarry, Genevive; Lemire, Matthieu; Lowden, John; Hampton, Carnell

    2013-11-15

    Purpose: This study (1) examines a variety of real-world cases where systematic errors were not detected by widely accepted methods for IMRT/VMAT dosimetric accuracy evaluation, and (2) drills-down to identify failure modes and their corresponding means for detection, diagnosis, and mitigation. The primary goal of detailing these case studies is to explore different, more sensitive methods and metrics that could be used more effectively for evaluating accuracy of dose algorithms, delivery systems, and QA devices.Methods: The authors present seven real-world case studies representing a variety of combinations of the treatment planning system (TPS), linac, delivery modality, and systematic error type. These case studies are typical to what might be used as part of an IMRT or VMAT commissioning test suite, varying in complexity. Each case study is analyzed according to TG-119 instructions for gamma passing rates and action levels for per-beam and/or composite plan dosimetric QA. Then, each case study is analyzed in-depth with advanced diagnostic methods (dose profile examination, EPID-based measurements, dose difference pattern analysis, 3D measurement-guided dose reconstruction, and dose grid inspection) and more sensitive metrics (2% local normalization/2 mm DTA and estimated DVH comparisons).Results: For these case studies, the conventional 3%/3 mm gamma passing rates exceeded 99% for IMRT per-beam analyses and ranged from 93.9% to 100% for composite plan dose analysis, well above the TG-119 action levels of 90% and 88%, respectively. However, all cases had systematic errors that were detected only by using advanced diagnostic techniques and more sensitive metrics. The systematic errors caused variable but noteworthy impact, including estimated target dose coverage loss of up to 5.5% and local dose deviations up to 31.5%. Types of errors included TPS model settings, algorithm limitations, and modeling and alignment of QA phantoms in the TPS. Most of the errors were

  17. FY 2009 Annual Report of Joule Software Metric SC GG 3.1/2.5.2, Improve Computational Science Capabilities

    SciTech Connect (OSTI)

    Kothe, Douglas B; Roche, Kenneth J; Kendall, Ricky A

    2010-01-01

    The Joule Software Metric for Computational Effectiveness is established by Public Authorizations PL 95-91, Department of Energy Organization Act, and PL 103-62, Government Performance and Results Act. The U.S. Office of Management and Budget (OMB) oversees the preparation and administration of the President s budget; evaluates the effectiveness of agency programs, policies, and procedures; assesses competing funding demands across agencies; and sets the funding priorities for the federal government. The OMB has the power of audit and exercises this right annually for each federal agency. According to the Government Performance and Results Act of 1993 (GPRA), federal agencies are required to develop three planning and performance documents: 1.Strategic Plan: a broad, 3 year outlook; 2.Annual Performance Plan: a focused, 1 year outlook of annual goals and objectives that is reflected in the annual budget request (What results can the agency deliver as part of its public funding?); and 3.Performance and Accountability Report: an annual report that details the previous fiscal year performance (What results did the agency produce in return for its public funding?). OMB uses its Performance Assessment Rating Tool (PART) to perform evaluations. PART has seven worksheets for seven types of agency functions. The function of Research and Development (R&D) programs is included. R&D programs are assessed on the following criteria: Does the R&D program perform a clear role? Has the program set valid long term and annual goals? Is the program well managed? Is the program achieving the results set forth in its GPRA documents? In Fiscal Year (FY) 2003, the Department of Energy Office of Science (DOE SC-1) worked directly with OMB to come to a consensus on an appropriate set of performance measures consistent with PART requirements. The scientific performance expectations of these requirements reach the scope of work conducted at the DOE national laboratories. The Joule system

  18. THE POSSIBLE ROLE OF CORONAL STREAMERS AS MAGNETICALLY CLOSED STRUCTURES IN SHOCK-INDUCED ENERGETIC ELECTRONS AND METRIC TYPE II RADIO BURSTS

    SciTech Connect (OSTI)

    Kong, Xiangliang; Chen, Yao; Feng, Shiwei; Wang, Bing; Du, Guohui; Guo, Fan; Li, Gang

    2015-01-10

    Two solar typeII radio bursts, separated by ?24 hr in time, are examined together. Both events are associated with coronal mass ejections (CMEs) erupting from the same active region (NOAA 11176) beneath a well-observed helmet streamer. We find that the typeII emissions in both events ended once the CME/shock fronts passed the white-light streamer tip, which is presumably the magnetic cusp of the streamer. This leads us to conjecture that the closed magnetic arcades of the streamer may play a role in electron acceleration and typeII excitation at coronal shocks. To examine such a conjecture, we conduct a test-particle simulation for electron dynamics within a large-scale partially closed streamer magnetic configuration swept by a coronal shock. We find that the closed field lines play the role of an electron trap via which the electrons are sent back to the shock front multiple times and therefore accelerated to high energies by the shock. Electrons with an initial energy of 300 eV can be accelerated to tens of keV concentrating at the loop apex close to the shock front with a counter-streaming distribution at most locations. These electrons are energetic enough to excite Langmuir waves and radio bursts. Considering the fact that most solar eruptions originate from closed field regions, we suggest that the scenario may be important for the generation of more metric typeIIs. This study also provides an explanation of the general ending frequencies of metric typeIIs at or above 20-30 MHz and the disconnection issue between metric and interplanetary typeIIs.

  19. Table 11.1 Carbon Dioxide Emissions From Energy Consumption by Source, 1949-2011 (Million Metric Tons of Carbon Dioxide )

    U.S. Energy Information Administration (EIA) Indexed Site

    Carbon Dioxide Emissions From Energy Consumption by Source, 1949-2011 (Million Metric Tons of Carbon Dioxide 1) Year Coal 3 Natural Gas 4 Petroleum Total 2,9 Biomass 2 Aviation Gasoline Distillate Fuel Oil 5 Jet Fuel Kero- sene LPG 6 Lubri- cants Motor Gasoline 7 Petroleum Coke Residual Fuel Oil Other 8 Total Wood 10 Waste 11 Fuel Ethanol 12 Bio- diesel Total 1949 1,118 270 12 140 NA 42 13 7 329 8 244 25 820 2,207 145 NA NA NA 145 1950 1,152 313 14 168 NA 48 16 9 357 8 273 26 918 2,382 147 NA NA

  20. Table 11.2c Carbon Dioxide Emissions From Energy Consumption: Industrial Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide )

    U.S. Energy Information Administration (EIA) Indexed Site

    c Carbon Dioxide Emissions From Energy Consumption: Industrial Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide 1) Year Coal Coal Coke Net Imports Natural Gas 3 Petroleum Retail Elec- tricity 8 Total 2 Biomass 2 Distillate Fuel Oil 4 Kero- sene LPG 5 Lubri- cants Motor Gasoline 6 Petroleum Coke Residual Fuel Oil Other 7 Total Wood 9 Waste 10 Fuel Ethanol 11 Total 1949 500 -1 166 41 18 3 3 16 8 95 25 209 120 995 44 NA NA 44 1950 531 (s) 184 51 20 4 3 18 8 110 26 239 140 1,095 50 NA NA 50

  1. Table 11.2d Carbon Dioxide Emissions From Energy Consumption: Transportation Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide )

    U.S. Energy Information Administration (EIA) Indexed Site

    d Carbon Dioxide Emissions From Energy Consumption: Transportation Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide 1) Year Coal Natural Gas 3 Petroleum Retail Elec- tricity 7 Total 2 Biomass 2 Aviation Gasoline Distillate Fuel Oil 4 Jet Fuel LPG 5 Lubricants Motor Gasoline 6 Residual Fuel Oil Total Fuel Ethanol 8 Biodiesel Total 1949 161 NA 12 30 NA (s) 4 306 91 443 6 611 NA NA NA 1950 146 7 14 35 NA (s) 5 332 95 481 6 640 NA NA NA 1951 129 11 18 42 NA (s) 6 360 102 529 7 675 NA NA NA

  2. Dilaton field minimally coupled to 2+1 gravity; uniqueness of the static Chan-Mann black hole and new dilaton stationary metrics

    SciTech Connect (OSTI)

    García-Diaz, Alberto A.

    2014-01-14

    Using the Schwarzschild coordinate frame for a static cyclic symmetric metric in 2+1 gravity coupled minimally to a dilaton logarithmically depending on the radial coordinate in the presence of an exponential potential, by solving first order linear Einstein equations, the general solution is derived and it is identified with the Chan–Mann dilaton solution. In these coordinates, a new stationary dilaton solution is obtained; it does not allow for a de Sitter–Anti-de Sitter limit at spatial infinity, where its structural functions increase indefinitely. On the other hand, it is horizonless and allows for a naked singularity at the origin of coordinates; moreover, one can identify at a large radial coordinate a (quasi-local) mass parameter and in the whole space a constant angular momentum. Via a general SL(2,R)–transformation, applied on the static cyclic symmetric metric, a family of stationary dilaton solutions has been generated. A particular SL(2,R)–transformation is identified, which gives rise to the rotating Chan–Mann dilaton solution. All the exhibited solutions have been characterized by their quasi-local energy, mass, and momentum through their series expansions at spatial infinity. The algebraic structure of the Ricci–energy-momentum, and Cotton tensors is given explicitly.

  3. CT head-scan dosimetry in an anthropomorphic phantom and associated measurement of ACR accreditation-phantom imaging metrics under clinically representative scan conditions

    SciTech Connect (OSTI)

    Brunner, Claudia C.; Stern, Stanley H.; Chakrabarti, Kish; Minniti, Ronaldo; Parry, Marie I.; Skopec, Marlene

    2013-08-15

    Purpose: To measure radiation absorbed dose and its distribution in an anthropomorphic head phantom under clinically representative scan conditions in three widely used computed tomography (CT) scanners, and to relate those dose values to metrics such as high-contrast resolution, noise, and contrast-to-noise ratio (CNR) in the American College of Radiology CT accreditation phantom.Methods: By inserting optically stimulated luminescence dosimeters (OSLDs) in the head of an anthropomorphic phantom specially developed for CT dosimetry (University of Florida, Gainesville), we measured dose with three commonly used scanners (GE Discovery CT750 HD, Siemens Definition, Philips Brilliance 64) at two different clinical sites (Walter Reed National Military Medical Center, National Institutes of Health). The scanners were set to operate with the same data-acquisition and image-reconstruction protocols as used clinically for typical head scans, respective of the practices of each facility for each scanner. We also analyzed images of the ACR CT accreditation phantom with the corresponding protocols. While the Siemens Definition and the Philips Brilliance protocols utilized only conventional, filtered back-projection (FBP) image-reconstruction methods, the GE Discovery also employed its particular version of an adaptive statistical iterative reconstruction (ASIR) algorithm that can be blended in desired proportions with the FBP algorithm. We did an objective image-metrics analysis evaluating the modulation transfer function (MTF), noise power spectrum (NPS), and CNR for images reconstructed with FBP. For images reconstructed with ASIR, we only analyzed the CNR, since MTF and NPS results are expected to depend on the object for iterative reconstruction algorithms.Results: The OSLD measurements showed that the Siemens Definition and the Philips Brilliance scanners (located at two different clinical facilities) yield average absorbed doses in tissue of 42.6 and 43.1 m

  4. Reducing Power Factor Cost

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    ... The presence of both in the same circuit results in the continuous alternating transfer of ... In the diagram below, the power triangle shows an initial 0.70 power factor for a 100-kW ...

  5. FGF growth factor analogs

    DOE Patents [OSTI]

    Zamora, Paul O.; Pena, Louis A.; Lin, Xinhua; Takahashi, Kazuyuki

    2012-07-24

    The present invention provides a fibroblast growth factor heparin-binding analog of the formula: ##STR00001## where R.sub.1, R.sub.2, R.sub.3, R.sub.4, R.sub.5, X, Y and Z are as defined, pharmaceutical compositions, coating compositions and medical devices including the fibroblast growth factor heparin-binding analog of the foregoing formula, and methods and uses thereof.

  6. Table 11.2b Carbon Dioxide Emissions From Energy Consumption: Commercial Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide )

    U.S. Energy Information Administration (EIA) Indexed Site

    b Carbon Dioxide Emissions From Energy Consumption: Commercial Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide 1) Year Coal Natural Gas 3 Petroleum Retail Electricity 7 Total 2 Biomass 2 Distillate Fuel Oil 4 Kerosene LPG 5 Motor Gasoline 6 Petroleum Coke Residual Fuel Oil Total Wood 8 Waste 9 Fuel Ethanol 10 Total 1949 148 19 16 3 2 7 NA 28 55 58 280 2 NA NA 2 1950 147 21 19 3 2 7 NA 33 66 63 297 2 NA NA 2 1951 125 25 21 4 3 8 NA 34 70 69 289 2 NA NA 2 1952 112 28 22 4 3 8 NA 35 71 73

  7. Table 11.2e Carbon Dioxide Emissions From Energy Consumption: Electric Power Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide )

    U.S. Energy Information Administration (EIA) Indexed Site

    e Carbon Dioxide Emissions From Energy Consumption: Electric Power Sector, 1949-2011 (Million Metric Tons of Carbon Dioxide 1) Year Coal Natural Gas 3 Petroleum Geo- thermal Non- Biomass Waste 5 Total 2 Biomass 2 Distillate Fuel Oil 4 Petroleum Coke Residual Fuel Oil Total Wood 6 Waste 7 Total 1949 187 30 2 NA 30 33 NA NA 250 1 NA 1 1950 206 35 2 NA 35 37 NA NA 278 1 NA 1 1951 235 42 2 NA 29 31 NA NA 308 1 NA 1 1952 240 50 2 NA 31 33 NA NA 323 1 NA 1 1953 260 57 3 NA 38 40 NA NA 358 (s) NA (s)

  8. Multi-factor authentication

    DOE Patents [OSTI]

    Hamlet, Jason R; Pierson, Lyndon G

    2014-10-21

    Detection and deterrence of spoofing of user authentication may be achieved by including a cryptographic fingerprint unit within a hardware device for authenticating a user of the hardware device. The cryptographic fingerprint unit includes an internal physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generates a PUF value. Combining logic is coupled to receive the PUF value, combines the PUF value with one or more other authentication factors to generate a multi-factor authentication value. A key generator is coupled to generate a private key and a public key based on the multi-factor authentication value while a decryptor is coupled to receive an authentication challenge posed to the hardware device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  9. Radiation View Factor With Shadowing

    Energy Science and Technology Software Center (OSTI)

    1992-02-24

    FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors as input data to finite element heat transfer analysis codes.

  10. Inelastic Scattering Form Factors

    Energy Science and Technology Software Center (OSTI)

    1992-01-01

    ATHENA-IV computes form factors for inelastic scattering calculations, using single-particle wave functions that are eigenstates of motion in either a Woods-Saxon potential well or a harmonic oscillator well. Two-body forces of Gauss, Coulomb, Yukawa, and a sum of cut-off Yukawa radial dependences are available.

  11. ERYTHROPOIETIC FACTOR PURIFICATION

    DOE Patents [OSTI]

    White, W.F.; Schlueter, R.J.

    1962-05-01

    A method is given for purifying and concentrating the blood plasma erythropoietic factor. Anemic sheep plasma is contacted three times successively with ion exchange resins: an anion exchange resin, a cation exchange resin at a pH of about 5, and a cation exchange resin at a pH of about 6. (AEC)

  12. Two-Factor Authentication

    Broader source: Energy.gov [DOE]

    Two-Factor Authentication (2FA) (also known as 2-Step Verification) is a system that employs two methods to identify an individual. More secure than reusable passwords, when a token's random number...

  13. Anthrax Lethal Factor

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Thiang Yian Wong, Robert Schwarzenbacher and Robert C. Liddington The Burnham Institute, 10901 North Torrey Pines Road, La Jolla, CA 92037. Anthrax Toxin is a major virulence factor in the infectious disease, Anthrax1. This toxin is produced by Bacillus anthracis, which is an encapsulated, spore-forming, rod-shaped bacterium. Inhalation anthrax, the most deadly form, is contracted through breathing spores. Once spores germinate within cells of the immune system called macrophages2, bacterial

  14. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    SciTech Connect (OSTI)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

  15. The Oil Security Metrics Model: A Tool for Evaluating the Prospective Oil Security Benefits of DOE's Energy Efficiency and Renewable Energy R&D Programs

    SciTech Connect (OSTI)

    Greene, David L; Leiby, Paul Newsome

    2006-05-01

    Energy technology R&D is a cornerstone of U.S. energy policy. Understanding the potential for energy technology R&D to solve the nation's energy problems is critical to formulating a successful R&D program. In light of this, the U.S. Congress requested the National Research Council (NRC) to undertake both retrospective and prospective assessments of the Department of Energy's (DOE's) Energy Efficiency and Fossil Energy Research programs (NRC, 2001; NRC, 2005). ("The Congress continued to express its interest in R&D benefits assessment by providing funds for the NRC to build on the retrospective methodology to develop a methodology for assessing prospective benefits." NRC, 2005, p. ES-2) In 2004, the NRC Committee on Prospective Benefits of DOE's Energy Efficiency and Fossil Energy R&D Programs published a report recommending a new framework and principles for prospective benefits assessment. The Committee explicitly deferred the issue of estimating security benefits to future work. Recognizing the need for a rigorous framework for assessing the energy security benefits of its R&D programs, the DOE's Office of Energy Efficiency and Renewable Energy (EERE) developed a framework and approach for defining energy security metrics for R&D programs to use in gauging the energy security benefits of their programs (Lee, 2005). This report describes methods for estimating the prospective oil security benefits of EERE's R&D programs that are consistent with the methodologies of the NRC (2005) Committee and that build on Lee's (2005) framework. Its objective is to define and implement a method that makes use of the NRC's typology of prospective benefits and methodological framework, satisfies the NRC's criteria for prospective benefits evaluation, and permits measurement of that portion of the prospective energy security benefits of EERE's R&D portfolio related to oil. While the Oil Security Metrics (OSM) methodology described in this report has been specifically developed to

  16. The MX Factor

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    MX Factor Test films played a strategic-planning role in the debates of the late 1970s and early 1980s about where and how to deploy the MX intercontinental ballistic missile (LGM-118 Peacekeeper). The deployment would have to ensure that the missiles could survive a first strike by an adversary. Military planners were considering placing the missiles in clusters of hardened concrete shelters in the hot, dry Great Basin Desert of Nevada and Utah. Films of atmospheric tests at the Nevada Test

  17. Nucleon Electromagnetic Form Factors

    SciTech Connect (OSTI)

    Marc Vanderhaeghen; Charles Perdrisat; Vina Punjabi

    2007-10-01

    There has been much activity in the measurement of the elastic electromagnetic proton and neutron form factors in the last decade, and the quality of the data has greatly improved by performing double polarization experiments, in comparison with previous unpolarized data. Here we review the experimental data base in view of the new results for the proton, and neutron, obtained at JLab, MAMI, and MIT-Bates. The rapid evolution of phenomenological models triggered by these high-precision experiments will be discussed, including the recent progress in the determination of the valence quark generalized parton distributions of the nucleon, as well as the steady rate of improvements made in the lattice QCD calculations.

  18. Characteristics RSE Column Factor: Total

    U.S. Energy Information Administration (EIA) Indexed Site

    and 1994 Vehicle Characteristics RSE Column Factor: Total 1993 Family Income Below Poverty Line Eli- gible for Fed- eral Assist- ance 1 RSE Row Factor: Less than 5,000 5,000...

  19. DOE Project Management Update (Metrics)

    Broader source: Energy.gov [DOE]

    Michael Peek, Deputy Director, Office of Project Management Oversight and Assessments March 22, 2016

  20. "(Million Metric Tons Carbon Dioxide)"

    U.S. Energy Information Administration (EIA) Indexed Site

    ....0280756469,0.02562455361,0.02345646124 " China",2293,5558,5862,6284,7716,9057,10514,11945...,0.4312535075,0.4478837352,0.7550810962 " China",0.1064692737,0.1961919973,0.2032923089,0....

  1. A File System Utilization Metric

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Abstract A high performance computing (HPC) platform today typically contains ... the remaining 1M B - 4kB already inmemory when later 4kB requests for that data arrive. ...

  2. Validation of mathematical models for the prediction of organs-at-risk dosimetric metrics in high-dose-rate gynecologic interstitial brachytherapy

    SciTech Connect (OSTI)

    Damato, Antonio L.; Viswanathan, Akila N.; Cormack, Robert A.

    2013-10-15

    Purpose: Given the complicated nature of an interstitial gynecologic brachytherapy treatment plan, the use of a quantitative tool to evaluate the quality of the achieved metrics compared to clinical practice would be advantageous. For this purpose, predictive mathematical models to predict the D{sub 2cc} of rectum and bladder in interstitial gynecologic brachytherapy are discussed and validated.Methods: Previous plans were used to establish the relationship between D2cc and the overlapping volume of the organ at risk with the targeted area (C0) or a 1-cm expansion of the target area (C1). Three mathematical models were evaluated: D{sub 2cc}=α*C{sub 1}+β (LIN); D{sub 2cc}=α– exp(–β*C{sub 0}) (EXP); and a mixed approach (MIX), where both C{sub 0} and C{sub 1} were inputs of the model. The parameters of the models were optimized on a training set of patient data, and the predictive error of each model (predicted D{sub 2cc}− real D{sub 2cc}) was calculated on a validation set of patient data. The data of 20 patients were used to perform a K-fold cross validation analysis, with K = 2, 4, 6, 8, 10, and 20.Results: MIX was associated with the smallest mean prediction error <6.4% for an 18-patient training set; LIN had an error <8.5%; EXP had an error <8.3%. Best case scenario analysis shows that an error ≤5% can be achieved for a ten-patient training set with MIX, an error ≤7.4% for LIN, and an error ≤6.9% for EXP. The error decreases with the increase in training set size, with the most marked decrease observed for MIX.Conclusions: The MIX model can predict the D{sub 2cc} of the organs at risk with an error lower than 5% with a training set of ten patients or greater. The model can be used in the development of quality assurance tools to identify treatment plans with suboptimal sparing of the organs at risk. It can also be used to improve preplanning and in the development of real-time intraoperative planning tools.

  3. Factors Affecting PMU Installation Costs

    Broader source: Energy.gov (indexed) [DOE]

    ... information to improve the modeling, forecasting and controls of the grid Standards ... Department of Energy |September 2014 Factors Affecting PMU Installation Costs | Page 3 ...

  4. Factor CO2 | Open Energy Information

    Open Energy Info (EERE)

    Factor CO2 Jump to: navigation, search Name: Factor CO2 Place: Bilbao, Spain Zip: 48008 Product: Spain-based consultancy specializing in climate change projects. References: Factor...

  5. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    SciTech Connect (OSTI)

    Lee, Dong-Chang; Jans, Hans; McEwan, Sandy; Riauka, Terence; Martin, Wayne; Wieler, Marguerite

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively. The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.

  6. PROGRESS TOWARDS NEXT GENERATION, WAVEFORM BASED THREE-DIMENSIONAL MODELS AND METRICS TO IMPROVE NUCLEAR EXPLOSION MONITORING IN THE MIDDLE EAST

    SciTech Connect (OSTI)

    Savage, B; Peter, D; Covellone, B; Rodgers, A; Tromp, J

    2009-07-02

    Efforts to update current wave speed models of the Middle East require a thoroughly tested database of sources and recordings. Recordings of seismic waves traversing the region from Tibet to the Red Sea will be the principal metric in guiding improvements to the current wave speed model. Precise characterizations of the earthquakes, specifically depths and faulting mechanisms, are essential to avoid mapping source errors into the refined wave speed model. Errors associated with the source are manifested in amplitude and phase changes. Source depths and paths near nodal planes are particularly error prone as small changes may severely affect the resulting wavefield. Once sources are quantified, regions requiring refinement will be highlighted using adjoint tomography methods based on spectral element simulations [Komatitsch and Tromp (1999)]. An initial database of 250 regional Middle Eastern events from 1990-2007, was inverted for depth and focal mechanism using teleseismic arrivals [Kikuchi and Kanamori (1982)] and regional surface and body waves [Zhao and Helmberger (1994)]. From this initial database, we reinterpreted a large, well recorded subset of 201 events through a direct comparison between data and synthetics based upon a centroid moment tensor inversion [Liu et al. (2004)]. Evaluation was done using both a 1D reference model [Dziewonski and Anderson (1981)] at periods greater than 80 seconds and a 3D model [Kustowski et al. (2008)] at periods of 25 seconds and longer. The final source reinterpretations will be within the 3D model, as this is the initial starting point for the adjoint tomography. Transitioning from a 1D to 3D wave speed model shows dramatic improvements when comparisons are done at shorter periods, (25 s). Synthetics from the 1D model were created through mode summations while those from the 3D simulations were created using the spectral element method. To further assess errors in source depth and focal mechanism, comparisons between the

  7. Human Factors Engineering Analysis Tool

    Energy Science and Technology Software Center (OSTI)

    2002-03-04

    HFE-AT is a human factors engineering (HFE) software analysis tool (AT) for human-system interface design of process control systems, and is based primarily on NUREG-0700 guidance.

  8. Factorization, power corrections, and the pion form factor

    SciTech Connect (OSTI)

    Rothstein, Ira Z.

    2004-09-01

    This paper is an investigation of the pion form factor utilizing recently developed effective field theory techniques. The primary results reported are both the transition and electromagnetic form factors are corrected at order {lambda}/Q due to time ordered products which account for deviations of the pion from being a state composed purely of highly energetic collinear quarks in the lab frame. The usual higher twist wave function corrections contribute only at order {lambda}{sup 2}/Q{sup 2}, when the quark mass vanishes. In the case of the electromagnetic form factor the {lambda}/Q power correction is enhanced by a power of 1/{alpha}{sub s}(Q) relative to the leading order result of Brodsky and Lepage, if the scale {radical}({lambda}Q) is nonperturbative. This enhanced correction could explain the discrepancy with the data.

  9. Transcription factor-based biosensor

    DOE Patents [OSTI]

    2013-10-08

    The present invention provides for a system comprising a BmoR transcription factor, a .sigma..sup.54-RNA polymerase, and a pBMO promoter operatively linked to a reporter gene, wherein the pBMO promoter is capable of expression of the reporter gene with an activated form of the BmoR and the .sigma..sup.54-RNA polymerase.

  10. Human factors in software development

    SciTech Connect (OSTI)

    Curtis, B.

    1986-01-01

    This book presents an overview of ergonomics/human factors in software development, recent research, and classic papers. Articles are drawn from the following areas of psychological research on programming: cognitive ergonomics, cognitive psychology, and psycholinguistics. Topics examined include: theoretical models of how programmers solve technical problems, the characteristics of programming languages, specification formats in behavioral research and psychological aspects of fault diagnosis.

  11. IPCC Emission Factor Database | Open Energy Information

    Open Energy Info (EERE)

    Emission Factor Database Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IPCC Emission Factor Database AgencyCompany Organization: World Meteorological Organization,...

  12. Tetrahydroquinoline Derivatives as Potent and Selective Factor...

    Office of Scientific and Technical Information (OSTI)

    as Potent and Selective Factor XIa Inhibitors Citation Details In-Document Search Title: Tetrahydroquinoline Derivatives as Potent and Selective Factor XIa Inhibitors Authors: ...

  13. Structural basis for Tetrahymena telomerase processivity factor...

    Office of Scientific and Technical Information (OSTI)

    factor Teb1 binding to single-stranded telomeric-repeat DNA Citation Details In-Document Search Title: Structural basis for Tetrahymena telomerase processivity factor Teb1 ...

  14. Factors Impacting Decommissioning Costs - 13576

    SciTech Connect (OSTI)

    Kim, Karen; McGrath, Richard

    2013-07-01

    The Electric Power Research Institute (EPRI) studied United States experience with decommissioning cost estimates and the factors that impact the actual cost of decommissioning projects. This study gathered available estimated and actual decommissioning costs from eight nuclear power plants in the United States to understand the major components of decommissioning costs. Major costs categories for decommissioning a nuclear power plant are removal costs, radioactive waste costs, staffing costs, and other costs. The technical factors that impact the costs were analyzed based on the plants' decommissioning experiences. Detailed cost breakdowns by major projects and other cost categories from actual power plant decommissioning experiences will be presented. Such information will be useful in planning future decommissioning and designing new plants. (authors)

  15. Human factors in waste management

    SciTech Connect (OSTI)

    Moray, N.

    1994-10-01

    This article examines the role of human factors in radioactive waste management. Although few problems and ergonomics are special to radioactive waste management, some problems are unique especially with long term storage. The entire sociotechnical system must be looked at in order to see where improvement can take place because operator errors, as seen in Chernobyl and Bhopal, are ultimately the result of management errors.

  16. Calculating Individual Resources Variability and Uncertainty Factors Based on Their Contributions to the Overall System Balancing Needs

    SciTech Connect (OSTI)

    Makarov, Yuri V.; Du, Pengwei; Pai, M. A.; McManus, Bart

    2014-01-14

    The variability and uncertainty of wind power production requires increased flexibility in power systems, or more operational reserves to main a satisfactory level of reliability. The incremental increase in reserve requirement caused by wind power is often studied separately from the effects of loads. Accordingly, the cost in procuring reserves is allocated based on this simplification rather than a fair and transparent calculation of the different resources contribution to the reserve requirement. This work proposes a new allocation mechanism for intermittency and variability of resources regardless of their type. It is based on a new formula, called grid balancing metric (GBM). The proposed GBM has several distinct features: 1) it is directly linked to the control performance standard (CPS) scores and interconnection frequency performance, 2) it provides scientifically defined allocation factors for individual resources, 3) the sum of allocation factors within any group of resources is equal to the groups collective allocation factor (linearity), and 4) it distinguishes helpers and harmers. The paper illustrates and provides results of the new approach based on actual transmission system operator (TSO) data.

  17. Cone Penetrometer N Factor Determination Testing Results

    SciTech Connect (OSTI)

    Follett, Jordan R.

    2014-03-05

    This document contains the results of testing activities to determine the empirical 'N Factor' for the cone penetrometer in kaolin clay simulant. The N Factor is used to releate resistance measurements taken with the cone penetrometer to shear strength.

  18. SU-D-204-05: Quantitative Comparison of a High Resolution Micro-Angiographic Fluoroscopic (MAF) Detector with a Standard Flat Panel Detector (FPD) Using the New Metric of Generalized Measured Relative Object Detectability (GM-ROD)

    SciTech Connect (OSTI)

    Russ, M; Ionita, C; Bednarek, D; Rudin, S

    2015-06-15

    Purpose: In endovascular image-guided neuro-interventions, visualization of fine detail is paramount. For example, the ability of the interventionist to visualize the stent struts depends heavily on the x-ray imaging detector performance. Methods: A study to examine the relative performance of the high resolution MAF-CMOS (pixel size 75µm, Nyquist frequency 6.6 cycles/mm) and a standard Flat Panel Detector (pixel size 194µm, Nyquist frequency 2.5 cycles/mm) detectors in imaging a neuro stent was done using the Generalized Measured Relative Object Detectability (GM-ROD) metric. Low quantum noise images of a deployed stent were obtained by averaging 95 frames obtained by both detectors without changing other exposure or geometric parameters. The square of the Fourier transform of each image is taken and divided by the generalized normalized noise power spectrum to give an effective measured task-specific signal-to-noise ratio. This expression is then integrated from 0 to each of the detector’s Nyquist frequencies, and the GM-ROD value is determined by taking a ratio of the integrals for the MAF-CMOS to that of the FPD. The lower bound of integration can be varied to emphasize high frequencies in the detector comparisons. Results: The MAF-CMOS detector exhibits vastly superior performance over the FPD when integrating over all frequencies, yielding a GM-ROD value of 63.1. The lower bound of integration was stepped up in increments of 0.5 cycles/mm for higher frequency comparisons. As the lower bound increased, the GM-ROD value was augmented, reflecting the superior performance of the MAF-CMOS in the high frequency regime. Conclusion: GM-ROD is a versatile metric that can provide quantitative detector and task dependent comparisons that can be used as a basis for detector selection. Supported by NIH Grant: 2R01EB002873 and an equipment grant from Toshiba Medical Systems Corporation.

  19. Clothes Washer Test Cloth Correction Factor Information

    Broader source: Energy.gov [DOE]

    This page contains the information used to determine the test cloth correction factors for each test cloth lot.

  20. Human factors: a necessary tool for industry

    SciTech Connect (OSTI)

    Starcher, K.O.

    1984-03-09

    The need for human factors (ergonomics) input in the layout of a ferroelectric ceramics laboratory is presented as an example of the overall need for human factors professionals in industry. However, even in the absence of one trained in human factors, knowledge of a few principles in ergonomics will provide many possibilities for improving performance in the industrial environment.

  1. Antenna factorization in strongly ordered limits

    SciTech Connect (OSTI)

    Kosower, David A.

    2005-02-15

    When energies or angles of gluons emitted in a gauge-theory process are small and strongly ordered, the emission factorizes in a simple way to all orders in perturbation theory. I show how to unify the various strongly ordered soft, mixed soft-collinear, and collinear limits using antenna factorization amplitudes, which are generalizations of the Catani-Seymour dipole factorization function.

  2. Factors fragmenting the Russian Federation

    SciTech Connect (OSTI)

    Brown, E.

    1993-10-06

    This paper examines the factors that threaten the future of the Russian Federation (RF). The observations are based on a study that focused on eight republics: Mordova, Udmurtia, Tatarstan, Mari El, Bashkortostan, Kabardino-Balkaria, Buryatia, and Altay Republic. These republics were selected for their geographic and economic significance to the RF. Tatarstan, Bashkortostan, Udmurtia, and Mari El are located on important supply routes, such as the Volga River and the trans-Siberian railroad. Some of these republics are relatively wealthy, with natural resources such as oil (e.g., Tatarstan and Bashkortostan), and all eight republics play significant roles in the military-industrial complex. The importance of these republics to the RF contrasts to the relative insignificance of the independence-minded Northern Caucasus area. The author chose not to examine the Northern Caucasus region (except Kabardino-Balkaria) because these republics may have only a minor impact on the rest of the RF if they secede. Their impact would be minimized because they lie on the frontiers of the RF. Many Russians believe that {open_quotes}it might be best to let such a troublesome area secede.{close_quotes}

  3. Factorization using the quadratic sieve algorithm

    SciTech Connect (OSTI)

    Davis, J.A.; Holdridge, D.B.

    1983-01-01

    Since the cryptosecurity of the RSA two key cryptoalgorithm is no greater than the difficulty of factoring the modulus (product of two secret primes), a code that implements the Quadratic Sieve factorization algorithm on the CRAY I computer has been developed at the Sandia National Laboratories to determine as sharply as possible the current state-of-the-art in factoring. Because all viable attacks on RSA thus far proposed are equivalent to factorization of the modulus, sharper bounds on the computational difficulty of factoring permit improved estimates for the size of RSA parameters needed for given levels of cryptosecurity. Analysis of the Quadratic Sieve indicates that it may be faster than any previously published general purpose algorithm for factoring large integers. The high speed of the CRAY I coupled with the capability of the CRAY to pipeline certain vectorized operations make this algorithm (and code) the front runner in current factoring techniques.

  4. Factorization using the quadratic sieve algorithm

    SciTech Connect (OSTI)

    Davis, J.A.; Holdridge, D.B.

    1983-12-01

    Since the cryptosecurity of the RSA two key cryptoalgorithm is no greater than the difficulty of factoring the modulus (product of two secret primes), a code that implements the Quadratic Sieve factorization algorithm on the CRAY I computer has been developed at the Sandia National Laboratories to determine as sharply as possible the current state-of-the-art in factoring. Because all viable attacks on RSA thus far proposed are equivalent to factorization of the modulus, sharper bounds on the computational difficulty of factoring permit improved estimates for the size of RSA parameters needed for given levels of cryptosecurity. Analysis of the Quadratic Sieve indicates that it may be faster than any previously published general purpose algorithm for factoring large integers. The high speed of the CRAY I coupled with the capability of the CRAY to pipeline certain vectorized operations make this algorithm (and code) the front runner in current factoring techniques.

  5. Synthetic heparin-binding growth factor analogs

    DOE Patents [OSTI]

    Pena, Louis A.; Zamora, Paul; Lin, Xinhua; Glass, John D.

    2007-01-23

    The invention provides synthetic heparin-binding growth factor analogs having at least one peptide chain that binds a heparin-binding growth factor receptor, covalently bound to a hydrophobic linker, which is in turn covalently bound to a non-signaling peptide that includes a heparin-binding domain. The synthetic heparin-binding growth factor analogs are useful as soluble biologics or as surface coatings for medical devices.

  6. Summary - Major Risk Factors Integrated Facility Disposition...

    Office of Environmental Management (EM)

    Office of Environmental Management (DOE-EM) External Technical Review of the Major Risk Factors Integrated Facility Disposition Project (IFDP) Oak Ridge, TN Why DOE-EM Did...

  7. CONTROL OF MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS...

    Office of Scientific and Technical Information (OSTI)

    MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS AFFECTING FUSION. Henderson, Ian M.; Paxton, Walter F Abstract not provided. Sandia National Laboratories (SNL-NM), Albuquerque,...

  8. EcoFactor Inc | Open Energy Information

    Open Energy Info (EERE)

    Name: EcoFactor Inc Place: Millbrae, California Zip: 94030 Product: California-based home energy management service provider. Coordinates: 37.60276, -122.395444 Show Map...

  9. CONTROL OF MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Journal Article: CONTROL OF MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS AFFECTING FUSION. Citation Details In-Document Search Title: CONTROL OF MECHANICALLY ACTIVATED...

  10. Soliton form factors from lattice simulations

    SciTech Connect (OSTI)

    Rajantie, Arttu; Weir, David J.

    2010-12-01

    The form factor provides a convenient way to describe properties of topological solitons in the full quantum theory, when semiclassical concepts are not applicable. It is demonstrated that the form factor can be calculated numerically using lattice Monte Carlo simulations. The approach is very general and can be applied to essentially any type of soliton. The technique is illustrated by calculating the kink form factor near the critical point in 1+1-dimensional scalar field theory. As expected from universality arguments, the result agrees with the exactly calculable scaling form factor of the two-dimensional Ising model.

  11. Emission Factors (EMFAC) | Open Energy Information

    Open Energy Info (EERE)

    The EMission FACtors (EMFAC) model is used to calculate emission rates from all motor vehicles, such as passenger cars to heavy-duty trucks, operating on highways, freeways...

  12. Nonrelativistic QCD factorization and the velocity dependence...

    Office of Scientific and Technical Information (OSTI)

    CONFIGURATION; FACTORIZATION; MATRIX ELEMENTS; QUANTUM CHROMODYNAMICS; QUARKONIUM; SINGULARITY; T QUARKS; VELOCITY Word Cloud More Like This Full Text Journal Articles DOI: ...

  13. Industrial Power Factor Analysis Guidebook. Electrotek Concepts...

    Office of Scientific and Technical Information (OSTI)

    low power factors, increased conductor and transformer losses, and lower voltages. Utilities must supply both active and reactive power and compensate for these losses. Power...

  14. Major Risk Factors Integrated Facility Disposition Project -...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Integrated Facility Disposition Project - Oak Ridge Major Risk Factors Integrated Facility Disposition Project - Oak Ridge Full Document and Summary Versions are available for ...

  15. EM Corporate Performance Metrics, Complex Level

    Office of Environmental Management (EM)

    98,053 106,526 LLLLMW disposed Legacy (Stored) and NGW Cubic Meters 1,558,048 1,209,709 1,237,779 1,265,849 MAAs eliminated Number of Material Access Areas 35 30 30 30 Nuclear...

  16. Documentation for FY2003 GPRA metrics

    SciTech Connect (OSTI)

    None, None

    2002-02-01

    This report is broken into two sections: a summary section providing an overview of the benefits analysis of OPT technology R&D programs, and a detailed section providing specific information about the entire GPRA benefits process and each of the OPT programs.

  17. Comparison summary (key metrics and multiples)

    Broader source: Energy.gov (indexed) [DOE]

    variable lift, timing and duration to enable high efficiency engine combustion control | Department of Energy Discusses development of advanced variable valve actuation system to enable high efficiency combustion highlighting advances to improving system packaging while reducing cost deer12_mccarthy.pdf (1.78 MB) More Documents & Publications Step change in Fuel Efficiency:Eaton's perspective Development of High-Efficiency Clean Combustion Engines Designs for SI and CI Engines High

  18. Stochastic inverse problems: Models and metrics

    SciTech Connect (OSTI)

    Sabbagh, Elias H.; Sabbagh, Harold A.; Murphy, R. Kim; Aldrin, John C.; Annis, Charles; Knopp, Jeremy S.

    2015-03-31

    In past work, we introduced model-based inverse methods, and applied them to problems in which the anomaly could be reasonably modeled by simple canonical shapes, such as rectangular solids. In these cases the parameters to be inverted would be length, width and height, as well as the occasional probe lift-off or rotation. We are now developing a formulation that allows more flexibility in modeling complex flaws. The idea consists of expanding the flaw in a sequence of basis functions, and then solving for the expansion coefficients of this sequence, which are modeled as independent random variables, uniformly distributed over their range of values. There are a number of applications of such modeling: 1. Connected cracks and multiple half-moons, which we have noted in a POD set. Ideally we would like to distinguish connected cracks from one long shallow crack. 2. Cracks of irregular profile and shape which have appeared in cold work holes during bolt-hole eddy-current inspection. One side of such cracks is much deeper than other. 3. L or C shaped crack profiles at the surface, examples of which have been seen in bolt-hole cracks. By formulating problems in a stochastic sense, we are able to leverage the stochastic global optimization algorithms in NLSE, which is resident in VIC-3D®, to answer questions of global minimization and to compute confidence bounds using the sensitivity coefficient that we get from NLSE. We will also address the issue of surrogate functions which are used during the inversion process, and how they contribute to the quality of the estimation of the bounds.

  19. Clean Cities Annual Metrics Report 2008

    SciTech Connect (OSTI)

    Johnson, C.; Bergeron, P.

    2009-09-01

    This report summarizes the Department of Energy's Clean Cities coalition accomplishments in 2008, including petroleum displacement data, membership, funding, sales of alternative fuel blends, deployment of AFVs and HEVs, idle reduction initiatives, and fuel economy activities.

  20. EM Corporate Performance Metrics, Site Level

    Office of Environmental Management (EM)

    completed 1 1 1 1 Grand Junction Geographic Sites Eliminated Number completed 3 2 2 2 Inhalation Toxicology Laboratory LLLLMW disposed Legacy (Stored) and NGW Cubic Meters 359...

  1. Uranium Leasing Program: Lease Tract Metrics

    Broader source: Energy.gov [DOE]

    The Atomic Energy Act and other legislative actions authorized the U.S. Atomic Energy Commission (AEC), predecessor agency to the U.S. Department of Energy (DOE), to withdraw lands from the public...

  2. Comparison summary (key metrics and multiples)

    Energy Savers [EERE]

    ... Early Concern Over Slope Instability 10 (from McIver,1982) Cause Turbidity Currents Act as ... Documented Gas Release from the Seafloor 38 Sea surface Seafloor 100m ocean shear? 800m ...

  3. Clean Cities 2012 Annual Metrics Report

    SciTech Connect (OSTI)

    Johnson, C.

    2013-12-01

    The U.S. Department of Energy's (DOE) Clean Cities program advances the nation's economic, environmental, and energy security by supporting local actions to cut petroleum use in transportation. A national network of nearly 100 Clean Cities coalitions brings together stakeholders in the public and private sectors to deploy alternative and renewable fuels, idle-reduction measures, fuel economy improvements, and new transportation technologies, as they emerge. Each year DOE asks Clean Cities coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterizes the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this report.

  4. Evaluation Metrics Applied to Accident Tolerant Fuels

    SciTech Connect (OSTI)

    Shannon M. Bragg-Sitton; Jon Carmack; Frank Goldner

    2014-10-01

    The safe, reliable, and economic operation of the nation’s nuclear power reactor fleet has always been a top priority for the United States’ nuclear industry. Continual improvement of technology, including advanced materials and nuclear fuels, remains central to the industry’s success. Decades of research combined with continual operation have produced steady advancements in technology and have yielded an extensive base of data, experience, and knowledge on light water reactor (LWR) fuel performance under both normal and accident conditions. One of the current missions of the U.S. Department of Energy’s (DOE) Office of Nuclear Energy (NE) is to develop nuclear fuels and claddings with enhanced accident tolerance for use in the current fleet of commercial LWRs or in reactor concepts with design certifications (GEN-III+). Accident tolerance became a focus within advanced LWR research upon direction from Congress following the 2011 Great East Japan Earthquake, resulting tsunami, and subsequent damage to the Fukushima Daiichi nuclear power plant complex. The overall goal of ATF development is to identify alternative fuel system technologies to further enhance the safety, competitiveness and economics of commercial nuclear power. Enhanced accident tolerant fuels would endure loss of active cooling in the reactor core for a considerably longer period of time than the current fuel system while maintaining or improving performance during normal operations. The U.S. DOE is supporting multiple teams to investigate a number of technologies that may improve fuel system response and behavior in accident conditions, with team leadership provided by DOE national laboratories, universities, and the nuclear industry. Concepts under consideration offer both evolutionary and revolutionary changes to the current nuclear fuel system. Mature concepts will be tested in the Advanced Test Reactor at Idaho National Laboratory beginning in Summer 2014 with additional concepts being readied for insertion in fiscal year 2015. This paper provides a brief summary of the proposed evaluation process that would be used to evaluate and prioritize the candidate accident tolerant fuel concepts currently under development.

  5. Clean Cities 2012 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center [Office of Energy Efficiency and Renewable Energy (EERE)]

    of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC. This report is available at no cost from the National Renewable...

  6. Efficient Synchronization Stability Metrics for Fault Clearing...

    Office of Scientific and Technical Information (OSTI)

    Technical Information Service, Springfield, VA at www.ntis.gov. Authors: Backhaus, Scott N. 1 ; Chertkov, Michael 1 ; Bent, Russell Whitford 1 ; Bienstock, Daniel 2...

  7. EECBG SEP Attachment 1 - Process metric list

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    systems installed (tons) 3e. Biomass (non-transport) system installed Number of biomass ... Bike lanes installed Length of bike lanes installed (linear feet) 12k. Vehicle ...

  8. Gauss Sum Factorization with Cold Atoms

    SciTech Connect (OSTI)

    Gilowski, M.; Wendrich, T.; Mueller, T.; Ertmer, W.; Rasel, E. M. [Institut fuer Quantenoptik, Leibniz Universitaet Hannover, Welfengarten 1, D-30167 Hannover (Germany); Jentsch, Ch. [Astrium GmbH-Satellites, 88039 Friedrichshafen (Germany); Schleich, W. P. [Institut fuer Quantenphysik, Universitaet Ulm, Albert-Einstein-Allee 11, D-89081 Ulm (Germany)

    2008-01-25

    We report the first implementation of a Gauss sum factorization algorithm by an internal state Ramsey interferometer using cold atoms. A sequence of appropriately designed light pulses interacts with an ensemble of cold rubidium atoms. The final population in the involved atomic levels determines a Gauss sum. With this technique we factor the number N=263193.

  9. Synthetic heparin-binding factor analogs

    DOE Patents [OSTI]

    Pena, Louis A.; Zamora, Paul O.; Lin, Xinhua; Glass, John D.

    2010-04-20

    The invention provides synthetic heparin-binding growth factor analogs having at least one peptide chain, and preferably two peptide chains branched from a dipeptide branch moiety composed of two trifunctional amino acid residues, which peptide chain or chains bind a heparin-binding growth factor receptor and are covalently bound to a non-signaling peptide that includes a heparin-binding domain, preferably by a linker, which may be a hydrophobic linker. The synthetic heparin-binding growth factor analogs are useful as pharmaceutical agents, soluble biologics or as surface coatings for medical devices.

  10. Relativistic Thomson Scatter from Factor Calculation

    Energy Science and Technology Software Center (OSTI)

    2009-11-01

    The purpose of this program is calculate the fully relativistic Thomson scatter from factor in unmagnetized plasmas. Such calculations are compared to experimental diagnoses of plasmas at such facilities as the Jupiter laser facility here a LLNL.

  11. Carbon Dioxide Emission Factors for Coal

    Reports and Publications (EIA)

    1994-01-01

    The Energy Information Administration (EIA) has developed factors for estimating the amount of carbon dioxide emitted, accounting for differences among coals, to reflect the changing "mix" of coal in U.S. coal consumption.

  12. Industrial Power Factor Analysis Guidebook. (Technical Report...

    Office of Scientific and Technical Information (OSTI)

    Power factor is a way of measuring the percentage of reactive power in an electrical system. Reactive power represents wasted energy--electricity that does no useful work because ...

  13. Proton form factor effects in hydrogenic atoms

    SciTech Connect (OSTI)

    Daza, F. Garcia; Kelkar, N. G.; Nowakowski, M.

    2011-10-21

    The proton structure corrections to the hyperfine splittings in electronic and muonic hydrogen are evaluated using the Breit potential with electromagnetic form factors. In contrast to other methods, the Breit equation with q{sup 2} dependent form factors is just an extension of the standard Breit equation which gives the hyperfine splitting Hamiltonian. Precise QED corrections are comparable to the structure corrections which therefore need to be evaluated ab initio.

  14. Factors affecting robust retail energy markets

    SciTech Connect (OSTI)

    Michelman, T.S.

    1999-04-01

    This paper briefly defines an active retail market, details the factors that influence market activity and their relative importance, compares activity in various retail energy markets to date, and predicts future retail energy market activity. Three primary factors translate into high market activity: supplier margins, translated into potential savings for actively shopping customers; market size; and market barriers. The author surveys activity nationwide and predicts hot spots for the coming year.

  15. Hadronic form factors in kaon photoproduction

    SciTech Connect (OSTI)

    Syukurilla, L. Mart, T.

    2014-09-25

    We have revisited the effect of hadronic form factors in kaon photoproduction process by utilizing an isobaric model developed for kaon photoproduction off the proton. The model is able to reproduce the available experimental data nicely as well as to reveal the origin of the second peak in the total cross section, which was the main source of confusion for decades. Different from our previous study, in the present work we explore the possibility of using different hadronic form factors in each of the K?N vertices. The use of different hadronic form factors, e.g. dipole, Gaussian, and generalized dipole, has been found to produce a more flexible isobar model, which can provide a significant improvement in the model.

  16. Power-factor metering gains new interest

    SciTech Connect (OSTI)

    Womack, D.L.

    1980-01-01

    The combined effect of increased energy costs, advances in digital metering techniques, and regulatory pressures is stimulating utility interest in charging smaller customers the full cost of their burden on the electric system, by metering reactive power and billing for poor power factor. Oklahoma Gas and Electric Co. adopted the Q-meter method, made practical with the advent of magnetic-tape metering. Digital metering and new techniques now being developed will add more options for utilities interested in metering power factor. There are three commonly used methods of determining power factor, all of which require the use of the standard induction watthour meter, plus at least one other meter, to obtain a second value in the power triangle. In all cases, the third value, if required, is obtained by calculation.

  17. Measurement of the Helium Factors at Jlab

    SciTech Connect (OSTI)

    Elena Khrosinkova

    2007-06-11

    An experiment to measure elastic electron scattering off 3He and 4He at large momentum transfers is presented. The experiment was carried out in the Hall A Facility of Jefferson Lab. Elastic electron scattering off 3He was measured at forward and backward electron scattering angles to extract the isotope's charge and magnetic form factors. The charge form factor of 4He will be extracted from forward-angle electron scattering angle measurements. The data are expected to significantly extend and improve the existing measurements of the three-and four-body form factors. The results will be crucial for the establishment of a canonical standard model for the few- body nuclear systems and for testing predictions of quark dimensional scaling and hybrid nucleon- quark models.

  18. Human factors in nuclear technology - a history

    SciTech Connect (OSTI)

    Jones, D.B. )

    1992-01-01

    Human factors, human factors engineering (HFE), or ergonomics did not receive much formal attention in nuclear technology prior to the Three Mile Island Unit 2 (TMI-2) incident. Three principal reasons exist for this lack of concern. First, emerging technologies show little concern with how people will use a new system. Making the new technology work is considered more important than the people who will use it. Second, the culture of the users of nuclear power did not recognize a need for human factors. Traditional utilities had well established and effective engineering designs for control of electric power generation, while medicine considered the use of nuclear isotopes another useful tool, not requiring special ergonomics. Finally, the nuclear industry owed much to Admiral Rickover. He was definitely opposed.

  19. Chiral extrapolation of nucleon magnetic form factors

    SciTech Connect (OSTI)

    P. Wang; D. Leinweber; A. W. Thomas; R.Young

    2007-04-01

    The extrapolation of nucleon magnetic form factors calculated within lattice QCD is investigated within a framework based upon heavy baryon chiral effective-field theory. All one-loop graphs are considered at arbitrary momentum transfer and all octet and decuplet baryons are included in the intermediate states. Finite range regularization is applied to improve the convergence in the quark-mass expansion. At each value of the momentum transfer (Q{sup 2}), a separate extrapolation to the physical pion mass is carried out as a function of m{sub {pi}} alone. Because of the large values of Q{sup 2} involved, the role of the pion form factor in the standard pion-loop integrals is also investigated. The resulting values of the form factors at the physical pion mass are compared with experimental data as a function of Q{sup 2} and demonstrate the utility and accuracy of the chiral extrapolation methods presented herein.

  20. Communication-avoiding symmetric-indefinite factorization

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ballard, Grey Malone; Becker, Dulcenia; Demmel, James; Dongarra, Jack; Druinsky, Alex; Peled, Inon; Schwartz, Oded; Toledo, Sivan; Yamazaki, Ichitaro

    2014-11-13

    We describe and analyze a novel symmetric triangular factorization algorithm. The algorithm is essentially a block version of Aasen's triangular tridiagonalization. It factors a dense symmetric matrix A as the product A=PLTLTPT where P is a permutation matrix, L is lower triangular, and T is block tridiagonal and banded. The algorithm is the first symmetric-indefinite communication-avoiding factorization: it performs an asymptotically optimal amount of communication in a two-level memory hierarchy for almost any cache-line size. Adaptations of the algorithm to parallel computers are likely to be communication efficient as well; one such adaptation has been recently published. As a result,more » the current paper describes the algorithm, proves that it is numerically stable, and proves that it is communication optimal.« less

  1. Communication-avoiding symmetric-indefinite factorization

    SciTech Connect (OSTI)

    Ballard, Grey Malone; Becker, Dulcenia; Demmel, James; Dongarra, Jack; Druinsky, Alex; Peled, Inon; Schwartz, Oded; Toledo, Sivan; Yamazaki, Ichitaro

    2014-11-13

    We describe and analyze a novel symmetric triangular factorization algorithm. The algorithm is essentially a block version of Aasen's triangular tridiagonalization. It factors a dense symmetric matrix A as the product A=PLTLTPT where P is a permutation matrix, L is lower triangular, and T is block tridiagonal and banded. The algorithm is the first symmetric-indefinite communication-avoiding factorization: it performs an asymptotically optimal amount of communication in a two-level memory hierarchy for almost any cache-line size. Adaptations of the algorithm to parallel computers are likely to be communication efficient as well; one such adaptation has been recently published. As a result, the current paper describes the algorithm, proves that it is numerically stable, and proves that it is communication optimal.

  2. Human factors challenges for advanced process control

    SciTech Connect (OSTI)

    Stubler, W.F.; O`Hara, J..M.

    1996-08-01

    New human-system interface technologies provide opportunities for improving operator and plant performance. However, if these technologies are not properly implemented, they may introduce new challenges to performance and safety. This paper reports the results from a survey of human factors considerations that arise in the implementation of advanced human-system interface technologies in process control and other complex systems. General trends were identified for several areas based on a review of technical literature and a combination of interviews and site visits with process control organizations. Human factors considerations are discussed for two of these areas, automation and controls.

  3. Annotated bibliography of human factors applications literature

    SciTech Connect (OSTI)

    McCafferty, D.B.

    1984-09-30

    This bibliography was prepared as part of the Human Factors Technology Project, FY 1984, sponsored by the Office of Nuclear Safety, US Department of Energy. The project was conducted by Lawrence Livermore National Laboratory, with Essex Corporation as a subcontractor. The material presented here is a revision and expansion of the bibliographic material developed in FY 1982 as part of a previous Human Factors Technology Project. The previous bibliography was published September 30, 1982, as Attachment 1 to the FY 1982 Project Status Report.

  4. Meson-photon transition form factors

    SciTech Connect (OSTI)

    Balakireva, Irina; Lucha, Wolfgang; Melikhov, Dmitri

    2012-10-23

    We present the results of our recent analysis of the meson-photon transition form factors F{sub P{gamma}}(Q{sup 2}) for the pseudoscalar mesons P {pi}{sup 0},{eta},{eta} Prime ,{eta}{sub c}, using the local-duality version of QCD sum rules.

  5. Factors Affecting PMU Installation Costs (October 2014)

    Broader source: Energy.gov [DOE]

    The Department of Energy investigated the major cost factors that affected PMU installation costs for the synchrophasor projects funded through the Recovery Act Smart Grid Programs. The data was compiled through interviews with the nine projects that deployed production grade synchrophasor systems.

  6. Derivation of dose conversion factors for tritium

    SciTech Connect (OSTI)

    Killough, G. G.

    1982-03-01

    For a given intake mode (ingestion, inhalation, absorption through the skin), a dose conversion factor (DCF) is the committed dose equivalent to a specified organ of an individual per unit intake of a radionuclide. One also may consider the effective dose commitment per unit intake, which is a weighted average of organ-specific DCFs, with weights proportional to risks associated with stochastic radiation-induced fatal health effects, as defined by Publication 26 of the International Commission on Radiological Protection (ICRP). This report derives and tabulates organ-specific dose conversion factors and the effective dose commitment per unit intake of tritium. These factors are based on a steady-state model of hydrogen in the tissues of ICRP's Reference Man (ICRP Publication 23) and equilibrium of specific activities between body water and other tissues. The results differ by 27 to 33% from the estimate on which ICRP Publication 30 recommendations are based. The report also examines a dynamic model of tritium retention in body water, mineral bone, and two compartments representing organically-bound hydrogen. This model is compared with data from human subjects who were observed for extended periods. The manner of combining the dose conversion factors with measured or model-predicted levels of contamination in man's exposure media (air, drinking water, soil moisture) to estimate dose rate to an individual is briefly discussed.

  7. Module: Emission Factors for Deforestation | Open Energy Information

    Open Energy Info (EERE)

    www.leafasia.orgtoolstechnical-guidance-series-emission-factors-defo Cost: Free Language: English Module: Emission Factors for Deforestation Screenshot Logo: Module: Emission...

  8. Engineering an allosteric transcription factor to respond to...

    Office of Scientific and Technical Information (OSTI)

    Engineering an allosteric transcription factor to respond to new ligands Citation Details In-Document Search Title: Engineering an allosteric transcription factor to respond to new ...

  9. Test of factorization in diffractive deep inelastic scattering...

    Office of Scientific and Technical Information (OSTI)

    Test of factorization in diffractive deep inelastic scattering and photoproduction at HERA Citation Details In-Document Search Title: Test of factorization in diffractive deep ...

  10. Dense LU Factorization on Multicore Supercomputer Nodes (Conference...

    Office of Scientific and Technical Information (OSTI)

    factorization's memory hierarchy contention on now-ubiquitous multi-core architectures. ... During active panel factorization, rank-1 updates stream through memory with minimal ...

  11. Study of Factors Affecting Shrub Establishment on the Monticello...

    Office of Environmental Management (EM)

    Study of Factors Affecting Shrub Establishment on the Monticello, Utah, Disposal Cell Cover Study of Factors Affecting Shrub Establishment on the Monticello, Utah, Disposal Cell...

  12. Neutrino mass, dark energy, and the linear growth factor (Journal...

    Office of Scientific and Technical Information (OSTI)

    dark energy, and the linear growth factor Citation Details In-Document Search Title: Neutrino mass, dark energy, and the linear growth factor We study the degeneracies between ...

  13. Is the proton electromagnetic form factor modified in nuclei...

    Office of Scientific and Technical Information (OSTI)

    Is the proton electromagnetic form factor modified in nuclei? Citation Details In-Document Search Title: Is the proton electromagnetic form factor modified in nuclei? You are ...

  14. Method for determining formation quality factor from well log...

    Office of Scientific and Technical Information (OSTI)

    factor from well log data and its application to seismic reservoir characterization Citation Details In-Document Search Title: Method for determining formation quality factor ...

  15. Initiation factor 2 crystal structure reveals a different domain...

    Office of Scientific and Technical Information (OSTI)

    Initiation factor 2 crystal structure reveals a different domain organization from eukaryotic initiation factor 5B and mechanism among translational GTPases Citation Details ...

  16. Theory of factors limiting high gradient operation of warm acceleratin...

    Office of Scientific and Technical Information (OSTI)

    Theory of factors limiting high gradient operation of warm accelerating structures Citation Details In-Document Search Title: Theory of factors limiting high gradient operation of ...

  17. Crystal structure of elongation factor 4 bound to a clockwise...

    Office of Scientific and Technical Information (OSTI)

    Crystal structure of elongation factor 4 bound to a clockwise ratcheted ribosome Citation Details In-Document Search Title: Crystal structure of elongation factor 4 bound to a ...

  18. First Climate formerly Factor Consulting | Open Energy Information

    Open Energy Info (EERE)

    First Climate formerly Factor Consulting Jump to: navigation, search Name: First Climate (formerly Factor Consulting) Place: Germany Sector: Carbon Product: Former Swiss-based...

  19. Research on Factors Relating to Density and Climate Change |...

    Open Energy Info (EERE)

    on Factors Relating to Density and Climate Change Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Research on Factors Relating to Density and Climate Change Agency...

  20. Human Factors Engineering Analysis Tool - Energy Innovation Portal

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Human Factors Engineering Analysis Tool Software tool that enables easy and quick selection of applicable regulatory guidelines as starting point for human factors engineering ...

  1. HUMAN FACTORS GUIDANCE FOR CONTROL ROOM EVALUATION

    SciTech Connect (OSTI)

    OHARA,J.; BROWN,W.; STUBLER,W.; HIGGINS,J.; WACHTEL,J.; PERSENSKY,J.J.

    2000-07-30

    The Human-System Interface Design Review Guideline (NUREG-0700, Revision 1) was developed by the US Nuclear Regulatory Commission (NRC) to provide human factors guidance as a basis for the review of advanced human-system interface technologies. The guidance consists of three components: design review procedures, human factors engineering guidelines, and a software application to provide design review support called the ``Design Review Guideline.'' Since it was published in June 1996, Rev. 1 to NUREG-0700 has been used successfully by NRC staff, contractors and nuclear industry organizations, as well as by interested organizations outside the nuclear industry. The NRC has committed to the periodic update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool in the face of emerging and rapidly changing technology. This paper addresses the current research to update of NUREG-0700 based on the substantial work that has taken place since the publication of Revision 1.

  2. Understanding Hazardous Combustion Byproducts Reduces Factors Impacting

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Climate Change Hazardous Combustion Byproducts Reduces Factors Impacting Climate Change - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery

  3. Transcription factor-based biosensors for detecting dicarboxylic acids

    DOE Patents [OSTI]

    Dietrich, Jeffrey; Keasling, Jay

    2014-02-18

    The invention provides methods and compositions for detecting dicarboxylic acids using a transcription factor biosensor.

  4. Various factors affect coiled tubing limits

    SciTech Connect (OSTI)

    Yang, Y.S.

    1996-01-15

    Safety and reliability remain the primary concerns in coiled tubing operations. Factors affecting safety and reliability include corrosion, flexural bending, internal (or external) pressure and tension (or compression), and mechanical damage due to improper use. Such limits as coiled tubing fatigue, collapse, and buckling need to be understood to avoid disaster. With increased use of coiled tubing, operators will gain more experience. But at the same time, with further research and development of coiled tubing, the manufacturing quality will be improved and fatigue, collapse, and buckling models will become more mature, and eventually standard specifications will be available. This paper reviews the uses of coiled tubing and current research on mechanical behavior of said tubing. It also discusses several models used to help predict fatigue and failure levels.

  5. Chiral corrections to hyperon axial form factors

    SciTech Connect (OSTI)

    Jiang Fujiun; Tiburzi, B. C.

    2008-05-01

    We study the complete set of flavor-changing hyperon axial-current matrix elements at small momentum transfer. Using partially quenched heavy baryon chiral perturbation theory, we derive the chiral and momentum behavior of the axial and induced pseudoscalar form factors. The meson pole contributions to the latter posses a striking signal for chiral physics. We argue that the study of hyperon axial matrix elements enables a systematic lattice investigation of the efficacy of three-flavor chiral expansions in the baryon sector. This can be achieved by considering chiral corrections to SU(3) symmetry predictions, and their partially quenched generalizations. In particular, despite the presence of eight unknown low-energy constants, we are able to make next-to-leading order symmetry breaking predictions for two linear combinations of axial charges.

  6. Identification and Control of Factors that Affect EGR Cooler Fouling

    Broader source: Energy.gov [DOE]

    Key factors that cause exhaust gas recirculation cooler fouling were identified through extensive literature search and controlled experiment was devised to study the impact of a few key factors on deposition.

  7. Confinement and the safety factor profile

    SciTech Connect (OSTI)

    Batha, S.H.; Levinton, F.M.; Scott, S.D.

    1995-12-01

    The conjecture that the safety factor profile, q(r), controls the improvement in tokamak plasmas from poor confinement in the Low (L-) mode regime to improved confinement in the supershot regime has been tested in two experiments on the Tokamak Fusion Test Reactor (TFTR). First, helium was puffed into the beam-heated phase of a supershot discharge which induced a degradation from supershot to L-mode confinement in about 100 msec, far less than the current relaxation time. The q and shear profiles measured by a motional Stark effect polarimeter showed little change during the confinement degradation. Second, rapid current ramps in supershot plasmas altered the q profile, but were observed not to change significantly the energy confinement. Thus, enhanced confinement in supershot plasmas is not due to a particular q profile which has enhanced stability or transport properties. The discharges making a continuous transition between supershot and L-mode confinement were also used to test the critical-electron-temperature-gradient transport model. It was found that this model could not reproduce the large changes in electron and ion temperature caused by the change in confinement.

  8. Human Factors Aspects of Operating Small Reactors

    SciTech Connect (OSTI)

    OHara, J.M.; Higgins, J.; Deem, R.; Xing, J.; DAgostino, A.

    2010-11-07

    The nuclear-power community has reached the stage of proposing advanced reactor designs to support power generation for decades to come. They are considering small modular reactors (SMRs) as one approach to meet these energy needs. While the power output of individual reactor modules is relatively small, they can be grouped to produce reactor sites with different outputs. Also, they can be designed to generate hydrogen, or to process heat. Many characteristics of SMRs are quite different from those of current plants, and so may require a concept of operations (ConOps) that also is different. The U.S. Nuclear Regulatory Commission (NRC) has begun examining the human factors engineering- (HFE) and ConOps- aspects of SMRs; if needed, they will formulate guidance to support SMR licensing reviews. We developed a ConOps model, consisting of the following dimensions: Plant mission; roles and responsibilities of all agents; staffing, qualifications, and training; management of normal operations; management of off-normal conditions and emergencies; and, management of maintenance and modifications. We are reviewing information on SMR design to obtain data about each of these dimensions, and have identified several preliminary issues. In addition, we are obtaining operations-related information from other types of multi-module systems, such as refineries, to identify lessons learned from their experience. Here, we describe the project's methodology and our preliminary findings.

  9. LPS-inducible factor(s) from activated macrophages mediates cytolysis of Naegleria fowleri amoebae

    SciTech Connect (OSTI)

    Cleary, S.F.; Marciano-Cabral, F.

    1986-03-01

    Soluble cytolytic factors of macrophage origin have previously been described with respect to their tumoricidal activity. The purpose of this study was to investigate the mechanism and possible factor(s) responsible for cytolysis of the amoeba Naegleria fowleri by activated peritoneal macrophages from B6C3F1 mice. Macrophages or conditioned medium (CM) from macrophage cultures were incubated with /sup 3/H-Uridine labeled amoebae. Percent specific release of label served as an index of cytolysis. Bacille Calmette-Guerin (BCG) and Corynebacterium parvum macrophages demonstrated significant cytolysis of amoebae at 24 h with an effector to target ratio of 10:1. Treatment of macrophages with inhibitors of RNA or protein synthesis blocked amoebicidal activity. Interposition of a 1 ..mu..m pore membrane between macrophages and amoebae inhibited killing. Inhibition in the presence of the membrane was overcome by stimulating the macrophages with LPS. CM from SPS-stimulated, but not unstimulated, cultures of activated macrophages was cytotoxic for amoebae. The activity was heat sensitive and was recovered from ammonium sulfate precipitation of the CM. Results indicate that amoebicidal activity is mediated by a protein(s) of macrophage origin induced by target cell contact or stimulation with LPS.

  10. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect (OSTI)

    Wasiolek, Maryla A.

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  11. Investigation of the effects of cell model and subcellular location of gold nanoparticles on nuclear dose enhancement factors using Monte Carlo simulation

    SciTech Connect (OSTI)

    Cai, Zhongli; Chattopadhyay, Niladri; Kwon, Yongkyu Luke; Pignol, Jean-Philippe; Lechtman, Eli; Reilly, Raymond M.; Department of Medical Imaging, University of Toronto, Toronto, Ontario M5S 3E2; Toronto General Research Institute, University Health Network, Toronto, Ontario M5G 2C4

    2013-11-15

    Purpose: The authors aims were to model how various factors influence radiation dose enhancement by gold nanoparticles (AuNPs) and to propose a new modeling approach to the dose enhancement factor (DEF).Methods: The authors used Monte Carlo N-particle (MCNP 5) computer code to simulate photon and electron transport in cells. The authors modeled human breast cancer cells as a single cell, a monolayer, or a cluster of cells. Different numbers of 5, 30, or 50 nm AuNPs were placed in the extracellular space, on the cell surface, in the cytoplasm, or in the nucleus. Photon sources examined in the simulation included nine monoenergetic x-rays (10100 keV), an x-ray beam (100 kVp), and {sup 125}I and {sup 103}Pd brachytherapy seeds. Both nuclear and cellular dose enhancement factors (NDEFs, CDEFs) were calculated. The ability of these metrics to predict the experimental DEF based on the clonogenic survival of MDA-MB-361 human breast cancer cells exposed to AuNPs and x-rays were compared.Results: NDEFs show a strong dependence on photon energies with peaks at 15, 30/40, and 90 keV. Cell model and subcellular location of AuNPs influence the peak position and value of NDEF. NDEFs decrease in the order of AuNPs in the nucleus, cytoplasm, cell membrane, and extracellular space. NDEFs also decrease in the order of AuNPs in a cell cluster, monolayer, and single cell if the photon energy is larger than 20 keV. NDEFs depend linearly on the number of AuNPs per cell. Similar trends were observed for CDEFs. NDEFs using the monolayer cell model were more predictive than either single cell or cluster cell models of the DEFs experimentally derived from the clonogenic survival of cells cultured as a monolayer. The amount of AuNPs required to double the prescribed dose in terms of mg Au/g tissue decreases as the size of AuNPs increases, especially when AuNPs are in the nucleus and the cytoplasm. For 40 keV x-rays and a cluster of cells, to double the prescribed x-ray dose (NDEF = 2

  12. Dose factor entry and display tool for BNCT radiotherapy

    DOE Patents [OSTI]

    Wessol, Daniel E.; Wheeler, Floyd J.; Cook, Jeremy L.

    1999-01-01

    A system for use in Boron Neutron Capture Therapy (BNCT) radiotherapy planning where a biological distribution is calculated using a combination of conversion factors and a previously calculated physical distribution. Conversion factors are presented in a graphical spreadsheet so that a planner can easily view and modify the conversion factors. For radiotherapy in multi-component modalities, such as Fast-Neutron and BNCT, it is necessary to combine each conversion factor component to form an effective dose which is used in radiotherapy planning and evaluation. The Dose Factor Entry and Display System is designed to facilitate planner entry of appropriate conversion factors in a straightforward manner for each component. The effective isodose is then immediately computed and displayed over the appropriate background (e.g. digitized image).

  13. Major Risk Factors to the Integrated Facility Disposition Project |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy to the Integrated Facility Disposition Project Major Risk Factors to the Integrated Facility Disposition Project The scope of the Integrated Facility Disposition Project (IFDP) needs to comprehensively address a wide range of environmental management risks at the Oak Ridge Reservation (ORO). Major Risk Factors to the Integrated Facility Disposition Project (227.35 KB) More Documents & Publications Major Risk Factors Integrated Facility Disposition Project - Oak Ridge

  14. Fragmentation, NRQCD and Factorization in Heavy Quarkonium Production...

    Office of Scientific and Technical Information (OSTI)

    However, we show that gauge invariance and factorization require that conventional NRQCD production matrix elements be modified to include Wilson lines or non-abelian gauge links. ...

  15. Electromagnetic form factors and the hypercentral constituent quark model

    SciTech Connect (OSTI)

    Sanctis, M. De; Giannini, M. M.; Santopinto, E.; Vassallo, A.

    2007-12-15

    We present new results concerning the electromagnetic form factors of the nucleon using a relativistic version of the hypercentral constituent quark model and a relativistic current.

  16. A Compendium of Transfer Factors for Agricultural and Animal...

    Office of Scientific and Technical Information (OSTI)

    Tables of transfer factors are listed by element and information source for beef, eggs, fish, fruit, grain, leafy vegetation, milk, poultry, and root vegetables. Authors: Staven, ...

  17. Analytical evaluation of atomic form factors: Application to Rayleigh scattering

    SciTech Connect (OSTI)

    Safari, L.; Santos, J. P.; Amaro, P.; Jnkl, K.; Fratini, F.

    2015-05-15

    Atomic form factors are widely used for the characterization of targets and specimens, from crystallography to biology. By using recent mathematical results, here we derive an analytical expression for the atomic form factor within the independent particle model constructed from nonrelativistic screened hydrogenic wave functions. The range of validity of this analytical expression is checked by comparing the analytically obtained form factors with the ones obtained within the Hartee-Fock method. As an example, we apply our analytical expression for the atomic form factor to evaluate the differential cross section for Rayleigh scattering off neutral atoms.

  18. Factors Affecting Power Output by Photovoltaic Cells Lesson

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Factors Affecting Power Output by Photovoltaic Cells Grade Level(s): IB 2 (Senior - 3 ... C.8 Photovoltaic cells and dye-sensitized solar cells (DSSC) Understandings: * Solar ...

  19. Development of the Electricity Carbon Emission Factors for Russia...

    Open Energy Info (EERE)

    Russia Jump to: navigation, search Name Development of the Electricity Carbon Emission Factors for Russia AgencyCompany Organization European Bank for Reconstruction and...

  20. EPA Rainfall Erosivity Factor Calculator Website | Open Energy...

    Open Energy Info (EERE)

    Calculator Website Jump to: navigation, search OpenEI Reference LibraryAdd to library Web Site: EPA Rainfall Erosivity Factor Calculator Website Abstract This website allows...

  1. EPA - Rainfall Erosivity Factor Calculator webpage | Open Energy...

    Open Energy Info (EERE)

    Not Provided DOI Not Provided Check for DOI availability: http:crossref.org Online Internet link for EPA - Rainfall Erosivity Factor Calculator webpage Citation Environmental...

  2. Critical Factors Driving the High Volumetric Uptake of Methane...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Critical Factors Driving the High Volumetric Uptake of Methane in Cu-3(btc)(2) Previous Next List Hulvey, Zeric; Vlaisavljevich, Bess; Mason, Jarad A.; Tsivion, Ehud; Dougherty,...

  3. Factors Impacting EGR Cooler Fouling - Main Effects and Interactions...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Impacting EGR Cooler Fouling - Main Effects and Interactions Factors Impacting EGR Cooler Fouling - Main Effects and Interactions Presentation given at the 16th Directions in ...

  4. Identification and Control of Factors that Affect EGR Cooler...

    Broader source: Energy.gov (indexed) [DOE]

    Key factors that cause exhaust gas recirculation cooler fouling were identified through extensive literature search and controlled experiment was devised to study the impact of a ...

  5. Multi-factor Authentication Update | The Ames Laboratory

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Multi-factor Authentication Update There is a delay in the purchase of the multi-factor authentication software solution that will cause a lag in the planned implementation. The Laboratory is currently in negotiations to complete the purchase. Once complete, the implementation can begin.

  6. Consideration of Factors Affecting Strip Effluent PH and Sodium Content

    SciTech Connect (OSTI)

    Peters, T.

    2015-07-29

    A number of factors were investigated to determine possible reasons for why the Strip Effluent (SE) can sometimes have higher than expected pH values and/or sodium content, both of which have prescribed limits. All of the factors likely have some impact on the pH values and Na content.

  7. Scaling factor inconsistencies in neutrinoless double beta decay

    SciTech Connect (OSTI)

    Cowell, S. [Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States)

    2006-02-15

    The modern theory of neutrinoless double beta decay includes a scaling factor that has often been treated inconsistently in the literature. The nuclear contribution to the decay half-life can be suppressed by 15%-20% when scaling factors are mismatched. Correspondingly, is overestimated.

  8. View Factor Calculation for Three-Dimensional Geometries.

    SciTech Connect (OSTI)

    1989-06-20

    Version 00 MCVIEW calculates the radiation geometric view factor between surfaces for three dimensional geometries with and without interposed third surface obstructions. It was developed to calculate view factors for input data to heat transfer analysis programs such as SCA-03/TRUMP, SCA-01/HEATING-5 and PSR-199/HEATING-6.

  9. Lifestyle Factors in U.S. Residential Electricity Consumption

    SciTech Connect (OSTI)

    Sanquist, Thomas F.; Orr, Heather M.; Shui, Bin; Bittner, Alvah C.

    2012-03-30

    A multivariate statistical approach to lifestyle analysis of residential electricity consumption is described and illustrated. Factor analysis of selected variables from the 2005 U.S. Residential Energy Consumption Survey (RECS) identified five lifestyle factors reflecting social and behavioral choices associated with air conditioning, laundry usage, personal computer usage, climate zone of residence, and TV use. These factors were also estimated for 2001 RECS data. Multiple regression analysis using the lifestyle factors yields solutions accounting for approximately 40% of the variance in electricity consumption for both years. By adding the associated household and market characteristics of income, local electricity price and access to natural gas, variance accounted for is increased to approximately 54%. Income contributed only {approx}1% unique variance to the 2005 and 2001 models, indicating that lifestyle factors reflecting social and behavioral choices better account for consumption differences than income. This was not surprising given the 4-fold range of energy use at differing income levels. Geographic segmentation of factor scores is illustrated, and shows distinct clusters of consumption and lifestyle factors, particularly in suburban locations. The implications for tailored policy and planning interventions are discussed in relation to lifestyle issues.

  10. Using partial safety factors in wind turbine design and testing

    SciTech Connect (OSTI)

    Musial, W.D.; Butterfield, C.

    1997-09-01

    This paper describes the relationship between wind turbine design and testing in terms of the certification process. An overview of the current status of international certification is given along with a description of limit-state design basics. Wind turbine rotor blades are used to illustrate the principles discussed. These concepts are related to both International Electrotechnical Commission and Germanischer Lloyd design standards, and are covered using schematic representations of statistical load and material strength distributions. Wherever possible, interpretations of the partial safety factors are given with descriptions of their intended meaning. Under some circumstances, the authors` interpretations may be subjective. Next, the test-load factors are described in concept and then related to the design factors. Using technical arguments, it is shown that some of the design factors for both load and materials must be used in the test loading, but some should not be used. In addition, some test factors not used in the design may be necessary for an accurate test of the design. The results show that if the design assumptions do not clearly state the effects and uncertainties that are covered by the design`s partial safety factors, outside parties such as test labs or certification agencies could impose their own meaning on these factors.

  11. Optimization of scat detection methods for a social ungulate, the wild pig, and experimental evaluation of factors affecting detection of scat

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Keiter, David A.; Cunningham, Fred L.; Rhodes, Jr., Olin E.; Irwin, Brian J.; Beasley, James C.

    2016-05-25

    Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocolsmore » with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. In conclusion, knowledge of relationships between environmental variables and scat detection may allow

  12. Dual chain synthetic heparin-binding growth factor analogs

    DOE Patents [OSTI]

    Zamora, Paul O.; Pena, Louis A.; Lin, Xinhua

    2009-10-06

    The invention provides synthetic heparin-binding growth factor analogs having two peptide chains each branched from a branch moiety, such as trifunctional amino acid residues, the branch moieties separated by a first linker of from 3 to about 20 backbone atoms, which peptide chains bind a heparin-binding growth factor receptor and are covalently bound to a non-signaling peptide that includes a heparin-binding domain, preferably by a second linker, which may be a hydrophobic second linker. The synthetic heparin-binding growth factor analogs are useful as pharmaceutical agents, soluble biologics or as surface coatings for medical devices.

  13. Dual chain synthetic heparin-binding growth factor analogs

    DOE Patents [OSTI]

    Zamora, Paul O.; Pena, Louis A.; Lin, Xinhua

    2012-04-24

    The invention provides synthetic heparin-binding growth factor analogs having two peptide chains each branched from a branch moiety, such as trifunctional amino acid residues, the branch moieties separated by a first linker of from 3 to about 20 backbone atoms, which peptide chains bind a heparin-binding growth factor receptor and are covalently bound to a non-signaling peptide that includes a heparin-binding domain, preferably by a second linker, which may be a hydrophobic second linker. The synthetic heparin-binding growth factor analogs are useful as pharmaceutical agents, soluble biologics or as surface coatings for medical devices.

  14. The structure of the nucleon: Elastic electromagnetic form factors

    SciTech Connect (OSTI)

    Punjabi, V.; Perdrisat, C. F.; Jones, M. K.; Brash, E. J.; Carlson, C. E.

    2015-07-10

    Precise proton and neutron form factor measurements at Jefferson Lab, using spin observables, have recently made a significant contribution to the unraveling of the internal structure of the nucleon. Accurate experimental measurements of the nucleon form factors are a test-bed for understanding how the nucleon's static properties and dynamical behavior emerge from QCD, the theory of the strong interactions between quarks. There has been enormous theoretical progress, since the publication of the Jefferson Lab proton form factor ratio data, aiming at reevaluating the picture of the nucleon. We will review the experimental and theoretical developments in this field and discuss the outlook for the future.

  15. CDPHE Construction Storm Water Forms R-Factor Waiver Application...

    Open Energy Info (EERE)

    CDPHE Construction Storm Water Forms R-Factor Waiver Application Jump to: navigation, search OpenEI Reference LibraryAdd to library Legal Document- Permit ApplicationPermit...

  16. Recommended U-factors for swinging, overhead, and revolving doors

    SciTech Connect (OSTI)

    Carpenter, S.C.; Hogan, J.

    1996-11-01

    Doors are often an overlooked component in the thermal integrity of the building envelope. Although swinging doors represent a small portion of the shell in residential buildings, their U-factor is usually many times higher than those of walls or ceilings. In some commercial buildings, loading (overhead) doors represent a significant area of high heat loss. Contrary to common perception, there is a wide range in the design, type, and therefore thermal performance of doors. The 1997 ASHRAE Handbook of Fundamentals will contain expanded tables of door U-factors to account for these product variations. This paper presents the results of detailed computer simulations of door U-factors. Recommended U-factors for glazed and unglazed residential and commercial swinging doors and commercial/industrial overhead and revolving doors are presented.

  17. Hadronic Form Factors in Asymptotically Free Field Theories

    DOE R&D Accomplishments [OSTI]

    Gross, D. J.; Treiman, S. B.

    1974-01-01

    The breakdown of Bjorken scaling in asymptotically free gauge theories of the strong interactions is explored for its implications on the large q{sup 2} behavior of nucleon form factors. Duality arguments of Bloom and Gilman suggest a connection between the form factors and the threshold properties of the deep inelastic structure functions. The latter are addressed directly in an analysis of asymptotically free theories; and through the duality connection we are then led to statements about the form factors. For very large q{sup 2} the form factors are predicted to fall faster than any inverse power of q{sup 2}. For the more modest range of q{sup 2} reached in existing experiments the agreement with data is fairly good, though this may well be fortuitous. Extrapolations beyond this range are presented.

  18. Indoor Thermal Factors and Symptoms in Office Workers: Findings...

    Office of Scientific and Technical Information (OSTI)

    from the U.S. EPA BASE Study Citation Details In-Document Search Title: Indoor Thermal Factors and Symptoms in Office Workers: Findings from the U.S. EPA BASE Study You ...

  19. Phenomenology of semileptonic B -meson decays with form factors...

    Office of Scientific and Technical Information (OSTI)

    of semileptonic B -meson decays with form factors from lattice QCD Authors: Du, Daping ; El-Khadra, A. X. ; Gottlieb, Steven ; Kronfeld, A. S. ; Laiho, J. ; Lunghi, E. ; Van de...

  20. Factors Controlling The Geochemical Evolution Of Fumarolic Encrustatio...

    Open Energy Info (EERE)

    Smokes (VTTS). The six-factor solution model explains a large proportion (low of 74% for Ni to high of 99% for Si) of the individual element data variance. Although the primary...

  1. Property:Geothermal/LoadFactor | Open Energy Information

    Open Energy Info (EERE)

    to: navigation, search This is a property of type Number. Pages using the property "GeothermalLoadFactor" Showing 25 pages using this property. (previous 25) (next 25) 4 4 UR...

  2. Proton Form Factors Measurements in the Time-Like Region

    SciTech Connect (OSTI)

    Anulli, F.; /Frascati

    2007-10-22

    I present an overview of the measurement of the proton form factors in the time-like region. BABAR has recently measured with great accuracy the e{sup +}e{sup -} {yields} p{bar p} reaction from production threshold up to an energy of {approx} 4.5 GeV, finding evidence for a ratio of the electric to magnetic form factor greater than unity, contrary to expectation. In agreement with previous measurements, BABAR confirmed the steep rise of the magnetic form factor close to the p{bar p} mass threshold, suggesting the possible presence of an under-threshold N{bar N} vector state. These and other open questions related to the nucleon form factors both in the time-like and space-like region, wait for more data with different experimental techniques to be possibly solved.

  3. Property:ExplorationCostPerMetric | Open Energy Information

    Open Energy Info (EERE)

    Paleomagnetic Measurements Passive Seismic Techniques Passive Sensors Portable X-Ray Diffraction (XRD) Portfolio Risk Modeling Production Wells R Radar Remote Sensing Techniques...

  4. Analysis of key safety metrics of thorium utilization in LWRs...

    Office of Scientific and Technical Information (OSTI)

    high-temperature gas-cooled, fast spectrum sodium, and molten salt reactors), along with use in advanced accelerator-driven systems and even in fission-fusion hybrid systems. ...

  5. Phase estimation with nonunitary interferometers: Information as a metric

    SciTech Connect (OSTI)

    Bahder, Thomas B.

    2011-05-15

    Determining the phase in one arm of a quantum interferometer is discussed taking into account the three nonideal aspects in real experiments: nondeterministic state preparation, nonunitary state evolution due to losses during state propagation, and imperfect state detection. A general expression is written for the probability of a measurement outcome taking into account these three nonideal aspects. As an example of applying the formalism, the classical Fisher information and fidelity (Shannon mutual information between phase and measurements) are computed for few-photon Fock and N00N states input into a lossy Mach-Zehnder interferometer. These three nonideal aspects lead to qualitative differences in phase estimation, such as a decrease in fidelity and Fisher information that depends on the true value of the phase.

  6. Analysis of key safety metrics of thorium utilization in LWRs

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Ade, Brian J.; Bowman, Stephen M.; Worrall, Andrew; Powers, Jeffrey

    2016-04-08

    Here, thorium has great potential to stretch nuclear fuel reserves because of its natural abundance and because it is possible to breed the 232Th isotope into a fissile fuel (233U). Various scenarios exist for utilization of thorium in the nuclear fuel cycle, including use in different nuclear reactor types (e.g., light water, high-temperature gas-cooled, fast spectrum sodium, and molten salt reactors), along with use in advanced accelerator-driven systems and even in fission-fusion hybrid systems. The most likely near-term application of thorium in the United States is in currently operating light water reactors (LWRs). This use is primarily based on conceptsmore » that mix thorium with uranium (UO2 + ThO2) or that add fertile thorium (ThO2) fuel pins to typical LWR fuel assemblies. Utilization of mixed fuel assemblies (PuO2 + ThO2) is also possible. The addition of thorium to currently operating LWRs would result in a number of different phenomenological impacts to the nuclear fuel. Thorium and its irradiation products have different nuclear characteristics from those of uranium and its irradiation products. ThO2, alone or mixed with UO2 fuel, leads to different chemical and physical properties of the fuel. These key reactor safety–related issues have been studied at Oak Ridge National Laboratory and documented in “Safety and Regulatory Issues of the Thorium Fuel Cycle” (NUREG/CR-7176, U.S. Nuclear Regulatory Commission, 2014). Various reactor analyses were performed using the SCALE code system for comparison of key performance parameters of both ThO2 + UO2 and ThO2 + PuO2 against those of UO2 and typical UO2 + PuO2 mixed oxide fuels, including reactivity coefficients and power sharing between surrounding UO2 assemblies and the assembly of interest. The decay heat and radiological source terms for spent fuel after its discharge from the reactor are also presented. Based on this evaluation, potential impacts on safety requirements and identification of knowledge gaps that require additional analysis or research to develop a technical basis for the licensing of thorium fuel are identified.« less

  7. Hierarchical clustering using correlation metric and spatial continuity constraint

    DOE Patents [OSTI]

    Stork, Christopher L.; Brewer, Luke N.

    2012-10-02

    Large data sets are analyzed by hierarchical clustering using correlation as a similarity measure. This provides results that are superior to those obtained using a Euclidean distance similarity measure. A spatial continuity constraint may be applied in hierarchical clustering analysis of images.

  8. Final documentation report for FY2004 GPRA metrics: Subtask 5

    SciTech Connect (OSTI)

    None, None

    2003-02-01

    The Office of Energy Efficiency and Renewable Energys (EERE) Renewable and Distributed Energy R&D programs manage research in two broad areas: 1) Energy Supply Technologies; and 2) Electricity Delivery. Several different approaches are required to estimate the benefits of this wide array of programs. The analytical approaches used for FY 2004 are documented in this report, as are the results of these analyses. This chapter provides a broad overview of the approaches taken for each of the two EERE research areas. Greater detail for each EERE Renewable and Distributed Energy program is provided later in this report in program-specific discussions.

  9. Annex A Metrics for the Smart Grid System Report

    Broader source: Energy.gov (indexed) [DOE]

    ... D.C. A.49 M.8.3.0 Deployment Trends and Projections Table M.8.1 presents ... These scenarios assume that the PHEV will ultimately become the dominant alternative fuel vehicle...

  10. Deep Energy Retrofit Performance Metric Comparison: Eight California...

    Office of Scientific and Technical Information (OSTI)

    For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use ...

  11. Summary of Proposed Metrics - QER Technical Workshop on Energy...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Infrastructure Assurance Center presentation o DEFINITION - Resilience, in the context of critical infrastructure, is defined as the ability of a facility or asset to ...

  12. Toward a new metric for ranking high performance computing systems...

    Office of Scientific and Technical Information (OSTI)

    as a true measure of system performance for a growing collection of important science and engineering applications. In this paper we describe a new high performance conjugate...

  13. Analysis of key safety metrics of thorium utilization in LWRs...

    Office of Scientific and Technical Information (OSTI)

    nuclear reactor types (e.g., light water, high-temperature gas-cooled, fast ... of thorium in the United States is in currently operating light water reactors (LWRs). ...

  14. Toward a new metric for ranking high performance computing systems...

    Office of Scientific and Technical Information (OSTI)

    Close Cite: Bibtex Format Close 0 pages in this document matching the terms "" Search For Terms: Enter terms in the toolbar above to search the full text of this document for ...

  15. FY 2016 Q3 Metrics Summary.xlsx

    Broader source: Energy.gov (indexed) [DOE]

    FY 2016 Target 95% 95% 90% 85% 90% 90% FY 2016 3rd Qtr Actual Comment FY 2016 Forecast ... Schedule Compliance, Projects Less Than 5 Years Duration: Projects will meet the project ...

  16. Property:ExplorationTimePerMetric | Open Energy Information

    Open Energy Info (EERE)

    Techniques Geothermal Literature Review Geothermometry Gravity Methods Gravity Techniques Ground Electromagnetic Techniques Groundwater Sampling H Hand-held X-Ray Fluorescence...

  17. Non-minimal derivative couplings of the composite metric (Journal...

    Office of Scientific and Technical Information (OSTI)

    In the context of massive gravity, bi-gravity and multi-gravity non-minimal matter ... limit and the matter quantum loop corrections do not detune the potential interactions. ...

  18. EAC Presentation: Metrics and Benefits Analysis for the ARRA...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    and benefits analysis for the American Recovery and Reinvestment Act smart grid programs including the Smart Grid Investment Grants and the Smart Grid Demonstration Program. ...

  19. Integration of Sustainability Metrics into Design Cases and State...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    This presentation does not contain any proprietary, confidential, or otherwise restricted information DOE Bioenergy Technologies Office (BETO) 2015 Project Peer Review Integration ...

  20. Method for Confidence Metric in Optic Disk Location in Retinal...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Images Oak Ridge National Laboratory Contact ORNL About This Technology Technology Marketing Summary To improve accuracy in diagnosis of retinal disease, ORNL researchers...

  1. On The conformal metric structure of geometrothermodynamics: Generalizations

    SciTech Connect (OSTI)

    Azreg-Anou, Mustapha

    2014-03-15

    We show that the range of applicability of the change of representation formula derived by Bravetti et al. [J. Math. Phys. 54, 033513 (2013)] is very narrow and extend it to include all physical applications, particularly, applications to black hole thermodynamics, cosmology, and fluid thermodynamics.

  2. Toward a new metric for ranking high performance computing systems...

    Office of Scientific and Technical Information (OSTI)

    performance for a growing collection of important science and engineering applications. ... performance and expect to drive computer system design and implementation in ...

  3. Office of HC Strategy Budget and Performance Metrics (HC-50)...

    Broader source: Energy.gov (indexed) [DOE]

    Statement and Function Statement The Office of Human Capital Strategy, Budget, and ... Provides analytical support and consultative advice to the Chief Human Capital Officer, ...

  4. Dissipation factor as a predictor of anodic coating performance

    DOE Patents [OSTI]

    Panitz, Janda K. G.

    1995-01-01

    A dissipation factor measurement is used to predict as-anodized fixture performance prior to actual use of the fixture in an etching environment. A dissipation factor measurement of the anodic coating determines its dielectric characteristics and correlates to the performance of the anodic coating in actual use. The ability to predict the performance of the fixture and its anodized coating permits the fixture to be repaired or replaced prior to complete failure.

  5. Classical strongly coupled quark-gluon plasma. V. Structure factors

    SciTech Connect (OSTI)

    Cho, Sungtae; Zahed, Ismail

    2010-10-15

    We show that the classical and strongly coupled quark-gluon plasma is characterized by a multiple of structure factors that obey generalized Orstein-Zernicke equations. We use the canonical partition function and its associated density functional to derive analytical equations for the density and charge monopole structure factors for arbitrary values of {Gamma}=V/K, the ratio of the mean potential to the Coulomb energy. The results are compared with SU(2) molecular dynamics simulations.

  6. Charm and bottom hadronic form factors with QCD sum rules

    SciTech Connect (OSTI)

    Bracco, M. E.; Rodrigues, B. O.; Cerqueira, A. Jr.

    2013-03-25

    We present a brief review of some calculations of form factors and coupling constants in vertices with charm and bottom mesons in the framework of QCD sum rules. We first discuss the motivation for this work, describing possible applications of these form factors to charm and bottom decays processes. We first make a summarize of the QCD sum rules method. We give special attention to the uncertainties of the method introducing by the intrinsic variation of the parameters. Finally we conclude.

  7. Article Published on LED Lumen Maintenance and Light Loss Factors

    Broader source: Energy.gov [DOE]

    An article has been published in LEUKOS: The Journal of the Illuminating Engineering Society of North America (IES) that may be of interest to the solid-state lighting community. Entitled "Lumen Maintenance and Light Loss Factors: Consequences of Current Design Practices for LEDs," the article was written by Michael Royer of Pacific Northwest National Laboratory and discusses complications related to the lamp lumen depreciation (LLD) light loss factor and LEDs.

  8. Constructing the S-matrix With Complex Factorization

    SciTech Connect (OSTI)

    Schuster, Philip C.; Toro, Natalia; /Stanford U., ITP

    2009-06-19

    A remarkable connection between BCFW recursion relations and constraints on the S-matrix was made by Benincasa and Cachazo in 0705.4305, who noted that mutual consistency of different BCFW constructions of four-particle amplitudes generates nontrivial (but familiar) constraints on three-particle coupling constants - these include gauge invariance, the equivalence principle, and the lack of non-trivial couplings for spins > 2. These constraints can also be derived with weaker assumptions, by demanding the existence of four-point amplitudes that factorize properly in all unitarity limits with complex momenta. From this starting point, we show that the BCFW prescription can be interpreted as an algorithm for fully constructing a tree-level S-matrix, and that complex factorization of general BCFW amplitudes follows from the factorization of four-particle amplitudes. The allowed set of BCFW deformations is identified, formulated entirely as a statement on the three-particle sector, and using only complex factorization as a guide. Consequently, our analysis based on the physical consistency of the S-matrix is entirely independent of field theory. We analyze the case of pure Yang-Mills, and outline a proof for gravity. For Yang-Mills, we also show that the well-known scaling behavior of BCFW-deformed amplitudes at large z is a simple consequence of factorization. For gravity, factorization in certain channels requires asymptotic behavior {approx} 1/z{sup 2}.

  9. Human factors evaluation of teletherapy: Literature review. Volume 5

    SciTech Connect (OSTI)

    Henriksen, K.; Kaye, R.D.; Jones, R.; Morisseau, D.S.; Serig, D.L.

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was performed initially to guide subsequent evaluations in the areas of workplace environment, system-user interfaces, procedures, training, and organizational practices. To further acquire an in-depth and up-to-date understanding of the practice of teletherapy in support of these evaluations, a systematic literature review was conducted. Factors that have a potential impact on the accuracy of treatment delivery were of primary concern. The present volume is the literature review. The volume starts with an overview of the multiphased nature of teletherapy, and then examines the requirement for precision, the increasing role of quality assurance, current conceptualizations of human error, and the role of system factors such as the workplace environment, user-system interfaces, procedures, training, and organizational practices.

  10. The structure of the nucleon: Elastic electromagnetic form factors

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Punjabi, V.; Perdrisat, C. F.; Jones, M. K.; Brash, E. J.; Carlson, C. E.

    2015-07-10

    Precise proton and neutron form factor measurements at Jefferson Lab, using spin observables, have recently made a significant contribution to the unraveling of the internal structure of the nucleon. Accurate experimental measurements of the nucleon form factors are a test-bed for understanding how the nucleon's static properties and dynamical behavior emerge from QCD, the theory of the strong interactions between quarks. There has been enormous theoretical progress, since the publication of the Jefferson Lab proton form factor ratio data, aiming at reevaluating the picture of the nucleon. We will review the experimental and theoretical developments in this field and discussmore » the outlook for the future.« less

  11. Measurements of the Helium Form Factors at JLab

    SciTech Connect (OSTI)

    Khrosinkova, Elena

    2007-10-26

    An experiment to measure elastic electron scattering off {sup 3}He and {sup 4}He at large momentum transfers is presented. The experiment was carried out in the Hall A Facility of Jefferson Lab. Elastic electron scattering off {sup 3}He was measured at forward and backward electron scattering angles to extract the isotope's charge and magnetic form factors. The charge form factor of {sup 4}He will be extracted from forward-angle electron scattering angle measurements. The data are expected to significantly extend and improve the existing measurements of the three- and four-body form factors. The results will be crucial for the establishment of a canonical standard model for the few-body nuclear systems and for testing predictions of quark dimensional scaling and hybrid nucleon-quark models.

  12. Selection of powder factor in large diameter blastholes

    SciTech Connect (OSTI)

    Eloranta, J.

    1995-12-31

    This paper documents the relationship between material handling and processing costs compared to blasting cost. The old adage, The cheapest crushing is done in the pit, appears accurate in this case study. Comparison of the accumulated cost of: powder, selected wear materials and electricity; indicate a strong, inverse correlation with powder factor (lbs powder/long ton of rock). In this case, the increased powder cost is more than offset by electrical savings alone. Measurable, overall costs decline while shovel and crusher productivity rise by about 5% when powder factor rises by 15%. These trends were previously masked by the effects of: weather, ore grade fluctuations and accounting practices. Attempts to correlate increased powder factor to: wear materials in the crushing plant and to shovel hoist rope life have not shown the same benefit.

  13. Ion-ion dynamic structure factor of warm dense mixtures

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Gill, N. M.; Heinonen, R. A.; Starrett, C. E.; Saumon, D.

    2015-06-25

    In this study, the ion-ion dynamic structure factor of warm dense matter is determined using the recently developed pseudoatom molecular dynamics method [Starrett et al., Phys. Rev. E 91, 013104 (2015)]. The method uses density functional theory to determine ion-ion pair interaction potentials that have no free parameters. These potentials are used in classical molecular dynamics simulations. This constitutes a computationally efficient and realistic model of dense plasmas. Comparison with recently published simulations of the ion-ion dynamic structure factor and sound speed of warm dense aluminum finds good to reasonable agreement. Using this method, we make predictions of the ion-ionmore » dynamical structure factor and sound speed of a warm dense mixture—equimolar carbon-hydrogen. This material is commonly used as an ablator in inertial confinement fusion capsules, and our results are amenable to direct experimental measurement.« less

  14. Ion-ion dynamic structure factor of warm dense mixtures

    SciTech Connect (OSTI)

    Gill, N. M.; Heinonen, R. A.; Starrett, C. E.; Saumon, D.

    2015-06-25

    In this study, the ion-ion dynamic structure factor of warm dense matter is determined using the recently developed pseudoatom molecular dynamics method [Starrett et al., Phys. Rev. E 91, 013104 (2015)]. The method uses density functional theory to determine ion-ion pair interaction potentials that have no free parameters. These potentials are used in classical molecular dynamics simulations. This constitutes a computationally efficient and realistic model of dense plasmas. Comparison with recently published simulations of the ion-ion dynamic structure factor and sound speed of warm dense aluminum finds good to reasonable agreement. Using this method, we make predictions of the ion-ion dynamical structure factor and sound speed of a warm dense mixture—equimolar carbon-hydrogen. This material is commonly used as an ablator in inertial confinement fusion capsules, and our results are amenable to direct experimental measurement.

  15. Cosmic Reionization On Computers III. The Clumping Factor

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Kaurov, Alexander A.; Gnedin, Nickolay Y.

    2015-09-09

    We use fully self-consistent numerical simulations of cosmic reionization, completed under the Cosmic Reionization On Computers project, to explore how well the recombinations in the ionized intergalactic medium (IGM) can be quantified by the effective "clumping factor." The density distribution in the simulations (and, presumably, in a real universe) is highly inhomogeneous and more-or-less smoothly varying in space. However, even in highly complex and dynamic environments, the concept of the IGM remains reasonably well-defined; the largest ambiguity comes from the unvirialized regions around galaxies that are over-ionized by the local enhancement in the radiation field ("proximity zones"). This ambiguity precludesmore » computing the IGM clumping factor to better than about 20%. Furthermore, we discuss a "local clumping factor," defined over a particular spatial scale, and quantify its scatter on a given scale and its variation as a function of scale.« less

  16. Greybody factors for Myers–Perry black holes

    SciTech Connect (OSTI)

    Boonserm, Petarpa; Chatrabhuti, Auttakit Ngampitipan, Tritos; Visser, Matt

    2014-11-15

    The Myers–Perry black holes are higher-dimensional generalizations of the usual (3+1)-dimensional rotating Kerr black hole. They are of considerable interest in Kaluza–Klein models, specifically within the context of brane-world versions thereof. In the present article, we shall consider the greybody factors associated with scalar field excitations of the Myers–Perry spacetimes, and develop some rigorous bounds on these greybody factors. These bounds are of relevance for characterizing both the higher-dimensional Hawking radiation, and the super-radiance, that is expected for these spacetimes.

  17. Structure factors for tunneling ionization rates of diatomic molecules

    SciTech Connect (OSTI)

    Saito, Ryoichi; Tolstikhin, Oleg I.; Madsen, Lars Bojer; Morishita, Toru

    2015-05-15

    Within the leading-order, single-active-electron, and frozen-nuclei approximation of the weak-field asymptotic theory, the rate of tunneling ionization of a molecule in an external static uniform electric field is determined by the structure factor for the highest occupied molecular orbital. We present the results of systematic calculations of structure factors for 40 homonuclear and heteronuclear diatomic molecules by the Hartree–Fock method using a numerical grid-based approach implemented in the program X2DHF.

  18. Dominant factors of the laser gettering of silicon wafers

    SciTech Connect (OSTI)

    Bokhan, Yu. I. E-mail: yuibokhan@gmail.com; Kamenkov, V. S.; Tolochko, N. K.

    2015-02-15

    The laser gettering of silicon wafers is experimentally investigated. The typical gettering parameters are considered. The surfaces of laser-treated silicon wafers are investigated by microscopy. When studying the effect of laser radiation on silicon wafers during gettering, a group of factors determining the conditions of interaction between the laser beam and silicon-wafer surface and affecting the final result of treatment are selected. The main factors determining the gettering efficiency are revealed. Limitations on the desired value of the getter-layer capacity on surfaces with insufficiently high cleanness (for example, ground or matte) are established.

  19. Performance analysis of parallel supernodal sparse LU factorization

    SciTech Connect (OSTI)

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  20. Nucleon form factors program with SBS at JLAB

    SciTech Connect (OSTI)

    Wojtsekhowski, Bogdan B.

    2014-12-01

    The physics of the nucleon form factors is the basic part of the Jefferson Laboratory program. We review the achievements of the 6-GeV era and the program with the 12- GeV beam with the SBS spectrometer in Hall A, with a focus on the nucleon ground state properties.

  1. UPDATING THE NRC GUIDANCE FOR HUMAN FACTORS ENGINEERING REVIEWS.

    SciTech Connect (OSTI)

    O HARA,J.M.; BROWN,W.S.; HIGGINS,J.C.; PERSENSKY,J.J.; LEWIS,P.M.; BONGARRA,J.

    2002-09-15

    The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) aspects of nuclear plants. NUREG-0800 (Standard Review Plan), Chapter 18, ''Human Factors Engineering,'' is the principal NRC staff guidance document. Two main documents provide the review criteria to support the evaluations. The HFE Program Review Model (NUREG-0711) addresses the design process from planning to verification and validation to design implementation. The Human-System Interface Design Review Guidelines (NUREG-0700) provides the guidelines for the review of the HFE aspects of human-system interface technology, such as alarms, information systems, controls, and control room design. Since these documents were published in 1994 and 1996 respectively, they have been used by NRC staff, contractors, nuclear industry organizations, as well as by numerous organizations outside the nuclear industry. Using feedback from users and NRC research conducted in recent years, both documents have been revised and updated. This was done to ensure that they remain state-of-the-art evaluation tools for changing nuclear industry issues and emerging technologies. This paper describes the methodology used to revise and update the documents and summarizes the changes made to each and their current contents. Index Terms for this report are: Control system human factors, Ergonomics, Human factors, Nuclear power generation safety.

  2. The Modern description of semileptonic meson form factors

    SciTech Connect (OSTI)

    Hill, Richard J.

    2006-06-01

    I describe recent advances in our understanding of the hadronic form factors governing semileptonic meson transitions. The resulting framework provides a systematic approach to the experimental data, as a means of extracting precision observables, testing nonperturbative field theory methods, and probing a poorly understood limit of QCD.

  3. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect (OSTI)

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  4. Effect of Environmental Factors on Sulfur Gas Emissions from Drywall

    SciTech Connect (OSTI)

    Maddalena, Randy

    2011-08-20

    Problem drywall installed in U.S. homes is suspected of being a source of odorous and potentially corrosive indoor pollutants. The U.S. Consumer Product Safety Commission's (CPSC) investigation of problem drywall incorporates three parallel tracks: (1) evaluating the relationship between the drywall and reported health symptoms; (2) evaluating the relationship between the drywall and electrical and fire safety issues in affected homes; and (3) tracing the origin and the distribution of the drywall. To assess the potential impact on human health and to support testing for electrical and fire safety, the CPSC has initiated a series of laboratory tests that provide elemental characterization of drywall, characterization of chemical emissions, and in-home air sampling. The chemical emission testing was conducted at Lawrence Berkeley National Laboratory (LBNL). The LBNL study consisted of two phases. In Phase 1 of this study, LBNL tested thirty drywall samples provided by CPSC and reported standard emission factors for volatile organic compounds (VOCs), aldehydes, reactive sulfur gases (RSGs) and volatile sulfur compounds (VSCs). The standard emission factors were determined using small (10.75 liter) dynamic test chambers housed in a constant temperature environmental chamber. The tests were all run at 25 C, 50% relative humidity (RH) and with an area-specific ventilation rate of {approx}1.5 cubic meters per square meter of emitting surface per hour [m{sup 3}/m{sup 2}/h]. The thirty samples that were tested in Phase 1 included seventeen that were manufactured in China in 2005, 2006 and 2009, and thirteen that were manufactured in North America in 2009. The measured emission factors for VOCs and aldehydes were generally low and did not differ significantly between the Chinese and North American drywall. Eight of the samples tested had elevated emissions of volatile sulfur-containing compounds with total RSG emission factors between 32 and 258 micrograms per square meter

  5. Discovery of Novel P1 Groups for Coagulation Factor VIIa Inhibition...

    Office of Scientific and Technical Information (OSTI)

    for Coagulation Factor VIIa Inhibition Using Fragment-Based Screening Citation Details In-Document Search Title: Discovery of Novel P1 Groups for Coagulation Factor VIIa Inhibition ...

  6. Tuning g factors of core-shell nanoparticles by controlled positioning...

    Office of Scientific and Technical Information (OSTI)

    Tuning g factors of core-shell nanoparticles by controlled positioning of magnetic ... 22, 2017 Prev Next Title: Tuning g factors of core-shell nanoparticles by ...

  7. DOE Order 458.1 Property Clearance Requirements and Factors Considered...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    58.1 Property Clearance Requirements and Factors Considered to Update Its Clearance Limits DOE Order 458.1 Property Clearance Requirements and Factors Considered to Update Its ...

  8. Factors driving wind power development in the United States

    SciTech Connect (OSTI)

    Bird, Lori A.; Parsons, Brian; Gagliano, Troy; Brown, Matthew H.; Wiser, Ryan H.; Bolinger, Mark

    2003-05-15

    In the United States, there has been substantial recent growth in wind energy generating capacity, with growth averaging 24 percent annually during the past five years. About 1,700 MW of wind energy capacity was installed in 2001, while another 410 MW became operational in 2002. This year (2003) shows promise of significant growth with more than 1,500 MW planned. With this growth, an increasing number of states are experiencing investment in wind energy projects. Wind installations currently exist in about half of all U.S. states. This paper explores the key factors at play in the states that have achieved a substantial amount of wind energy investment. Some of the factors that are examined include policy drivers, such as renewable portfolio standards (RPS), federal and state financial incentives, and integrated resource planning; as well as market drivers, such as consumer demand for green power, natural gas price volatility, and wholesale market rules.

  9. Precision Measurements of the Proton Elastic Form Factor Ratio

    SciTech Connect (OSTI)

    Douglas Higinbotham

    2010-08-01

    New high precision polarization measurements of the proton elastic form factor ratio in the Q^2 from 0.3 to 0.7 [GeV/c]^2 have been made. These elastic H(e,e'p) measurementswere done in Jefferson Lab's Hall A using 80% longitudinally polarized electrons and recoil polarimetry. For Q^2 greater than 1 [GeV/c]^2, previous polarization data indicated a strong deviation of the form factor ratio from unity which sparked renewed theoretical and experimental interest in how two-photon diagrams have been taken into account. The new high precision data indicate that the deviation from unity, while small, persists even at Q^2 less than 1 [GeV/c]^2.

  10. Measurement of the gamma gamma* -> pi0 transition form factor

    SciTech Connect (OSTI)

    Aubert, B.

    2009-06-02

    We study the reaction e{sup +}e{sup -} {yields} e{sup +}e{sup -}{pi}{sup 0} in the single tag mode and measure the differential cross section d{sigma}/dQ{sup 2} and the {gamma}{gamma}* {yields} {pi}{sup 0} transition form factor in the mometum transfer range from 4 to 40 GeV{sup 2}. At Q{sup 2} > 10 GeV{sup 2} the measured form factor exceeds the asymptotic limit predicted by perturbative QCD. The analysis is based on 442 fb{sup -1} of integrated luminosity collected at PEP-II with the BABAR detector at e{sup +}e{sup -} center-of-mass energies near 10.6 GeV.

  11. Performance of non-conventional factorization approaches for neutron kinetics

    SciTech Connect (OSTI)

    Bulla, S.; Nervo, M.

    2013-07-01

    The use of factorization techniques provides a interesting option for the simulation of the time-dependent behavior of nuclear systems with a reduced computational effort. While point kinetics neglects all spatial and spectral effects, quasi-statics and multipoint kinetics allow to produce results with a higher accuracy for transients involving relevant modifications of the neutron distribution. However, in some conditions these methods can not work efficiently. In this paper, we discuss some possible alternative formulations for the factorization process for neutron kinetics, leading to mathematical models of reduced complications that can allow an accurate simulation of transients involving spatial and spectral effects. The performance of these innovative approaches are compared to standard techniques for some test cases, showing the benefits and shortcomings of the method proposed. (authors)

  12. Surface engineering of the quality factor of metal coated microcantilevers

    SciTech Connect (OSTI)

    Ergincan, O.; Kooi, B. J.; Palasantzas, G.

    2014-12-14

    We performed noise measurements to obtain the quality factor (Q) and frequency shift of gold coated microcantilevers before and after surface modification using focused ion beam. As a result of our studies, it is demonstrated that surface engineering offers a promising method to control and increase the Q factor up to 50% for operation in vacuum. Surface modification could also lead to deviations from the known Q ∼ P{sup −1} behavior at low vacuum pressures P within the molecular regime. Finally, at higher pressures within the continuum regime, where Q is less sensitive to surface changes, a power scaling Q ∼ P{sup c} with c ≈ 0.3 was found instead of c = 0.5. The latter is explained via a semi-empirical formulation to account for continuum dissipation mechanisms at significant Reynolds numbers Re ∼ 1.

  13. Factorization of large integers on a massively parallel computer

    SciTech Connect (OSTI)

    Davis, J.A.; Holdridge, D.B.

    1988-01-01

    Our interest in integer factorization at Sandia National Laboratories is motivated by cryptographic applications and in particular the security of the RSA encryption-decryption algorithm. We have implemented our version of the quadratic sieve procedure on the NCUBE computer with 1024 processors (nodes). The new code is significantly different in all important aspects from the program used to factor number of order 10/sup 70/ on a single processor CRAY computer. Capabilities of parallel processing and limitation of small local memory necessitated this entirely new implementation. This effort involved several restarts as realizations of program structures that seemed appealing bogged down due to inter-processor communications. We are presently working with integers of magnitude about 10/sup 70/ in tuning this code to the novel hardware. 6 refs., 3 figs.

  14. Factors that affect electric-utility stranded commitments

    SciTech Connect (OSTI)

    Hirst, E.; Hadley, S.; Baxter, L.

    1996-07-01

    Estimates of stranded commitments for U.S. investor-owned utilities range widely, with many falling in the range of $100 to $200 billion. These potential losses exist because some utility-owned power plants, long-term power-purchase contracts and fuel-supply contracts, regulatory assets, and expenses for public-policy programs have book values that exceed their expected market values under full competition. This report quantifies the sensitivity of stranded- commitment estimates to the various factors that lead to these above- market-value estimates. The purpose of these sensitivity analyses is to improve understanding on the part of state and federal regulators, utilities, customers, and other electric-industry participants about the relative importance of the factors that affect stranded- commitment amounts.

  15. Factors influencing photocurrent generation in organic bulk heterojunction

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    solar cells: interfacial energetics and blend microstructure | MIT-Harvard Center for Excitonics Factors influencing photocurrent generation in organic bulk heterojunction solar cells: interfacial energetics and blend microstructure April 29, 2009 at 3pm/36-428 Jenny Nelson Department of Physics Imperial College London jenny-nelson_000 abstract: The efficiency of photocurrent generation in conjugated polymer:small molecule blend solar is strongly influenced both by the energy level alignment

  16. Transcription factors for modification of lignin content in plants

    DOE Patents [OSTI]

    Wang, Huanzhong; Chen, Fang; Dixon, Richard A.

    2015-06-02

    The invention provides methods for modifying lignin, cellulose, xylan, and hemicellulose content in plants, and for achieving ectopic lignification and, for instance, secondary cell wall synthesis in pith cells, by altered regulation of a WRKY transcription factor. Nucleic acid constructs for altered WRKY-TF expression are described. Transgenic plants are provided that comprise modified pith cell walls, and lignin, cellulose, and hemicellulose content. Plants described herein may be used, for example, as improved biofuel feedstock and as highly digestible forage crops.

  17. Method for determining formation quality factor from seismic data

    DOE Patents [OSTI]

    Taner, M. Turhan; Treitel, Sven

    2005-08-16

    A method is disclosed for calculating the quality factor Q from a seismic data trace. The method includes calculating a first and a second minimum phase inverse wavelet at a first and a second time interval along the seismic data trace, synthetically dividing the first wavelet by the second wavelet, Fourier transforming the result of the synthetic division, calculating the logarithm of this quotient of Fourier transforms and determining the slope of a best fit line to the logarithm of the quotient.

  18. Structure of Plasmodium falciparum ADP-ribosylation factor 1

    SciTech Connect (OSTI)

    Cook, William J.; Smith, Craig D.; Senkovich, Olga; Holder, Anthony A.; Chattopadhyay, Debasish

    2011-09-26

    Vesicular trafficking may play a crucial role in the pathogenesis and survival of the malaria parasite. ADP-ribosylation factors (ARFs) are among the major components of vesicular trafficking pathways in eukaryotes. The crystal structure of ARF1 GTPase from Plasmodium falciparum has been determined in the GDP-bound conformation at 2.5 {angstrom} resolution and is compared with the structures of mammalian ARF1s.

  19. Limiting Factors for Convective Cloud Top Height in the Tropics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Limiting Factors for Convective Cloud Top Height in the Tropics M. P. Jensen and A. D. Del Genio National Aeronautics and Space Administration Goddard Institute for Space Studies Columbia University New York, New York Introduction Populations of tropical convective clouds are mainly comprised of three types: shallow trade cumulus, mid-level cumulus congestus and deep convective clouds (Johnson et al. 1999). Each of these cloud types has different impacts on the local radiation and water budgets.

  20. Factors Affecting HCCI Combustion Phasing for Fuels with Single- and

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Dual-Stage Chemistry | Department of Energy Affecting HCCI Combustion Phasing for Fuels with Single- and Dual-Stage Chemistry Factors Affecting HCCI Combustion Phasing for Fuels with Single- and Dual-Stage Chemistry 2004 Diesel Engine Emissions Reduction (DEER) Conference Presentation: Sandia National Laboratories 2004_deer_dec.pdf (185.71 KB) More Documents & Publications Microsoft PowerPoint - DEER03-P.ppt HCCI and Stratified-Charge CI Engine Combustion Research Improving Efficiency

  1. Tuning the thermoelectric power factor in carbon nanotube films

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    . Tuning the thermoelectric power factor in carbon nanotube films Ben Zhou 1 , Azure Avery 2 , Andrew Ferguson 2 , Jeff Blackburn 2 Schematic of a thermoelectric device. (wikipedia) Heat Thermoelectric Device Electricity Thermoelectrics Carbon Nanotubes Introduction * Single walled carbon nanotubes (SWCNTs) are promising thermoelectrics because of their good conductivity and one dimensional density of states. Materials and Methods * Ink Preparation: (7,5) nanotubes were dispersed by

  2. Simulation: Moving from Technology Challenge to Human Factors Success

    SciTech Connect (OSTI)

    Gould, Derek A.; Chalmers, Nicholas; Johnson, Sheena J.; Kilkenny, Caroline; White, Mark D.; Bech, Bo; Lonn, Lars; Bello, Fernando

    2012-06-15

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used.

  3. Method for factor analysis of GC/MS data

    DOE Patents [OSTI]

    Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R

    2012-09-11

    The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.

  4. Energy Factor Analysis for Gas Heat Pump Water Heaters

    SciTech Connect (OSTI)

    Gluesenkamp, Kyle R

    2016-01-01

    Gas heat pump water heaters (HPWHs) can improve water heating efficiency with zero GWP and zero ODP working fluids. The energy factor (EF) of a gas HPWH is sensitive to several factors. In this work, expressions are derived for EF of gas HPWHs, as a function of heat pump cycle COP, tank heat losses, burner efficiency, electrical draw, and effectiveness of supplemental heat exchangers. The expressions are used to investigate the sensitivity of EF to each parameter. EF is evaluated on a site energy basis (as used by the US DOE for rating water heater EF), and a primary energy-basis energy factor (PEF) is also defined and included. Typical ranges of values for the six parameters are given. For gas HPWHs, using typical ranges for component performance, EF will be 59 80% of the heat pump cycle thermal COP (for example, a COP of 1.60 may result in an EF of 0.94 1.28). Most of the reduction in COP is due to burner efficiency and tank heat losses. Gas-fired HPWHs are theoretically be capable of an EF of up to 1.7 (PEF of 1.6); while an EF of 1.1 1.3 (PEF of 1.0 1.1) is expected from an early market entry.

  5. Dirac equation in low dimensions: The factorization method

    SciTech Connect (OSTI)

    Snchez-Monroy, J.A.; Quimbay, C.J.

    2014-11-15

    We present a general approach to solve the (1+1) and (2+1)-dimensional Dirac equations in the presence of static scalar, pseudoscalar and gauge potentials, for the case in which the potentials have the same functional form and thus the factorization method can be applied. We show that the presence of electric potentials in the Dirac equation leads to two KleinGordon equations including an energy-dependent potential. We then generalize the factorization method for the case of energy-dependent Hamiltonians. Additionally, the shape invariance is generalized for a specific class of energy-dependent Hamiltonians. We also present a condition for the absence of the Klein paradox (stability of the Dirac sea), showing how Dirac particles in low dimensions can be confined for a wide family of potentials. - Highlights: The low-dimensional Dirac equation in the presence of static potentials is solved. The factorization method is generalized for energy-dependent Hamiltonians. The shape invariance is generalized for energy-dependent Hamiltonians. The stability of the Dirac sea is related to the existence of supersymmetric partner Hamiltonians.

  6. Human factors engineering report for the cold vacuum drying facility

    SciTech Connect (OSTI)

    IMKER, F.W.

    1999-06-30

    The purpose of this report is to present the results and findings of the final Human Factors Engineering (HFE) technical analysis and evaluation of the Cold Vacuum Drying Facility (CVDF). Ergonomics issues are also addressed in this report, as appropriate. This report follows up and completes the preliminary work accomplished and reported by the Preliminary HFE Analysis report (SNF-2825, Spent Nuclear Fuel Project Cold Vacuum Drying Facility Human Factors Engineering Analysis: Results and Findings). This analysis avoids redundancy of effort except for ensuring that previously recommended HFE design changes have not affected other parts of the system. Changes in one part of the system may affect other parts of the system where those changes were not applied. The final HFE analysis and evaluation of the CVDF human-machine interactions (HMI) was expanded to include: the physical work environment, human-computer interface (HCI) including workstation and software, operator tasks, tools, maintainability, communications, staffing, training, and the overall ability of humans to accomplish their responsibilities, as appropriate. Key focal areas for this report are the process bay operations, process water conditioning (PWC) skid, tank room, and Central Control Room operations. These key areas contain the system safety-class components and are the foundation for the human factors design basis of the CVDF.

  7. Critical success factors in implementing process safety management

    SciTech Connect (OSTI)

    Wilson, D.J. [Chevron USA, Inc., New Orleans, LA (United States)

    1996-08-01

    This paper focuses on several {open_quotes}Critical Success Factors {close_quotes} which will determine how well employees will embrace and utilize the changes being asked of them to implement Process Safety Management (PSM). These success factors are applicable to any change which involves asking employees to perform activities differently than they are currently performing them. This includes changes in work processes (the way we arrange and conduct a set of tasks) or changes in work activities (how we perform individual tasks). Simply developing new work processes and explaining them to employees is not enough to ensure that employees will actually utilize them -- no matter how good these processes are. To ensure successful, complete implementation of Process Safety Management, we must manage the transition from how we perform our work now to how we will perform it after PSM is implemented. Environmental and safety performance improvements, facility reliability and operability increases, and employee effectiveness and productivity gains CAN NOT be achieved until Process Safety Management processes are fully implemented. To successfully implement management of change, mechanical integrity, or any of the other processes in PSM, each of the following critical success factors must be carefully considered and utilized as appropriate. They are: (1) Vision of a Future State, Current State Assessment, and a Detailed Plan to Achieve the Future State, (2) Management Commitment, (3) Ownership by Key Individuals, (4) Justification for Actions, (5) Autonomy to Customize the Process, (6) Feedback Mechanism to Adjust Activities, and (7) Process to Refocus & Redirect Efforts.

  8. Automatic Blocking Of QR and LU Factorizations for Locality

    SciTech Connect (OSTI)

    Yi, Q; Kennedy, K; You, H; Seymour, K; Dongarra, J

    2004-03-26

    QR and LU factorizations for dense matrices are important linear algebra computations that are widely used in scientific applications. To efficiently perform these computations on modern computers, the factorization algorithms need to be blocked when operating on large matrices to effectively exploit the deep cache hierarchy prevalent in today's computer memory systems. Because both QR (based on Householder transformations) and LU factorization algorithms contain complex loop structures, few compilers can fully automate the blocking of these algorithms. Though linear algebra libraries such as LAPACK provides manually blocked implementations of these algorithms, by automatically generating blocked versions of the computations, more benefit can be gained such as automatic adaptation of different blocking strategies. This paper demonstrates how to apply an aggressive loop transformation technique, dependence hoisting, to produce efficient blockings for both QR and LU with partial pivoting. We present different blocking strategies that can be generated by our optimizer and compare the performance of auto-blocked versions with manually tuned versions in LAPACK, both using reference BLAS, ATLAS BLAS and native BLAS specially tuned for the underlying machine architectures.

  9. Transfer Factors for Contaminant Uptake by Fruit and Nut Trees

    SciTech Connect (OSTI)

    Napier, Bruce A.; Fellows, Robert J.; Minc, Leah D.

    2013-11-20

    Transfer of radionuclides from soils into plants is one of the key mechanisms for long-term contamination of the human food chain. Nearly all computer models that address soil-to-plant uptake of radionuclides use empirically-derived transfer factors to address this process. Essentially all available soil-to-plant transfer factors are based on measurements in annual crops. Because very few measurements are available for tree fruits, samples were taken of alfalfa and oats and the stems, leaves, and fruits and nuts of almond, apple, apricot, carob, fig, grape, nectarine, pecan, pistachio (natural and grafted), and pomegranate, along with local surface soil. The samples were dried, ground, weighed, and analyzed for trace constituents through a combination of induction-coupled plasma mass spectrometry and instrumental neutron activation analysis for a wide range of naturally-occurring elements. Analysis results are presented and converted to soil-to-plant transfer factors. These are compared to commonly used and internationally recommended values. Those determined for annual crops are very similar to commonly-used values; those determined for tree fruits show interesting differences. Most macro- and micronutrients are slightly reduced in fruits; non-essential elements are reduced further. These findings may be used in existing computer models and may allow development of tree-fruit-specific transfer models.

  10. Patient-based radiographic exposure factor selection: a systematic review

    SciTech Connect (OSTI)

    Ching, William; Robinson, John; McEntee, Mark

    2014-09-15

    Digital technology has wider exposure latitude and post-processing algorithms which can mask the evidence of underexposure and overexposure. Underexposure produces noisy, grainy images which can impede diagnosis and overexposure results in a greater radiation dose to the patient. These exposure errors can result from inaccurate adjustment of exposure factors in response to changes in patient thickness. This study aims to identify all published radiographic exposure adaptation systems which have been, or are being, used in general radiography and discuss their applicability to digital systems. Studies in EMBASE, MEDLINE, CINAHL and SCOPUS were systematically reviewed. Some of the search terms used were exposure adaptation, exposure selection, exposure technique, 25% rule, 15% rule, DuPont™ Bit System and radiography. A manual journal-specific search was also conducted in The Radiographer and Radiologic Technology. Studies were included if they demonstrated a system of altering exposure factors to compensate for variations in patients for general radiography. Studies were excluded if they focused on finding optimal exposures for an ‘average’ patient or focused on the relationship between exposure factors and dose. The database search uncovered 11 articles and the journal-specific search uncovered 13 articles discussing systems of exposure adaptation. They can be categorised as simple one-step guidelines, comprehensive charts and computer programs. Only two papers assessed the efficacy of exposure adjustment systems. No literature compares the efficacy of exposure adaptations system for film/screen radiography with digital radiography technology nor is there literature on a digital specific exposure adaptation system.

  11. Assessment of Factors Influencing Effective CO{sub 2} Storage Capacity and Injectivity in Eastern Gas Shales

    SciTech Connect (OSTI)

    Godec, Michael

    2013-06-30

    Building upon advances in technology, production of natural gas from organic-rich shales is rapidly developing as a major hydrocarbon supply option in North America and around the world. The same technology advances that have facilitated this revolution - dense well spacing, horizontal drilling, and hydraulic fracturing - may help to facilitate enhanced gas recovery (EGR) and carbon dioxide (CO{sub 2}) storage in these formations. The potential storage of CO {sub 2} in shales is attracting increasing interest, especially in Appalachian Basin states that have extensive shale deposits, but limited CO{sub 2} storage capacity in conventional reservoirs. The goal of this cooperative research project was to build upon previous and on-going work to assess key factors that could influence effective EGR, CO{sub 2} storage capacity, and injectivity in selected Eastern gas shales, including the Devonian Marcellus Shale, the Devonian Ohio Shale, the Ordovician Utica and Point Pleasant shale and equivalent formations, and the late Devonian-age Antrim Shale. The project had the following objectives: (1) Analyze and synthesize geologic information and reservoir data through collaboration with selected State geological surveys, universities, and oil and gas operators; (2) improve reservoir models to perform reservoir simulations to better understand the shale characteristics that impact EGR, storage capacity and CO{sub 2} injectivity in the targeted shales; (3) Analyze results of a targeted, highly monitored, small-scale CO{sub 2} injection test and incorporate into ongoing characterization and simulation work; (4) Test and model a smart particle early warning concept that can potentially be used to inject water with uniquely labeled particles before the start of CO{sub 2} injection; (5) Identify and evaluate potential constraints to economic CO{sub 2} storage in gas shales, and propose development approaches that overcome these constraints; and (6) Complete new basin

  12. PHASER 2.10 methodology for dependence, importance, and sensitivity: The role of scale factors, confidence factors, and extremes

    SciTech Connect (OSTI)

    Cooper, J.A.

    1996-09-01

    PHASER (Probabilistic Hybrid Analytical System Evaluation Routine) is a software tool that has the capability of incorporating subjective expert judgment into probabilistic safety analysis (PSA) along with conventional data inputs. An earlier report described the PHASER methodology, but only gave a cursory explanation about how dependence was incorporated in Version 1.10 and about how ``Importance`` and ``Sensitivity`` measures were to be incorporated in Version 2.00. A more detailed description is given in this report. The basic concepts involve scale factors and confidence factors that are associated with the stochastic variability and subjective uncertainty (which are common adjuncts used in PSA), and the safety risk extremes that are crucial to safety assessment. These are all utilized to illustrate methodology for incorporating dependence among analysis variables in generating PSA results, and for Importance and Sensitivity measures associated with the results that help point out where any major sources of safety concern arise and where any major sources of uncertainty reside, respectively.

  13. Preparation and characterization of cobalt-substituted anthrax lethal factor

    SciTech Connect (OSTI)

    Saebel, Crystal E.; Carbone, Ryan; Dabous, John R.; Lo, Suet Y. [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada)] [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada); Siemann, Stefan, E-mail: ssiemann@laurentian.ca [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada)] [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada)

    2011-12-09

    Highlights: Black-Right-Pointing-Pointer Cobalt-substituted anthrax lethal factor (CoLF) is highly active. Black-Right-Pointing-Pointer CoLF can be prepared by bio-assimilation and direct exchange. Black-Right-Pointing-Pointer Lethal factor binds cobalt tightly. Black-Right-Pointing-Pointer The electronic spectrum of CoLF reveals penta-coordination. Black-Right-Pointing-Pointer Interaction of CoLF with thioglycolic acid follows a 2-step mechanism. -- Abstract: Anthrax lethal factor (LF) is a zinc-dependent endopeptidase involved in the cleavage of mitogen-activated protein kinase kinases near their N-termini. The current report concerns the preparation of cobalt-substituted LF (CoLF) and its characterization by electronic spectroscopy. Two strategies to produce CoLF were explored, including (i) a bio-assimilation approach involving the cultivation of LF-expressing Bacillus megaterium cells in the presence of CoCl{sub 2}, and (ii) direct exchange by treatment of zinc-LF with CoCl{sub 2}. Independent of the method employed, the protein was found to contain one Co{sup 2+} per LF molecule, and was shown to be twice as active as its native zinc counterpart. The electronic spectrum of CoLF suggests the Co{sup 2+} ion to be five-coordinate, an observation similar to that reported for other Co{sup 2+}-substituted gluzincins, but distinct from that documented for the crystal structure of native LF. Furthermore, spectroscopic studies following the exposure of CoLF to thioglycolic acid (TGA) revealed a sequential mechanism of metal removal from LF, which likely involves the formation of an enzyme: Co{sup 2+}:TGA ternary complex prior to demetallation of the active site. CoLF reported herein constitutes the first spectroscopic probe of LF's active site, which may be utilized in future studies to gain further insight into the enzyme's mechanism and inhibitor interactions.

  14. Factors affecting expanded electricity trade in North America

    SciTech Connect (OSTI)

    Hill, L.J.

    1994-01-01

    The authors explore factors that affect electricity trade between enterprises in the US and Canada and the US and Mexico. They look to those underlying policy and institutional factors that affect the relative costs of producing electricity in the three countries. In particular, they consider six factors that appear to have a significant impact on electricity trade in North America: differences in the types of economic regulation of power leading to differences in cost recovery for wholesale and retail power and wheeling charges; changing regulatory attitudes, placing more emphasis on demand-side management and environmental concerns; differences in energy and economic policies; differences in national and subnational environmental policies; changing organization of electric power industries which may foster uncertainty, change historical relationships, and provide other potentially important sources of power for distribution utilities; and differences in the ability of enterprises to gain access to electric power markets because of restrictions placed on transmission access. In Section 2, the authors discuss the regulation of electricity trade in North America and provide an overview of the recent trading experience for electricity between Canada and the US and between Mexico and the US, including the volume of that trade over the past decade and existing transmission capacity between regions of the three countries. In Section 3, they look at the benefits that accrue to trading counties and what those benefits are likely to be for the three countries. The discussion in Section 4 centers on the relevant provisions of the Canada Free Trade Agreement and the proposed North American Free Trade Agreement. In Section 5, they set the stage for the discussion of policy and institutional differences presented in Section 6 by outlining differences in the organization of the electric power sectors of Canada, the US, and Mexico. The study is synthesized in Section 7.

  15. Pion Form Factor in Improved Holographic QCD Backgrounds

    SciTech Connect (OSTI)

    Kwee, Herry J.

    2010-08-05

    We extend our recent numerical calculation of the pion electromagnetic form factor F{sub {pi}}(Q{sup 2}) in holographic QCD with a background field that interpolates between 'hard-wall' and 'soft-wall' models to obtain an improved model that reproduces the desirable phenomenological features of both. In all cases, F{sub {pi}}for large Q{sup 2} is shallower than data, an effect that can be cured by relaxing the fit to one of the static observables, particularly the decay constant f{sub {pi}}.

  16. Greybody factors and charges in Kerr/CFT

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Cvetič, Mirjam; Larsen, Finn

    2009-09-01

    We compute greybody factors for near extreme Kerr black holes in D = 4 and D = 5. In D = 4 we include four charges so that our solutions can be continuously deformed to the BPS limit. In D = 5 we include two independent angular momenta so Left-Right symmetry is incorporated. We discuss the CFT interpretation of our emission amplitudes, including the overall frequency dependence and the dependence on all black hole parameters. We find that all additional parameters can be incorporated Kerr/CFT, with central charge independent of U(1) charges.

  17. On the relationship between formation resistivity factor and porosity

    SciTech Connect (OSTI)

    Perez-Rosales, C.

    1982-08-01

    A theory on the relationship between formation resistivity factor and porosity is presented. This theory considers that, from the standpoint of the flow of electric current within a porous medium saturated with a conducting fluid, the pore space can be divided into flowing and stagnant regions. This assumption leads to a general expression, and formulas currently used in practice are special cases of this expression. The validity of the new expression is established by the use of data corresponding to sandstones and packings and suspensions of particles. For the case of natural rocks, the theory confirms Darcy's equation and gives an interpretation of the physical significance of the so-called cementation exponent.

  18. Kaon semileptonic vector form factor and determination of |Vus| using

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    staggered fermions | Argonne Leadership Computing Facility Kaon semileptonic vector form factor and determination of |Vus| using staggered fermions Authors: A. Bazavov, C. Bernard, C. M. Bouchard, C. DeTar, Daping Du, A. X. El-Khadra, J. Foley, E. D. Freeland6, E. Gámiz, Steven Gottlieb, U. M. Heller, Jongjeong Kim, A. S. Kronfeld, J. Laiho, L. Levkova, P. B. Mackenzie, E. T. Neil, M. B. Oktay, Si-Wei Qiu, J. N. Simone, R. Sugar, D. Toussaint, R. S. Van de Water, Ran Zhou Using staggered

  19. Gyromagnetic factors in {sup 144-150}Nd

    SciTech Connect (OSTI)

    Giannatiempo, A.

    2011-09-15

    The U(5) to SU(3) evolution of the nuclear structure in the even {sup 144-156}Nd isotopes has been investigated in the framework of the interacting boson approximation (IBA-2) model, taking into account the effect of the partial Z=64 subshell closure on the structure of the states of a collective nature. The analysis, which led to a satisfactory description of excitation energy patterns, quadrupole moments, and decay properties of the states (even when important M1 components were present in the transitions), is extended to the available data on g factors, in {sup 144-150}Nd. Their values are reasonably reproduced by the calculations.

  20. Human factors issues in qualitative and quantitative safety analyses

    SciTech Connect (OSTI)

    Hahn, H.A.

    1993-10-01

    Humans are a critical and integral part of any operational system, be it a nuclear reactor, a facility for assembly or disassembling hazardous components, or a transportation network. In our concern over the safety of these systems, we often focus our attention on the hardware engineering components of such systems. However, experience has repeatedly demonstrated that it is often the human component that is the primary determinant of overall system safety. Both the nuclear reactor accidents at Chernobyl and Three Mile Island and shipping disasters such as the Exxon Valdez and the Herald of Free Enterprise accidents are attributable to human error. Concern over human contributions to system safety prompts us to include reviews of human factors issues in our safety analyses. In the conduct of Probabilistic Risk Assessments (PRAs), human factors issues are addressed using a quantitative method called Human Reliability Analysis (HRA). HRAs typically begin with the identification of potential sources of human error in accident sequences of interest. Human error analysis often employs plant and/or procedures walk-downs in which the analyst considers the ``goodness`` of procedures, training, and human-machine interfaces concerning their potential contribution to human error. Interviews with expert task performers may also be conducted. In the application of HRA, once candidate sources of human error have been identified, error probabilities are developed.

  1. Asymptotic, multigroup flux reconstruction and consistent discontinuity factors

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Trahan, Travis J.; Larsen, Edward W.

    2015-05-12

    Recent theoretical work has led to an asymptotically derived expression for reconstructing the neutron flux from lattice functions and multigroup diffusion solutions. The leading-order asymptotic term is the standard expression for flux reconstruction, i.e., it is the product of a shape function, obtained through a lattice calculation, and the multigroup diffusion solution. The first-order asymptotic correction term is significant only where the gradient of the diffusion solution is not small. Inclusion of this first-order correction term can significantly improve the accuracy of the reconstructed flux. One may define discontinuity factors (DFs) to make certain angular moments of the reconstructed fluxmore » continuous across interfaces between assemblies in 1-D. Indeed, the standard assembly discontinuity factors make the zeroth moment (scalar flux) of the reconstructed flux continuous. The inclusion of the correction term in the flux reconstruction provides an additional degree of freedom that can be used to make two angular moments of the reconstructed flux continuous across interfaces by using current DFs in addition to flux DFs. Thus, numerical results demonstrate that using flux and current DFs together can be more accurate than using only flux DFs, and that making the second angular moment continuous can be more accurate than making the zeroth moment continuous.« less

  2. Effectiveness factors for hydroprocessing of coal and coal liquids

    SciTech Connect (OSTI)

    Massoth, F.E.; Seader, J.D.

    1990-03-29

    The aim of this project is to develop a methodology to predict, from a knowledge of feed and catalyst properties, effectiveness factors for catalytic hydroprocessing of coal and coal liquids. To achieve this aim, it is necessary to account for restrictive diffusion, which has not hitherto been done from a fundamental approach under reaction conditions. The research entails a study of hydrodenitrogenation of model compounds and coal-derived liquids using three NiMo/alumina catalysts of different pore size to develop, for restrictive diffusion, a relationship that can be used for estimating reliable effectiveness factors. The research program includes: Task A - measurement of pertinent properties of the catalysts and of several coal liquids; Task B - determination of effective diffusivities and turtuosities of the catalysts; Task C - development of restrictive diffusion correlations from data on model N-compound reactions; Task D - testing of correlations with coal-liquid cuts and whole coal-liquid feed. Results are presented and discussed from Tasks B and D. 9 refs., 6 figs., 4 tabs.

  3. Effectiveness factors for hydroprocessing of coal and coal liquids

    SciTech Connect (OSTI)

    Massoth, F.E.; Seader, J.D.

    1990-01-01

    The aim of this research project is to develop a methodology to predict, from a knowledge of feed and catalyst properties, effectiveness factors for catalytic hydroprocessing of coal and coal liquids. To achieve this aim, it is necessary to account for restrictive diffusion, which has not hitherto been done from a fundamental approach under reaction conditions. The research proposed here entails a study of hydrodenitrogenation of model compounds and coal-derived liquids using three NiMo/alumina catalysts of different pore size to develop, for restrictive diffusion, a relationship that can be used for estimating reliable effectiveness factors. The program is divided into four parts: measurements of pertinent properties of the catalysts and of a coal liquid and its derived boiling-point cuts; determination of effective diffusivities and tortuosities of the catalysts; development of restrictive diffusion correlations from data on model N-compounds at reaction conditions; and testing of correlations with coal-liquid cuts and whole coal-liquid feed, modifying correlations as necessary.

  4. Factors relevant to utility integration of intermittent renewable technologies

    SciTech Connect (OSTI)

    Wan, Yih-huei; Parsons, B.K.

    1993-08-01

    This study assesses factors that utilities must address when they integrate intermittent renewable technologies into their power-supply systems; it also reviews the literature in this area and has a bibliography containing more than 350 listings. Three topics are covered: (1) interface (hardware and design-related interconnection), (2) operability/stability, and (3) planning. This study finds that several commonly held perceptions regarding integration of intermittent renewable energy technologies are not valid. Among findings of the study are the following: (1) hardware and system design advances have eliminated most concerns about interface; (2) cost penalties have not occurred at low to moderate penetration levels (and high levels are feasible); and (3) intermittent renewable energy technologies can have capacity values. Obstacles still interfering with intermittent renewable technologies are also identified.

  5. Making tensor factorizations robust to non-gaussian noise.

    SciTech Connect (OSTI)

    Chi, Eric C.; Kolda, Tamara Gibson

    2011-03-01

    Tensors are multi-way arrays, and the CANDECOMP/PARAFAC (CP) tensor factorization has found application in many different domains. The CP model is typically fit using a least squares objective function, which is a maximum likelihood estimate under the assumption of independent and identically distributed (i.i.d.) Gaussian noise. We demonstrate that this loss function can be highly sensitive to non-Gaussian noise. Therefore, we propose a loss function based on the 1-norm because it can accommodate both Gaussian and grossly non-Gaussian perturbations. We also present an alternating majorization-minimization (MM) algorithm for fitting a CP model using our proposed loss function (CPAL1) and compare its performance to the workhorse algorithm for fitting CP models, CP alternating least squares (CPALS).

  6. Control of mechanically activated polymersome fusion: Factors affecting fusion

    SciTech Connect (OSTI)

    Henderson, Ian M.; Paxton, Walter F.

    2014-12-15

    Previously we have studied the mechanically-activated fusion of extruded (200 nm) polymer vesicles into giant polymersomes using agitation in the presence of salt. In this study we have investigated several factors contributing to this phenomenon, including the effects of (i) polymer vesicle concentration, (ii) agitation speed and duration, and iii) variation of the salt and its concentration. It was found that increasing the concentration of the polymer dramatically increases the production of giant vesicles through the increased collisions of polymersomes. Our investigations also found that increasing the frequency of agitation increased the efficiency of fusion, though ultimately limited the size of vesicle which could be produced due to the high shear involved. Finally it was determined that salt-mediation of the fusion process was not limited to NaCl, but is instead a general effect facilitated by the presence of solvated ionic compounds, albeit with different salts initiating fusion at different concentration.

  7. Control of mechanically activated polymersome fusion: Factors affecting fusion

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Henderson, Ian M.; Paxton, Walter F.

    2014-12-15

    Previously we have studied the mechanically-activated fusion of extruded (200 nm) polymer vesicles into giant polymersomes using agitation in the presence of salt. In this study we have investigated several factors contributing to this phenomenon, including the effects of (i) polymer vesicle concentration, (ii) agitation speed and duration, and iii) variation of the salt and its concentration. It was found that increasing the concentration of the polymer dramatically increases the production of giant vesicles through the increased collisions of polymersomes. Our investigations also found that increasing the frequency of agitation increased the efficiency of fusion, though ultimately limited the sizemore » of vesicle which could be produced due to the high shear involved. Finally it was determined that salt-mediation of the fusion process was not limited to NaCl, but is instead a general effect facilitated by the presence of solvated ionic compounds, albeit with different salts initiating fusion at different concentration.« less

  8. Factorization method and new potentials from the inverted oscillator

    SciTech Connect (OSTI)

    Bermudez, David Fernndez C, David J.

    2013-06-15

    In this article we will apply the first- and second-order supersymmetric quantum mechanics to obtain new exactly-solvable real potentials departing from the inverted oscillator potential. This system has some special properties; in particular, only very specific second-order transformations produce non-singular real potentials. It will be shown that these transformations turn out to be the so-called complex ones. Moreover, we will study the factorization method applied to the inverted oscillator and the algebraic structure of the new Hamiltonians. -- Highlights: We apply supersymmetric quantum mechanics to the inverted oscillator potential. The complex second-order transformations allow us to build new non-singular potentials. The algebraic structure of the initial and final potentials is analyzed. The initial potential is described by a complex-deformed HeisenbergWeyl algebra. The final potentials are described by polynomial Heisenberg algebras.

  9. Meson Transition Form Factors in Light-Front Holographic QCD

    SciTech Connect (OSTI)

    Brodsky, Stanley J.; Cao, Fu-Guang; de Teramond, Guy F.; /Costa Rica U.

    2011-06-22

    We study the photon-to-meson transition form factors (TFFs) F{sub M{gamma}}(Q{sup 2}) for {gamma}{gamma}* {yields} M using light-front holographic methods. The Chern-Simons action, which is a natural form in 5-dimensional anti-de Sitter (AdS) space, leads directly to an expression for the photon-to-pion TFF for a class of confining models. Remarkably, the predicted pion TFF is identical to the leading order QCD result where the distribution amplitude has asymptotic form. The Chern-Simons form is local in AdS space and is thus somewhat limited in its predictability. It only retains the q{bar q} component of the pion wavefunction, and further, it projects out only the asymptotic form of the meson distribution amplitude. It is found that in order to describe simultaneously the decay process {pi}{sup 0} {yields} {gamma}{gamma} and the pion TFF at the asymptotic limit, a probability for the q{bar q} component of the pion wavefunction P{sub q{bar q}} = 0.5 is required; thus giving indication that the contributions from higher Fock states in the pion light-front wavefunction need to be included in the analysis. The probability for the Fock state containing four quarks (anti-quarks) which follows from analyzing the hadron matrix elements, P{sub q{bar q}q{bar q}} {approx} 10%, agrees with the analysis of the pion elastic form factor using light-front holography including higher Fock components in the pion wavefunction. The results for the TFFs for the {eta} and {eta}{prime} mesons are also presented. The rapid growth of the pion TFF exhibited by the BABAR data at high Q{sup 2} is not compatible with the models discussed in this article, whereas the theoretical calculations are in agreement with the experimental data for the {eta} and {eta}{prime} TFFs.

  10. Energy Price Indices and Discount Factors for Life-Cycle Cost...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Price Indices and Discount Factors for Life-Cycle Cost Analysis - 2015 Energy Price Indices and Discount Factors for Life-Cycle Cost Analysis - 2015 Handbook describes the annual ...

  11. LATTICE WITH SMALLER MOMENTUM COMPACTION FACTOR FOR PEP-II HIGH...

    Office of Scientific and Technical Information (OSTI)

    LATTICE WITH SMALLER MOMENTUM COMPACTION FACTOR FOR PEP-II HIGH ENERGY RING Citation Details In-Document Search Title: LATTICE WITH SMALLER MOMENTUM COMPACTION FACTOR FOR PEP-II ...

  12. Neutron and Gamma-Ray Kerma Factors Based on LLNL Nuclear Data Files.

    Energy Science and Technology Software Center (OSTI)

    1991-07-08

    Version 00 Kerma factors are used extensively in biomedical applications. Specifically, neutron kerma factors are used in determining heating in materials of interest from neutron-induced reactions in fission or fusion power applications.

  13. Central safety factor and β N control on NSTX-U via beam power...

    Office of Scientific and Technical Information (OSTI)

    Central safety factor and N control on NSTX-U via beam power and plasma boundary shape ... Citation Details In-Document Search Title: Central safety factor and N control on ...

  14. Human Factors for Situation Assessment in Grid Operations

    SciTech Connect (OSTI)

    Guttromson, Ross T.; Schur, Anne; Greitzer, Frank L.; Paget, Mia L.

    2007-08-08

    Executive Summary Despite advances in technology, power system operators must assimilate overwhelming amounts of data to keep the grid operating. Analyses of recent blackouts have clearly demonstrated the need to enhance the operators situation awareness (SA). The long-term objective of this research is to integrate valuable technologies into the grid operator environment that support decision making under normal and abnormal operating conditions and remove non-technical barriers to enable the optimum use of these technologies by individuals working alone and as a team. More specifically, the research aims to identify methods and principles to increase SA of grid operators in the context of system conditions that are representative or common across many operating entities and develop operationally relevant experimental methods for studying technologies and operational practices which contribute to SA. With increasing complexity and interconnectivity of the grid, the scope and complexity of situation awareness have grown. New paradigms are needed to guide research and tool development aimed to enhance and improve operations. In reviewing related research, operating practices, systems, and tools, the present study established a taxonomy that provides a perspective on research and development surrounding power grid situation awareness and clarifies the field of human factors/SA for grid operations. Information sources that we used to identify critical factors underlying SA included interviews with experienced operational personnel, available historical summaries and transcripts of abnormal conditions and outages (e.g., the August 14, 2003 blackout), scientific literature, and operational policies/procedures and other documentation. Our analysis of August 2003 blackout transcripts and interviews adopted a different perspective than previous analyses of this material, and we complemented this analysis with additional interviews. Based on our analysis and a broad

  15. Global Agricultural Supply and Demand: Factors Contributing to the Recent Increase in Food Commodity Prices

    SciTech Connect (OSTI)

    none,

    2008-05-01

    This report discusses the factors that have led to global food commodity price inflaction and addresses the resulting implications.

  16. Meson transition form factors in light-front holographic QCD

    SciTech Connect (OSTI)

    Brodsky, Stanley J.; Cao Fuguang; de Teramond, Guy F.

    2011-10-01

    We study the photon-to-meson transition form factors (TFFs) F{sub M}{gamma}(Q{sup 2}) for {gamma}{gamma}{sup *}{yields}M using light-front holographic methods. The Chern-Simons action, which is a natural form in five-dimensional anti-de Sitter (AdS) space, is required to describe the anomalous coupling of mesons to photons using holographic methods and leads directly to an expression for the photon-to-pion TFF for a class of confining models. Remarkably, the predicted pion TFF is identical to the leading order QCD result where the distribution amplitude has asymptotic form. The Chern-Simons form is local in AdS space and is thus somewhat limited in its predictability. It only retains the qq component of the pion wave function, and further, it projects out only the asymptotic form of the meson distribution amplitude. It is found that in order to describe simultaneously the decay process {pi}{sup 0}{yields}{gamma}{gamma} and the pion TFF at the asymptotic limit, a probability for the qq component of the pion wave function P{sub qq}=0.5 is required, thus giving indication that the contributions from higher Fock states in the pion light-front wave function need to be included in the analysis. The probability for the Fock state containing four quarks P{sub qqqq}{approx}10%, which follows from analyzing the hadron matrix elements for a dressed current model, agrees with the analysis of the pion elastic form factor using light-front holography including higher Fock components in the pion wave function. The results for the TFFs for the {eta} and {eta}{sup '} mesons are also presented. The rapid growth of the pion TFF exhibited by the BABAR data at high Q{sup 2} is not compatible with the models discussed in this article, whereas the theoretical calculations are in agreement with the experimental data for the {eta} and {eta}{sup '} TFFs.

  17. Decommissioning Cost Estimating Factors And Earned Value Integration

    SciTech Connect (OSTI)

    Sanford, P.C.; Cimmarron, E.

    2008-07-01

    The Rocky Flats 771 Project progressed from the planning stage of decommissioning a plutonium facility, through the strip-out of highly-contaminated equipment, removal of utilities and structural decontamination, and building demolition. Actual cost data was collected from the strip-out activities and compared to original estimates, allowing the development of cost by equipment groupings and types and over time. Separate data was developed from the project control earned value reporting and compared with the equipment data. The paper discusses the analysis to develop the detailed factors for the different equipment types, and the items that need to be considered during characterization of a similar facility when preparing an estimate. The factors are presented based on direct labor requirements by equipment type. The paper also includes actual support costs, and examples of fixed or one-time start-up costs. The integration of the estimate and the earned value system used for the 771 Project is also discussed. The paper covers the development of the earned value system as well as its application to a facility to be decommissioned and an existing work breakdown structure. Lessons learned are provided, including integration with scheduling and craft supervision, measurement approaches, and verification of scope completion. In summary: The work of decommissioning the Rocky Flats 771 Project process equipment was completed in 2003. Early in the planning process, we had difficulty in identifying credible data and implementing processes for estimating and controlling this work. As the project progressed, we were able to collect actual data on the costs of removing plutonium contaminated equipment from various areas over the life of this work and associate those costs with individual pieces of equipment. We also were able to develop and test out a system for measuring the earned value of a decommissioning project based on an evolving estimate. These were elements that

  18. A Review of Operational Water Consumption and Withdrawal Factors for Electricity Generating Technologies

    SciTech Connect (OSTI)

    Macknick, Jordan; Newmark, Robin; Heath, Garvin; Hallett, K. C.

    2011-03-01

    This report provides estimates of operational water withdrawal and water consumption factors for electricity generating technologies in the United States. Estimates of water factors were collected from published primary literature and were not modified except for unit conversions. The presented water factors may be useful in modeling and policy analyses where reliable power plant level data are not available.

  19. Extracellular nonmitogenic angiogenesis factor and method of isolation thereof from wound fluid

    DOE Patents [OSTI]

    Banda, M.J.; Werb, Z.; Knighton, D.R.; Hunt, T.K.

    1985-03-05

    A nonmitogenic angiogenesis factor is isolated from wound fluid by dialysis to include materials in the molecular size range of 2,000 to 14,000, lyophilization, and chromatography. The nonmitogenic angiogenesis factor is identified by activity by corneal implant assay and by cell migration assay. The angiogenesis factor is also characterized by inactivity by mitogenesis assay. 3 figs.

  20. Extracellular nonmitogenic angiogenesis factor and method of isolation thereof from wound fluid

    DOE Patents [OSTI]

    Banda, Michael J.; Werb, Zena; Knighton, David R.; Hunt, Thomas K.

    1985-01-01

    A nonmitogenic angiogenesis factor is isolated from wound fluid by dialysis to include materials in the molecular size range of 2,000 to 14,000, lyophilization, and chromatography. The nonmitogenic angiogenesis factor is identified by activity by corneal implant assay and by cell migration assay. The angiogenesis factor is also characterized by inactivity by mitogenesis assay.

  1. Sugarcane transgenics expressing MYB transcription factors show improved glucose release

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Poovaiah, Charleson R.; Bewg, William P.; Lan, Wu; Ralph, John; Coleman, Heather D.

    2016-07-15

    In this study, sugarcane, a tropical C4 perennial crop, is capable of producing 30-100 tons or more of biomass per hectare annually. The lignocellulosic residue remaining after sugar extraction is currently underutilized and can provide a significant source of biomass for the production of second-generation bioethanol. As a result, MYB31 and MYB42 were cloned from maize and expressed in sugarcane with and without the UTR sequences. The cloned sequences were 98 and 99 % identical to the published nucleotide sequences. The inclusion of the UTR sequences did not affect any of the parameters tested. There was little difference in plantmore » height and the number of internodes of the MYB-overexpressing sugarcane plants when compared with controls. MYB transgene expression determined by qPCR exhibited continued expression in young and maturing internodes. MYB31 downregulated more genes within the lignin biosynthetic pathway than MYB42. MYB31 and MYB42 expression resulted in decreased lignin content in some lines. All MYB42 plants further analyzed showed significant increases in glucose release by enzymatic hydrolysis in 72 h, whereas only two MYB31 plants released more glucose than control plants. This correlated directly with a significant decrease in acid-insoluble lignin. Soluble sucrose content of the MYB42 transgenic plants did not vary compared to control plants. In conclusion, this study demonstrates the use of MYB transcription factors to improve the production of bioethanol from sugarcane bagasse remaining after sugar extraction.« less

  2. Hyperdynamics boost factor achievable with an ideal bias potential

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Huang, Chen; Perez, Danny; Voter, Arthur F.

    2015-08-20

    Hyperdynamics is a powerful method to significantly extend the time scales amenable to molecular dynamics simulation of infrequent events. One outstanding challenge, however, is the development of the so-called bias potential required by the method. In this work, we design a bias potential using information about all minimum energy pathways (MEPs) out of the current state. While this approach is not suitable for use in an actual hyperdynamics simulation, because the pathways are generally not known in advance, it allows us to show that it is possible to come very close to the theoretical boost limit of hyperdynamics while maintainingmore » high accuracy. We demonstrate this by applying this MEP-based hyperdynamics (MEP-HD) to metallic surface diffusion systems. In most cases, MEP-HD gives boost factors that are orders of magnitude larger than the best existing bias potential, indicating that further development of hyperdynamics bias potentials could have a significant payoff. Lastly, we discuss potential practical uses of MEP-HD, including the possibility of developing MEP-HD into a true hyperdynamics.« less

  3. Factors controlling pathogen destruction during anaerobic digestion of biowastes

    SciTech Connect (OSTI)

    Smith, S.R. . E-mail: s.r.smith@imperial.ac.uk; Lang, N.L.; Cheung, K.H.M.; Spanoudaki, K.

    2005-07-01

    Anaerobic digestion is the principal method of stabilising biosolids from urban wastewater treatment in the UK, and it also has application for the treatment of other types of biowaste. Increasing awareness of the potential risks to human and animal health from environmental sources of pathogens has focused attention on the efficacy of waste treatment processes at destroying pathogenic microorganisms in biowastes recycled to agricultural land. The degree of disinfection achieved by a particular anaerobic digester is influenced by a variety of interacting operational variables and conditions, which can often deviate from the ideal. Experimental investigations demonstrate that Escherichia coli and Salmonella spp. are not damaged by mesophilic temperatures, whereas rapid inactivation occurs by thermophilic digestion. A hydraulic, biokinetic and thermodynamic model of pathogen inactivation during anaerobic digestion showed that a 2 log{sub 10} reduction in E. coli (the minimum removal required for agricultural use of conventionally treated biosolids) is likely to challenge most conventional mesophilic digesters, unless strict maintenance and management practices are adopted to minimise dead zones and by-pass flow. Efficient mixing and organic matter stabilisation are the main factors controlling the rate of inactivation under mesophilic conditions and not a direct effect of temperature per se on pathogenic organisms.

  4. Highly Efficient Small Form Factor LED Retrofit Lamp

    SciTech Connect (OSTI)

    Steven Allen; Fred Palmer; Ming Li

    2011-09-11

    This report summarizes work to develop a high efficiency LED-based MR16 lamp downlight at OSRAM SYLVANIA under US Department of Energy contract DE-EE0000611. A new multichip LED package, electronic driver, and reflector optic were developed for these lamps. At steady-state, the lamp luminous flux was 409 lumens (lm), luminous efficacy of 87 lumens per watt (LPW), CRI (Ra) of 87, and R9 of 85 at a correlated color temperature (CCT) of 3285K. The LED alone achieved 120 lumens per watt efficacy and 600 lumen flux output at 25 C. The driver had 90% electrical conversion efficiency while maintaining excellent power quality with power factor >0.90 at a power of only 5 watts. Compared to similar existing MR16 lamps using LED sources, these lamps had much higher efficacy and color quality. The objective of this work was to demonstrate a LED-based MR16 retrofit lamp for replacement of 35W halogen MR16 lamps having (1) luminous flux of 500 lumens, (2) luminous efficacy of 100 lumens per watt, (3) beam angle less than 40{sup o} and center beam candlepower of at least 1000 candelas, and (4) excellent color quality.

  5. Factors influencing radiation therapy student clinical placement satisfaction

    SciTech Connect (OSTI)

    Bridge, Pete; Carmichael, Mary-Ann

    2014-02-15

    Introduction: Radiation therapy students at Queensland University of Technology (QUT) attend clinical placements at five different clinical departments with varying resources and support strategies. This study aimed to determine the relative availability and perceived importance of different factors affecting student support while on clinical placement. The purpose of the research was to inform development of future support mechanisms to enhance radiation therapy students experience on clinical placement. Methods: This study used anonymous Likert-style surveys to gather data from years 1 and 2 radiation therapy students from QUT and clinical educators from Queensland relating to availability and importance of support mechanisms during clinical placements in a semester. Results: The study findings demonstrated student satisfaction with clinical support and suggested that level of support on placement influenced student employment choices. Staff support was perceived as more important than physical resources; particularly access to a named mentor, a clinical educator and weekly formative feedback. Both students and educators highlighted the impact of time pressures. Conclusions: The support offered to radiation therapy students by clinical staff is more highly valued than physical resources or models of placement support. Protected time and acknowledgement of the importance of clinical education roles are both invaluable. Joint investment in mentor support by both universities and clinical departments is crucial for facilitation of effective clinical learning.

  6. Factors affecting coking pressures in tall coke ovens

    SciTech Connect (OSTI)

    Grimley, J.J.; Radley, C.E.

    1995-12-01

    The detrimental effects of excessive coking pressures, resulting in the permanent deformation of coke oven walls, have been recognized for many years. Considerable research has been undertaken worldwide in attempts to define the limits within which a plant may safely operate and to quantify the factors which influence these pressures. Few full scale techniques are available for assessing the potential of a coal blend for causing wall damage. Inference of dangerous swelling pressures may be made however by the measurement of the peak gas pressure which is generated as the plastic layers meet and coalesce at the center of the oven. This pressure is referred to in this report as the carbonizing pressure. At the Dawes Lane cokemaking plant of British Steel`s Scunthorpe Works, a large database has been compiled over several years from the regulator measurement of this pressure. This data has been statistically analyzed to provide a mathematical model for predicting the carbonizing pressure from the properties of the component coals, the results of this analysis are presented in this report.

  7. Factors governing sustainable groundwater pumping near a river

    SciTech Connect (OSTI)

    Zhang, Y.; Hubbard, S.S.; Finsterle, S.

    2011-01-15

    The objective of this paper is to provide new insights into processes affecting riverbank filtration (RBF). We consider a system with an inflatable dam installed for enhancing water production from downstream collector wells. Using a numerical model, we investigate the impact of groundwater pumping and dam operation on the hydrodynamics in the aquifer and water production. We focus our study on two processes that potentially limit water production of an RBF system: the development of an unsaturated zone and riverbed clogging. We quantify river clogging by calibrating a time-dependent riverbed permeability function based on knowledge of pumping rate, river stage, and temperature. The dynamics of the estimated riverbed permeability reflects clogging and scouring mechanisms. Our results indicate that (1) riverbed permeability is the dominant factor affecting infiltration needed for sustainable RBF production; (2) dam operation can influence pumping efficiency and prevent the development of an unsaturated zone beneath the riverbed only under conditions of sufficient riverbed permeability; (3) slow river velocity, caused by dam raising during summer months, may lead to sedimentation and deposition of fine-grained material within the riverbed, which may clog the riverbed, limiting recharge to the collector wells and contributing to the development of an unsaturated zone beneath the riverbed; and (4) higher river flow velocities, caused by dam lowering during winter storms, scour the riverbed an thus increase its permeability. These insights can be used as the basis for developing sustainable water management of a RBF system.

  8. Safety culture management: The importance of organizational factors

    SciTech Connect (OSTI)

    Haber, S.B.; Shurberg, D.A.; Jacobs, R.; Hofmann, D.

    1995-05-01

    The concept of safety culture has been used extensively to explain the underlying causes of performance based events, both positive and negative, across the nuclear industry. The work described in this paper represents several years of effort to identify, define and assess the organizational factors important to safe performance in nuclear power plants (NPPs). The research discussed in this paper is primarily conducted in support of the US Nuclear Regulatory Commission`s (NRC) efforts in understanding the impact of organizational performance on safety. As a result of a series of research activities undertaken by numerous NRC contractors, a collection of organizational dimensions has been identified and defined. These dimensions represent what is believed to be a comprehensive taxonomy of organizational elements that relate to the safe operation of nuclear power plants. Techniques were also developed by which to measure these organizational dimensions, and include structured interview protocols, behavioral checklists, and behavioral anchored rating scales (BARS). Recent efforts have focused on devising a methodology for the extraction of information related to the identified organizational dimensions from existing NRC documentation. This type of effort would assess the applicability of the organizational dimensions to existing NRC inspection and evaluation reports, refine the organizational dimensions previously developed so they are more relevant to the task of retrospective analysis, and attempt to rate plants based on the review of existing NRC documentation using the techniques previously developed for the assessment of organizational dimensions.

  9. Relating transverse-momentum-dependent and collinear factorization theorems in a generalized formalism

    DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

    Collins, J.; Gamberg, L.; Prokudin, A.; Rogers, T. C.; Sato, N.; Wang, B.

    2016-08-08

    We construct an improved implementation for combining TMD factorization transverse- momentum-dependent (TMD) factorization and collinear factorization. TMD factorization is suit- able for low transverse momentum physics, while collinear factorization is suitable for high transverse momenta and for a cross section integrated over transverse momentum. The result is a modified version of the standard W + Y prescription traditionally used in the Collins-Soper-Sterman (CSS) formalism and related approaches. As a result, we further argue that questions regarding the shape and Q- dependence of the cross sections at lower Q are largely governed by the matching to the Y -term.

  10. Impact of HFIR LEU Conversion on Beryllium Reflector Degradation Factors

    SciTech Connect (OSTI)

    Ilas, Dan

    2013-10-01

    An assessment of the impact of low enriched uranium (LEU) conversion on the factors that may cause the degradation of the beryllium reflector is performed for the High Flux Isotope Reactor (HFIR). The computational methods, models, and tools, comparisons with previous work, along with the results obtained are documented and discussed in this report. The report documents the results for the gas and neutronic poison production, and the heating in the beryllium reflector for both the highly enriched uranium (HEU) and LEU HFIR configurations, and discusses the impact that the conversion to LEU may have on these quantities. A time-averaging procedure was developed to calculate the isotopic (gas and poisons) production in reflector. The sensitivity of this approach to different approximations is gauged and documented. The results show that the gas is produced in the beryllium reflector at a total rate of 0.304 g/cycle for the HEU configuration; this rate increases by ~12% for the LEU case. The total tritium production rate in reflector is 0.098 g/cycle for the HEU core and approximately 11% higher for the LEU core. A significant increase (up to ~25%) in the neutronic poisons production in the reflector during the operation cycles is observed for the LEU core, compared to the HEU case, for regions close to the core s horizontal midplane. The poisoning level of the reflector may increase by more than two orders of magnitude during long periods of downtime. The heating rate in the reflector is estimated to be approximately 20% lower for the LEU core than for the HEU core. The decrease is due to a significantly lower contribution of the heating produced by the gamma radiation for the LEU core. Both the isotopic (gas and neutronic poisons) production and the heating rates are spatially non-uniform throughout the beryllium reflector volume. The maximum values typically occur in the removable reflector and close to the midplane.

  11. Multifocal Glioblastoma Multiforme: Prognostic Factors and Patterns of Progression

    SciTech Connect (OSTI)

    Showalter, Timothy N.; Andrel, Jocelyn; Andrews, David W.; Curran, Walter J.; Daskalakis, Constantine; Werner-Wasik, Maria

    2007-11-01

    Purpose: To assess the progression patterns in patients with multifocal glioblastoma multiforme who had undergone whole brain radiotherapy (WBRT), the historical standard, versus three-dimensional conformal radiotherapy, and to identify predictive treatment and pretreatment factors. Methods and Materials: The records of 50 patients with multifocal glioblastoma multiforme treated with RT were reviewed. Univariate analyses were performed using survival methods and the Cox proportional hazards regression method. Multivariate analyses were performed using the Cox proportional hazards regression method. Results: The mean age was 61 years, and 71% had a Karnofsky performance status (KPS) score of {>=}70. Of the 50 patients, 32% underwent WBRT and 68%, three-dimensional conformal RT. Progression was local in all evaluable patients, as determined by imaging in 38 patients and early neurologic progression in 12. The median time to progression (TTP) was 3.1 months, and the median survival time (MST) was 8.1 months. The significant independent predictors of TTP on multivariate analysis were a KPS score <70 (p = 0.001), the extent of surgery (p = 0.040), a radiation dose <60 Gy (p = 0.027), and the lack of chemotherapy (p = 0.001). The significant independent predictors of a reduced MST were a KPS score <70 (p = 0.022) and the absence of salvage surgery (p = 0.011) and salvage chemotherapy (p = 0.003). Conclusion: Local progression was observed in all patients. On multivariate analysis, no significant difference was found in the TTP or MST between three-dimensional conformal radiotherapy and WBRT. The KPS was a consistent independent predictor of both TTP and MST. On the basis of the progression pattern, we do not recommend WBRT as a mandatory component of the treatment of multifocal glioblastoma multi0011for.

  12. Experimental analysis of elemental factors controlling the life of PAFCs

    SciTech Connect (OSTI)

    Watanabe, Masahiro; Miyoshi, Hideaki; Uchida, Hiroyuki

    1996-12-31

    Since 1991, 5MW-class and 1MW-class PAFC power plants have been demonstrated with the objective of accelerating development and commercialization by the Phosphoric Acid Fuel Cell Technology Research Association (PAFC-TRA) jointly with NEDO as one of MITI`s fuel cell programs. As a complimentary research project to the demonstration project, the mechanism and rate of deterioration of the cells and stacks have been studied from 1995 FY, with the objective of establishing an estimation method for the service life-time of the cell stacks. Our work has been performed in the Basic Research Project, as part of that project on PAFCs, with the cooperation of Yamanashi University supported by the Ministry of Education, Science and Culture, PAFC-TRA supported by NEDO and three PAFC makers. We have selected the following four subjects as the essential factors relating to the life-time, after a year-long study of the literature and the accumulation of a large number of data as to the practical operations of the cells, cell stacks and plants of PAFCs; i.e., (1) Mechanism of the degradation of electrocatalysts and the effect of the degradation on the electrode performances. (2) Effect of the electrolyte fill-level on the electrode performances. (3) Corrosion of cell constructing materials and the effect of the corrosion on the electrode performances. (4) The rate and mechanism of electrolyte loss under various operating conditions of a model cell. The paper briefly introduces the interim results which have been found on the above subjects at this time.

  13. Human Factors Considerations in New Nuclear Power Plants: Detailed Analysis.

    SciTech Connect (OSTI)

    OHara,J.; Higgins, J.; Brown, W.; Fink, R.

    2008-02-14

    This Nuclear Regulatory Commission (NRC) sponsored study has identified human-performance issues in new and advanced nuclear power plants. To identify the issues, current industry developments and trends were evaluated in the areas of reactor technology, instrumentation and control technology, human-system integration technology, and human factors engineering (HFE) methods and tools. The issues were organized into seven high-level HFE topic areas: Role of Personnel and Automation, Staffing and Training, Normal Operations Management, Disturbance and Emergency Management, Maintenance and Change Management, Plant Design and Construction, and HFE Methods and Tools. The issues where then prioritized into four categories using a 'Phenomena Identification and Ranking Table' methodology based on evaluations provided by 14 independent subject matter experts. The subject matter experts were knowledgeable in a variety of disciplines. Vendors, utilities, research organizations and regulators all participated. Twenty issues were categorized into the top priority category. This Brookhaven National Laboratory (BNL) technical report provides the detailed methodology, issue analysis, and results. A summary of the results of this study can be found in NUREG/CR-6947. The research performed for this project has identified a large number of human-performance issues for new control stations and new nuclear power plant designs. The information gathered in this project can serve as input to the development of a long-term strategy and plan for addressing human performance in these areas through regulatory research. Addressing human-performance issues will provide the technical basis from which regulatory review guidance can be developed to meet these challenges. The availability of this review guidance will help set clear expectations for how the NRC staff will evaluate new designs, reduce regulatory uncertainty, and provide a well-defined path to new nuclear power plant licensing.

  14. Applying Human Factors during the SIS Life Cycle

    SciTech Connect (OSTI)

    Avery, K.

    2010-05-05

    Safety Instrumented Systems (SIS) are widely used in U.S. Department of Energy's (DOE) nonreactor nuclear facilities for safety-critical applications. Although use of the SIS technology and computer-based digital controls, can improve performance and safety, it potentially introduces additional complexities, such as failure modes that are not readily detectable. Either automated actions or manual (operator) actions may be required to complete the safety instrumented function to place the process in a safe state or mitigate a hazard in response to an alarm or indication. DOE will issue a new standard, Application of Safety Instrumented Systems Used at DOE Nonreactor Nuclear Facilities, to provide guidance for the design, procurement, installation, testing, maintenance, operation, and quality assurance of SIS used in safety significant functions at DOE nonreactor nuclear facilities. The DOE standard focuses on utilizing the process industry consensus standard, American National Standards Institute/ International Society of Automation (ANSI/ISA) 84.00.01, Functional Safety: Safety Instrumented Systems for the Process Industry Sector, to support reliable SIS design throughout the DOE complex. SIS design must take into account human-machine interfaces and their limitations and follow good human factors engineering (HFE) practices. HFE encompasses many diverse areas (e.g., information display, user-system interaction, alarm management, operator response, control room design, and system maintainability), which affect all aspects of system development and modification. This paper presents how the HFE processes and principles apply throughout the SIS life cycle to support the design and use of SIS at DOE nonreactor nuclear facilities.

  15. Factors Affecting the Disposal Capacity of a Repository at Yucca Mountain

    SciTech Connect (OSTI)

    Nutt, W.M.; Peters, M.T.; Wigeland, R.A.; Kouts, C.; Kim, D.; Gomberg, S.

    2007-07-01

    The development of a repository at Yucca Mountain is proceeding in accordance with the Nuclear Waste Policy Act (NWPA). The current design of the proposed repository emplaces 63,000 metric tons of heavy metal (MTHM) of commercial spent nuclear fuel and 7,000 MTHM-equivalent of Department of Energy-owned spent nuclear fuel and high level nuclear waste. Efforts are underway to complete the pre-closure and postclosure safety analyses in accordance with 10 CFR 63. This will be included in a license application for construction of the repository that is currently planned to be submitted to the U.S. Nuclear Regulatory Commission (NRC) no later than June of 2008. The Global Nuclear Energy Partnership (GNEP) aims to 'recycle nuclear fuel using new proliferation-resistant technologies to recover more energy and reduce waste'. The Nation's decision to choose to recycle spent nuclear fuel in an advanced nuclear fuel cycle, such as that being considered under the GNEP, would present the opportunity to change the current approach for managing and disposing nuclear waste. The total amount of waste that could be disposed in a repository at Yucca Mountain would be a key component of a new waste management strategy should a decision be made in the future to utilize the proposed Yucca Mountain repository to dispose of wastes generated under the GNEP. (authors)

  16. Economic Conditions and Factors Affecting New Nuclear Power Deployment

    SciTech Connect (OSTI)

    Harrison, Thomas J.

    2014-10-01

    This report documents work performed in support of the US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (AdvSMR) program. The report presents information and results from economic analyses to describe current electricity market conditions and those key factors that may impact the deployment of AdvSMRs or any other new nuclear power plants. Thus, this report serves as a reference document for DOE as it moves forward with its plans to develop advanced reactors, including AdvSMRs. For the purpose of this analysis, information on electricity markets and nuclear power plant operating costs will be combined to examine the current state of the nuclear industry and the process required to successfully move forward with new nuclear power in general and AdvSMRs in particular. The current electricity market is generally unfavorable to new nuclear construction, especially in deregulated markets with heavy competition from natural gas and subsidized renewables. The successful and profitable operation of a nuclear power plant (or any power plant) requires the rate at which the electricity is sold to be sufficiently greater than the cost to operate. The wholesale rates in most US markets have settled into values that provide profits for most operating nuclear power plants but are too low to support the added cost of capital recovery for new nuclear construction. There is a strong geographic dependence on the wholesale rate, with some markets currently able to support new nuclear construction. However, there is also a strong geographic dependence on pronuclear public opinion; the areas where power prices are high tend to have unfavorable views on the construction of new nuclear power plants. The use of government-backed incentives, such as subsidies, can help provide a margin to help justify construction projects that otherwise may not seem viable. Similarly, low interest rates for the project will also add a positive margin to the economic

  17. Key Response Planning Factors for the Aftermath of Nuclear Terrorism

    SciTech Connect (OSTI)

    Buddemeier, B R; Dillon, M B

    2009-01-21

    Despite hundreds of above-ground nuclear tests and data gathered from Hiroshima and Nagasaki, the effects of a ground-level, low-yield nuclear detonation in a modern urban environment are still the subject of considerable scientific debate. Extensive review of nuclear weapon effects studies and discussions with nuclear weapon effects experts from various federal agencies, national laboratories, and technical organizations have identified key issues and bounded some of the unknowns required to support response planning for a low-yield, ground-level nuclear detonation in a modern U.S. city. This study, which is focused primarily upon the hazards posed by radioactive fallout, used detailed fallout predictions from the advanced suite of three-dimensional (3-D) meteorology and plume/fallout models developed at Lawrence Livermore National Laboratory (LLNL), including extensive global Key Response Planning Factors for the Aftermath of Nuclear Terrorism geographical and real-time meteorological databases to support model calculations. This 3-D modeling system provides detailed simulations that account for complex meteorology and terrain effects. The results of initial modeling and analysis were presented to federal, state, and local working groups to obtain critical, broad-based review and feedback on strategy and messaging. This effort involved a diverse set of communities, including New York City, National Capitol Regions, Charlotte, Houston, Portland, and Los Angeles. The largest potential for reducing casualties during the post-detonation response phase comes from reducing exposure to fallout radiation. This can be accomplished through early, adequate sheltering followed by informed, delayed evacuation.B The response challenges to a nuclear detonation must be solved through multiple approaches of public education, planning, and rapid response actions. Because the successful response will require extensive coordination of a large number of organizations, supplemented by

  18. Energy Intensity Baselining and Tracking Guidance

    Broader source: Energy.gov (indexed) [DOE]

    ... total energy required to generate, transmit, and distribute electricity from the power generation source to the end user is factored into a company's energy consumption metrics. ...

  19. Monthly Energy Review The Monthly Energy Review

    Gasoline and Diesel Fuel Update (EIA)

    natural gas, coal, electricity, and nuclear energy. Also included are international energy and thermal and metric conversion factors. Publication of this report is in keeping with...

  20. DOE Publishes GATEWAY Report on Pedestrian Friendly Outdoor Lighting...

    Energy Savers [EERE]

    lighting project is different, and tradeoffs between such factors as visual comfort, color, visibility, and efficacy are inevitable. There is no glare metric that works reliably...