National Library of Energy BETA

Sample records for factors metric prefixes

  1. Draft Supplemental Environmental Impact Statement for the Production...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    METRIC PREFIXES Prefix Symbol Multiplication factor Scientific notation tera- T 1,000,000,... gramscubic centimeter poundscubic feet 16,025.6 gramscubic meter inches 2.54 ...

  2. performance metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    performance metrics - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy ...

  3. Resilience Metrics

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    for Quadrennial Energy Review Technical Workshop on Resilience Metrics for Energy Transmission and Distribution Infrastructure April 28, 2014 Infrastructure Assurance Center ...

  4. Metric Presentation

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    ... MODERN GRID S T R A T E G Y 14 14 Value Metrics - Work to date Reliability Outage duration and frequency Momentary outages Power Quality measures Security Ratio of distributed ...

  5. Metric Presentation

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    MODERN GRID S T R A T E G Y Smart Grid Metrics Monitoring our Progress Smart Grid Implementation Workshop Joe Miller - Modern Grid Team June 19, 2008 1 Conducted by the National Energy Technology Laboratory Funded by the U.S. Department of Energy, Office of Electricity Delivery and Energy Reliability 2 Office of Electricity Delivery and Energy Reliability MODERN GRID S T R A T E G Y Many are working on the Smart Grid FERC DOE-OE Grid 2030 GridWise Alliance EEI NERC (FM) DOE/NETL Modern Grid

  6. ARM - 2006 Performance Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  7. ARM - 2007 Performance Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Atmospheric System Research (ASR) Earth System Modeling Regional & Global Climate Modeling Terrestrial Ecosystem Science Performance Metrics User Meetings Past ARM Science Team ...

  8. NIF Target Shot Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    target shot metrics NIF Target Shot Metrics Exp Cap - Experimental Capability Natl Sec Appl - National Security Applications DS - Discovery Science ICF - Inertial Confinement Fusion HED - High Energy Density For internal LLNL firewall viewing - if the page is blank, please open www.google.com to flush out BCB

  9. Surveillance metrics sensitivity study.

    SciTech Connect

    Hamada, Michael S.; Bierbaum, Rene Lynn; Robertson, Alix A.

    2011-09-01

    In September of 2009, a Tri-Lab team was formed to develop a set of metrics relating to the NNSA nuclear weapon surveillance program. The purpose of the metrics was to develop a more quantitative and/or qualitative metric(s) describing the results of realized or non-realized surveillance activities on our confidence in reporting reliability and assessing the stockpile. As a part of this effort, a statistical sub-team investigated various techniques and developed a complementary set of statistical metrics that could serve as a foundation for characterizing aspects of meeting the surveillance program objectives. The metrics are a combination of tolerance limit calculations and power calculations, intending to answer level-of-confidence type questions with respect to the ability to detect certain undesirable behaviors (catastrophic defects, margin insufficiency defects, and deviations from a model). Note that the metrics are not intended to gauge product performance but instead the adequacy of surveillance. This report gives a short description of four metrics types that were explored and the results of a sensitivity study conducted to investigate their behavior for various inputs. The results of the sensitivity study can be used to set the risk parameters that specify the level of stockpile problem that the surveillance program should be addressing.

  10. Multi-Metric Sustainability Analysis

    SciTech Connect

    Cowlin, S.; Heimiller, D.; Macknick, J.; Mann, M.; Pless, J.; Munoz, D.

    2014-12-01

    A readily accessible framework that allows for evaluating impacts and comparing tradeoffs among factors in energy policy, expansion planning, and investment decision making is lacking. Recognizing this, the Joint Institute for Strategic Energy Analysis (JISEA) funded an exploration of multi-metric sustainability analysis (MMSA) to provide energy decision makers with a means to make more comprehensive comparisons of energy technologies. The resulting MMSA tool lets decision makers simultaneously compare technologies and potential deployment locations.

  11. Metric Construction | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Metric Construction Jump to: navigation, search Name: Metric Construction Place: Boston, MA Information About Partnership with NREL Partnership with NREL Yes Partnership Type Test...

  12. Cyber threat metrics.

    SciTech Connect

    Frye, Jason Neal; Veitch, Cynthia K.; Mateski, Mark Elliot; Michalski, John T.; Harris, James Mark; Trevino, Cassandra M.; Maruoka, Scott

    2012-03-01

    Threats are generally much easier to list than to describe, and much easier to describe than to measure. As a result, many organizations list threats. Fewer describe them in useful terms, and still fewer measure them in meaningful ways. This is particularly true in the dynamic and nebulous domain of cyber threats - a domain that tends to resist easy measurement and, in some cases, appears to defy any measurement. We believe the problem is tractable. In this report we describe threat metrics and models for characterizing threats consistently and unambiguously. The purpose of this report is to support the Operational Threat Assessment (OTA) phase of risk and vulnerability assessment. To this end, we focus on the task of characterizing cyber threats using consistent threat metrics and models. In particular, we address threat metrics and models for describing malicious cyber threats to US FCEB agencies and systems.

  13. STAR METRICS | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    STAR METRICS STAR METRICS May 4, 2011 - 4:47pm Addthis Energy continues to define Phase II of the STAR METRICS program, a collaborative initiative to track Research and Development expenditures and their outcomes. Visit the STAR METRICS website for more information about the program. Addthis Related Articles A New Effort to Save the Ozone Layer and Protect the Climate STAR METRICS DOE Office of Environmental Management 2015 Year in Review Civil War Icon Becomes National Clean Ener

  14. Aquatic Acoustic Metrics Interface

    Energy Science and Technology Software Center

    2012-12-18

    Fishes and marine mammals may suffer a range of potential effects from exposure to intense underwater sound generated by anthropogenic activities such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording (USR) devices have been built to acquire samples of the underwater sound generated by anthropogenic activities. Software becomes indispensable for processing and analyzing the audio files recorded by these USRs. The new Aquatic Acoustic Metrics Interface Utility Software (AAMI) is specificallymore » designed for analysis of underwater sound recordings to provide data in metrics that facilitate evaluation of the potential impacts of the sound on aquatic animals. In addition to the basic functions, such as loading and editing audio files recorded by USRs and batch processing of sound files, the software utilizes recording system calibration data to compute important parameters in physical units. The software also facilitates comparison of the noise sound sample metrics with biological measures such as audiograms of the sensitivity of aquatic animals to the sound, integrating various components into a single analytical frame.« less

  15. Metrics for Energy Resilience

    SciTech Connect

    Paul E. Roege; Zachary A. Collier; James Mancillas; John A. McDonagh; Igor Linkov

    2014-09-01

    Energy lies at the backbone of any advanced society and constitutes an essential prerequisite for economic growth, social order and national defense. However there is an Achilles heel to today?s energy and technology relationship; namely a precarious intimacy between energy and the fiscal, social, and technical systems it supports. Recently, widespread and persistent disruptions in energy systems have highlighted the extent of this dependence and the vulnerability of increasingly optimized systems to changing conditions. Resilience is an emerging concept that offers to reconcile considerations of performance under dynamic environments and across multiple time frames by supplementing traditionally static system performance measures to consider behaviors under changing conditions and complex interactions among physical, information and human domains. This paper identifies metrics useful to implement guidance for energy-related planning, design, investment, and operation. Recommendations are presented using a matrix format to provide a structured and comprehensive framework of metrics relevant to a system?s energy resilience. The study synthesizes previously proposed metrics and emergent resilience literature to provide a multi-dimensional model intended for use by leaders and practitioners as they transform our energy posture from one of stasis and reaction to one that is proactive and which fosters sustainable growth.

  16. Exploring Metric Symmetry

    SciTech Connect

    Zwart, P.H.; Grosse-Kunstleve, R.W.; Adams, P.D.

    2006-07-31

    Relatively minor perturbations to a crystal structure can in some cases result in apparently large changes in symmetry. Changes in space group or even lattice can be induced by heavy metal or halide soaking (Dauter et al, 2001), flash freezing (Skrzypczak-Jankun et al, 1996), and Se-Met substitution (Poulsen et al, 2001). Relations between various space groups and lattices can provide insight in the underlying structural causes for the symmetry or lattice transformations. Furthermore, these relations can be useful in understanding twinning and how to efficiently solve two different but related crystal structures. Although (pseudo) symmetric properties of a certain combination of unit cell parameters and a space group are immediately obvious (such as a pseudo four-fold axis if a is approximately equal to b in an orthorhombic space group), other relations (e.g. Lehtio, et al, 2005) that are less obvious might be crucial to the understanding and detection of certain idiosyncrasies of experimental data. We have developed a set of tools that allows straightforward exploration of possible metric symmetry relations given unit cell parameters and a space group. The new iotbx.explore{_}metric{_}symmetry command produces an overview of the various relations between several possible point groups for a given lattice. Methods for finding relations between a pair of unit cells are also available. The tools described in this newsletter are part of the CCTBX libraries, which are included in the latest (versions July 2006 and up) PHENIX and CCI Apps distributions.

  17. Ames Laboratory Metrics | The Ames Laboratory

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Metrics Document Number: NA Effective Date: 01/2016 File (public): PDF icon ameslab_metrics_01-14-16

  18. Variable metric conjugate gradient methods

    SciTech Connect

    Barth, T.; Manteuffel, T.

    1994-07-01

    1.1 Motivation. In this paper we present a framework that includes many well known iterative methods for the solution of nonsymmetric linear systems of equations, Ax = b. Section 2 begins with a brief review of the conjugate gradient method. Next, we describe a broader class of methods, known as projection methods, to which the conjugate gradient (CG) method and most conjugate gradient-like methods belong. The concept of a method having either a fixed or a variable metric is introduced. Methods that have a metric are referred to as either fixed or variable metric methods. Some relationships between projection methods and fixed (variable) metric methods are discussed. The main emphasis of the remainder of this paper is on variable metric methods. In Section 3 we show how the biconjugate gradient (BCG), and the quasi-minimal residual (QMR) methods fit into this framework as variable metric methods. By modifying the underlying Lanczos biorthogonalization process used in the implementation of BCG and QMR, we obtain other variable metric methods. These, we refer to as generalizations of BCG and QMR.

  19. Daylight metrics and energy savings

    SciTech Connect

    Mardaljevic, John; Heschong, Lisa; Lee, Eleanor

    2009-12-31

    The drive towards sustainable, low-energy buildings has increased the need for simple, yet accurate methods to evaluate whether a daylit building meets minimum standards for energy and human comfort performance. Current metrics do not account for the temporal and spatial aspects of daylight, nor of occupants comfort or interventions. This paper reviews the historical basis of current compliance methods for achieving daylit buildings, proposes a technical basis for development of better metrics, and provides two case study examples to stimulate dialogue on how metrics can be applied in a practical, real-world context.

  20. EECBG SEP Attachment 1 - Process metric list

    Energy Saver

    EECBG 10-07BSEP 10-006A Attachment 1: Process Metrics List Metric Area Metric Primary or ... Solar energy systems installed Number of solar energy systems installed Total capacity of ...

  1. List of SEP Reporting Metrics

    Energy.gov [DOE]

    DOE State Energy Program List of Reporting Metrics, which was produced by the Office of Energy Efficiency and Renewable Energy Weatherization and Intergovernmental Program for SEP and the Energy Efficiency and Conservation Block Grants (EECBG) programs.

  2. Common Carbon Metric | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Common Carbon Metric Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Common Carbon Metric AgencyCompany Organization: United Nations Environment Programme, World...

  3. Technical Workshop: Resilience Metrics for Energy Transmission...

    Energy Saver

    Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (14.49 MB) Sandia Presentation: Resilience Metrics for Energy ...

  4. Performance Metrics Tiers | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Performance Metrics Tiers Performance Metrics Tiers The performance metrics defined by the Commercial Buildings Integration Program offer different tiers of information to address the needs of various users. On this page you will find information about the various goals users are trying to achieve by using performance metrics and the tiers of metrics. Goals in Measuring Performance Many individuals and groups are involved with a building over its lifetime, and all have different interests in and

  5. Thermodynamic Metrics and Optimal Paths

    SciTech Connect

    Sivak, David; Crooks, Gavin

    2012-05-08

    A fundamental problem in modern thermodynamics is how a molecular-scale machine performs useful work, while operating away from thermal equilibrium without excessive dissipation. To this end, we derive a friction tensor that induces a Riemannian manifold on the space of thermodynamic states. Within the linear-response regime, this metric structure controls the dissipation of finite-time transformations, and bestows optimal protocols with many useful properties. We discuss the connection to the existing thermodynamic length formalism, and demonstrate the utility of this metric by solving for optimal control parameter protocols in a simple nonequilibrium model.

  6. Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Department of Energy (DOE) national laboratories in patenting and is in the Top 5 for new copyright assertions. Fiscal Year 2015 Inventions & Patents Fiscal Year 2015 ...

  7. Comparing Resource Adequacy Metrics: Preprint

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Comparing Resource Adequacy Metrics Preprint E. Ibanez and M. Milligan National Renewable Energy Laboratory To be presented at the 13th International Workshop on Large-Scale Integration of Wind Power into Power Systems as Well as on Transmission Networks for Offshore Wind Power Plants Berlin, Germany November 11-13, 2014 Conference Paper NREL/CP-5D00-62847 September 2014 NOTICE The submitted manuscript has been offered by an employee of the Alliance for Sustainable Energy, LLC (Alliance), a

  8. Buildings Performance Metrics Terminology | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Performance Metrics Terminology Buildings Performance Metrics Terminology This document provides the terms and definitions used in the Department of Energys Performance Metrics Research Project. metrics_terminology_20090203.pdf (152.35 KB) More Documents & Publications Procuring Architectural and Engineering Services for Energy Efficiency and Sustainability Transmittal Letter for the Statewide Benchmarking Process Evaluation Guide for Benchmarking Residential Energy Efficiency Program

  9. EECBG SEP Attachment 1 - Process metric list

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    10-07B/SEP 10-006A Attachment 1: Process Metrics List Metric Area Metric Primary or Optional Metric Item(s) to Report On 1. Building Retrofits 1a. Buildings retrofitted, by sector Number of buildings retrofitted Square footage of buildings retrofitted 1b. Energy management systems installed, by sector Number of energy management systems installed Square footage of buildings under management 1c. Building roofs retrofitted, by sector Number of building roofs retrofitted Square footage of building

  10. Definition of GPRA08 benefits metrics

    SciTech Connect

    None, None

    2009-01-18

    Background information for the FY 2007 GPRA methodology review on the definitions of GPRA08 benefits metrics.

  11. Western Resource Adequacy: Challenges - Approaches - Metrics | Department

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    of Energy Resource Adequacy: Challenges - Approaches - Metrics Western Resource Adequacy: Challenges - Approaches - Metrics West-Wide Resource Assessment Team. Committee on Regional Electric Power Cooperation. March 25, 2004 San Francisco, California Western Resource Adequacy: Challenges - Approaches - Metrics (368.96 KB) More Documents & Publications Eastern Wind Integration and Transmission Study (EWITS) (Revised) Estimating the Benefits and Costs of Distributed Energy Technologies

  12. Defining a Standard Metric for Electricity Savings

    SciTech Connect

    Brown, Marilyn; Akbari, Hashem; Blumstein, Carl; Koomey, Jonathan; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H.; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B.; Greenberg, Steve; Hafemeister, David; Harris, Jeff; Harvey, Hal; Heitz, Eric; Hirst, Eric; Hummel, Holmes; Kammen, Dan; Kelly, Henry; Laitner, Skip; Levine, Mark; Lovins, Amory; Masters, Gil; McMahon, James E.; Meier, Alan; Messenger, Michael; Millhone, John; Mills, Evan; Nadel, Steve; Nordman, Bruce; Price, Lynn; Romm, Joe; Ross, Marc; Rufo, Michael; Sathaye, Jayant; Schipper, Lee; Schneider, Stephen H; Sweeney, James L; Verdict, Malcolm; Vorsatz, Diana; Wang, Devra; Weinberg, Carl; Wilk, Richard; Wilson, John; Worrell, Ernst

    2009-03-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70percent capacity factor with 7percent T&D losses. Displacing such a plant for one year would save 3 billion kW h per year at the meter and reduce emissions by 3 million metric tons of CO2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question--Dr. Arthur H. Rosenfeld.

  13. Comparing Resource Adequacy Metrics: Preprint

    SciTech Connect

    Ibanez, E.; Milligan, M.

    2014-09-01

    As the penetration of variable generation (wind and solar) increases around the world, there is an accompanying growing interest and importance in accurately assessing the contribution that these resources can make toward planning reserve. This contribution, also known as the capacity credit or capacity value of the resource, is best quantified by using a probabilistic measure of overall resource adequacy. In recognizing the variable nature of these renewable resources, there has been interest in exploring the use of reliability metrics other than loss of load expectation. In this paper, we undertake some comparisons using data from the Western Electricity Coordinating Council in the western United States.

  14. Efficient Synchronization Stability Metrics for Fault Clearing...

    Office of Scientific and Technical Information (OSTI)

    Title: Efficient Synchronization Stability Metrics for Fault Clearing Authors: Backhaus, Scott N. 1 ; Chertkov, Michael 1 ; Bent, Russell Whitford 1 ; Bienstock, Daniel 2...

  15. Module 6 - Metrics, Performance Measurements and Forecasting...

    Energy.gov [DOE] (indexed site)

    This module reviews metrics such as cost and schedule variance along with cost and schedule performance indices. In addition, this module will outline forecasting tools such as ...

  16. Microsoft Word - QER Resilience Metrics - Technical Workshp ...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Workshop Resilience Metrics for Energy Transmission and Distribution Infrastructure Offices of Electricity Delivery and Energy Reliability (OE) and Energy Policy and Systems ...

  17. Microsoft Word - QER Resilience Metrics - Technical Workshp ...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Quadrennial Energy Review Technical Workshop on Resilience Metrics for Energy Transmission and Distribution Infrastructure April, 29th, 2014 777 North Capitol St NE Ste 300, ...

  18. Smart Grid Status and Metrics Report Appendices

    SciTech Connect

    Balducci, Patrick J.; Antonopoulos, Chrissi A.; Clements, Samuel L.; Gorrissen, Willy J.; Kirkham, Harold; Ruiz, Kathleen A.; Smith, David L.; Weimar, Mark R.; Gardner, Chris; Varney, Jeff

    2014-07-01

    A smart grid uses digital power control and communication technology to improve the reliability, security, flexibility, and efficiency of the electric system, from large generation through the delivery systems to electricity consumers and a growing number of distributed generation and storage resources. To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. The Smart Grid Status and Metrics Report defines and examines 21 metrics that collectively provide insight into the grid’s capacity to embody these characteristics. This appendix presents papers covering each of the 21 metrics identified in Section 2.1 of the Smart Grid Status and Metrics Report. These metric papers were prepared in advance of the main body of the report and collectively form its informational backbone.

  19. Metrics for border management systems.

    SciTech Connect

    Duggan, Ruth Ann

    2009-07-01

    There are as many unique and disparate manifestations of border systems as there are borders to protect. Border Security is a highly complex system analysis problem with global, regional, national, sector, and border element dimensions for land, water, and air domains. The complexity increases with the multiple, and sometimes conflicting, missions for regulating the flow of people and goods across borders, while securing them for national security. These systems include frontier border surveillance, immigration management and customs functions that must operate in a variety of weather, terrain, operational conditions, cultural constraints, and geopolitical contexts. As part of a Laboratory Directed Research and Development Project 08-684 (Year 1), the team developed a reference framework to decompose this complex system into international/regional, national, and border elements levels covering customs, immigration, and border policing functions. This generalized architecture is relevant to both domestic and international borders. As part of year two of this project (09-1204), the team determined relevant relative measures to better understand border management performance. This paper describes those relative metrics and how they can be used to improve border management systems.

  20. Description of the Sandia National Laboratories science, technology & engineering metrics process.

    SciTech Connect

    Jordan, Gretchen B.; Watkins, Randall D.; Trucano, Timothy Guy; Burns, Alan Richard; Oelschlaeger, Peter

    2010-04-01

    There has been a concerted effort since 2007 to establish a dashboard of metrics for the Science, Technology, and Engineering (ST&E) work at Sandia National Laboratories. These metrics are to provide a self assessment mechanism for the ST&E Strategic Management Unit (SMU) to complement external expert review and advice and various internal self assessment processes. The data and analysis will help ST&E Managers plan, implement, and track strategies and work in order to support the critical success factors of nurturing core science and enabling laboratory missions. The purpose of this SAND report is to provide a guide for those who want to understand the ST&E SMU metrics process. This report provides an overview of why the ST&E SMU wants a dashboard of metrics, some background on metrics for ST&E programs from existing literature and past Sandia metrics efforts, a summary of work completed to date, specifics on the portfolio of metrics that have been chosen and the implementation process that has been followed, and plans for the coming year to improve the ST&E SMU metrics process.

  1. Metrics for comparison of crystallographic maps

    SciTech Connect

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects, such as regions of high density, are of interest.

  2. Performance Metrics and Budget Division (HC-51)

    Energy.gov [DOE]

    The mission of the Performance Metrics and Budget Division (HC-51) is to support the effective and efficient implementation of the Department of Energy’s human capital initiatives and functions...

  3. Clean Cities Annual Metrics Report 2009 (Revised)

    SciTech Connect

    Johnson, C.

    2011-08-01

    Document provides Clean Cities coalition metrics about the use of alternative fuels; the deployment of alternative fuel vehicles, hybrid electric vehicles (HEVs), and idle reduction initiatives; fuel economy activities; and programs to reduce vehicle miles driven.

  4. FY 2014 Q3 Metric Summary | Department of Energy

    Office of Environmental Management (EM)

    FY 2014 Overall Contract and Project Management Improvement Performance Metrics and Targets FY 2015 Overall Contract and Project Management Improvement Performance Metrics and ...

  5. Business Metrics for High-Performance Homes: A Colorado Springs...

    Office of Scientific and Technical Information (OSTI)

    Technical Report: Business Metrics for High-Performance Homes: A Colorado Springs Case Study Citation Details In-Document Search Title: Business Metrics for High-Performance Homes: ...

  6. Conceptual Framework for Developing Resilience Metrics for the...

    Energy.gov [DOE] (indexed site)

    Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States Technical Workshop: Resilience Metrics for Energy Transmission ...

  7. Label-invariant Mesh Quality Metrics. (Conference) | SciTech...

    Office of Scientific and Technical Information (OSTI)

    Label-invariant Mesh Quality Metrics. Citation Details In-Document Search Title: Label-invariant Mesh Quality Metrics. Abstract not provided. Authors: Knupp, Patrick Publication ...

  8. Implementing the Data Center Energy Productivity Metric

    SciTech Connect

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2012-10-01

    As data centers proliferate in both size and number, their energy efficiency is becoming increasingly important. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high performance computing data center. We found that DCeP was successful in clearly distinguishing between different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve (or even maximize) energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and among data centers.

  9. Instructions for EM Corporate Performance Metrics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Instructions for EM Corporate Performance Metrics Instructions for EM Corporate Performance Metrics Quality Program Criteria Instructions for EM Corporate Performance Metrics (128.47 KB) More Documents & Publications EM Corporate QA Performance Metrics CPMS Tables QA Corporate Board Meeting - July 2008

  10. Metrics for Evaluating the Accuracy of Solar Power Forecasting (Presentation)

    SciTech Connect

    Zhang, J.; Hodge, B.; Florita, A.; Lu, S.; Hamann, H.; Banunarayanan, V.

    2013-10-01

    This presentation proposes a suite of metrics for evaluating the performance of solar power forecasting.

  11. Enhanced Accident Tolerant LWR Fuels: Metrics Development

    SciTech Connect

    Shannon Bragg-Sitton; Lori Braase; Rose Montgomery; Chris Stanek; Robert Montgomery; Lance Snead; Larry Ott; Mike Billone

    2013-09-01

    The Department of Energy (DOE) Fuel Cycle Research and Development (FCRD) Advanced Fuels Campaign (AFC) is conducting research and development on enhanced Accident Tolerant Fuels (ATF) for light water reactors (LWRs). This mission emphasizes the development of novel fuel and cladding concepts to replace the current zirconium alloy-uranium dioxide (UO2) fuel system. The overall mission of the ATF research is to develop advanced fuels/cladding with improved performance, reliability and safety characteristics during normal operations and accident conditions, while minimizing waste generation. The initial effort will focus on implementation in operating reactors or reactors with design certifications. To initiate the development of quantitative metrics for ATR, a LWR Enhanced Accident Tolerant Fuels Metrics Development Workshop was held in October 2012 in Germantown, MD. This paper summarizes the outcome of that workshop and the current status of metrics development for LWR ATF.

  12. Metrics for comparison of crystallographic maps

    DOE PAGES [OSTI]

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Lunin, Vladimir Y.; Terwilliger, Thomas C.; Adams, Paul D.

    2014-10-01

    Numerical comparison of crystallographic contour maps is used extensively in structure solution and model refinement, analysis and validation. However, traditional metrics such as the map correlation coefficient (map CC, real-space CC or RSCC) sometimes contradict the results of visual assessment of the corresponding maps. This article explains such apparent contradictions and suggests new metrics and tools to compare crystallographic contour maps. The key to the new methods is rank scaling of the Fourier syntheses. The new metrics are complementary to the usual map CC and can be more helpful in map comparison, in particular when only some of their aspects,more » such as regions of high density, are of interest.« less

  13. EECBG SEP Attachment 1 - Process metric list | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    SEP Attachment 1 - Process metric list EECBG SEP Attachment 1 - Process metric list Reporting Guidance Process Metric List eecbg_10_07b_sep__10_006a_attachment1_process_metric_list.pdf (93.56 KB) More Documents & Publications EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List EECBG Program Notice 10-07A DOE Recovery Act Reporting Requirements for the State Energy Program

  14. Clean Cities 2011 Annual Metrics Report

    SciTech Connect

    Johnson, C.

    2012-12-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2011. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  15. Performance Metrics Research Project - Final Report

    SciTech Connect

    Deru, M.; Torcellini, P.

    2005-10-01

    NREL began work for DOE on this project to standardize the measurement and characterization of building energy performance. NREL's primary research objectives were to determine which performance metrics have greatest value for determining energy performance and to develop standard definitions and methods of measuring and reporting that performance.

  16. Clean Cities 2010 Annual Metrics Report

    SciTech Connect

    Johnson, C.

    2012-10-01

    This report details the petroleum savings and vehicle emissions reductions achieved by the U.S. Department of Energy's Clean Cities program in 2010. The report also details other performance metrics, including the number of stakeholders in Clean Cities coalitions, outreach activities by coalitions and national laboratories, and alternative fuel vehicles deployed.

  17. Smart Grid Status and Metrics Report

    SciTech Connect

    Balducci, Patrick J.; Weimar, Mark R.; Kirkham, Harold

    2014-07-01

    To convey progress made in achieving the vision of a smart grid, this report uses a set of six characteristics derived from the National Energy Technology Laboratory Modern Grid Strategy. It measures 21 metrics to provide insight into the grid’s capacity to embody these characteristics. This report looks across a spectrum of smart grid concerns to measure the status of smart grid deployment and impacts.

  18. Widget:CrazyEggMetrics | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    CrazyEggMetrics Jump to: navigation, search This widget runs javascript code for the Crazy Egg user experience metrics. This should not be on all pages, but on select pages...

  19. Energy Department Project Captures and Stores One Million Metric...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    One Million Metric Tons of Carbon Energy Department Project Captures and Stores One Million Metric Tons of Carbon January 8, 2015 - 11:18am Addthis News Media Contact 202-586-4940 ...

  20. Financial Metrics Data Collection Protocol, Version 1.0

    SciTech Connect

    Fowler, Kimberly M.; Gorrissen, Willy J.; Wang, Na

    2010-04-30

    Brief description of data collection process and plan that will be used to collect financial metrics associated with sustainable design.

  1. Module 6 - Metrics, Performance Measurements and Forecasting | Department

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    of Energy 6 - Metrics, Performance Measurements and Forecasting Module 6 - Metrics, Performance Measurements and Forecasting This module focuses on the metrics and performance measurement tools used in Earned Value. This module reviews metrics such as cost and schedule variance along with cost and schedule performance indices. In addition, this module will outline forecasting tools such as estimate to complete (ETC) and estimate at completion (EAC). Begin Module >> (471

  2. Nonmaximality of known extremal metrics on torus and Klein bottle

    SciTech Connect

    Karpukhin, M A

    2013-12-31

    The El Soufi-Ilias theorem establishes a connection between minimal submanifolds of spheres and extremal metrics for eigenvalues of the Laplace-Beltrami operator. Recently, this connection was used to provide several explicit examples of extremal metrics. We investigate the properties of these metrics and prove that none of them is maximal. Bibliography: 24 titles.

  3. Annex A Metrics for the Smart Grid System Report

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Annex A Metrics for the Smart Grid System Report A.iii Table of Contents Introduction ........................................................................................................................................... A.1 Metric #1: The Fraction of Customers and Total Load Served by Real-Time Pricing, Critical Peak Pricing, and Time-of-Use Pricing ........................................................................................ A.2 Metric #2: Real-Time System Operations Data

  4. Microsoft Word - McIntyre-Metrics Report SAND draft9-14.doc

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    SAND2007-2070P Unlimited Release September 2007 Security Metrics for ... systems, including development of a metrics taxonomy and guidelines for using metrics. ...

  5. Metrics For Comparing Plasma Mass Filters

    SciTech Connect

    Abraham J. Fetterman and Nathaniel J. Fisch

    2012-08-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter. __________________________________________________

  6. Metrics for comparing plasma mass filters

    SciTech Connect

    Fetterman, Abraham J.; Fisch, Nathaniel J.

    2011-10-15

    High-throughput mass separation of nuclear waste may be useful for optimal storage, disposal, or environmental remediation. The most dangerous part of nuclear waste is the fission product, which produces most of the heat and medium-term radiation. Plasmas are well-suited to separating nuclear waste because they can separate many different species in a single step. A number of plasma devices have been designed for such mass separation, but there has been no standardized comparison between these devices. We define a standard metric, the separative power per unit volume, and derive it for three different plasma mass filters: the plasma centrifuge, Ohkawa filter, and the magnetic centrifugal mass filter.

  7. Clean Cities 2014 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center

    Clean Cities 2014 Annual Metrics Report Caley Johnson and Mark Singer National Renewable Energy Laboratory Technical Report NREL/TP-5400-65265 December 2015 NREL is a national laboratory of the U.S. Department of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC This report is available at no cost from the National Renewable Energy Laboratory (NREL) at www.nrel.gov/publications. Contract No. DE-AC36-08GO28308 National Renewable Energy

  8. Metric redefinitions in Einstein-Aether theory

    SciTech Connect

    Foster, Brendan Z.

    2005-08-15

    'Einstein-Aether' theory, in which gravity couples to a dynamical, timelike, unit-norm vector field, provides a means for studying Lorentz violation in a generally covariant setting. Demonstrated here is the effect of a redefinition of the metric and 'aether' fields in terms of the original fields and two free parameters. The net effect is a change of the coupling constants appearing in the action. Using such a redefinition, one of the coupling constants can be set to zero, simplifying studies of solutions of the theory.

  9. Clean Cities 2014 Annual Metrics Report

    SciTech Connect

    Johnson, Caley; Singer, Mark

    2015-12-22

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.

  10. Clean Cities 2013 Annual Metrics Report

    SciTech Connect

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.

  11. Metrics correlation and analysis service (MCAS)

    SciTech Connect

    Baranovski, Andrew; Dykstra, Dave; Garzoglio, Gabriele; Hesselroth, Ted; Mhashilkar, Parag; Levshina, Tanya; /Fermilab

    2009-05-01

    The complexity of Grid workflow activities and their associated software stacks inevitably involves multiple organizations, ownership, and deployment domains. In this setting, important and common tasks such as the correlation and display of metrics and debugging information (fundamental ingredients of troubleshooting) are challenged by the informational entropy inherent to independently maintained and operated software components. Because such an information 'pond' is disorganized, it a difficult environment for business intelligence analysis i.e. troubleshooting, incident investigation and trend spotting. The mission of the MCAS project is to deliver a software solution to help with adaptation, retrieval, correlation, and display of workflow-driven data and of type-agnostic events, generated by disjoint middleware.

  12. Metrics for Measuring Progress Toward Implementation of the Smart Grid

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    (June 2008) | Department of Energy Metrics for Measuring Progress Toward Implementation of the Smart Grid (June 2008) Metrics for Measuring Progress Toward Implementation of the Smart Grid (June 2008) Results of the breakout session discussions at the Smart Grid Implementation Workshop, June 19-20, 2008 Metrics for Measuring Progress Toward Implementation of the Smart Grid (308.23 KB) More Documents & Publications 5th Annual CHP Roadmap Workshop Breakout Group Results, September 2004

  13. Measuring energy efficiency: Opportunities from standardization and common metrics

    Energy Information Administration (EIA) (indexed site)

    Measuring energy efficiency: Opportunities from standardization and common metrics For 2016 EIA Energy Conference July 11, 2016 | Washington, D.C. By Stacy Angel, Energy Information Portfolio Analyst Carol White, Senior Energy Efficiency Analyst How is the importance of measuring energy efficiency changing? * The number of energy efficiency policies and programs is growing. * Common metrics help measure progress towards multiple objectives. * Clear metrics help consumers make informed energy

  14. Integration of the EM Corporate QA Performance Metrics With Performance

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Analysis Process | Department of Energy the EM Corporate QA Performance Metrics With Performance Analysis Process Integration of the EM Corporate QA Performance Metrics With Performance Analysis Process August 2009 Presenter: Robert Hinds, Savannah River Remediation, LLC Track 9-12 Topics Covered: Implementing CPMS for QA Corporate QA Performance Metrics Contractor Performance Analysis Contractor Assessment Programs Assessment Program Structure CPMS Integration with P/A Process Validating

  15. Toward a new metric for ranking high performance computing systems.

    Office of Scientific and Technical Information (OSTI)

    (Technical Report) | SciTech Connect Toward a new metric for ranking high performance computing systems. Citation Details In-Document Search Title: Toward a new metric for ranking high performance computing systems. The High Performance Linpack (HPL), or Top 500, benchmark [1] is the most widely recognized and discussed metric for ranking high performance computing systems. However, HPL is increasingly unreliable as a true measure of system performance for a growing collection of important

  16. Technical Workshop: Resilience Metrics for Energy Transmission and

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Distribution Infrastructure | Department of Energy Resilience Metrics for Energy Transmission and Distribution Infrastructure Technical Workshop: Resilience Metrics for Energy Transmission and Distribution Infrastructure During this workshop, EPSA invited technical experts from industry, national laboratories, academia, and NGOs to discuss the state of play of and need for resilience metrics and how they vary by natural gas, liquid fuels and electric grid infrastructures. Issues important to

  17. EM Corporate QA Performance Metrics | Department of Energy

    Energy.gov [DOE] (indexed site)

    QA Corporate Board Meeting - November 2008 Instructions for EM Corporate Performance Metrics FY 2015 SENIOR EXECUTIVE SERVICE (SES) AND SENIOR PROFESSIONAL (SP) PERFORMANCE ...

  18. Office of HC Strategy Budget and Performance Metrics (HC-50)

    Energy.gov [DOE]

    The Office of Human Capital Strategy, Budget, and Performance Metrics provides strategic direction and advice to its stakeholders through the integration of budget analysis, workforce projections,...

  19. DOE Announces Webinars on Solar Forecasting Metrics, the DOE...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    DOE Announces Webinars on Solar Forecasting Metrics, the DOE ... from adopting the latest energy efficiency and renewable ... to liquids technology, advantages of using natural gas, ...

  20. Exploration Cost and Time Metric | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    lt":0,"address":"","icon":"","group":"","inlineLabel":"","visitedicon":"" Hide Map Language: English Exploration Cost and Time Metric Screenshot References: Conference Paper1...

  1. Integration of the EM Corporate QA Performance Metrics With Performanc...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Integration of the EM Corporate QA Performance Metrics With Performance Analysis Process ... Assessment Program Structure CPMS Integration with PA Process Validating The Process ...

  2. Wave Energy Converter System Requirements and Performance Metrics

    Energy.gov [DOE]

    The Energy Department and Wave Energy Scotland are holding a joint workshop on wave energy converter (WEC) system requirements and performance metrics on Friday, February 26.

  3. Practical Diagnostics for Evaluating Residential Commissioning Metrics

    SciTech Connect

    Wray, Craig; Walker, Iain; Siegel, Jeff; Sherman, Max

    2002-06-11

    In this report, we identify and describe 24 practical diagnostics that are ready now to evaluate residential commissioning metrics, and that we expect to include in the commissioning guide. Our discussion in the main body of this report is limited to existing diagnostics in areas of particular concern with significant interactions: envelope and HVAC systems. These areas include insulation quality, windows, airtightness, envelope moisture, fan and duct system airflows, duct leakage, cooling equipment charge, and combustion appliance backdrafting with spillage. Appendix C describes the 83 other diagnostics that we have examined in the course of this project, but that are not ready or are inappropriate for residential commissioning. Combined with Appendix B, Table 1 in the main body of the report summarizes the advantages and disadvantages of all 107 diagnostics. We first describe what residential commissioning is, its characteristic elements, and how one might structure its process. Our intent in this discussion is to formulate and clarify these issues, but is largely preliminary because such a practice does not yet exist. Subsequent sections of the report describe metrics one can use in residential commissioning, along with the consolidated set of 24 practical diagnostics that the building industry can use now to evaluate them. Where possible, we also discuss the accuracy and usability of diagnostics, based on recent laboratory work and field studies by LBNL staff and others in more than 100 houses. These studies concentrate on evaluating diagnostics in the following four areas: the DeltaQ duct leakage test, air-handler airflow tests, supply and return grille airflow tests, and refrigerant charge tests. Appendix A describes those efforts in detail. In addition, where possible, we identify the costs to purchase diagnostic equipment and the amount of time required to conduct the diagnostics. Table 1 summarizes these data. Individual equipment costs for the 24

  4. Metrics for Evaluating Conventional and Renewable Energy Technologies (Presentation)

    SciTech Connect

    Mann, M. K.

    2013-01-01

    With numerous options for the future of natural gas, how do we know we're going down the right path? How do we designate a metric to measure and demonstrate change and progress, and how does that metric incorporate all stakeholders and scenarios?

  5. Quantitative metrics for assessment of chemical image quality and spatial resolution

    DOE PAGES [OSTI]

    Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.

    2016-02-28

    Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less

  6. Development of new VOC exposure metrics and their relationship to ''Sick Building Syndrome'' symptoms

    SciTech Connect

    Ten Brinke, JoAnn

    1995-08-01

    Volatile organic compounds (VOCs) are suspected to contribute significantly to ''Sick Building Syndrome'' (SBS), a complex of subchronic symptoms that occurs during and in general decreases away from occupancy of the building in question. A new approach takes into account individual VOC potencies, as well as the highly correlated nature of the complex VOC mixtures found indoors. The new VOC metrics are statistically significant predictors of symptom outcomes from the California Healthy Buildings Study data. Multivariate logistic regression analyses were used to test the hypothesis that a summary measure of the VOC mixture, other risk factors, and covariates for each worker will lead to better prediction of symptom outcome. VOC metrics based on animal irritancy measures and principal component analysis had the most influence in the prediction of eye, dermal, and nasal symptoms. After adjustment, a water-based paints and solvents source was found to be associated with dermal and eye irritation. The more typical VOC exposure metrics used in prior analyses were not useful in symptom prediction in the adjusted model (total VOC (TVOC), or sum of individually identified VOCs ({Sigma}VOC{sub i})). Also not useful were three other VOC metrics that took into account potency, but did not adjust for the highly correlated nature of the data set, or the presence of VOCs that were not measured. High TVOC values (2--7 mg m{sup {minus}3}) due to the presence of liquid-process photocopiers observed in several study spaces significantly influenced symptoms. Analyses without the high TVOC values reduced, but did not eliminate the ability of the VOC exposure metric based on irritancy and principal component analysis to explain symptom outcome.

  7. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    SciTech Connect

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  8. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    SciTech Connect

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  9. Self-benchmarking Guide for Data Centers: Metrics, Benchmarks, Actions

    SciTech Connect

    Mathew, Paul; Ganguly, Srirupa; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in data centers. This guide is primarily intended for personnel who have responsibility for managing energy use in existing data centers - including facilities managers, energy managers, and their engineering consultants. Additionally, data center designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior data center benchmarking studies supported by the California Energy Commission. Much of the benchmarking data are drawn from the LBNL data center benchmarking database that was developed from these studies. Additional benchmark data were obtained from engineering experts including facility designers and energy managers. This guide also builds on recent research supported by the U.S. Department of Energy's Save Energy Now program.

  10. A Graph Analytic Metric for Mitigating Advanced Persistent Threat

    SciTech Connect

    Johnson, John R.; Hogan, Emilie A.

    2013-06-04

    This paper introduces a novel graph analytic metric that can be used to measure the potential vulnerability of a cyber network to specific types of attacks that use lateral movement and privilege escalation such as the well known Pass The Hash, (PTH). The metric is computed from an oriented subgraph of the underlying cyber network induced by selecting only those edges for which a given property holds between the two vertices of the edge. The metric with respect to a select node on the subgraph is defined as the likelihood that the select node is reachable from another arbitrary node in the graph. This metric can be calculated dynamically from the authorization and auditing layers during the network security authorization phase and will potentially enable predictive deterrence against attacks such as PTH.

  11. ARM - Evaluation Product - AERI Data Quality Metric (AERI-QC...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    to hear from you Send us a note below or call us at 1-888-ARM-DATA. Send Evaluation Product : AERI Data Quality Metric (AERI-QC) Ancillary NetCDF file to be used with the...

  12. Microsoft Word - followup to Fin Risk Metrics workshop.doc

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    March 21, 2008 PurposeSubject: Follow-up to Financial Risk Metrics Workshop Page 1 of 1 Differences in Cash Flow between Net Billing and Direct Pay for Energy Northwest Attached...

  13. Analysis of Solar Cell Quality Using Voltage Metrics: Preprint

    SciTech Connect

    Toberer, E. S.; Tamboli, A. C.; Steiner, M.; Kurtz, S.

    2012-06-01

    The highest efficiency solar cells provide both excellent voltage and current. Of these, the open-circuit voltage (Voc) is more frequently viewed as an indicator of the material quality. However, since the Voc also depends on the band gap of the material, the difference between the band gap and the Voc is a better metric for comparing material quality of unlike materials. To take this one step further, since Voc also depends on the shape of the absorption edge, we propose to use the ultimate metric: the difference between the measured Voc and the Voc calculated from the external quantum efficiency using a detailed balance approach. This metric is less sensitive to changes in cell design and definition of band gap. The paper defines how to implement this metric and demonstrates how it can be useful in tracking improvements in Voc, especially as Voc approaches its theoretical maximum.

  14. ARM - Evaluation Product - Barrow Radiation Data (2009 metric...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    from you Send us a note below or call us at 1-888-ARM-DATA. Send Evaluation Product : Barrow Radiation Data (2009 metric) Observations from a suite of radiometers including...

  15. Measuring solar reflectance Part I: Defining a metric that accurately...

    Office of Scientific and Technical Information (OSTI)

    A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool ...

  16. Texas CO2 Capture Demonstration Project Hits Three Million Metric...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    On June 30, Allentown, PA-based Air Products successfully captured and transported, via pipeline, its 3 millionth metric ton of carbon dioxide (CO2) to be used for enhanced oil ...

  17. Towards Efficient Supercomputing: Searching for the Right Efficiency Metric

    SciTech Connect

    Hsu, Chung-Hsing; Kuehn, Jeffery A; Poole, Stephen W

    2012-01-01

    The efficiency of supercomputing has traditionally been in the execution time. In early 2000 s, the concept of total cost of ownership was re-introduced, with the introduction of efficiency measure to include aspects such as energy and space. Yet the supercomputing community has never agreed upon a metric that can cover these aspects altogether and also provide a fair basis for comparison. This paper exam- ines the metrics that have been proposed in the past decade, and proposes a vector-valued metric for efficient supercom- puting. Using this metric, the paper presents a study of where the supercomputing industry has been and how it stands today with respect to efficient supercomputing.

  18. Resilient Control Systems Practical Metrics Basis for Defining Mission Impact

    SciTech Connect

    Craig G. Rieger

    2014-08-01

    "Resilience” describes how systems operate at an acceptable level of normalcy despite disturbances or threats. In this paper we first consider the cognitive, cyber-physical interdependencies inherent in critical infrastructure systems and how resilience differs from reliability to mitigate these risks. Terminology and metrics basis are provided to integrate the cognitive, cyber-physical aspects that should be considered when defining solutions for resilience. A practical approach is taken to roll this metrics basis up to system integrity and business case metrics that establish “proper operation” and “impact.” A notional chemical processing plant is the use case for demonstrating how the system integrity metrics can be applied to establish performance, and

  19. Weatherization Assistance Program Goals and Metrics | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    » Weatherization Assistance Program Goals and Metrics Weatherization Assistance Program Goals and Metrics UT - Bettelle - Oak Ridge National Laboratory Logo The U.S. Department of Energy (DOE) Weatherization Assistance Program (WAP) regularly reviews the work of states and grant recipients for effectiveness and for meeting program goals. DOE's Oak Ridge National Laboratory provides technical support to the program and conducts the evaluations. Goals The overall goal of WAP is to reduce the

  20. New IEC Specifications Help Define Wind Plant Performance Reporting Metrics

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    | Department of Energy IEC Specifications Help Define Wind Plant Performance Reporting Metrics New IEC Specifications Help Define Wind Plant Performance Reporting Metrics January 6, 2014 - 10:00am Addthis This is an excerpt from the Fourth Quarter 2013 edition of the Wind Program R&D Newsletter. The U.S. Department of Energy Wind Program and Sandia National Laboratories have been working with the International Electrotechnical Commission (IEC) Committee on wind turbine availability to

  1. Conceptual Framework for Developing Resilience Metrics for the Electricity,

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Oil, and Gas Sectors in the United States (September 2015) | Department of Energy Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (September 2015) Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States (September 2015) This report has been written for the Department of Energy's Office of Electricity Delivery and Energy Reliability to support the Office of

  2. Enclosure - FY 2015 Q4 Metrics Report 2015-11-02.xlsx

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Fourth Quarter Overall Root Cause Analysis (RCA)Corrective Action Plan (CAP) Performance Metrics No. ContractProject Management Performance Metrics FY 2015 Target Comment No. 2 3 ...

  3. Microsoft Word - 2014-5-27 RCA Qtr 2 Metrics Attachment_R1

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Second Quarter Overall Root Cause Analysis (RCA)Corrective Action Plan (CAP) Performance Metrics 1 ContractProject Management Performance Metric FY 2014 Target FY 2014 Projected ...

  4. Metrics Evolution in an Energy Research & Development Program

    SciTech Connect

    Brent Dixon

    2011-08-01

    All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R&D Program, which is working to improve the sustainability of nuclear energy.

  5. Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint

    SciTech Connect

    Zhang, J.; Hodge, B. M.; Florita, A.; Lu, S.; Hamann, H. F.; Banunarayanan, V.

    2013-10-01

    Forecasting solar energy generation is a challenging task due to the variety of solar power systems and weather regimes encountered. Forecast inaccuracies can result in substantial economic losses and power system reliability issues. This paper presents a suite of generally applicable and value-based metrics for solar forecasting for a comprehensive set of scenarios (i.e., different time horizons, geographic locations, applications, etc.). In addition, a comprehensive framework is developed to analyze the sensitivity of the proposed metrics to three types of solar forecasting improvements using a design of experiments methodology, in conjunction with response surface and sensitivity analysis methods. The results show that the developed metrics can efficiently evaluate the quality of solar forecasts, and assess the economic and reliability impact of improved solar forecasting.

  6. Non-minimal derivative couplings of the composite metric

    SciTech Connect

    Heisenberg, Lavinia

    2015-11-04

    In the context of massive gravity, bi-gravity and multi-gravity non-minimal matter couplings via a specific composite effective metric were investigated recently. Even if these couplings generically reintroduce the Boulware-Deser ghost, this composite metric is unique in the sense that the ghost reemerges only beyond the decoupling limit and the matter quantum loop corrections do not detune the potential interactions. We consider non-minimal derivative couplings of the composite metric to matter fields for a specific subclass of Horndeski scalar-tensor interactions. We first explore these couplings in the mini-superspace and investigate in which scenario the ghost remains absent. We further study these non-minimal derivative couplings in the decoupling-limit of the theory and show that the equation of motion for the helicity-0 mode remains second order in derivatives. Finally, we discuss preliminary implications for cosmology.

  7. Primer Control System Cyber Security Framework and Technical Metrics

    SciTech Connect

    Wayne F. Boyer; Miles A. McQueen

    2008-05-01

    The Department of Homeland Security National Cyber Security Division supported development of a control system cyber security framework and a set of technical metrics to aid owner-operators in tracking control systems security. The framework defines seven relevant cyber security dimensions and provides the foundation for thinking about control system security. Based on the developed security framework, a set of ten technical metrics are recommended that allow control systems owner-operators to track improvements or degradations in their individual control systems security posture.

  8. Enclosure - FY 2016 Q4 Metrics Report.xlsx

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Fourth Quarter Overall Root Cause Analysis (RCA)/Corrective Action Plan (CAP) Performance Metrics No. Contract/Project Management Performance Metrics FY 2016 Target No. 2 3 4 5 6 7 FY14-FY16: 4 completions through 4th Qtr. Combined Construction Cleanup 83% 91% 50% Based on 3-year rolling period (FY14 to FY16). TPC is Total Project Cost. 100% Comment EVM: Earned Value Management. CD-3: Critical Decision-3, Approve Start of Construction/Execution. FPD: Federal Project Director CD-1: Critical

  9. Calabi-Yau metrics for quotients and complete intersections

    DOE PAGES [OSTI]

    Braun, Volker; Brelidze, Tamaz; Douglas, Michael R.; Ovrut, Burt A.

    2008-05-22

    We extend previous computations of Calabi-Yau metrics on projective hypersurfaces to free quotients, complete intersections, and free quotients of complete intersections. In particular, we construct these metrics on generic quintics, four-generation quotients of the quintic, Schoen Calabi-Yau complete intersections and the quotient of a Schoen manifold with Z₃ x Z₃ fundamental group that was previously used to construct a heterotic standard model. Various numerical investigations into the dependence of Donaldson's algorithm on the integration scheme, as well as on the Kähler and complex structure moduli, are also performed.

  10. Culture, and a Metrics Methodology for Biological Countermeasure Scenarios

    SciTech Connect

    Simpson, Mary J.

    2007-03-15

    Outcome Metrics Methodology defines a way to evaluate outcome metrics associated with scenario analyses related to biological countermeasures. Previous work developed a schema to allow evaluation of common elements of impacts across a wide range of potential threats and scenarios. Classes of metrics were identified that could be used by decision makers to differentiate the common bases among disparate scenarios. Typical impact metrics used in risk calculations include the anticipated number of deaths, casualties, and the direct economic costs should a given event occur. There are less obvious metrics that are often as important and require more intensive initial work to be incorporated. This study defines a methodology for quantifying, evaluating, and ranking metrics other than direct health and economic impacts. As has been observed with the consequences of Hurricane Katrina, impacts to the culture of specific sectors of society are less obvious on an immediate basis but equally important over the ensuing and long term. Culture is used as the example class of metrics within which • requirements for a methodology are explored • likely methodologies are examined • underlying assumptions for the respective methodologies are discussed • the basis for recommending a specific methodology is demonstrated. Culture, as a class of metrics, is shown to consist of political, sociological, and psychological elements that are highly valued by decision makers. In addition, cultural practices, dimensions, and kinds of knowledge offer complementary sets of information that contribute to the context within which experts can provide input. The quantification and evaluation of sociopolitical, socio-economic, and sociotechnical impacts depend predominantly on subjective, expert judgment. Epidemiological data is limited, resulting in samples with statistical limits. Dose response assessments and curves depend on the quality of data and its relevance to human modes of exposure

  11. Deep Energy Retrofit Performance Metric Comparison: Eight California Case Studies

    SciTech Connect

    Walker, Iain; Fisher, Jeremy; Less, Brennan

    2014-06-01

    In this paper we will present the results of monitored annual energy use data from eight residential Deep Energy Retrofit (DER) case studies using a variety of performance metrics. For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use energy consumption for approximately one year. Annual performance in site and source energy, as well as carbon dioxide equivalent (CO2e) emissions were determined on a per house, per person and per square foot basis to examine the sensitivity to these different metrics. All eight DERs showed consistent success in achieving substantial site energy and CO2e reductions, but some projects achieved very little, if any source energy reduction. This problem emerged in those homes that switched from natural gas to electricity for heating and hot water, resulting in energy consumption dominated by electricity use. This demonstrates the crucial importance of selecting an appropriate metric to be used in guiding retrofit decisions. Also, due to the dynamic nature of DERs, with changes in occupancy, size, layout, and comfort, several performance metrics might be necessary to understand a project’s success.

  12. Metrics and Benchmarks for Energy Efficiency in Laboratories

    SciTech Connect

    Mathew, Paul

    2007-10-26

    A wide spectrum of laboratory owners, ranging from universities to federal agencies, have explicit goals for energy efficiency in their facilities. For example, the Energy Policy Act of 2005 (EPACT 2005) requires all new federal buildings to exceed ASHRAE 90.1-2004 1 by at least 30 percent. The University of California Regents Policy requires all new construction to exceed California Title 24 2 by at least 20 percent. A new laboratory is much more likely to meet energy efficiency goals if quantitative metrics and targets are explicitly specified in programming documents and tracked during the course of the delivery process. If efficiency targets are not explicitly and properly defined, any additional capital costs or design time associated with attaining higher efficiencies can be difficult to justify. The purpose of this guide is to provide guidance on how to specify and compute energy efficiency metrics and benchmarks for laboratories, at the whole building as well as the system level. The information in this guide can be used to incorporate quantitative metrics and targets into the programming of new laboratory facilities. Many of these metrics can also be applied to evaluate existing facilities. For information on strategies and technologies to achieve energy efficiency, the reader is referred to Labs21 resources, including technology best practice guides, case studies, and the design guide (available at www.labs21century.gov/toolkit).

  13. EERE Portfolio. Primary Benefits Metrics for FY09

    SciTech Connect

    none,

    2011-11-01

    This collection of data tables shows the benefits metrics related to energy security, environmental impacts, and economic impacts for both the entire EERE portfolio of renewable energy technologies as well as the individual technologies. Data are presented for the years 2015, 2020, 2030, and 2050, for both the NEMS and MARKAL models.

  14. On the existence of certain axisymmetric interior metrics

    SciTech Connect

    Angulo Santacruz, C.; Batic, D.; Nowakowski, M.

    2010-08-15

    One of the effects of noncommutative coordinate operators is that the delta function connected to the quantum mechanical amplitude between states sharp to the position operator gets smeared by a Gaussian distribution. Although this is not the full account of the effects of noncommutativity, this effect is, in particular, important as it removes the point singularities of Schwarzschild and Reissner-Nordstroem solutions. In this context, it seems to be of some importance to probe also into ringlike singularities which appear in the Kerr case. In particular, starting with an anisotropic energy-momentum tensor and a general axisymmetric ansatz of the metric together with an arbitrary mass distribution (e.g., Gaussian), we derive the full set of Einstein equations that the noncommutative geometry inspired Kerr solution should satisfy. Using these equations we prove two theorems regarding the existence of certain Kerr metrics inspired by noncommutative geometry.

  15. Development of Technology Readiness Level (TRL) Metrics and Risk Measures

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Anderson, K. K.; Sivaramakrishnan, Chandrika; Lansing, Carina

    2012-10-01

    This is an internal project milestone report to document the CCSI Element 7 team's progress on developing Technology Readiness Level (TRL) metrics and risk measures. In this report, we provide a brief overview of the current technology readiness assessment research, document the development of technology readiness levels (TRLs) specific to carbon capture technologies, describe the risk measures and uncertainty quantification approaches used in our research, and conclude by discussing the next steps that the CCSI Task 7 team aims to accomplish.

  16. Optimal recovery of linear operators in non-Euclidean metrics

    SciTech Connect

    Osipenko, K Yu

    2014-10-31

    The paper looks at problems concerning the recovery of operators from noisy information in non-Euclidean metrics. Anumber of general theorems are proved and applied to recovery problems for functions and their derivatives from the noisy Fourier transform. In some cases, afamily of optimal methods is found, from which the methods requiring the least amount of original information are singled out. Bibliography: 25 titles.

  17. Microsoft Word - DOE_ANNUAL_METRICS_2009Q3.docx

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    14404 Third Quarter 2009 Modeling Program Metric: Coupled model comparison with observations using improved dynamics at coarse resolution Quantifying the impact of a finite volume dynamical core in CCSM3 on simulated precipitation over major catchment areas July 2009 Peter J. Gleckler and Karl E. Taylor Lawrence Livermore National Laboratory Livermore, CA Work supported by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research 
 2
 Disclaimer This

  18. Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and...

    Office of Scientific and Technical Information (OSTI)

    Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and Conformal Quantum Mechanics Citation Details In-Document Search Title: Modified Anti-de-Sitter Metric, Light-Front...

  19. Microsoft Word - 2014-1-1 RCA Qtr 1 Metrics Attachment_R1

    Energy Saver

    ContractProject Management Performance Metric FY 2014 Target FY 2014 Projected FY 2014 ... ContractProject Management Performance Metrics FY 2014 Target FY 2014 1 th Qtr Actual ...

  20. FY 2012 Overall Contract and Project Management Improvement Performance Metrics and Targets

    Energy.gov [DOE]

    Overall Contract and Project Management Performance Metrics and Targets for FY 2012, first quarter through fourth quarter.

  1. FY 2014 Overall Contract and Project Management Improvement Performance Metrics and Targets

    Energy.gov [DOE]

    Overall Contract and Project Management Performance Metrics and Targets for FY 2014, first quarter through fourth quarter.

  2. FY 2011 Overall Contract and Project Management Improvement Performance Metrics and Targets

    Energy.gov [DOE]

    Overall Contract and Project Management Performance Metrics and Targets for FY 2011, first quarter through fourth quarter.

  3. FY 2010 Overall Contract and Project Management Improvement Performance Metrics and Targets

    Energy.gov [DOE]

    Overall Contract and Project Management Performance Metrics and Targets for FY 2010, first quarter through fourth quarter.

  4. FY 2016 Overall Contract and Project Management Improvement Performance Metrics and Targets

    Energy.gov [DOE]

    Overall Contract and Project Management Performance Metrics and Targets for FY 2016, first quarter through fourth quarter.

  5. Guidebook for ARRA Smart Grid Program Metrics and Benefits | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Energy Guidebook for ARRA Smart Grid Program Metrics and Benefits Guidebook for ARRA Smart Grid Program Metrics and Benefits The Guidebook for American Recovery and Reinvestment Act (ARRA) Smart Grid Program Metrics and Benefits describes the type of information to be collected from each of the Project Teams and how it will be used by the Department of Energy to communicate overall conclusions to the public. Guidebook for ARRA Smart Grid Program Metrics and Benefits (975.03 KB) More

  6. Derivation of a Levelized Cost of Coating (LCOC) metric for evaluation of solar selective absorber materials

    SciTech Connect

    Ho, C. K.; Pacheco, J. E.

    2015-06-05

    A new metric, the Levelized Cost of Coating (LCOC), is derived in this paper to evaluate and compare alternative solar selective absorber coatings against a baseline coating (Pyromark 2500). In contrast to previous metrics that focused only on the optical performance of the coating, the LCOC includes costs, durability, and optical performance for more comprehensive comparisons among candidate materials. The LCOC is defined as the annualized marginal cost of the coating to produce a baseline annual thermal energy production. Costs include the cost of materials and labor for initial application and reapplication of the coating, as well as the cost of additional or fewer heliostats to yield the same annual thermal energy production as the baseline coating. Results show that important factors impacting the LCOC include the initial solar absorptance, thermal emittance, reapplication interval, degradation rate, reapplication cost, and downtime during reapplication. The LCOC can also be used to determine the optimal reapplication interval to minimize the levelized cost of energy production. As a result, similar methods can be applied more generally to determine the levelized cost of component for other applications and systems.

  7. Derivation of a Levelized Cost of Coating (LCOC) metric for evaluation of solar selective absorber materials

    DOE PAGES [OSTI]

    Ho, C. K.; Pacheco, J. E.

    2015-06-05

    A new metric, the Levelized Cost of Coating (LCOC), is derived in this paper to evaluate and compare alternative solar selective absorber coatings against a baseline coating (Pyromark 2500). In contrast to previous metrics that focused only on the optical performance of the coating, the LCOC includes costs, durability, and optical performance for more comprehensive comparisons among candidate materials. The LCOC is defined as the annualized marginal cost of the coating to produce a baseline annual thermal energy production. Costs include the cost of materials and labor for initial application and reapplication of the coating, as well as the costmore » of additional or fewer heliostats to yield the same annual thermal energy production as the baseline coating. Results show that important factors impacting the LCOC include the initial solar absorptance, thermal emittance, reapplication interval, degradation rate, reapplication cost, and downtime during reapplication. The LCOC can also be used to determine the optimal reapplication interval to minimize the levelized cost of energy production. As a result, similar methods can be applied more generally to determine the levelized cost of component for other applications and systems.« less

  8. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

    SciTech Connect

    Price, Lynn; Murtishaw, Scott; Worrell, Ernst

    2003-06-01

    Energy Commission (Energy Commission) related to the Registry in three areas: (1) assessing the availability and usefulness of industry-specific metrics, (2) evaluating various methods for establishing baselines for calculating GHG emissions reductions related to specific actions taken by Registry participants, and (3) establishing methods for calculating electricity CO2 emission factors. The third area of research was completed in 2002 and is documented in Estimating Carbon Dioxide Emissions Factors for the California Electric Power Sector (Marnay et al., 2002). This report documents our findings related to the first areas of research. For the first area of research, the overall objective was to evaluate the metrics, such as emissions per economic unit or emissions per unit of production that can be used to report GHG emissions trends for potential Registry participants. This research began with an effort to identify methodologies, benchmarking programs, inventories, protocols, and registries that u se industry-specific metrics to track trends in energy use or GHG emissions in order to determine what types of metrics have already been developed. The next step in developing industry-specific metrics was to assess the availability of data needed to determine metric development priorities. Berkeley Lab also determined the relative importance of different potential Registry participant categories in order to asses s the availability of sectoral or industry-specific metrics and then identified industry-specific metrics in use around the world. While a plethora of metrics was identified, no one metric that adequately tracks trends in GHG emissions while maintaining confidentiality of data was identified. As a result of this review, Berkeley Lab recommends the development of a GHG intensity index as a new metric for reporting and tracking GHG emissions trends.Such an index could provide an industry-specific metric for reporting and tracking GHG emissions trends to accurately

  9. Metrics for the National SCADA Test Bed Program

    SciTech Connect

    Craig, Philip A.; Mortensen, J.; Dagle, Jeffery E.

    2008-12-05

    The U.S. Department of Energy Office of Electricity Delivery and Energy Reliability (DOE-OE) National SCADA Test Bed (NSTB) Program is providing valuable inputs into the electric industry by performing topical research and development (R&D) to secure next generation and legacy control systems. In addition, the program conducts vulnerability and risk analysis, develops tools, and performs industry liaison, outreach and awareness activities. These activities will enhance the secure and reliable delivery of energy for the United States. This report will describe metrics that could be utilized to provide feedback to help enhance the effectiveness of the NSTB Program.

  10. Metrics for Evaluating the Accuracy of Solar Power Forecasting: Preprint

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Metrics for Evaluating the Accuracy of Solar Power Forecasting Preprint J. Zhang, B.-M. Hodge, and A. Florita National Renewable Energy Laboratory S. Lu and H. F. Hamann IBM TJ Watson Research Center V. Banunarayanan U.S. Department of Energy To be presented at 3rd International Workshop on Integration of Solar Power into Power Systems London, England October 21 - 22, 2013 Conference Paper NREL/CP-5500-60142 October 2013 NOTICE The submitted manuscript has been offered by an employee of the

  11. User's Guide to the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect

    Taasevigen, Danny J.; Koran, William

    2012-02-28

    The intent of this user guide is to provide a brief description of the functionality of the Energy Charting and Metrics (ECAM) tool, including the expanded building re-tuning functionality developed for Pacific Northwest National laboratory (PNNL). This document describes the tool's general functions and features, and offers detailed instructions for PNNL building re-tuning charts, a feature in ECAM intended to help building owners and operators look at trend data (recommended 15-minute time intervals) in a series of charts (both time series and scatter) to analyze air-handler, zone, and central plant information gathered from a building automation system (BAS).

  12. SU-E-I-71: Quality Assessment of Surrogate Metrics in Multi-Atlas-Based Image Segmentation

    SciTech Connect

    Zhao, T; Ruan, D

    2015-06-15

    Purpose: With the ever-growing data of heterogeneous quality, relevance assessment of atlases becomes increasingly critical for multi-atlas-based image segmentation. However, there is no universally recognized best relevance metric and even a standard to compare amongst candidates remains elusive. This study, for the first time, designs a quantification to assess relevance metrics’ quality, based on a novel perspective of the metric as surrogate for inferring the inaccessible oracle geometric agreement. Methods: We first develop an inference model to relate surrogate metrics in image space to the underlying oracle relevance metric in segmentation label space, with a monotonically non-decreasing function subject to random perturbations. Subsequently, we investigate model parameters to reveal key contributing factors to surrogates’ ability in prognosticating the oracle relevance value, for the specific task of atlas selection. Finally, we design an effective contract-to-noise ratio (eCNR) to quantify surrogates’ quality based on insights from these analyses and empirical observations. Results: The inference model was specialized to a linear function with normally distributed perturbations, with surrogate metric exemplified by several widely-used image similarity metrics, i.e., MSD/NCC/(N)MI. Surrogates’ behaviors in selecting the most relevant atlases were assessed under varying eCNR, showing that surrogates with high eCNR dominated those with low eCNR in retaining the most relevant atlases. In an end-to-end validation, NCC/(N)MI with eCNR of 0.12 compared to MSD with eCNR of 0.10 resulted in statistically better segmentation with mean DSC of about 0.85 and the first and third quartiles of (0.83, 0.89), compared to MSD with mean DSC of 0.84 and the first and third quartiles of (0.81, 0.89). Conclusion: The designed eCNR is capable of characterizing surrogate metrics’ quality in prognosticating the oracle relevance value. It has been demonstrated to be

  13. Projections of Full-Fuel-Cycle Energy and Emissions Metrics

    SciTech Connect

    Coughlin, Katie

    2013-01-01

    To accurately represent how conservation and efficiency policies affect energy demand, both direct and indirect impacts need to be included in the accounting. The indirect impacts are defined here as the resource savings that accrue over the fuel production chain, which when added to the energy consumed at the point of use, constitute the full-fuel- cycle (FFC) energy. This paper uses the accounting framework developed in (Coughlin 2012) to calculate FFC energy metrics as time series for the period 2010-2040. The approach is extended to define FFC metrics for the emissions of greenhouse gases (GHGs) and other air-borne pollutants. The primary focus is the types of energy used in buildings and industrial processes, mainly natural gas and electricity. The analysis includes a discussion of the fuel production chain for coal, which is used extensively for electric power generation, and for diesel and fuel oil, which are used in mining, oil and gas operations, and fuel distribution. Estimates of the energy intensity parameters make use of data and projections from the Energy Information Agency’s National Energy Modeling System, with calculations based on information from the Annual Energy Outlook 2012.

  14. Conceptual Soundness, Metric Development, Benchmarking, and Targeting for PATH Subprogram Evaluation

    SciTech Connect

    Mosey. G.; Doris, E.; Coggeshall, C.; Antes, M.; Ruch, J.; Mortensen, J.

    2009-01-01

    The objective of this study is to evaluate the conceptual soundness of the U.S. Department of Housing and Urban Development (HUD) Partnership for Advancing Technology in Housing (PATH) program's revised goals and establish and apply a framework to identify and recommend metrics that are the most useful for measuring PATH's progress. This report provides an evaluative review of PATH's revised goals, outlines a structured method for identifying and selecting metrics, proposes metrics and benchmarks for a sampling of individual PATH programs, and discusses other metrics that potentially could be developed that may add value to the evaluation process. The framework and individual program metrics can be used for ongoing management improvement efforts and to inform broader program-level metrics for government reporting requirements.

  15. CEM_Metrics_and_Technical_Note_7_14_10.pdf | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    CEM_Metrics_and_Technical_Note_7_14_10.pdf CEM_Metrics_and_Technical_Note_7_14_10.pdf (129.47 KB) More Documents & Publications SEAD-Fact-Sheet.pdf Schematics of a heat pump clothes dryer<br /> Credit: Oak Ridge National Lab Heat Pump Clothes Dryer CEM_Metrics_and_Technical_Note_7_14_10.pdf Wind Vision: A New Era for Wind Power in the United States

  16. EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List |

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Department of Energy 10-07C/SEP 10-006B Attachment 1: Process Metrics List EECBG 10-07C/SEP 10-006B Attachment 1: Process Metrics List eecbg_sep_reporting_guidance_attachment_06242011.pdf (56.65 KB) More Documents & Publications EECBG SEP Attachment 1 - Process metric list EECBG Program Notice 10-07A DOE Recovery Act Reporting Requirements for the State Energy Program

  17. Taking the One-Metric-Ton Challenge | Y-12 National Security Complex

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Taking the One-Metric-Ton ... Taking the One-Metric-Ton Challenge Posted: January 13, 2016 - 4:46pm NNSA Uranium Program Manager Tim Driscoll speaks with the One-Metric-Ton Challenge team in Building 9212. The team has undertaken an extensive dedicated maintenance effort to improve metal production equipment reliability and reduce unexpected down time, with an end goal of significantly increasing purified metal production by fiscal year 2017. Last year, NNSA Uranium Program Manager Tim Driscoll

  18. DOE Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for

    National Nuclear Security Administration (NNSA)

    Civilian Reactors | National Nuclear Security Administration | (NNSA) Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for Civilian Reactors DOE Will Dispose of 34 Metric Tons of Plutonium by Turning it into Fuel for Civi Washington, DC Secretary Abraham announced that DOE will dispose of 34 metric tons of surplus weapons grade plutonium by turning the material into mixed oxide fuel (MOX) for use in nuclear reactors. The decision follows an exhaustive Administration review

  19. Metrics for Assessment of Smart Grid Data Integrity Attacks

    SciTech Connect

    Annarita Giani; Miles McQueen; Russell Bent; Kameshwar Poolla; Mark Hinrichs

    2012-07-01

    There is an emerging consensus that the nation’s electricity grid is vulnerable to cyber attacks. This vulnerability arises from the increasing reliance on using remote measurements, transmitting them over legacy data networks to system operators who make critical decisions based on available data. Data integrity attacks are a class of cyber attacks that involve a compromise of information that is processed by the grid operator. This information can include meter readings of injected power at remote generators, power flows on transmission lines, and relay states. These data integrity attacks have consequences only when the system operator responds to compromised data by redispatching generation under normal or contingency protocols. These consequences include (a) financial losses from sub-optimal economic dispatch to service loads, (b) robustness/resiliency losses from placing the grid at operating points that are at greater risk from contingencies, and (c) systemic losses resulting from cascading failures induced by poor operational choices. This paper is focused on understanding the connections between grid operational procedures and cyber attacks. We first offer two examples to illustrate how data integrity attacks can cause economic and physical damage by misleading operators into taking inappropriate decisions. We then focus on unobservable data integrity attacks involving power meter data. These are coordinated attacks where the compromised data are consistent with the physics of power flow, and are therefore passed by any bad data detection algorithm. We develop metrics to assess the economic impact of these attacks under re-dispatch decisions using optimal power flow methods. These metrics can be use to prioritize the adoption of appropriate countermeasures including PMU placement, encryption, hardware upgrades, and advance attack detection algorithms.

  20. Variable-metric diffraction crystals for x-ray optics

    SciTech Connect

    Smither, R.K.; Fernandez, P.B. )

    1992-02-01

    A variable-metric (VM) crystal is one in which the spacing between the crystalline planes changes with position in the crystal. This variation can be either parallel to the crystalline planes or perpendicular to the crystalline planes of interest and can be produced by either introducing a thermal gradient in the crystal or by growing a crystal made of two or more elements and changing the relative percentages of the two elements as the crystal is grown. A series of experiments were performed in the laboratory to demonstrate the principle of the variable-metric crystal and its potential use in synchrotron beam lines. One of the most useful applications of the VM crystal is to increase the number of photons per unit bandwidth in a diffracted beam without losing any of the overall intensity. In a normal synchrotron beam line that uses a two-crystal monochromator, the bandwidth of the diffracted photon beam is determined by the vertical opening angle of the beam which is typically 0.10--0.30 mrad or 20--60 arcsec. When the VM crystal approach is applied, the bandwidth of the beam can be made as narrow as the rocking curve of the diffracting crystal, which is typically 0.005--0.050 mrad or 1--10 arcsec. Thus a very large increase of photons per unit bandwidth (or per unit energy) can be achieved through the use of VM crystals. When the VM principle is used with bent crystals, new kinds of x-ray optical elements can be generated that can focus and defocus x-ray beams much like simple lenses where the focal length of the lens can be changed to match its application. Thus both large magnifications and large demagnifications can be achieved as well as parallel beams with narrow bandwidths.

  1. FY 2015 Q1 Metrics Supporting Documentation 2015-02-09.xls

    Energy.gov [DOE] (indexed site)

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Success: Complete 90% of capital asset projects at ...

  2. Enclosure - FY 2015 Q3 Metrics Report 2015-08-12.xlsx

    Energy.gov [DOE] (indexed site)

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Management Success: Complete 90% of capital asset ...

  3. (SSS)GAO Metrics - Project Success 2015-04-29 1100.xls

    Energy.gov [DOE] (indexed site)

    ContractProject Management Performance Metrics FY 2015 Target FY 2015 Pre- & Post- CAP* Forecast Comment 1 Capital Asset Project Success: Complete 90% of capital asset projects at ...

  4. New Selection Metric for Design of Thin-Film Solar Cell Absorber...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    New Selection Metric for Design of Thin-Film Solar Cell Absorber Materials Research Details * SLME account s for the physics of absorption, emission, and recombination by directly ...

  5. GPRA 2003 quality metrics methodology and results: Office of Industrial Technologies

    SciTech Connect

    None, None

    2002-04-19

    This report describes the results, calculations, and assumptions underlying the GPRA 2003 Quality Metrics results for all Planning Units withing the Office of Industrial Technologies.

  6. Building Cost and Performance Metrics: Data Collection Protocol, Revision 1.0

    SciTech Connect

    Fowler, Kimberly M.; Solana, Amy E.; Spees, Kathleen L.

    2005-09-29

    This technical report describes the process for selecting and applying the building cost and performance metrics for measuring sustainably designed buildings in comparison to traditionally designed buildings.

  7. Microsoft PowerPoint - Snippet 3.2 Schedule Health Metrics 20140713...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    ... available software. These metrics can be quickly reviewed each month to identify any schedule health risks on your project, whether you are the contractor or the customer. ...

  8. Enhanced Accident Tolerant LWR Fuels National Metrics Workshop Report

    SciTech Connect

    Lori Braase

    2013-01-01

    Commercialization. The activities performed during the feasibility assessment phase include laboratory scale experiments; fuel performance code updates; and analytical assessment of economic, operational, safety, fuel cycle, and environmental impacts of the new concepts. The development and qualification stage will consist of fuel fabrication and large scale irradiation and safety basis testing, leading to qualification and ultimate NRC licensing of the new fuel. The commercialization phase initiates technology transfer to industry for implementation. Attributes for fuels with enhanced accident tolerance include improved reaction kinetics with steam and slower hydrogen generation rate, while maintaining acceptable cladding thermo-mechanical properties; fuel thermo-mechanical properties; fuel-clad interactions; and fission-product behavior. These attributes provide a qualitative guidance for parameters that must be considered in the development of fuels and cladding with enhanced accident tolerance. However, quantitative metrics must be developed for these attributes. To initiate the quantitative metrics development, a Light Water Reactor Enhanced Accident Tolerant Fuels Metrics Development Workshop was held October 10-11, 2012, in Germantown, Maryland. This document summarizes the structure and outcome of the two-day workshop. Questions regarding the content can be directed to Lori Braase, 208-526-7763, lori.braase@inl.gov.

  9. Impact of Different Economic Performance Metrics on the Perceived Value of Solar Photovoltaics

    SciTech Connect

    Drury, E.; Denholm, P.; Margolis, R.

    2011-10-01

    Photovoltaic (PV) systems are installed by several types of market participants, ranging from residential customers to large-scale project developers and utilities. Each type of market participant frequently uses a different economic performance metric to characterize PV value because they are looking for different types of returns from a PV investment. This report finds that different economic performance metrics frequently show different price thresholds for when a PV investment becomes profitable or attractive. Several project parameters, such as financing terms, can have a significant impact on some metrics [e.g., internal rate of return (IRR), net present value (NPV), and benefit-to-cost (B/C) ratio] while having a minimal impact on other metrics (e.g., simple payback time). As such, the choice of economic performance metric by different customer types can significantly shape each customer's perception of PV investment value and ultimately their adoption decision.

  10. Sensitivity of Multi-gas Climate Policy to Emission Metrics

    SciTech Connect

    Smith, Steven J.; Karas, Joseph F.; Edmonds, James A.; Eom, Jiyong; Mizrahi, Andrew H.

    2013-04-01

    Multi-gas greenhouse emission targets require that different emissions be combined into an aggregate total. The Global Warming Potential (GWP) index is currently used for this purpose, despite various criticisms of the underlying concept. It is not possible to uniquely define a single metric that perfectly captures the different impacts of emissions of substances with widely disparate atmospheric lifetimes, which leads to a wide range of possible index values. We examine the sensitivity of emissions and climate outcomes to the value of the index used to aggregate methane emissions using a technologically detailed integrated assessment model. We find that the sensitivity to index value is of order 4-14% in terms of methane emissions and 2% in terms of total radiative forcing, using index values between 4 and 70 for methane, with larger regional differences in some cases. The sensitivity to index value is much higher in economic terms, with total 2-gas mitigation cost decreasing 4-5% for a lower index and increasing 10-13% for a larger index, with even larger changes if the emissions reduction targets are small. The sensitivity to index value also depends on the assumed maximum amount of mitigation available in each sector. Evaluation of the maximum mitigation potential for major sources of non-CO2 greenhouse gases would greatly aid analysis

  11. Proceedings of the 2009 Performance Metrics for Intelligent Systems Workshop

    SciTech Connect

    Madhavan, Raj; Messina, Elena

    2009-09-01

    The Performance Metrics for Intelligent Systems (PerMIS) workshop is dedicated to defining measures and methodologies of evaluating performance of intelligent systems. As the only workshop of its kind, PerMIS has proved to be an excellent forum for sharing lessons learned and discussions as well as fostering collaborations between researchers and practitioners from industry, academia and government agencies. The main theme of the ninth iteration of the workshop, PerMIS'09, seeks to address the question: 'Does performance measurement accelerate the pace of advancement for intelligent systems?' In addition to the main theme, as in previous years, the workshop will focus on applications of performance measures to practical problems in commercial, industrial, homeland security, and military applications. The PerMIS'09 program consists of six plenary addresses and six general and special sessions. The topics that are to be discussed by the speakers cover a wide array of themes centered on many intricate facets of intelligent system research. The presentations will emphasize and showcase the interdisciplinary nature of intelligent systems research and why it is not straightforward to evaluate such interconnected system of systems. The three days of twelve sessions will span themes from manufacturing, mobile robotics, human-system interaction, theory of mind, testing and evaluation of unmanned systems, to name a few.

  12. Standard metrics and methods for conducting Avian/wind energy interaction studies

    SciTech Connect

    Anderson, R.L.; Davis, H.; Kendall, W.

    1997-12-31

    The awareness of the problem of avian fatalities at large scale wind energy developments first emerged in the late 1980`s at the Altamont Pass Wind Resource Area (WRA) in Central California. Observations of dead raptors at the Altamont Pass WRA triggered concern on the part of regulatory agencies, environmental/conservation groups, resource agencies, and wind and electric utility industries. This led the California Energy Commission staff, along with the planning departments of Alameda, Contra Costa, and Solano counties, to commission a study of bird mortality at the Altamont Pass WRA. In addition to the Altamont Pass WRA, other studies and observations have established that windplants kill birds. Depending upon the specific factors, this may or may not be a serious problem. The current level of scrutiny and caution exhibited during the permitting of a new windplant development in the United States results in costly delays and studies. This is occurring during a highly competitive period for electrical production companies in the USA. Clarification of the bird fatality issue is needed to bring it into perspective. This means standardizing metrics, defining terms, and recommending methods to be used in addressing or studying wind energy/bird interactions.

  13. 13,279,806 Metric Tons of CO2 Injected as of October 3, 2016 | Department

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    of Energy 13,279,806 Metric Tons of CO2 Injected as of October 3, 2016 13,279,806 Metric Tons of CO2 Injected as of October 3, 2016 This carbon dioxide (CO2) has been injected in the United States as part of DOE's Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the annual greenhouse gas emissions from 210,526 passenger vehicles. The projects currently injecting CO2 within DOE's Regional Carbon Sequestration Partnership Program and

  14. Large-scale seismic waveform quality metric calculation using Hadoop

    DOE PAGES [OSTI]

    Magana-Zook, Steven; Gaylord, Jessie M.; Knapp, Douglas R.; Dodge, Douglas A.; Ruppert, Stanley D.

    2016-05-27

    Here in this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of ~0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/Omore » performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. We conducted these experiments multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will

  15. Supporting analysis and assessments quality metrics: Utility market sector

    SciTech Connect

    Ohi, J.

    1996-10-01

    In FY96, NREL was asked to coordinate all analysis tasks so that in FY97 these tasks will be part of an integrated analysis agenda that will begin to define a 5-15 year R&D roadmap and portfolio for the DOE Hydrogen Program. The purpose of the Supporting Analysis and Assessments task at NREL is to provide this coordination and conduct specific analysis tasks. One of these tasks is to prepare the Quality Metrics (QM) for the Program as part of the overall QM effort at DOE/EERE. The Hydrogen Program one of 39 program planning units conducting QM, a process begun in FY94 to assess benefits/costs of DOE/EERE programs. The purpose of QM is to inform decisionmaking during budget formulation process by describing the expected outcomes of programs during the budget request process. QM is expected to establish first step toward merit-based budget formulation and allow DOE/EERE to get {open_quotes}most bang for its (R&D) buck.{close_quotes} In FY96. NREL coordinated a QM team that prepared a preliminary QM for the utility market sector. In the electricity supply sector, the QM analysis shows hydrogen fuel cells capturing 5% (or 22 GW) of the total market of 390 GW of new capacity additions through 2020. Hydrogen consumption in the utility sector increases from 0.009 Quads in 2005 to 0.4 Quads in 2020. Hydrogen fuel cells are projected to displace over 0.6 Quads of primary energy in 2020. In future work, NREL will assess the market for decentralized, on-site generation, develop cost credits for distributed generation benefits (such as deferral of transmission and distribution investments, uninterruptible power service), cost credits for by-products such as heat and potable water, cost credits for environmental benefits (reduction of criteria air pollutants and greenhouse gas emissions), compete different fuel cell technologies against each other for market share, and begin to address economic benefits, especially employment.

  16. 12,877,644 Metric Tons of CO2 Injected as of July 1, 2016

    Energy.gov [DOE]

    This carbon dioxide (CO2) has been injected in the United States as part of DOE’s Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the...

  17. 11,202,720 Metric Tons of CO2 Injected as of October 14, 2015...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    This carbon dioxide (CO2) has been injected in the United States as part of DOE's Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is ...

  18. Enclosure - FY 2016 Q1 Metrics Report 2016-02-11.xlsx

    Energy.gov [DOE] (indexed site)

    No. ContractProject Management Performance Metrics FY 2016 Target No. 2 3 4 5 6 7 Comment FY 2016 Forecast Certified Contracting Staff: By the end of FY 2011, 85% of the 1102 ...

  19. EAC Presentation: Metrics and Benefits Analysis for the ARRA Smart Grid Programs- March 10, 2011

    Energy.gov [DOE]

    PowerPoint presentation by Joe Paladino from the Office of Electricity Delivery and Energy Reliability before the Electricity Advisory Committee (EAC) on metrics and benefits analysis for the...

  20. Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and...

    Office of Scientific and Technical Information (OSTI)

    Modified Anti-de-Sitter Metric, Light-Front Quantized QCD, and Conformal Quantum Mechanics Dosch, Hans Gunter; U. Heidelberg, ITP; Brodsky, Stanley J.; SLAC; de Teramond, Guy F.;...

  1. 11,202,720 Metric Tons of CO2 Injected as of October 14, 2015

    Office of Energy Efficiency and Renewable Energy (EERE)

    This carbon dioxide (CO2) has been injected in the United States as part of DOEs Clean Coal Research, Development, and Demonstration Programs. One million metric tons of CO2 is equivalent to the...

  2. FY 2014 Q3 RCA CAP Performance Metrics Report 2014-09-05.xlsx

    Energy Saver

    ContractProject Management Performance Metrics FY 2014 Target FY 2014 Pre- & Post- CAP* ... TPC is Total Project Cost. No. FY 2014 Target FY 2014 3rd Qtr Actual 2 95% 92% 3 95% ...

  3. FY 2014 Q4 Metrics Report 2014-11-06.xlsx

    Energy Saver

    ContractProject Management Performance Metrics FY 2014 Target FY 2014 Pre- & Post- CAP* ... TPC is Total Project Cost. No. FY 2014 Target FY 2014 4th Qtr Actual 2 95% 89% 3 95% ...

  4. Metrics for Developing an Endorsed Set of Radiographic Threat Surrogates for JINII/CAARS

    SciTech Connect

    Wurtz, R; Walston, S; Dietrich, D; Martz, H

    2009-02-11

    CAARS (Cargo Advanced Automated Radiography System) is developing x-ray dual energy and x-ray backscatter methods to automatically detect materials that are greater than Z=72 (hafnium). This works well for simple geometry materials, where most of the radiographic path is through one material. However, this is usually not the case. Instead, the radiographic path includes many materials of different lengths. Single energy can be used to compute {mu}y{sub l} which is related to areal density (mass per unit area) while dual energy yields more information. This report describes a set of metrics suitable and sufficient for characterizing the appearance of assemblies as detected by x-ray radiographic imaging systems, such as those being tested by Joint Integrated Non-Intrusive Inspection (JINII) or developed under CAARS. These metrics will be simulated both for threat assemblies and surrogate threat assemblies (such as are found in Roney et al. 2007) using geometrical and compositional information of the assemblies. The imaging systems are intended to distinguish assemblies containing high-Z material from those containing low-Z material, regardless of thickness, density, or compounds and mixtures. The systems in question operate on the principle of comparing images obtained by using two different x-ray end-point energies--so-called 'dual energy' imaging systems. At the direction of the DHS JINII sponsor, this report does not cover metrics that implement scattering, in the form of either forward-scattered radiation or high-Z detection systems operating on the principle of backscatter detection. Such methods and effects will be covered in a later report. The metrics described here are to be used to compare assemblies and not x-ray radiography systems. We intend to use these metrics to determine whether two assemblies do or do not look the same. We are tasked to develop a set of assemblies whose appearance using this class of detection systems is indistinguishable from the

  5. Method for Confidence Metric in Optic Disk Location in Retinal Images -

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Energy Innovation Portal Method for Confidence Metric in Optic Disk Location in Retinal Images Oak Ridge National Laboratory Contact ORNL About This Technology Technology Marketing Summary To improve accuracy in diagnosis of retinal disease, ORNL researchers invented a method for assigning a confidence metric to computer-aided optic disc analysis. The physical condition of the optic disk determines the presence of various ophthalmic pathologies, including glaucoma and diabetic retinopathy.

  6. Implementing the Data Center Energy Productivity Metric in a High Performance Computing Data Center

    SciTech Connect

    Sego, Landon H.; Marquez, Andres; Rawson, Andrew; Cader, Tahir; Fox, Kevin M.; Gustafson, William I.; Mundy, Christopher J.

    2013-06-30

    As data centers proliferate in size and number, the improvement of their energy efficiency and productivity has become an economic and environmental imperative. Making these improvements requires metrics that are robust, interpretable, and practical. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented, high-performance computing data center. We found that DCeP was successful in clearly distinguishing different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and between data centers.

  7. Geothermal Plant Capacity Factors

    SciTech Connect

    Greg Mines; Jay Nathwani; Christopher Richard; Hillary Hanson; Rachel Wood

    2015-01-01

    The capacity factors recently provided by the Energy Information Administration (EIA) indicated this plant performance metric had declined for geothermal power plants since 2008. Though capacity factor is a term commonly used by geothermal stakeholders to express the ability of a plant to produce power, it is a term frequently misunderstood and in some instances incorrectly used. In this paper we discuss how this capacity factor is defined and utilized by the EIA, including discussion on the information that the EIA requests from operations in their 923 and 860 forms that are submitted both monthly and annually by geothermal operators. A discussion is also provided regarding the entities utilizing the information in the EIA reports, and how those entities can misinterpret the data being supplied by the operators. The intent of the paper is to inform the facility operators as the importance of the accuracy of the data that they provide, and the implications of not providing the correct information.

  8. New ansatz for metric operator calculation in pseudo-Hermitian field theory

    SciTech Connect

    Shalaby, Abouzeid M.

    2009-05-15

    In this work, a new ansatz is introduced to make the calculations of the metric operator in pseudo-Hermitian field theory simpler. The idea is to assume that the metric operator is not only a functional of the field operator {phi} and its conjugate field {pi} but also on the field gradient {nabla}{phi}. The ansatz enables one to calculate the metric operator just once for all dimensions of the space-time. We calculated the metric operator of the i{phi}{sup 3} scalar field theory up to first order in the coupling. The higher orders can be conjectured from their corresponding operators in the quantum mechanical case available in the literature. We assert that the calculations existing in literature for the metric operator in field theory are cumbersome and are done case by case concerning the dimension of space-time in which the theory is investigated. In fact, with the aid of this work a rigorous study of a PT-symmetric Higgs mechanism can be reached.

  9. Effective detective quantum efficiency for two mammography systems: Measurement and comparison against established metrics

    SciTech Connect

    Salvagnini, Elena; Bosmans, Hilde; Marshall, Nicholas W.; Struelens, Lara

    2013-10-15

    Purpose: The aim of this paper was to illustrate the value of the new metric effective detective quantum efficiency (eDQE) in relation to more established measures in the optimization process of two digital mammography systems. The following metrics were included for comparison against eDQE: detective quantum efficiency (DQE) of the detector, signal difference to noise ratio (SdNR), and detectability index (d′) calculated using a standard nonprewhitened observer with eye filter.Methods: The two systems investigated were the Siemens MAMMOMAT Inspiration and the Hologic Selenia Dimensions. The presampling modulation transfer function (MTF) required for the eDQE was measured using two geometries: a geometry containing scattered radiation and a low scatter geometry. The eDQE, SdNR, and d′ were measured for poly(methyl methacrylate) (PMMA) thicknesses of 20, 40, 60, and 70 mm, with and without the antiscatter grid and for a selection of clinically relevant target/filter (T/F) combinations. Figures of merit (FOMs) were then formed from SdNR and d′ using the mean glandular dose as the factor to express detriment. Detector DQE was measured at energies covering the range of typical clinically used spectra.Results: The MTF measured in the presence of scattered radiation showed a large drop at low spatial frequency compared to the low scatter method and led to a corresponding reduction in eDQE. The eDQE for the Siemens system at 1 mm{sup −1} ranged between 0.15 and 0.27, depending on T/F and grid setting. For the Hologic system, eDQE at 1 mm{sup −1} varied from 0.15 to 0.32, again depending on T/F and grid setting. The eDQE results for both systems showed that the grid increased the system efficiency for PMMA thicknesses of 40 mm and above but showed only small sensitivity to T/F setting. While results of the SdNR and d′ based FOMs confirmed the eDQE grid position results, they were also more specific in terms of T/F selection. For the Siemens system at 20 mm PMMA

  10. Energy Department Project Captures and Stores more than One Million Metric

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Tons of CO2 | Department of Energy Project Captures and Stores more than One Million Metric Tons of CO2 Energy Department Project Captures and Stores more than One Million Metric Tons of CO2 June 26, 2014 - 11:30am Addthis Aerial view of Air Products’ existing steam methane reforming facility at Port Arthur, Texas, with new carbon-capture units and central co-gen and CO2 product compressor. | Photo courtesy of Air Products and Chemicals Inc. Aerial view of Air Products' existing steam

  11. DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Weapons Stockpile | Department of Energy to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile DOE to Remove 200 Metric Tons of Highly Enriched Uranium from U.S. Nuclear Weapons Stockpile November 7, 2005 - 12:38pm Addthis Will Be Redirected to Naval Reactors, Down-blended or Used for Space Programs WASHINGTON, DC - Secretary of Energy Samuel W. Bodman today announced that the Department of Energy's (DOE) National Nuclear Security Administration (NNSA) will

  12. Analysis of key safety metrics of thorium utilization in LWRs (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | DOE PAGES Analysis of key safety metrics of thorium utilization in LWRs This content will become publicly available on April 8, 2017 « Prev Next » Title: Analysis of key safety metrics of thorium utilization in LWRs Here, thorium has great potential to stretch nuclear fuel reserves because of its natural abundance and because it is possible to breed the 232Th isotope into a fissile fuel (233U). Various scenarios exist for utilization of thorium in the nuclear fuel cycle,

  13. Analysis of key safety metrics of thorium utilization in LWRs (Journal

    Office of Scientific and Technical Information (OSTI)

    Article) | SciTech Connect Analysis of key safety metrics of thorium utilization in LWRs Citation Details In-Document Search This content will become publicly available on April 8, 2017 Title: Analysis of key safety metrics of thorium utilization in LWRs Here, thorium has great potential to stretch nuclear fuel reserves because of its natural abundance and because it is possible to breed the 232Th isotope into a fissile fuel (233U). Various scenarios exist for utilization of thorium in the

  14. Multidimensional metrics for estimating phage abundance, distribution, gene density, and sequence coverage in metagenomes

    DOE PAGES [OSTI]

    Aziz, Ramy K.; Dwivedi, Bhakti; Akhter, Sajia; Breitbart, Mya; Edwards, Robert A.

    2015-05-08

    Phages are the most abundant biological entities on Earth and play major ecological roles, yet the current sequenced phage genomes do not adequately represent their diversity, and little is known about the abundance and distribution of these sequenced genomes in nature. Although the study of phage ecology has benefited tremendously from the emergence of metagenomic sequencing, a systematic survey of phage genes and genomes in various ecosystems is still lacking, and fundamental questions about phage biology, lifestyle, and ecology remain unanswered. To address these questions and improve comparative analysis of phages in different metagenomes, we screened a core set ofmore » publicly available metagenomic samples for sequences related to completely sequenced phages using the web tool, Phage Eco-Locator. We then adopted and deployed an array of mathematical and statistical metrics for a multidimensional estimation of the abundance and distribution of phage genes and genomes in various ecosystems. Experiments using those metrics individually showed their usefulness in emphasizing the pervasive, yet uneven, distribution of known phage sequences in environmental metagenomes. Using these metrics in combination allowed us to resolve phage genomes into clusters that correlated with their genotypes and taxonomic classes as well as their ecological properties. By adding this set of metrics to current metaviromic analysis pipelines, where they can provide insight regarding phage mosaicism, habitat specificity, and evolution.« less

  15. Metrics of closed world of Friedmann, agitated by electric charge (towards a theory electromagnetic Friedmanns)

    SciTech Connect

    Markov, M.A.; Frolov, V.P.

    1986-06-10

    The generalization is considered of the well-known Tolman problem to the case of electrically charged dust-like matter of the central symmetrical system. The first integrals of the correspondent system of the Einstein-Maxwell equations are found. The problem is specificated in such a way that with the full charge of the system going to zero, the metrics of the closed Friedman world arises. Such a system is considered at the initial moment, that of maximal enlargement. With any nonvanishing but no-matter-how-small value of the electric charge, the metrics is unclosed. The metrics of the almost-Friedmanian part of the world allows the continuation through the narrow manhole (at the small charge) as the Nordstroem Reissner metrics with the parameters m/sub O/ sq rt (chi) = e/sub o/. The expression for the electric potential in the manhole phi/sub h/ = c-squared/sq rt chi does not depend upon the value of the electric charge. The radius of the manhole r/sub h/ = e/sub O/ sq. rt (chi)/ c-squared increases with the increase of the charge. The state of the manhole as given by the classical description appears as essentially unstable from the quantum-physics viewpoint. The production of various pairs in the enormous electric fields of the manhole gives rise to the polarisation of the latter up to effective charge Z < 137e irrespective of the initial (no matter how great) charge of the system.

  16. Multidimensional metrics for estimating phage abundance, distribution, gene density, and sequence coverage in metagenomes

    SciTech Connect

    Aziz, Ramy K.; Dwivedi, Bhakti; Akhter, Sajia; Breitbart, Mya; Edwards, Robert A.

    2015-05-08

    Phages are the most abundant biological entities on Earth and play major ecological roles, yet the current sequenced phage genomes do not adequately represent their diversity, and little is known about the abundance and distribution of these sequenced genomes in nature. Although the study of phage ecology has benefited tremendously from the emergence of metagenomic sequencing, a systematic survey of phage genes and genomes in various ecosystems is still lacking, and fundamental questions about phage biology, lifestyle, and ecology remain unanswered. To address these questions and improve comparative analysis of phages in different metagenomes, we screened a core set of publicly available metagenomic samples for sequences related to completely sequenced phages using the web tool, Phage Eco-Locator. We then adopted and deployed an array of mathematical and statistical metrics for a multidimensional estimation of the abundance and distribution of phage genes and genomes in various ecosystems. Experiments using those metrics individually showed their usefulness in emphasizing the pervasive, yet uneven, distribution of known phage sequences in environmental metagenomes. Using these metrics in combination allowed us to resolve phage genomes into clusters that correlated with their genotypes and taxonomic classes as well as their ecological properties. By adding this set of metrics to current metaviromic analysis pipelines, where they can provide insight regarding phage mosaicism, habitat specificity, and evolution.

  17. PREDICTION METRICS FOR CHEMICAL DETECTION IN LONG-WAVE INFRARED HYPERSPECTRAL IMAGERY

    SciTech Connect

    Chilton, M.; Walsh, S.J.; Daly, D.S.

    2009-01-01

    Natural and man-made chemical processes generate gaseous plumes that may be detected by hyperspectral imaging, which produces a matrix of spectra affected by the chemical constituents of the plume, the atmosphere, the bounding background surface and instrument noise. A physics-based model of observed radiance shows that high chemical absorbance and low background emissivity result in a larger chemical signature. Using simulated hyperspectral imagery, this study investigated two metrics which exploited this relationship. The objective was to explore how well the chosen metrics predicted when a chemical would be more easily detected when comparing one background type to another. The two predictor metrics correctly rank ordered the backgrounds for about 94% of the chemicals tested as compared to the background rank orders from Whitened Matched Filtering (a detection algorithm) of the simulated spectra. These results suggest that the metrics provide a reasonable summary of how the background emissivity and chemical absorbance interact to produce the at-sensor chemical signal. This study suggests that similarly effective predictors that account for more general physical conditions may be derived.

  18. Energy Department Project Captures and Stores One Million Metric Tons of Carbon

    Energy.gov [DOE]

    As part of President Obama’s all-of-the-above energy strategy, the Department of Energy announced today that its Illinois Basin-Decatur Project successfully captured and stored one million metric tons of carbon dioxide (CO2) and injected it into a deep saline formation.

  19. Performance Metrics

    Energy.gov [DOE]

    RCA/CAP Closure Report 2011 - This RCA/CAP Closure Report presents a status of the Department’s initiatives to address the most significant issues and their corresponding root causes and officially...

  20. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    SciTech Connect

    Schanfein, Mark J; Gouveia, Fernando S

    2010-07-01

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls.

  1. Comparing Resource Adequacy Metrics and their Influence on Capacity Value: Preprint

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Comparing Resource Adequacy Metrics and Their Influence on Capacity Value Preprint E. Ibanez and M. Milligan National Renewable Energy Laboratory To be presented at the 13 th International Conference on Probabilistic Methods Applied to Power Systems Durham, United Kingdom July 7-10, 2014 Conference Paper NREL/CP-5D00-61017 April 2014 NOTICE The submitted manuscript has been offered by an employee of the Alliance for Sustainable Energy, LLC (Alliance), a contractor of the US Government under

  2. Texas CO2 Capture Demonstration Project Hits Three Million Metric Ton Milestone

    Energy.gov [DOE]

    On June 30, Allentown, PA-based Air Products and Chemicals, Inc. successfully captured and transported, via pipeline, its 3 millionth metric ton of carbon dioxide (CO2) to be used for enhanced oil recovery. This achievement highlights the ongoing success of a carbon capture and storage (CCS) project sponsored by the U.S. Department of Energy (DOE) and managed by the National Energy Technology Laboratory (NETL).

  3. Dynamical Systems in the Variational Formulation of the Fokker-Planck Equation by the Wasserstein Metric

    SciTech Connect

    Mikami, T.

    2000-07-01

    R. Jordan, D. Kinderlehrer, and F. Otto proposed the discrete-time approximation of the Fokker-Planck equation by the variational formulation. It is determined by the Wasserstein metric, an energy functional, and the Gibbs-Boltzmann entropy functional. In this paper we study the asymptotic behavior of the dynamical systems which describe their approximation of the Fokker-Planck equation and characterize the limit as a solution to a class of variational problems.

  4. Time delay of light signals in an energy-dependent spacetime metric

    SciTech Connect

    Grillo, A. F.; Luzio, E.; Mendez, F.

    2008-05-15

    In this paper we review the problem of time delay of photons propagating in a spacetime with a metric that explicitly depends on the energy of the particles (gravity-rainbow approach). We show that corrections due to this approach--which is closely related to the double special relativity proposal--produce for small redshifts (z<<1) smaller time delays than in the generic Lorentz invariance violating case.

  5. Performance metrics and life-cycle information management for building performance assurance

    SciTech Connect

    Hitchcock, R.J.; Piette, M.A.; Selkowitz, S.E.

    1998-06-01

    Commercial buildings account for over $85 billion per year in energy costs, which is far more energy than technically necessary. One of the primary reasons buildings do not perform as well as intended is that critical information is lost, through ineffective documentation and communication, leading to building systems that are often improperly installed and operated. A life-cycle perspective on the management of building information provides a framework for improving commercial building energy performance. This paper describes a project to develop strategies and techniques to provide decision-makers with information needed to assure the desired building performance across the complete life cycle of a building project. A key element in this effort is the development of explicit performance metrics that quantitatively represent performance objectives of interest to various building stakeholders. The paper begins with a discussion of key problems identified in current building industry practice, and ongoing work to address these problems. The paper then focuses on the concept of performance metrics and their use in improving building performance during design, commissioning, and on-going operations. The design of a Building Life-cycle Information System (BLISS) is presented. BLISS is intended to provide an information infrastructure capable of integrating a variety of building information technologies that support performance assurance. The use of performance metrics in case study building projects is explored to illustrate current best practice. The application of integrated information technology for improving current practice is discussed.

  6. Specification and implementation of IFC based performance metrics to support building life cycle assessment of hybrid energy systems

    SciTech Connect

    Morrissey, Elmer; O'Donnell, James; Keane, Marcus; Bazjanac, Vladimir

    2004-03-29

    Minimizing building life cycle energy consumption is becoming of paramount importance. Performance metrics tracking offers a clear and concise manner of relating design intent in a quantitative form. A methodology is discussed for storage and utilization of these performance metrics through an Industry Foundation Classes (IFC) instantiated Building Information Model (BIM). The paper focuses on storage of three sets of performance data from three distinct sources. An example of a performance metrics programming hierarchy is displayed for a heat pump and a solar array. Utilizing the sets of performance data, two discrete performance effectiveness ratios may be computed, thus offering an accurate method of quantitatively assessing building performance.

  7. Development and evaluation of aperture-based complexity metrics using film and EPID measurements of static MLC openings

    SciTech Connect

    Götstedt, Julia; Karlsson Hauer, Anna; Bäck, Anna

    2015-07-15

    Purpose: Complexity metrics have been suggested as a complement to measurement-based quality assurance for intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT). However, these metrics have not yet been sufficiently validated. This study develops and evaluates new aperture-based complexity metrics in the context of static multileaf collimator (MLC) openings and compares them to previously published metrics. Methods: This study develops the converted aperture metric and the edge area metric. The converted aperture metric is based on small and irregular parts within the MLC opening that are quantified as measured distances between MLC leaves. The edge area metric is based on the relative size of the region around the edges defined by the MLC. Another metric suggested in this study is the circumference/area ratio. Earlier defined aperture-based complexity metrics—the modulation complexity score, the edge metric, the ratio monitor units (MU)/Gy, the aperture area, and the aperture irregularity—are compared to the newly proposed metrics. A set of small and irregular static MLC openings are created which simulate individual IMRT/VMAT control points of various complexities. These are measured with both an amorphous silicon electronic portal imaging device and EBT3 film. The differences between calculated and measured dose distributions are evaluated using a pixel-by-pixel comparison with two global dose difference criteria of 3% and 5%. The extent of the dose differences, expressed in terms of pass rate, is used as a measure of the complexity of the MLC openings and used for the evaluation of the metrics compared in this study. The different complexity scores are calculated for each created static MLC opening. The correlation between the calculated complexity scores and the extent of the dose differences (pass rate) are analyzed in scatter plots and using Pearson’s r-values. Results: The complexity scores calculated by the edge

  8. Genome Assembly Forensics: Metrics for Assessing Assembly Correctness (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Pop, Mihai [University of Maryland

    2016-07-12

    University of Maryland's Mihai Pop on "Genome Assembly Forensics: Metrics for Assessing Assembly Correctness" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  9. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    SciTech Connect

    Ronald Boring; Roger Lew; Thomas Ulrich; Jeffrey Joe

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how the process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.

  10. OSTIblog Articles in the metrics Topic | OSTI, US Dept of Energy Office of

    Office of Scientific and Technical Information (OSTI)

    Scientific and Technical Information metrics Topic OSTI's Committee of Visitors, An Update by Dr. Jeffrey Salmon 23 May, 2011 in Science Communications 4333 COV%202009%20Group.jpg OSTI's Committee of Visitors, An Update Read more about 4333 "The unexamined life is not worth living." So says Plato's Socrates in the Apology. His self-examination led to extreme humility (or to an extreme irony) when Socrates confessed to his accusers that the only knowledge he had was knowledge of his

  11. Comparing Resource Adequacy Metrics and Their Influence on Capacity Value: Preprint

    SciTech Connect

    Ibanez, E.; Milligan, M.

    2014-04-01

    Traditional probabilistic methods have been used to evaluate resource adequacy. The increasing presence of variable renewable generation in power systems presents a challenge to these methods because, unlike thermal units, variable renewable generation levels change over time because they are driven by meteorological events. Thus, capacity value calculations for these resources are often performed to simple rules of thumb. This paper follows the recommendations of the North American Electric Reliability Corporation?s Integration of Variable Generation Task Force to include variable generation in the calculation of resource adequacy and compares different reliability metrics. Examples are provided using the Western Interconnection footprint under different variable generation penetrations.

  12. Einstein-aether theory, violation of Lorentz invariance, and metric-affine gravity

    SciTech Connect

    Heinicke, Christian; Baekler, Peter; Hehl, Friedrich W.

    2005-07-15

    We show that the Einstein-aether theory of Jacobson and Mattingly (J and M) can be understood in the framework of the metric-affine (gauge theory of) gravity (MAG). We achieve this by relating the aether vector field of J and M to certain post-Riemannian nonmetricity pieces contained in an independent linear connection of spacetime. Then, for the aether, a corresponding geometrical curvature-square Lagrangian with a massive piece can be formulated straightforwardly. We find an exact spherically symmetric solution of our model.

  13. Perfect fluid and scalar field in the Reissner-Nordstroem metric

    SciTech Connect

    Babichev, E. O.; Dokuchaev, V. I. Eroshenko, Yu. N.

    2011-05-15

    We describe the spherically symmetric steady-state accretion of perfect fluid in the Reissner-Nordstroem metric. We present analytic solutions for accretion of a fluid with linear equations of state and of the Chaplygin gas. We also show that under reasonable physical conditions, there is no steady-state accretion of a perfect fluid onto a Reissner-Nordstroem naked singularity. Instead, a static atmosphere of fluid is formed. We discuss a possibility of violation of the third law of black hole thermodynamics for a phantom fluid accretion.

  14. Ultrahard fluid and scalar field in the Kerr-Newman metric

    SciTech Connect

    Babichev, E.; Chernov, S.; Dokuchaev, V.; Eroshenko, Yu.

    2008-11-15

    An analytic solution for the accretion of ultrahard perfect fluid onto a moving Kerr-Newman black hole is found. This solution is a generalization of the previously known solution by Petrich, Shapiro, and Teukolsky for a Kerr black hole. We show that the found solution is applicable for the case of a nonextreme black hole, however it cannot describe the accretion onto an extreme black hole due to violation of the test fluid approximation. We also present a stationary solution for a massless scalar field in the metric of a Kerr-Newman naked singularity.

  15. Integration of Sustainability Metrics into Design Cases and State of Technology Assessments

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    This presentation does not contain any proprietary, confidential, or otherwise restricted information DOE Bioenergy Technologies Office (BETO) 2015 Project Peer Review Integration of Sustainability Metrics into Design Cases and State of Technology Assessments 2.1.0.100/2.1.0.302 NREL 2.1.0.301 PNNL Mary Biddy On behalf Eric Tan, Abhijit Dutta, Ryan Davis, Mike Talmadge NREL Lesley Snowden-Swan On behalf of Sue Jones, Aye Meyer, Ken Rappe, Kurt Spies PNNL Goal Statement 2 Support the development

  16. Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems

    SciTech Connect

    Greitzer, Frank L.

    2008-09-15

    The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighters cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts; and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.

  17. An Aquatic Acoustic Metrics Interface Utility for Underwater Sound Monitoring and Analysis

    SciTech Connect

    Ren, Huiying; Halvorsen, Michele B.; Deng, Zhiqun; Carlson, Thomas J.

    2012-05-31

    Fishes and other marine mammals suffer a range of potential effects from intense sound sources generated by anthropogenic underwater processes such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording devices (USR) were built to monitor the acoustic sound pressure waves generated by those anthropogenic underwater activities, so the relevant processing software becomes indispensable for analyzing the audio files recorded by these USRs. However, existing software packages did not meet performance and flexibility requirements. In this paper, we provide a detailed description of a new software package, named Aquatic Acoustic Metrics Interface (AAMI), which is a Graphical User Interface (GUI) designed for underwater sound monitoring and analysis. In addition to the general functions, such as loading and editing audio files recorded by USRs, the software can compute a series of acoustic metrics in physical units, monitor the sound's influence on fish hearing according to audiograms from different species of fishes and marine mammals, and batch process the sound files. The detailed applications of the software AAMI will be discussed along with several test case scenarios to illustrate its functionality.

  18. Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics

    DOE PAGES [OSTI]

    Laney, Daniel; Langer, Steven; Weber, Christopher; Lindstrom, Peter; Wegener, Al

    2014-01-01

    This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less

  19. Using research metrics to evaluate the International Atomic Energy Agency guidelines on quality assurance for R&D

    SciTech Connect

    Bodnarczuk, M.

    1994-06-01

    The objective of the International Atomic Energy Agency (IAEA) Guidelines on Quality Assurance for R&D is to provide guidance for developing quality assurance (QA) programs for R&D work on items, services, and processes important to safety, and to support the siting, design, construction, commissioning, operation, and decommissioning of nuclear facilities. The standard approach to writing papers describing new quality guidelines documents is to present a descriptive overview of the contents of the document. I will depart from this approach. Instead, I will first discuss a conceptual framework of metrics for evaluating and improving basic and applied experimental science as well as the associated role that quality management should play in understanding and implementing these metrics. I will conclude by evaluating how well the IAEA document addresses the metrics from this conceptual framework and the broader principles of quality management.

  20. MULTI-SCALE MORPHOLOGICAL ANALYSIS OF SDSS DR5 SURVEY USING THE METRIC SPACE TECHNIQUE

    SciTech Connect

    Wu Yongfeng; Batuski, David J.; Khalil, Andre

    2009-12-20

    Following the novel development and adaptation of the Metric Space Technique (MST), a multi-scale morphological analysis of the Sloan Digital Sky Survey (SDSS) Data Release 5 (DR5) was performed. The technique was adapted to perform a space-scale morphological analysis by filtering the galaxy point distributions with a smoothing Gaussian function, thus giving quantitative structural information on all size scales between 5 and 250 Mpc. The analysis was performed on a dozen slices of a volume of space containing many newly measured galaxies from the SDSS DR5 survey. Using the MST, observational data were compared to galaxy samples taken from N-body simulations with current best estimates of cosmological parameters and from random catalogs. By using the maximal ranking method among MST output functions, we also develop a way to quantify the overall similarity of the observed samples with the simulated samples.

  1. Quantifying Availability in SCADA Environments Using the Cyber Security Metric MFC

    SciTech Connect

    Aissa, Anis Ben; Rabai, Latifa Ben Arfa; Abercrombie, Robert K; Sheldon, Frederick T; Mili, Ali

    2014-01-01

    Supervisory Control and Data Acquisition (SCADA) systems are distributed networks dispersed over large geographic areas that aim to monitor and control industrial processes from remote areas and/or a centralized location. They are used in the management of critical infrastructures such as electric power generation, transmission and distribution, water and sewage, manufacturing/industrial manufacturing as well as oil and gas production. The availability of SCADA systems is tantamount to assuring safety, security and profitability. SCADA systems are the backbone of the national cyber-physical critical infrastructure. Herein, we explore the definition and quantification of an econometric measure of availability, as it applies to SCADA systems; our metric is a specialization of the generic measure of mean failure cost.

  2. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    SciTech Connect

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign and threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.

  3. Method and system for assigning a confidence metric for automated determination of optic disc location

    DOEpatents

    Karnowski, Thomas P.; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya; Chaum, Edward

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  4. Interval Data Analysis with the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect

    Taasevigen, Danny J.; Katipamula, Srinivas; Koran, William

    2011-07-07

    Analyzing whole building interval data is an inexpensive but effective way to identify and improve building operations, and ultimately save money. Utilizing the Energy Charting and Metrics Tool (ECAM) add-in for Microsoft Excel, building operators and managers can begin implementing changes to their Building Automation System (BAS) after trending the interval data. The two data components needed for full analyses are whole building electricity consumption (kW or kWh) and outdoor air temperature (OAT). Using these two pieces of information, a series of plots and charts and be created in ECAM to monitor the buildings performance over time, gain knowledge of how the building is operating, and make adjustments to the BAS to improve efficiency and start saving money.

  5. Table 11.3 Methane Emissions, 1980-2009 (Million Metric Tons of Methane)

    Energy Information Administration (EIA) (indexed site)

    Methane Emissions, 1980-2009 (Million Metric Tons of Methane) Year Energy Sources Waste Management Agricultural Sources Industrial Processes 9 Total 5 Coal Mining Natural Gas Systems 1 Petroleum Systems 2 Mobile Com- bustion 3 Stationary Com- bustion 4 Total 5 Landfills Waste- water Treatment 6 Total 5 Enteric Fermen- tation 7 Animal Waste 8 Rice Cultivation Crop Residue Burning Total 5 1980 3.06 4.42 NA 0.28 0.45 8.20 10.52 0.52 11.04 5.47 2.87 0.48 0.04 8.86 0.17 28.27 1981 2.81 5.02 NA .27

  6. SU-E-T-359: Measurement of Various Metrics to Determine Changes in Megavoltage Photon Beam Energy

    SciTech Connect

    Gao, S; Balter, P; Rose, M; Simon, W

    2014-06-01

    Purpose: To examine the relationship between photon beam energy and various metrics for energy on the flattened and flattening filter free (FFF) beams generated by the Varian TrueBeam. Methods: Energy changes were accomplished by adjusting the bending magnet current 10% from the nominal value for the 4, 6, 8, and 10 MV flattened and 6 and 10 MV FFF beams. Profiles were measured for a 3030 cm{sup 2} field using a 2D ionization chamber array and a 3D water Scanner which was also used to measure PDDs. For flattened beams we compared several energy metrics; PDD at 10 cm depth in water (PDD(10)); the variation over the central 80% of the field (Flat); and the average of the highest reading along each diagonal divided by the CAX value, diagonal normalized flatness (FDN). For FFF beams we examined PDD(10), FDN, and the width of a chosen isodose level in a 3030 cm{sup 2} field (W(d%)). Results: Changes in PDD(10) were nearly linear with changes in energy for both flattened and FFF beams as were changes in FDN. Changes in W(d%) were also nearly linear with energy for the FFF beams. PDD(10) was not as sensitive to changes in energy compared to the other metrics for either flattened or FFF beams. Flat was not as sensitive to changes in energy compared to FDN for flattened beams and its behavior depends on depth. FDN was the metric that had the highest sensitivity to the changes in energy for flattened beams while W(d%) was the metric that had highest sensitivity to the changes in energy for FFF beams. Conclusions: The metric FDN was found to be most sensitive to energy changes for flattened beams, while the W(d%) was most sensitive to energy changes for FFF beams.

  7. Measuring solar reflectance Part I: Defining a metric that accurately predicts solar heat gain

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    Solar reflectance can vary with the spectral and angular distributions of incident sunlight, which in turn depend on surface orientation, solar position and atmospheric conditions. A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective 'cool colored' surface because this irradiance contains a greater fraction of near-infrared light than typically found in ordinary (unconcentrated) global sunlight. At mainland U.S. latitudes, this metric RE891BN can underestimate the annual peak solar heat gain of a typical roof or pavement (slope {le} 5:12 [23{sup o}]) by as much as 89 W m{sup -2}, and underestimate its peak surface temperature by up to 5 K. Using R{sub E891BN} to characterize roofs in a building energy simulation can exaggerate the economic value N of annual cool-roof net energy savings by as much as 23%. We define clear-sky air mass one global horizontal ('AM1GH') solar reflectance R{sub g,0}, a simple and easily measured property that more accurately predicts solar heat gain. R{sub g,0} predicts the annual peak solar heat gain of a roof or pavement to within 2 W m{sup -2}, and overestimates N by no more than 3%. R{sub g,0} is well suited to rating the solar reflectances of roofs, pavements and walls. We show in Part II that R{sub g,0} can be easily and accurately measured with a pyranometer, a solar spectrophotometer or version 6 of the Solar Spectrum Reflectometer.

  8. Measuring solar reflectance - Part I: Defining a metric that accurately predicts solar heat gain

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-09-15

    Solar reflectance can vary with the spectral and angular distributions of incident sunlight, which in turn depend on surface orientation, solar position and atmospheric conditions. A widely used solar reflectance metric based on the ASTM Standard E891 beam-normal solar spectral irradiance underestimates the solar heat gain of a spectrally selective ''cool colored'' surface because this irradiance contains a greater fraction of near-infrared light than typically found in ordinary (unconcentrated) global sunlight. At mainland US latitudes, this metric R{sub E891BN} can underestimate the annual peak solar heat gain of a typical roof or pavement (slope {<=} 5:12 [23 ]) by as much as 89 W m{sup -2}, and underestimate its peak surface temperature by up to 5 K. Using R{sub E891BN} to characterize roofs in a building energy simulation can exaggerate the economic value N of annual cool roof net energy savings by as much as 23%. We define clear sky air mass one global horizontal (''AM1GH'') solar reflectance R{sub g,0}, a simple and easily measured property that more accurately predicts solar heat gain. R{sub g,0} predicts the annual peak solar heat gain of a roof or pavement to within 2 W m{sup -2}, and overestimates N by no more than 3%. R{sub g,0} is well suited to rating the solar reflectances of roofs, pavements and walls. We show in Part II that R{sub g,0} can be easily and accurately measured with a pyranometer, a solar spectrophotometer or version 6 of the Solar Spectrum Reflectometer. (author)

  9. Marker-free registration of forest terrestrial laser scanner data pairs with embedded confidence metrics

    DOE PAGES [OSTI]

    Van Aardt, Jan; Romanczyk, Paul; van Leeuwen, Martin; Kelbe, David; Cawse-Nicholson, Kerry

    2016-04-04

    Terrestrial laser scanning (TLS) has emerged as an effective tool for rapid comprehensive measurement of object structure. Registration of TLS data is an important prerequisite to overcome the limitations of occlusion. However, due to the high dissimilarity of point cloud data collected from disparate viewpoints in the forest environment, adequate marker-free registration approaches have not been developed. The majority of studies instead rely on the utilization of artificial tie points (e.g., reflective tooling balls) placed within a scene to aid in coordinate transformation. We present a technique for generating view-invariant feature descriptors that are intrinsic to the point cloud datamore » and, thus, enable blind marker-free registration in forest environments. To overcome the limitation of initial pose estimation, we employ a voting method to blindly determine the optimal pairwise transformation parameters, without an a priori estimate of the initial sensor pose. To provide embedded error metrics, we developed a set theory framework in which a circular transformation is traversed between disjoint tie point subsets. This provides an upper estimate of the Root Mean Square Error (RMSE) confidence associated with each pairwise transformation. Output RMSE errors are commensurate with the RMSE of input tie points locations. Thus, while the mean output RMSE=16.3cm, improved results could be achieved with a more precise laser scanning system. This study 1) quantifies the RMSE of the proposed marker-free registration approach, 2) assesses the validity of embedded confidence metrics using receiver operator characteristic (ROC) curves, and 3) informs optimal sample spacing considerations for TLS data collection in New England forests. Furthermore, while the implications for rapid, accurate, and precise forest inventory are obvious, the conceptual framework outlined here could potentially be extended to built environments.« less

  10. Conceptual Framework for Developing Resilience Metrics for the Electricity, Oil, and Gas Sectors in the United States

    SciTech Connect

    Watson, Jean-Paul; Guttromson, Ross; Silva-Monroy, Cesar; Jeffers, Robert; Jones, Katherine; Ellison, James; Rath, Charles; Gearhart, Jared; Jones, Dean; Corbet, Tom; Hanley, Charles; Walker, La Tonya

    2014-09-01

    This report has been written for the Department of Energy’s Energy Policy and Systems Analysis Office to inform their writing of the Quadrennial Energy Review in the area of energy resilience. The topics of measuring and increasing energy resilience are addressed, including definitions, means of measuring, and analytic methodologies that can be used to make decisions for policy, infrastructure planning, and operations. A risk-based framework is presented which provides a standard definition of a resilience metric. Additionally, a process is identified which explains how the metrics can be applied. Research and development is articulated that will further accelerate the resilience of energy infrastructures.

  11. A Year of Radiation Measurements at the North Slope of Alaska Second Quarter 2009 ARM and Climate Change Prediction Program Metric Report

    SciTech Connect

    S.A. McFarlane, Y. Shi, C.N. Long

    2009-04-15

    In 2009, the Atmospheric Radiation Measurement (ARM) Program and the Climate Change Prediction Program (CCPP) have been asked to produce joint science metrics. For CCPP, the second quarter metrics are reported in Evaluation of Simulated Precipitation in CCSM3: Annual Cycle Performance Metrics at Watershed Scales. For ARM, the metrics will produce and make available new continuous time series of radiative fluxes based on one year of observations from Barrow, Alaska, during the International Polar Year and report on comparisons of observations with baseline simulations of the Community Climate System Model (CCSM).

  12. Recommendations for mass spectrometry data quality metrics for open access data(corollary to the Amsterdam principles)

    SciTech Connect

    Kingsinger, Christopher R.; Apffel, James; Baker, Mark S.; Bian, Xiaopeng; Borchers, Christoph H.; Bradshaw, Ralph A.; Brusniak, Mi-Youn; Chan, Daniel W.; Deutsch, Eric W.; Domon, Bruno; Gorman, Jeff; Grimm, Rudolf; Hancock, William S.; Hermjakob, Henning; Horn, David; Hunter, Christie; Kolar, Patrik; Kraus, Hans-Joachim; Langen, Hanno; Linding, Rune; Moritz, Robert L.; Omenn, Gilbert S.; Orlando, Ron; Pandey, Akhilesh; Ping, Peipei; Rahbar, Amir; Rivers, Robert; Seymour, Sean L.; Simpson, Richard J.; Slotta, Douglas; Smith, Richard D.; Stein, Stephen E.; Tabb, David L.; Tagle, Danilo; Yates, John R.; Rodriguez, Henry

    2011-12-01

    Policies supporting the rapid and open sharing of proteomic data are being implemented by the leading journals in the field. The proteomics community is taking steps to ensure that data are made publicly accessible and are of high quality, a challenging task that requires the development and deployment of methods for measuring and documenting data quality metrics. On September 18, 2010, the U.S. National Cancer Institute (NCI) convened the 'International Workshop on Proteomic Data Quality Metrics' in Sydney, Australia, to identify and address issues facing the development and use of such methods for open access proteomics data. The stakeholders at the workshop enumerated the key principles underlying a framework for data quality assessment in mass spectrometry data that will meet the needs of the search community, journals, funding agencies, and data repositories. Attendees discussed and agreed upon two primary needs for the wide use of quality metrics: (i)an evolving list of comprehensive quality metrics and (ii)standards accompanied by software analytics. Attendees stressed the importance of increased education and training programs to promote reliable protocols in proteomics. This workshop report explores the historic precedents, key discussions, and necessary next steps to enhance the quality of open access data. By agreement, this article is published simultaneously in Proteomics, Proteomics Clinical Applications, Journal of Proteome Research, and Molecular and Cellular Proteomics, as a public service to the research community.The peer review process was a coordinated effort conducted by a panel of referees selected by the journals.

  13. Implementation Guide - Performance Indicators (Metrics ) for Use with DOE O 440.2B, Aviation Management and Safety

    Directives, Delegations, and Other Requirements [Office of Management (MA)]

    2005-09-19

    The Guide provides information regarding specific provisions of DOE O 440.2B and is intended to be useful in understanding and implementing performance indicators (metrics) required by the Order. Cancels DOE G 440.2B-1. Canceled by DOE N 251.98.

  14. Implementation Guide - Aviation Program Performance Indicators (Metrics) for use with DOE O 440.2B, Aviation Management And Safety

    Directives, Delegations, and Other Requirements [Office of Management (MA)]

    2002-12-10

    The Guide provides information regarding Departmental expectations on provisions of DOE 440.2B, identifies acceptable methods of implementing Aviation Program Performance Indicators (Metrics) requirements in the Order, and identifies relevant principles and practices by referencing Government and non-Government standards. Canceled by DOE G 440.2B-1A.

  15. Use of Frequency Response Metrics to Assess the Planning and Operating Requirements for Reliable Integration of Variable Renewable Generation

    SciTech Connect

    Eto, Joseph H.; Undrill, John; Mackin, Peter; Daschmans, Ron; Williams, Ben; Haney, Brian; Hunt, Randall; Ellis, Jeff; Illian, Howard; Martinez, Carlos; O'Malley, Mark; Coughlin, Katie; LaCommare, Kristina Hamachi

    2010-12-20

    An interconnected electric power system is a complex system that must be operated within a safe frequency range in order to reliably maintain the instantaneous balance between generation and load. This is accomplished by ensuring that adequate resources are available to respond to expected and unexpected imbalances and restoring frequency to its scheduled value in order to ensure uninterrupted electric service to customers. Electrical systems must be flexible enough to reliably operate under a variety of"change" scenarios. System planners and operators must understand how other parts of the system change in response to the initial change, and need tools to manage such changes to ensure reliable operation within the scheduled frequency range. This report presents a systematic approach to identifying metrics that are useful for operating and planning a reliable system with increased amounts of variable renewable generation which builds on existing industry practices for frequency control after unexpected loss of a large amount of generation. The report introduces a set of metrics or tools for measuring the adequacy of frequency response within an interconnection. Based on the concept of the frequency nadir, these metrics take advantage of new information gathering and processing capabilities that system operators are developing for wide-area situational awareness. Primary frequency response is the leading metric that will be used by this report to assess the adequacy of primary frequency control reserves necessary to ensure reliable operation. It measures what is needed to arrest frequency decline (i.e., to establish frequency nadir) at a frequency higher than the highest set point for under-frequency load shedding within an interconnection. These metrics can be used to guide the reliable operation of an interconnection under changing circumstances.

  16. The International Safeguards Technology Base: How is the Patient Doing? An Exploration of Effective Metrics

    SciTech Connect

    Schanfein, Mark; Gouveia, Fernando; Crawford, Cary E.; Pickett, Chris J.; Jay, Jeffrey

    2010-07-15

    The term “Technology Base” is commonly used but what does it mean? Is there a common understanding of the components that comprise a technology base? Does a formal process exist to assess the health of a given technology base? These are important questions the relevance of which is even more pressing given the USDOE/NNSA initiatives to strengthen the safeguards technology base through investments in research & development and human capital development. Accordingly, the authors will establish a high-level framework to define and understand what comprises a technology base. Potential goal-driven metrics to assess the health of a technology base will also be explored, such as linear demographics and resource availability, in the hope that they can be used to better understand and improve the health of the U.S. safeguards technology base. Finally, through the identification of such metrics, the authors will offer suggestions and highlight choices for addressing potential shortfalls. Introduction The U.S. safeguards technology base got its start almost half a century ago in the nuclear weapons program of the U.S. Department of Energy/National Nuclear Security Administration (DOE/NNSA) and their predecessors: AEC & ERDA. Due to nuclear materials’ strategic importance and value, and the risk associated with the public’s and worker’s health and the potential for theft, significant investments were made to develop techniques to measure nuclear materials using both destructive assay (DA) and non-destructive assay (NDA). Major investment within the U.S. DOE Domestic Safeguards Program continued over the next three decades, resulting in continuous improvements in the state-of-the-art of these techniques. This was particularly true in the area of NDA with its ability to use gamma rays, neutrons, and heat to identify and quantify nuclear materials without the need to take direct samples of the material. Most of these techniques were commercialized and transferred to

  17. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  18. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    SciTech Connect

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  19. ZFS on RBODs - Leveraging RAID Controllers for Metrics and Enclosure Management

    SciTech Connect

    Stearman, D. M.

    2015-03-30

    Traditionally, the Lustre file system has relied on the ldiskfs file system with reliable RAID (Redundant Array of Independent Disks) storage underneath. As of Lustre 2.4, ZFS was added as a backend file system, with built-in software RAID, thereby removing the need of expensive RAID controllers. ZFS was designed to work with JBOD (Just a Bunch Of Disks) storage enclosures under the Solaris Operating System, which provided a rich device management system. Long time users of the Lustre file system have relied on the RAID controllers to provide metrics and enclosure monitoring and management services, with rich APIs and command line interfaces. This paper will study a hybrid approach using an advanced full featured RAID enclosure which is presented to the host as a JBOD, This RBOD (RAIDed Bunch Of Disks) allows ZFS to do the RAID protection and error correction, while the RAID controller handles management of the disks and monitors the enclosure. It was hoped that the value of the RAID controller features would offset the additional cost, and that performance would not suffer in this mode. The test results revealed that the hybrid RBOD approach did suffer reduced performance.

  20. Advanced Fuels Campaign Light Water Reactor Accident Tolerant Fuel Performance Metrics Executive Summary

    SciTech Connect

    Shannon Bragg-Sitton

    2014-02-01

    Research and development (R&D) activities on advanced, higher performance Light Water Reactor (LWR) fuels have been ongoing for the last few years. Following the unfortunate March 2011 events at the Fukushima Nuclear Power Plant in Japan, the R&D shifted toward enhancing the accident tolerance of LWRs. Qualitative attributes for fuels with enhanced accident tolerance, such as improved reaction kinetics with steam resulting in slower hydrogen generation rate, provide guidance for the design and development of fuels and cladding with enhanced accident tolerance. A common set of technical metrics should be established to aid in the optimization and down selection of candidate designs on a more quantitative basis. “Metrics” describe a set of technical bases by which multiple concepts can be fairly evaluated against a common baseline and against one another. This report describes a proposed technical evaluation methodology that can be applied to evaluate the ability of each concept to meet performance and safety goals relative to the current UO2 – zirconium alloy system and relative to one another. The resultant ranked evaluation can then inform concept down-selection, such that the most promising accident tolerant fuel design option(s) can continue to be developed toward qualification.

  1. En route to Background Independence: Broken split-symmetry, and how to restore it with bi-metric average actions

    SciTech Connect

    Becker, D. Reuter, M.

    2014-11-15

    The most momentous requirement a quantum theory of gravity must satisfy is Background Independence, necessitating in particular an ab initio derivation of the arena all non-gravitational physics takes place in, namely spacetime. Using the background field technique, this requirement translates into the condition of an unbroken split-symmetry connecting the (quantized) metric fluctuations to the (classical) background metric. If the regularization scheme used violates split-symmetry during the quantization process it is mandatory to restore it in the end at the level of observable physics. In this paper we present a detailed investigation of split-symmetry breaking and restoration within the Effective Average Action (EAA) approach to Quantum Einstein Gravity (QEG) with a special emphasis on the Asymptotic Safety conjecture. In particular we demonstrate for the first time in a non-trivial setting that the two key requirements of Background Independence and Asymptotic Safety can be satisfied simultaneously. Carefully disentangling fluctuation and background fields, we employ a ‘bi-metric’ ansatz for the EAA and project the flow generated by its functional renormalization group equation on a truncated theory space spanned by two separate Einstein–Hilbert actions for the dynamical and the background metric, respectively. A new powerful method is used to derive the corresponding renormalization group (RG) equations for the Newton- and cosmological constant, both in the dynamical and the background sector. We classify and analyze their solutions in detail, determine their fixed point structure, and identify an attractor mechanism which turns out instrumental in the split-symmetry restoration. We show that there exists a subset of RG trajectories which are both asymptotically safe and split-symmetry restoring: In the ultraviolet they emanate from a non-Gaussian fixed point, and in the infrared they loose all symmetry violating contributions inflicted on them by the

  2. Advanced Fuels Campaign Light Water Reactor Accident Tolerant Fuel Performance Metrics

    SciTech Connect

    Brad Merrill; Melissa Teague; Robert Youngblood; Larry Ott; Kevin Robb; Michael Todosow; Chris Stanek; Mitchell Farmer; Michael Billone; Robert Montgomery; Nicholas Brown; Shannon Bragg-Sitton

    2014-02-01

    The safe, reliable and economic operation of the nation’s nuclear power reactor fleet has always been a top priority for the United States’ nuclear industry. As a result, continual improvement of technology, including advanced materials and nuclear fuels, remains central to industry’s success. Decades of research combined with continual operation have produced steady advancements in technology and yielded an extensive base of data, experience, and knowledge on light water reactor (LWR) fuel performance under both normal and accident conditions. In 2011, following the Great East Japan Earthquake, resulting tsunami, and subsequent damage to the Fukushima Daiichi nuclear power plant complex, enhancing the accident tolerance of LWRs became a topic of serious discussion. As a result of direction from the U.S. Congress, the U.S. Department of Energy Office of Nuclear Energy (DOE-NE) initiated an Accident Tolerant Fuel (ATF) Development program. The complex multiphysics behavior of LWR nuclear fuel makes defining specific material or design improvements difficult; as such, establishing qualitative attributes is critical to guide the design and development of fuels and cladding with enhanced accident tolerance. This report summarizes a common set of technical evaluation metrics to aid in the optimization and down selection of candidate designs. As used herein, “metrics” describe a set of technical bases by which multiple concepts can be fairly evaluated against a common baseline and against one another. Furthermore, this report describes a proposed technical evaluation methodology that can be applied to assess the ability of each concept to meet performance and safety goals relative to the current UO2 – zirconium alloy system and relative to one another. The resultant ranked evaluation can then inform concept down-selection, such that the most promising accident tolerant fuel design option(s) can continue to be developed for lead test rod or lead test assembly

  3. Simulation information regarding Sandia National Laboratories%3CU%2B2019%3E trinity capability improvement metric.

    SciTech Connect

    Agelastos, Anthony Michael; Lin, Paul T.

    2013-10-01

    Sandia National Laboratories, Los Alamos National Laboratory, and Lawrence Livermore National Laboratory each selected a representative simulation code to be used as a performance benchmark for the Trinity Capability Improvement Metric. Sandia selected SIERRA Low Mach Module: Nalu, which is a uid dynamics code that solves many variable-density, acoustically incompressible problems of interest spanning from laminar to turbulent ow regimes, since it is fairly representative of implicit codes that have been developed under ASC. The simulations for this metric were performed on the Cielo Cray XE6 platform during dedicated application time and the chosen case utilized 131,072 Cielo cores to perform a canonical turbulent open jet simulation within an approximately 9-billion-elementunstructured- hexahedral computational mesh. This report will document some of the results from these simulations as well as provide instructions to perform these simulations for comparison.

  4. DOE-HDBK-1122-99; Radiological Control Technician Training

    Office of Environmental Management (EM)

    ... PREFIX FACTOR SYMBOL PREFIX FACTOR SYMBOL yotta 10 24 Y deci 10 -1 d zetta 10 21 Z centi 10 -2 c exa 10 18 E milli 10 -3 m peta 10 15 P micro 10 -6 tera 10 12 T nano 10 -9 n ...

  5. New Pathways and Metrics for Enhanced, Reversible Hydrogen Storage in Boron-Doped Carbon Nanospaces

    SciTech Connect

    Pfeifer, Peter; Wexler, Carlos; Hawthorne, M. Frederick; Lee, Mark W.; Jalistegi, Satish S.

    2014-08-14

    This project, since its start in 2007—entitled “Networks of boron-doped carbon nanopores for low-pressure reversible hydrogen storage” (2007-10) and “New pathways and metrics for enhanced, reversible hydrogen storage in boron-doped carbon nanospaces” (2010-13)—is in support of the DOE's National Hydrogen Storage Project, as part of the DOE Hydrogen and Fuel Cells Program’s comprehensive efforts to enable the widespread commercialization of hydrogen and fuel cell technologies in diverse sectors of the economy. Hydrogen storage is widely recognized as a critical enabling technology for the successful commercialization and market acceptance of hydrogen powered vehicles. Storing sufficient hydrogen on board a wide range of vehicle platforms, at energy densities comparable to gasoline, without compromising passenger or cargo space, remains an outstanding technical challenge. Of the main three thrust areas in 2007—metal hydrides, chemical hydrogen storage, and sorption-based hydrogen storage—sorption-based storage, i.e., storage of molecular hydrogen by adsorption on high-surface-area materials (carbons, metal-organic frameworks, and other porous organic networks), has emerged as the most promising path toward achieving the 2017 DOE storage targets of 0.055 kg H2/kg system (“5.5 wt%”) and 0.040 kg H2/liter system. The objective of the project is to develop high-surface-area carbon materials that are boron-doped by incorporation of boron into the carbon lattice at the outset, i.e., during the synthesis of the material. The rationale for boron-doping is the prediction that boron atoms in carbon will raise the binding energy of hydro- gen from 4-5 kJ/mol on the undoped surface to 10-14 kJ/mol on a doped surface, and accordingly the hydro- gen storage capacity of the material. The mechanism for the increase in binding energy is electron donation from H2 to electron-deficient B atoms, in the form of sp2 boron-carbon bonds. Our team is proud to have

  6. Douglas Factors

    Energy.gov [DOE]

    The Merit Systems Protection Board in its landmark decision, Douglas vs. Veterans Administration, 5 MSPR 280, established criteria that supervisors must consider in determining an appropriate penalty to impose for an act of employee misconduct. These twelve factors are commonly referred to as “Douglas Factors” and have been incorporated into the Federal Aviation Administration (FAA) Personnel Management System and various FAA Labor Agreements.

  7. User's Guide to Pre-Processing Data in Universal Translator 2 for the Energy Charting and Metrics Tool (ECAM)

    SciTech Connect

    Taasevigen, Danny J.

    2011-11-30

    This document is a user's guide for the Energy Charting and Metrics Tool to facilitate the examination of energy information from buildings, reducing the time spent analyzing trend and utility meter data. This user guide was generated to help pre-process data with the intention of utilizing the Energy Charting and Metrics (ECAM) tool to improve building operational efficiency. There are numerous occasions when the metered data that is received from the building automation system (BAS) isn't in the right format acceptable for ECAM. This includes, but isn't limited to, cases such as inconsistent time-stamps for the trends (e.g., each trend has its own time-stamp), data with holes (e.g., some time-stamps have data and others are missing data), each point in the BAS is trended and exported into an individual .csv or .txt file, the time-stamp is unrecognizable by ECAM, etc. After reading through this user guide, the user should be able to pre-process all data files and be ready to use this data in ECAM to improve their building operational efficiency.

  8. Development of Metric for Measuring the Impact of RD&D Funding on GTO's Geothermal Exploration Goals (Presentation)

    SciTech Connect

    Jenne, S.; Young, K. R.; Thorsteinsson, H.

    2013-04-01

    The Department of Energy's Geothermal Technologies Office (GTO) provides RD&D funding for geothermal exploration technologies with the goal of lowering the risks and costs of geothermal development and exploration. In 2012, NREL was tasked with developing a metric to measure the impacts of this RD&D funding on the cost and time required for exploration activities. The development of this metric included collecting cost and time data for exploration techniques, creating a baseline suite of exploration techniques to which future exploration and cost and time improvements could be compared, and developing an online tool for graphically showing potential project impacts (all available at http://en.openei.org/wiki/Gateway:Geothermal). The conference paper describes the methodology used to define the baseline exploration suite of techniques (baseline), as well as the approach that was used to create the cost and time data set that populates the baseline. The resulting product, an online tool for measuring impact, and the aggregated cost and time data are available on the Open EI website for public access (http://en.openei.org).

  9. Knowledge-based prediction of plan quality metrics in intracranial stereotactic radiosurgery

    SciTech Connect

    Shiraishi, Satomi; Moore, Kevin L.; Tan, Jun; Olsen, Lindsey A.

    2015-02-15

    Purpose: The objective of this work was to develop a comprehensive knowledge-based methodology for predicting achievable dose–volume histograms (DVHs) and highly precise DVH-based quality metrics (QMs) in stereotactic radiosurgery/radiotherapy (SRS/SRT) plans. Accurate QM estimation can identify suboptimal treatment plans and provide target optimization objectives to standardize and improve treatment planning. Methods: Correlating observed dose as it relates to the geometric relationship of organs-at-risk (OARs) to planning target volumes (PTVs) yields mathematical models to predict achievable DVHs. In SRS, DVH-based QMs such as brain V{sub 10Gy} (volume receiving 10 Gy or more), gradient measure (GM), and conformity index (CI) are used to evaluate plan quality. This study encompasses 223 linear accelerator-based SRS/SRT treatment plans (SRS plans) using volumetric-modulated arc therapy (VMAT), representing 95% of the institution’s VMAT radiosurgery load from the past four and a half years. Unfiltered models that use all available plans for the model training were built for each category with a stratification scheme based on target and OAR characteristics determined emergently through initial modeling process. Model predictive accuracy is measured by the mean and standard deviation of the difference between clinical and predicted QMs, δQM = QM{sub clin} − QM{sub pred}, and a coefficient of determination, R{sup 2}. For categories with a large number of plans, refined models are constructed by automatic elimination of suspected suboptimal plans from the training set. Using the refined model as a presumed achievable standard, potentially suboptimal plans are identified. Predictions of QM improvement are validated via standardized replanning of 20 suspected suboptimal plans based on dosimetric predictions. The significance of the QM improvement is evaluated using the Wilcoxon signed rank test. Results: The most accurate predictions are obtained when plans are

  10. Impact of ASTM Standard E722 update on radiation damage metrics.

    SciTech Connect

    DePriest, Kendall Russell

    2014-06-01

    The impact of recent changes to the ASTM Standard E722 is investigated. The methodological changes in the production of the displacement kerma factors for silicon has significant impact for some energy regions of the 1-MeV(Si) equivalent fluence response function. When evaluating the integral over all neutrons energies in various spectra important to the SNL electronics testing community, the change in the response results in an increase in the total 1-MeV(Si) equivalent fluence of 2 7%. Response functions have been produced and are available for users of both the NuGET and MCNP codes.

  11. A Comparison of Model Short-Range Forecasts and the ARM Microbase Data Fourth Quarter ARM Science Metric

    SciTech Connect

    Hnilo, J.

    2006-09-19

    For the fourth quarter ARM metric we will make use of new liquid water data that has become available, and called the “Microbase” value added product (referred to as OBS, within the text) at three sites: the North Slope of Alaska (NSA), Tropical West Pacific (TWP) and the Southern Great Plains (SGP) and compare these observations to model forecast data. Two time periods will be analyzed March 2000 for the SGP and October 2004 for both TWP and NSA. The Microbase data have been averaged to 35 pressure levels (e.g., from 1000hPa to 100hPa at 25hPa increments) and time averaged to 3hourly data for direct comparison to our model output.

  12. DOE Safety Metrics Indicator Program (SMIP) Fiscal Year 2000 Annual Report of Packaging- and Transportation-related Occurrences

    SciTech Connect

    Dickerson, L.S.

    2001-07-26

    The Oak Ridge National Laboratory (ORNL) has been charged by the DOE National Transportation Program (NTP) with the responsibility of retrieving reports and information pertaining to packaging and transportation (P&T) incidents from the centralized Occurrence Reporting and Processing System (ORPS) database. These selected reports have been analyzed for trends, impact on P&T operations and safety concerns, and lessons learned (LL) in P&T operations. This task is designed not only to keep the NTP aware of what is occurring at DOE sites on a periodic basis, but also to highlight potential P&T problems that may need management attention and allow dissemination of LL to DOE Operations Offices, with the subsequent flow of information to contractors. The Safety Metrics Indicator Program (SMIP) was established by the NTP in fiscal year (FY) 1998 as an initiative to develop a methodology for reporting occurrences with the appropriate metrics to show rates and trends. One of its chief goals has been to augment historical reporting of occurrence-based information and present more meaningful statistics for comparison of occurrences. To this end, the SMIP established a severity weighting system for the classification of the occurrences, which would allow normalization of the data and provide a basis for trending analyses. The process for application of this methodology is documented in the September 1999 report DOE Packaging and Transportation Measurement Methodology for the Safety Metrics Indicator Program (SMIP). This annual report contains information on those P&T-related occurrences reported to the ORPS during the period from October 1, 1999, through September 30, 2000. Only those incidents that occur in preparation for transport, during transport, and during unloading of hazardous material are considered as packaging- or transportation-related occurrences. Other incidents with P&T significance, but not involving hazardous material (such as vehicle accidents or empty

  13. On use of CO{sub 2} chemiluminescence for combustion metrics in natural gas fired reciprocating engines.

    SciTech Connect

    Gupta, S. B.; Bihari, B.; Biruduganti, M.; Sekar, R.; Zigan, J.

    2011-01-01

    Flame chemiluminescence is widely acknowledged to be an indicator of heat release rate in premixed turbulent flames that are representative of gas turbine combustion. Though heat release rate is an important metric for evaluating combustion strategies in reciprocating engine systems, its correlation with flame chemiluminescence is not well studied. To address this gap an experimental study was carried out in a single-cylinder natural gas fired reciprocating engine that could simulate turbocharged conditions with exhaust gas recirculation. Crank angle resolved spectra (266-795 nm) of flame luminosity were measured for various operational conditions by varying the ignition timing for MBT conditions and by holding the speed at 1800 rpm and Brake Mean effective Pressure (BMEP) at 12 bar. The effect of dilution on CO*{sub 2}chemiluminescence intensities was studied, by varying the global equivalence ratio (0.6-1.0) and by varying the exhaust gas recirculation rate. It was attempted to relate the measured chemiluminescence intensities to thermodynamic metrics of importance to engine research -- in-cylinder bulk gas temperature and heat release rate (HRR) calculated from measured cylinder pressure signals. The peak of the measured CO*{sub 2} chemiluminescence intensities coincided with peak pressures within {+-}2 CAD for all test conditions. For each combustion cycle, the peaks of heat release rate, spectral intensity and temperature occurred in that sequence, well separated temporally. The peak heat release rates preceded the peak chemiluminescent emissions by 3.8-9.5 CAD, whereas the peak temperatures trailed by 5.8-15.6 CAD. Such a temporal separation precludes correlations on a crank-angle resolved basis. However, the peak cycle heat release rates and to a lesser extent the peak cycle temperatures correlated well with the chemiluminescent emission from CO*{sub 2}. Such observations point towards the potential use of flame chemiluminescence to monitor peak bulk gas

  14. Relating fish health and reproductive metrics to contaminant bioaccumulation at the Tennessee Valley Authority Kingston coal ash spill site

    DOE PAGES [OSTI]

    Pracheil, Brenda M.; Marshall Adams, S.; Bevelhimer, Mark S.; Fortner, Allison M.; Greeley, Mark S.; Murphy, Cheryl A.; Mathews, Teresa J.; Peterson, Mark J.

    2016-05-06

    A 4.1 million m3 release of coal ash into the Emory and Clinch rivers in December 2008 at Tennessee Valley Authority s Kingston Fossil Plant has prompted a long-term, large-scale biological monitoring effort to determine if there are chronic effects of this spill on biota. Of concern in this spill were arsenic (As) and selenium (Se), heavy metal constituents of coal ash that can be toxic to fish and wildlife and also mercury (Hg): a legacy contaminant that can interact with Se in organisms. We used fish filet bioaccumulation data from Bluegill Lepomis macrochirus, Redear Lepomis microlophus, Largemouth Bass Micropterusmore » salmoides and Channel Catfish Ictalurus punctatus and metrics of fish health including fish condition indices, blood chemistry parameters and liver histopathology data collected from 2009-2013 to determine whether tissue heavy metal burdens relate 1) to each other 2) to metrics of fish health (e.g., blood chemistry characteristics and liver histopathology) and condition, and 3) whether relationships between fish health characteristics and heavy metals are related to site and ash-exposure. We found that burdens of Se and As are generally related to each other between tissues, but burdens of Hg between tissues are not generally positively associated. Taking analyses together, there appears to be reductions in growth and sublethal liver and kidney dysfunction in Bluegill and Largemouth Bass as indicated by blood chemistry parameters (elevated blood protein, glucose, phosphorous, blood urea nitrogen and creatinine in ash-affected sites) and related to concentrations of As and Se. Seeing sub-lethal effects in these species of fish is interesting because Redear had the highest filet burdens of Se, but did not have biomarkers indicating disease or dysfunction. We conclude our study by highlighting the complexities inherent in multimetric fish health data and the need for continued monitoring to further untangle contaminant and fish health

  15. DOE JGI Quality Metrics; Approaches to Scaling and Improving Metagenome Assembly (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Copeland, Alex [DOE JGI]; Brown, C Titus [Michigan State University

    2013-01-22

    DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  16. SU-E-T-379: Concave Approximations of Target Volume Dose Metrics for Intensity- Modulated Radiotherapy Treatment Planning

    SciTech Connect

    Xie, Y; Chen, Y; Wickerhauser, M; Deasy, J

    2014-06-01

    Purpose: The widely used treatment plan metric Dx (mimimum dose to the hottest x% by volume of the target volume) is simple to interpret and use, but is computationally poorly behaved (non-convex), this impedes its use in computationally efficient intensity-modulated radiotherapy (IMRT) treatment planning algorithms. We therefore searched for surrogate metrics that are concave, computationally efficient, and accurately correlated to Dx values in IMRT treatment plans. Methods: To find concave surrogates of D95and more generally, Dx values with variable x valueswe tested equations containing one or two generalized equivalent uniform dose (gEUD) functions. Fits were obtained by varying gEUD a parameter values, as well as the linear equation coefficients. Fitting was performed using a dataset of dose-volume histograms from 498 de-identified head and neck IMRT treatment plans. Fit characteristics were tested using a crossvalidation process. Reported root-mean-square error values were averaged over the cross-validation shuffles. Results: As expected, the two-gEUD formula provided a superior fit, compared to the single-gEUD formula. The best approximation uses two gEUD terms: 16.25 x gEUD[a=0.45] 15.30 x gEUD[a=1.75] 0.69. The average root-mean-square error on repeated (70/30) cross validation was 0.94 Gy. In addition, a formula was found that reasonably approximates Dx for x between 80% and 96%. Conclusion: A simple concave function using two gEUD terms was found that correlates well with PTV D95s for these head and neck treatment plans. More generally, a formula was found that represents well the Dx for x values from 80% to 96%, thus providing a computationally efficient formula for use in treatment planning optimization. The formula may need to be adjusted for other institutions with different treatment planning protocols. We conclude that the strategy of replacing Dx values with gEUD-based formulas is promising.

  17. Geothermal Resource Reporting Metric (GRRM) Developed for the U.S. Department of Energy's Geothermal Technologies Office

    SciTech Connect

    Young, Katherine R.; Wall, Anna M.; Dobson, Patrick F.

    2015-09-02

    This paper reviews a methodology being developed for reporting geothermal resources and project progress. The goal is to provide the U.S. Department of Energy's (DOE) Geothermal Technologies Office (GTO) with a consistent and comprehensible means of evaluating the impacts of its funding programs. This framework will allow the GTO to assess the effectiveness of research, development, and deployment (RD&D) funding, prioritize funding requests, and demonstrate the value of RD&D programs to the U.S. Congress and the public. Standards and reporting codes used in other countries and energy sectors provide guidance to develop the relevant geothermal methodology, but industry feedback and our analysis suggest that the existing models have drawbacks that should be addressed. In order to formulate a comprehensive metric for use by the GTO, we analyzed existing resource assessments and reporting methodologies for the geothermal, mining, and oil and gas industries, and sought input from industry, investors, academia, national labs, and other government agencies. Using this background research as a guide, we describe a methodology for evaluating and reporting on GTO funding according to resource grade (geological, technical and socio-economic) and project progress. This methodology would allow GTO to target funding, measure impact by monitoring the progression of projects, or assess geological potential of targeted areas for development.

  18. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  19. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES [OSTI]

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  20. How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

    SciTech Connect

    Mathew, Paul; Greenberg, Steve; Ganguly, Srirupa; Sartor, Dale; Tschudi, William

    2009-04-01

    Data centers are among the most energy intensive types of facilities, and they are growing dramatically in terms of size and intensity [EPA 2007]. As a result, in the last few years there has been increasing interest from stakeholders - ranging from data center managers to policy makers - to improve the energy efficiency of data centers, and there are several industry and government organizations that have developed tools, guidelines, and training programs. There are many opportunities to reduce energy use in data centers and benchmarking studies reveal a wide range of efficiency practices. Data center operators may not be aware of how efficient their facility may be relative to their peers, even for the same levels of service. Benchmarking is an effective way to compare one facility to another, and also to track the performance of a given facility over time. Toward that end, this article presents the key metrics that facility managers can use to assess, track, and manage the efficiency of the infrastructure systems in data centers, and thereby identify potential efficiency actions. Most of the benchmarking data presented in this article are drawn from the data center benchmarking database at Lawrence Berkeley National Laboratory (LBNL). The database was developed from studies commissioned by the California Energy Commission, Pacific Gas and Electric Co., the U.S. Department of Energy and the New York State Energy Research and Development Authority.

  1. Reducing Power Factor Cost

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Low power factor is expensive and inefficient. Many utility companies charge you an additional fee if your power factor is less than 0.95. Low power factor also reduces your electrical system's distribu- tion capacity by increasing current flow and causing voltage drops. This fact sheet describes power factor and explains how you can improve your power factor to reduce electric bills and enhance your electrical system's capacity. REDUCING POWER FACTOR COST To understand power factor, visualize a

  2. FY 2015 METRIC SUMMARY

    Energy.gov [DOE]

    The Root Cause Analysis report identifies the key elements necessary to make the meaningful changes required to consistently deliver projects within cost and schedule performance parameters.

  3. ARM - 2008 Performance Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    series of retrieved cloud, aerosol, and dust properties, based on results from the ARM ... series of retrieved cloud, aerosol, and dust properties based on results from the ARM ...

  4. Fire Protection Program Metrics

    Energy.gov [DOE]

    Presenter: Perry E. D ’Antonio, P.E., Acting Sr. Manager, Fire Protection - Sandia National Laboratories

  5. Oil Security Metrics Model

    SciTech Connect

    Greene, David L.; Leiby, Paul N.

    2005-03-06

    A presentation to the IWG GPRA USDOE, March 6, 2005, Washington, DC. OSMM estimates oil security benefits of changes in the U.S. oil market.

  6. ARM - 2009 Performance Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    for climate data products and report the Barrow radiation times series data set. ... (PDF). The Barrow radiation time series data set was developed and is available at the ...

  7. ASR - 2011 Performance Metrics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    climate modeling within BER CESD. The goal of the climate modeling program is the development of climate models that include natural and human systems, which will project...

  8. Science as Knowledge, Practice, and Map Making: The Challenge of Defining Metrics for Evaluating and Improving DOE-Funded Basic Experimental Science

    SciTech Connect

    Bodnarczuk, M.

    1993-03-01

    Industrial R&D laboratories have been surprisingly successful in developing performance objectives and metrics that convincingly show that planning, management, and improvement techniques can be value-added to the actual output of R&D organizations. In this paper, I will discuss the more difficult case of developing analogous constructs for DOE-funded non-nuclear, non-weapons basic research, or as I will refer to it - basic experimental science. Unlike most industrial R&D or the bulk of applied science performed at the National Renewable Energy Laboratory (NREL), the purpose of basic experimental science is producing new knowledge (usually published in professional journals) that has no immediate application to the first link (the R) of a planned R&D chain. Consequently, performance objectives and metrics are far more difficult to define. My claim is that if one can successfully define metrics for evaluating and improving DOE-funded basic experimental science (which is the most difficult case), then defining such constructs for DOE-funded applied science should be much less problematic. With the publication of the DOE Standard - Implementation Guide for Quality Assurance Programs for Basic and Applied Research (DOE-ER-STD-6001-92) and the development of a conceptual framework for integrating all the DOE orders, we need to move aggressively toward the threefold next phase: (1) focusing the management elements found in DOE-ER-STD-6001-92 on the main output of national laboratories - the experimental science itself; (2) developing clearer definitions of basic experimental science as practice not just knowledge; and (3) understanding the relationship between the metrics that scientists use for evaluating the performance of DOE-funded basic experimental science, the management elements of DOE-ER-STD-6001-92, and the notion of continuous improvement.

  9. Land and Water Use, CO2 Emissions, and Worker Radiological Exposure Factors for the Nuclear Fuel Cycle

    SciTech Connect

    Brett W Carlsen; Brent W Dixon; Urairisa Pathanapirom; Eric Schneider; Bethany L. Smith; Timothy M. AUlt; Allen G. Croff; Steven L. Krahn

    2013-08-01

    The Department of Energy Office of Nuclear Energy’s Fuel Cycle Technologies program is preparing to evaluate several proposed nuclear fuel cycle options to help guide and prioritize Fuel Cycle Technology research and development. Metrics are being developed to assess performance against nine evaluation criteria that will be used to assess relevant impacts resulting from all phases of the fuel cycle. This report focuses on four specific environmental metrics. • land use • water use • CO2 emissions • radiological Dose to workers Impacts associated with the processes in the front-end of the nuclear fuel cycle, mining through enrichment and deconversion of DUF6 are summarized from FCRD-FCO-2012-000124, Revision 1. Impact estimates are developed within this report for the remaining phases of the nuclear fuel cycle. These phases include fuel fabrication, reactor construction and operations, fuel reprocessing, and storage, transport, and disposal of associated used fuel and radioactive wastes. Impact estimates for each of the phases of the nuclear fuel cycle are given as impact factors normalized per unit process throughput or output. These impact factors can then be re-scaled against the appropriate mass flows to provide estimates for a wide range of potential fuel cycles. A companion report, FCRD-FCO-2013-000213, applies the impact factors to estimate and provide a comparative evaluation of 40 fuel cycles under consideration relative to these four environmental metrics.

  10. Evaluating IMRT and VMAT dose accuracy: Practical examples of failure to detect systematic errors when applying a commonly used metric and action levels

    SciTech Connect

    Nelms, Benjamin E.; Chan, Maria F.; Jarry, Genevive; Lemire, Matthieu; Lowden, John; Hampton, Carnell

    2013-11-15

    Purpose: This study (1) examines a variety of real-world cases where systematic errors were not detected by widely accepted methods for IMRT/VMAT dosimetric accuracy evaluation, and (2) drills-down to identify failure modes and their corresponding means for detection, diagnosis, and mitigation. The primary goal of detailing these case studies is to explore different, more sensitive methods and metrics that could be used more effectively for evaluating accuracy of dose algorithms, delivery systems, and QA devices.Methods: The authors present seven real-world case studies representing a variety of combinations of the treatment planning system (TPS), linac, delivery modality, and systematic error type. These case studies are typical to what might be used as part of an IMRT or VMAT commissioning test suite, varying in complexity. Each case study is analyzed according to TG-119 instructions for gamma passing rates and action levels for per-beam and/or composite plan dosimetric QA. Then, each case study is analyzed in-depth with advanced diagnostic methods (dose profile examination, EPID-based measurements, dose difference pattern analysis, 3D measurement-guided dose reconstruction, and dose grid inspection) and more sensitive metrics (2% local normalization/2 mm DTA and estimated DVH comparisons).Results: For these case studies, the conventional 3%/3 mm gamma passing rates exceeded 99% for IMRT per-beam analyses and ranged from 93.9% to 100% for composite plan dose analysis, well above the TG-119 action levels of 90% and 88%, respectively. However, all cases had systematic errors that were detected only by using advanced diagnostic techniques and more sensitive metrics. The systematic errors caused variable but noteworthy impact, including estimated target dose coverage loss of up to 5.5% and local dose deviations up to 31.5%. Types of errors included TPS model settings, algorithm limitations, and modeling and alignment of QA phantoms in the TPS. Most of the errors were

  11. FY 2009 Annual Report of Joule Software Metric SC GG 3.1/2.5.2, Improve Computational Science Capabilities

    SciTech Connect

    Kothe, Douglas B; Roche, Kenneth J; Kendall, Ricky A

    2010-01-01

    The Joule Software Metric for Computational Effectiveness is established by Public Authorizations PL 95-91, Department of Energy Organization Act, and PL 103-62, Government Performance and Results Act. The U.S. Office of Management and Budget (OMB) oversees the preparation and administration of the President s budget; evaluates the effectiveness of agency programs, policies, and procedures; assesses competing funding demands across agencies; and sets the funding priorities for the federal government. The OMB has the power of audit and exercises this right annually for each federal agency. According to the Government Performance and Results Act of 1993 (GPRA), federal agencies are required to develop three planning and performance documents: 1.Strategic Plan: a broad, 3 year outlook; 2.Annual Performance Plan: a focused, 1 year outlook of annual goals and objectives that is reflected in the annual budget request (What results can the agency deliver as part of its public funding?); and 3.Performance and Accountability Report: an annual report that details the previous fiscal year performance (What results did the agency produce in return for its public funding?). OMB uses its Performance Assessment Rating Tool (PART) to perform evaluations. PART has seven worksheets for seven types of agency functions. The function of Research and Development (R&D) programs is included. R&D programs are assessed on the following criteria: Does the R&D program perform a clear role? Has the program set valid long term and annual goals? Is the program well managed? Is the program achieving the results set forth in its GPRA documents? In Fiscal Year (FY) 2003, the Department of Energy Office of Science (DOE SC-1) worked directly with OMB to come to a consensus on an appropriate set of performance measures consistent with PART requirements. The scientific performance expectations of these requirements reach the scope of work conducted at the DOE national laboratories. The Joule system

  12. THE POSSIBLE ROLE OF CORONAL STREAMERS AS MAGNETICALLY CLOSED STRUCTURES IN SHOCK-INDUCED ENERGETIC ELECTRONS AND METRIC TYPE II RADIO BURSTS

    SciTech Connect

    Kong, Xiangliang; Chen, Yao; Feng, Shiwei; Wang, Bing; Du, Guohui; Guo, Fan; Li, Gang

    2015-01-10

    Two solar typeII radio bursts, separated by ?24 hr in time, are examined together. Both events are associated with coronal mass ejections (CMEs) erupting from the same active region (NOAA 11176) beneath a well-observed helmet streamer. We find that the typeII emissions in both events ended once the CME/shock fronts passed the white-light streamer tip, which is presumably the magnetic cusp of the streamer. This leads us to conjecture that the closed magnetic arcades of the streamer may play a role in electron acceleration and typeII excitation at coronal shocks. To examine such a conjecture, we conduct a test-particle simulation for electron dynamics within a large-scale partially closed streamer magnetic configuration swept by a coronal shock. We find that the closed field lines play the role of an electron trap via which the electrons are sent back to the shock front multiple times and therefore accelerated to high energies by the shock. Electrons with an initial energy of 300 eV can be accelerated to tens of keV concentrating at the loop apex close to the shock front with a counter-streaming distribution at most locations. These electrons are energetic enough to excite Langmuir waves and radio bursts. Considering the fact that most solar eruptions originate from closed field regions, we suggest that the scenario may be important for the generation of more metric typeIIs. This study also provides an explanation of the general ending frequencies of metric typeIIs at or above 20-30 MHz and the disconnection issue between metric and interplanetary typeIIs.

  13. SU-E-I-28: Introduction and Investigation of Effective Diameter Ratios as a New Patient Size Metric for Use in CT

    SciTech Connect

    Lamoureux, R; Sinclair, L; Mench, A; Lipnharski, I; Carranza, C; Bidari, S; Cormack, B; Rill, L; Arreola, M

    2015-06-15

    Purpose: To introduce and investigate effective diameter ratios as a new patient metric for use in computed tomography protocol selection as a supplement to patient-specific size parameter data. Methods: The metrics of outer effective diameter and inner effective diameter were measured for 7 post-mortem subjects scanned with a standardized chest/abdomen/pelvis (CAP) protocol on a 320-slice MDCT scanner. The outer effective diameter was calculated by obtaining the anterior/posterior and lateral dimensions of the imaged anatomy at the middle of the scan range using Effective Diameter= SQRT(AP height*Lat Width). The inner effective diameter was calculated with the same equation using the AP and Lat dimensions of the anatomy excluding the adipose tissue. The ratio of outer to inner effective diameter was calculated for each subject. A relationship to BMI, weight, and CTDI conversion coefficients was investigated. Results: For the largest subject with BMI of 43.85 kg/m2 and weight of 255 lbs the diameter ratio was calculated as 1.33. For the second largest subject with BMI of 33.5 kg/m2 and weight of 192.4 lbs the diameter ratio was measured as 1.43, indicating a larger percentage of adipose tissue in the second largest subject’s anatomical composition. For the smallest subject at BMI of 17.4 kg/m2 and weight of 86 lbs a similar tissue composition was indicated as a subject with BMI of 24.2 kg/m2 and weight of 136 lbs as they had the same diameter ratios of 1.11. Conclusion: The diameter ratio proves to contain information about anatomical composition that the BMI and weight alone do not. The utility of this metric is still being examined but could prove useful for determining MDCT techniques and for giving a more in depth detail of the composition of a patient’s body habitus.

  14. Dilaton field minimally coupled to 2+1 gravity; uniqueness of the static Chan-Mann black hole and new dilaton stationary metrics

    SciTech Connect

    García-Diaz, Alberto A.

    2014-01-14

    Using the Schwarzschild coordinate frame for a static cyclic symmetric metric in 2+1 gravity coupled minimally to a dilaton logarithmically depending on the radial coordinate in the presence of an exponential potential, by solving first order linear Einstein equations, the general solution is derived and it is identified with the Chan–Mann dilaton solution. In these coordinates, a new stationary dilaton solution is obtained; it does not allow for a de Sitter–Anti-de Sitter limit at spatial infinity, where its structural functions increase indefinitely. On the other hand, it is horizonless and allows for a naked singularity at the origin of coordinates; moreover, one can identify at a large radial coordinate a (quasi-local) mass parameter and in the whole space a constant angular momentum. Via a general SL(2,R)–transformation, applied on the static cyclic symmetric metric, a family of stationary dilaton solutions has been generated. A particular SL(2,R)–transformation is identified, which gives rise to the rotating Chan–Mann dilaton solution. All the exhibited solutions have been characterized by their quasi-local energy, mass, and momentum through their series expansions at spatial infinity. The algebraic structure of the Ricci–energy-momentum, and Cotton tensors is given explicitly.

  15. CT head-scan dosimetry in an anthropomorphic phantom and associated measurement of ACR accreditation-phantom imaging metrics under clinically representative scan conditions

    SciTech Connect

    Brunner, Claudia C.; Stern, Stanley H.; Chakrabarti, Kish; Minniti, Ronaldo; Parry, Marie I.; Skopec, Marlene

    2013-08-15

    Purpose: To measure radiation absorbed dose and its distribution in an anthropomorphic head phantom under clinically representative scan conditions in three widely used computed tomography (CT) scanners, and to relate those dose values to metrics such as high-contrast resolution, noise, and contrast-to-noise ratio (CNR) in the American College of Radiology CT accreditation phantom.Methods: By inserting optically stimulated luminescence dosimeters (OSLDs) in the head of an anthropomorphic phantom specially developed for CT dosimetry (University of Florida, Gainesville), we measured dose with three commonly used scanners (GE Discovery CT750 HD, Siemens Definition, Philips Brilliance 64) at two different clinical sites (Walter Reed National Military Medical Center, National Institutes of Health). The scanners were set to operate with the same data-acquisition and image-reconstruction protocols as used clinically for typical head scans, respective of the practices of each facility for each scanner. We also analyzed images of the ACR CT accreditation phantom with the corresponding protocols. While the Siemens Definition and the Philips Brilliance protocols utilized only conventional, filtered back-projection (FBP) image-reconstruction methods, the GE Discovery also employed its particular version of an adaptive statistical iterative reconstruction (ASIR) algorithm that can be blended in desired proportions with the FBP algorithm. We did an objective image-metrics analysis evaluating the modulation transfer function (MTF), noise power spectrum (NPS), and CNR for images reconstructed with FBP. For images reconstructed with ASIR, we only analyzed the CNR, since MTF and NPS results are expected to depend on the object for iterative reconstruction algorithms.Results: The OSLD measurements showed that the Siemens Definition and the Philips Brilliance scanners (located at two different clinical facilities) yield average absorbed doses in tissue of 42.6 and 43.1 m

  16. The MX Factor

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    The MX Factor National Security Science Latest Issue:April 2016 past issues All Issues » submit The MX Factor Data from atmospheric test films persuaded Department of Defense planners not to deploy the MX missile system in the Great Basin Desert. July 1, 2015 The MX Factor A Peacekeeper test missile re-entering the atmosphere at the Kwajalein Atoll in the Marshall Islands. This long-exposure photo shows the paths of the multiple re-entry vehicles deployed by the missile. (Photo: U.S. Army)

  17. FGF growth factor analogs

    DOEpatents

    Zamora, Paul O.; Pena, Louis A.; Lin, Xinhua; Takahashi, Kazuyuki

    2012-07-24

    The present invention provides a fibroblast growth factor heparin-binding analog of the formula: ##STR00001## where R.sub.1, R.sub.2, R.sub.3, R.sub.4, R.sub.5, X, Y and Z are as defined, pharmaceutical compositions, coating compositions and medical devices including the fibroblast growth factor heparin-binding analog of the foregoing formula, and methods and uses thereof.

  18. Multi-factor authentication

    DOEpatents

    Hamlet, Jason R; Pierson, Lyndon G

    2014-10-21

    Detection and deterrence of spoofing of user authentication may be achieved by including a cryptographic fingerprint unit within a hardware device for authenticating a user of the hardware device. The cryptographic fingerprint unit includes an internal physically unclonable function ("PUF") circuit disposed in or on the hardware device, which generates a PUF value. Combining logic is coupled to receive the PUF value, combines the PUF value with one or more other authentication factors to generate a multi-factor authentication value. A key generator is coupled to generate a private key and a public key based on the multi-factor authentication value while a decryptor is coupled to receive an authentication challenge posed to the hardware device and encrypted with the public key and coupled to output a response to the authentication challenge decrypted with the private key.

  19. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    SciTech Connect

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&E’s development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

  20. The Oil Security Metrics Model: A Tool for Evaluating the Prospective Oil Security Benefits of DOE's Energy Efficiency and Renewable Energy R&D Programs

    SciTech Connect

    Greene, David L; Leiby, Paul Newsome

    2006-05-01

    Energy technology R&D is a cornerstone of U.S. energy policy. Understanding the potential for energy technology R&D to solve the nation's energy problems is critical to formulating a successful R&D program. In light of this, the U.S. Congress requested the National Research Council (NRC) to undertake both retrospective and prospective assessments of the Department of Energy's (DOE's) Energy Efficiency and Fossil Energy Research programs (NRC, 2001; NRC, 2005). ("The Congress continued to express its interest in R&D benefits assessment by providing funds for the NRC to build on the retrospective methodology to develop a methodology for assessing prospective benefits." NRC, 2005, p. ES-2) In 2004, the NRC Committee on Prospective Benefits of DOE's Energy Efficiency and Fossil Energy R&D Programs published a report recommending a new framework and principles for prospective benefits assessment. The Committee explicitly deferred the issue of estimating security benefits to future work. Recognizing the need for a rigorous framework for assessing the energy security benefits of its R&D programs, the DOE's Office of Energy Efficiency and Renewable Energy (EERE) developed a framework and approach for defining energy security metrics for R&D programs to use in gauging the energy security benefits of their programs (Lee, 2005). This report describes methods for estimating the prospective oil security benefits of EERE's R&D programs that are consistent with the methodologies of the NRC (2005) Committee and that build on Lee's (2005) framework. Its objective is to define and implement a method that makes use of the NRC's typology of prospective benefits and methodological framework, satisfies the NRC's criteria for prospective benefits evaluation, and permits measurement of that portion of the prospective energy security benefits of EERE's R&D portfolio related to oil. While the Oil Security Metrics (OSM) methodology described in this report has been specifically developed to

  1. Radiation View Factor With Shadowing

    Energy Science and Technology Software Center

    1992-02-24

    FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors as input data to finite element heat transfer analysis codes.

  2. Inelastic Scattering Form Factors

    Energy Science and Technology Software Center

    1992-01-01

    ATHENA-IV computes form factors for inelastic scattering calculations, using single-particle wave functions that are eigenstates of motion in either a Woods-Saxon potential well or a harmonic oscillator well. Two-body forces of Gauss, Coulomb, Yukawa, and a sum of cut-off Yukawa radial dependences are available.

  3. ERYTHROPOIETIC FACTOR PURIFICATION

    DOEpatents

    White, W.F.; Schlueter, R.J.

    1962-05-01

    A method is given for purifying and concentrating the blood plasma erythropoietic factor. Anemic sheep plasma is contacted three times successively with ion exchange resins: an anion exchange resin, a cation exchange resin at a pH of about 5, and a cation exchange resin at a pH of about 6. (AEC)

  4. Two-Factor Authentication

    Energy.gov [DOE]

    Two-Factor Authentication (2FA) (also known as 2-Step Verification) is a system that employs two methods to identify an individual. More secure than reusable passwords, when a token's random number...

  5. Anthrax Lethal Factor

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Thiang Yian Wong, Robert Schwarzenbacher and Robert C. Liddington The Burnham Institute, 10901 North Torrey Pines Road, La Jolla, CA 92037. Anthrax Toxin is a major virulence factor in the infectious disease, Anthrax1. This toxin is produced by Bacillus anthracis, which is an encapsulated, spore-forming, rod-shaped bacterium. Inhalation anthrax, the most deadly form, is contracted through breathing spores. Once spores germinate within cells of the immune system called macrophages2, bacterial

  6. Nucleon Electromagnetic Form Factors

    SciTech Connect

    Kees de Jager

    2004-08-01

    Although nucleons account for nearly all the visible mass in the universe, they have a complicated structure that is still incompletely understood. The first indication that nucleons have an internal structure, was the measurement of the proton magnetic moment by Frisch and Stern (1933) which revealed a large deviation from the value expected for a point-like Dirac particle. The investigation of the spatial structure of the nucleon, resulting in the first quantitative measurement of the proton charge radius, was initiated by the HEPL (Stanford) experiments in the 1950s, for which Hofstadter was awarded the 1961 Nobel prize. The first indication of a non-zero neutron charge distribution was obtained by scattering thermal neutrons off atomic electrons. The recent revival of its experimental study through the operational implementation of novel instrumentation has instigated a strong theoretical interest. Nucleon electro-magnetic form factors (EMFFs) are optimally studied through the exchange of a virtual photon, in elastic electron-nucleon scattering. The momentum transferred to the nucleon by the virtual photon can be selected to probe different scales of the nucleon, from integral properties such as the charge radius to scaling properties of its internal constituents. Polarization instrumentation, polarized beams and targets, and the measurement of the polarization of the recoiling nucleon have been essential in the accurate separation of the charge and magnetic form factors and in studies of the elusive neutron charge form factor.

  7. DOE Project Management Update (Metrics)

    Energy.gov [DOE]

    Michael Peek, Deputy Director, Office of Project Management Oversight and Assessments March 22, 2016

  8. A File System Utilization Metric

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Abstract A high performance computing (HPC) platform today typically contains ... the remaining 1M B - 4kB already inmemory when later 4kB requests for that data arrive. ...

  9. "(Million Metric Tons Carbon Dioxide)"

    Energy Information Administration (EIA) (indexed site)

    ....0280756469,0.02562455361,0.02345646124 " China",2293,5558,5862,6284,7716,9057,10514,11945...,0.4312535075,0.4478837352,0.7550810962 " China",0.1064692737,0.1961919973,0.2032923089,0....

  10. The MX Factor

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    MX Factor Test films played a strategic-planning role in the debates of the late 1970s and early 1980s about where and how to deploy the MX intercontinental ballistic missile (LGM-118 Peacekeeper). The deployment would have to ensure that the missiles could survive a first strike by an adversary. Military planners were considering placing the missiles in clusters of hardened concrete shelters in the hot, dry Great Basin Desert of Nevada and Utah. Films of atmospheric tests at the Nevada Test

  11. Nucleon Electromagnetic Form Factors

    SciTech Connect

    Marc Vanderhaeghen; Charles Perdrisat; Vina Punjabi

    2007-10-01

    There has been much activity in the measurement of the elastic electromagnetic proton and neutron form factors in the last decade, and the quality of the data has greatly improved by performing double polarization experiments, in comparison with previous unpolarized data. Here we review the experimental data base in view of the new results for the proton, and neutron, obtained at JLab, MAMI, and MIT-Bates. The rapid evolution of phenomenological models triggered by these high-precision experiments will be discussed, including the recent progress in the determination of the valence quark generalized parton distributions of the nucleon, as well as the steady rate of improvements made in the lattice QCD calculations.

  12. Characteristics RSE Column Factor: Total

    Energy Information Administration (EIA) (indexed site)

    and 1994 Vehicle Characteristics RSE Column Factor: Total 1993 Family Income Below Poverty Line Eli- gible for Fed- eral Assist- ance 1 RSE Row Factor: Less than 5,000 5,000...

  13. Validation of mathematical models for the prediction of organs-at-risk dosimetric metrics in high-dose-rate gynecologic interstitial brachytherapy

    SciTech Connect

    Damato, Antonio L.; Viswanathan, Akila N.; Cormack, Robert A.

    2013-10-15

    Purpose: Given the complicated nature of an interstitial gynecologic brachytherapy treatment plan, the use of a quantitative tool to evaluate the quality of the achieved metrics compared to clinical practice would be advantageous. For this purpose, predictive mathematical models to predict the D{sub 2cc} of rectum and bladder in interstitial gynecologic brachytherapy are discussed and validated.Methods: Previous plans were used to establish the relationship between D2cc and the overlapping volume of the organ at risk with the targeted area (C0) or a 1-cm expansion of the target area (C1). Three mathematical models were evaluated: D{sub 2cc}=α*C{sub 1}+β (LIN); D{sub 2cc}=α– exp(–β*C{sub 0}) (EXP); and a mixed approach (MIX), where both C{sub 0} and C{sub 1} were inputs of the model. The parameters of the models were optimized on a training set of patient data, and the predictive error of each model (predicted D{sub 2cc}− real D{sub 2cc}) was calculated on a validation set of patient data. The data of 20 patients were used to perform a K-fold cross validation analysis, with K = 2, 4, 6, 8, 10, and 20.Results: MIX was associated with the smallest mean prediction error <6.4% for an 18-patient training set; LIN had an error <8.5%; EXP had an error <8.3%. Best case scenario analysis shows that an error ≤5% can be achieved for a ten-patient training set with MIX, an error ≤7.4% for LIN, and an error ≤6.9% for EXP. The error decreases with the increase in training set size, with the most marked decrease observed for MIX.Conclusions: The MIX model can predict the D{sub 2cc} of the organs at risk with an error lower than 5% with a training set of ten patients or greater. The model can be used in the development of quality assurance tools to identify treatment plans with suboptimal sparing of the organs at risk. It can also be used to improve preplanning and in the development of real-time intraoperative planning tools.

  14. PROGRESS TOWARDS NEXT GENERATION, WAVEFORM BASED THREE-DIMENSIONAL MODELS AND METRICS TO IMPROVE NUCLEAR EXPLOSION MONITORING IN THE MIDDLE EAST

    SciTech Connect

    Savage, B; Peter, D; Covellone, B; Rodgers, A; Tromp, J

    2009-07-02

    Efforts to update current wave speed models of the Middle East require a thoroughly tested database of sources and recordings. Recordings of seismic waves traversing the region from Tibet to the Red Sea will be the principal metric in guiding improvements to the current wave speed model. Precise characterizations of the earthquakes, specifically depths and faulting mechanisms, are essential to avoid mapping source errors into the refined wave speed model. Errors associated with the source are manifested in amplitude and phase changes. Source depths and paths near nodal planes are particularly error prone as small changes may severely affect the resulting wavefield. Once sources are quantified, regions requiring refinement will be highlighted using adjoint tomography methods based on spectral element simulations [Komatitsch and Tromp (1999)]. An initial database of 250 regional Middle Eastern events from 1990-2007, was inverted for depth and focal mechanism using teleseismic arrivals [Kikuchi and Kanamori (1982)] and regional surface and body waves [Zhao and Helmberger (1994)]. From this initial database, we reinterpreted a large, well recorded subset of 201 events through a direct comparison between data and synthetics based upon a centroid moment tensor inversion [Liu et al. (2004)]. Evaluation was done using both a 1D reference model [Dziewonski and Anderson (1981)] at periods greater than 80 seconds and a 3D model [Kustowski et al. (2008)] at periods of 25 seconds and longer. The final source reinterpretations will be within the 3D model, as this is the initial starting point for the adjoint tomography. Transitioning from a 1D to 3D wave speed model shows dramatic improvements when comparisons are done at shorter periods, (25 s). Synthetics from the 1D model were created through mode summations while those from the 3D simulations were created using the spectral element method. To further assess errors in source depth and focal mechanism, comparisons between the

  15. Table 11.5b Emissions From Energy Consumption for Electricity Generation and Useful Thermal Output: Electric Power Sector, 1989-2010 (Subset of Table 11.5a; Metric Tons of Gas)

    Energy Information Administration (EIA) (indexed site)

    b Emissions From Energy Consumption for Electricity Generation and Useful Thermal Output: Electric Power Sector, 1989-2010 (Subset of Table 11.5a; Metric Tons of Gas) Year Carbon Dioxide 1 Sulfur Dioxide Nitrogen Oxides Coal 2 Natural Gas 3 Petroleum 4 Geo- thermal 5 Non- Biomass Waste 6 Total Coal 2 Natural Gas 3 Petroleum 4 Other 7 Total Coal 2 Natural Gas 3 Petroleum 4 Other 7 Total 1989 1,520,229,870 169,653,294 133,545,718 363,247 4,365,768 1,828,157,897 13,815,263 832 809,873 6,874

  16. Table 11.5c Emissions From Energy Consumption for Electricity Generation and Useful Thermal Output: Commercial and Industrial Sectors, 1989-2010 (Subset of Table 11.5a; Metric Tons of Gas)

    Energy Information Administration (EIA) (indexed site)

    c Emissions From Energy Consumption for Electricity Generation and Useful Thermal Output: Commercial and Industrial Sectors, 1989-2010 (Subset of Table 11.5a; Metric Tons of Gas) Year Carbon Dioxide 1 Sulfur Dioxide Nitrogen Oxides Coal 2 Natural Gas 3 Petroleum 4 Geo- thermal 5 Non- Biomass Waste 6 Total Coal 2 Natural Gas 3 Petroleum 4 Other 7 Total Coal 2 Natural Gas 3 Petroleum 4 Other 7 Total Commercial Sector 8<//td> 1989 2,319,630 1,542,083 637,423 [ –] 803,754 5,302,890 37,398 4

  17. Factorized molecular wave functions: Analysis of the nuclear factor

    SciTech Connect

    Lefebvre, R.

    2015-06-07

    The exact factorization of molecular wave functions leads to nuclear factors which should be nodeless functions. We reconsider the case of vibrational perturbations in a diatomic species, a situation usually treated by combining Born-Oppenheimer products. It was shown [R. Lefebvre, J. Chem. Phys. 142, 074106 (2015)] that it is possible to derive, from the solutions of coupled equations, the form of the factorized function. By increasing artificially the interstate coupling in the usual approach, the adiabatic regime can be reached, whereby the wave function can be reduced to a single product. The nuclear factor of this product is determined by the lowest of the two potentials obtained by diagonalization of the potential matrix. By comparison with the nuclear wave function of the factorized scheme, it is shown that by a simple rectification, an agreement is obtained between the modified nodeless function and that of the adiabatic scheme.

  18. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    SciTech Connect

    Lee, Dong-Chang; Jans, Hans; McEwan, Sandy; Riauka, Terence; Martin, Wayne; Wieler, Marguerite

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively. The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.

  19. Factor CO2 | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Factor CO2 Jump to: navigation, search Name: Factor CO2 Place: Bilbao, Spain Zip: 48008 Product: Spain-based consultancy specializing in climate change projects. References: Factor...

  20. Human Factors Engineering Analysis Tool

    Energy Science and Technology Software Center

    2002-03-04

    HFE-AT is a human factors engineering (HFE) software analysis tool (AT) for human-system interface design of process control systems, and is based primarily on NUREG-0700 guidance.

  1. Factorization, power corrections, and the pion form factor

    SciTech Connect

    Rothstein, Ira Z.

    2004-09-01

    This paper is an investigation of the pion form factor utilizing recently developed effective field theory techniques. The primary results reported are both the transition and electromagnetic form factors are corrected at order {lambda}/Q due to time ordered products which account for deviations of the pion from being a state composed purely of highly energetic collinear quarks in the lab frame. The usual higher twist wave function corrections contribute only at order {lambda}{sup 2}/Q{sup 2}, when the quark mass vanishes. In the case of the electromagnetic form factor the {lambda}/Q power correction is enhanced by a power of 1/{alpha}{sub s}(Q) relative to the leading order result of Brodsky and Lepage, if the scale {radical}({lambda}Q) is nonperturbative. This enhanced correction could explain the discrepancy with the data.

  2. Human factors in software development

    SciTech Connect

    Curtis, B.

    1986-01-01

    This book presents an overview of ergonomics/human factors in software development, recent research, and classic papers. Articles are drawn from the following areas of psychological research on programming: cognitive ergonomics, cognitive psychology, and psycholinguistics. Topics examined include: theoretical models of how programmers solve technical problems, the characteristics of programming languages, specification formats in behavioral research and psychological aspects of fault diagnosis.

  3. Transcription factor-based biosensor

    DOEpatents

    2013-10-08

    The present invention provides for a system comprising a BmoR transcription factor, a .sigma..sup.54-RNA polymerase, and a pBMO promoter operatively linked to a reporter gene, wherein the pBMO promoter is capable of expression of the reporter gene with an activated form of the BmoR and the .sigma..sup.54-RNA polymerase.

  4. SECTION M_Evaluation Factors

    National Nuclear Security Administration (NNSA)

    SECTION M EVALUATION FACTORS FOR AWARD TABLE OF CONTENTS M-1 EVALUATION OF PROPOSALS......................................................................176 M-2 BASIS FOR CONTRACT AWARD...................................................................177 M-3 TECHNICAL AND MANAGEMENT CRITERIA..........................................177 M-4 COST CRITERION.............................................................................................179 Section M, Page 176 M-1 EVALUATION OF

  5. Tetrahydroquinoline Derivatives as Potent and Selective Factor...

    Office of Scientific and Technical Information (OSTI)

    as Potent and Selective Factor XIa Inhibitors Citation Details In-Document Search Title: Tetrahydroquinoline Derivatives as Potent and Selective Factor XIa Inhibitors Authors: ...

  6. Structural basis for Tetrahymena telomerase processivity factor...

    Office of Scientific and Technical Information (OSTI)

    factor Teb1 binding to single-stranded telomeric-repeat DNA Citation Details In-Document Search Title: Structural basis for Tetrahymena telomerase processivity factor Teb1 ...

  7. IPCC Emission Factor Database | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Emission Factor Database Jump to: navigation, search Tool Summary LAUNCH TOOL Name: IPCC Emission Factor Database AgencyCompany Organization: World Meteorological Organization,...

  8. Factors Impacting Decommissioning Costs - 13576

    SciTech Connect

    Kim, Karen; McGrath, Richard

    2013-07-01

    The Electric Power Research Institute (EPRI) studied United States experience with decommissioning cost estimates and the factors that impact the actual cost of decommissioning projects. This study gathered available estimated and actual decommissioning costs from eight nuclear power plants in the United States to understand the major components of decommissioning costs. Major costs categories for decommissioning a nuclear power plant are removal costs, radioactive waste costs, staffing costs, and other costs. The technical factors that impact the costs were analyzed based on the plants' decommissioning experiences. Detailed cost breakdowns by major projects and other cost categories from actual power plant decommissioning experiences will be presented. Such information will be useful in planning future decommissioning and designing new plants. (authors)

  9. Human factors in waste management

    SciTech Connect

    Moray, N.

    1994-10-01

    This article examines the role of human factors in radioactive waste management. Although few problems and ergonomics are special to radioactive waste management, some problems are unique especially with long term storage. The entire sociotechnical system must be looked at in order to see where improvement can take place because operator errors, as seen in Chernobyl and Bhopal, are ultimately the result of management errors.

  10. Calculating Individual Resources Variability and Uncertainty Factors Based on Their Contributions to the Overall System Balancing Needs

    SciTech Connect

    Makarov, Yuri V.; Du, Pengwei; Pai, M. A.; McManus, Bart

    2014-01-14

    The variability and uncertainty of wind power production requires increased flexibility in power systems, or more operational reserves to main a satisfactory level of reliability. The incremental increase in reserve requirement caused by wind power is often studied separately from the effects of loads. Accordingly, the cost in procuring reserves is allocated based on this simplification rather than a fair and transparent calculation of the different resources contribution to the reserve requirement. This work proposes a new allocation mechanism for intermittency and variability of resources regardless of their type. It is based on a new formula, called grid balancing metric (GBM). The proposed GBM has several distinct features: 1) it is directly linked to the control performance standard (CPS) scores and interconnection frequency performance, 2) it provides scientifically defined allocation factors for individual resources, 3) the sum of allocation factors within any group of resources is equal to the groups collective allocation factor (linearity), and 4) it distinguishes helpers and harmers. The paper illustrates and provides results of the new approach based on actual transmission system operator (TSO) data.

  11. SU-D-204-05: Quantitative Comparison of a High Resolution Micro-Angiographic Fluoroscopic (MAF) Detector with a Standard Flat Panel Detector (FPD) Using the New Metric of Generalized Measured Relative Object Detectability (GM-ROD)

    SciTech Connect

    Russ, M; Ionita, C; Bednarek, D; Rudin, S

    2015-06-15

    Purpose: In endovascular image-guided neuro-interventions, visualization of fine detail is paramount. For example, the ability of the interventionist to visualize the stent struts depends heavily on the x-ray imaging detector performance. Methods: A study to examine the relative performance of the high resolution MAF-CMOS (pixel size 75µm, Nyquist frequency 6.6 cycles/mm) and a standard Flat Panel Detector (pixel size 194µm, Nyquist frequency 2.5 cycles/mm) detectors in imaging a neuro stent was done using the Generalized Measured Relative Object Detectability (GM-ROD) metric. Low quantum noise images of a deployed stent were obtained by averaging 95 frames obtained by both detectors without changing other exposure or geometric parameters. The square of the Fourier transform of each image is taken and divided by the generalized normalized noise power spectrum to give an effective measured task-specific signal-to-noise ratio. This expression is then integrated from 0 to each of the detector’s Nyquist frequencies, and the GM-ROD value is determined by taking a ratio of the integrals for the MAF-CMOS to that of the FPD. The lower bound of integration can be varied to emphasize high frequencies in the detector comparisons. Results: The MAF-CMOS detector exhibits vastly superior performance over the FPD when integrating over all frequencies, yielding a GM-ROD value of 63.1. The lower bound of integration was stepped up in increments of 0.5 cycles/mm for higher frequency comparisons. As the lower bound increased, the GM-ROD value was augmented, reflecting the superior performance of the MAF-CMOS in the high frequency regime. Conclusion: GM-ROD is a versatile metric that can provide quantitative detector and task dependent comparisons that can be used as a basis for detector selection. Supported by NIH Grant: 2R01EB002873 and an equipment grant from Toshiba Medical Systems Corporation.

  12. Reducing Power Factor Cost | Department of Energy

    Energy.gov [DOE] (indexed site)

    Many utility companies charge an additional fee if your power factor is less than 0.95. Low power factor also reduces your electrical system's distribution capacity by increasing ...

  13. Cone Penetrometer N Factor Determination Testing Results

    SciTech Connect

    Follett, Jordan R.

    2014-03-05

    This document contains the results of testing activities to determine the empirical 'N Factor' for the cone penetrometer in kaolin clay simulant. The N Factor is used to releate resistance measurements taken with the cone penetrometer to shear strength.

  14. Clothes Washer Test Cloth Correction Factor Information

    Energy.gov [DOE]

    This page contains the information used to determine the test cloth correction factors for each test cloth lot.

  15. Human factors: a necessary tool for industry

    SciTech Connect

    Starcher, K.O.

    1984-03-09

    The need for human factors (ergonomics) input in the layout of a ferroelectric ceramics laboratory is presented as an example of the overall need for human factors professionals in industry. However, even in the absence of one trained in human factors, knowledge of a few principles in ergonomics will provide many possibilities for improving performance in the industrial environment.

  16. Antenna factorization in strongly ordered limits

    SciTech Connect

    Kosower, David A.

    2005-02-15

    When energies or angles of gluons emitted in a gauge-theory process are small and strongly ordered, the emission factorizes in a simple way to all orders in perturbation theory. I show how to unify the various strongly ordered soft, mixed soft-collinear, and collinear limits using antenna factorization amplitudes, which are generalizations of the Catani-Seymour dipole factorization function.

  17. Factors fragmenting the Russian Federation

    SciTech Connect

    Brown, E.

    1993-10-06

    This paper examines the factors that threaten the future of the Russian Federation (RF). The observations are based on a study that focused on eight republics: Mordova, Udmurtia, Tatarstan, Mari El, Bashkortostan, Kabardino-Balkaria, Buryatia, and Altay Republic. These republics were selected for their geographic and economic significance to the RF. Tatarstan, Bashkortostan, Udmurtia, and Mari El are located on important supply routes, such as the Volga River and the trans-Siberian railroad. Some of these republics are relatively wealthy, with natural resources such as oil (e.g., Tatarstan and Bashkortostan), and all eight republics play significant roles in the military-industrial complex. The importance of these republics to the RF contrasts to the relative insignificance of the independence-minded Northern Caucasus area. The author chose not to examine the Northern Caucasus region (except Kabardino-Balkaria) because these republics may have only a minor impact on the rest of the RF if they secede. Their impact would be minimized because they lie on the frontiers of the RF. Many Russians believe that {open_quotes}it might be best to let such a troublesome area secede.{close_quotes}

  18. EM Corporate Performance Metrics, Complex Level

    Office of Environmental Management (EM)

    98,053 106,526 LLLLMW disposed Legacy (Stored) and NGW Cubic Meters 1,558,048 1,209,709 1,237,779 1,265,849 MAAs eliminated Number of Material Access Areas 35 30 30 30 Nuclear...

  19. Evaluation Metrics Applied to Accident Tolerant Fuels

    SciTech Connect

    Shannon M. Bragg-Sitton; Jon Carmack; Frank Goldner

    2014-10-01

    The safe, reliable, and economic operation of the nation’s nuclear power reactor fleet has always been a top priority for the United States’ nuclear industry. Continual improvement of technology, including advanced materials and nuclear fuels, remains central to the industry’s success. Decades of research combined with continual operation have produced steady advancements in technology and have yielded an extensive base of data, experience, and knowledge on light water reactor (LWR) fuel performance under both normal and accident conditions. One of the current missions of the U.S. Department of Energy’s (DOE) Office of Nuclear Energy (NE) is to develop nuclear fuels and claddings with enhanced accident tolerance for use in the current fleet of commercial LWRs or in reactor concepts with design certifications (GEN-III+). Accident tolerance became a focus within advanced LWR research upon direction from Congress following the 2011 Great East Japan Earthquake, resulting tsunami, and subsequent damage to the Fukushima Daiichi nuclear power plant complex. The overall goal of ATF development is to identify alternative fuel system technologies to further enhance the safety, competitiveness and economics of commercial nuclear power. Enhanced accident tolerant fuels would endure loss of active cooling in the reactor core for a considerably longer period of time than the current fuel system while maintaining or improving performance during normal operations. The U.S. DOE is supporting multiple teams to investigate a number of technologies that may improve fuel system response and behavior in accident conditions, with team leadership provided by DOE national laboratories, universities, and the nuclear industry. Concepts under consideration offer both evolutionary and revolutionary changes to the current nuclear fuel system. Mature concepts will be tested in the Advanced Test Reactor at Idaho National Laboratory beginning in Summer 2014 with additional concepts being readied for insertion in fiscal year 2015. This paper provides a brief summary of the proposed evaluation process that would be used to evaluate and prioritize the candidate accident tolerant fuel concepts currently under development.

  20. Comparison summary (key metrics and multiples)

    Energy Saver

    ... Early Concern Over Slope Instability 10 (from McIver,1982) Cause Turbidity Currents Act as ... Documented Gas Release from the Seafloor 38 Sea surface Seafloor 100m ocean shear? 800m ...

  1. Efficient Synchronization Stability Metrics for Fault Clearing...

    Office of Scientific and Technical Information (OSTI)

    Technical Information Service, Springfield, VA at www.ntis.gov. Authors: Backhaus, Scott N. 1 ; Chertkov, Michael 1 ; Bent, Russell Whitford 1 ; Bienstock, Daniel 2...

  2. Stochastic inverse problems: Models and metrics

    SciTech Connect

    Sabbagh, Elias H.; Sabbagh, Harold A.; Murphy, R. Kim; Aldrin, John C.; Annis, Charles; Knopp, Jeremy S.

    2015-03-31

    In past work, we introduced model-based inverse methods, and applied them to problems in which the anomaly could be reasonably modeled by simple canonical shapes, such as rectangular solids. In these cases the parameters to be inverted would be length, width and height, as well as the occasional probe lift-off or rotation. We are now developing a formulation that allows more flexibility in modeling complex flaws. The idea consists of expanding the flaw in a sequence of basis functions, and then solving for the expansion coefficients of this sequence, which are modeled as independent random variables, uniformly distributed over their range of values. There are a number of applications of such modeling: 1. Connected cracks and multiple half-moons, which we have noted in a POD set. Ideally we would like to distinguish connected cracks from one long shallow crack. 2. Cracks of irregular profile and shape which have appeared in cold work holes during bolt-hole eddy-current inspection. One side of such cracks is much deeper than other. 3. L or C shaped crack profiles at the surface, examples of which have been seen in bolt-hole cracks. By formulating problems in a stochastic sense, we are able to leverage the stochastic global optimization algorithms in NLSE, which is resident in VIC-3D®, to answer questions of global minimization and to compute confidence bounds using the sensitivity coefficient that we get from NLSE. We will also address the issue of surrogate functions which are used during the inversion process, and how they contribute to the quality of the estimation of the bounds.

  3. EM Corporate Performance Metrics, Site Level

    Office of Environmental Management (EM)

    completed 1 1 1 1 Grand Junction Geographic Sites Eliminated Number completed 3 2 2 2 Inhalation Toxicology Laboratory LLLLMW disposed Legacy (Stored) and NGW Cubic Meters 359...

  4. Uranium Leasing Program: Lease Tract Metrics

    Energy.gov [DOE]

    The Atomic Energy Act and other legislative actions authorized the U.S. Atomic Energy Commission (AEC), predecessor agency to the U.S. Department of Energy (DOE), to withdraw lands from the public...

  5. Clean Cities Annual Metrics Report 2008

    SciTech Connect

    Johnson, C.; Bergeron, P.

    2009-09-01

    This report summarizes the Department of Energy's Clean Cities coalition accomplishments in 2008, including petroleum displacement data, membership, funding, sales of alternative fuel blends, deployment of AFVs and HEVs, idle reduction initiatives, and fuel economy activities.

  6. Documentation for FY2003 GPRA metrics

    SciTech Connect

    None, None

    2002-02-01

    This report is broken into two sections: a summary section providing an overview of the benefits analysis of OPT technology R&D programs, and a detailed section providing specific information about the entire GPRA benefits process and each of the OPT programs.

  7. Clean Cities 2013 Annual Metrics Report

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    ... The number of flexible fuel vehicles that can operate on E85 (a high-level ethanol blend) ... AFVs use CNG. This is in stark contrast to E85, which accounts for only 12% of the AFV ...

  8. Clean Cities 2012 Annual Metrics Report

    SciTech Connect

    Johnson, C.

    2013-12-01

    The U.S. Department of Energy's (DOE) Clean Cities program advances the nation's economic, environmental, and energy security by supporting local actions to cut petroleum use in transportation. A national network of nearly 100 Clean Cities coalitions brings together stakeholders in the public and private sectors to deploy alternative and renewable fuels, idle-reduction measures, fuel economy improvements, and new transportation technologies, as they emerge. Each year DOE asks Clean Cities coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterizes the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this report.

  9. Clean Cities 2012 Annual Metrics Report

    Alternative Fuels and Advanced Vehicles Data Center

    of Energy Office of Energy Efficiency & Renewable Energy Operated by the Alliance for Sustainable Energy, LLC. This report is available at no cost from the National Renewable...

  10. Factorization using the quadratic sieve algorithm

    SciTech Connect

    Davis, J.A.; Holdridge, D.B.

    1983-01-01

    Since the cryptosecurity of the RSA two key cryptoalgorithm is no greater than the difficulty of factoring the modulus (product of two secret primes), a code that implements the Quadratic Sieve factorization algorithm on the CRAY I computer has been developed at the Sandia National Laboratories to determine as sharply as possible the current state-of-the-art in factoring. Because all viable attacks on RSA thus far proposed are equivalent to factorization of the modulus, sharper bounds on the computational difficulty of factoring permit improved estimates for the size of RSA parameters needed for given levels of cryptosecurity. Analysis of the Quadratic Sieve indicates that it may be faster than any previously published general purpose algorithm for factoring large integers. The high speed of the CRAY I coupled with the capability of the CRAY to pipeline certain vectorized operations make this algorithm (and code) the front runner in current factoring techniques.

  11. Factorization using the quadratic sieve algorithm

    SciTech Connect

    Davis, J.A.; Holdridge, D.B.

    1983-12-01

    Since the cryptosecurity of the RSA two key cryptoalgorithm is no greater than the difficulty of factoring the modulus (product of two secret primes), a code that implements the Quadratic Sieve factorization algorithm on the CRAY I computer has been developed at the Sandia National Laboratories to determine as sharply as possible the current state-of-the-art in factoring. Because all viable attacks on RSA thus far proposed are equivalent to factorization of the modulus, sharper bounds on the computational difficulty of factoring permit improved estimates for the size of RSA parameters needed for given levels of cryptosecurity. Analysis of the Quadratic Sieve indicates that it may be faster than any previously published general purpose algorithm for factoring large integers. The high speed of the CRAY I coupled with the capability of the CRAY to pipeline certain vectorized operations make this algorithm (and code) the front runner in current factoring techniques.

  12. Synthetic heparin-binding growth factor analogs

    DOEpatents

    Pena, Louis A.; Zamora, Paul; Lin, Xinhua; Glass, John D.

    2007-01-23

    The invention provides synthetic heparin-binding growth factor analogs having at least one peptide chain that binds a heparin-binding growth factor receptor, covalently bound to a hydrophobic linker, which is in turn covalently bound to a non-signaling peptide that includes a heparin-binding domain. The synthetic heparin-binding growth factor analogs are useful as soluble biologics or as surface coatings for medical devices.

  13. CIMGS: An incomplete orthogonal factorization preconditioner

    SciTech Connect

    Wang, X.; Bramley, R.; Gallivan, K.

    1994-12-31

    This paper introduces, analyzes, and tests a preconditioning method for conjugate gradient (CG) type iterative methods. The authors start by examining incomplete Gram-Schmidt factorization (IGS) methods in order to motivate the new preconditioner. They show that the IGS family is more stable than IC, and they successfully factor any full rank matrix. Furthermore, IGS preconditioners are at least as effective in accelerating convergence of CG type iterative methods as the incomplete Cholesky (IC) preconditioner. The drawback of IGS methods are their high cost of factorization. This motivates finding a new algorithm, CIMGS, which can generate the same factor in a more efficient way.

  14. Understanding Hazardous Combustion Byproducts Reduces Factors...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Hazardous Combustion Byproducts Reduces Factors Impacting Climate Change - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate ...

  15. CONTROL OF MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS...

    Office of Scientific and Technical Information (OSTI)

    MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS AFFECTING FUSION. Henderson, Ian M.; Paxton, Walter F Abstract not provided. Sandia National Laboratories (SNL-NM), Albuquerque,...

  16. Summary - Major Risk Factors Integrated Facility Disposition...

    Office of Environmental Management (EM)

    Office of Environmental Management (DOE-EM) External Technical Review of the Major Risk Factors Integrated Facility Disposition Project (IFDP) Oak Ridge, TN Why DOE-EM Did...

  17. EcoFactor Inc | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Name: EcoFactor Inc Place: Millbrae, California Zip: 94030 Product: California-based home energy management service provider. Coordinates: 37.60276, -122.395444 Show Map...

  18. Nonrelativistic QCD factorization and the velocity dependence...

    Office of Scientific and Technical Information (OSTI)

    CONFIGURATION; FACTORIZATION; MATRIX ELEMENTS; QUANTUM CHROMODYNAMICS; QUARKONIUM; SINGULARITY; T QUARKS; VELOCITY Word Cloud More Like This Full Text Journal Articles DOI: ...

  19. CONTROL OF MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Journal Article: CONTROL OF MECHANICALLY ACTIVATED POLYMERSOME FUSION: FACTORS AFFECTING FUSION. Citation Details In-Document Search Title: CONTROL OF MECHANICALLY ACTIVATED...

  20. Soliton form factors from lattice simulations

    SciTech Connect

    Rajantie, Arttu; Weir, David J.

    2010-12-01

    The form factor provides a convenient way to describe properties of topological solitons in the full quantum theory, when semiclassical concepts are not applicable. It is demonstrated that the form factor can be calculated numerically using lattice Monte Carlo simulations. The approach is very general and can be applied to essentially any type of soliton. The technique is illustrated by calculating the kink form factor near the critical point in 1+1-dimensional scalar field theory. As expected from universality arguments, the result agrees with the exactly calculable scaling form factor of the two-dimensional Ising model.

  1. Industrial Power Factor Analysis Guidebook. Electrotek Concepts...

    Office of Scientific and Technical Information (OSTI)

    low power factors, increased conductor and transformer losses, and lower voltages. Utilities must supply both active and reactive power and compensate for these losses. Power...

  2. Emission Factors (EMFAC) | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    The EMission FACtors (EMFAC) model is used to calculate emission rates from all motor vehicles, such as passenger cars to heavy-duty trucks, operating on highways, freeways...

  3. Section M: Evaluations Factors for Award

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    V SECTION M EVALUATION FACTORS FOR AWARD Request for Proposal # DE-RP36-07GO97036 PART V SECTION M EVALUATION FACTORS FOR AWARD TABLE OF CONTENTS M.1 Evaluation of Proposals ..........................................................................................1 M.2 Evaluation Criteria..................................................................................................1 M.3 Basis For Award

  4. Gauss Sum Factorization with Cold Atoms

    SciTech Connect

    Gilowski, M.; Wendrich, T.; Mueller, T.; Ertmer, W.; Rasel, E. M. [Institut fuer Quantenoptik, Leibniz Universitaet Hannover, Welfengarten 1, D-30167 Hannover (Germany); Jentsch, Ch. [Astrium GmbH-Satellites, 88039 Friedrichshafen (Germany); Schleich, W. P. [Institut fuer Quantenphysik, Universitaet Ulm, Albert-Einstein-Allee 11, D-89081 Ulm (Germany)

    2008-01-25

    We report the first implementation of a Gauss sum factorization algorithm by an internal state Ramsey interferometer using cold atoms. A sequence of appropriately designed light pulses interacts with an ensemble of cold rubidium atoms. The final population in the involved atomic levels determines a Gauss sum. With this technique we factor the number N=263193.

  5. Two-Factor Identify Proofing Process | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Identify Proofing Process Two-Factor Identify Proofing Process

  6. Synthetic heparin-binding factor analogs

    DOEpatents

    Pena, Louis A.; Zamora, Paul O.; Lin, Xinhua; Glass, John D.

    2010-04-20

    The invention provides synthetic heparin-binding growth factor analogs having at least one peptide chain, and preferably two peptide chains branched from a dipeptide branch moiety composed of two trifunctional amino acid residues, which peptide chain or chains bind a heparin-binding growth factor receptor and are covalently bound to a non-signaling peptide that includes a heparin-binding domain, preferably by a linker, which may be a hydrophobic linker. The synthetic heparin-binding growth factor analogs are useful as pharmaceutical agents, soluble biologics or as surface coatings for medical devices.

  7. Carbon Dioxide Emission Factors for Coal

    Reports and Publications

    1994-01-01

    The Energy Information Administration (EIA) has developed factors for estimating the amount of carbon dioxide emitted, accounting for differences among coals, to reflect the changing "mix" of coal in U.S. coal consumption.

  8. Relativistic Thomson Scatter from Factor Calculation

    Energy Science and Technology Software Center

    2009-11-01

    The purpose of this program is calculate the fully relativistic Thomson scatter from factor in unmagnetized plasmas. Such calculations are compared to experimental diagnoses of plasmas at such facilities as the Jupiter laser facility here a LLNL.

  9. Industrial Power Factor Analysis Guidebook. (Technical Report...

    Office of Scientific and Technical Information (OSTI)

    Power factor is a way of measuring the percentage of reactive power in an electrical system. Reactive power represents wasted energy--electricity that does no useful work because ...

  10. Proton form factor effects in hydrogenic atoms

    SciTech Connect

    Daza, F. Garcia; Kelkar, N. G.; Nowakowski, M.

    2011-10-21

    The proton structure corrections to the hyperfine splittings in electronic and muonic hydrogen are evaluated using the Breit potential with electromagnetic form factors. In contrast to other methods, the Breit equation with q{sup 2} dependent form factors is just an extension of the standard Breit equation which gives the hyperfine splitting Hamiltonian. Precise QED corrections are comparable to the structure corrections which therefore need to be evaluated ab initio.

  11. Factors affecting robust retail energy markets

    SciTech Connect

    Michelman, T.S.

    1999-04-01

    This paper briefly defines an active retail market, details the factors that influence market activity and their relative importance, compares activity in various retail energy markets to date, and predicts future retail energy market activity. Three primary factors translate into high market activity: supplier margins, translated into potential savings for actively shopping customers; market size; and market barriers. The author surveys activity nationwide and predicts hot spots for the coming year.

  12. Sheet1 Water Availability Metric (Acre-Feet/Yr) Water Cost Metric...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    ... WA Washington Coastal Willapa Bay 0 0 81.241411363839845 0 ... 17100304OR OR Southern Oregon Coastal Coos 466031.37959679996 783007.00254536141 ...

  13. Hadronic form factors in kaon photoproduction

    SciTech Connect

    Syukurilla, L. Mart, T.

    2014-09-25

    We have revisited the effect of hadronic form factors in kaon photoproduction process by utilizing an isobaric model developed for kaon photoproduction off the proton. The model is able to reproduce the available experimental data nicely as well as to reveal the origin of the second peak in the total cross section, which was the main source of confusion for decades. Different from our previous study, in the present work we explore the possibility of using different hadronic form factors in each of the K?N vertices. The use of different hadronic form factors, e.g. dipole, Gaussian, and generalized dipole, has been found to produce a more flexible isobar model, which can provide a significant improvement in the model.

  14. Human factors in nuclear technology - a history

    SciTech Connect

    Jones, D.B. )

    1992-01-01

    Human factors, human factors engineering (HFE), or ergonomics did not receive much formal attention in nuclear technology prior to the Three Mile Island Unit 2 (TMI-2) incident. Three principal reasons exist for this lack of concern. First, emerging technologies show little concern with how people will use a new system. Making the new technology work is considered more important than the people who will use it. Second, the culture of the users of nuclear power did not recognize a need for human factors. Traditional utilities had well established and effective engineering designs for control of electric power generation, while medicine considered the use of nuclear isotopes another useful tool, not requiring special ergonomics. Finally, the nuclear industry owed much to Admiral Rickover. He was definitely opposed.

  15. Power-factor metering gains new interest

    SciTech Connect

    Womack, D.L.

    1980-01-01

    The combined effect of increased energy costs, advances in digital metering techniques, and regulatory pressures is stimulating utility interest in charging smaller customers the full cost of their burden on the electric system, by metering reactive power and billing for poor power factor. Oklahoma Gas and Electric Co. adopted the Q-meter method, made practical with the advent of magnetic-tape metering. Digital metering and new techniques now being developed will add more options for utilities interested in metering power factor. There are three commonly used methods of determining power factor, all of which require the use of the standard induction watthour meter, plus at least one other meter, to obtain a second value in the power triangle. In all cases, the third value, if required, is obtained by calculation.

  16. Measurement of the Helium Factors at Jlab

    SciTech Connect

    Elena Khrosinkova

    2007-06-11

    An experiment to measure elastic electron scattering off 3He and 4He at large momentum transfers is presented. The experiment was carried out in the Hall A Facility of Jefferson Lab. Elastic electron scattering off 3He was measured at forward and backward electron scattering angles to extract the isotope's charge and magnetic form factors. The charge form factor of 4He will be extracted from forward-angle electron scattering angle measurements. The data are expected to significantly extend and improve the existing measurements of the three-and four-body form factors. The results will be crucial for the establishment of a canonical standard model for the few- body nuclear systems and for testing predictions of quark dimensional scaling and hybrid nucleon- quark models.

  17. Chiral extrapolation of nucleon magnetic form factors

    SciTech Connect

    P. Wang; D. Leinweber; A. W. Thomas; R.Young

    2007-04-01

    The extrapolation of nucleon magnetic form factors calculated within lattice QCD is investigated within a framework based upon heavy baryon chiral effective-field theory. All one-loop graphs are considered at arbitrary momentum transfer and all octet and decuplet baryons are included in the intermediate states. Finite range regularization is applied to improve the convergence in the quark-mass expansion. At each value of the momentum transfer (Q{sup 2}), a separate extrapolation to the physical pion mass is carried out as a function of m{sub {pi}} alone. Because of the large values of Q{sup 2} involved, the role of the pion form factor in the standard pion-loop integrals is also investigated. The resulting values of the form factors at the physical pion mass are compared with experimental data as a function of Q{sup 2} and demonstrate the utility and accuracy of the chiral extrapolation methods presented herein.

  18. Communication-avoiding symmetric-indefinite factorization

    DOE PAGES [OSTI]

    Ballard, Grey Malone; Becker, Dulcenia; Demmel, James; Dongarra, Jack; Druinsky, Alex; Peled, Inon; Schwartz, Oded; Toledo, Sivan; Yamazaki, Ichitaro

    2014-11-13

    We describe and analyze a novel symmetric triangular factorization algorithm. The algorithm is essentially a block version of Aasen's triangular tridiagonalization. It factors a dense symmetric matrix A as the product A=PLTLTPT where P is a permutation matrix, L is lower triangular, and T is block tridiagonal and banded. The algorithm is the first symmetric-indefinite communication-avoiding factorization: it performs an asymptotically optimal amount of communication in a two-level memory hierarchy for almost any cache-line size. Adaptations of the algorithm to parallel computers are likely to be communication efficient as well; one such adaptation has been recently published. As a result,more » the current paper describes the algorithm, proves that it is numerically stable, and proves that it is communication optimal.« less

  19. Communication-avoiding symmetric-indefinite factorization

    SciTech Connect

    Ballard, Grey Malone; Becker, Dulcenia; Demmel, James; Dongarra, Jack; Druinsky, Alex; Peled, Inon; Schwartz, Oded; Toledo, Sivan; Yamazaki, Ichitaro

    2014-11-13

    We describe and analyze a novel symmetric triangular factorization algorithm. The algorithm is essentially a block version of Aasen's triangular tridiagonalization. It factors a dense symmetric matrix A as the product A=PLTLTPT where P is a permutation matrix, L is lower triangular, and T is block tridiagonal and banded. The algorithm is the first symmetric-indefinite communication-avoiding factorization: it performs an asymptotically optimal amount of communication in a two-level memory hierarchy for almost any cache-line size. Adaptations of the algorithm to parallel computers are likely to be communication efficient as well; one such adaptation has been recently published. As a result, the current paper describes the algorithm, proves that it is numerically stable, and proves that it is communication optimal.

  20. Human factors challenges for advanced process control

    SciTech Connect

    Stubler, W.F.; O`Hara, J..M.

    1996-08-01

    New human-system interface technologies provide opportunities for improving operator and plant performance. However, if these technologies are not properly implemented, they may introduce new challenges to performance and safety. This paper reports the results from a survey of human factors considerations that arise in the implementation of advanced human-system interface technologies in process control and other complex systems. General trends were identified for several areas based on a review of technical literature and a combination of interviews and site visits with process control organizations. Human factors considerations are discussed for two of these areas, automation and controls.

  1. Annotated bibliography of human factors applications literature

    SciTech Connect

    McCafferty, D.B.

    1984-09-30

    This bibliography was prepared as part of the Human Factors Technology Project, FY 1984, sponsored by the Office of Nuclear Safety, US Department of Energy. The project was conducted by Lawrence Livermore National Laboratory, with Essex Corporation as a subcontractor. The material presented here is a revision and expansion of the bibliographic material developed in FY 1982 as part of a previous Human Factors Technology Project. The previous bibliography was published September 30, 1982, as Attachment 1 to the FY 1982 Project Status Report.

  2. Meson-photon transition form factors

    SciTech Connect

    Balakireva, Irina; Lucha, Wolfgang; Melikhov, Dmitri

    2012-10-23

    We present the results of our recent analysis of the meson-photon transition form factors F{sub P{gamma}}(Q{sup 2}) for the pseudoscalar mesons P {pi}{sup 0},{eta},{eta} Prime ,{eta}{sub c}, using the local-duality version of QCD sum rules.

  3. Derivation of dose conversion factors for tritium

    SciTech Connect

    Killough, G. G.

    1982-03-01

    For a given intake mode (ingestion, inhalation, absorption through the skin), a dose conversion factor (DCF) is the committed dose equivalent to a specified organ of an individual per unit intake of a radionuclide. One also may consider the effective dose commitment per unit intake, which is a weighted average of organ-specific DCFs, with weights proportional to risks associated with stochastic radiation-induced fatal health effects, as defined by Publication 26 of the International Commission on Radiological Protection (ICRP). This report derives and tabulates organ-specific dose conversion factors and the effective dose commitment per unit intake of tritium. These factors are based on a steady-state model of hydrogen in the tissues of ICRP's Reference Man (ICRP Publication 23) and equilibrium of specific activities between body water and other tissues. The results differ by 27 to 33% from the estimate on which ICRP Publication 30 recommendations are based. The report also examines a dynamic model of tritium retention in body water, mineral bone, and two compartments representing organically-bound hydrogen. This model is compared with data from human subjects who were observed for extended periods. The manner of combining the dose conversion factors with measured or model-predicted levels of contamination in man's exposure media (air, drinking water, soil moisture) to estimate dose rate to an individual is briefly discussed.

  4. Module: Emission Factors for Deforestation | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    www.leafasia.orgtoolstechnical-guidance-series-emission-factors-defo Cost: Free Language: English Module: Emission Factors for Deforestation Screenshot Logo: Module: Emission...

  5. Engineering an allosteric transcription factor to respond to...

    Office of Scientific and Technical Information (OSTI)

    Engineering an allosteric transcription factor to respond to new ligands Citation Details In-Document Search Title: Engineering an allosteric transcription factor to respond to new ...

  6. Test of factorization in diffractive deep inelastic scattering...

    Office of Scientific and Technical Information (OSTI)

    Test of factorization in diffractive deep inelastic scattering and photoproduction at HERA Citation Details In-Document Search Title: Test of factorization in diffractive deep ...

  7. Study of Factors Affecting Shrub Establishment on the Monticello...

    Office of Environmental Management (EM)

    Study of Factors Affecting Shrub Establishment on the Monticello, Utah, Disposal Cell Cover Study of Factors Affecting Shrub Establishment on the Monticello, Utah, Disposal Cell...

  8. Research on Factors Relating to Density and Climate Change |...

    OpenEI (Open Energy Information) [EERE & EIA]

    on Factors Relating to Density and Climate Change Jump to: navigation, search Tool Summary LAUNCH TOOL Name: Research on Factors Relating to Density and Climate Change Agency...

  9. Theory of factors limiting high gradient operation of warm acceleratin...

    Office of Scientific and Technical Information (OSTI)

    Theory of factors limiting high gradient operation of warm accelerating structures Citation Details In-Document Search Title: Theory of factors limiting high gradient operation of ...

  10. Neutrino mass, dark energy, and the linear growth factor (Journal...

    Office of Scientific and Technical Information (OSTI)

    dark energy, and the linear growth factor Citation Details In-Document Search Title: Neutrino mass, dark energy, and the linear growth factor We study the degeneracies between ...

  11. Is the proton electromagnetic form factor modified in nuclei...

    Office of Scientific and Technical Information (OSTI)

    Is the proton electromagnetic form factor modified in nuclei? Citation Details In-Document Search Title: Is the proton electromagnetic form factor modified in nuclei? You are ...

  12. Method for determining formation quality factor from well log...

    Office of Scientific and Technical Information (OSTI)

    factor from well log data and its application to seismic reservoir characterization Citation Details In-Document Search Title: Method for determining formation quality factor ...

  13. Initiation factor 2 crystal structure reveals a different domain...

    Office of Scientific and Technical Information (OSTI)

    Initiation factor 2 crystal structure reveals a different domain organization from eukaryotic initiation factor 5B and mechanism among translational GTPases Citation Details ...

  14. Dense LU Factorization on Multicore Supercomputer Nodes (Conference...

    Office of Scientific and Technical Information (OSTI)

    factorization's memory hierarchy contention on now-ubiquitous multi-core architectures. ... During active panel factorization, rank-1 updates stream through memory with minimal ...

  15. Crystal structure of elongation factor 4 bound to a clockwise...

    Office of Scientific and Technical Information (OSTI)

    Crystal structure of elongation factor 4 bound to a clockwise ratcheted ribosome Citation Details In-Document Search Title: Crystal structure of elongation factor 4 bound to a ...

  16. First Climate formerly Factor Consulting | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    First Climate formerly Factor Consulting Jump to: navigation, search Name: First Climate (formerly Factor Consulting) Place: Germany Sector: Carbon Product: Former Swiss-based...

  17. Human Factors Engineering Analysis Tool - Energy Innovation Portal

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Human Factors Engineering Analysis Tool Software tool that enables easy and quick selection of applicable regulatory guidelines as starting point for human factors engineering ...

  18. HUMAN FACTORS GUIDANCE FOR CONTROL ROOM EVALUATION

    SciTech Connect

    OHARA,J.; BROWN,W.; STUBLER,W.; HIGGINS,J.; WACHTEL,J.; PERSENSKY,J.J.

    2000-07-30

    The Human-System Interface Design Review Guideline (NUREG-0700, Revision 1) was developed by the US Nuclear Regulatory Commission (NRC) to provide human factors guidance as a basis for the review of advanced human-system interface technologies. The guidance consists of three components: design review procedures, human factors engineering guidelines, and a software application to provide design review support called the ``Design Review Guideline.'' Since it was published in June 1996, Rev. 1 to NUREG-0700 has been used successfully by NRC staff, contractors and nuclear industry organizations, as well as by interested organizations outside the nuclear industry. The NRC has committed to the periodic update and improvement of the guidance to ensure that it remains a state-of-the-art design evaluation tool in the face of emerging and rapidly changing technology. This paper addresses the current research to update of NUREG-0700 based on the substantial work that has taken place since the publication of Revision 1.

  19. Structural studies on leukaemia inhibitory factor

    SciTech Connect

    Norton, R.S.; Maurer, T.; Smith, D.K.; Nicola, N.A.

    1994-12-01

    Leukaemia Inhibitory Factor (LIF) is a pleiotropic cytokine that acts on a wide range of target cells, including mega-karyocytes, osteoblasts, hepatocytes, adipocytes, neurons, embryonic stem cells, and primordial germ cells. Many of its activities are shared with other cytokines, particularly interleukin-6, oncostatin-M, ciliary neurotrophic factor, and granulocyte colony-stimulating factor (G-CSF). Although secreted in vivo as a glycoprotein, nonglycosylated recombinant protein expressed in E. coli is fully active and has been used in our nuclear magnetic resonance (NMR) studies of the three-dimensional structure and structure-function relationships of LIF. With 180 amino acids and a molecular mass of about 20 kDa, OF is too large for direct structure determination by two-dimensional and three-dimensional {sup 1}HNMR. It is necessary to label the protein with the stable isotopes {sup 15}N and {sup 13}C and employ heteronuclear three-dimensional NMR in order to resolve and interpret the spectral information required for three-dimensional structure determination. This work has been undertaken with both human LIF and a mouse-human chimaera that binds to the human LIF receptor with the same affinity as the human protein and yet expresses in E. coli at much higher levels. Sequence-specific resonance assignments and secondary structure elements for these proteins will be presented and progress towards determination of their three-dimensional structures described.

  20. Transcription factor-based biosensors for detecting dicarboxylic acids

    DOEpatents

    Dietrich, Jeffrey; Keasling, Jay

    2014-02-18

    The invention provides methods and compositions for detecting dicarboxylic acids using a transcription factor biosensor.

  1. Parallel LU Factorization on GPU cluster

    SciTech Connect

    D'Azevedo, Ed F; Hill, Judith C

    2012-01-01

    This paper describes our progress in developing software for performing parallel LU factorization of a large dense matrix on a GPU cluster. Three approaches, with increasing software complexity, are considered: (i) a naive 'thunking' approach that links the existing parallel ScaLAPACK software library with cuBLAS through a software emulation layer; (ii) a more intrusive magmaBLAS implementation integrated into the LU solver in the High-Performance Linpack software; and (iii) a left-looking out-of-core algorithm for solving problems that are larger than the available memory on GPU devices. Comparison of the performance gains versus the current ScaLAPACK PZGETRF are provided.

  2. Various factors affect coiled tubing limits

    SciTech Connect

    Yang, Y.S.

    1996-01-15

    Safety and reliability remain the primary concerns in coiled tubing operations. Factors affecting safety and reliability include corrosion, flexural bending, internal (or external) pressure and tension (or compression), and mechanical damage due to improper use. Such limits as coiled tubing fatigue, collapse, and buckling need to be understood to avoid disaster. With increased use of coiled tubing, operators will gain more experience. But at the same time, with further research and development of coiled tubing, the manufacturing quality will be improved and fatigue, collapse, and buckling models will become more mature, and eventually standard specifications will be available. This paper reviews the uses of coiled tubing and current research on mechanical behavior of said tubing. It also discusses several models used to help predict fatigue and failure levels.

  3. Chiral corrections to hyperon axial form factors

    SciTech Connect

    Jiang Fujiun; Tiburzi, B. C.

    2008-05-01

    We study the complete set of flavor-changing hyperon axial-current matrix elements at small momentum transfer. Using partially quenched heavy baryon chiral perturbation theory, we derive the chiral and momentum behavior of the axial and induced pseudoscalar form factors. The meson pole contributions to the latter posses a striking signal for chiral physics. We argue that the study of hyperon axial matrix elements enables a systematic lattice investigation of the efficacy of three-flavor chiral expansions in the baryon sector. This can be achieved by considering chiral corrections to SU(3) symmetry predictions, and their partially quenched generalizations. In particular, despite the presence of eight unknown low-energy constants, we are able to make next-to-leading order symmetry breaking predictions for two linear combinations of axial charges.

  4. Identification and Control of Factors that Affect EGR Cooler Fouling

    Energy.gov [DOE]

    Key factors that cause exhaust gas recirculation cooler fouling were identified through extensive literature search and controlled experiment was devised to study the impact of a few key factors on deposition.

  5. Major Risk Factors Integrated Facility Disposition Project - Oak Ridge |

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Department of Energy Integrated Facility Disposition Project - Oak Ridge Major Risk Factors Integrated Facility Disposition Project - Oak Ridge Full Document and Summary Versions are available for download Major Risk Factors Integrated Facility Disposition Project - Oak Ridge (3.57 MB) Summary - Major Risk Factors Integrated Facility Disposition Project (IFDP) Oak Ridge, TN (65.13 KB) More Documents & Publications Major Risk Factors to the Integrated Facility Disposition Project

  6. Factors Influencing Spatial and Annual Variability in Eelgrass (Zostera marina L.) Meadows in Willapa Bay, Washington, and Coos Bay, Oregon, Estuaries

    SciTech Connect

    Thom, Ronald M.; Borde, Amy B.; Rumrill, Steven; Woodruff, Dana L.; Williams, Greg D.; Southard, John A.; Sargeant, Susan L.

    2003-08-01

    Environmental factors that influence annual variability and spatial differences in eelgrass meadows (Zostera marina L.) were examined within Willapa Bay, WA, and Coos Bay, OR, over a period of 4 years (1998-2001). A suite of eelgrass metrics were recorded annually at field sites that spanned the estuarine gradient from the marine-dominated to mesohaline regions. Growth of eelgrass plants was also monitored on a monthly basis within Sequim Bay, WA. Both the spatial cover and density of Z. marina were positively correlated with estuarine salinity and inversely correlated with temperature of the tideflat sediment. Experimental evidence verified that optimal eelgrass growth occurred at highest salinities and relatively low temperatures. Eelgrass density, biomass, and the incident of flowering plants all increased substantially in Willapa Bay, and less so in Coos Bay, over the duration of the study. Warmer winters and cooler summers associated with the transition from El Ni?o to La Ni?a ocean conditions during the study period were correlated with the increase in eelgrass abundance and flowering. Anthropogenic factors (e.g., disturbance and erosion by vessel wakes and recreational shellfishing activities) may have contributed to spatial variability. Our findings indicate that large-scale changes in climate and nearshore ocean conditions can exert a strong regional influence on eelgrass abundance, which can vary annually by as much as 700% in Willapa Bay. Lower levels of variability observed in Coos Bay may be due to the stronger and more direct influence of the nearshore Pacific Ocean. We conclude that climate variation may have profound effects on the abundance and distribution of eelgrass meadows throughout the Pacific Northwest, and we anticipate that ocean conditions will emerge as a primary driving force for living estuarine resources and ecological processes that are associated with Z. marina beds within the landscape of these estuarine tidal basins.

  7. Confinement and the safety factor profile

    SciTech Connect

    Batha, S.H.; Levinton, F.M.; Scott, S.D.

    1995-12-01

    The conjecture that the safety factor profile, q(r), controls the improvement in tokamak plasmas from poor confinement in the Low (L-) mode regime to improved confinement in the supershot regime has been tested in two experiments on the Tokamak Fusion Test Reactor (TFTR). First, helium was puffed into the beam-heated phase of a supershot discharge which induced a degradation from supershot to L-mode confinement in about 100 msec, far less than the current relaxation time. The q and shear profiles measured by a motional Stark effect polarimeter showed little change during the confinement degradation. Second, rapid current ramps in supershot plasmas altered the q profile, but were observed not to change significantly the energy confinement. Thus, enhanced confinement in supershot plasmas is not due to a particular q profile which has enhanced stability or transport properties. The discharges making a continuous transition between supershot and L-mode confinement were also used to test the critical-electron-temperature-gradient transport model. It was found that this model could not reproduce the large changes in electron and ion temperature caused by the change in confinement.

  8. Human Factors Aspects of Operating Small Reactors

    SciTech Connect

    OHara, J.M.; Higgins, J.; Deem, R.; Xing, J.; DAgostino, A.

    2010-11-07

    The nuclear-power community has reached the stage of proposing advanced reactor designs to support power generation for decades to come. They are considering small modular reactors (SMRs) as one approach to meet these energy needs. While the power output of individual reactor modules is relatively small, they can be grouped to produce reactor sites with different outputs. Also, they can be designed to generate hydrogen, or to process heat. Many characteristics of SMRs are quite different from those of current plants, and so may require a concept of operations (ConOps) that also is different. The U.S. Nuclear Regulatory Commission (NRC) has begun examining the human factors engineering- (HFE) and ConOps- aspects of SMRs; if needed, they will formulate guidance to support SMR licensing reviews. We developed a ConOps model, consisting of the following dimensions: Plant mission; roles and responsibilities of all agents; staffing, qualifications, and training; management of normal operations; management of off-normal conditions and emergencies; and, management of maintenance and modifications. We are reviewing information on SMR design to obtain data about each of these dimensions, and have identified several preliminary issues. In addition, we are obtaining operations-related information from other types of multi-module systems, such as refineries, to identify lessons learned from their experience. Here, we describe the project's methodology and our preliminary findings.

  9. LPS-inducible factor(s) from activated macrophages mediates cytolysis of Naegleria fowleri amoebae

    SciTech Connect

    Cleary, S.F.; Marciano-Cabral, F.

    1986-03-01

    Soluble cytolytic factors of macrophage origin have previously been described with respect to their tumoricidal activity. The purpose of this study was to investigate the mechanism and possible factor(s) responsible for cytolysis of the amoeba Naegleria fowleri by activated peritoneal macrophages from B6C3F1 mice. Macrophages or conditioned medium (CM) from macrophage cultures were incubated with /sup 3/H-Uridine labeled amoebae. Percent specific release of label served as an index of cytolysis. Bacille Calmette-Guerin (BCG) and Corynebacterium parvum macrophages demonstrated significant cytolysis of amoebae at 24 h with an effector to target ratio of 10:1. Treatment of macrophages with inhibitors of RNA or protein synthesis blocked amoebicidal activity. Interposition of a 1 ..mu..m pore membrane between macrophages and amoebae inhibited killing. Inhibition in the presence of the membrane was overcome by stimulating the macrophages with LPS. CM from SPS-stimulated, but not unstimulated, cultures of activated macrophages was cytotoxic for amoebae. The activity was heat sensitive and was recovered from ammonium sulfate precipitation of the CM. Results indicate that amoebicidal activity is mediated by a protein(s) of macrophage origin induced by target cell contact or stimulation with LPS.

  10. Disruptive Event Biosphere Doser Conversion Factor Analysis

    SciTech Connect

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  11. Nominal Performance Biosphere Dose Conversion Factor Analysis

    SciTech Connect

    Wasiolek, Maryla A.

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  12. Investigation of the effects of cell model and subcellular location of gold nanoparticles on nuclear dose enhancement factors using Monte Carlo simulation

    SciTech Connect

    Cai, Zhongli; Chattopadhyay, Niladri; Kwon, Yongkyu Luke; Pignol, Jean-Philippe; Lechtman, Eli; Reilly, Raymond M.; Department of Medical Imaging, University of Toronto, Toronto, Ontario M5S 3E2; Toronto General Research Institute, University Health Network, Toronto, Ontario M5G 2C4

    2013-11-15

    Purpose: The authors aims were to model how various factors influence radiation dose enhancement by gold nanoparticles (AuNPs) and to propose a new modeling approach to the dose enhancement factor (DEF).Methods: The authors used Monte Carlo N-particle (MCNP 5) computer code to simulate photon and electron transport in cells. The authors modeled human breast cancer cells as a single cell, a monolayer, or a cluster of cells. Different numbers of 5, 30, or 50 nm AuNPs were placed in the extracellular space, on the cell surface, in the cytoplasm, or in the nucleus. Photon sources examined in the simulation included nine monoenergetic x-rays (10100 keV), an x-ray beam (100 kVp), and {sup 125}I and {sup 103}Pd brachytherapy seeds. Both nuclear and cellular dose enhancement factors (NDEFs, CDEFs) were calculated. The ability of these metrics to predict the experimental DEF based on the clonogenic survival of MDA-MB-361 human breast cancer cells exposed to AuNPs and x-rays were compared.Results: NDEFs show a strong dependence on photon energies with peaks at 15, 30/40, and 90 keV. Cell model and subcellular location of AuNPs influence the peak position and value of NDEF. NDEFs decrease in the order of AuNPs in the nucleus, cytoplasm, cell membrane, and extracellular space. NDEFs also decrease in the order of AuNPs in a cell cluster, monolayer, and single cell if the photon energy is larger than 20 keV. NDEFs depend linearly on the number of AuNPs per cell. Similar trends were observed for CDEFs. NDEFs using the monolayer cell model were more predictive than either single cell or cluster cell models of the DEFs experimentally derived from the clonogenic survival of cells cultured as a monolayer. The amount of AuNPs required to double the prescribed dose in terms of mg Au/g tissue decreases as the size of AuNPs increases, especially when AuNPs are in the nucleus and the cytoplasm. For 40 keV x-rays and a cluster of cells, to double the prescribed x-ray dose (NDEF = 2

  13. Dose factor entry and display tool for BNCT radiotherapy

    DOEpatents

    Wessol, Daniel E.; Wheeler, Floyd J.; Cook, Jeremy L.

    1999-01-01

    A system for use in Boron Neutron Capture Therapy (BNCT) radiotherapy planning where a biological distribution is calculated using a combination of conversion factors and a previously calculated physical distribution. Conversion factors are presented in a graphical spreadsheet so that a planner can easily view and modify the conversion factors. For radiotherapy in multi-component modalities, such as Fast-Neutron and BNCT, it is necessary to combine each conversion factor component to form an effective dose which is used in radiotherapy planning and evaluation. The Dose Factor Entry and Display System is designed to facilitate planner entry of appropriate conversion factors in a straightforward manner for each component. The effective isodose is then immediately computed and displayed over the appropriate background (e.g. digitized image).

  14. Major Risk Factors to the Integrated Facility Disposition Project |

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Department of Energy to the Integrated Facility Disposition Project Major Risk Factors to the Integrated Facility Disposition Project The scope of the Integrated Facility Disposition Project (IFDP) needs to comprehensively address a wide range of environmental management risks at the Oak Ridge Reservation (ORO). Major Risk Factors to the Integrated Facility Disposition Project (227.35 KB) More Documents & Publications Major Risk Factors Integrated Facility Disposition Project - Oak Ridge

  15. Factors Affecting PMU Installation Costs (October 2014) | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Energy Factors Affecting PMU Installation Costs (October 2014) Factors Affecting PMU Installation Costs (October 2014) The Department of Energy investigated the major cost factors that affected PMU installation costs for the synchrophasor projects funded through the Recovery Act Smart Grid Programs. The data was compiled through interviews with the nine projects that deployed production grade synchrophasor systems. The study found that while the costs associated with PMUs as stand-alone

  16. Fragmentation, NRQCD and Factorization in Heavy Quarkonium Production...

    Office of Scientific and Technical Information (OSTI)

    However, we show that gauge invariance and factorization require that conventional NRQCD production matrix elements be modified to include Wilson lines or non-abelian gauge links. ...

  17. Factors Affecting Power Output by Photovoltaic Cells Lesson

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Factors Affecting Power Output by Photovoltaic Cells Grade Level(s): IB 2 (Senior - 3 ... C.8 Photovoltaic cells and dye-sensitized solar cells (DSSC) Understandings: * Solar ...

  18. A Compendium of Transfer Factors for Agricultural and Animal...

    Office of Scientific and Technical Information (OSTI)

    Tables of transfer factors are listed by element and information source for beef, eggs, fish, fruit, grain, leafy vegetation, milk, poultry, and root vegetables. Authors: Staven, ...

  19. Electromagnetic form factors and the hypercentral constituent quark model

    SciTech Connect

    Sanctis, M. De; Giannini, M. M.; Santopinto, E.; Vassallo, A.

    2007-12-15

    We present new results concerning the electromagnetic form factors of the nucleon using a relativistic version of the hypercentral constituent quark model and a relativistic current.

  20. Critical Factors Driving the High Volumetric Uptake of Methane...

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Critical Factors Driving the High Volumetric Uptake of Methane in Cu-3(btc)(2) Previous Next List Hulvey, Zeric; Vlaisavljevich, Bess; Mason, Jarad A.; Tsivion, Ehud; Dougherty,...

  1. Identification and Control of Factors that Affect EGR Cooler...

    Energy.gov [DOE] (indexed site)

    Key factors that cause exhaust gas recirculation cooler fouling were identified through extensive literature search and controlled experiment was devised to study the impact of a ...

  2. Development of the Electricity Carbon Emission Factors for Russia...

    OpenEI (Open Energy Information) [EERE & EIA]

    Russia Jump to: navigation, search Name Development of the Electricity Carbon Emission Factors for Russia AgencyCompany Organization European Bank for Reconstruction and...

  3. EPA Rainfall Erosivity Factor Calculator Website | Open Energy...

    OpenEI (Open Energy Information) [EERE & EIA]

    Calculator Website Jump to: navigation, search OpenEI Reference LibraryAdd to library Web Site: EPA Rainfall Erosivity Factor Calculator Website Abstract This website allows...

  4. Analytical evaluation of atomic form factors: Application to Rayleigh scattering

    SciTech Connect

    Safari, L.; Santos, J. P.; Amaro, P.; Jnkl, K.; Fratini, F.

    2015-05-15

    Atomic form factors are widely used for the characterization of targets and specimens, from crystallography to biology. By using recent mathematical results, here we derive an analytical expression for the atomic form factor within the independent particle model constructed from nonrelativistic screened hydrogenic wave functions. The range of validity of this analytical expression is checked by comparing the analytically obtained form factors with the ones obtained within the Hartee-Fock method. As an example, we apply our analytical expression for the atomic form factor to evaluate the differential cross section for Rayleigh scattering off neutral atoms.

  5. EPA - Rainfall Erosivity Factor Calculator webpage | Open Energy...

    OpenEI (Open Energy Information) [EERE & EIA]

    Not Provided DOI Not Provided Check for DOI availability: http:crossref.org Online Internet link for EPA - Rainfall Erosivity Factor Calculator webpage Citation Environmental...

  6. Factors Impacting EGR Cooler Fouling - Main Effects and Interactions...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Impacting EGR Cooler Fouling - Main Effects and Interactions Factors Impacting EGR Cooler Fouling - Main Effects and Interactions Presentation given at the 16th Directions in ...

  7. Consideration of Factors Affecting Strip Effluent PH and Sodium Content

    SciTech Connect

    Peters, T.

    2015-07-29

    A number of factors were investigated to determine possible reasons for why the Strip Effluent (SE) can sometimes have higher than expected pH values and/or sodium content, both of which have prescribed limits. All of the factors likely have some impact on the pH values and Na content.

  8. View Factor Calculation for Three-Dimensional Geometries.

    SciTech Connect

    1989-06-20

    Version 00 MCVIEW calculates the radiation geometric view factor between surfaces for three dimensional geometries with and without interposed third surface obstructions. It was developed to calculate view factors for input data to heat transfer analysis programs such as SCA-03/TRUMP, SCA-01/HEATING-5 and PSR-199/HEATING-6.

  9. Scaling factor inconsistencies in neutrinoless double beta decay

    SciTech Connect

    Cowell, S. [Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States)

    2006-02-15

    The modern theory of neutrinoless double beta decay includes a scaling factor that has often been treated inconsistently in the literature. The nuclear contribution to the decay half-life can be suppressed by 15%-20% when scaling factors are mismatched. Correspondingly, is overestimated.

  10. Final documentation report for FY2004 GPRA metrics: Subtask 5

    SciTech Connect

    None, None

    2003-02-01

    The Office of Energy Efficiency and Renewable Energys (EERE) Renewable and Distributed Energy R&D programs manage research in two broad areas: 1) Energy Supply Technologies; and 2) Electricity Delivery. Several different approaches are required to estimate the benefits of this wide array of programs. The analytical approaches used for FY 2004 are documented in this report, as are the results of these analyses. This chapter provides a broad overview of the approaches taken for each of the two EERE research areas. Greater detail for each EERE Renewable and Distributed Energy program is provided later in this report in program-specific discussions.

  11. Property:ExplorationCostPerMetric | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Paleomagnetic Measurements Passive Seismic Techniques Passive Sensors Portable X-Ray Diffraction (XRD) Portfolio Risk Modeling Production Wells R Radar Remote Sensing Techniques...

  12. Integration of Sustainability Metrics into Design Cases and State...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    ... from a fraction of hydrolysate, or by diverting a fraction of feedstock biomass to gasification train Increases cost to 5.48GGE (in situ), 4.95GGE (gasification) ...

  13. Hierarchical clustering using correlation metric and spatial continuity constraint

    DOEpatents

    Stork, Christopher L.; Brewer, Luke N.

    2012-10-02

    Large data sets are analyzed by hierarchical clustering using correlation as a similarity measure. This provides results that are superior to those obtained using a Euclidean distance similarity measure. A spatial continuity constraint may be applied in hierarchical clustering analysis of images.

  14. Deep Energy Retrofit Performance Metric Comparison: Eight California...

    Office of Scientific and Technical Information (OSTI)

    For each home, the details of the retrofits were analyzed, diagnostic tests to characterize the home were performed and the homes were monitored for total and individual end-use ...

  15. Analysis of key safety metrics of thorium utilization in LWRs...

    Office of Scientific and Technical Information (OSTI)

    high-temperature gas-cooled, fast spectrum sodium, and molten salt reactors), along with use in advanced accelerator-driven systems and even in fission-fusion hybrid systems. ...

  16. Phase estimation with nonunitary interferometers: Information as a metric

    SciTech Connect

    Bahder, Thomas B.

    2011-05-15

    Determining the phase in one arm of a quantum interferometer is discussed taking into account the three nonideal aspects in real experiments: nondeterministic state preparation, nonunitary state evolution due to losses during state propagation, and imperfect state detection. A general expression is written for the probability of a measurement outcome taking into account these three nonideal aspects. As an example of applying the formalism, the classical Fisher information and fidelity (Shannon mutual information between phase and measurements) are computed for few-photon Fock and N00N states input into a lossy Mach-Zehnder interferometer. These three nonideal aspects lead to qualitative differences in phase estimation, such as a decrease in fidelity and Fisher information that depends on the true value of the phase.

  17. Toward a new metric for ranking high performance computing systems...

    Office of Scientific and Technical Information (OSTI)

    Close Cite: Bibtex Format Close 0 pages in this document matching the terms "" Search For Terms: Enter terms in the toolbar above to search the full text of this document for ...

  18. FY 2016 Q3 Metrics Summary.xlsx

    Energy.gov [DOE] (indexed site)

    FY 2016 Target 95% 95% 90% 85% 90% 90% FY 2016 3rd Qtr Actual Comment FY 2016 Forecast ... Schedule Compliance, Projects Less Than 5 Years Duration: Projects will meet the project ...

  19. Annex A Metrics for the Smart Grid System Report

    Energy.gov [DOE] (indexed site)

    ... The increase in power-sensitive and digital loads has forced us to more narrowly define PQ. For example, 10 years ago a voltage sag might be classified as a drop of 40% or more for ...

  20. Energy Department Sponsored Project Captures One Millionth Metric...

    Energy Saver

    So what do you do with all this pure CO2? The dried, compressed CO2 is delivered via pipeline to the West Hastings Field, a depleted oil and gas field in southeast Texas where it ...

  1. Analysis of key safety metrics of thorium utilization in LWRs

    DOE PAGES [OSTI]

    Ade, Brian J.; Bowman, Stephen M.; Worrall, Andrew; Powers, Jeffrey

    2016-04-08

    Here, thorium has great potential to stretch nuclear fuel reserves because of its natural abundance and because it is possible to breed the 232Th isotope into a fissile fuel (233U). Various scenarios exist for utilization of thorium in the nuclear fuel cycle, including use in different nuclear reactor types (e.g., light water, high-temperature gas-cooled, fast spectrum sodium, and molten salt reactors), along with use in advanced accelerator-driven systems and even in fission-fusion hybrid systems. The most likely near-term application of thorium in the United States is in currently operating light water reactors (LWRs). This use is primarily based on conceptsmore » that mix thorium with uranium (UO2 + ThO2) or that add fertile thorium (ThO2) fuel pins to typical LWR fuel assemblies. Utilization of mixed fuel assemblies (PuO2 + ThO2) is also possible. The addition of thorium to currently operating LWRs would result in a number of different phenomenological impacts to the nuclear fuel. Thorium and its irradiation products have different nuclear characteristics from those of uranium and its irradiation products. ThO2, alone or mixed with UO2 fuel, leads to different chemical and physical properties of the fuel. These key reactor safety–related issues have been studied at Oak Ridge National Laboratory and documented in “Safety and Regulatory Issues of the Thorium Fuel Cycle” (NUREG/CR-7176, U.S. Nuclear Regulatory Commission, 2014). Various reactor analyses were performed using the SCALE code system for comparison of key performance parameters of both ThO2 + UO2 and ThO2 + PuO2 against those of UO2 and typical UO2 + PuO2 mixed oxide fuels, including reactivity coefficients and power sharing between surrounding UO2 assemblies and the assembly of interest. The decay heat and radiological source terms for spent fuel after its discharge from the reactor are also presented. Based on this evaluation, potential impacts on safety requirements and identification of knowledge gaps that require additional analysis or research to develop a technical basis for the licensing of thorium fuel are identified.« less

  2. EVMS Training Snippet: 3.2 Schedule Health Metrics | Department...

    Energy.gov [DOE] (indexed site)

    More Documents & Publications EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations EVMS Training Snippet: 3.1A Integrated ...

  3. Property:ExplorationTimePerMetric | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    Techniques Geothermal Literature Review Geothermometry Gravity Methods Gravity Techniques Ground Electromagnetic Techniques Groundwater Sampling H Hand-held X-Ray Fluorescence...

  4. Non-minimal derivative couplings of the composite metric (Journal...

    Office of Scientific and Technical Information (OSTI)

    In the context of massive gravity, bi-gravity and multi-gravity non-minimal matter ... limit and the matter quantum loop corrections do not detune the potential interactions. ...

  5. EAC Presentation: Metrics and Benefits Analysis for the ARRA...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    and benefits analysis for the American Recovery and Reinvestment Act smart grid programs including the Smart Grid Investment Grants and the Smart Grid Demonstration Program. ...

  6. Summary of Proposed Metrics - QER Technical Workshop on Energy...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Infrastructure Assurance Center presentation o DEFINITION - Resilience, in the context of critical infrastructure, is defined as the ability of a facility or asset to ...

  7. Toward a new metric for ranking high performance computing systems...

    Office of Scientific and Technical Information (OSTI)

    as a true measure of system performance for a growing collection of important science and engineering applications. In this paper we describe a new high performance conjugate...

  8. On The conformal metric structure of geometrothermodynamics: Generalizations

    SciTech Connect

    Azreg-Anou, Mustapha

    2014-03-15

    We show that the range of applicability of the change of representation formula derived by Bravetti et al. [J. Math. Phys. 54, 033513 (2013)] is very narrow and extend it to include all physical applications, particularly, applications to black hole thermodynamics, cosmology, and fluid thermodynamics.

  9. Office of HC Strategy Budget and Performance Metrics (HC-50)...

    Energy.gov [DOE] (indexed site)

    Statement and Function Statement The Office of Human Capital Strategy, Budget, and ... Provides analytical support and consultative advice to the Chief Human Capital Officer, ...

  10. Analysis of IFR driver fuel hot channel factors

    SciTech Connect

    Ku, J.Y.; Chang, L.K.; Mohr, D.

    1994-03-01

    Thermal-hydraulic uncertainty factors for Integral Fast Reactor (IFR) driver fuels have been determined based primarily on the database obtained from the predecessor fuels used in the IFR prototype, Experimental Breeder Reactor II. The uncertainty factors were applied to the channel factors (HCFs) analyses to obtain separate overall HCFs for fuel and cladding for steady-state analyses. A ``semistatistical horizontal method`` was used in the HCFs analyses. The uncertainty factor of the fuel thermal conductivity dominates the effects considered in the HCFs analysis; the uncertainty in fuel thermal conductivity will be reduced as more data are obtained to expand the currently limited database for the IFR ternary metal fuel (U-20Pu-10Zr). A set of uncertainty factors to be used for transient analyses has also been derived.

  11. Optimization of scat detection methods for a social ungulate, the wild pig, and experimental evaluation of factors affecting detection of scat

    DOE PAGES [OSTI]

    Keiter, David A.; Cunningham, Fred L.; Rhodes, Jr., Olin E.; Irwin, Brian J.; Beasley, James C.

    2016-05-25

    Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocolsmore » with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. In conclusion, knowledge of relationships between environmental variables and scat detection may allow

  12. Lifestyle Factors in U.S. Residential Electricity Consumption

    SciTech Connect

    Sanquist, Thomas F.; Orr, Heather M.; Shui, Bin; Bittner, Alvah C.

    2012-03-30

    A multivariate statistical approach to lifestyle analysis of residential electricity consumption is described and illustrated. Factor analysis of selected variables from the 2005 U.S. Residential Energy Consumption Survey (RECS) identified five lifestyle factors reflecting social and behavioral choices associated with air conditioning, laundry usage, personal computer usage, climate zone of residence, and TV use. These factors were also estimated for 2001 RECS data. Multiple regression analysis using the lifestyle factors yields solutions accounting for approximately 40% of the variance in electricity consumption for both years. By adding the associated household and market characteristics of income, local electricity price and access to natural gas, variance accounted for is increased to approximately 54%. Income contributed only {approx}1% unique variance to the 2005 and 2001 models, indicating that lifestyle factors reflecting social and behavioral choices better account for consumption differences than income. This was not surprising given the 4-fold range of energy use at differing income levels. Geographic segmentation of factor scores is illustrated, and shows distinct clusters of consumption and lifestyle factors, particularly in suburban locations. The implications for tailored policy and planning interventions are discussed in relation to lifestyle issues.

  13. Using partial safety factors in wind turbine design and testing

    SciTech Connect

    Musial, W.D.; Butterfield, C.

    1997-09-01

    This paper describes the relationship between wind turbine design and testing in terms of the certification process. An overview of the current status of international certification is given along with a description of limit-state design basics. Wind turbine rotor blades are used to illustrate the principles discussed. These concepts are related to both International Electrotechnical Commission and Germanischer Lloyd design standards, and are covered using schematic representations of statistical load and material strength distributions. Wherever possible, interpretations of the partial safety factors are given with descriptions of their intended meaning. Under some circumstances, the authors` interpretations may be subjective. Next, the test-load factors are described in concept and then related to the design factors. Using technical arguments, it is shown that some of the design factors for both load and materials must be used in the test loading, but some should not be used. In addition, some test factors not used in the design may be necessary for an accurate test of the design. The results show that if the design assumptions do not clearly state the effects and uncertainties that are covered by the design`s partial safety factors, outside parties such as test labs or certification agencies could impose their own meaning on these factors.

  14. Using partial safety factors in wind turbine design and testing

    SciTech Connect

    Musial, W.D.

    1997-12-31

    This paper describes the relationship between wind turbine design and testing in terms of the certification process. An overview of the current status of international certification is given along with a description of limit-state design basics. Wind turbine rotor blades are used to illustrate the principles discussed. These concepts are related to both International Electrotechnical Commission and Germanischer Lloyd design standards, and are covered using schematic representations of statistical load and material strength distributions. Wherever possible, interpretations of the partial safety factors are given with descriptions of their intended meaning. Under some circumstances, the authors` interpretations may be subjective. Next, the test-load factors are described in concept and then related to the design factors. Using technical arguments, it is shown that some of the design factors for both load and materials must be used in the test loading, but some should not be used. In addition, some test factors not used in the design may be necessary for an accurate test of the design. The results show that if the design assumptions do not clearly state the effects and uncertainties that are covered by the design`s partial safety factors, outside parties such as test labs or certification agencies could impose their own meaning on these factors.

  15. The structure of the nucleon: Elastic electromagnetic form factors

    SciTech Connect

    Punjabi, V.; Perdrisat, C. F.; Jones, M. K.; Brash, E. J.; Carlson, C. E.

    2015-07-10

    Precise proton and neutron form factor measurements at Jefferson Lab, using spin observables, have recently made a significant contribution to the unraveling of the internal structure of the nucleon. Accurate experimental measurements of the nucleon form factors are a test-bed for understanding how the nucleon's static properties and dynamical behavior emerge from QCD, the theory of the strong interactions between quarks. There has been enormous theoretical progress, since the publication of the Jefferson Lab proton form factor ratio data, aiming at reevaluating the picture of the nucleon. We will review the experimental and theoretical developments in this field and discuss the outlook for the future.

  16. Dual chain synthetic heparin-binding growth factor analogs

    DOEpatents

    Zamora, Paul O.; Pena, Louis A.; Lin, Xinhua

    2009-10-06

    The invention provides synthetic heparin-binding growth factor analogs having two peptide chains each branched from a branch moiety, such as trifunctional amino acid residues, the branch moieties separated by a first linker of from 3 to about 20 backbone atoms, which peptide chains bind a heparin-binding growth factor receptor and are covalently bound to a non-signaling peptide that includes a heparin-binding domain, preferably by a second linker, which may be a hydrophobic second linker. The synthetic heparin-binding growth factor analogs are useful as pharmaceutical agents, soluble biologics or as surface coatings for medical devices.

  17. Dual chain synthetic heparin-binding growth factor analogs

    DOEpatents

    Zamora, Paul O.; Pena, Louis A.; Lin, Xinhua

    2012-04-24

    The invention provides synthetic heparin-binding growth factor analogs having two peptide chains each branched from a branch moiety, such as trifunctional amino acid residues, the branch moieties separated by a first linker of from 3 to about 20 backbone atoms, which peptide chains bind a heparin-binding growth factor receptor and are covalently bound to a non-signaling peptide that includes a heparin-binding domain, preferably by a second linker, which may be a hydrophobic second linker. The synthetic heparin-binding growth factor analogs are useful as pharmaceutical agents, soluble biologics or as surface coatings for medical devices.

  18. Multi-Factor Authentication Update | The Ames Laboratory

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Multi-Factor Authentication Update Duo was the software selected for Multi-Factor Authentication (MFA) at the Laboratory. Training sessions will be conducted by Information Systems to explain the use of MFA for on-site and remote access. Multi Factor Authentication training is scheduled for Moderate Enclave staff at the following times: November 15, TASF 205, 9-10 AM November 16, TASF 205, 2-3 PM This training is optional. These sessions will demonstrate how to use Duo and discuss the migration

  19. Proton Form Factors Measurements in the Time-Like Region

    SciTech Connect

    Anulli, F.; /Frascati

    2007-10-22

    I present an overview of the measurement of the proton form factors in the time-like region. BABAR has recently measured with great accuracy the e{sup +}e{sup -} {yields} p{bar p} reaction from production threshold up to an energy of {approx} 4.5 GeV, finding evidence for a ratio of the electric to magnetic form factor greater than unity, contrary to expectation. In agreement with previous measurements, BABAR confirmed the steep rise of the magnetic form factor close to the p{bar p} mass threshold, suggesting the possible presence of an under-threshold N{bar N} vector state. These and other open questions related to the nucleon form factors both in the time-like and space-like region, wait for more data with different experimental techniques to be possibly solved.

  20. Recommended U-factors for swinging, overhead, and revolving doors

    SciTech Connect

    Carpenter, S.C.; Hogan, J.

    1996-11-01

    Doors are often an overlooked component in the thermal integrity of the building envelope. Although swinging doors represent a small portion of the shell in residential buildings, their U-factor is usually many times higher than those of walls or ceilings. In some commercial buildings, loading (overhead) doors represent a significant area of high heat loss. Contrary to common perception, there is a wide range in the design, type, and therefore thermal performance of doors. The 1997 ASHRAE Handbook of Fundamentals will contain expanded tables of door U-factors to account for these product variations. This paper presents the results of detailed computer simulations of door U-factors. Recommended U-factors for glazed and unglazed residential and commercial swinging doors and commercial/industrial overhead and revolving doors are presented.

  1. Property:Geothermal/LoadFactor | Open Energy Information

    OpenEI (Open Energy Information) [EERE & EIA]

    to: navigation, search This is a property of type Number. Pages using the property "GeothermalLoadFactor" Showing 25 pages using this property. (previous 25) (next 25) 4 4 UR...

  2. Hadronic Form Factors in Asymptotically Free Field Theories

    DOE R&D Accomplishments

    Gross, D. J.; Treiman, S. B.

    1974-01-01

    The breakdown of Bjorken scaling in asymptotically free gauge theories of the strong interactions is explored for its implications on the large q{sup 2} behavior of nucleon form factors. Duality arguments of Bloom and Gilman suggest a connection between the form factors and the threshold properties of the deep inelastic structure functions. The latter are addressed directly in an analysis of asymptotically free theories; and through the duality connection we are then led to statements about the form factors. For very large q{sup 2} the form factors are predicted to fall faster than any inverse power of q{sup 2}. For the more modest range of q{sup 2} reached in existing experiments the agreement with data is fairly good, though this may well be fortuitous. Extrapolations beyond this range are presented.

  3. Indoor Thermal Factors and Symptoms in Office Workers: Findings...

    Office of Scientific and Technical Information (OSTI)

    from the U.S. EPA BASE Study Citation Details In-Document Search Title: Indoor Thermal Factors and Symptoms in Office Workers: Findings from the U.S. EPA BASE Study You ...

  4. CDPHE Construction Storm Water Forms R-Factor Waiver Application...

    OpenEI (Open Energy Information) [EERE & EIA]

    CDPHE Construction Storm Water Forms R-Factor Waiver Application Jump to: navigation, search OpenEI Reference LibraryAdd to library Legal Document- Permit ApplicationPermit...

  5. Factors Controlling The Geochemical Evolution Of Fumarolic Encrustatio...

    OpenEI (Open Energy Information) [EERE & EIA]

    Smokes (VTTS). The six-factor solution model explains a large proportion (low of 74% for Ni to high of 99% for Si) of the individual element data variance. Although the primary...

  6. Phenomenology of semileptonic B -meson decays with form factors...

    Office of Scientific and Technical Information (OSTI)

    of semileptonic B -meson decays with form factors from lattice QCD Authors: Du, Daping ; El-Khadra, A. X. ; Gottlieb, Steven ; Kronfeld, A. S. ; Laiho, J. ; Lunghi, E. ; Van de...

  7. Stabilizing Perovskite Structures by Tuning Tolerance Factor: Formation of

    Office of Scientific and Technical Information (OSTI)

    Formamidinium and Cesium Lead Iodide Solid-State Alloys (Journal Article) | SciTech Connect Stabilizing Perovskite Structures by Tuning Tolerance Factor: Formation of Formamidinium and Cesium Lead Iodide Solid-State Alloys Citation Details In-Document Search Title: Stabilizing Perovskite Structures by Tuning Tolerance Factor: Formation of Formamidinium and Cesium Lead Iodide Solid-State Alloys Authors: Li, Zhen ; Yang, Mengjin ; Park, Ji-Sang ; Wei, Su-Huai ; Berry, Joseph J. ; Zhu, Kai

  8. Article Published on LED Lumen Maintenance and Light Loss Factors

    Energy.gov [DOE]

    An article has been published in LEUKOS: The Journal of the Illuminating Engineering Society of North America (IES) that may be of interest to the solid-state lighting community. Entitled "Lumen Maintenance and Light Loss Factors: Consequences of Current Design Practices for LEDs," the article was written by Michael Royer of Pacific Northwest National Laboratory and discusses complications related to the lamp lumen depreciation (LLD) light loss factor and LEDs.

  9. Dissipation factor as a predictor of anodic coating performance

    DOEpatents

    Panitz, Janda K. G.

    1995-01-01

    A dissipation factor measurement is used to predict as-anodized fixture performance prior to actual use of the fixture in an etching environment. A dissipation factor measurement of the anodic coating determines its dielectric characteristics and correlates to the performance of the anodic coating in actual use. The ability to predict the performance of the fixture and its anodized coating permits the fixture to be repaired or replaced prior to complete failure.

  10. Charm and bottom hadronic form factors with QCD sum rules

    SciTech Connect

    Bracco, M. E.; Rodrigues, B. O.; Cerqueira, A. Jr.

    2013-03-25

    We present a brief review of some calculations of form factors and coupling constants in vertices with charm and bottom mesons in the framework of QCD sum rules. We first discuss the motivation for this work, describing possible applications of these form factors to charm and bottom decays processes. We first make a summarize of the QCD sum rules method. We give special attention to the uncertainties of the method introducing by the intrinsic variation of the parameters. Finally we conclude.

  11. Classical strongly coupled quark-gluon plasma. V. Structure factors

    SciTech Connect

    Cho, Sungtae; Zahed, Ismail

    2010-10-15

    We show that the classical and strongly coupled quark-gluon plasma is characterized by a multiple of structure factors that obey generalized Orstein-Zernicke equations. We use the canonical partition function and its associated density functional to derive analytical equations for the density and charge monopole structure factors for arbitrary values of {Gamma}=V/K, the ratio of the mean potential to the Coulomb energy. The results are compared with SU(2) molecular dynamics simulations.

  12. Constructing the S-matrix With Complex Factorization

    SciTech Connect

    Schuster, Philip C.; Toro, Natalia; /Stanford U., ITP

    2009-06-19

    A remarkable connection between BCFW recursion relations and constraints on the S-matrix was made by Benincasa and Cachazo in 0705.4305, who noted that mutual consistency of different BCFW constructions of four-particle amplitudes generates nontrivial (but familiar) constraints on three-particle coupling constants - these include gauge invariance, the equivalence principle, and the lack of non-trivial couplings for spins > 2. These constraints can also be derived with weaker assumptions, by demanding the existence of four-point amplitudes that factorize properly in all unitarity limits with complex momenta. From this starting point, we show that the BCFW prescription can be interpreted as an algorithm for fully constructing a tree-level S-matrix, and that complex factorization of general BCFW amplitudes follows from the factorization of four-particle amplitudes. The allowed set of BCFW deformations is identified, formulated entirely as a statement on the three-particle sector, and using only complex factorization as a guide. Consequently, our analysis based on the physical consistency of the S-matrix is entirely independent of field theory. We analyze the case of pure Yang-Mills, and outline a proof for gravity. For Yang-Mills, we also show that the well-known scaling behavior of BCFW-deformed amplitudes at large z is a simple consequence of factorization. For gravity, factorization in certain channels requires asymptotic behavior {approx} 1/z{sup 2}.

  13. Human factors evaluation of teletherapy: Literature review. Volume 5

    SciTech Connect

    Henriksen, K.; Kaye, R.D.; Jones, R.; Morisseau, D.S.; Serig, D.L.

    1995-07-01

    A series of human factors evaluations were undertaken to better understand the contributing factors to human error in the teletherapy environment. Teletherapy is a multidisciplinary methodology for treating cancerous tissue through selective exposure to an external beam of ionizing radiation. A team of human factors specialists, assisted by a panel of radiation oncologists, medical physicists, and radiation therapists, conducted site visits to radiation oncology departments at community hospitals, university centers, and free-standing clinics. A function and task analysis was performed initially to guide subsequent evaluations in the areas of workplace environment, system-user interfaces, procedures, training, and organizational practices. To further acquire an in-depth and up-to-date understanding of the practice of teletherapy in support of these evaluations, a systematic literature review was conducted. Factors that have a potential impact on the accuracy of treatment delivery were of primary concern. The present volume is the literature review. The volume starts with an overview of the multiphased nature of teletherapy, and then examines the requirement for precision, the increasing role of quality assurance, current conceptualizations of human error, and the role of system factors such as the workplace environment, user-system interfaces, procedures, training, and organizational practices.

  14. Factorization in large-scale many-body calculations

    DOE PAGES [OSTI]

    Johnson, Calvin W.; Ormand, W. Erich; Krastev, Plamen G.

    2013-08-07

    One approach for solving interacting many-fermion systems is the configuration-interaction method, also sometimes called the interacting shell model, where one finds eigenvalues of the Hamiltonian in a many-body basis of Slater determinants (antisymmetrized products of single-particle wavefunctions). The resulting Hamiltonian matrix is typically very sparse, but for large systems the nonzero matrix elements can nonetheless require terabytes or more of storage. An alternate algorithm, applicable to a broad class of systems with symmetry, in our case rotational invariance, is to exactly factorize both the basis and the interaction using additive/multiplicative quantum numbers; such an algorithm recreates the many-body matrix elementsmore » on the fly and can reduce the storage requirements by an order of magnitude or more. Here, we discuss factorization in general and introduce a novel, generalized factorization method, essentially a ‘double-factorization’ which speeds up basis generation and set-up of required arrays. Although we emphasize techniques, we also place factorization in the context of a specific (unpublished) configuration-interaction code, BIGSTICK, which runs both on serial and parallel machines, and discuss the savings in memory due to factorization.« less

  15. Selection of powder factor in large diameter blastholes

    SciTech Connect

    Eloranta, J.

    1995-12-31

    This paper documents the relationship between material handling and processing costs compared to blasting cost. The old adage, The cheapest crushing is done in the pit, appears accurate in this case study. Comparison of the accumulated cost of: powder, selected wear materials and electricity; indicate a strong, inverse correlation with powder factor (lbs powder/long ton of rock). In this case, the increased powder cost is more than offset by electrical savings alone. Measurable, overall costs decline while shovel and crusher productivity rise by about 5% when powder factor rises by 15%. These trends were previously masked by the effects of: weather, ore grade fluctuations and accounting practices. Attempts to correlate increased powder factor to: wear materials in the crushing plant and to shovel hoist rope life have not shown the same benefit.

  16. Ion-ion dynamic structure factor of warm dense mixtures

    DOE PAGES [OSTI]

    Gill, N. M.; Heinonen, R. A.; Starrett, C. E.; Saumon, D.

    2015-06-25

    In this study, the ion-ion dynamic structure factor of warm dense matter is determined using the recently developed pseudoatom molecular dynamics method [Starrett et al., Phys. Rev. E 91, 013104 (2015)]. The method uses density functional theory to determine ion-ion pair interaction potentials that have no free parameters. These potentials are used in classical molecular dynamics simulations. This constitutes a computationally efficient and realistic model of dense plasmas. Comparison with recently published simulations of the ion-ion dynamic structure factor and sound speed of warm dense aluminum finds good to reasonable agreement. Using this method, we make predictions of the ion-ionmore » dynamical structure factor and sound speed of a warm dense mixture—equimolar carbon-hydrogen. This material is commonly used as an ablator in inertial confinement fusion capsules, and our results are amenable to direct experimental measurement.« less

  17. Ion-ion dynamic structure factor of warm dense mixtures

    SciTech Connect

    Gill, N. M.; Heinonen, R. A.; Starrett, C. E.; Saumon, D.

    2015-06-25

    In this study, the ion-ion dynamic structure factor of warm dense matter is determined using the recently developed pseudoatom molecular dynamics method [Starrett et al., Phys. Rev. E 91, 013104 (2015)]. The method uses density functional theory to determine ion-ion pair interaction potentials that have no free parameters. These potentials are used in classical molecular dynamics simulations. This constitutes a computationally efficient and realistic model of dense plasmas. Comparison with recently published simulations of the ion-ion dynamic structure factor and sound speed of warm dense aluminum finds good to reasonable agreement. Using this method, we make predictions of the ion-ion dynamical structure factor and sound speed of a warm dense mixture—equimolar carbon-hydrogen. This material is commonly used as an ablator in inertial confinement fusion capsules, and our results are amenable to direct experimental measurement.

  18. Measurements of the Helium Form Factors at JLab

    SciTech Connect

    Khrosinkova, Elena

    2007-10-26

    An experiment to measure elastic electron scattering off {sup 3}He and {sup 4}He at large momentum transfers is presented. The experiment was carried out in the Hall A Facility of Jefferson Lab. Elastic electron scattering off {sup 3}He was measured at forward and backward electron scattering angles to extract the isotope's charge and magnetic form factors. The charge form factor of {sup 4}He will be extracted from forward-angle electron scattering angle measurements. The data are expected to significantly extend and improve the existing measurements of the three- and four-body form factors. The results will be crucial for the establishment of a canonical standard model for the few-body nuclear systems and for testing predictions of quark dimensional scaling and hybrid nucleon-quark models.

  19. The structure of the nucleon: Elastic electromagnetic form factors

    DOE PAGES [OSTI]

    Punjabi, V.; Perdrisat, C. F.; Jones, M. K.; Brash, E. J.; Carlson, C. E.

    2015-07-10

    Precise proton and neutron form factor measurements at Jefferson Lab, using spin observables, have recently made a significant contribution to the unraveling of the internal structure of the nucleon. Accurate experimental measurements of the nucleon form factors are a test-bed for understanding how the nucleon's static properties and dynamical behavior emerge from QCD, the theory of the strong interactions between quarks. There has been enormous theoretical progress, since the publication of the Jefferson Lab proton form factor ratio data, aiming at reevaluating the picture of the nucleon. We will review the experimental and theoretical developments in this field and discussmore » the outlook for the future.« less

  20. Cosmic Reionization On Computers III. The Clumping Factor

    DOE PAGES [OSTI]

    Kaurov, Alexander A.; Gnedin, Nickolay Y.

    2015-09-09

    We use fully self-consistent numerical simulations of cosmic reionization, completed under the Cosmic Reionization On Computers project, to explore how well the recombinations in the ionized intergalactic medium (IGM) can be quantified by the effective "clumping factor." The density distribution in the simulations (and, presumably, in a real universe) is highly inhomogeneous and more-or-less smoothly varying in space. However, even in highly complex and dynamic environments, the concept of the IGM remains reasonably well-defined; the largest ambiguity comes from the unvirialized regions around galaxies that are over-ionized by the local enhancement in the radiation field ("proximity zones"). This ambiguity precludesmore » computing the IGM clumping factor to better than about 20%. Furthermore, we discuss a "local clumping factor," defined over a particular spatial scale, and quantify its scatter on a given scale and its variation as a function of scale.« less

  1. Greybody factors for Myers–Perry black holes

    SciTech Connect

    Boonserm, Petarpa; Chatrabhuti, Auttakit Ngampitipan, Tritos; Visser, Matt

    2014-11-15

    The Myers–Perry black holes are higher-dimensional generalizations of the usual (3+1)-dimensional rotating Kerr black hole. They are of considerable interest in Kaluza–Klein models, specifically within the context of brane-world versions thereof. In the present article, we shall consider the greybody factors associated with scalar field excitations of the Myers–Perry spacetimes, and develop some rigorous bounds on these greybody factors. These bounds are of relevance for characterizing both the higher-dimensional Hawking radiation, and the super-radiance, that is expected for these spacetimes.

  2. Dominant factors of the laser gettering of silicon wafers

    SciTech Connect

    Bokhan, Yu. I. E-mail: yuibokhan@gmail.com; Kamenkov, V. S.; Tolochko, N. K.

    2015-02-15

    The laser gettering of silicon wafers is experimentally investigated. The typical gettering parameters are considered. The surfaces of laser-treated silicon wafers are investigated by microscopy. When studying the effect of laser radiation on silicon wafers during gettering, a group of factors determining the conditions of interaction between the laser beam and silicon-wafer surface and affecting the final result of treatment are selected. The main factors determining the gettering efficiency are revealed. Limitations on the desired value of the getter-layer capacity on surfaces with insufficiently high cleanness (for example, ground or matte) are established.

  3. Performance analysis of parallel supernodal sparse LU factorization

    SciTech Connect

    Grigori, Laura; Li, Xiaoye S.

    2004-02-05

    We investigate performance characteristics for the LU factorization of large matrices with various sparsity patterns. We consider supernodal right-looking parallel factorization on a bi-dimensional grid of processors, making use of static pivoting. We develop a performance model and we validate it using the implementation in SuperLU-DIST, the real matrices and the IBM Power3 machine at NERSC. We use this model to obtain performance bounds on parallel computers, to perform scalability analysis and to identify performance bottlenecks. We also discuss the role of load balance and data distribution in this approach.

  4. Structure factors for tunneling ionization rates of diatomic molecules

    SciTech Connect

    Saito, Ryoichi; Tolstikhin, Oleg I.; Madsen, Lars Bojer; Morishita, Toru

    2015-05-15

    Within the leading-order, single-active-electron, and frozen-nuclei approximation of the weak-field asymptotic theory, the rate of tunneling ionization of a molecule in an external static uniform electric field is determined by the structure factor for the highest occupied molecular orbital. We present the results of systematic calculations of structure factors for 40 homonuclear and heteronuclear diatomic molecules by the Hartree–Fock method using a numerical grid-based approach implemented in the program X2DHF.

  5. Nucleon form factors program with SBS at JLAB

    SciTech Connect

    Wojtsekhowski, Bogdan B.

    2014-12-01

    The physics of the nucleon form factors is the basic part of the Jefferson Laboratory program. We review the achievements of the 6-GeV era and the program with the 12- GeV beam with the SBS spectrometer in Hall A, with a focus on the nucleon ground state properties.

  6. UPDATING THE NRC GUIDANCE FOR HUMAN FACTORS ENGINEERING REVIEWS.

    SciTech Connect

    O HARA,J.M.; BROWN,W.S.; HIGGINS,J.C.; PERSENSKY,J.J.; LEWIS,P.M.; BONGARRA,J.

    2002-09-15

    The U.S. Nuclear Regulatory Commission (NRC) reviews the human factors engineering (HFE) aspects of nuclear plants. NUREG-0800 (Standard Review Plan), Chapter 18, ''Human Factors Engineering,'' is the principal NRC staff guidance document. Two main documents provide the review criteria to support the evaluations. The HFE Program Review Model (NUREG-0711) addresses the design process from planning to verification and validation to design implementation. The Human-System Interface Design Review Guidelines (NUREG-0700) provides the guidelines for the review of the HFE aspects of human-system interface technology, such as alarms, information systems, controls, and control room design. Since these documents were published in 1994 and 1996 respectively, they have been used by NRC staff, contractors, nuclear industry organizations, as well as by numerous organizations outside the nuclear industry. Using feedback from users and NRC research conducted in recent years, both documents have been revised and updated. This was done to ensure that they remain state-of-the-art evaluation tools for changing nuclear industry issues and emerging technologies. This paper describes the methodology used to revise and update the documents and summarizes the changes made to each and their current contents. Index Terms for this report are: Control system human factors, Ergonomics, Human factors, Nuclear power generation safety.

  7. The Modern description of semileptonic meson form factors

    SciTech Connect

    Hill, Richard J.

    2006-06-01

    I describe recent advances in our understanding of the hadronic form factors governing semileptonic meson transitions. The resulting framework provides a systematic approach to the experimental data, as a means of extracting precision observables, testing nonperturbative field theory methods, and probing a poorly understood limit of QCD.

  8. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    SciTech Connect

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  9. Gated Si nanowires for large thermoelectric power factors

    SciTech Connect

    Neophytou, Neophytos; Kosina, Hans

    2014-08-18

    We investigate the effect of electrostatic gating on the thermoelectric power factor of p-type Si nanowires (NWs) of up to 20 nm in diameter in the [100], [110], and [111] crystallographic transport orientations. We use atomistic tight-binding simulations for the calculation of the NW electronic structure, coupled to linearized Boltzmann transport equation for the calculation of the thermoelectric coefficients. We show that gated NW structures can provide ∼5× larger thermoelectric power factor compared to doped channels, attributed to their high hole phonon-limited mobility, as well as gating induced bandstructure modifications which further improve mobility. Despite the fact that gating shifts the charge carriers near the NW surface, surface roughness scattering is not strong enough to degrade the transport properties of the accumulated hole layer. The highest power factor is achieved for the [111] NW, followed by the [110], and finally by the [100] NW. As the NW diameter increases, the advantage of the gated channel is reduced. We show, however, that even at 20 nm diameters (the largest ones that we were able to simulate), a ∼3× higher power factor for gated channels is observed. Our simulations suggest that the advantage of gating could still be present in NWs with diameters of up to ∼40 nm.

  10. An overview of transverse momentum dependent factorization and evolution

    DOE PAGES [OSTI]

    Rogers, Ted C.

    2016-06-17

    I review TMD factorization and evolution theorems, with an emphasis on the treatment by Collins and originating in the Collins-Soper-Sterman (CSS) formalism. Furthermore, I summarize basic results while attempting to trace their development over that past several decades.

  11. Effect of Environmental Factors on Sulfur Gas Emissions from Drywall

    SciTech Connect

    Maddalena, Randy

    2011-08-20

    Problem drywall installed in U.S. homes is suspected of being a source of odorous and potentially corrosive indoor pollutants. The U.S. Consumer Product Safety Commission's (CPSC) investigation of problem drywall incorporates three parallel tracks: (1) evaluating the relationship between the drywall and reported health symptoms; (2) evaluating the relationship between the drywall and electrical and fire safety issues in affected homes; and (3) tracing the origin and the distribution of the drywall. To assess the potential impact on human health and to support testing for electrical and fire safety, the CPSC has initiated a series of laboratory tests that provide elemental characterization of drywall, characterization of chemical emissions, and in-home air sampling. The chemical emission testing was conducted at Lawrence Berkeley National Laboratory (LBNL). The LBNL study consisted of two phases. In Phase 1 of this study, LBNL tested thirty drywall samples provided by CPSC and reported standard emission factors for volatile organic compounds (VOCs), aldehydes, reactive sulfur gases (RSGs) and volatile sulfur compounds (VSCs). The standard emission factors were determined using small (10.75 liter) dynamic test chambers housed in a constant temperature environmental chamber. The tests were all run at 25 C, 50% relative humidity (RH) and with an area-specific ventilation rate of {approx}1.5 cubic meters per square meter of emitting surface per hour [m{sup 3}/m{sup 2}/h]. The thirty samples that were tested in Phase 1 included seventeen that were manufactured in China in 2005, 2006 and 2009, and thirteen that were manufactured in North America in 2009. The measured emission factors for VOCs and aldehydes were generally low and did not differ significantly between the Chinese and North American drywall. Eight of the samples tested had elevated emissions of volatile sulfur-containing compounds with total RSG emission factors between 32 and 258 micrograms per square meter

  12. Tuning g factors of core-shell nanoparticles by controlled positioning...

    Office of Scientific and Technical Information (OSTI)

    Tuning g factors of core-shell nanoparticles by controlled positioning of magnetic ... 22, 2017 Prev Next Title: Tuning g factors of core-shell nanoparticles by ...

  13. Discovery of Novel P1 Groups for Coagulation Factor VIIa Inhibition...

    Office of Scientific and Technical Information (OSTI)

    for Coagulation Factor VIIa Inhibition Using Fragment-Based Screening Citation Details In-Document Search Title: Discovery of Novel P1 Groups for Coagulation Factor VIIa Inhibition ...

  14. Assessment of Factors Influencing Effective CO{sub 2} Storage Capacity and Injectivity in Eastern Gas Shales

    SciTech Connect

    Godec, Michael

    2013-06-30

    Building upon advances in technology, production of natural gas from organic-rich shales is rapidly developing as a major hydrocarbon supply option in North America and around the world. The same technology advances that have facilitated this revolution - dense well spacing, horizontal drilling, and hydraulic fracturing - may help to facilitate enhanced gas recovery (EGR) and carbon dioxide (CO{sub 2}) storage in these formations. The potential storage of CO {sub 2} in shales is attracting increasing interest, especially in Appalachian Basin states that have extensive shale deposits, but limited CO{sub 2} storage capacity in conventional reservoirs. The goal of this cooperative research project was to build upon previous and on-going work to assess key factors that could influence effective EGR, CO{sub 2} storage capacity, and injectivity in selected Eastern gas shales, including the Devonian Marcellus Shale, the Devonian Ohio Shale, the Ordovician Utica and Point Pleasant shale and equivalent formations, and the late Devonian-age Antrim Shale. The project had the following objectives: (1) Analyze and synthesize geologic information and reservoir data through collaboration with selected State geological surveys, universities, and oil and gas operators; (2) improve reservoir models to perform reservoir simulations to better understand the shale characteristics that impact EGR, storage capacity and CO{sub 2} injectivity in the targeted shales; (3) Analyze results of a targeted, highly monitored, small-scale CO{sub 2} injection test and incorporate into ongoing characterization and simulation work; (4) Test and model a smart particle early warning concept that can potentially be used to inject water with uniquely labeled particles before the start of CO{sub 2} injection; (5) Identify and evaluate potential constraints to economic CO{sub 2} storage in gas shales, and propose development approaches that overcome these constraints; and (6) Complete new basin

  15. Precision Measurements of the Proton Elastic Form Factor Ratio

    SciTech Connect

    Douglas Higinbotham

    2010-08-01

    New high precision polarization measurements of the proton elastic form factor ratio in the Q^2 from 0.3 to 0.7 [GeV/c]^2 have been made. These elastic H(e,e'p) measurementswere done in Jefferson Lab's Hall A using 80% longitudinally polarized electrons and recoil polarimetry. For Q^2 greater than 1 [GeV/c]^2, previous polarization data indicated a strong deviation of the form factor ratio from unity which sparked renewed theoretical and experimental interest in how two-photon diagrams have been taken into account. The new high precision data indicate that the deviation from unity, while small, persists even at Q^2 less than 1 [GeV/c]^2.

  16. Measurement of the gamma gamma* -> pi0 transition form factor

    SciTech Connect

    Aubert, B.

    2009-06-02

    We study the reaction e{sup +}e{sup -} {yields} e{sup +}e{sup -}{pi}{sup 0} in the single tag mode and measure the differential cross section d{sigma}/dQ{sup 2} and the {gamma}{gamma}* {yields} {pi}{sup 0} transition form factor in the mometum transfer range from 4 to 40 GeV{sup 2}. At Q{sup 2} > 10 GeV{sup 2} the measured form factor exceeds the asymptotic limit predicted by perturbative QCD. The analysis is based on 442 fb{sup -1} of integrated luminosity collected at PEP-II with the BABAR detector at e{sup +}e{sup -} center-of-mass energies near 10.6 GeV.

  17. Factorization of large integers on a massively parallel computer

    SciTech Connect

    Davis, J.A.; Holdridge, D.B.

    1988-01-01

    Our interest in integer factorization at Sandia National Laboratories is motivated by cryptographic applications and in particular the security of the RSA encryption-decryption algorithm. We have implemented our version of the quadratic sieve procedure on the NCUBE computer with 1024 processors (nodes). The new code is significantly different in all important aspects from the program used to factor number of order 10/sup 70/ on a single processor CRAY computer. Capabilities of parallel processing and limitation of small local memory necessitated this entirely new implementation. This effort involved several restarts as realizations of program structures that seemed appealing bogged down due to inter-processor communications. We are presently working with integers of magnitude about 10/sup 70/ in tuning this code to the novel hardware. 6 refs., 3 figs.

  18. Factors driving wind power development in the United States

    SciTech Connect

    Bird, Lori A.; Parsons, Brian; Gagliano, Troy; Brown, Matthew H.; Wiser, Ryan H.; Bolinger, Mark

    2003-05-15

    In the United States, there has been substantial recent growth in wind energy generating capacity, with growth averaging 24 percent annually during the past five years. About 1,700 MW of wind energy capacity was installed in 2001, while another 410 MW became operational in 2002. This year (2003) shows promise of significant growth with more than 1,500 MW planned. With this growth, an increasing number of states are experiencing investment in wind energy projects. Wind installations currently exist in about half of all U.S. states. This paper explores the key factors at play in the states that have achieved a substantial amount of wind energy investment. Some of the factors that are examined include policy drivers, such as renewable portfolio standards (RPS), federal and state financial incentives, and integrated resource planning; as well as market drivers, such as consumer demand for green power, natural gas price volatility, and wholesale market rules.

  19. Performance of non-conventional factorization approaches for neutron kinetics

    SciTech Connect

    Bulla, S.; Nervo, M.

    2013-07-01

    The use of factorization techniques provides a interesting option for the simulation of the time-dependent behavior of nuclear systems with a reduced computational effort. While point kinetics neglects all spatial and spectral effects, quasi-statics and multipoint kinetics allow to produce results with a higher accuracy for transients involving relevant modifications of the neutron distribution. However, in some conditions these methods can not work efficiently. In this paper, we discuss some possible alternative formulations for the factorization process for neutron kinetics, leading to mathematical models of reduced complications that can allow an accurate simulation of transients involving spatial and spectral effects. The performance of these innovative approaches are compared to standard techniques for some test cases, showing the benefits and shortcomings of the method proposed. (authors)

  20. Factors that affect electric-utility stranded commitments

    SciTech Connect

    Hirst, E.; Hadley, S.; Baxter, L.

    1996-07-01

    Estimates of stranded commitments for U.S. investor-owned utilities range widely, with many falling in the range of $100 to $200 billion. These potential losses exist because some utility-owned power plants, long-term power-purchase contracts and fuel-supply contracts, regulatory assets, and expenses for public-policy programs have book values that exceed their expected market values under full competition. This report quantifies the sensitivity of stranded- commitment estimates to the various factors that lead to these above- market-value estimates. The purpose of these sensitivity analyses is to improve understanding on the part of state and federal regulators, utilities, customers, and other electric-industry participants about the relative importance of the factors that affect stranded- commitment amounts.

  1. Surface engineering of the quality factor of metal coated microcantilevers

    SciTech Connect

    Ergincan, O.; Kooi, B. J.; Palasantzas, G.

    2014-12-14

    We performed noise measurements to obtain the quality factor (Q) and frequency shift of gold coated microcantilevers before and after surface modification using focused ion beam. As a result of our studies, it is demonstrated that surface engineering offers a promising method to control and increase the Q factor up to 50% for operation in vacuum. Surface modification could also lead to deviations from the known Q ∼ P{sup −1} behavior at low vacuum pressures P within the molecular regime. Finally, at higher pressures within the continuum regime, where Q is less sensitive to surface changes, a power scaling Q ∼ P{sup c} with c ≈ 0.3 was found instead of c = 0.5. The latter is explained via a semi-empirical formulation to account for continuum dissipation mechanisms at significant Reynolds numbers Re ∼ 1.

  2. Geological and Anthropogenic Factors Influencing Mercury Speciation in Mine

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Wastes Geological and Anthropogenic Factors Influencing Mercury Speciation in Mine Wastes Christopher S. Kim,1 James J. Rytuba,2 Gordon E. Brown, Jr.3 1Department of Physical Sciences, Chapman University, Orange, CA 92866 2U.S. Geological Survey, Menlo Park, CA 94025 3Department of Geological and Environmental Sciences, Stanford University, Stanford, CA 94305 Introduction Figure 1. Dr. Christopher Kim collects a mine waste sample from the Oat Hill mercury mine in Northern California. The

  3. Limiting Factors for Convective Cloud Top Height in the Tropics

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    Limiting Factors for Convective Cloud Top Height in the Tropics M. P. Jensen and A. D. Del Genio National Aeronautics and Space Administration Goddard Institute for Space Studies Columbia University New York, New York Introduction Populations of tropical convective clouds are mainly comprised of three types: shallow trade cumulus, mid-level cumulus congestus and deep convective clouds (Johnson et al. 1999). Each of these cloud types has different impacts on the local radiation and water budgets.

  4. Structure of Plasmodium falciparum ADP-ribosylation factor 1

    SciTech Connect

    Cook, William J.; Smith, Craig D.; Senkovich, Olga; Holder, Anthony A.; Chattopadhyay, Debasish

    2011-09-26

    Vesicular trafficking may play a crucial role in the pathogenesis and survival of the malaria parasite. ADP-ribosylation factors (ARFs) are among the major components of vesicular trafficking pathways in eukaryotes. The crystal structure of ARF1 GTPase from Plasmodium falciparum has been determined in the GDP-bound conformation at 2.5 {angstrom} resolution and is compared with the structures of mammalian ARF1s.

  5. Method for factor analysis of GC/MS data

    DOEpatents

    Van Benthem, Mark H; Kotula, Paul G; Keenan, Michael R

    2012-09-11

    The method of the present invention provides a fast, robust, and automated multivariate statistical analysis of gas chromatography/mass spectroscopy (GC/MS) data sets. The method can involve systematic elimination of undesired, saturated peak masses to yield data that follow a linear, additive model. The cleaned data can then be subjected to a combination of PCA and orthogonal factor rotation followed by refinement with MCR-ALS to yield highly interpretable results.

  6. Method for determining formation quality factor from seismic data

    DOEpatents

    Taner, M. Turhan; Treitel, Sven

    2005-08-16

    A method is disclosed for calculating the quality factor Q from a seismic data trace. The method includes calculating a first and a second minimum phase inverse wavelet at a first and a second time interval along the seismic data trace, synthetically dividing the first wavelet by the second wavelet, Fourier transforming the result of the synthetic division, calculating the logarithm of this quotient of Fourier transforms and determining the slope of a best fit line to the logarithm of the quotient.

  7. Factors influencing photocurrent generation in organic bulk heterojunction

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    solar cells: interfacial energetics and blend microstructure | MIT-Harvard Center for Excitonics Factors influencing photocurrent generation in organic bulk heterojunction solar cells: interfacial energetics and blend microstructure April 29, 2009 at 3pm/36-428 Jenny Nelson Department of Physics Imperial College London jenny-nelson_000 abstract: The efficiency of photocurrent generation in conjugated polymer:small molecule blend solar is strongly influenced both by the energy level alignment

  8. Tuning the thermoelectric power factor in carbon nanotube films

    U.S. Department of Energy (DOE) - all webpages (Extended Search)

    . Tuning the thermoelectric power factor in carbon nanotube films Ben Zhou 1 , Azure Avery 2 , Andrew Ferguson 2 , Jeff Blackburn 2 Schematic of a thermoelectric device. (wikipedia) Heat Thermoelectric Device Electricity Thermoelectrics Carbon Nanotubes Introduction * Single walled carbon nanotubes (SWCNTs) are promising thermoelectrics because of their good conductivity and one dimensional density of states. Materials and Methods * Ink Preparation: (7,5) nanotubes were dispersed by

  9. Transcription factors for modification of lignin content in plants

    DOEpatents

    Wang, Huanzhong; Chen, Fang; Dixon, Richard A.

    2015-06-02

    The invention provides methods for modifying lignin, cellulose, xylan, and hemicellulose content in plants, and for achieving ectopic lignification and, for instance, secondary cell wall synthesis in pith cells, by altered regulation of a WRKY transcription factor. Nucleic acid constructs for altered WRKY-TF expression are described. Transgenic plants are provided that comprise modified pith cell walls, and lignin, cellulose, and hemicellulose content. Plants described herein may be used, for example, as improved biofuel feedstock and as highly digestible forage crops.

  10. Simulation: Moving from Technology Challenge to Human Factors Success

    SciTech Connect

    Gould, Derek A.; Chalmers, Nicholas; Johnson, Sheena J.; Kilkenny, Caroline; White, Mark D.; Bech, Bo; Lonn, Lars; Bello, Fernando

    2012-06-15

    Recognition of the many limitations of traditional apprenticeship training is driving new approaches to learning medical procedural skills. Among simulation technologies and methods available today, computer-based systems are topical and bring the benefits of automated, repeatable, and reliable performance assessments. Human factors research is central to simulator model development that is relevant to real-world imaging-guided interventional tasks and to the credentialing programs in which it would be used.

  11. Human factors engineering report for the cold vacuum drying facility

    SciTech Connect

    IMKER, F.W.

    1999-06-30

    The purpose of this report is to present the results and findings of the final Human Factors Engineering (HFE) technical analysis and evaluation of the Cold Vacuum Drying Facility (CVDF). Ergonomics issues are also addressed in this report, as appropriate. This report follows up and completes the preliminary work accomplished and reported by the Preliminary HFE Analysis report (SNF-2825, Spent Nuclear Fuel Project Cold Vacuum Drying Facility Human Factors Engineering Analysis: Results and Findings). This analysis avoids redundancy of effort except for ensuring that previously recommended HFE design changes have not affected other parts of the system. Changes in one part of the system may affect other parts of the system where those changes were not applied. The final HFE analysis and evaluation of the CVDF human-machine interactions (HMI) was expanded to include: the physical work environment, human-computer interface (HCI) including workstation and software, operator tasks, tools, maintainability, communications, staffing, training, and the overall ability of humans to accomplish their responsibilities, as appropriate. Key focal areas for this report are the process bay operations, process water conditioning (PWC) skid, tank room, and Central Control Room operations. These key areas contain the system safety-class components and are the foundation for the human factors design basis of the CVDF.

  12. Dirac equation in low dimensions: The factorization method

    SciTech Connect

    Snchez-Monroy, J.A.; Quimbay, C.J.

    2014-11-15

    We present a general approach to solve the (1+1) and (2+1)-dimensional Dirac equations in the presence of static scalar, pseudoscalar and gauge potentials, for the case in which the potentials have the same functional form and thus the factorization method can be applied. We show that the presence of electric potentials in the Dirac equation leads to two KleinGordon equations including an energy-dependent potential. We then generalize the factorization method for the case of energy-dependent Hamiltonians. Additionally, the shape invariance is generalized for a specific class of energy-dependent Hamiltonians. We also present a condition for the absence of the Klein paradox (stability of the Dirac sea), showing how Dirac particles in low dimensions can be confined for a wide family of potentials. - Highlights: The low-dimensional Dirac equation in the presence of static potentials is solved. The factorization method is generalized for energy-dependent Hamiltonians. The shape invariance is generalized for energy-dependent Hamiltonians. The stability of the Dirac sea is related to the existence of supersymmetric partner Hamiltonians.

  13. Critical success factors in implementing process safety management

    SciTech Connect

    Wilson, D.J. [Chevron USA, Inc., New Orleans, LA (United States)

    1996-08-01

    This paper focuses on several {open_quotes}Critical Success Factors {close_quotes} which will determine how well employees will embrace and utilize the changes being asked of them to implement Process Safety Management (PSM). These success factors are applicable to any change which involves asking employees to perform activities differently than they are currently performing them. This includes changes in work processes (the way we arrange and conduct a set of tasks) or changes in work activities (how we perform individual tasks). Simply developing new work processes and explaining them to employees is not enough to ensure that employees will actually utilize them -- no matter how good these processes are. To ensure successful, complete implementation of Process Safety Management, we must manage the transition from how we perform our work now to how we will perform it after PSM is implemented. Environmental and safety performance improvements, facility reliability and operability increases, and employee effectiveness and productivity gains CAN NOT be achieved until Process Safety Management processes are fully implemented. To successfully implement management of change, mechanical integrity, or any of the other processes in PSM, each of the following critical success factors must be carefully considered and utilized as appropriate. They are: (1) Vision of a Future State, Current State Assessment, and a Detailed Plan to Achieve the Future State, (2) Management Commitment, (3) Ownership by Key Individuals, (4) Justification for Actions, (5) Autonomy to Customize the Process, (6) Feedback Mechanism to Adjust Activities, and (7) Process to Refocus & Redirect Efforts.

  14. Transfer Factors for Contaminant Uptake by Fruit and Nut Trees

    SciTech Connect

    Napier, Bruce A.; Fellows, Robert J.; Minc, Leah D.

    2013-11-20

    Transfer of radionuclides from soils into plants is one of the key mechanisms for long-term contamination of the human food chain. Nearly all computer models that address soil-to-plant uptake of radionuclides use empirically-derived transfer factors to address this process. Essentially all available soil-to-plant transfer factors are based on measurements in annual crops. Because very few measurements are available for tree fruits, samples were taken of alfalfa and oats and the stems, leaves, and fruits and nuts of almond, apple, apricot, carob, fig, grape, nectarine, pecan, pistachio (natural and grafted), and pomegranate, along with local surface soil. The samples were dried, ground, weighed, and analyzed for trace constituents through a combination of induction-coupled plasma mass spectrometry and instrumental neutron activation analysis for a wide range of naturally-occurring elements. Analysis results are presented and converted to soil-to-plant transfer factors. These are compared to commonly used and internationally recommended values. Those determined for annual crops are very similar to commonly-used values; those determined for tree fruits show interesting differences. Most macro- and micronutrients are slightly reduced in fruits; non-essential elements are reduced further. These findings may be used in existing computer models and may allow development of tree-fruit-specific transfer models.

  15. Energy Factor Analysis for Gas Heat Pump Water Heaters

    SciTech Connect

    Gluesenkamp, Kyle R

    2016-01-01

    Gas heat pump water heaters (HPWHs) can improve water heating efficiency with zero GWP and zero ODP working fluids. The energy factor (EF) of a gas HPWH is sensitive to several factors. In this work, expressions are derived for EF of gas HPWHs, as a function of heat pump cycle COP, tank heat losses, burner efficiency, electrical draw, and effectiveness of supplemental heat exchangers. The expressions are used to investigate the sensitivity of EF to each parameter. EF is evaluated on a site energy basis (as used by the US DOE for rating water heater EF), and a primary energy-basis energy factor (PEF) is also defined and included. Typical ranges of values for the six parameters are given. For gas HPWHs, using typical ranges for component performance, EF will be 59 80% of the heat pump cycle thermal COP (for example, a COP of 1.60 may result in an EF of 0.94 1.28). Most of the reduction in COP is due to burner efficiency and tank heat losses. Gas-fired HPWHs are theoretically be capable of an EF of up to 1.7 (PEF of 1.6); while an EF of 1.1 1.3 (PEF of 1.0 1.1) is expected from an early market entry.

  16. Automatic Blocking Of QR and LU Factorizations for Locality

    SciTech Connect

    Yi, Q; Kennedy, K; You, H; Seymour, K; Dongarra, J

    2004-03-26

    QR and LU factorizations for dense matrices are important linear algebra computations that are widely used in scientific applications. To efficiently perform these computations on modern computers, the factorization algorithms need to be blocked when operating on large matrices to effectively exploit the deep cache hierarchy prevalent in today's computer memory systems. Because both QR (based on Householder transformations) and LU factorization algorithms contain complex loop structures, few compilers can fully automate the blocking of these algorithms. Though linear algebra libraries such as LAPACK provides manually blocked implementations of these algorithms, by automatically generating blocked versions of the computations, more benefit can be gained such as automatic adaptation of different blocking strategies. This paper demonstrates how to apply an aggressive loop transformation technique, dependence hoisting, to produce efficient blockings for both QR and LU with partial pivoting. We present different blocking strategies that can be generated by our optimizer and compare the performance of auto-blocked versions with manually tuned versions in LAPACK, both using reference BLAS, ATLAS BLAS and native BLAS specially tuned for the underlying machine architectures.

  17. Patient-based radiographic exposure factor selection: a systematic review

    SciTech Connect

    Ching, William; Robinson, John; McEntee, Mark

    2014-09-15

    Digital technology has wider exposure latitude and post-processing algorithms which can mask the evidence of underexposure and overexposure. Underexposure produces noisy, grainy images which can impede diagnosis and overexposure results in a greater radiation dose to the patient. These exposure errors can result from inaccurate adjustment of exposure factors in response to changes in patient thickness. This study aims to identify all published radiographic exposure adaptation systems which have been, or are being, used in general radiography and discuss their applicability to digital systems. Studies in EMBASE, MEDLINE, CINAHL and SCOPUS were systematically reviewed. Some of the search terms used were exposure adaptation, exposure selection, exposure technique, 25% rule, 15% rule, DuPont™ Bit System and radiography. A manual journal-specific search was also conducted in The Radiographer and Radiologic Technology. Studies were included if they demonstrated a system of altering exposure factors to compensate for variations in patients for general radiography. Studies were excluded if they focused on finding optimal exposures for an ‘average’ patient or focused on the relationship between exposure factors and dose. The database search uncovered 11 articles and the journal-specific search uncovered 13 articles discussing systems of exposure adaptation. They can be categorised as simple one-step guidelines, comprehensive charts and computer programs. Only two papers assessed the efficacy of exposure adjustment systems. No literature compares the efficacy of exposure adaptations system for film/screen radiography with digital radiography technology nor is there literature on a digital specific exposure adaptation system.

  18. The Exposure Rate Conversion Factor for Nuclear Fallout

    SciTech Connect

    Spriggs, G D

    2009-02-11

    Nuclear fallout is comprised of approximately 2000 radionuclides. About 1000 of these radionuclides are either primary fission products or activated fission products that are created during the burn process. The exposure rate one meter above the surface produced by this complex mixture of radionuclides varies rapidly with time since many of the radionuclides are short-lived and decay numerous times before reaching a stable isotope. As a result, the mixture of radionuclides changes rapidly with time. Using a new code developed at the Lawrence Livermore National Laboratory, the mixture of radionuclides at any given point in time can be calculated. The code also calculates the exposure rate conversion factor (ECF) for all 3864 individual isotopes contained in its database based on the total gamma energy released per decay. Based on the combination of isotope mixture and individual ECFs, the time-dependent variation of the composite exposure rate conversion factor for nuclear fallout can be easily calculated. As example of this new capability, a simple test case corresponding to a 10 kt, uranium-plutonium fuel has been calculated. The results for the time-dependent, composite ECF for this test case are shown in Figure 1. For comparison, we also calculated the composite exposure rate conversion factor using the conversion factors found in Federal Guidance Report No.12 (FGR-12) published by ORNL, which contains the conversion factors for approximately 1000 isotopes. As can be noted from Figure 1, the two functions agree reasonably well at times greater than about 30 minutes. However, they do not agree at early times since FGR-12 does not include all of the short-lived isotopes that are produced in nuclear fallout. It should also be noted that the composite ECF at one hour is 19.7 R/hr per Ci/m{sup 2}. This corresponds to 3148 R/hr per 1 kt per square mile, which agrees reasonably well with the value of 3000 R/hr per 1 kt per square mile as quoted by Glasstone. We have

  19. PHASER 2.10 methodology for dependence, importance, and sensitivity: The role of scale factors, confidence factors, and extremes

    SciTech Connect

    Cooper, J.A.

    1996-09-01

    PHASER (Probabilistic Hybrid Analytical System Evaluation Routine) is a software tool that has the capability of incorporating subjective expert judgment into probabilistic safety analysis (PSA) along with conventional data inputs. An earlier report described the PHASER methodology, but only gave a cursory explanation about how dependence was incorporated in Version 1.10 and about how ``Importance`` and ``Sensitivity`` measures were to be incorporated in Version 2.00. A more detailed description is given in this report. The basic concepts involve scale factors and confidence factors that are associated with the stochastic variability and subjective uncertainty (which are common adjuncts used in PSA), and the safety risk extremes that are crucial to safety assessment. These are all utilized to illustrate methodology for incorporating dependence among analysis variables in generating PSA results, and for Importance and Sensitivity measures associated with the results that help point out where any major sources of safety concern arise and where any major sources of uncertainty reside, respectively.

  20. Preparation and characterization of cobalt-substituted anthrax lethal factor

    SciTech Connect

    Saebel, Crystal E.; Carbone, Ryan; Dabous, John R.; Lo, Suet Y. [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada)] [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada); Siemann, Stefan, E-mail: ssiemann@laurentian.ca [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada)] [Department of Chemistry and Biochemistry, Laurentian University, 935 Ramsey Lake Rd., Sudbury, Ontario, Canada P3E 2C6 (Canada)

    2011-12-09

    Highlights: Black-Right-Pointing-Pointer Cobalt-substituted anthrax lethal factor (CoLF) is highly active. Black-Right-Pointing-Pointer CoLF can be prepared by bio-assimilation and direct exchange. Black-Right-Pointing-Pointer Lethal factor binds cobalt tightly. Black-Right-Pointing-Pointer The electronic spectrum of CoLF reveals penta-coordination. Black-Right-Pointing-Pointer Interaction of CoLF with thioglycolic acid follows a 2-step mechanism. -- Abstract: Anthrax lethal factor (LF) is a zinc-dependent endopeptidase involved in the cleavage of mitogen-activated protein kinase kinases near their N-termini. The current report concerns the preparation of cobalt-substituted LF (CoLF) and its characterization by electronic spectroscopy. Two strategies to produce CoLF were explored, including (i) a bio-assimilation approach involving the cultivation of LF-expressing Bacillus megaterium cells in the presence of CoCl{sub 2}, and (ii) direct exchange by treatment of zinc-LF with CoCl{sub 2}. Independent of the method employed, the protein was found to contain one Co{sup 2+} per LF molecule, and was shown to be twice as active as its native zinc counterpart. The electronic spectrum of CoLF suggests the Co{sup 2+} ion to be five-coordinate, an observation similar to that reported for other Co{sup 2+}-substituted gluzincins, but distinct from that documented for the crystal structure of native LF. Furthermore, spectroscopic studies following the exposure of CoLF to thioglycolic acid (TGA) revealed a sequential mechanism of metal removal from LF, which likely involves the formation of an enzyme: Co{sup 2+}:TGA ternary complex prior to demetallation of the active site. CoLF reported herein constitutes the first spectroscopic probe of LF's active site, which may be utilized in future studies to gain further insight into the enzyme's mechanism and inhibitor interactions.

  1. Factors affecting expanded electricity trade in North America

    SciTech Connect

    Hill, L.J.

    1994-01-01

    The authors explore factors that affect electricity trade between enterprises in the US and Canada and the US and Mexico. They look to those underlying policy and institutional factors that affect the relative costs of producing electricity in the three countries. In particular, they consider six factors that appear to have a significant impact on electricity trade in North America: differences in the types of economic regulation of power leading to differences in cost recovery for wholesale and retail power and wheeling charges; changing regulatory attitudes, placing more emphasis on demand-side management and environmental concerns; differences in energy and economic policies; differences in national and subnational environmental policies; changing organization of electric power industries which may foster uncertainty, change historical relationships, and provide other potentially important sources of power for distribution utilities; and differences in the ability of enterprises to gain access to electric power markets because of restrictions placed on transmission access. In Section 2, the authors discuss the regulation of electricity trade in North America and provide an overview of the recent trading experience for electricity between Canada and the US and between Mexico and the US, including the volume of that trade over the past decade and existing transmission capacity between regions of the three countries. In Section 3, they look at the benefits that accrue to trading counties and what those benefits are likely to be for the three countries. The discussion in Section 4 centers on the relevant provisions of the Canada Free Trade Agreement and the proposed North American Free Trade Agreement. In Section 5, they set the stage for the discussion of policy and institutional differences presented in Section 6 by outlining differences in the organization of the electric power sectors of Canada, the US, and Mexico. The study is synthesized in Section 7.

  2. On the relationship between formation resistivity factor and porosity

    SciTech Connect

    Perez-Rosales, C.

    1982-08-01

    A theory on the relationship between formation resistivity factor and porosity is presented. This theory considers that, from the standpoint of the flow of electric current within a porous medium saturated with a conducting fluid, the pore space can be divided into flowing and stagnant regions. This assumption leads to a general expression, and formulas currently used in practice are special cases of this expression. The validity of the new expression is established by the use of data corresponding to sandstones and packings and suspensions of particles. For the case of natural rocks, the theory confirms Darcy's equation and gives an interpretation of the physical significance of the so-called cementation exponent.

  3. Pion Form Factor in Improved Holographic QCD Backgrounds

    SciTech Connect

    Kwee, Herry J.

    2010-08-05

    We extend our recent numerical calculation of the pion electromagnetic form factor F{sub {pi}}(Q{sup 2}) in holographic QCD with a background field that interpolates between 'hard-wall' and 'soft-wall' models to obtain an improved model that reproduces the desirable phenomenological features of both. In all cases, F{sub {pi}}for large Q{sup 2} is shallower than data, an effect that can be cured by relaxing the fit to one of the static observables, particularly the decay constant f{sub {pi}}.

  4. Factors Affecting Zebra Mussel Kill by the Bacterium Pseudomonas fluorescens

    SciTech Connect

    Daniel P. Molloy

    2004-02-24

    The specific purpose of this research project was to identify factors that affect zebra mussel kill by the bacterium Pseudomonas fluorescens. Test results obtained during this three-year project identified the following key variables as affecting mussel kill: treatment concentration, treatment duration, mussel siphoning activity, dissolved oxygen concentration, water temperature, and naturally suspended particle load. Using this latter information, the project culminated in a series of pipe tests which achieved high mussel kill inside power plants under once-through conditions using service water in artificial pipes.

  5. Gyromagnetic factors in {sup 144-150}Nd

    SciTech Connect

    Giannatiempo, A.

    2011-09-15

    The U(5) to SU(3) evolution of the nuclear structure in the even {sup 144-156}Nd isotopes has been investigated in the framework of the interacting boson approximation (IBA-2) model, taking into account the effect of the partial Z=64 subshell closure on the structure of the states of a collective nature. The analysis, which led to a satisfactory description of excitation energy patterns, quadrupole moments, and decay properties of the states (even when important M1 components were present in the transitions), is extended to the available data on g factors, in {sup 144-150}Nd. Their values are reasonably reproduced by the calculations.

  6. Greybody factors and charges in Kerr/CFT

    DOE PAGES [OSTI]

    Cvetič, Mirjam; Larsen, Finn

    2009-09-01

    We compute greybody factors for near extreme Kerr black holes in D = 4 and D = 5. In D = 4 we include four charges so that our solutions can be continuously deformed to the BPS limit. In D = 5 we include two independent angular momenta so Left-Right symmetry is incorporated. We discuss the CFT interpretation of our emission amplitudes, including the overall frequency dependence and the dependence on all black hole parameters. We find that all additional parameters can be incorporated Kerr/CFT, with central charge independent of U(1) charges.

  7. Human factors issues in qualitative and quantitative safety analyses

    SciTech Connect

    Hahn, H.A.

    1993-10-01

    Humans are a critical and integral part of any operational system, be it a nuclear reactor, a facility for assembly or disassembling hazardous components, or a transportation network. In our concern over the safety of these systems, we often focus our attention on the hardware engineering components of such systems. However, experience has repeatedly demonstrated that it is often the human component that is the primary determinant of overall system safety. Both the nuclear reactor accidents at Chernobyl and Three Mile Island and shipping disasters such as the Exxon Valdez and the Herald of Free Enterprise accidents are attributable to human error. Concern over human contributions to system safety prompts us to include reviews of human factors issues in our safety analyses. In the conduct of Probabilistic Risk Assessments (PRAs), human factors issues are addressed using a quantitative method called Human Reliability Analysis (HRA). HRAs typically begin with the identification of potential sources of human error in accident sequences of interest. Human error analysis often employs plant and/or procedures walk-downs in which the analyst considers the ``goodness`` of procedures, training, and human-machine interfaces concerning their potential contribution to human error. Interviews with expert task performers may also be conducted. In the application of HRA, once candidate sources of human error have been identified, error probabilities are developed.

  8. Effectiveness factors for hydroprocessing of coal and coal liquids

    SciTech Connect

    Massoth, F.E.; Seader, J.D.

    1990-03-29

    The aim of this project is to develop a methodology to predict, from a knowledge of feed and catalyst properties, effectiveness factors for catalytic hydroprocessing of coal and coal liquids. To achieve this aim, it is necessary to account for restrictive diffusion, which has not hitherto been done from a fundamental approach under reaction conditions. The research entails a study of hydrodenitrogenation of model compounds and coal-derived liquids using three NiMo/alumina catalysts of different pore size to develop, for restrictive diffusion, a relationship that can be used for estimating reliable effectiveness factors. The research program includes: Task A - measurement of pertinent properties of the catalysts and of several coal liquids; Task B - determination of effective diffusivities and turtuosities of the catalysts; Task C - development of restrictive diffusion correlations from data on model N-compound reactions; Task D - testing of correlations with coal-liquid cuts and whole coal-liquid feed. Results are presented and discussed from Tasks B and D. 9 refs., 6 figs., 4 tabs.

  9. Effectiveness factors for hydroprocessing of coal and coal liquids

    SciTech Connect

    Massoth, F.E.; Seader, J.D.

    1990-01-01

    The aim of this research project is to develop a methodology to predict, from a knowledge of feed and catalyst properties, effectiveness factors for catalytic hydroprocessing of coal and coal liquids. To achieve this aim, it is necessary to account for restrictive diffusion, which has not hitherto been done from a fundamental approach under reaction conditions. The research proposed here entails a study of hydrodenitrogenation of model compounds and coal-derived liquids using three NiMo/alumina catalysts of different pore size to develop, for restrictive diffusion, a relationship that can be used for estimating reliable effectiveness factors. The program is divided into four parts: measurements of pertinent properties of the catalysts and of a coal liquid and its derived boiling-point cuts; determination of effective diffusivities and tortuosities of the catalysts; development of restrictive diffusion correlations from data on model N-compounds at reaction conditions; and testing of correlations with coal-liquid cuts and whole coal-liquid feed, modifying correlations as necessary.

  10. Asymptotic, multigroup flux reconstruction and consistent discontinuity factors

    DOE PAGES [OSTI]

    Trahan, Travis J.; Larsen, Edward W.

    2015-05-12

    Recent theoretical work has led to an asymptotically derived expression for reconstructing the neutron flux from lattice functions and multigroup diffusion solutions. The leading-order asymptotic term is the standard expression for flux reconstruction, i.e., it is the product of a shape function, obtained through a lattice calculation, and the multigroup diffusion solution. The first-order asymptotic correction term is significant only where the gradient of the diffusion solution is not small. Inclusion of this first-order correction term can significantly improve the accuracy of the reconstructed flux. One may define discontinuity factors (DFs) to make certain angular moments of the reconstructed fluxmore » continuous across interfaces between assemblies in 1-D. Indeed, the standard assembly discontinuity factors make the zeroth moment (scalar flux) of the reconstructed flux continuous. The inclusion of the correction term in the flux reconstruction provides an additional degree of freedom that can be used to make two angular moments of the reconstructed flux continuous across interfaces by using current DFs in addition to flux DFs. Thus, numerical results demonstrate that using flux and current DFs together can be more accurate than using only flux DFs, and that making the second angular moment continuous can be more accurate than making the zeroth moment continuous.« less

  11. The asymptotic convergence factor for a polygon under a perturbation

    SciTech Connect

    Li, X.

    1994-12-31

    Let Ax = b be a large system of linear equations, where A {element_of} C{sup NxN}, nonsingular and b {element_of} C{sup N}. A few iterative methods for solving have recently been presented in the case where A is nonsymmetric. Many of their algorithms consist of two phases: Phase I: estimate the extreme eigenvalues of A; Phase II: construct and apply an iterative method based on the estimates. For convenience, it is rewritten as an equivalent fixed-point form, x = Tx + c. Let {Omega} be a compact set excluding 1 in the complex plane, and let its complement in the extended complex plane be simply connected. The asymptotic convergence factor (ACF) for {Omega}, denoted by {kappa}({Omega}), measures the rate of convergence for the asymptotically optimal semiiterative methods for solving, where {sigma}(T) {contained_in} {Omega}.

  12. Making tensor factorizations robust to non-gaussian noise.

    SciTech Connect

    Chi, Eric C.; Kolda, Tamara Gibson

    2011-03-01

    Tensors are multi-way arrays, and the CANDECOMP/PARAFAC (CP) tensor factorization has found application in many different domains. The CP model is typically fit using a least squares objective function, which is a maximum likelihood estimate under the assumption of independent and identically distributed (i.i.d.) Gaussian noise. We demonstrate that this loss function can be highly sensitive to non-Gaussian noise. Therefore, we propose a loss function based on the 1-norm because it can accommodate both Gaussian and grossly non-Gaussian perturbations. We also present an alternating majorization-minimization (MM) algorithm for fitting a CP model using our proposed loss function (CPAL1) and compare its performance to the workhorse algorithm for fitting CP models, CP alternating least squares (CPALS).

  13. IMPACT OF FIVE TREATMENT FACTORS ON MUSSEL MORTALITY

    SciTech Connect

    Daniel P. Molloy

    2003-12-08

    Under this USDOE-NETL contract, the bacterium Pseudomonas fluorescens is being developed as a biocontrol agent for zebra mussels. The specific purpose of the contract is to identify factors that affect mussel kill. Test results reported herein indicate that mussel kill should not be affected by: (1) air bubbles being carried by currents through power plant pipes; (2) pipe orientation (e.g., vertical or horizontal); (3) whether the bacterial cell concentration during a treatment is constant or slightly varying; (4) whether a treatment is between 3 hr and 12 hr in duration, given that the total quantity of bacteria being applied to the pipe is a constant; and (5) whether the water temperature is between 13 C and 23 C.

  14. Control of mechanically activated polymersome fusion: Factors affecting fusion

    DOE PAGES [OSTI]

    Henderson, Ian M.; Paxton, Walter F.

    2014-12-15

    Previously we have studied the mechanically-activated fusion of extruded (200 nm) polymer vesicles into giant polymersomes using agitation in the presence of salt. In this study we have investigated several factors contributing to this phenomenon, including the effects of (i) polymer vesicle concentration, (ii) agitation speed and duration, and iii) variation of the salt and its concentration. It was found that increasing the concentration of the polymer dramatically increases the production of giant vesicles through the increased collisions of polymersomes. Our investigations also found that increasing the frequency of agitation increased the efficiency of fusion, though ultimately limited the sizemore » of vesicle which could be produced due to the high shear involved. Finally it was determined that salt-mediation of the fusion process was not limited to NaCl, but is instead a general effect facilitated by the presence of solvated ionic compounds, albeit with different salts initiating fusion at different concentration.« less

  15. Control of mechanically activated polymersome fusion: Factors affecting fusion

    SciTech Connect

    Henderson, Ian M.; Paxton, Walter F.

    2014-12-15

    Previously we have studied the mechanically-activated fusion of extruded (200 nm) polymer vesicles into giant polymersomes using agitation in the presence of salt. In this study we have investigated several factors contributing to this phenomenon, including the effects of (i) polymer vesicle concentration, (ii) agitation speed and duration, and iii) variation of the salt and its concentration. It was found that increasing the concentration of the polymer dramatically increases the production of giant vesicles through the increased collisions of polymersomes. Our investigations also found that increasing the frequency of agitation increased the efficiency of fusion, though ultimately limited the size of vesicle which could be produced due to the high shear involved. Finally it was determined that salt-mediation of the fusion process was not limited to NaCl, but is instead a general effect facilitated by the presence of solvated ionic compounds, albeit with different salts initiating fusion at different concentration.

  16. Factorization method and new potentials from the inverted oscillator

    SciTech Connect

    Bermudez, David Fernndez C, David J.

    2013-06-15

    In this article we will apply the first- and second-order supersymmetric quantum mechanics to obtain new exactly-solvable real potentials departing from the inverted oscillator potential. This system has some special properties; in particular, only very specific second-order transformations produce non-singular real potentials. It will be shown that these transformations turn out to be the so-called complex ones. Moreover, we will study the factorization method applied to the inverted oscillator and the algebraic structure of the new Hamiltonians. -- Highlights: We apply supersymmetric quantum mechanics to the inverted oscillator potential. The complex second-order transformations allow us to build new non-singular potentials. The algebraic structure of the initial and final potentials is analyzed. The initial potential is described by a complex-deformed HeisenbergWeyl algebra. The final potentials are described by polynomial Heisenberg algebras.

  17. Factors relevant to utility integration of intermittent renewable technologies

    SciTech Connect

    Wan, Yih-huei; Parsons, B.K.

    1993-08-01

    This study assesses factors that utilities must address when they integrate intermittent renewable technologies into their power-supply systems; it also reviews the literature in this area and has a bibliography containing more than 350 listings. Three topics are covered: (1) interface (hardware and design-related interconnection), (2) operability/stability, and (3) planning. This study finds that several commonly held perceptions regarding integration of intermittent renewable energy technologies are not valid. Among findings of the study are the following: (1) hardware and system design advances have eliminated most concerns about interface; (2) cost penalties have not occurred at low to moderate penetration levels (and high levels are feasible); and (3) intermittent renewable energy technologies can have capacity values. Obstacles still interfering with intermittent renewable technologies are also identified.

  18. On spectroscopic factors of magic and semimagic nuclei

    SciTech Connect

    Saperstein, E. E.; Gnezdilov, N. V.; Tolokonnikov, S. V.

    2014-10-15

    Single-particle spectroscopic factors (SF) of magic and semimagic nuclei are analyzed within the self-consistent theory of finite Fermi systems. The the in-volume energy dependence of the mass operator Σ is taken into account in addition to the energy dependence induced by the surface-phonon coupling effects which is commonly considered. It appears due to the effect of high-lying collective and non-collective particle-hole excitations and persists in nuclear matter. The self-consistent basis of the energy density functional method by Fayans et al. is used. Both the surface and in-volume contributions to the SFs turned out to be of comparable magnitude. Results for magic {sup 208}Pb nucleus and semimagic lead isotopes are presented.

  19. Meson Transition Form Factors in Light-Front Holographic QCD

    SciTech Connect

    Brodsky, Stanley J.; Cao, Fu-Guang; de Teramond, Guy F.; /Costa Rica U.

    2011-06-22

    We study the photon-to-meson transition form factors (TFFs) F{sub M{gamma}}(Q{sup 2}) for {gamma}{gamma}* {yields} M using light-front holographic methods. The Chern-Simons action, which is a natural form in 5-dimensional anti-de Sitter (AdS) space, leads directly to an expression for the photon-to-pion TFF for a class of confining models. Remarkably, the predicted pion TFF is identical to the leading order QCD result where the distribution amplitude has asymptotic form. The Chern-Simons form is local in AdS space and is thus somewhat limited in its predictability. It only retains the q{bar q} component of the pion wavefunction, and further, it projects out only the asymptotic form of the meson distribution amplitude. It is found that in order to describe simultaneously the decay process {pi}{sup 0} {yields} {gamma}{gamma} and the pion TFF at the asymptotic limit, a probability for the q{bar q} component of the pion wavefunction P{sub q{bar q}} = 0.5 is required; thus giving indication that the contributions from higher Fock states in the pion light-front wavefunction need to be included in the analysis. The probability for the Fock state containing four quarks (anti-quarks) which follows from analyzing the hadron matrix elements, P{sub q{bar q}q{bar q}} {approx} 10%, agrees with the analysis of the pion elastic form factor using light-front holography including higher Fock components in the pion wavefunction. The results for the TFFs for the {eta} and {eta}{prime} mesons are also presented. The rapid growth of the pion TFF exhibited by the BABAR data at high Q{sup 2} is not compatible with the models discussed in this article, whereas the theoretical calculations are in agreement with the experimental data for the {eta} and {eta}{prime} TFFs.

  20. Energy Price Indices and Discount Factors for Life-Cycle Cost...

    Office of Energy Efficiency and Renewable Energy (EERE) (indexed site)

    Price Indices and Discount Factors for Life-Cycle Cost Analysis - 2015 Energy Price Indices and Discount Factors for Life-Cycle Cost Analysis - 2015 Handbook describes the annual ...