National Library of Energy BETA

Sample records for minimum computer requirements

  1. Program Evaluation: Minimum EERE Requirements

    Broader source: Energy.gov [DOE]

    The minimum requirements for EERE's in-progress peer reviews are described below. Given the diversity of EERE programs and activities, a great deal of flexibility is provided within these...

  2. HEAT Loan Minimum Standards and Requirements | Department of Energy

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    HEAT Loan Minimum Standards and Requirements HEAT Loan Minimum Standards and Requirements Presents additional resources on loan standards and requirements from Elise Avers' presentation on HEAT Loan Minimum Standards and Requirements. Minimum Standards and Requirements (63.33 KB) More Documents & Publications Building America Best Practices Series Vol. 14: Energy Renovations - HVAC: A Guide for Contractors to Share with Homeowners STEP Financial Incentives Summary Energy Saver 101: Home

  3. DOE CYBER SECURITY EBK: MINIMUM CORE COMPETENCY TRAINING REQUIREMENTS |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy CYBER SECURITY EBK: MINIMUM CORE COMPETENCY TRAINING REQUIREMENTS DOE CYBER SECURITY EBK: MINIMUM CORE COMPETENCY TRAINING REQUIREMENTS puzzle-693870_960_720.jpg DOE CYBER SECURITY EBK: MINIMUM CORE COMPETENCY TRAINING REQUIREMENTS (78.26 KB) More Documents & Publications DOE CYBER SECURITY EBK: MINIMUM CORE COMPETENCY TRAINING REQUIREMENTS DOE CYBER SECURITY EBK: CORE COMPETENCY TRAINING REQUIREMENTS: CA Authorizing Official Designated Representative (AODR)

  4. HEAT Loan Minimum Standards and Requirements

    Energy Savers [EERE]

    you must meet the following minimum standards listed below. * New natural gas or propane boilers must be at least 90% AFUE to be eligible. * New oil boilers must be at least...

  5. Minimum Efficiency Requirements Tables for Heating and Cooling Product

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Categories | Department of Energy Minimum Efficiency Requirements Tables for Heating and Cooling Product Categories Minimum Efficiency Requirements Tables for Heating and Cooling Product Categories The Federal Energy Management Program (FEMP) created tables that mirror American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) 90.1-2013 tables, which include minimum efficiency requirements for FEMP-designated and ENERGY STAR-qualified heating and cooling product

  6. Minimum Velocity Required to Transport Solid Particles from the...

    Office of Scientific and Technical Information (OSTI)

    Required to Transport Solid Particles from the 2H-Evaporator to the Tank Farm Citation Details In-Document Search Title: Minimum Velocity Required to Transport Solid Particles ...

  7. Incorporate Minimum Efficiency Requirements for Heating and Cooling...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    and Air-Conditioning Engineers (ASHRAE) 90.1-2013 minimum efficiency requirement tables. ... These ASHRAE 90.1-2013 Table 6.8.1-1 and Table 6.8.1-2 equipment types are excluded: ...

  8. Present and Future Computing Requirements

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computational Cosmology DES LSST Presenter: Salman Habib Argonne National Laboratory Jim Ahrens (LANL) Scott Dodelson (FNAL) Katrin Heitmann (ANL) Peter Nugent (LBNL) Anze Slosar (BNL) Risa Wechsler (SLAC) 1 Cosmic Frontier Computing Collaboration Computational Cosmology SciDAC-3 Project Ann Almgren (LBNL) Nick Gnedin (FNAL) Dave Higdon (LANL) Rob Ross (ANL) Martin White (UC Berkeley/ LBNL) Large Scale Production Computing and Storage Requirements for High Energy Physics Research A DOE Technical

  9. Intro to computer programming, no computer required! | Argonne...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ... "Computational thinking requires you to think in abstractions," said Papka, who spoke to computer science and computer-aided design students at Kaneland High School in Maple Park about ...

  10. ASHRAE Minimum Efficiency Requirements Tables for Heating and Cooling Product Categories

    Broader source: Energy.gov [DOE]

    The Federal Energy Management Program (FEMP) created tables that mirror American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) 90.1-2013 tables, which include minimum efficiency requirements for FEMP-designated and ENERGY STAR-qualified heating and cooling product categories. Download the tables below to incorporate FEMP and ENERGY STAR purchasing requirements into federal product acquisition documents.

  11. Present and Future Computing Requirements for PETSc

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and Future Computing Requirements for PETSc Jed Brown jedbrown@mcs.anl.gov Mathematics and Computer Science Division, Argonne National Laboratory Department of Computer Science, University of Colorado Boulder NERSC ASCR Requirements for 2017 2014-01-15 Extending PETSc's Hierarchically Nested Solvers ANL Lois C. McInnes, Barry Smith, Jed Brown, Satish Balay UChicago Matt Knepley IIT Hong Zhang LBL Mark Adams Linear solvers, nonlinear solvers, time integrators, optimization methods (merged TAO)

  12. Incorporate Minimum Efficiency Requirements for Heating and Cooling Products into Federal Acquisition Documents

    Broader source: Energy.gov [DOE]

    The Federal Energy Management Program (FEMP) organized information about FEMP-designated and ENERGY STAR-qualified heating, ventilating, and air conditioning (HVAC) and water heating products into tables that mirror American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) 90.1-2013 minimum efficiency requirement tables. Federal buyers can use these tables as a reference and to incorporate the proper purchasing requirements set by FEMP and ENERGY STAR into federal acquisition documents.

  13. "Table A52. Nonswitchable Minimum Requirements and Maximum Consumption"

    U.S. Energy Information Administration (EIA) Indexed Site

    2. Nonswitchable Minimum Requirements and Maximum Consumption" " Potential by Census Region, 1991" " (Estimates in Physical Units)" ,,,,"RSE" ,"Actual","Minimum","Maximum","Row" "Type of Energy","Consumption","Consumption(a)","Consumption(b)","Factors" "RSE Column Factors:",1,1.2,0.8 ," Total United States" ,"-","-","-"

  14. Large Scale Computing and Storage Requirements for Advanced Scientific...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research: Target 2014 ASCRFrontcover.png Large Scale Computing and Storage Requirements for ...

  15. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Production Computing and Storage Requirements for Fusion Energy Sciences: Target 2017 The NERSC Program Requirements Review "Large Scale Production Computing and ...

  16. Minimum 186 Basin levels required for operation of ECS and CWS pumps

    SciTech Connect (OSTI)

    Reeves, K.K.; Barbour, K.L.

    1992-10-01

    Operation of K Reactor with a cooling tower requires that 186 Basin loss of inventory transients be considered during Design Basis Accident analyses requiring ECS injection, such as the LOCA and LOPA. Since the cooling tower systems are not considered safety systems, credit is not taken for their continued operation during a LOPA or LOCA even though they would likely continue to operate as designed. Without the continued circulation of cooling water to the 186 Basin by the cooling tower pumps, the 186 Basin will lose inventory until additional make-up can be obtained from the river water supply system. Increasing the make-up to the 186 Basin from the river water system may require the opening of manually operated valves, the starting of additional river water pumps, and adjustments of the flow to L Area. In the time required for these actions a loss of basin inventory could occur. The ECS and CWS pumps are supplied by the 186 Basin. A reduction in the basin level will result in decreased pump suction head. This reduction in suction head will result in decreased output from the pumps and, if severe enough, could lead to pump cavitation for some configurations. The subject of this report is the minimum 186 Basin level required to prevent ECS and CWS pump cavitation. The reduction in ECS flow due to a reduced 186 Basin level without cavitation is part of a separate study.

  17. Advanced Scientific Computing Research Network Requirements

    SciTech Connect (OSTI)

    Bacon, Charles; Bell, Greg; Canon, Shane; Dart, Eli; Dattoria, Vince; Goodwin, Dave; Lee, Jason; Hicks, Susan; Holohan, Ed; Klasky, Scott; Lauzon, Carolyn; Rogers, Jim; Shipman, Galen; Skinner, David; Tierney, Brian

    2013-03-08

    The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the U.S. Department of Energy (DOE) Office of Science (SC), the single largest supporter of basic research in the physical sciences in the United States. In support of SC programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 25 years. In October 2012, ESnet and the Office of Advanced Scientific Computing Research (ASCR) of the DOE SC organized a review to characterize the networking requirements of the programs funded by the ASCR program office. The requirements identified at the review are summarized in the Findings section, and are described in more detail in the body of the report.

  18. Large Scale Computing and Storage Requirements for Advanced Scientific

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing Research: Target 2014 Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research: Target 2014 ASCRFrontcover.png Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research An ASCR / NERSC Review January 5-6, 2011 Final Report Large Scale Computing and Storage Requirements for Advanced Scientific Computing Research, Report of the Joint ASCR / NERSC Workshop conducted January 5-6, 2011 Goals This workshop is being

  19. Large Scale Production Computing and Storage Requirements for Fusion Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sciences: Target 2017 Large Scale Production Computing and Storage Requirements for Fusion Energy Sciences: Target 2017 The NERSC Program Requirements Review "Large Scale Production Computing and Storage Requirements for Fusion Energy Sciences" is organized by the Department of Energy's Office of Fusion Energy Sciences (FES), Office of Advanced Scientific Computing Research (ASCR), and the National Energy Research Scientific Computing Center (NERSC). The review's goal is to

  20. Large Scale Production Computing and Storage Requirements for High Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Physics: Target 2017 Large Scale Production Computing and Storage Requirements for High Energy Physics: Target 2017 HEPlogo.jpg The NERSC Program Requirements Review "Large Scale Computing and Storage Requirements for High Energy Physics" is organized by the Department of Energy's Office of High Energy Physics (HEP), Office of Advanced Scientific Computing Research (ASCR), and the National Energy Research Scientific Computing Center (NERSC). The review's goal is to characterize

  1. Large Scale Computing and Storage Requirements for High Energy Physics

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Computing and Storage Requirements for High Energy Physics HEPFrontcover.png Large Scale Computing and Storage Requirements for High Energy Physics An HEP / ASCR / NERSC Workshop November 12-13, 2009 Report Large Scale Computing and Storage Requirements for High Energy Physics, Report of the Joint HEP / ASCR / NERSC Workshop conducted Nov. 12-13, 2009 https://www.nersc.gov/assets/HPC-Requirements-for-Science/HEPFrontcover.png Goals This workshop was organized by the Department of

  2. Can Cloud Computing Address the Scientific Computing Requirements...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    the ever-increasing computational needs of scientists, Department of Energy ... and as the largest funder of basic scientific research in the U.S., DOE was interested in ...

  3. Large Scale Computing and Storage Requirements for Basic Energy Sciences:

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Target 2014 Large Scale Computing and Storage Requirements for Basic Energy Sciences: Target 2014 BESFrontcover.png Final Report Large Scale Computing and Storage Requirements for Basic Energy Sciences, Report of the Joint BES/ ASCR / NERSC Workshop conducted February 9-10, 2010 Workshop Agenda The agenda for this workshop is presented here: including presentation times and speaker information. Read More » Workshop Presentations Large Scale Computing and Storage Requirements for Basic

  4. Large Scale Production Computing and Storage Requirements for Advanced

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scientific Computing Research: Target 2017 Large Scale Production Computing and Storage Requirements for Advanced Scientific Computing Research: Target 2017 ASCRLogo.png This is an invitation-only review organized by the Department of Energy's Office of Advanced Scientific Computing Research (ASCR) and NERSC. The general goal is to determine production high-performance computing, storage, and services that will be needed for ASCR to achieve its science goals through 2017. A specific focus

  5. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Requirements for Advanced Scientific Computing Research: Target 2017 ASCRLogo.png This is an invitation-only review organized by the Department of Energy's Office of Advanced ...

  6. Harvey Wasserman! Large Scale Computing and Storage Requirements...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Computing and Storage Requirements for High Energy Physics Research: Target 2017 ...www.nersc.govsciencerequirementsHEP * Mid---morning a

  7. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Large Scale Production Computing and Storage Requirements for High Energy Physics: Target 2017 ... Energy's Office of High Energy Physics (HEP), Office of Advanced Scientific ...

  8. Determining Allocation Requirements | Argonne Leadership Computing...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Feedback Form Determining Allocation Requirements Estimating CPU-Hours for ALCF Blue GeneQ Systems When estimating CPU-hours for the ALCF Blue GeneQ systems, it is...

  9. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    SciTech Connect (OSTI)

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project.

  10. Large Scale Computing and Storage Requirements for Nuclear Physics: Target

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2014 Large Scale Computing and Storage Requirements for Nuclear Physics: Target 2014 NPFrontcover.png May 26-27, 2011 Hyatt Regency Bethesda One Bethesda Metro Center (7400 Wisconsin Ave) Bethesda, Maryland, USA 20814 Final Report Large Scale Computing and Storage Requirements for Nuclear Physics Research, Report of the Joint NP / NERSC Workshop Conducted May 26-27, 2011 Bethesda, MD Sponsored by the U.S. Department of Energy Office of Science, Office of Advanced Scientific Computing

  11. Large Scale Production Computing and Storage Requirements for Basic Energy

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Sciences: Target 2017 Large Scale Production Computing and Storage Requirements for Basic Energy Sciences: Target 2017 BES-Montage.png This is an invitation-only review organized by the Department of Energy's Office of Basic Energy Sciences (BES), Office of Advanced Scientific Computing Research (ASCR), and the National Energy Research Scientific Computing Center (NERSC). The goal is to determine production high-performance computing, storage, and services that will be needed for BES to

  12. Large Scale Production Computing and Storage Requirements for Nuclear

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Physics: Target 2017 Large Scale Production Computing and Storage Requirements for Nuclear Physics: Target 2017 NPicon.png This invitation-only review is organized by the Department of Energy's Offices of Nuclear Physics (NP) and Advanced Scientific Computing Research (ASCR) and by NERSC. The goal is to determine production high-performance computing, storage, and services that will be needed for NP to achieve its science goals through 2017. The review brings together DOE Program Managers,

  13. Large Scale Computing and Storage Requirements for Biological and

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Environmental Research: Target 2014 Large Scale Computing and Storage Requirements for Biological and Environmental Research: Target 2014 BERFrontcover.png A BER / ASCR / NERSC Workshop May 7-8, 2009 Final Report Large Scale Computing and Storage Requirements for Biological and Environmental Research, Report of the Joint BER / NERSC Workshop Conducted May 7-8, 2009 Rockville, MD Goals This workshop was jointly organized by the Department of Energy's Office of Biological & Environmental

  14. Large Scale Computing and Storage Requirements for Fusion Energy Sciences:

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Target 2014 High Energy Physics (HEP) Nuclear Physics (NP) Overview Published Reports Case Study FAQs NERSC HPC Achievement Awards Share Your Research User Submitted Research Citations NERSC Citations Home » Science at NERSC » HPC Requirements Reviews » Requirements Reviews: Target 2014 » Fusion Energy Sciences (FES) Large Scale Computing and Storage Requirements for Fusion Energy Sciences: Target 2014 FESFrontcover.png An FES / ASCR / NERSC Workshop August 3-4, 2010 Final Report Large

  15. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    SciTech Connect (OSTI)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  16. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect (OSTI)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  17. Large Scale Production Computing and Storage Requirements for Biological

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    and Environmental Research: Target 2017 Large Scale Production Computing and Storage Requirements for Biological and Environmental Research: Target 2017 BERmontage.gif September 11-12, 2012 Hilton Rockville Hotel and Executive Meeting Center 1750 Rockville Pike Rockville, MD, 20852-1699 TEL: 1-301-468-1100 Sponsored by: U.S. Department of Energy Office of Science Office of Advanced Scientific Computing Research (ASCR) Office of Biological and Environmental Research (BER) National Energy

  18. ComPASS Present and Future Computing Requirements

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    ComPASS Present and Future Computing Requirements Panagiotis Spentzouris (Fermilab) for the ComPASS collaboration NERSC BER Requirements for 2017 September 11-12, 2012 Rockville, MD Accelerators for High Energy Physics § At the Energy Frontier, high- energy particle beam collisions seek to uncover new phenomena * the origin of mass, the nature of dark matter, extra dimensions of space. § At the Intensity Frontier, high-flux beams enable exploration of * neutrino interactions, to answer

  19. Large Scale Computing and Storage Requirements for High Energy Physics

    SciTech Connect (OSTI)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  20. Present and Future Computing Requirements Radiative Transfer of Astrophysical Explosions

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Requirements Radiative Transfer of Astrophysical Explosions Daniel Kasen (UCB/LBNL) SciDAC computational astrophysics consortium Stan Woosley, Ann Almgren, John Bell, Haitao Ma, Peter Nugent, Rollin Thomas, Weiquin Zhang, Adam Burrows, Jason Nordhaus, Louis Howell, Mike Zingale topics and open questions * thermonuclear supernova: What are the progenitors: 1 or 2 white dwarfs? How does the nuclear runaway ignite and develop? How regular are these "standard candles" for cosmology? * core

  1. Advanced Scienti c Computing Research Network Requirements Review

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Scienti c Computing Research Network Requirements Review Final Report April 22-23, 2015 Disclaimer This document was prepared as an account of work sponsored by the United States Government. While this doc- ument is believed to contain correct informa on, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal responsibility for the accuracy,

  2. Requirements for supercomputing in energy research: The transition to massively parallel computing

    SciTech Connect (OSTI)

    Not Available

    1993-02-01

    This report discusses: The emergence of a practical path to TeraFlop computing and beyond; requirements of energy research programs at DOE; implementation: supercomputer production computing environment on massively parallel computers; and implementation: user transition to massively parallel computing.

  3. Scientific Application Requirements for Leadership Computing at the Exascale

    SciTech Connect (OSTI)

    Ahern, Sean; Alam, Sadaf R; Fahey, Mark R; Hartman-Baker, Rebecca J; Barrett, Richard F; Kendall, Ricky A; Kothe, Douglas B; Mills, Richard T; Sankaran, Ramanan; Tharrington, Arnold N; White III, James B

    2007-12-01

    The Department of Energy s Leadership Computing Facility, located at Oak Ridge National Laboratory s National Center for Computational Sciences, recently polled scientific teams that had large allocations at the center in 2007, asking them to identify computational science requirements for future exascale systems (capable of an exaflop, or 1018 floating point operations per second). These requirements are necessarily speculative, since an exascale system will not be realized until the 2015 2020 timeframe, and are expressed where possible relative to a recent petascale requirements analysis of similar science applications [1]. Our initial findings, which beg further data collection, validation, and analysis, did in fact align with many of our expectations and existing petascale requirements, yet they also contained some surprises, complete with new challenges and opportunities. First and foremost, the breadth and depth of science prospects and benefits on an exascale computing system are striking. Without a doubt, they justify a large investment, even with its inherent risks. The possibilities for return on investment (by any measure) are too large to let us ignore this opportunity. The software opportunities and challenges are enormous. In fact, as one notable computational scientist put it, the scale of questions being asked at the exascale is tremendous and the hardware has gotten way ahead of the software. We are in grave danger of failing because of a software crisis unless concerted investments and coordinating activities are undertaken to reduce and close this hardwaresoftware gap over the next decade. Key to success will be a rigorous requirement for natural mapping of algorithms to hardware in a way that complements (rather than competes with) compilers and runtime systems. The level of abstraction must be raised, and more attention must be paid to functionalities and capabilities that incorporate intent into data structures, are aware of memory hierarchy

  4. Architectural requirements for the Red Storm computing system...

    Office of Scientific and Technical Information (OSTI)

    ... Computer architecture.; Supercomputers.; Accelerated Strategic Computing Initiative (ASCI)* Word Cloud More Like This Full Text preview image File size NAView Full Text View ...

  5. User's manual for RATEPAC: a digital-computer program for revenue requirements and rate-impact analysis

    SciTech Connect (OSTI)

    Fuller, L.C.

    1981-09-01

    The RATEPAC computer program is designed to model the financial aspects of an electric power plant or other investment requiring capital outlays and having annual operating expenses. The program produces incremental pro forma financial statements showing how an investment will affect the overall financial statements of a business entity. The code accepts parameters required to determine capital investment and expense as a function of time and sums these to determine minimum revenue requirements (cost of service). The code also calculates present worth of revenue requirements and required return on rate base. This user's manual includes a general description of the code as well as the instructions for input data preparation. A complete example case is appended.

  6. Architectural requirements for the Red Storm computing system...

    Office of Scientific and Technical Information (OSTI)

    specially designed hardware and software reliability features, a light weight ... LABORATORIES; SUPERCOMPUTERS; COMPUTER ARCHITECTURE Parallel processing (Electronic ...

  7. Large Scale Production Computing and Storage Requirements for...

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    This is an invitation-only review organized by the Department of Energy's Office of Basic Energy Sciences (BES), Office of Advanced Scientific Computing Research (ASCR), and the ...

  8. National Ignition Facility sub-system design requirements computer system SSDR 1.5.1

    SciTech Connect (OSTI)

    Spann, J.; VanArsdall, P.; Bliss, E.

    1996-09-05

    This System Design Requirement document establishes the performance, design, development and test requirements for the Computer System, WBS 1.5.1 which is part of the NIF Integrated Computer Control System (ICCS). This document responds directly to the requirements detailed in ICCS (WBS 1.5) which is the document directly above.

  9. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing and Storage Requirements Computing and Storage Requirements for FES J. Candy General Atomics, San Diego, CA Presented at DOE Technical Program Review Hilton Washington DC/Rockville Rockville, MD 19-20 March 2013 2 Computing and Storage Requirements Drift waves and tokamak plasma turbulence Role in the context of fusion research * Plasma performance: In tokamak plasmas, performance is limited by turbulent radial transport of both energy and particles. * Gradient-driven: This turbulent

  10. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    SciTech Connect (OSTI)

    DOE Office of Science, Biological and Environmental Research Program Office ,

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  11. Large Scale Computing and Storage Requirements for Basic Energy Sciences Research

    SciTech Connect (OSTI)

    Gerber, Richard; Wasserman, Harvey

    2011-03-31

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility supporting research within the Department of Energy's Office of Science. NERSC provides high-performance computing (HPC) resources to approximately 4,000 researchers working on about 400 projects. In addition to hosting large-scale computing facilities, NERSC provides the support and expertise scientists need to effectively and efficiently use HPC systems. In February 2010, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR) and DOE's Office of Basic Energy Sciences (BES) held a workshop to characterize HPC requirements for BES research through 2013. The workshop was part of NERSC's legacy of anticipating users future needs and deploying the necessary resources to meet these demands. Workshop participants reached a consensus on several key findings, in addition to achieving the workshop's goal of collecting and characterizing computing requirements. The key requirements for scientists conducting research in BES are: (1) Larger allocations of computational resources; (2) Continued support for standard application software packages; (3) Adequate job turnaround time and throughput; and (4) Guidance and support for using future computer architectures. This report expands upon these key points and presents others. Several 'case studies' are included as significant representative samples of the needs of science teams within BES. Research teams scientific goals, computational methods of solution, current and 2013 computing requirements, and special software and support needs are summarized in these case studies. Also included are researchers strategies for computing in the highly parallel, 'multi-core' environment that is expected to dominate HPC architectures over the next few years. NERSC has strategic plans and initiatives already underway that address key workshop findings. This report includes a brief summary of those relevant to issues

  12. Minimum Day Time Load Calculation and Screening

    Office of Environmental Management (EM)

    ... and TOV requirements Battery storage ... Energy Planning Grid Technologies ... Planning System Planning Department Supplemental Review: 100% minimum load ...

  13. Large Scale Computing and Storage Requirements for Fusion Energy Sciences: Target 2017

    SciTech Connect (OSTI)

    Gerber, Richard

    2014-05-02

    The National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,500 users working on some 650 projects that involve nearly 600 codes in a wide variety of scientific disciplines. In March 2013, NERSC, DOE?s Office of Advanced Scientific Computing Research (ASCR) and DOE?s Office of Fusion Energy Sciences (FES) held a review to characterize High Performance Computing (HPC) and storage requirements for FES research through 2017. This report is the result.

  14. ASC Computational Environment (ACE) requirements version 8.0 final report.

    SciTech Connect (OSTI)

    Larzelere, Alex R. (Exagrid Engineering, Alexandria, VA); Sturtevant, Judith E.

    2006-11-01

    A decision was made early in the Tri-Lab Usage Model process, that the collection of the user requirements be separated from the document describing capabilities of the user environment. The purpose in developing the requirements as a separate document was to allow the requirements to take on a higher-level view of user requirements for ASC platforms in general. In other words, a separate ASC user requirement document could capture requirements in a way that was not focused on ''how'' the requirements would be fulfilled. The intent of doing this was to create a set of user requirements that were not linked to any particular computational platform. The idea was that user requirements would endure from one ASC platform user environment to another. The hope was that capturing the requirements in this way would assist in creating stable user environments even though the particular platforms would be evolving and changing. In order to clearly make the separation, the Tri-lab S&CS program decided to create a new title for the requirements. The user requirements became known as the ASC Computational Environment (ACE) Requirements.

  15. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    SciTech Connect (OSTI)

    Glasscock, J.A.

    1995-03-08

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies.

  16. High Performance Computing and Storage Requirements for Nuclear Physics: Target 2017

    SciTech Connect (OSTI)

    Gerber, Richard; Wasserman, Harvey

    2015-01-20

    In April 2014, NERSC, ASCR, and the DOE Office of Nuclear Physics (NP) held a review to characterize high performance computing (HPC) and storage requirements for NP research through 2017. This review is the 12th in a series of reviews held by NERSC and Office of Science program offices that began in 2009. It is the second for NP, and the final in the second round of reviews that covered the six Office of Science program offices. This report is the result of that review

  17. Requirements for Control Room Computer-Based Procedures for use in Hybrid Control Rooms

    SciTech Connect (OSTI)

    Le Blanc, Katya Lee; Oxstrand, Johanna Helene; Joe, Jeffrey Clark

    2015-05-01

    Many plants in the U.S. are currently undergoing control room modernization. The main drivers for modernization are the aging and obsolescence of existing equipment, which typically results in a like-for-like replacement of analogue equipment with digital systems. However, the modernization efforts present an opportunity to employ advanced technology that would not only extend the life, but enhance the efficiency and cost competitiveness of nuclear power. Computer-based procedures (CBPs) are one example of near-term advanced technology that may provide enhanced efficiencies above and beyond like for like replacements of analog systems. Researchers in the LWRS program are investigating the benefits of advanced technologies such as CBPs, with the goal of assisting utilities in decision making during modernization projects. This report will describe the existing research on CBPs, discuss the unique issues related to using CBPs in hybrid control rooms (i.e., partially modernized analog control rooms), and define the requirements of CBPs for hybrid control rooms.

  18. QCD Thermodynamics at High Temperature Peter Petreczky Large Scale Computing and Storage Requirements for Nuclear Physics (NP),

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    QCD Thermodynamics at High Temperature Peter Petreczky Large Scale Computing and Storage Requirements for Nuclear Physics (NP), Bethesda MD, April 29-30, 2014 NY Center for Computational Science 2 Defining questions of nuclear physics research in US: Nuclear Science Advisory Committee (NSAC) "The Frontiers of Nuclear Science", 2007 Long Range Plan "What are the phases of strongly interacting matter and what roles do they play in the cosmos ?" "What does QCD predict for

  19. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  20. Improved initial guess for minimum energy path calculations

    SciTech Connect (OSTI)

    Smidstrup, Sren; Pedersen, Andreas; Stokbro, Kurt

    2014-06-07

    A method is presented for generating a good initial guess of a transition path between given initial and final states of a system without evaluation of the energy. An objective function surface is constructed using an interpolation of pairwise distances at each discretization point along the path and the nudged elastic band method then used to find an optimal path on this image dependent pair potential (IDPP) surface. This provides an initial path for the more computationally intensive calculations of a minimum energy path on an energy surface obtained, for example, by ab initio or density functional theory. The optimal path on the IDPP surface is significantly closer to a minimum energy path than a linear interpolation of the Cartesian coordinates and, therefore, reduces the number of iterations needed to reach convergence and averts divergence in the electronic structure calculations when atoms are brought too close to each other in the initial path. The method is illustrated with three examples: (1) rotation of a methyl group in an ethane molecule, (2) an exchange of atoms in an island on a crystal surface, and (3) an exchange of two Si-atoms in amorphous silicon. In all three cases, the computational effort in finding the minimum energy path with DFT was reduced by a factor ranging from 50% to an order of magnitude by using an IDPP path as the initial path. The time required for parallel computations was reduced even more because of load imbalance when linear interpolation of Cartesian coordinates was used.

  1. computers

    National Nuclear Security Administration (NNSA)

    California.

    Retired computers used for cybersecurity research at Sandia National...

  2. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    SciTech Connect (OSTI)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  3. Cielo Computational Environment Usage Model With Mappings to ACE Requirements for the General Availability User Environment Capabilities Release Version 1.1

    SciTech Connect (OSTI)

    Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen

    2012-08-09

    Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.

  4. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Energy Defense Waste Management Programs Advanced Nuclear Energy

  5. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. ! Application and System Memory Use, Configuration, and Problems on Bassi Richard Gerber Lawrence Berkeley National Laboratory NERSC User Services ScicomP 13 Garching bei München, Germany, July 17, 2007 ScicomP 13, July 17, 2007, Garching Overview * About Bassi * Memory on Bassi * Large Page Memory (It's Great!) * System Configuration * Large Page

  6. Requirements for Computer Based-Procedures for Nuclear Power Plant Field Operators Results from a Qualitative Study

    SciTech Connect (OSTI)

    Katya Le Blanc; Johanna Oxstrand

    2012-05-01

    Although computer-based procedures (CBPs) have been investigated as a way to enhance operator performance on procedural tasks in the nuclear industry for almost thirty years, they are not currently widely deployed at United States utilities. One of the barriers to the wide scale deployment of CBPs is the lack of operational experience with CBPs that could serve as a sound basis for justifying the use of CBPs for nuclear utilities. Utilities are hesitant to adopt CBPs because of concern over potential costs of implementation, and concern over regulatory approval. Regulators require a sound technical basis for the use of any procedure at the utilities; without operating experience to support the use CBPs, it is difficult to establish such a technical basis. In an effort to begin the process of developing a technical basis for CBPs, researchers at Idaho National Laboratory are partnering with industry to explore CBPs with the objective of defining requirements for CBPs and developing an industry-wide vision and path forward for the use of CBPs. This paper describes the results from a qualitative study aimed at defining requirements for CBPs to be used by field operators and maintenance technicians.

  7. Development of a computer code to predict a ventilation requirement for an underground radioactive waste storage tank

    SciTech Connect (OSTI)

    Lee, Y.J.; Dalpiaz, E.L.

    1997-08-01

    Computer code, WTVFE (Waste Tank Ventilation Flow Evaluation), has been developed to evaluate the ventilation requirement for an underground storage tank for radioactive waste. Heat generated by the radioactive waste and mixing pumps in the tank is removed mainly through the ventilation system. The heat removal process by the ventilation system includes the evaporation of water from the waste and the heat transfer by natural convection from the waste surface. Also, a portion of the heat will be removed through the soil and the air circulating through the gap between the primary and secondary tanks. The heat loss caused by evaporation is modeled based on recent evaporation test results by the Westinghouse Hanford Company using a simulated small scale waste tank. Other heat transfer phenomena are evaluated based on well established conduction and convection heat transfer relationships. 10 refs., 3 tabs.

  8. Theoretical minimum energies to produce steel for selected conditions

    SciTech Connect (OSTI)

    Fruehan, R. J.; Fortini, O.; Paxton, H. W.; Brindle, R.

    2000-03-01

    An ITP study has determined the theoretical minimum energy requirements for producing steel from ore, scrap, and direct reduced iron. Dr. Richard Fruehan's report, Theoretical Minimum Energies to Produce Steel for Selected Conditions, provides insight into the potential energy savings (and associated reductions in carbon dioxide emissions) for ironmaking, steelmaking, and rolling processes (PDF459 KB).

  9. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    SciTech Connect (OSTI)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  10. Computing Videos

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computing Videos Computing

  11. GMTI radar minimum detectable velocity.

    SciTech Connect (OSTI)

    Richards, John Alfred

    2011-04-01

    Minimum detectable velocity (MDV) is a fundamental consideration for the design, implementation, and exploitation of ground moving-target indication (GMTI) radar imaging modes. All single-phase-center air-to-ground radars are characterized by an MDV, or a minimum radial velocity below which motion of a discrete nonstationary target is indistinguishable from the relative motion between the platform and the ground. Targets with radial velocities less than MDV are typically overwhelmed by endoclutter ground returns, and are thus not generally detectable. Targets with radial velocities greater than MDV typically produce distinct returns falling outside of the endoclutter ground returns, and are thus generally discernible using straightforward detection algorithms. This document provides a straightforward derivation of MDV for an air-to-ground single-phase-center GMTI radar operating in an arbitrary geometry.

  12. HEAT Loan Minimum Standards and Requirements | Department of...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Building America Best Practices Series Vol. 14: Energy Renovations - HVAC: A Guide for Contractors to Share with Homeowners STEP Financial Incentives Summary Energy Saver 101: Home ...

  13. Minimum Efficiency Requirements Tables for Heating and Cooling...

    Energy Savers [EERE]

    The Federal Energy Management Program (FEMP) created tables that mirror American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) 90.1-2013 tables, which ...

  14. 2-D image segmentation using minimum spanning trees

    SciTech Connect (OSTI)

    Xu, Y.; Uberbacher, E.C.

    1995-09-01

    This paper presents a new algorithm for partitioning a gray-level image into connected homogeneous regions. The novelty of this algorithm lies in the fact that by constructing a minimum spanning tree representation of a gray-level image, it reduces a region partitioning problem to a minimum spanning tree partitioning problem, and hence reduces the computational complexity of the region partitioning problem. The tree-partitioning algorithm, in essence, partitions a minimum spanning tree into subtrees, representing different homogeneous regions, by minimizing the sum of variations of gray levels over all subtrees under the constraints that each subtree should have at least a specified number of nodes, and two adjacent subtrees should have significantly different average gray-levels. Two (faster) heuristic implementations are also given for large-scale region partitioning problems. Test results have shown that the segmentation results are satisfactory and insensitive to noise.

  15. Computer System,

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    System, Cluster, and Networking Summer Institute New Mexico Consortium and Los Alamos National Laboratory HOW TO APPLY Applications will be accepted JANUARY 5 - FEBRUARY 13, 2016 Computing and Information Technology undegraduate students are encouraged to apply. Must be a U.S. citizen. * Submit a current resume; * Offcial University Transcript (with spring courses posted and/or a copy of spring 2016 schedule) 3.0 GPA minimum; * One Letter of Recommendation from a Faculty Member; and * Letter of

  16. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    SciTech Connect (OSTI)

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.

    2014-04-15

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomenpelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same

  17. Computer Security

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    computer security Computer Security All JLF participants must fully comply with all LLNL computer security regulations and procedures. A laptop entering or leaving B-174 for the sole use by a US citizen and so configured, and requiring no IP address, need not be registered for use in the JLF. By September 2009, it is expected that computers for use by Foreign National Investigators will have no special provisions. Notify maricle1@llnl.gov of all other computers entering, leaving, or being moved

  18. Designing a minimum-functionality neutron and gamma measurement instrument with a focus on authentication

    SciTech Connect (OSTI)

    Karpius, Peter J; Williams, Richard B

    2009-01-01

    During the design and construction of the Next-Generation Attribute-Measurement System, which included a largely commercial off-the-shelf (COTS), nondestructive assay (NDA) system, we realized that commercial NDA equipment tends to include numerous features that are not required for an attribute-measurement system. Authentication of the hardware, firmware, and software in these instruments is still required, even for those features not used in this application. However, such a process adds to the complexity, cost, and time required for authentication. To avoid these added authenticat ion difficulties, we began to design NDA systems capable of performing neutron multiplicity and gamma-ray spectrometry measurements by using simplified hardware and software that avoids unused features and complexity. This paper discusses one possible approach to this design: A hardware-centric system that attempts to perform signal analysis as much as possible in the hardware. Simpler processors and minimal firmware are used because computational requirements are kept to a bare minimum. By hard-coding the majority of the device's operational parameters, we could cull large sections of flexible, configurable hardware and software found in COTS instruments, thus yielding a functional core that is more straightforward to authenticate.

  19. Minimum Day Time Load Calculation and Screening

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Minimum Daytime Load Calculation and Screening Page 1 of 30 Kristen Ardani, Dora Nakfuji, Anthony Hong, and Babak Enayati Page 1 of 30 [Speaker: Kristen Ardani] Cover Slide: Thank you everyone for joining us today for our DG interconnection collaborative informational webinar. Today we are going to talk about minimum day time load calculation and screening procedures and their role in the distributed PV interconnection process. We're going to hear from Babak Enayati of the Massachusetts

  20. LHC Computing

    SciTech Connect (OSTI)

    Lincoln, Don

    2015-07-28

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

    1. ITP Steel: Theoretical Minimum Energies to Produce Steel for...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000 ...

    2. Theoretical Minimum Energies to Produce Steel for Selected Conditions

      SciTech Connect (OSTI)

      Fruehan, R.J.; Fortini, O.; Paxton, H.W.; Brindle, R.

      2000-05-01

      The energy used to produce liquid steel in today's integrated and electric arc furnace (EAF) facilities is significantly higher than the theoretical minimum energy requirements. This study presents the absolute minimum energy required to produce steel from ore and mixtures of scrap and scrap alternatives. Additional cases in which the assumptions are changed to more closely approximate actual operating conditions are also analyzed. The results, summarized in Table E-1, should give insight into the theoretical and practical potentials for reducing steelmaking energy requirements. The energy values have also been converted to carbon dioxide (CO{sub 2}) emissions in order to indicate the potential for reduction in emissions of this greenhouse gas (Table E-2). The study showed that increasing scrap melting has the largest impact on energy consumption. However, scrap should be viewed as having ''invested'' energy since at one time it was produced by reducing ore. Increasing scrap melting in the BOF mayor may not decrease energy if the ''invested'' energy in scrap is considered.

    3. Minimum Day Time Load Calculation and Screening

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Minimum Day Time Load Calculation and Screening" Dora Nakafuji and Anthony Hong, Hawaiian Electric Co. Babak Enayati, DG Techincal Standards Review Group April 30, 2014 2 Speakers Babak Enayati Chair of Massachusetts DG Technical Standards Review Group Dora Nakafuji Director of Renewable Energy Planning Hawaiian Electric Company (HECO) Kristen Ardani Solar Analyst, (today's moderator) NREL Anthony Hong Director of Distribution Planning Hawaiian Electric Company (HECO) Standardization of

    4. Two variants of minimum discarded fill ordering

      SciTech Connect (OSTI)

      D'Azevedo, E.F. ); Forsyth, P.A.; Tang, Wei-Pai . Dept. of Computer Science)

      1991-01-01

      It is well known that the ordering of the unknowns can have a significant effect on the convergence of Preconditioned Conjugate Gradient (PCG) methods. There has been considerable experimental work on the effects of ordering for regular finite difference problems. In many cases, good results have been obtained with preconditioners based on diagonal, spiral or natural row orderings. However, for finite element problems having unstructured grids or grids generated by a local refinement approach, it is difficult to define many of the orderings for more regular problems. A recently proposed Minimum Discarded Fill (MDF) ordering technique is effective in finding high quality Incomplete LU (ILU) preconditioners, especially for problems arising from unstructured finite element grids. Testing indicates this algorithm can identify a rather complicated physical structure in an anisotropic problem and orders the unknowns in the preferred'' direction. The MDF technique may be viewed as the numerical analogue of the minimum deficiency algorithm in sparse matrix technology. At any stage of the partial elimination, the MDF technique chooses the next pivot node so as to minimize the amount of discarded fill. In this work, two efficient variants of the MDF technique are explored to produce cost-effective high-order ILU preconditioners. The Threshold MDF orderings combine MDF ideas with drop tolerance techniques to identify the sparsity pattern in the ILU preconditioners. These techniques identify an ordering that encourages fast decay of the entries in the ILU factorization. The Minimum Update Matrix (MUM) ordering technique is a simplification of the MDF ordering and is closely related to the minimum degree algorithm. The MUM ordering is especially for large problems arising from Navier-Stokes problems. Some interesting pictures of the orderings are presented using a visualization tool. 22 refs., 4 figs., 7 tabs.

    5. Minimum Day Time Load Calculation and Screening

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Distributed Generation Interconnection Collaborative (DGIC) "Minimum Day Time Load Calculation and Screening" Dora Nakafuji and Anthony Hong, Hawaiian Electric Co. Babak Enayati, DG Techincal Standards Review Group April 30, 2014 2 Speakers Babak Enayati Chair of Massachusetts DG Technical Standards Review Group Dora Nakafuji Director of Renewable Energy Planning Hawaiian Electric Company (HECO) Kristen Ardani Solar Analyst, (today's moderator) NREL Anthony Hong Director of

    6. Computing Frontier: Distributed Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Frontier: Distributed Computing and Facility Infrastructures Conveners: Kenneth Bloom 1 , Richard Gerber 2 1 Department of Physics and Astronomy, University of Nebraska-Lincoln 2 National Energy Research Scientific Computing Center (NERSC), Lawrence Berkeley National Laboratory 1.1 Introduction The field of particle physics has become increasingly reliant on large-scale computing resources to address the challenges of analyzing large datasets, completing specialized computations and

    7. Energy and IAQ Implications of Alternative Minimum Ventilation Rates in California Retail and School Buildings

      SciTech Connect (OSTI)

      Dutton, Spencer M.; Fisk, William J.

      2015-01-01

      For a stand-alone retail building, a primary school, and a secondary school in each of the 16 California climate zones, the EnergyPlus building energy simulation model was used to estimate how minimum mechanical ventilation rates (VRs) affect energy use and indoor air concentrations of an indoor-generated contaminant. The modeling indicates large changes in heating energy use, but only moderate changes in total building energy use, as minimum VRs in the retail building are changed. For example, predicted state-wide heating energy consumption in the retail building decreases by more than 50% and total building energy consumption decreases by approximately 10% as the minimum VR decreases from the Title 24 requirement to no mechanical ventilation. The primary and secondary schools have notably higher internal heat gains than in the retail building models, resulting in significantly reduced demand for heating. The school heating energy use was correspondingly less sensitive to changes in the minimum VR. The modeling indicates that minimum VRs influence HVAC energy and total energy use in schools by only a few percent. For both the retail building and the school buildings, minimum VRs substantially affected the predicted annual-average indoor concentrations of an indoor generated contaminant, with larger effects in schools. The shape of the curves relating contaminant concentrations with VRs illustrate the importance of avoiding particularly low VRs.

    8. Minimum wear tube support hole design

      DOE Patents [OSTI]

      Glatthorn, Raymond H. (St. Petersburg, FL)

      1986-01-01

      A minimum-wear through-bore (16) is defined within a heat exchanger tube support plate (14) so as to have an hourglass configuration as determined by means of a constant radiused surface curvature (18) as defined by means of an external radius (R3), wherein the surface (18) extends between the upper surface (20) and lower surface (22) of the tube support plate (14). When a heat exchange tube (12) is disposed within the tube support plate (14) so as to pass through the through-bore (16), the heat exchange tube (12) is always in contact with a smoothly curved or radiused portion of the through-bore surface (16) whereby unacceptably excessive wear upon the heat exchange tube (12), as normally developed by means of sharp edges, lands, ridges, or the like conventionally part of the tube support plates, is eliminated or substantially reduced.

    9. Computer, Computational, and Statistical Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCS Computer, Computational, and Statistical Sciences Computational physics, computer science, applied mathematics, statistics and the integration of large data streams are central ...

    10. Optimizing minimum free-energy crossing points in solution: Linear...

      Office of Scientific and Technical Information (OSTI)

      Optimizing minimum free-energy crossing points in solution: Linear-response free energyspin-flip density functional theory approach Citation Details In-Document Search Title:...

    11. ON COMPUTING UPPER LIMITS TO SOURCE INTENSITIES

      SciTech Connect (OSTI)

      Kashyap, Vinay L.; Siemiginowska, Aneta [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Van Dyk, David A.; Xu Jin [Department of Statistics, University of California, Irvine, CA 92697-1250 (United States); Connors, Alanna [Eureka Scientific, 2452 Delmer Street, Suite 100, Oakland, CA 94602-3017 (United States); Freeman, Peter E. [Department of Statistics, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213 (United States); Zezas, Andreas, E-mail: vkashyap@cfa.harvard.ed, E-mail: asiemiginowska@cfa.harvard.ed, E-mail: dvd@ics.uci.ed, E-mail: jinx@ics.uci.ed, E-mail: aconnors@eurekabayes.co, E-mail: pfreeman@cmu.ed, E-mail: azezas@cfa.harvard.ed [Physics Department, University of Crete, P.O. Box 2208, GR-710 03, Heraklion, Crete (Greece)

      2010-08-10

      A common problem in astrophysics is determining how bright a source could be and still not be detected in an observation. Despite the simplicity with which the problem can be stated, the solution involves complicated statistical issues that require careful analysis. In contrast to the more familiar confidence bound, this concept has never been formally analyzed, leading to a great variety of often ad hoc solutions. Here we formulate and describe the problem in a self-consistent manner. Detection significance is usually defined by the acceptable proportion of false positives (background fluctuations that are claimed as detections, or Type I error), and we invoke the complementary concept of false negatives (real sources that go undetected, or Type II error), based on the statistical power of a test, to compute an upper limit to the detectable source intensity. To determine the minimum intensity that a source must have for it to be detected, we first define a detection threshold and then compute the probabilities of detecting sources of various intensities at the given threshold. The intensity that corresponds to the specified Type II error probability defines that minimum intensity and is identified as the upper limit. Thus, an upper limit is a characteristic of the detection procedure rather than the strength of any particular source. It should not be confused with confidence intervals or other estimates of source intensity. This is particularly important given the large number of catalogs that are being generated from increasingly sensitive surveys. We discuss, with examples, the differences between these upper limits and confidence bounds. Both measures are useful quantities that should be reported in order to extract the most science from catalogs, though they answer different statistical questions: an upper bound describes an inference range on the source intensity, while an upper limit calibrates the detection process. We provide a recipe for computing upper

    12. Present and Future Computational Requirements General Plasma...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Amitava Bhattacharjee, Fatima Ebrahimi, Will Fox, Liwei Lin CICART Space Science Center Dept. of Physics University of New Hampshire March 18, 2013 Kai Germaschewski and Homa ...

    13. BES Science Network Requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Network Requirements Report of the Basic Energy Sciences Network Requirements Workshop Conducted June 4-5, 2007 BES Science Network Requirements Workshop Basic Energy Sciences Program Office, DOE Office of Science Energy Sciences Network Washington, DC - June 4 and 5, 2007 ESnet is funded by the US Dept. of Energy, Office of Science, Advanced Scientific Computing Research (ASCR) program. Dan Hitchcock is the ESnet Program Manager. ESnet is operated by Lawrence Berkeley National Laboratory, which

    14. RMACS software requirements specification

      SciTech Connect (OSTI)

      Gneiting, B.C.

      1996-10-01

      This document defines the essential user (or functional) requirements of the Requirements Management and Assured Compliance System (RMACS), which is used by the Tank Waste Remediation System program (TWRS). RMACS provides a computer-based environment that TWRS management and systems engineers can use to identify, define, and document requirements. The intent of the system is to manage information supporting definition of the TWRS technical baseline using a structured systems engineering process. RMACS has the capability to effectively manage a complete set of complex requirements and relationships in a manner that satisfactorily assures compliance to the program requirements over the TWRS life-cycle.

    15. Requirement-Reviews.pptx

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      1, 2 013 Requirements Reviews * 1½-day reviews with each Program Office * Computing and storage requirements for next 5 years * Participants - DOE ADs & Program Managers - Leading scientists using NERSC & key potential users - NERSC staff 2 High Energy Physics Fusion R esearch Reports From 6 Requirements Reviews Have Been Published 3 h%p://www.nersc.gov/science/requirements---reviews/ final---reports/ * Compu<ng a nd s torage requirements f or 2013/2014 * Execu<ve S ummary o f

    16. PREPARING FOR EXASCALE: ORNL Leadership Computing Application...

      Office of Scientific and Technical Information (OSTI)

      This effort targeted science teams whose projects received large computer allocation ... the proposed time frame will require disruptive changes in computer hardware and software. ...

    17. Reporting Requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Reporting Requirements Reporting Requirements Contacts Director Albert Migliori Deputy Franz Freibert 505 667-6879 Email Professional Staff Assistant Susan Ramsay 505 665 0858...

    18. Energy Aware Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Energy Aware Computing Energy Aware Computing Dynamic Frequency Scaling One means to lower the energy required to compute is to reduce the power usage on a node. One way to accomplish this is by lowering the frequency at which the CPU operates. However, reducing the clock speed increases the time to solution, creating a potential tradeoff. NERSC continues to examine how such methods impact its operations and its

    19. ELECTRONIC DIGITAL COMPUTER

      DOE Patents [OSTI]

      Stone, J.J. Jr.; Bettis, E.S.; Mann, E.R.

      1957-10-01

      The electronic digital computer is designed to solve systems involving a plurality of simultaneous linear equations. The computer can solve a system which converges rather rapidly when using Von Seidel's method of approximation and performs the summations required for solving for the unknown terms by a method of successive approximations.

    20. Computing | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Computing Computing Fun fact: Most systems require air conditioning or chilled water to cool super powerful supercomputers, but the Olympus supercomputer at Pacific Northwest National Laboratory is cooled by the location's 65 degree groundwater. Traditional cooling systems could cost up to $61,000 in electricity each year, but this more efficient setup uses 70 percent less energy. | Photo courtesy of PNNL. Fun fact: Most systems require air conditioning or chilled water to cool super powerful

    1. Compute nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute nodes Compute nodes Click here to see more detailed hierachical map of the topology of a compute node. Last edited: 2015-03-30 20:55:24...

    2. Apparatus and method for closed-loop control of reactor power in minimum time

      DOE Patents [OSTI]

      Bernard, Jr., John A.

      1988-11-01

      Closed-loop control law for altering the power level of nuclear reactors in a safe manner and without overshoot and in minimum time. Apparatus is provided for moving a fast-acting control element such as a control rod or a control drum for altering the nuclear reactor power level. A computer computes at short time intervals either the function: .rho.=(.beta.-.rho.).omega.-.lambda..sub.e '.rho.-.SIGMA..beta..sub.i (.lambda..sub.i -.lambda..sub.e ')+l* .omega.+l* [.omega..sup.2 +.lambda..sub.e '.omega.] or the function: .rho.=(.beta.-.rho.).omega.-.lambda..sub.e .rho.-(.lambda..sub.e /.lambda..sub.e)(.beta.-.rho.)+l* .omega.+l* [.omega..sup.2 +.lambda..sub.e .omega.-(.lambda..sub.e /.lambda..sub.e).omega.] These functions each specify the rate of change of reactivity that is necessary to achieve a specified rate of change of reactor power. The direction and speed of motion of the control element is altered so as to provide the rate of reactivity change calculated using either or both of these functions thereby resulting in the attainment of a new power level without overshoot and in minimum time. These functions are computed at intervals of approximately 0.01-1.0 seconds depending on the specific application.

    3. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      undergraduate summer institute http:isti.lanl.gov (Educational Prog) 2016 Computer System, Cluster, and Networking Summer Institute Purpose The Computer System,...

    4. Modeling an Application's Theoretical Minimum and Average Transactional Response Times

      SciTech Connect (OSTI)

      Paiz, Mary Rose

      2015-04-01

      The theoretical minimum transactional response time of an application serves as a ba- sis for the expected response time. The lower threshold for the minimum response time represents the minimum amount of time that the application should take to complete a transaction. Knowing the lower threshold is beneficial in detecting anomalies that are re- sults of unsuccessful transactions. On the converse, when an application's response time falls above an upper threshold, there is likely an anomaly in the application that is causing unusual performance issues in the transaction. This report explains how the non-stationary Generalized Extreme Value distribution is used to estimate the lower threshold of an ap- plication's daily minimum transactional response time. It also explains how the seasonal Autoregressive Integrated Moving Average time series model is used to estimate the upper threshold for an application's average transactional response time.

    5. Table 10.1 Nonswitchable Minimum and Maximum Consumption, 2002

      U.S. Energy Information Administration (EIA) Indexed Site

      Nonswitchable Minimum and Maximum Consumption, 2002; " " Level: National and Regional Data;" " Row: Energy Sources;" " Column: Consumption Potential;" " Unit: Physical Units." ,,,,"RSE" ,"Actual","Minimum","Maximum","Row" "Energy Sources","Consumption","Consumption(a)","Consumption(b)","Factors" ,"Total United States" "RSE Column

    6. Animation Requirements

      Broader source: Energy.gov [DOE]

      Animations include dynamic elements such as interactive images and games. For developing animations, follow these design and coding requirements.

    7. Computing Information

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information From here you can find information relating to: Obtaining the right computer accounts. Using NIC terminals. Using BooNE's Computing Resources, including: Choosing your desktop. Kerberos. AFS. Printing. Recommended applications for various common tasks. Running CPU- or IO-intensive programs (batch jobs) Commonly encountered problems Computing support within BooNE Bringing a computer to FNAL, or purchasing a new one. Laptops. The Computer Security Program Plan for MiniBooNE The

    8. Optimal shielding design for minimum materials cost or mass

      SciTech Connect (OSTI)

      Woolley, Robert D.

      2015-12-02

      The mathematical underpinnings of cost optimal radiation shielding designs based on an extension of optimal control theory are presented, a heuristic algorithm to iteratively solve the resulting optimal design equations is suggested, and computational results for a simple test case are discussed. A typical radiation shielding design problem can have infinitely many solutions, all satisfying the problem's specified set of radiation attenuation requirements. Each such design has its own total materials cost. For a design to be optimal, no admissible change in its deployment of shielding materials can result in a lower cost. This applies in particular to very small changes, which can be restated using the calculus of variations as the Euler-Lagrange equations. Furthermore, the associated Hamiltonian function and application of Pontryagin's theorem lead to conditions for a shield to be optimal.

    9. Optimal shielding design for minimum materials cost or mass

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Woolley, Robert D.

      2015-12-02

      The mathematical underpinnings of cost optimal radiation shielding designs based on an extension of optimal control theory are presented, a heuristic algorithm to iteratively solve the resulting optimal design equations is suggested, and computational results for a simple test case are discussed. A typical radiation shielding design problem can have infinitely many solutions, all satisfying the problem's specified set of radiation attenuation requirements. Each such design has its own total materials cost. For a design to be optimal, no admissible change in its deployment of shielding materials can result in a lower cost. This applies in particular to very smallmore » changes, which can be restated using the calculus of variations as the Euler-Lagrange equations. Furthermore, the associated Hamiltonian function and application of Pontryagin's theorem lead to conditions for a shield to be optimal.« less

    10. Eligibility Requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Eligibility Requirements Eligibility Requirements A comprehensive benefits package with plan options for health care and retirement to take care of our employees today and tomorrow. Contact Benefits Office (505) 667-1806 Email Eligibility and required supporting documentation The Laboratory offers an extensive benefits package to full and part time employees. Casual employees (excluding High School Coop, Lab Associates and Craft Employees) are eligible to enroll in the HDHP medical plan. Refer

    11. Competition Requirements

      Office of Environmental Management (EM)

      - Chapter 6.1 (January 2011) 1 Competition Requirements [Reference: FAR 6 and DEAR 906] Overview This section discusses competition requirements and provides a model Justification for Other than Full and Open Competition (JOFOC). Background The Competition in Contracting Act (CICA) of 1984 requires that all acquisitions be made using full and open competition. Seven exceptions to using full and open competition are specifically identified in Federal Acquisition Regulation (FAR) Subpart 6.3.

    12. Reporting Requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Reporting Requirements Reporting Requirements Contacts Director Albert Migliori Deputy Franz Freibert 505 667-6879 Email Professional Staff Assistant Susan Ramsay 505 665 0858 Email The Fellow will be required to participate in the Actinide Science lecture series by both attending lectures and presenting a scientific lecture on actinide science in this series. Submission of a viewgraph and brief write-up of the project. Provide metrics information as requested. Submission of an overview article

    13. Competition Requirements

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      ---------------------------------------- Chapter 6.1 (July 2011) 1 Competition Requirements [Reference: FAR 6 and DEAR 906] Overview This section discusses competition requirements and provides a model Justification for Other than Full and Open Competition (JOFOC). Background The Competition in Contracting Act (CICA) of 1984 requires that all acquisitions be made using full and open competition. Seven exceptions to using full and open competition are specifically identified in Federal

    14. Deployment Requirements

      Broader source: Energy.gov (indexed) [DOE]

      Troy, Michigan June 13, 2014 THIS PRESENTATION DOES NOT CONTAIN ANY PROPRIETARY, CONFIDENTIAL OR OTHERWISE RESTRICTED INFORMATION 2 Outline of talk * SAE 2719 Requirements and ...

    15. Video Requirements

      Broader source: Energy.gov [DOE]

      All EERE videos, including webinar recordings, must meet Section 508's requirements for accessibility. All videos should be hosted on the DOE YouTube channel.

    16. Theoretical solution of the minimum charge problem for gaseous detonations

      SciTech Connect (OSTI)

      Ostensen, R.W.

      1990-12-01

      A theoretical model was developed for the minimum charge to trigger a gaseous detonation in spherical geometry as a generalization of the Zeldovich model. Careful comparisons were made between the theoretical predictions and experimental data on the minimum charge to trigger detonations in propane-air mixtures. The predictions are an order of magnitude too high, and there is no apparent resolution to the discrepancy. A dynamic model, which takes into account the experimentally observed oscillations in the detonation zone, may be necessary for reliable predictions. 27 refs., 9 figs.

    17. An introduction to computer viruses

      SciTech Connect (OSTI)

      Brown, D.R.

      1992-03-01

      This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

    18. Computing Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Division The Computational Research Division conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and...

    19. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cluster-Image TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computing Resources The TRACC Computational Clusters With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD

    20. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB DDR3 800 MHz memory per node Peak Gflop rate 9.2 Gflops/core 36.8 Gflops/node 352 Tflops for the entire machine Each core has their own L1 and L2 caches, with 64 KB and 512KB respectively 2 MB L3 cache shared among the 4 cores Compute Node Software By default the compute nodes run a restricted low-overhead

    1. New York City- Energy Conservation Requirements for Existing Buildings

      Broader source: Energy.gov [DOE]

      Council Bill No. 564-A (Local Law 85 of 2009): Requires that renovations of existing buildings meet minimum energy conservation standards. The result of this law is essentially a city energy code ...

    2. The"minimum information about an environmental sequence" (MIENS) specification

      SciTech Connect (OSTI)

      Yilmaz, P.; Kottmann, R.; Field, D.; Knight, R.; Cole, J.R.; Amaral-Zettler, L.; Gilbert, J.A.; Karsch-Mizrachi, I.; Johnston, A.; Cochrane, G.; Vaughan, R.; Hunter, C.; Park, J.; Morrison, N.; Rocca-Serra, P.; Sterk, P.; Arumugam, M.; Baumgartner, L.; Birren, B.W.; Blaser, M.J.; Bonazzi, V.; Bork, P.; Buttigieg, P. L.; Chain, P.; Costello, E.K.; Huot-Creasy, H.; Dawyndt, P.; DeSantis, T.; Fierer, N.; Fuhrman, J.; Gallery, R.E.; Gibbs, R.A.; Giglio, M.G.; Gil, I. San; Gonzalez, A.; Gordon, J.I.; Guralnick, R.; Hankeln, W.; Highlander, S.; Hugenholtz, P.; Jansson, J.; Kennedy, J.; Knights, D.; Koren, O.; Kuczynski, J.; Kyrpides, N.; Larsen, R.; Lauber, C.L.; Legg, T.; Ley, R.E.; Lozupone, C.A.; Ludwig, W.; Lyons, D.; Maguire, E.; Methe, B.A.; Meyer, F.; Nakieny, S.; Nelson, K.E.; Nemergut, D.; Neufeld, J.D.; Pace, N.R.; Palanisamy, G.; Peplies, J.; Peterson, J.; Petrosino, J.; Proctor, L.; Raes, J.; Ratnasingham, S.; Ravel, J.; Relman, D.A.; Assunta-Sansone, S.; Schriml, L.; Sodergren, E.; Spor, A.; Stombaugh, J.; Tiedje, J.M.; Ward, D.V.; Weinstock, G.M.; Wendel, D.; White, O.; Wikle, A.; Wortman, J.R.; Glockner, F.O.; Bushman, F.D.; Charlson, E.; Gevers, D.; Kelley, S.T.; Neubold, L.K.; Oliver, A.E.; Pruesse, E.; Quast, C.; Schloss, P.D.; Sinha, R.; Whitely, A.

      2010-10-15

      We present the Genomic Standards Consortium's (GSC) 'Minimum Information about an ENvironmental Sequence' (MIENS) standard for describing marker genes. Adoption of MIENS will enhance our ability to analyze natural genetic diversity across the Tree of Life as it is currently being documented by massive DNA sequencing efforts from myriad ecosystems in our ever-changing biosphere.

    3. DOE SC Exascale Requirements Reviews: High Energy Physics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computational domain scientists, DOE planners and administrators, and experts in computer science and applied mathematics to determine the requirements for an exascale ecosystem ...

    4. Is ""predictability"" in computational sciences a myth?

      SciTech Connect (OSTI)

      Hemez, Francois M [Los Alamos National Laboratory

      2011-01-31

      Within the last two decades, Modeling and Simulation (M&S) has become the tool of choice to investigate the behavior of complex phenomena. Successes encountered in 'hard' sciences are prompting interest to apply a similar approach to Computational Social Sciences in support, for example, of national security applications faced by the Intelligence Community (IC). This manuscript attempts to contribute to the debate on the relevance of M&S to IC problems by offering an overview of what it takes to reach 'predictability' in computational sciences. Even though models developed in 'soft' and 'hard' sciences are different, useful analogies can be drawn. The starting point is to view numerical simulations as 'filters' capable to represent information only within specific length, time or energy bandwidths. This simplified view leads to the discussion of resolving versus modeling which motivates the need for sub-scale modeling. The role that modeling assumptions play in 'hiding' our lack-of-knowledge about sub-scale phenomena is explained which leads to discussing uncertainty in simulations. It is argued that the uncertainty caused by resolution and modeling assumptions should be dealt with differently than uncertainty due to randomness or variability. The corollary is that a predictive capability cannot be defined solely as accuracy, or ability of predictions to match the available physical observations. We propose that 'predictability' is the demonstration that predictions from a class of 'equivalent' models are as consistent as possible. Equivalency stems from defining models that share a minimum requirement of accuracy, while being equally robust to the sources of lack-of-knowledge in the problem. Examples in computational physics and engineering are given to illustrate the discussion.

    5. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      DesignForward FastForward CAL Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Exascale Computing Exascale Computing Moving forward into the exascale era, NERSC users place will place increased demands on NERSC computational facilities. Users will be facing increased complexity in the memory subsystem and node architecture. System designs and programming models will have to evolve to face these new challenges. NERSC staff are active in current initiatives addressing

    6. DOE Requires Manufacturers to Halt Sales of Heat Pumps and Air Conditioners

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Violating Minimum Appliance Standards | Department of Energy Requires Manufacturers to Halt Sales of Heat Pumps and Air Conditioners Violating Minimum Appliance Standards DOE Requires Manufacturers to Halt Sales of Heat Pumps and Air Conditioners Violating Minimum Appliance Standards June 3, 2010 - 12:00am Addthis Washington, DC - Today, the Department of Energy announced that three manufacturers -- Aspen Manufacturing, Inc., Summit Manufacturing, and Advanced Distributor Products -- must

    7. Computational Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Advanced Materials Laboratory Center for Integrated Nanotechnologies Combustion Research Facility Computational Science Research Institute Joint BioEnergy Institute About EC News ...

    8. Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cite Seer Department of Energy provided open access science research citations in chemistry, physics, materials, engineering, and computer science IEEE Xplore Full text...

    9. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      low-overhead operating system optimized for high performance computing called "Cray Linux Environment" (CLE). This OS supports only a limited number of system calls and UNIX...

    10. Computing and Computational Sciences Directorate - Contacts

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Home About Us Contacts Jeff Nichols Associate Laboratory Director Computing and Computational Sciences Becky Verastegui Directorate Operations Manager Computing and...

    11. Proposal for grid computing for nuclear applications

      SciTech Connect (OSTI)

      Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.; Sulaiman, Mohamad Safuan B.; Aslan, Mohd Dzul Aiman Bin.; Samsudin, Nursuliza Bt.; Ibrahim, Maizura Bt.; Ahmad, Megat Harun Al Rashid B. Megat; Yazid, Hafizal B.; Jamro, Rafhayudi B.; Azman, Azraf B.; Rahman, Anwar B. Abdul; Ibrahim, Mohd Rizal B. Mamat; Muhamad, Shalina Bt. Sheik; Hassan, Hasni; Abdullah, Wan Ahmad Tajuddin Wan; Ibrahim, Zainol Abidin; Zolkapli, Zukhaimira; Anuar, Afiq Aizuddin; Norjoharuddeen, Nurfikri; and others

      2014-02-12

      The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.

    12. Computing and Computational Sciences Directorate - Divisions

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCSD Divisions Computational Sciences and Engineering Computer Sciences and Mathematics Information Technolgoy Services Joint Institute for Computational Sciences National Center for Computational Sciences

    13. Competition Requirements

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      ---- ----------------------------------------------- Chapter 5.2 (April 2008) Synopsizing Proposed Non-Competitive Contract Actions Citing the Authority of FAR 6.302-1 [Reference: FAR 5 and DEAR 905] Overview This section discusses publicizing sole source actions as part of the approval of a Justification for Other than Full and Open Competition (JOFOC) using the authority of FAR 6.302-1. Background The Competition in Contracting Act (CICA) of 1984 requires that all acquisitions be made using

    14. Minimum information about a marker gene sequence (MIMARKS) and minimum information about any (x) sequence (MIxS) specifications.

      SciTech Connect (OSTI)

      Yilmaz, P.; Kottmann, R.; Field, D.; Knight, R.; Cole, J. R.; Amaral-Zettler, L.; Gilbert, J. A.

      2011-05-01

      Here we present a standard developed by the Genomic Standards Consortium (GSC) for reporting marker gene sequences - the minimum information about a marker gene sequence (MIMARKS). We also introduce a system for describing the environment from which a biological sample originates. The 'environmental packages' apply to any genome sequence of known origin and can be used in combination with MIMARKS and other GSC checklists. Finally, to establish a unified standard for describing sequence data and to provide a single point of entry for the scientific community to access and learn about GSC checklists, we present the minimum information about any (x) sequence (MIxS). Adoption of MIxS will enhance our ability to analyze natural genetic diversity documented by massive DNA sequencing efforts from myriad ecosystems in our ever-changing biosphere.

    15. The transition from the open minimum to the ring minimum on the ground state and on the lowest excited state of like symmetry in ozone: A configuration interaction study

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Theis, Daniel; Ivanic, Joseph; Windus, Theresa L.; Ruedenberg, Klaus

      2016-03-10

      The metastable ring structure of the ozone 11A1 ground state, which theoretical calculations have shown to exist, has so far eluded experimental detection. An accurate prediction for the energy difference between this isomer and the lower open structure is therefore of interest, as is a prediction for the isomerization barrier between them, which results from interactions between the lowest two 1A1 states. In the present work, valence correlated energies of the 11A1 state and the 21A1 state were calculated at the 11A1 open minimum, the 11A1 ring minimum, the transition state between these two minima, the minimum of the 21A1more » state, and the conical intersection between the two states. The geometries were determined at the full-valence multi-configuration self-consistent-field level. Configuration interaction (CI) expansions up to quadruple excitations were calculated with triple-zeta atomic basis sets. The CI expansions based on eight different reference configuration spaces were explored. To obtain some of the quadruple excitation energies, the method of CorrelationEnergy Extrapolation by Intrinsic Scaling was generalized to the simultaneous extrapolation for two states. This extrapolation method was shown to be very accurate. On the other hand, none of the CI expansions were found to have converged to millihartree (mh) accuracy at the quadruple excitation level. The data suggest that convergence to mh accuracy is probably attained at the sextuple excitation level. On the 11A1 state, the present calculations yield the estimates of (ring minimum—open minimum) ~45–50 mh and (transition state—open minimum) ~85–90 mh. For the (21A1–1A1) excitation energy, the estimate of ~130–170 mh is found at the open minimum and 270–310 mh at the ring minimum. At the transition state, the difference (21A1–1A1) is found to be between 1 and 10 mh. The geometry of the transition state on the 11A1 surface and that of the minimum on the 21A1 surface nearly coincide

    16. HUD (Housing and Urban Development) Intermediate Minimum Property Standards Supplement 4930. 2 (1989 edition). Solar heating and domestic hot water systems

      SciTech Connect (OSTI)

      Not Available

      1989-12-01

      The Minimum Property Standards for Housing 4910.1 were developed to provide a sound technical basis for housing under numerous programs of the Department of Housing and Urban Development (HUD). These Intermediate Minimum Property Standards for Solar Heating and Domestic Hot Water Systems are intended to provide a companion technical basis for the planning and design of solar heating and domestic hot water systems. These standards have been prepared as a supplement to the Minimum Property Standards (MPS) and deal only with aspects of planning and design that are different from conventional housing by reason of the solar systems under consideration. The document contains requirements and standards applicable to one- and two-family dwellings, multifamily housing, and nursing homes and intermediate care facilities references made in the text to the MPS refer to the same section in the Minimum Property Standards for Housing 4910.1.

    17. Optimization of Operating Parameters for Minimum Mechanical Specific...

      Office of Scientific and Technical Information (OSTI)

      in maximum Rate of Penetration. Current methods for computing MSE make it possible to ... Mathematical relationships between the parameters were established, and the conventional ...

    18. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB...

    19. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Exascale Computing CoDEx Project: A Hardware/Software Codesign Environment for the Exascale Era The next decade will see a rapid evolution of HPC node architectures as power and cooling constraints are limiting increases in microprocessor clock speeds and constraining data movement. Applications and algorithms will need to change and adapt as node architectures evolve. A key element of the strategy as we move forward is the co-design of applications, architectures and programming

    20. Minimum length, extra dimensions, modified gravity and black hole remnants

      SciTech Connect (OSTI)

      Maziashvili, Michael

      2013-03-01

      We construct a Hilbert space representation of minimum-length deformed uncertainty relation in presence of extra dimensions. Following this construction, we study corrections to the gravitational potential (back reaction on gravity) with the use of correspondingly modified propagator in presence of two (spatial) extra dimensions. Interestingly enough, for r?0 the gravitational force approaches zero and the horizon for modified Schwarzschild-Tangherlini space-time disappears when the mass approaches quantum-gravity energy scale. This result points out to the existence of zero-temperature black hole remnants in ADD brane-world model.

    1. Development of the Minimum Information Specification for in situ Hybridization and Immunohistochemistry Experiments (MISFISHIE)

      SciTech Connect (OSTI)

      Deutsch, Eric W.; Ball, Catherine A.; Bova, G. Steven; Brazma, Alvis; Bumgarner, Roger E.; Campbell, David; Causton, Helen C.; Christiansen, Jeff; Davidson, Duncan; Eichner, Lillian J.; Goo, Young Ah; Grimmond, Sean; Henrich, Thorsten; Johnson, Michael H.; Korb, Martin; Mills, Jason C.; Oudes, Asa; Parkinson, Helen E.; Pascal, Laura E.; Quackenbush, John; Ramialison, Mirana; Ringwald, Martin; Sansone, Susanna A.; Sherlock, Gavin; Stoeckert, Christian Jr. J.; Swedlow, Jason; Taylor, Ronald C.; Walashek, Laura; Zhou, Yi; Liu, Alvin Y.; True, Lawrence D.

      2006-06-06

      Background One purpose of the biomedical literature is to report results in sufficient detail so that the methods of data collection and analysis can be independently replicated and verified. In order to ensure that this level of detail is provided in published works, a minimum information specification is needed for each experimental data type and for this specification to be a requirement for publication in peer-reviewed journals. This is especially beneficial for researchers working with complex data types and experiments. A data content specification has already been widely accepted by, and directly benefited, the microarray community, and efforts are well underway to develop a comparable specification for proteomics data types. However, no similar specification exists for visual interpretation-based tissue protein and transcript abundance/localization experiments (hereafter referred to as ‘gene expression localization experiments’), such as in situ hybridization and experiments involving immunohistochemistry. Results Here we present for consideration a specification, called the “Minimum Information Specification For In Situ Hybridization and Immunohistochemistry Experiments (MISFISHIE)”. It is modelled after the MIAME (Minimum Information About a Microarray Experiment) specification for microarray experiments. Data specifications like MIAME and MISFISHIE specify the information content without specifying a format for encoding that information. The MISFISHIE specification describes six types of information that should be provided for each gene expression localization experiment: Experimental Design, Biomaterials and Treatments, Reporters, Staining, Imaging Data, and Image Characterizations. A general checklist is provided for quick and easy reference and to promote adherence to the specification. We consider that most articles describing gene expression localization studies do not fully provide the minimum information needed for independent verification

    2. Lattice QCD and NERSC requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      QCD and NERSC requirements Rich Brower, Steven Gottlieb and Doug Toussaint November 26, 2012 Rich Brower, Steven Gottlieb and Doug Toussaint () Lattice QCD at NERSC November 26, 2012 1 / 17 Lattice Gauge Theory at NERSC First-principles computations in QCD Computations in other strongly coupled field theories Find hadronic factors to get fundamental physics from experiments Understand structure and interactions of hadrons Understand QCD: confinement and chiral symmetry breaking Other strongly

    3. Computational mechanics

      SciTech Connect (OSTI)

      Goudreau, G.L.

      1993-03-01

      The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

    4. Computational mechanics

      SciTech Connect (OSTI)

      Raboin, P J

      1998-01-01

      The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

    5. 02-HellandNERSC-Requirements.pptx

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      5, 2014 Barbara Helland Advanced Scientific Computing Research NERSC-ASCR Requirements Review 1 ASCR 2 NERSC---ASCR R equirements R eview 1 /15/2014 3 World C lass F aciliBes * High Performance Production Computing for the Office of Science * Characterized by a large number of projects (over 400) and users ( over 4800) * Leadership Computing for Open Science * Characterized by a small number of projects ( about 50) and users (about 800) with computationally intensive projects * Linking it

    6. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Resources This page is the repository for sundry items of information relevant to general computing on BooNE. If you have a question or problem that isn't answered here, or a suggestion for improving this page or the information on it, please mail boone-computing@fnal.gov and we'll do our best to address any issues. Note about this page Some links on this page point to www.everything2.com, and are meant to give an idea about a concept or thing without necessarily wading through a whole website

    7. Computational trigonometry

      SciTech Connect (OSTI)

      Gustafson, K.

      1994-12-31

      By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

    8. Institutional computing (IC) information session

      SciTech Connect (OSTI)

      Koch, Kenneth R; Lally, Bryan R

      2011-01-19

      The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

    9. DOE Requires Manufacturers to Halt Sales of Heat Pumps and Air Conditioners

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Violating Minimum Appliance Standards | Department of Energy Manufacturers to Halt Sales of Heat Pumps and Air Conditioners Violating Minimum Appliance Standards DOE Requires Manufacturers to Halt Sales of Heat Pumps and Air Conditioners Violating Minimum Appliance Standards June 3, 2010 - 2:17pm Addthis Today, the Department of Energy announced that three manufacturers -- Aspen Manufacturing, Inc., Summit Manufacturing, and Advanced Distributor Products -- must stop distributing 61 heat

    10. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, ... The DOE Office of Science's Advanced Scientific Computing Research (ASCR) program ...

    11. Theory, Simulation, and Computation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer, Computational, and Statistical Sciences (CCS) Division is an international ... and statistics The deployment and integration of computational technology, ...

    12. Computed Tomography Status

      DOE R&D Accomplishments [OSTI]

      Hansche, B. D.

      1983-01-01

      Computed tomography (CT) is a relatively new radiographic technique which has become widely used in the medical field, where it is better known as computerized axial tomographic (CAT) scanning. This technique is also being adopted by the industrial radiographic community, although the greater range of densities, variation in samples sizes, plus possible requirement for finer resolution make it difficult to duplicate the excellent results that the medical scanners have achieved.

    13. Data Crosscutting Requirements Review

      SciTech Connect (OSTI)

      Kleese van Dam, Kerstin; Shoshani, Arie; Plata, Charity

      2013-04-01

      In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments. Participants in the review included facilities staff, program managers, and scientific experts from the offices of Basic Energy Sciences, Biological and Environmental Research, High Energy Physics, and Advanced Scientific Computing Research. As part of the meeting, review participants discussed key issues associated with three distinct aspects of the data challenge: 1) processing, 2) management, and 3) analysis. These discussions identified commonalities and differences among the needs of varied scientific communities. They also helped to articulate gaps between current approaches and future needs, as well as the research advances that will be required to close these gaps. Moreover, the review provided a rare opportunity for experts from across the Office of Science to learn about their collective expertise, challenges, and opportunities. The "Data Crosscutting Requirements Review" generated specific findings and recommendations for addressing large-scale data crosscutting requirements.

    14. Computing at JLab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      JLab --- Accelerator Controls CAD CDEV CODA Computer Center High Performance Computing Scientific Computing JLab Computer Silo maintained by webmaster@jlab.org...

    15. Computational Combustion

      SciTech Connect (OSTI)

      Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

      2004-08-26

      Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

    16. RATIO COMPUTER

      DOE Patents [OSTI]

      Post, R.F.

      1958-11-11

      An electronic computer circuit is described for producing an output voltage proportional to the product or quotient of tbe voltages of a pair of input signals. ln essence, the disclosed invention provides a computer having two channels adapted to receive separate input signals and each having amplifiers with like fixed amplification factors and like negatlve feedback amplifiers. One of the channels receives a constant signal for comparison purposes, whereby a difference signal is produced to control the amplification factors of the variable feedback amplifiers. The output of the other channel is thereby proportional to the product or quotient of input signals depending upon the relation of input to fixed signals in the first mentioned channel.

    17. Computing Events

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Events Computing Events Spotlighting the most advanced scientific and technical applications in the world! Featuring exhibits of the latest and greatest technologies from industry, academia and government research organizations; many of these technologies will be seen for the first time in Denver. Supercomputing Conference 13 Denver, Colorado November 17-22, 2013 Spotlighting the most advanced scientific and technical applications in the world, SC13 will bring together the international

    18. Sandia National Laboratories: Research: Research Foundations: Computing and

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science Research Foundations Bioscience Computing and Information Science Engineering Science Geoscience Materials Science Nanodevices and Microsystems Radiation Effects and High Energy Density Science Research Computing and Information Science Red Storm photo Our approach Vertically integrated, scalable supercomputing Goal Increase capability while reducing the space and power requirements of future computing systems by changing the nature of computing devices, computer

    19. Computing and Computational Sciences Directorate - Computer Science and

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematics Division Computer Science and Mathematics Division The Computer Science and Mathematics Division (CSMD) is ORNL's premier source of basic and applied research in high-performance computing, applied mathematics, and intelligent systems. Our mission includes basic research in computational sciences and application of advanced computing systems, computational, mathematical and analysis techniques to the solution of scientific problems of national importance. We seek to work

    20. Analysis of Minimum Efficiency Performance Standards for Residential General Service Lighting in Chile

      SciTech Connect (OSTI)

      Letschert, Virginie E.; McNeil, Michael A.; Leiva Ibanez, Francisco Humberto; Ruiz, Ana Maria; Pavon, Mariana; Hall, Stephen

      2011-06-01

      Minimum Efficiency Performance Standards (MEPS) have been chosen as part of Chile's national energy efficiency action plan. As a first MEPS, the Ministry of Energy has decided to focus on a regulation for lighting that would ban the sale of inefficient bulbs, effectively phasing out the use of incandescent lamps. Following major economies such as the US (EISA, 2007) , the EU (Ecodesign, 2009) and Australia (AS/NZS, 2008) who planned a phase out based on minimum efficacy requirements, the Ministry of Energy has undertaken the impact analysis of a MEPS on the residential lighting sector. Fundacion Chile (FC) and Lawrence Berkeley National Laboratory (LBNL) collaborated with the Ministry of Energy and the National Energy Efficiency Program (Programa Pais de Eficiencia Energetica, or PPEE) in order to produce a techno-economic analysis of this future policy measure. LBNL has developed for CLASP (CLASP, 2007) a spreadsheet tool called the Policy Analysis Modeling System (PAMS) that allows for evaluation of costs and benefits at the consumer level but also a wide range of impacts at the national level, such as energy savings, net present value of savings, greenhouse gas (CO2) emission reductions and avoided capacity generation due to a specific policy. Because historically Chile has followed European schemes in energy efficiency programs (test procedures, labelling program definitions), we take the Ecodesign commission regulation No 244/2009 as a starting point when defining our phase out program, which means a tiered phase out based on minimum efficacy per lumen category. The following data were collected in order to perform the techno-economic analysis: (1) Retail prices, efficiency and wattage category in the current market, (2) Usage data (hours of lamp use per day), and (3) Stock data, penetration of efficient lamps in the market. Using these data, PAMS calculates the costs and benefits of efficiency standards from two distinct but related perspectives: (1) The Life

    1. VSD and minimum pump speed: How to calculate it and why

      SciTech Connect (OSTI)

      Vaillencourt, R.R.

      1995-06-01

      The potential of the Affinity Laws for energy savings is almost unbelievable. In the simplest situations, the savings are absolutely true. The true test is to be able to recognize when things are not simple and the evaluation needs to be modified. Remember: the evaluation needs to be modified. The laws are the laws. Rest assured, the modifications all follow the Affinity Laws and the calculations are simple. But the savings potential will be less, and in some cases too small to consider when certain easily recognized situations exist. The reduction in expected savings comes from the misapplication of the affinity laws, or more correctly, the application of the wrong affinity law, when evaluating the real world of pumps in action. If the application is to reduce flow in a closed loop circulation system without any devices that require a significant inlet pressure at all times to perform properly, then using the affinity laws, in their simplest form, i.e., the flow vs. rpm relationship, is the correct way to go. If, however, there is a minimum pressure required on the system at all times for the end use devices to work properly, or a static head that must be overcome, then the evaluation must first focus on the second Affinity Law.

    2. Visitor Hanford Computer Access Request - Hanford Site

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Visitor Hanford Computer Access Request Visitor Hanford Computer Access Request Visitor Hanford Computer Access Request Visitor Hanford Computer Access Request Email Email Page | Print Print Page | Text Increase Font Size Decrease Font Size The U.S. Department of Energy (DOE), Richland Operations Office (RL), in compliance with the 'Tri-Party Agreement Databases, Access Mechanism and Procedures' document, DOE/RL-93-69, Revision 5; set forth the requirements for access to the Hanford Site

    3. Cyber Security Process Requirements Manual

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2008-08-12

      The Manual establishes the minimum implementation standards for cyber security management processes throughout the Department. No cancellation.

    4. Exploratory Experimentation and Computation

      SciTech Connect (OSTI)

      Bailey, David H.; Borwein, Jonathan M.

      2010-02-25

      We believe the mathematical research community is facing a great challenge to re-evaluate the role of proof in light of recent developments. On one hand, the growing power of current computer systems, of modern mathematical computing packages, and of the growing capacity to data-mine on the Internet, has provided marvelous resources to the research mathematician. On the other hand, the enormous complexity of many modern capstone results such as the Poincare conjecture, Fermat's last theorem, and the classification of finite simple groups has raised questions as to how we can better ensure the integrity of modern mathematics. Yet as the need and prospects for inductive mathematics blossom, the requirement to ensure the role of proof is properly founded remains undiminished.

    5. NERSC HPC Program Requirements Review Reports

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      NERSC-PRR-HEP-2017.pdf | Adobe Acrobat PDF file Large Scale Computing and Storage Requirements for High Energy Physics - Target 2017 NERSC-NP2017FINAL.pdf | Adobe Acrobat PDF file ...

    6. Constructing the ASCI computational grid

      SciTech Connect (OSTI)

      BEIRIGER,JUDY I.; BIVENS,HUGH P.; HUMPHREYS,STEVEN L.; JOHNSON,WILBUR R.; RHEA,RONALD E.

      2000-06-01

      The Accelerated Strategic Computing Initiative (ASCI) computational grid is being constructed to interconnect the high performance computing resources of the nuclear weapons complex. The grid will simplify access to the diverse computing, storage, network, and visualization resources, and will enable the coordinated use of shared resources regardless of location. To match existing hardware platforms, required security services, and current simulation practices, the Globus MetaComputing Toolkit was selected to provide core grid services. The ASCI grid extends Globus functionality by operating as an independent grid, incorporating Kerberos-based security, interfacing to Sandia's Cplant{trademark},and extending job monitoring services. To fully meet ASCI's needs, the architecture layers distributed work management and criteria-driven resource selection services on top of Globus. These services simplify the grid interface by allowing users to simply request ''run code X anywhere''. This paper describes the initial design and prototype of the ASCI grid.

    7. Computing Resources | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Resources Mira Cetus and Vesta Visualization Cluster Data and Networking Software JLSE Computing Resources Theory and Computing Sciences Building Argonne's Theory and Computing Sciences (TCS) building houses a wide variety of computing systems including some of the most powerful supercomputers in the world. The facility has 25,000 square feet of raised computer floor space and a pair of redundant 20 megavolt amperes electrical feeds from a 90 megawatt substation. The building also

    8. IDAPA 37.03.03 - Rules and Minimum Standards for the Construction...

      Open Energy Info (EERE)

      3 - Rules and Minimum Standards for the Construction and Use of Injection Wells Jump to: navigation, search OpenEI Reference LibraryAdd to library Legal Document-...

    9. DOE Requires Manufacturers to Halt Sales of Heat Pumps and Air...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Media contact(s): (202) 586-4940 Addthis Related Articles DOE Requires Manufacturers to Halt Sales of Heat Pumps and Air Conditioners Violating Minimum Appliance Standards DOE Orders ...

    10. High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      HPC INL Logo Home High-Performance Computing INL's high-performance computing center provides general use scientific computing capabilities to support the lab's efforts in advanced...

    11. Multiprocessor computing for images

      SciTech Connect (OSTI)

      Cantoni, V. ); Levialdi, S. )

      1988-08-01

      A review of image processing systems developed until now is given, highlighting the weak points of such systems and the trends that have dictated their evolution through the years producing different generations of machines. Each generation may be characterized by the hardware architecture, the programmability features and the relative application areas. The need for multiprocessing hierarchical systems is discussed focusing on pyramidal architectures. Their computational paradigms, their virtual and physical implementation, their programming and software requirements, and capabilities by means of suitable languages, are discussed.

    12. ASCR Workshop on Quantum Computing for Science

      SciTech Connect (OSTI)

      Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward; Gaitan, Frank; Humble, Travis; Jordan, Stephen; Landahl, Andrew J; Love, Peter; Lucas, Robert; Preskill, John; Muller, Richard P.; Svore, Krysta; Wiebe, Nathan; Williams, Carl

      2015-06-01

      This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms for linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.

    13. TRIDAC host computer functional specification

      SciTech Connect (OSTI)

      Hilbert, S.M.; Hunter, S.L.

      1983-08-23

      The purpose of this document is to outline the baseline functional requirements for the Triton Data Acquisition and Control (TRIDAC) Host Computer Subsystem. The requirements presented in this document are based upon systems that currently support both the SIS and the Uranium Separator Technology Groups in the AVLIS Program at the Lawrence Livermore National Laboratory and upon the specific demands associated with the extended safe operation of the SIS Triton Facility.

    14. BES Requirements Review 2014

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      BES Requirements Review 2014 Science Engagement Move your data Programs & Workshops Science Requirements Reviews Network Requirements Reviews Documents and Background Materials FAQ for Case Study Authors BER Requirements Review 2015 ASCR Requirements Review 2015 Previous Reviews HEP/NP Requirements Review 2013 FES Requirements Review 2014 BES Requirements Review 2014 BES Attendees 2014 Requirements Review Reports Case Studies Contact Us Technical Assistance: 1 800-33-ESnet (Inside US) 1

    15. FES Requirements Review 2014

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      FES Requirements Review 2014 Science Engagement Move your data Programs & Workshops Science Requirements Reviews Network Requirements Reviews Documents and Background Materials FAQ for Case Study Authors BER Requirements Review 2015 ASCR Requirements Review 2015 Previous Reviews HEP/NP Requirements Review 2013 FES Requirements Review 2014 FES Attendees 2014 BES Requirements Review 2014 Requirements Review Reports Case Studies Contact Us Technical Assistance: 1 800-33-ESnet (Inside US) 1

    16. ITP Steel: Theoretical Minimum Energies to Produce Steel for Selected Conditions, March 2000

      Broader source: Energy.gov [DOE]

      The absolute theoretical minimum energies to produce liquid steel from idealized scrap (100% Fe) and ore (100% Fe2O3) are much lower than consumed in practice, as are the theoretical minimum energies to roll the steel into its final shape.

    17. BER Requirements Review 2015

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Reviews Network Requirements Reviews Documents and Background Materials FAQ for Case Study Authors BER Requirements Review 2015 BER Attendees 2015 ASCR Requirements...

    18. Network Requirements Reviews

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Reviews Network Requirements Reviews Documents and Background Materials FAQ for Case Study Authors BER Requirements Review 2015 ASCR Requirements Review 2015 Previous...

    19. Requirements for Wind Development

      Office of Energy Efficiency and Renewable Energy (EERE)

      In 2015 Oklahoma amended the Oklahoma Wind Energy Development Act. The amendments added new financial security requirements, setback requirements, and notification requirements for wind energy...

    20. Meeting PMU Data Quality Requirements for Mission Critical Application...

      Energy Savers [EERE]

      Meeting PMU Data Quality Requirements for Mission Critical Applications Anurag Srivastava School of Electrical Engineering and Computer Science Washington State University ...

    1. Large Scale Computing and Storage Requirements for Basic Energy...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      SciencesAn BES ASCR NERSC WorkshopFebruary 9-10, 2010... Read More Workshop Logistics Workshop location, directions, and registration information are included here......

    2. Present and Future Computing Requirements Sergey Syritsyn RIKEN...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      at JLab, Mainz; OLYMPUS, MUSE (planned) * Quark Density Distributions in the Proton ... differs by 7 only 30% of the proton spin can be explained by quark spins 2012 1 value. ...

    3. Large Scale Computing Requirements for Basic Energy Sciences...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Acoustic Waves ). ( ) , , , ( 1 2 2 2 2 2 2 2 2 2 t s t z y x p z y x t v ... Starting Models - Test Different Noise Assumptions * Scale Problem Up to Ever ...

    4. Large Scale Computing and Storage Requirements for High Energy...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      for High Energy Physics Accelerator Physics P. Spentzouris, Fermilab Motivation ... Project-X http:www.er.doe.govhepHEPAPreportsP5Report%2006022008.pdf ComPASS The SciDAC2 ...

    5. Large Scale Production Computing and Storage Requirements for...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      11-12, 2012 Hilton Rockville Hotel and Executive Meeting Center 1750 Rockville Pike Rockville, MD, 20852-1699 TEL: 1-301-468-1100 Sponsored by: U.S. Department of Energy...

    6. Review of PREPA Technical Requirements for Interconnecting Wind and Solar Generation

      SciTech Connect (OSTI)

      Gevorgian, Vahan; Booth, Sarah

      2013-11-01

      The Puerto Rico Electric Power Authority developed the minimum technical requirements for interconnection of wind turbine generation and photovoltaic power plants. NREL has conducted a review of these requirements based on generic technical aspects and electrical characteristics of wind and photovoltaic power plants, and on existing requirements from other utilities (both U.S. and European).

    7. Glassy slags for minimum additive waste stabilization. Interim progress report, May 1993--February 1994

      SciTech Connect (OSTI)

      Feng, X.; Wronkiewicz, D.J.; Bates, J.K.; Brown, N.R.; Buck, E.C.; Dietz, N.L.; Gong, M.; Emery, J.W.

      1994-05-01

      Glassy slag waste forms are being developed to complement glass waste forms in implementing Minimum Additive Waste Stabilization (MAWS) for supporting DOE`s environmental restoration efforts. The glassy slag waste form is composed of various crystalline and metal oxide phases embedded in a silicate glass phase. The MAWS approach was adopted by blending multiple waste streams to achieve up to 100% waste loadings. The crystalline phases, such as spinels, are very durable and contain hazardous and radioactive elements in their lattice structures. These crystalline phases may account for up to 80% of the total volume of slags having over 80% metal loading. The structural bond strength model was used to quantify the correlation between glassy slag composition and chemical durability so that optimized slag compositions were obtained with limited crucible melting and testing. Slag compositions developed through crucible melts were also successfully generated in a pilot-scale Retech plasma centrifugal furnace at Ukiah, California. Utilization of glassy slag waste forms allows the MAWS approach to be applied to a much wider range of waste streams than glass waste forms. The initial work at ANL has indicated that glassy slags are good final waste forms because of (1) their high chemical durability; (2) their ability to incorporate large amounts of metal oxides; (3) their ability to incorporate waste streams having low contents of flux components; (4) their less stringent requirements on processing parameters, compared to glass waste forms; and (5) their low requirements for purchased additives, which means greater waste volume reduction and treatment cost savings.

    8. Determining collective barrier operation skew in a parallel computer

      DOE Patents [OSTI]

      Faraj, Daniel A.

      2015-12-24

      Determining collective barrier operation skew in a parallel computer that includes a number of compute nodes organized into an operational group includes: for each of the nodes until each node has been selected as a delayed node: selecting one of the nodes as a delayed node; entering, by each node other than the delayed node, a collective barrier operation; entering, after a delay by the delayed node, the collective barrier operation; receiving an exit signal from a root of the collective barrier operation; and measuring, for the delayed node, a barrier completion time. The barrier operation skew is calculated by: identifying, from the compute nodes' barrier completion times, a maximum barrier completion time and a minimum barrier completion time and calculating the barrier operation skew as the difference of the maximum and the minimum barrier completion time.

    9. Determining collective barrier operation skew in a parallel computer

      SciTech Connect (OSTI)

      Faraj, Daniel A.

      2015-11-24

      Determining collective barrier operation skew in a parallel computer that includes a number of compute nodes organized into an operational group includes: for each of the nodes until each node has been selected as a delayed node: selecting one of the nodes as a delayed node; entering, by each node other than the delayed node, a collective barrier operation; entering, after a delay by the delayed node, the collective barrier operation; receiving an exit signal from a root of the collective barrier operation; and measuring, for the delayed node, a barrier completion time. The barrier operation skew is calculated by: identifying, from the compute nodes' barrier completion times, a maximum barrier completion time and a minimum barrier completion time and calculating the barrier operation skew as the difference of the maximum and the minimum barrier completion time.

    10. Feed tank transfer requirements

      SciTech Connect (OSTI)

      Freeman-Pollard, J.R.

      1998-09-16

      This document presents a definition of tank turnover; DOE responsibilities; TWRS DST permitting requirements; TWRS Authorization Basis (AB) requirements; TWRS AP Tank Farm operational requirements; unreviewed safety question (USQ) requirements; records and reporting requirements, and documentation which will require revision in support of transferring a DST in AP Tank Farm to a privatization contractor for use during Phase 1B.

    11. Revenue-requirement approach to analysis of financing alternatives

      SciTech Connect (OSTI)

      Ewers, B.J.; Wheaton, K.E.

      1984-07-19

      The minimum revenue requirement discipline (MRRD) is accepted throughout the utility industry as a tool to be used for economic decisions and rate making. At least one utility company has also used MRRD in the analysis of financing alternatives. This article was written to show the versatility of the revenue requirement discipline. It demonstrates that this methodology is appropriate not only for evaluating traditional capital budgeting decisions, but also for identifying the most economic financing alternatives. 5 references, 4 figures, 4 tables.

    12. Impact analysis on a massively parallel computer

      SciTech Connect (OSTI)

      Zacharia, T.; Aramayo, G.A.

      1994-06-01

      Advanced mathematical techniques and computer simulation play a major role in evaluating and enhancing the design of beverage cans, industrial, and transportation containers for improved performance. Numerical models are used to evaluate the impact requirements of containers used by the Department of Energy (DOE) for transporting radioactive materials. Many of these models are highly compute-intensive. An analysis may require several hours of computational time on current supercomputers despite the simplicity of the models being studied. As computer simulations and materials databases grow in complexity, massively parallel computers have become important tools. Massively parallel computational research at the Oak Ridge National Laboratory (ORNL) and its application to the impact analysis of shipping containers is briefly described in this paper.

    13. A common language for computer security incidents

      SciTech Connect (OSTI)

      John D. Howard; Thomas A Longstaff

      1998-10-01

      Much of the computer security information regularly gathered and disseminated by individuals and organizations cannot currently be combined or compared because a common language has yet to emerge in the field of computer security. A common language consists of terms and taxonomies (principles of classification) which enable the gathering, exchange and comparison of information. This paper presents the results of a project to develop such a common language for computer security incidents. This project results from cooperation between the Security and Networking Research Group at the Sandia National Laboratories, Livermore, CA, and the CERT{reg_sign} Coordination Center at Carnegie Mellon University, Pittsburgh, PA. This Common Language Project was not an effort to develop a comprehensive dictionary of terms used in the field of computer security. Instead, the authors developed a minimum set of high-level terms, along with a structure indicating their relationship (a taxonomy), which can be used to classify and understand computer security incident information. They hope these high-level terms and their structure will gain wide acceptance, be useful, and most importantly, enable the exchange and comparison of computer security incident information. They anticipate, however, that individuals and organizations will continue to use their own terms, which may be more specific both in meaning and use. They designed the common language to enable these lower-level terms to be classified within the common language structure.

    14. Cyber Security Process Requirements Manual

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2008-08-12

      The Manual establishes the minimum implementation standards for cyber security management processes throughout the Department. No cancellation. Admin Chg 1 dated 9-1-09.

    15. ASCR Requirements Review 2015

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ASCR Requirements Review 2015 ASCR Attendees 2015 Previous Reviews Requirements Review Reports Case Studies News & Publications ESnet News Publications and Presentations Galleries ESnet Awards and Honors Blog ESnet Live Home » Science Engagement » Science Requirements Reviews » Network Requirements Reviews » ASCR Requirements Review 2015 Science Engagement Move your data Programs & Workshops Science Requirements Reviews Network Requirements Reviews Documents and Background Materials

    16. BER Requirements Review 2015

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      BER Attendees 2015 ASCR Requirements Review 2015 Previous Reviews Requirements Review Reports Case Studies News & Publications ESnet News Publications and Presentations Galleries ESnet Awards and Honors Blog ESnet Live Home » Science Engagement » Science Requirements Reviews » Network Requirements Reviews » BER Requirements Review 2015 Science Engagement Move your data Programs & Workshops Science Requirements Reviews Network Requirements Reviews Documents and Background Materials

    17. Science Requirements Process

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Science Requirements Reviews Network Requirements Reviews Requirements Review Reports Case Studies News & Publications ESnet News Publications and Presentations Galleries ESnet Awards and Honors Blog ESnet Live Home » Science Engagement » Science Requirements Reviews Science Engagement Move your data Programs & Workshops Science Requirements Reviews Network Requirements Reviews Requirements Review Reports Case Studies Contact Us Technical Assistance: 1 800-33-ESnet (Inside US) 1

    18. Computationally Efficient Multiconfigurational Reactive Molecular Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      | Argonne Leadership Computing Facility Computationally Efficient Multiconfigurational Reactive Molecular Dynamics Authors: Takefumi Yamashita, Yuxing Peng, Chris Knight, Gregory A. Voth It is a computationally demanding task to explicitly simulate the electronic degrees of freedom in a system to observe the chemical transformations of interest, while at the same time sampling the time and length scales required to converge statistical properties and thus reduce artifacts due to initial

    19. ASCR Science Network Requirements

      SciTech Connect (OSTI)

      Dart, Eli; Tierney, Brian

      2009-08-24

      The Energy Sciences Network (ESnet) is the primary provider of network connectivity for the US Department of Energy Office of Science, the single largest supporter of basic research in the physical sciences in the United States. In support of the Office of Science programs, ESnet regularly updates and refreshes its understanding of the networking requirements of the instruments, facilities, scientists, and science programs that it serves. This focus has helped ESnet to be a highly successful enabler of scientific discovery for over 20 years. In April 2009 ESnet and the Office of Advanced Scientific Computing Research (ASCR), of the DOE Office of Science, organized a workshop to characterize the networking requirements of the programs funded by ASCR. The ASCR facilities anticipate significant increases in wide area bandwidth utilization, driven largely by the increased capabilities of computational resources and the wide scope of collaboration that is a hallmark of modern science. Many scientists move data sets between facilities for analysis, and in some cases (for example the Earth System Grid and the Open Science Grid), data distribution is an essential component of the use of ASCR facilities by scientists. Due to the projected growth in wide area data transfer needs, the ASCR supercomputer centers all expect to deploy and use 100 Gigabit per second networking technology for wide area connectivity as soon as that deployment is financially feasible. In addition to the network connectivity that ESnet provides, the ESnet Collaboration Services (ECS) are critical to several science communities. ESnet identity and trust services, such as the DOEGrids certificate authority, are widely used both by the supercomputer centers and by collaborations such as Open Science Grid (OSG) and the Earth System Grid (ESG). Ease of use is a key determinant of the scientific utility of network-based services. Therefore, a key enabling aspect for scientists beneficial use of high

    20. Computation Modeling and Assessment of Nanocoatings for Ultra Supercritical Boilers

      SciTech Connect (OSTI)

      J. Shingledecker; D. Gandy; N. Cheruvu; R. Wei; K. Chan

      2011-06-21

      Forced outages and boiler unavailability of coal-fired fossil plants is most often caused by fire-side corrosion of boiler waterwalls and tubing. Reliable coatings are required for Ultrasupercritical (USC) application to mitigate corrosion since these boilers will operate at a much higher temperatures and pressures than in supercritical (565 C {at} 24 MPa) boilers. Computational modeling efforts have been undertaken to design and assess potential Fe-Cr-Ni-Al systems to produce stable nanocrystalline coatings that form a protective, continuous scale of either Al{sub 2}O{sub 3} or Cr{sub 2}O{sub 3}. The computational modeling results identified a new series of Fe-25Cr-40Ni with or without 10 wt.% Al nanocrystalline coatings that maintain long-term stability by forming a diffusion barrier layer at the coating/substrate interface. The computational modeling predictions of microstructure, formation of continuous Al{sub 2}O{sub 3} scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. Advanced coatings, such as MCrAl (where M is Fe, Ni, or Co) nanocrystalline coatings, have been processed using different magnetron sputtering deposition techniques. Several coating trials were performed and among the processing methods evaluated, the DC pulsed magnetron sputtering technique produced the best quality coating with a minimum number of shallow defects and the results of multiple deposition trials showed that the process is repeatable. scale, inward Al diffusion, grain growth, and sintering behavior were validated with experimental results. The cyclic oxidation test results revealed that the nanocrystalline coatings offer better oxidation resistance, in terms of weight loss, localized oxidation, and formation of mixed oxides in the Al{sub 2}O{sub 3} scale, than widely used MCrAlY coatings. However, the ultra-fine grain structure in these coatings, consistent with the computational model predictions, resulted in accelerated Al

    1. Applications of Parallel Computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computers Applications of Parallel Computers UCB CS267 Spring 2015 Tuesday & Thursday, 9:30-11:00 Pacific Time Applications of Parallel Computers, CS267, is a graduate-level course...

    2. Title 43 CFR 3206.12 What are the Minimum and Maximum Lease Sizes...

      Open Energy Info (EERE)

      .12 What are the Minimum and Maximum Lease Sizes? Jump to: navigation, search OpenEI Reference LibraryAdd to library Legal Document- Federal RegulationFederal Regulation: Title 43...

    3. From Fjords to Open Seas: Ecological Genomics of Expanding Oxygen Minimum Zones (2010 JGI User Meeting)

      ScienceCinema (OSTI)

      Hallam, Steven

      2011-04-26

      Steven Hallam of the University of British Columbia talks "From Fjords to Open Seas: Ecological Genomics of Expanding Oxygen Minimum Zones" on March 24, 2010 at the 5th Annual DOE JGI User Meeting

    4. Theory, Modeling and Computation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Theory, Modeling and Computation Theory, Modeling and Computation The sophistication of modeling and simulation will be enhanced not only by the wealth of data available from MaRIE but by the increased computational capacity made possible by the advent of extreme computing. CONTACT Jack Shlachter (505) 665-1888 Email Extreme Computing to Power Accurate Atomistic Simulations Advances in high-performance computing and theory allow longer and larger atomistic simulations than currently possible.

    5. Computer hardware fault administration

      DOE Patents [OSTI]

      Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

      2010-09-14

      Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

    6. Computational Earth Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      6 Computational Earth Science We develop and apply a range of high-performance computational methods and software tools to Earth science projects in support of environmental ...

    7. advanced simulation and computing

      National Nuclear Security Administration (NNSA)

      Each successive generation of computing system has provided greater computing power and energy efficiency.

      CTS-1 clusters will support NNSA's Life Extension Program and...

    8. Computational Physics and Methods

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 Computational Physics and Methods Performing innovative simulations of physics phenomena on tomorrow's scientific computing platforms Growth and emissivity of young galaxy ...

    9. Applied & Computational Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      & Computational Math - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us ... Twitter Google + Vimeo GovDelivery SlideShare Applied & Computational Math HomeEnergy ...

    10. Molecular Science Computing | EMSL

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computational and state-of-the-art experimental tools, providing a cross-disciplinary environment to further research. Additional Information Computing user policies Partners...

    11. Feed tank transfer requirements

      SciTech Connect (OSTI)

      Freeman-Pollard, J.R.

      1998-09-16

      This document presents a definition of tank turnover. Also, DOE and PC responsibilities; TWRS DST permitting requirements; TWRS Authorization Basis (AB) requirements; TWRS AP Tank Farm operational requirements; unreviewed safety question (USQ) requirements are presented for two cases (i.e., tank modifications occurring before tank turnover and tank modification occurring after tank turnover). Finally, records and reporting requirements, and documentation which will require revision in support of transferring a DST in AP Tank Farm to a privatization contractor are presented.

    12. NERSC-BES-Requirements-Yelick10.ppt

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Basic Energy Science Research Katherine Yelick NERSC Director Requirements Workshop NERSC Mission Accelerate the pace of scientific discovery for all DOE Office of Science (SC) research through computing and data systems and services. Efficient algorithms + flexible software + effective machines great computational science. 2 2010 Allocations NERSC is the Production Facility for DOE Office of Science * NERSC serves a large population Approximately 3000 users, 400 projects, 500 code instances *

    13. Regulators, Requirements, Statutes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Air Act (CAA) Requirements for air quality and air emissions from facility operations Clean Water Act (CWA) Requirements for water quality and water discharges from facility...

    14. Requirements Review Reports

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Editors, "High Energy Physics and Nuclear Physics Network Requirements - Final Report", ESnet Network Requirements Workshop, August 2013, LBNL 6642E Download File: HEP-NP-Net-Req...

    15. National Energy Research Scientific Computing Center NERSC Exceeds...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Scientific Computing Center NERSC Exceeds Reliability Standards With Tape-Based Active ... on the archive, NERSC's storage capacity and reliability requirements are significant. ...

    16. ALCF Data Science Program | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ALCF Data Science Program The ALCF Data Science Program (ADSP) is targeted at "big data" science problems that require the scale and performance of leadership computing resources. ...

    17. Fermilab computing at the Intensity Frontier

      SciTech Connect (OSTI)

      Group, Craig; Fuess, S.; Gutsche, O.; Kirby, M.; Kutschke, R.; Lyon, A.; Norman, A.; Perdue, G.; Sexton-Kennedy, E.

      2015-12-23

      The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less on the development of tools and infrastructure.

    18. Fermilab computing at the Intensity Frontier

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Group, Craig; Fuess, S.; Gutsche, O.; Kirby, M.; Kutschke, R.; Lyon, A.; Norman, A.; Perdue, G.; Sexton-Kennedy, E.

      2015-12-23

      The Intensity Frontier refers to a diverse set of particle physics experiments using high- intensity beams. In this paper I will focus the discussion on the computing requirements and solutions of a set of neutrino and muon experiments in progress or planned to take place at the Fermi National Accelerator Laboratory located near Chicago, Illinois. In addition, the experiments face unique challenges, but also have overlapping computational needs. In principle, by exploiting the commonality and utilizing centralized computing tools and resources, requirements can be satisfied efficiently and scientists of individual experiments can focus more on the science and less onmore » the development of tools and infrastructure.« less

    19. Richard Gerber! Harvey Wasserman! Requirements Reviews Organizers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      9, 2 014 Requirements Reviews * 1½-day reviews with each Program Office * Computing and storage requirements for next 5 years * Participants - DOE ADs & Program Managers - Leading NERSC users & key potential users - NERSC staff 2 High Energy Physics Fusion R esearch Adv. C omp. S cience R esearch J an. 2 014 Basic E nergy S ciences O ct 2 014 Reports From 8 Requirements Reviews Have Been Published 3 h@p://www.nersc.gov/science/hpc---requirements---reviews/reports/ * CompuFng a nd s

    20. Cosmic Reionization On Computers | Argonne Leadership Computing...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      its Cosmic Reionization On Computers (CROC) project, using the Adaptive Refinement Tree (ART) code as its main simulation tool. An important objective of this research is to make...

    1. Computational Science and Engineering

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Science and Engineering NETL's Computational Science and Engineering competency consists of conducting applied scientific research and developing physics-based simulation models, methods, and tools to support the development and deployment of novel process and equipment designs. Research includes advanced computations to generate information beyond the reach of experiments alone by integrating experimental and computational sciences across different length and time scales. Specific

    2. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zrich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland

    3. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland

    4. Computing for Finance

      SciTech Connect (OSTI)

      2010-03-24

      to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland

    5. Molecular Science Computing: 2010 Greenbook

      SciTech Connect (OSTI)

      De Jong, Wibe A.; Cowley, David E.; Dunning, Thom H.; Vorpagel, Erich R.

      2010-04-02

      This 2010 Greenbook outlines the science drivers for performing integrated computational environmental molecular research at EMSL and defines the next-generation HPC capabilities that must be developed at the MSC to address this critical research. The EMSL MSC Science Panel used EMSL’s vision and science focus and white papers from current and potential future EMSL scientific user communities to define the scientific direction and resulting HPC resource requirements presented in this 2010 Greenbook.

    6. Parallel computing works

      SciTech Connect (OSTI)

      Not Available

      1991-10-23

      An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

    7. Computational Fluid Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      scour-tracc-cfd TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Fluid Dynamics Overview of CFD: Video Clip with Audio Computational fluid dynamics (CFD) research uses mathematical and computational models of flowing fluids to describe and predict fluid response in problems of interest, such as the flow of air around a moving vehicle or the flow of water and sediment in a river. Coupled with appropriate and prototypical

    8. Requirements | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Requirements Requirements Statutes 42 U.S.C. 4321: National Environmental Policy Act of 1969 42 U.S.C. 4371: Environmental Quality Improvement Act of 1970 42 U.S.C. 7401: Clean Air ...

    9. Requirements Management Database

      Energy Science and Technology Software Center (OSTI)

      2009-08-13

      This application is a simplified and customized version of the RBA and CTS databases to capture federal, site, and facility requirements, link to actions that must be performed to maintain compliance with their contractual and other requirements.

    10. Point sensitive NMR imaging system using a magnetic field configuration with a spatial minimum

      DOE Patents [OSTI]

      Eberhard, P.H.

      A point-sensitive NMR imaging system in which a main solenoid coil produces a relatively strong and substantially uniform magnetic field and a pair of perturbing coils powered by current in the same direction superimposes a pair of relatively weak perturbing fields on the main field to produce a resultant point of minimum field strength at a desired location in a direction along the Z-axis. Two other pairs of perturbing coils superimpose relatively weak field gradients on the main field in directions along the X- and Y-axes to locate the minimum field point at a desired location in a plane normal to the Z-axes. An rf generator irradiates a tissue specimen in the field with radio frequency energy so that desired nuclei in a small volume at the point of minimum field strength will resonate.

    11. Point sensitive NMR imaging system using a magnetic field configuration with a spatial minimum

      DOE Patents [OSTI]

      Eberhard, Philippe H.

      1985-01-01

      A point-sensitive NMR imaging system (10) in which a main solenoid coil (11) produces a relatively strong and substantially uniform magnetic field and a pair of perturbing coils (PZ1 and PZ2) powered by current in the same direction superimposes a pair of relatively weak perturbing fields on the main field to produce a resultant point of minimum field strength at a desired location in a direction along the Z-axis. Two other pairs of perturbing coils (PX1, PX2; PY1, PY2) superimpose relatively weak field gradients on the main field in directions along the X- and Y-axes to locate the minimum field point at a desired location in a plane normal to the Z-axes. An RF generator (22) irradiates a tissue specimen in the field with radio frequency energy so that desired nuclei in a small volume at the point of minimum field strength will resonate.

    12. Polymorphous computing fabric

      DOE Patents [OSTI]

      Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

      2011-01-18

      Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

    13. ARM - Reporting Requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      StatisticsReporting Requirements 2016 Quarterly Reports First Quarter (PDF) Second Quarter (PDF) Third Quarter (PDF) Fourth Quarter (PDF) Past Quarterly Reports Historical Statistics Field Campaigns Operational Visitors and Accounts Data Archive and Usage (October 1995 - Present) Reporting Requirements As a matter of government policy, all U.S. Department of Energy user facilities, including the ARM Climate Research Facility, have a number of reporting requirements. The Facility is required to

    14. PIT Coating Requirements Analysis

      SciTech Connect (OSTI)

      MINTEER, D.J.

      2000-10-20

      This study identifies the applicable requirements for procurement and installation of a coating intended for tank farm valve and pump pit interior surfaces. These requirements are intended to be incorporated into project specification documents and design media. This study also evaluates previously recommended coatings and identifies requirement-compliant coating products.

    15. Cyber Security Process Requirements Manual

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2008-08-12

      The Manual establishes minimum implementation standards for cyber security management processes throughout the Department. Admin Chg 1 dated 9-1-09; Admin Chg 2 dated 12-22-09. Canceled by DOE O 205.1B. No cancellations.

    16. Computers in Commercial Buildings

      U.S. Energy Information Administration (EIA) Indexed Site

      Government-owned buildings of all types, had, on average, more than one computer per person (1,104 computers per thousand employees). They also had a fairly high ratio of...

    17. Computers for Learning

      Broader source: Energy.gov [DOE]

      Through Executive Order 12999, the Computers for Learning Program was established to provide Federal agencies a quick and easy system for donating excess and surplus computer equipment to schools...

    18. Cognitive Computing for Security.

      SciTech Connect (OSTI)

      Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

      2015-12-01

      Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

    19. developing-compute-efficient

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Developing Compute-efficient, Quality Models with LS-PrePost 3 on the TRACC Cluster Oct. ... with an emphasis on applying these capabilities to build computationally efficient models. ...

    20. SSRL Computer Account Request Form

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Initial password desired - must be changed after your first login: ... (Minimum length 8 characters, your password should be a combination of 3 out of 4 options: 1- ...

    1. Fermilab | Science at Fermilab | Computing | Grid Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      which would collect more data than any computing center in existence could process. ... consortium grid called Open Science Grid, so they initiated a project known as FermiGrid. ...

    2. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to the Department of Energy. Get Expertise Pieter Swart (505) 665 9437 Email Pat McCormick (505) 665-0201 Email Dave Higdon (505) 667-2091 Email Fulfilling the potential of emerging computing systems and architectures beyond today's tools and techniques to deliver

    3. Computational Structural Mechanics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      load-2 TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Structural Mechanics Overview of CSM Computational structural mechanics is a well-established methodology for the design and analysis of many components and structures found in the transportation field. Modern finite-element models (FEMs) play a major role in these evaluations, and sophisticated software, such as the commercially available LS-DYNA® code, is

    4. Computers-BSA.ppt

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Energy Computers, Electronics and Electrical Equipment (2010 MECS) Computers, Electronics and Electrical Equipment (2010 MECS) Manufacturing Energy and Carbon Footprint for Computers, Electronics and Electrical Equipment Sector (NAICS 334, 335) Energy use data source: 2010 EIA MECS (with adjustments) Footprint Last Revised: February 2014 View footprints for other sectors here. Manufacturing Energy and Carbon Footprint Computers, Electronics and Electrical Equipment (123.71 KB) More Documents

    5. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Sciences and Engineering The Computational Sciences and Engineering Division (CSED) is ORNL's premier source of basic and applied research in the field of data sciences and knowledge discovery. CSED's science agenda is focused on research and development related to knowledge discovery enabled by the explosive growth in the availability, size, and variability of dynamic and disparate data sources. This science agenda encompasses data sciences as well as advanced modeling and

    6. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Technology Information Technology (IT) at ORNL serves a diverse community of stakeholders and interests. From everyday operations like email and telecommunications to institutional cluster computing and high bandwidth networking, IT at ORNL is responsible for planning and executing a coordinated strategy that ensures cost-effective, state-of-the-art computing capabilities for research and development. ORNL IT delivers leading-edge products to users in a risk-managed portfolio of

    7. BNL ATLAS Grid Computing

      ScienceCinema (OSTI)

      Michael Ernst

      2010-01-08

      As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

    8. Mathematical and Computational Epidemiology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematical and Computational Epidemiology Search Site submit Contacts | Sponsors Mathematical and Computational Epidemiology Los Alamos National Laboratory change this image and alt text Menu About Contact Sponsors Research Agent-based Modeling Mixing Patterns, Social Networks Mathematical Epidemiology Social Internet Research Uncertainty Quantification Publications People Mathematical and Computational Epidemiology (MCEpi) Quantifying model uncertainty in agent-based simulations for

    9. Computing environment logbook

      DOE Patents [OSTI]

      Osbourn, Gordon C; Bouchard, Ann M

      2012-09-18

      A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

    10. Other Requirements - DOE Directives, Delegations, and Requirements

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Other Requirements by Website Administrator More filters Less filters Other Policy Type Secretarial Memo Program Office Memo Invoked Technical Standards 100 Office of Primary Interest (OPI) Office of Primary Interest (OPI) All AD - Office of Administrative Services AU - Office of Environment, Health, Safety and Security CF - Office of the Chief Financial Officer CI - Office of Congressional and Intergovernmental Affairs CN - Office of Counterintelligence CP - Office of the Press Secretary CR -