National Library of Energy BETA

Sample records for develop baseline computational

  1. Scope Management Baseline Development (FPM 208), Idaho | Department of

    Energy Savers [EERE]

    Energy Scope Management Baseline Development (FPM 208), Idaho Scope Management Baseline Development (FPM 208), Idaho March 29, 2016 8:00AM EDT to March 31, 2016 5:00PM EDT Scope Management Baseline Development Level 2 Required Course 3 days / 24 CLPs This course is designed to enhance a Program or Project Manager's ability to clearly define requirements and scope, develop a defensible baseline, and manage conformance to the baseline throughout the project life-cycle. The course emphasizes

  2. Baseline Glass Development for Combined Fission Products Waste Streams

    SciTech Connect (OSTI)

    Crum, Jarrod V.; Billings, Amanda Y.; Lang, Jesse B.; Marra, James C.; Rodriguez, Carmen P.; Ryan, Joseph V.; Vienna, John D.

    2009-06-29

    Borosilicate glass was selected as the baseline technology for immobilization of the Cs/Sr/Ba/Rb (Cs), lanthanide (Ln) and transition metal fission product (TM) waste steams as part of a cost benefit analysis study.[1] Vitrification of the combined waste streams have several advantages, minimization of the number of waste forms, a proven technology, and similarity to waste forms currently accepted for repository disposal. A joint study was undertaken by Pacific Northwest National Laboratory (PNNL) and Savannah River National Laboratory (SRNL) to develop acceptable glasses for the combined Cs + Ln + TM waste streams (Option 1) and Cs + Ln combined waste streams (Option 2) generated by the AFCI UREX+ set of processes. This study is aimed to develop baseline glasses for both combined waste stream options and identify key waste components and their impact on waste loading. The elemental compositions of the four-corners study were used along with the available separations data to determine the effect of burnup, decay, and separations variability on estimated waste stream compositions.[2-5] Two different components/scenarios were identified that could limit waste loading of the combined Cs + LN + TM waste streams, where as the combined Cs + LN waste stream has no single component that is perceived to limit waste loading. Combined Cs + LN waste stream in a glass waste form will most likely be limited by heat due to the high activity of Cs and Sr isotopes.

  3. A SURVEY OF ASTRONOMICAL RESEARCH: A BASELINE FOR ASTRONOMICAL DEVELOPMENT

    SciTech Connect (OSTI)

    Ribeiro, V. A. R. M.; Russo, P.; Crdenas-Avendao, A. E-mail: russo@strw.leidenuniv.nl

    2013-12-01

    Measuring scientific development is a difficult task. Different metrics have been put forward to evaluate scientific development; in this paper we explore a metric that uses the number of peer-reviewed, and when available non-peer-reviewed, research articles as an indicator of development in the field of astronomy. We analyzed the available publication record, using the Smithsonian Astrophysical Observatory/NASA Astrophysics Database System, by country affiliation in the time span between 1950 and 2011 for countries with a gross national income of less than 14,365 USD in 2010. This represents 149 countries. We propose that this metric identifies countries in ''astronomical development'' with a culture of research publishing. We also propose that for a country to develop in astronomy, it should invest in outside expert visits, send its staff abroad to study, and establish a culture of scientific publishing. Furthermore, we propose that this paper may be used as a baseline to measure the success of major international projects, such as the International Year of Astronomy 2009.

  4. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    SciTech Connect (OSTI)

    Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

  5. developing-compute-efficient

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Developing Compute-efficient, Quality Models with LS-PrePost® 3 on the TRACC Cluster Oct. 21-22, 2010 Argonne TRACC Dr. Cezary Bojanowski Dr. Ronald F. Kulak This email address is being protected from spambots. You need JavaScript enabled to view it. Announcement pdficon small The LS-PrePost Introductory Course was held October 21-22, 2010 at TRACC in West Chicago with interactive participation on-site as well as remotely via the Internet. Intended primarily for finite element analysts with

  6. Development of computer graphics

    SciTech Connect (OSTI)

    Nuttall, H.E.

    1989-07-01

    The purpose of this project was to screen and evaluate three graphics packages as to their suitability for displaying concentration contour graphs. The information to be displayed is from computer code simulations describing air-born contaminant transport. The three evaluation programs were MONGO (John Tonry, MIT, Cambridge, MA, 02139), Mathematica (Wolfram Research Inc.), and NCSA Image (National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign). After a preliminary investigation of each package, NCSA Image appeared to be significantly superior for generating the desired concentration contour graphs. Hence subsequent work and this report describes the implementation and testing of NCSA Image on both an Apple MacII and Sun 4 computers. NCSA Image includes several utilities (Layout, DataScope, HDF, and PalEdit) which were used in this study and installed on Dr. Ted Yamada`s Mac II computer. Dr. Yamada provided two sets of air pollution plume data which were displayed using NCSA Image. Both sets were animated into a sequential expanding plume series.

  7. Tools for Closure Project and Contract Management: Development of the Rocky Flats Integrated Closure Project Baseline

    SciTech Connect (OSTI)

    Gelles, C. M.; Sheppard, F. R.

    2002-02-26

    This paper details the development of the Rocky Flats Integrated Closure Project Baseline - an innovative project management effort undertaken to ensure proactive management of the Rocky Flats Closure Contract in support of the Department's goal for achieving the safe closure of the Rocky Flats Environmental Technology Site (RFETS) in December 2006. The accelerated closure of RFETS is one of the most prominent projects within the Department of Energy (DOE) Environmental Management program. As the first major former weapons plant to be remediated and closed, it is a first-of-kind effort requiring the resolution of multiple complex technical and institutional challenges. Most significantly, the closure of RFETS is dependent upon the shipment of all special nuclear material and wastes to other DOE sites. The Department is actively working to strengthen project management across programs, and there is increasing external interest in this progress. The development of the Rocky Flats Integrated Closure Project Baseline represents a groundbreaking and cooperative effort to formalize the management of such a complex project across multiple sites and organizations. It is original in both scope and process, however it provides a useful precedent for the other ongoing project management efforts within the Environmental Management program.

  8. Advanced Materials Development through Computational Design ...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Development through Computational Design Advanced Materials Development through Computational Design Presentation given at the 2007 Diesel Engine-Efficiency & Emissions Research ...

  9. Development of Computer-Aided Design Tools for Automotive Batteries...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools ...

  10. Development of Computer-Aided Design Tools for Automotive Batteries...

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) ... Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

  11. Computational Tools to Accelerate Commercial Development

    SciTech Connect (OSTI)

    Miller, David C.

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  12. Preliminary Phase Field Computational Model Development

    SciTech Connect (OSTI)

    Li, Yulan; Hu, Shenyang Y.; Xu, Ke; Suter, Jonathan D.; McCloy, John S.; Johnson, Bradley R.; Ramuhalli, Pradeep

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in experiments, special experimental methods were devised to create similar boundary conditions in the iron films. Preliminary MFM studies conducted on single and polycrystalline iron films with small sub-areas created with focused ion beam have correlated quite well qualitatively with phase-field simulations. However, phase-field model dimensions are still small relative to experiments thus far. We are in the process of increasing the size of the models and decreasing specimen size so both have identical dimensions. Ongoing research is focused on validation of the phase-field model. Validation is being accomplished through comparison with experimentally obtained MFM images (in progress), and planned measurements of major hysteresis loops and first order reversal curves. Extrapolation of simulation sizes to represent a more stochastic bulk-like system will require sampling of various simulations (i.e., with single non-magnetic defect, single magnetic defect, single grain boundary, single dislocation, etc.) with distributions of input parameters. These outputs can then be compared to laboratory magnetic measurements and ultimately to simulate magnetic Barkhausen noise signals.

  13. BASELINE DESIGN/ECONOMICS FOR ADVANCED FISCHER-TROPSCH TECHNOLOGY

    SciTech Connect (OSTI)

    1998-04-01

    Bechtel, along with Amoco as the main subcontractor, developed a Baseline design, two alternative designs, and computer process simulation models for indirect coal liquefaction based on advanced Fischer-Tropsch (F-T) technology for the U. S. Department of Energy's (DOE's) Federal Energy Technology Center (FETC).

  14. Development of Computer-Aided Design Tools for Automotive Batteries |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 9_han_2012_o.pdf More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

  15. Synthesis and Comparison of Baseline Avian and Bat Use, Raptor Nesting and Mortality Information from Proposed and Existing Wind Developments: Final Report.

    SciTech Connect (OSTI)

    Erickson, Wallace P.

    2002-12-01

    Primarily due to concerns generated from observed raptor mortality at the Altamont Pass (CA) wind plant, one of the first commercial electricity generating wind plants in the U.S., new proposed wind projects both within and outside of California have received a great deal of scrutiny and environmental review. A large amount of baseline and operational monitoring data have been collected at proposed and existing U.S. wind plants. The primary use of the avian baseline data collected at wind developments has been to estimate the overall project impacts (e.g., very low, low, moderate, and high relative mortality) on birds, especially raptors and sensitive species (e.g., state and federally listed species). In a few cases, these data have also been used for guiding placement of turbines within a project boundary. This new information has strengthened our ability to accurately predict and mitigate impacts from new projects. This report should assist various stakeholders in the interpretation and use of this large information source in evaluating new projects. This report also suggests that the level of baseline data (e.g., avian use data) required to adequately assess expected impacts of some projects may be reduced. This report provides an evaluation of the ability to predict direct impacts on avian resources (primarily raptors and waterfowl/waterbirds) using less than an entire year of baseline avian use data (one season, two seasons, etc.). This evaluation is important because pre-construction wildlife surveys can be one of the most time-consuming aspects of permitting wind power projects. For baseline data, this study focuses primarily on standardized avian use data usually collected using point count survey methodology and raptor nest survey data. In addition to avian use and raptor nest survey data, other baseline data is usually collected at a proposed project to further quantify potential impacts. These surveys often include vegetation mapping and state or federal sensitive-status wildlife and plant surveys if there is a likelihood of these species occurring in the vicinity of the project area. This report does not address these types of surveys, however, it is assumed in this document that those surveys are conducted when appropriate to help further quantify potential impacts. The amount and extent of ecological baseline data to collect at a wind project should be determined on a case-by-case basis. The decision should use information gained from this report, recent information from new projects (e.g., Stateline OR/WA), existing project site data from agencies and other knowledgeable groups/individuals, public scoping, and results of vegetation and habitat mapping. Other factors that should also be considered include the likelihood of the presence of sensitive species at the site and expected impacts to those species, project size and project layout.

  16. Hazard baseline documentation

    SciTech Connect (OSTI)

    Not Available

    1994-08-01

    This DOE limited technical standard establishes uniform Office of Environmental Management (EM) guidance on hazards baseline documents that identify and control radiological and nonradiological hazards for all EM facilities. It provides a road map to the safety and health hazard identification and control requirements contained in the Department`s orders and provides EM guidance on the applicability and integration of these requirements. This includes a definition of four classes of facilities (nuclear, non-nuclear, radiological, and other industrial); the thresholds for facility hazard classification; and applicable safety and health hazard identification, controls, and documentation. The standard applies to the classification, development, review, and approval of hazard identification and control documentation for EM facilities.

  17. Hazard Baseline Documentation

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    1995-12-04

    This standard establishes uniform Office of Environmental Management (EM) guidance on hazard baseline documents that identify and control radiological and non-radiological hazards for all EM facilities.

  18. Development of Computer-Aided Design Tools for Automotive Batteries |

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Department of Energy 8_hartridge_2012_o.pdf More Documents & Publications Progress of Computer-Aided Engineering of Batteries (CAEBAT) Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries Review of A123s HEV and PHEV USABC Programs

  19. Develop baseline computational model for proactive welding stress management to suppress helium induced cracking during weld repair

    Broader source: Energy.gov [DOE]

    There are over 100 nuclear power plants operating in the U.S., which generate approximately 20% of the nation’s electricity. These plants range from 15 to 40 years old. Extending the service lives...

  20. Advanced Materials Development through Computational Design | Department of

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    Energy Development through Computational Design Advanced Materials Development through Computational Design Presentation given at the 2007 Diesel Engine-Efficiency & Emissions Research Conference (DEER 2007). 13-16 August, 2007, Detroit, Michigan. Sponsored by the U.S. Department of Energy's (DOE) Office of FreedomCAR and Vehicle Technologies (OFCVT). PDF icon deer07_muralidharan.pdf More Documents & Publications Materials for HCCI Engines Vehicle Technologies Office Merit Review

  1. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect (OSTI)

    Miller, David; Sahinidis, N.V,; Cozad, A; Lee, A; Kim, H; Morinelly, J.; Eslick, J.; Yuan, Z.

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  2. The role of customized computational tools in product development.

    SciTech Connect (OSTI)

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  3. Transportation Baseline Report

    SciTech Connect (OSTI)

    Fawcett, Ricky Lee; Kramer, George Leroy Jr.

    1999-12-01

    The National Transportation Program 1999 Transportation Baseline Report presents data that form a baseline to enable analysis and planning for future Department of Energy (DOE) Environmental Management (EM) waste and materials transportation. In addition, this Report provides a summary overview of DOEs projected quantities of waste and materials for transportation. Data presented in this report were gathered as a part of the IPABS Spring 1999 update of the EM Corporate Database and are current as of July 30, 1999. These data were input and compiled using the Analysis and Visualization System (AVS) which is used to update all stream-level components of the EM Corporate Database, as well as TSD System and programmatic risk (disposition barrier) information. Project (PBS) and site-level IPABS data are being collected through the Interim Data Management System (IDMS). The data are presented in appendices to this report.

  4. NASA technical baseline

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    technical baseline - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced

  5. Accelerating technology development through integrated computation and experimentation

    SciTech Connect (OSTI)

    Shekhawat, Dushyant; Srivastava, Rameshwar

    2013-01-01

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference Accelerating Technology Development through Integrated Computation and Experimentation, sponsored and organized by the United States Department of Energys National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28?Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solvents for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).

  6. Baseline LAW Glass Formulation Testing

    SciTech Connect (OSTI)

    Kruger, Albert A.; Mooers, Cavin; Bazemore, Gina; Pegg, Ian L.; Hight, Kenneth; Lai, Shan Tao; Buechele, Andrew; Rielley, Elizabeth; Gan, Hao; Muller, Isabelle S.; Cecil, Richard

    2013-06-13

    The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

  7. Development of a Very Dense Liquid Cooled Compute Platform

    SciTech Connect (OSTI)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  8. Annual Technology Baseline

    Broader source: Energy.gov [DOE]

    The National Renewable Energy Laboratory is conducting a study sponsored by the U.S. Department of Energy DOE, Office of Energy Efficiency and Renewable Energy (EERE), that aims to document and implement an annual process designed to identify a realistic and timely set of input assumptions (e.g., technology cost and performance, fuel costs), and a diverse set of potential futures (standard scenarios), initially for electric sector analysis. This primary product of the Annual Technology Baseline (ATB) project component includes detailed cost and performance data (both current and projected) for both renewable and conventional technologies. This data is presented in MS Excel.

  9. FED baseline engineering studies report

    SciTech Connect (OSTI)

    Sager, P.H.

    1983-04-01

    Studies were carried out on the FED Baseline to improve design definition, establish feasibility, and reduce cost. Emphasis was placed on cost reduction, but significant feasibility concerns existed in several areas, and better design definition was required to establish feasibility and provide a better basis for cost estimates. Design definition and feasibility studies included the development of a labyrinth shield ring concept to prevent radiation streaming between the torus spool and the TF coil cryostat. The labyrinth shield concept which was developed reduced radiation streaming sufficiently to permit contact maintenance of the inboard EF coils. Various concepts of preventing arcing between adjacent shield sectors were also explored. It was concluded that installation of copper straps with molybdenum thermal radiation shields would provide the most reliable means of preventing arcing. Other design studies included torus spool electrical/structural concepts, test module shielding, torus seismic response, poloidal conditions in the magnets, disruption characteristics, and eddy current effects. These additional studies had no significant impact on cost but did confirm the feasibility of the basic FED Baseline concept.

  10. Computational Tools for Accelerating Carbon Capture Process Development

    SciTech Connect (OSTI)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  11. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-04-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  12. Direct coal liquefaction baseline design and system analysis

    SciTech Connect (OSTI)

    Not Available

    1991-07-01

    The primary objective of the study is to develop a computer model for a base line direct coal liquefaction design based on two stage direct coupled catalytic reactors. This primary objective is to be accomplished by completing the following: a base line design based on previous DOE/PETC results from Wilsonville pilot plant and other engineering evaluations; a cost estimate and economic analysis; a computer model incorporating the above two steps over a wide range of capacities and selected process alternatives; a comprehensive training program for DOE/PETC Staff to understand and use the computer model; a thorough documentation of all underlying assumptions for baseline economics; and a user manual and training material which will facilitate updating of the model in the future.

  13. Integrated Baseline System (IBS) Version 2.0: Models guide

    SciTech Connect (OSTI)

    Not Available

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  14. Energy Intensity Baselining and Tracking Guidance | Department of Energy

    Office of Environmental Management (EM)

    Technical Assistance » Better Plants » Energy Intensity Baselining and Tracking Guidance Energy Intensity Baselining and Tracking Guidance The Energy Intensity Baselining and Tracking Guidance for the Better Buildings, Better Plants Program helps companies meet the program's reporting requirements by describing the steps necessary to develop an energy consumption and energy intensity baseline and calculating consumption and intensity changes over time. Most of the calculation steps described

  15. Single ion implantation for solid state quantum computer development

    SciTech Connect (OSTI)

    Schenkel, Thomas; Meijers, Jan; Persaud, Arun; McDonald, Joseph W.; Holder, Joseph P.; Schneider, Dieter H.

    2001-12-18

    Several solid state quantum computer schemes are based on the manipulation of electron and nuclear spins of single donor atoms in a solid matrix. The fabrication of qubit arrays requires the placement of individual atoms with nanometer precision and high efficiency. In this article we describe first results from low dose, low energy implantations and our development of a low energy (<10 keV), single ion implantation scheme for {sup 31}P{sup q+} ions. When {sup 31}P{sup q+} ions impinge on a wafer surface, their potential energy (9.3 keV for P{sup 15+}) is released, and about 20 secondary electrons are emitted. The emission of multiple secondary electrons allows detection of each ion impact with 100% efficiency. The beam spot on target is controlled by beam focusing and collimation. Exactly one ion is implanted into a selected area avoiding a Poissonian distribution of implanted ions.

  16. Baseline Test Specimen Machining Report

    SciTech Connect (OSTI)

    mark Carroll

    2009-08-01

    The Next Generation Nuclear Plant (NGNP) Project is tasked with selecting a high temperature gas reactor technology that will be capable of generating electricity and supplying large amounts of process heat. The NGNP is presently being designed as a helium-cooled high temperature gas reactor (HTGR) with a large graphite core. The graphite baseline characterization project is conducting the research and development (R&D) activities deemed necessary to fully qualify nuclear-grade graphite for use in the NGNP reactor. Establishing nonirradiated thermomechanical and thermophysical properties by characterizing lot-to-lot and billet-to-billet variations (for probabilistic baseline data needs) through extensive data collection and statistical analysis is one of the major fundamental objectives of the project. The reactor core will be made up of stacks of graphite moderator blocks. In order to gain a more comprehensive understanding of the varying characteristics in a wide range of suitable graphites, any of which can be classified as nuclear grade, an experimental program has been initiated to develop an extensive database of the baseline characteristics of numerous candidate graphites. Various factors known to affect the properties of graphite will be investigated, including specimen size, spatial location within a graphite billet, specimen orientation within a billet (either parallel to [P] or transverse to [T] the long axis of the as-produced billet), and billet-to-billet variations within a lot or across different production lots. Because each data point is based on a certain position within a given billet of graphite, particular attention must be paid to the traceability of each specimen and its spatial location and orientation within each billet. The evaluation of these properties is discussed in the Graphite Technology Development Plan (Windes et. al, 2007). One of the key components in the evaluation of these graphite types will be mechanical testing on specimens drawn from carefully controlled sections of each billet. To this end, this report will discuss the machining of the first set of test specimens that will be evaluated in this program through tensile, compressive, and flexural testing. Validation that the test specimens have been produced to the tolerances required by the applicable ASTM standards, and to the quality control levels required by this program, will demonstrate the viability of sending graphite to selected suppliers that will provide valuable and certifiable data to future data sets that are integral to the NGNP program and beyond.

  17. Pinellas Plant Environmental Baseline Report

    SciTech Connect (OSTI)

    Not Available

    1997-06-01

    The Pinellas Plant has been part of the Department of Energy`s (DOE) nuclear weapons complex since the plant opened in 1957. In March 1995, the DOE sold the Pinellas Plant to the Pinellas County Industry Council (PCIC). DOE has leased back a large portion of the plant site to facilitate transition to alternate use and safe shutdown. The current mission is to achieve a safe transition of the facility from defense production and prepare the site for alternative uses as a community resource for economic development. Toward that effort, the Pinellas Plant Environmental Baseline Report (EBR) discusses the current and past environmental conditions of the plant site. Information for the EBR is obtained from plant records. Historical process and chemical usage information for each area is reviewed during area characterizations.

  18. ARM - Baseline Change Request Guidelines

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    DocumentsBaseline Change Request Guidelines Page Contents Introduction Submit a BCR BCR Process Flowchart Baseline Change Request Guidelines Introduction Baseline Change Requests (BCR) are used by the ARM Infrastructure as a process to provide configuration control and for formally requesting and documenting changes within the ARM Infrastructure. Configuration Control: BCRs are required for changes to instruments, data systems, data processes, datastreams, measurement methods, and facilities.

  19. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J....

  20. ARM - AMF2 Baseline Instruments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    2009-2010 Shouxian, China, 2008 Black Forest, Germany, 2007 Niamey, Niger, 2006 Point Reyes, California, 2005 AMF2 Baseline Instruments Instrument Suites View the list of...

  1. computers

    National Nuclear Security Administration (NNSA)

    Each successive generation of computing system has provided greater computing power and energy efficiency.

    CTS-1 clusters will support NNSA's Life Extension Program and...

  2. HEV America Baseline Test Sequence

    Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

    BASELINE TEST SEQUENCE Revision 1 September 1, 2006 Prepared by Electric Transportation Applications Prepared by: _______________________________ Date: __________ Roberta Brayer Approved by: _________ _________________________________ Date: _______________ _____ Donald B. Karner ©2005 Electric Transportation Applications All Rights Reserved HEV America Baseline Test Sequence Page 1 HEV PERFORMANCE TEST PROCEDURE SEQUENCE The following test sequence shall be used for conduct of HEV America

  3. Hanford Site technical baseline database

    SciTech Connect (OSTI)

    Porter, P.E., Westinghouse Hanford

    1996-05-10

    This document includes a cassette tape that contains the Hanford specific files that make up the Hanford Site Technical Baseline Database as of May 10, 1996. The cassette tape also includes the delta files that delineate the differences between this revision and revision 3 (April 10, 1996) of the Hanford Site Technical Baseline Database.

  4. Ethiopia-National Greenhouse Gas Emissions Baseline Scenarios...

    Open Energy Info (EERE)

    National Greenhouse Gas Emissions Baseline Scenarios: Learning from Experiences in Developing Countries Jump to: navigation, search Name Ethiopia-National Greenhouse Gas Emissions...

  5. U.S. Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2008-09-12

    The guide supports DOE O 413.3A and identifies key performance baseline development processes and practices. Does not cancel other directives.

  6. Baseline Graphite Characterization: First Billet

    SciTech Connect (OSTI)

    Mark C. Carroll; Joe Lords; David Rohrbaugh

    2010-09-01

    The Next Generation Nuclear Plant Project Graphite Research and Development program is currently establishing the safe operating envelope of graphite core components for a very high temperature reactor design. To meet this goal, the program is generating the extensive amount of quantitative data necessary for predicting the behavior and operating performance of the available nuclear graphite grades. In order determine the in-service behavior of the graphite for the latest proposed designs, two main programs are underway. The first, the Advanced Graphite Creep (AGC) program, is a set of experiments that are designed to evaluate the irradiated properties and behavior of nuclear grade graphite over a large spectrum of temperatures, neutron fluences, and compressive loads. Despite the aggressive experimental matrix that comprises the set of AGC test runs, a limited amount of data can be generated based upon the availability of space within the Advanced Test Reactor and the geometric constraints placed on the AGC specimens that will be inserted. In order to supplement the AGC data set, the Baseline Graphite Characterization program will endeavor to provide supplemental data that will characterize the inherent property variability in nuclear-grade graphite without the testing constraints of the AGC program. This variability in properties is a natural artifact of graphite due to the geologic raw materials that are utilized in its production. This variability will be quantified not only within a single billet of as-produced graphite, but also from billets within a single lot, billets from different lots of the same grade, and across different billets of the numerous grades of nuclear graphite that are presently available. The thorough understanding of this variability will provide added detail to the irradiated property data, and provide a more thorough understanding of the behavior of graphite that will be used in reactor design and licensing. This report covers the development of the Baseline Graphite Characterization program from a testing and data collection standpoint through the completion of characterization on the first billet of nuclear-grade graphite. This data set is the starting point for all future evaluations and comparisons of material properties.

  7. 324 Building Baseline Radiological Characterization

    SciTech Connect (OSTI)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  8. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, April--June 1992

    SciTech Connect (OSTI)

    Not Available

    1992-10-01

    Effective September 26, 1991, Bechtel, with Amoco as the main subcontractor, initiated a study to develop a computer model and baseline design for advanced Fischer-Tropsch (F-T) technology for the US Department of Energy`s Pittsburgh Energy Technology Center (PETC). The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced F-T technology; prepare the capital and operating costs for the baseline design; and develop a process flow sheet simulation (PI-S) model. The baseline design, the economic analysis, and the computer model win be the major research planning tools that PETC will use to plan, guide, and evaluate its ongoing and future research and commercialization programs relating to indirect coal liquefaction. for the manufacture of synthetic liquid fuels from coal. This report is Bechtel`s third quarterly technical progress report covering the period from March 16, 1992 through June 21, 1992. This report consists of seven sections: Section 1 - introduction; Section 2 - summary; Section 3 - carbon dioxide removal tradeoff study; Section 4 - preliminary plant designs for coal preparation; Section 5 - preliminary design for syngas production; Section 6 - Task 3 - engineering design criteria; and Section 7 - project management.

  9. ARM - AMF1 Baseline Instruments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    FacilitiesAMF1 Baseline Instruments AMF Information Science Architecture Baseline Instruments AMF1 AMF2 AMF3 MAOS Data Operations AMF Fact Sheet Images Contacts AMF Deployments McMurdo Station, Antarctica, 2015-2016 Pearl Harbor, Hawaii, to San Francisco, California, 2015 Hyytiälä, Finland, 2014 Manacapuru, Brazil, 2014 Oliktok Point, Alaska, 2013 Los Angeles, California, to Honolulu, Hawaii, 2012 Cape Cod, Massachusetts, 2012 Gan Island, Maldives, 2011 Ganges Valley, India, 2011 Steamboat

  10. ARM - AMF3 Baseline Instruments

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    FacilitiesAMF3 Baseline Instruments AMF Information Science Architecture Baseline Instruments AMF1 AMF2 AMF3 MAOS Data Operations AMF Fact Sheet Images Contacts AMF Deployments McMurdo Station, Antarctica, 2015-2016 Pearl Harbor, Hawaii, to San Francisco, California, 2015 Hyytiälä, Finland, 2014 Manacapuru, Brazil, 2014 Oliktok Point, Alaska, 2013 Los Angeles, California, to Honolulu, Hawaii, 2012 Cape Cod, Massachusetts, 2012 Gan Island, Maldives, 2011 Ganges Valley, India, 2011 Steamboat

  11. U.S Department of Energy Performance Baseline Guide

    Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

    2011-09-23

    This guide identifies key Performance Baseline (PB) elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development.

  12. Mid-Atlantic Baseline Studies Project | Department of Energy

    Energy Savers [EERE]

    Mid-Atlantic Baseline Studies Project Mid-Atlantic Baseline Studies Project Funded by the Department of Energy, along with a number of partners, the collaborative Mid-Atlantic Baseline Studies Project, led by the Biodiversity Research Institute (BRI), helps improve understanding of species composition and use of the Mid-Atlantic marine environment in order to promote more sustainable offshore wind development. This first-of-its-kind study along the Eastern Seaboard of the United States delivers

  13. computers

    National Nuclear Security Administration (NNSA)

    California.

    Retired computers used for cybersecurity research at Sandia National...

  14. Computer

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    I. INTRODUCTION This paper presents several computational tools required for processing images of a heavy ion beam and estimating the magnetic field within a plasma. The...

  15. Firm develops own EMS built on Apple computer

    SciTech Connect (OSTI)

    Pospisil, R.

    1982-04-05

    Firestone Fibers and Textile Co. programmed a $2000 desktop Apple II computer and special electronic panels designed by the engineering staff to perform process control and other energy-management functions. The system should reduce natural gas consumption 40% and save the company up to $75,000 a year by reducing the amount of hot air exhausted from fabric-treating ovens. The system can be expanded to control lights and space-conditioning equipment. The company is willing to negotiate with other firms to market the panels. The Apple II was chosen because it has a high capacity for data acquisition and testing and because of the available software. (DCK)

  16. Baseline Wind Energy Facility | Open Energy Information

    Open Energy Info (EERE)

    Wind Energy Facility Jump to: navigation, search Name Baseline Wind Energy Facility Facility Baseline Wind Energy Facility Sector Wind energy Facility Type Commercial Scale Wind...

  17. Baseline Control Measures.pdf

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Individual Permit Baseline Control Measures at Los Alamos National Laboratory, Poster, Individual Permit for Storm Water, NPDES Permit No. NM0030759 Author(s): Veenis, Steven J. Intended for: Public Purpose: This poster was prepared for the June 2013 Individual Permit for Storm Water (IP) public meeting. The purpose of the meeting was to update the public on implementation of the permit as required under Part 1.I (7) of the IP (National Pollutant Discharge Elimination System Permit No.

  18. Open source development experience with a computational gas-solids flow code

    SciTech Connect (OSTI)

    Syamlal, M; O'Brien, T. J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

    2008-01-01

    A case study on the use of open source (OS) software development in chemical engineering research and education is presented here. The multiphase computational fluid dynamics software MFIX is the object of the case study. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow and the dissemination of information to other areas such as geotechnical and volcanology research are demonstrated. It is shown that the advantages of OS development methodology were realized: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; and the facilitation of peer review of the results of computational research.

  19. Computing

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Office of Advanced Scientific Computing Research in the Department of Energy Office of Science under contract number DE-AC02-05CH11231. ! Application and System Memory Use, Configuration, and Problems on Bassi Richard Gerber Lawrence Berkeley National Laboratory NERSC User Services ScicomP 13 Garching bei München, Germany, July 17, 2007 ScicomP 13, July 17, 2007, Garching Overview * About Bassi * Memory on Bassi * Large Page Memory (It's Great!) * System Configuration * Large Page

  20. Computations

    Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

    Computations - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced Nuclear

    1. Vehicle Technologies Office Merit Review 2015: Development of Computer-Aided Design Tools for Automotive Batteries

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

    2. Vehicle Technologies Office Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

      Broader source: Energy.gov [DOE]

      Presentation given by CD-Adapco at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about development of computer-aided...

    3. New DOE Office of Science support for CAMERA to develop computational

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research New DOE Office of Science support for CAMERA to develop computational mathematics for experimental facilities research September 22, 2015 Contact: Linda Vu, +1 510 495 2402, lvu@lbl.gov newcameralogofinal Experimental science is evolving. With the advent of new technology, scientific facilities are collecting data at

    4. NREL Supports Industry to Develop Computer-Aided Engineering Tools for Car

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Batteries - News Releases | NREL NREL Supports Industry to Develop Computer-Aided Engineering Tools for Car Batteries July 7, 2011 The U.S. Department of Energy's (DOE) National Renewable Energy Laboratory (NREL) recently awarded three industry teams, after a competitive procurement process, a total of $7 million for the development of computer-aided software design tools to help produce the next generation of electric drive vehicle (EDV) batteries. These projects support DOE's

    5. Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Directed Research & Development Page National Energy Research Scientific Computing Center T3E Individual Node Optimization Michael Stewart, SGI/Cray, 4/9/98 * Introduction * T3E Processor * T3E Local Memory * Cache Structure * Optimizing Codes for Cache Usage * Loop Unrolling * Other Useful Optimization Options * References 1 Laboratory Directed Research & Development Page National Energy Research Scientific Computing Center Introduction * Primary topic will be single processor

    6. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

      SciTech Connect (OSTI)

      Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

      2012-12-01

      The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

    7. High-Performance Computing for Alloy Development | netl.doe.gov

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High-Performance Computing for Alloy Development alloy-development.jpg Tomorrow's fossil-fuel based power plants will achieve higher efficiencies by operating at higher pressures and temperatures and under harsher and more corrosive conditions. Unfortunately, conventional metals simply cannot withstand these extreme environments, so advanced alloys must be designed and fabricated to meet the needs of these advanced systems. The properties of metal alloys, which are mixtures of metallic elements,

    8. Global Nuclear Energy Partnership Waste Treatment Baseline

      SciTech Connect (OSTI)

      Dirk Gombert; William Ebert; James Marra; Robert Jubin; John Vienna

      2008-05-01

      The Global Nuclear Energy Partnership program (GNEP) is designed to demonstrate a proliferation-resistant and sustainable integrated nuclear fuel cycle that can be commercialized and used internationally. Alternative stabilization concepts for byproducts and waste streams generated by fuel recycling processes were evaluated and a baseline of waste forms was recommended for the safe disposition of waste streams. Waste forms are recommended based on the demonstrated or expected commercial practicability and technical maturity of the processes needed to make the waste forms, and performance of the waste form materials when disposed. Significant issues remain in developing technologies to process some of the wastes into the recommended waste forms, and a detailed analysis of technology readiness and availability may lead to the choice of a different waste form than what is recommended herein. Evolving regulations could also affect the selection of waste forms.

    9. Baselines for Greenhouse Gas Reductions: Problems, Precedents...

      Open Energy Info (EERE)

      Baseline projection, GHG inventory, Pathways analysis Resource Type: Publications, Lessons learnedbest practices Website: www.p2pays.orgref2221739.pdf References:...

    10. Tank waste remediation systems technical baseline database

      SciTech Connect (OSTI)

      Porter, P.E.

      1996-10-16

      This document includes a cassette tape that contains Hanford generated data for the Tank Waste Remediation Systems Technical Baseline Database as of October 09, 1996.

    11. LEDSGP/Transportation Toolkit/Key Actions/Create a Baseline ...

      Open Energy Info (EERE)

      a Baseline) Jump to: navigation, search LEDSGP Logo.png Transportation Toolkit Home Tools Training Request Assistance Key Actions for Low-Emission Development in Transportation...

    12. CD-2, Approve Performance Baseline

      Broader source: Energy.gov [DOE]

       Note:  Per 10 CFR 830.206, a major modification of an existing Hazard Category 1, 2 or 3 nuclear facility requires the development of a PDSA and its approval by DOE (10 CFR 830.207). Per DOE-STD...

    13. TWRS technical baseline database manager definition document

      SciTech Connect (OSTI)

      Acree, C.D.

      1997-08-13

      This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager.

    14. Performance Modeling for 3D Visualization in a Heterogeneous Computing

      Office of Scientific and Technical Information (OSTI)

      Environment (Technical Report) | SciTech Connect Performance Modeling for 3D Visualization in a Heterogeneous Computing Environment Citation Details In-Document Search Title: Performance Modeling for 3D Visualization in a Heterogeneous Computing Environment The visualization of large, remotely located data sets necessitates the development of a distributed computing pipeline in order to reduce the data, in stages, to a manageable size. The required baseline infrastructure for launching such

    15. Computing Videos

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Videos Computing

    16. Hanford Site technical baseline database. Revision 1

      SciTech Connect (OSTI)

      Porter, P.E.

      1995-01-27

      This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available.

    17. Integrated Baseline Bystem (IBS) Version 1.03: Models guide

      SciTech Connect (OSTI)

      Not Available

      1993-01-01

      The Integrated Baseline System)(IBS), operated by the Federal Emergency Management Agency (FEMA), is a system of computerized tools for emergency planning and analysis. This document is the models guide for the IBS and explains how to use the emergency related computer models. This document provides information for the experienced system user, and is the primary reference for the computer modeling software supplied with the system. It is designed for emergency managers and planners, and others familiar with the concepts of computer modeling. Although the IBS manual set covers basic and advanced operations, it is not a complete reference document set. Emergency situation modeling software in the IBS is supported by additional technical documents. Some of the other IBS software is commercial software for which more complete documentation is available. The IBS manuals reference such documentation where necessary.

    18. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model,...

    19. Development of a computer wellbore simulator for coiled-tube operations

      SciTech Connect (OSTI)

      Gu, H.; Walton, I.C.; Dowell, S.

      1994-12-31

      This paper describes a computer wellbore simulator developed for coiled tubing operations of fill cleanout and unloading of oil and gas wells. The simulator models the transient, multiphase fluid flow and mass transport process that occur in these operations. Unique features of the simulator include a sand bed that may form during fill cleanout in deviated and horizontal wells, particle transport with multiphase compressible fluids, and the transient unloading process of oil and gas wells. The requirements for a computer wellbore simulator for coiled tubing operations are discussed and it is demonstrated that the developed simulator is suitable for modeling these operations. The simulator structure and the incorporation of submodules for gas/liquid two-phase flow, reservoir and choke models, and coiled tubing movement are addressed. Simulation examples are presented to show the sand bed formed in cleanout in a deviated well and the transient unloading results of oil and gas wells. The wellbore simulator developed in this work can assist a field engineer with the design of coiled tubing operations. By using the simulator to predict the pressure, flow rates, sand concentration and bed depth, the engineer will be able to select the coiled tubing, fluid and schedule of an optimum design for particular well and reservoir conditions.

    20. GridPACK Toolkit for Developing Power Grid Simulations on High Performance Computing Platforms

      SciTech Connect (OSTI)

      Palmer, Bruce J.; Perkins, William A.; Glass, Kevin A.; Chen, Yousu; Jin, Shuangshuang; Callahan, Charles D.

      2013-11-30

      This paper describes the GridPACK framework, which is designed to help power grid engineers develop modeling software capable of running on todays high performance computers. The framework contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors, using parallel linear and non-linear solvers to solve algebraic equations, and mapping functionality to create matrices and vectors based on properties of the network. In addition, the framework contains additional functionality to support IO and to manage errors.

    1. Development of high performance scientific components for interoperability of computing packages

      SciTech Connect (OSTI)

      Gulabani, Teena Pratap

      2008-12-01

      Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

    2. Revised SRC-I project baseline. Volume 1

      SciTech Connect (OSTI)

      Not Available

      1984-01-01

      International Coal Refining Company (ICRC), in cooperation with the Commonwealth of Kentucky has contracted with the United States Department of Energy (DOE) to design, build and operate a first-of-its-kind plant demonstrating the economic, environmental, socioeconomic and technical feasibility of the direct coal liquefaction process known as SRC-I. ICRC has made a massive commitment of time and expertise to design processes, plan and formulate policy, schedules, costs and technical drawings for all plant systems. These fully integrated plans comprise the Project Baseline and are the basis for all future detailed engineering, plant construction, operation, and other work set forth in the contract between ICRC and the DOE. Volumes I and II of the accompanying documents constitute the updated Project Baseline for the SRC-I two-stage liquefaction plant. International Coal Refining Company believes this versatile plant design incorporates the most advanced coal liquefaction system available in the synthetic fuels field. SRC-I two-stage liquefaction, as developed by ICRC, is the way of the future in coal liquefaction because of its product slate flexibility, high process thermal efficiency, and low consumption of hydrogen. The SRC-I Project Baseline design also has made important state-of-the-art advances in areas such as environmental control systems. Because of a lack of funding, the DOE has curtailed the total project effort without specifying a definite renewal date. This precludes the development of revised accurate and meaningful schedules and, hence, escalated project costs. ICRC has revised and updated the original Design Baseline to include in the technical documentation all of the approved but previously non-incorporated Category B and C and new Post-Baseline Engineering Change Proposals.

    3. New Set of Computational Tools and Models Expected to Help Enable Rapid Development and Deployment of Carbon Capture Technologies

      Broader source: Energy.gov [DOE]

      An eagerly anticipated suite of 21 computational tools and models to help enable rapid development and deployment of new carbon capture technologies is now available from the Carbon Capture Simulation Initiative.

    4. Solid Waste Program technical baseline description

      SciTech Connect (OSTI)

      Carlson, A.B.

      1994-07-01

      The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

    5. Waste management project technical baseline description

      SciTech Connect (OSTI)

      Sederburg, J.P.

      1997-08-13

      A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project.

    6. South Africa-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Baseline Workstream Jump to: navigation, search Name South Africa-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish Ministry for...

    7. Brazil-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Brazil-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    8. Mexico-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name Mexico-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    9. Indonesia-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Indonesia-Danish Government Baseline Workstream Jump to: navigation, search Name Indonesia-Danish Government Baseline Workstream AgencyCompany Organization Danish Government...

    10. India-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name India-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    11. UNFCCC-Consolidated baseline and monitoring methodology for landfill...

      Open Energy Info (EERE)

      Consolidated baseline and monitoring methodology for landfill gas project activities Jump to: navigation, search Tool Summary LAUNCH TOOL Name: UNFCCC-Consolidated baseline and...

    12. China-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Danish Government Baseline Workstream Jump to: navigation, search Name China-Danish Government Baseline Workstream AgencyCompany Organization Danish Government Partner Danish...

    13. Computing Sciences

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Division The Computational Research Division conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and...

    14. Baseline Microstructural Characterization of Outer 3013 Containers

      SciTech Connect (OSTI)

      Zapp, Phillip E.; Dunn, Kerry A

      2005-07-31

      Three DOE Standard 3013 outer storage containers were examined to characterize the microstructure of the type 316L stainless steel material of construction. Two of the containers were closure-welded yielding production-quality outer 3013 containers; the third examined container was not closed. Optical metallography and Knoop microhardness measurements were performed to establish a baseline characterization that will support future destructive examinations of 3013 outer containers in the storage inventory. Metallography revealed the microstructural features typical of this austenitic stainless steel as it is formed and welded. The grains were equiaxed with evident annealing twins. Flow lines were prominent in the forming directions of the cylindrical body and flat lids and bottom caps. No adverse indications were seen. Microhardness values, although widely varying, were consistent with annealed austenitic stainless steel. The data gathered as part of this characterization will be used as a baseline for the destructive examination of 3013 containers removed from the storage inventory.

    15. Energy Intensity Baselining and Tracking Guidance

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Learn more at betterbuildings.energy.gov Energy Intensity Baselining and Tracking Guidance i Preface The U.S. Department of Energy's (DOE's) Better Buildings, Better Plants Program (Better Plants) is a voluntary energy efficiency leadership initiative for U.S. manufacturers. The program encourages companies to commit to reduce the energy intensity of their U.S. manufacturing operations, usually by 25% over a 10-year period. Companies joining Better Plants are recognized by DOE for their

    16. Module 7 - Integrated Baseline Review and Change Control | Department of

      Energy Savers [EERE]

      Energy 7 - Integrated Baseline Review and Change Control Module 7 - Integrated Baseline Review and Change Control This module focuses on integrated baseline reviews (IBR) and change control. This module outlines the objective and responsibility of an integrated baseline review. Additionally, this module will discuss the change control process required for implementing earned value

    17. Document Number Q0029500 Baseline Risk Assessment Update 4.0 Baseline Risk Assessment Update

      Office of Legacy Management (LM)

      Baseline Risk Assessment Update 4.0 Baseline Risk Assessment Update This section updates the human health and the ecological risk assessments that were originally presented in the 1998 RI (DOE 1998a). The impacts on the 1998 risk assessments are summarized in Section 2.9. 4.1 Human Health Risk Assessment Several activities completed since 1998 have contributed to changes in surface water and ground water concentrations. Activities that have impacted, or likely impacted surface water and ground

    18. Long-Baseline Neutrino Experiment (LBNE)Conceptual Design ReportThe LBNE Water Cherenkov DetectorApril 13 2012

      SciTech Connect (OSTI)

      Kettell S. H.; Bishai, M.; Brown, R.; Chen, H.; Diwan, M.; Dolph, J., Geronimo, G.; Gill, R.; Hackenburg, R.; Hahn, R.; Hans, S.; Isvan, Z.; Jaffe, D.; Junnarkar, S.; Kettell, S.H.; Lanni,F.; Li, Y.; Ling, J.; Littenberg, L.; Makowiecki, D.; Marciano, W.; Morse, W.; Parsa, Z.; Radeka, V.; Rescia, S.; Samios, N.; Sharma, R.; Simos, N.; Sondericker, J.; Stewart, J.; Tanaka, H.; Themann, H.; Thorn, C.; Viren, B., White, S.; Worcester, E.; Yeh, M.; Yu, B.; Zhang, C.

      2012-04-13

      Conceptual Design Report (CDR) developed for the Water Cherekov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    19. Process Simulation Role in the Development of New Alloys Based on Integrated Computational Material Science and Engineering

      SciTech Connect (OSTI)

      Sabau, Adrian S [ORNL; Porter, Wallace D [ORNL; Roy, Shibayan [ORNL; Shyam, Amit [ORNL

      2014-01-01

      To accelerate the introduction of new materials and components, the development of metal casting processes requires the teaming between different disciplines, as multi-physical phenomena have to be considered simultaneously for the process design and optimization of mechanical properties. The required models for physical phenomena as well as their validation status for metal casting are reviewed. The data on materials properties, model validation, and relevant microstructure for materials properties are highlighted. One vehicle to accelerate the development of new materials is through combined experimental-computational efforts. Integrated computational/experimental practices are reviewed; strengths and weaknesses are identified with respect to metal casting processes. Specifically, the examples are given for the knowledge base established at Oak Ridge National Laboratory and computer models for predicting casting defects and microstructure distribution in aluminum alloy components.

    20. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Williams, J.R.; Bower, J.C.

      1994-03-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

    1. Integrated Baseline System (IBS) Version 1.03: Utilities guide

      SciTech Connect (OSTI)

      Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

      1993-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

    2. ABB SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) ABB SCADA/EMS System INEEL Baseline Summary Test Report (November 2004) This document covers the security evaluation of the "baseline" or "as delivered" system performed in the Idaho National Engineering and Environmental Laboratory (INEEL) SCADA test bed as part of the Critical Infrastructure Test Range Development Program, which is funded by the U.S. Department of Energy; Office of

    3. System design and algorithmic development for computational steering in distributed environments

      SciTech Connect (OSTI)

      Wu, Qishi; Zhu, Mengxia; Gu, Yi; Rao, Nageswara S

      2010-03-01

      Supporting visualization pipelines over wide-area networks is critical to enabling large-scale scientific applications that require visual feedback to interactively steer online computations. We propose a remote computational steering system that employs analytical models to estimate the cost of computing and communication components and optimizes the overall system performance in distributed environments with heterogeneous resources. We formulate and categorize the visualization pipeline configuration problems for maximum frame rate into three classes according to the constraints on node reuse or resource sharing, namely no, contiguous, and arbitrary reuse. We prove all three problems to be NP-complete and present heuristic approaches based on a dynamic programming strategy. The superior performance of the proposed solution is demonstrated with extensive simulation results in comparison with existing algorithms and is further evidenced by experimental results collected on a prototype implementation deployed over the Internet.

    4. Integrated Baseline System (IBS). Version 1.03, System Management Guide

      SciTech Connect (OSTI)

      Williams, J.R.; Bailey, S.; Bower, J.C.

      1993-01-01

      This IBS System Management Guide explains how to install or upgrade the Integrated Baseline System (IBS) software package. The IBS is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This guide includes detailed instructions for installing the IBS software package on a Digital Equipment Corporation (DEC) VAX computer from the IBS distribution tapes. The installation instructions include procedures for both first-time installations and upgrades to existing IBS installations. To ensure that the system manager has the background necessary for successful installation of the IBS package, this guide also includes information on IBS computer requirements, software organization, and the generation of IBS distribution tapes. When special utility programs are used during IBS installation and setups, this guide refers you to the IBS Utilities Guide for specific instructions. This guide also refers you to the IBS Data Management Guide for detailed descriptions of some IBS data files and structures. Any special requirements for installation are not documented here but should be included in a set of installation notes that come with the distribution tapes.

    5. Grocery 2009 TSD Miami Baseline | Open Energy Information

      Open Energy Info (EERE)

      Jump to: navigation, search Model Name Grocery 2009 TSD Miami Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

    6. Grocery 2009 TSD Chicago Baseline | Open Energy Information

      Open Energy Info (EERE)

      Jump to: navigation, search Model Name Grocery 2009 TSD Chicago Baseline Building Type Food Sales Model Type Baseline Model Target Type ASHRAE 90.1 2004 Model Year 2009 IDF file...

    7. Proposed Methodology for LEED Baseline Refrigeration Modeling (Presentation)

      SciTech Connect (OSTI)

      Deru, M.

      2011-02-01

      This PowerPoint presentation summarizes a proposed methodology for LEED baseline refrigeration modeling. The presentation discusses why refrigeration modeling is important, the inputs of energy models, resources, reference building model cases, baseline model highlights, example savings calculations and results.

    8. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, M. J.; Li, Y.; Sale, D. C.

      2011-10-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    9. Moving baseline for evaluation of advanced coal-extraction systems

      SciTech Connect (OSTI)

      Bickerton, C.R.; Westerfield, M.D.

      1981-04-15

      This document reports results from the initial effort to establish baseline economic performance comparators for a program whose intent is to define, develop, and demonstrate advanced systems suitable for coal resource extraction beyond the year 2000. Systems used in this study were selected from contemporary coal mining technology and from conservative conjectures of year 2000 technology. The analysis was also based on a seam thickness of 6 ft. Therefore, the results are specific to the study systems and the selected seam thickness. To be more beneficial to the program, the effort should be extended to other seam thicknesses. This document is one of a series which describe systems level requirements for advanced underground coal mining equipment. Five areas of performance are discussed: production cost, miner safety, miner health, environmental impact, and recovery efficiency. The projections for cost and production capability comprise a so-called moving baseline which will be used to assess compliance with the systems requirement for production cost. Separate projections were prepared for room and pillar, longwall, and shortwall technology all operating under comparable sets of mining conditions. This work is part of an effort to define and develop innovative coal extraction systems suitable for the significant resources remaining in the year 2000.

    10. Development and Verification of a Computational Fluid Dynamics Model of a Horizontal-Axis Tidal Current Turbine

      SciTech Connect (OSTI)

      Lawson, Mi. J.; Li, Y.; Sale, D. C.

      2011-01-01

      This paper describes the development of a computational fluid dynamics (CFD) methodology to simulate the hydrodynamics of horizontal-axis tidal current turbines (HATTs). First, an HATT blade was designed using the blade element momentum method in conjunction with a genetic optimization algorithm. Several unstructured computational grids were generated using this blade geometry and steady CFD simulations were used to perform a grid resolution study. Transient simulations were then performed to determine the effect of time-dependent flow phenomena and the size of the computational timestep on the numerical solution. Qualitative measures of the CFD solutions were independent of the grid resolution. Conversely, quantitative comparisons of the results indicated that the use of coarse computational grids results in an under prediction of the hydrodynamic forces on the turbine blade in comparison to the forces predicted using more resolved grids. For the turbine operating conditions considered in this study, the effect of the computational timestep on the CFD solution was found to be minimal, and the results from steady and transient simulations were in good agreement. Additionally, the CFD results were compared to corresponding blade element momentum method calculations and reasonable agreement was shown. Nevertheless, we expect that for other turbine operating conditions, where the flow over the blade is separated, transient simulations will be required.

    11. International Nuclear Energy Research Initiative Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

      SciTech Connect (OSTI)

      M.F. Simpson; K.-R. Kim

      2010-12-01

      In support of closing the nuclear fuel cycle using non-aqueous separations technology, this project aims to develop computational models of electrorefiners based on fundamental chemical and physical processes. Spent driver fuel from Experimental Breeder Reactor-II (EBR-II) is currently being electrorefined in the Fuel Conditioning Facility (FCF) at Idaho National Laboratory (INL). And Korea Atomic Energy Research Institute (KAERI) is developing electrorefining technology for future application to spent fuel treatment and management in the Republic of Korea (ROK). Electrorefining is a critical component of pyroprocessing, a non-aqueous chemical process which separates spent fuel into four streams: (1) uranium metal, (2) U/TRU metal, (3) metallic high-level waste containing cladding hulls and noble metal fission products, and (4) ceramic high-level waste containing sodium and active metal fission products. Having rigorous yet flexible electrorefiner models will facilitate process optimization and assist in trouble-shooting as necessary. To attain such models, INL/UI has focused on approaches to develop a computationally-light and portable two-dimensional (2D) model, while KAERI/SNU has investigated approaches to develop a computationally intensive three-dimensional (3D) model for detailed and fine-tuned simulation.

    12. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, X.

      1996-12-17

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window. 5 figs.

    13. LTC vacuum blasting machine (concrete): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    14. LTC vacuum blasting machine (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    15. Pentek metal coating removal system: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    16. Baseline air quality study at Fermilab

      SciTech Connect (OSTI)

      Dave, M.J.; Charboneau, R.

      1980-10-01

      Air quality and meteorological data collected at Fermi National Accelerator Laboratory are presented. The data represent baseline values for the pre-construction phase of a proposed coal-gasification test facility. Air quality data were characterized through continuous monitoring of gaseous pollutants, collection of meteorological data, data acquisition and reduction, and collection and analysis of discrete atmospheric samples. Seven air quality parameters were monitored and recorded on a continuous real-time basis: sulfur dioxide, ozone, total hydrocarbons, nonreactive hydrocarbons, nitric oxide, nitrogen oxides, and carbon monoxide. A 20.9-m tower was erected near Argonne's mobile air monitoring laboratory, which was located immediately downwind of the proposed facility. The tower was instrumented at three levels to collect continuous meteorological data. Wind speed was monitored at three levels; wind direction, horizontal and vertical, at the top level; ambient temperature at the top level; and differential temperature between all three levels. All continuously-monitored parameters were digitized and recorded on magnetic tape. Appropriate software was prepared to reduce the data. Statistical summaries, grphical displays, and correlation studies also are presented.

    17. Gated integrator with signal baseline subtraction

      DOE Patents [OSTI]

      Wang, Xucheng (Lisle, IL)

      1996-01-01

      An ultrafast, high precision gated integrator includes an opamp having differential inputs. A signal to be integrated is applied to one of the differential inputs through a first input network, and a signal indicative of the DC offset component of the signal to be integrated is applied to the other of the differential inputs through a second input network. A pair of electronic switches in the first and second input networks define an integrating period when they are closed. The first and second input networks are substantially symmetrically constructed of matched components so that error components introduced by the electronic switches appear symmetrically in both input circuits and, hence, are nullified by the common mode rejection of the integrating opamp. The signal indicative of the DC offset component is provided by a sample and hold circuit actuated as the integrating period begins. The symmetrical configuration of the integrating circuit improves accuracy and speed by balancing out common mode errors, by permitting the use of high speed switching elements and high speed opamps and by permitting the use of a small integrating time constant. The sample and hold circuit substantially eliminates the error caused by the input signal baseline offset during a single integrating window.

    18. MP Salsa: a finite element computer program for reacting flow problems. Part 1--theoretical development

      SciTech Connect (OSTI)

      Shadid, J.N.; Moffat, H.K.; Hutchinson, S.A.; Hennigan, G.L.; Devine, K.D.; Salinger, A.G.

      1996-05-01

      The theoretical background for the finite element computer program, MPSalsa, is presented in detail. MPSalsa is designed to solve laminar, low Mach number, two- or three-dimensional incompressible and variable density reacting fluid flows on massively parallel computers, using a Petrov-Galerkin finite element formulation. The code has the capability to solve coupled fluid flow, heat transport, multicomponent species transport, and finite-rate chemical reactions, and to solver coupled multiple Poisson or advection-diffusion- reaction equations. The program employs the CHEMKIN library to provide a rigorous treatment of multicomponent ideal gas kinetics and transport. Chemical reactions occurring in the gas phase and on surfaces are treated by calls to CHEMKIN and SURFACE CHEMKIN, respectively. The code employs unstructured meshes, using the EXODUS II finite element data base suite of programs for its input and output files. MPSalsa solves both transient and steady flows by using fully implicit time integration, an inexact Newton method and iterative solvers based on preconditioned Krylov methods as implemented in the Aztec solver library.

    19. NEW DEVELOPMENTS ON INVERSE POLYGON MAPPING TO CALCULATE GRAVITATIONAL LENSING MAGNIFICATION MAPS: OPTIMIZED COMPUTATIONS

      SciTech Connect (OSTI)

      Mediavilla, E.; Lopez, P.; Gonzalez-Morcillo, C.; Jimenez-Vicente, J.

      2011-11-01

      We derive an exact solution (in the form of a series expansion) to compute gravitational lensing magnification maps. It is based on the backward gravitational lens mapping of a partition of the image plane in polygonal cells (inverse polygon mapping, IPM), not including critical points (except perhaps at the cell boundaries). The zeroth-order term of the series expansion leads to the method described by Mediavilla et al. The first-order term is used to study the error induced by the truncation of the series at zeroth order, explaining the high accuracy of the IPM even at this low order of approximation. Interpreting the Inverse Ray Shooting (IRS) method in terms of IPM, we explain the previously reported N {sup -3/4} dependence of the IRS error with the number of collected rays per pixel. Cells intersected by critical curves (critical cells) transform to non-simply connected regions with topological pathologies like auto-overlapping or non-preservation of the boundary under the transformation. To define a non-critical partition, we use a linear approximation of the critical curve to divide each critical cell into two non-critical subcells. The optimal choice of the cell size depends basically on the curvature of the critical curves. For typical applications in which the pixel of the magnification map is a small fraction of the Einstein radius, a one-to-one relationship between the cell and pixel sizes in the absence of lensing guarantees both the consistence of the method and a very high accuracy. This prescription is simple but very conservative. We show that substantially larger cells can be used to obtain magnification maps with huge savings in computation time.

    20. Sandia National Laboratories, California proposed CREATE facility environmental baseline survey.

      SciTech Connect (OSTI)

      Catechis, Christopher Spyros

      2013-10-01

      Sandia National Laboratories, Environmental Programs completed an environmental baseline survey (EBS) of 12.6 acres located at Sandia National Laboratories/California (SNL/CA) in support of the proposed Collaboration in Research and Engineering for Advanced Technology and Education (CREATE) Facility. The survey area is comprised of several parcels of land within SNL/CA, County of Alameda, California. The survey area is located within T 3S, R 2E, Section 13. The purpose of this EBS is to document the nature, magnitude, and extent of any environmental contamination of the property; identify potential environmental contamination liabilities associated with the property; develop sufficient information to assess the health and safety risks; and ensure adequate protection for human health and the environment related to a specific property.

    1. NREL: Climate Neutral Research Campuses - Determine Baseline Energy

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Consumption Determine Baseline Energy Consumption To create a climate action plan for your research campus, begin by determining current energy consumption and the resulting greenhouse gas emissions. You can then break down emissions by sector. It important to understand the following at the beginning: The Importance of a Baseline "The baseline inventory also provides a common data set for establishing benchmarks and priorities during the strategic planning stage and a means for

    2. ENERGY STAR PortfolioManager Baseline Year Instructions

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Baseline Year" Time frame Select "Multiple Properties" Using filters, choose properties to include in report Check box to Select all filtered properties Select these reporting items for the template Generate a new report using the template you created Once the report has been generated, download it as an Excel file Open downloaded "Baseline Year" report, select all and copy In report spreadsheet, choose the "Baseline

    3. Chile-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Kenya, Mexico, South Africa, Thailand and Vietnam), to share practices on setting national greenhouse gas emissions baseline scenarios. The aim of the workstream is to...

    4. EA-1943: Long Baseline Neutrino Facility/Deep Underground Neutrino...

      Broader source: Energy.gov (indexed) [DOE]

      May 27, 2015 EA-1943: Draft Environmental Assessment Long Baseline Neutrino FacilityDeep Underground Neutrino Experiment (LBNFDUNE) at Fermilab, Batavia, Illinois and the...

    5. Updates to the International Linear Collider Damping Rings Baseline...

      Office of Scientific and Technical Information (OSTI)

      Updates to the International Linear Collider Damping Rings Baseline Design Citation Details In-Document Search Title: Updates to the International Linear Collider Damping Rings...

    6. EA-1943: Construction and Operation of the Long Baseline Neutrino...

      Office of Environmental Management (EM)

      Neutrino Experiment at Fermilab, Batavia, Illinois, and Sanford Underground Research Facility, Lead, South Dakota EA-1943: Construction and Operation of the Long Baseline...

    7. Cost and Performance Comparison Baseline for Fossil Energy Power...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      blocks together into a new, revolutionary concept for future coal-based power and energy production. Objective To establish baseline performance and cost estimates for today's...

    8. South Africa - Greenhouse Gas Emission Baselines and Reduction...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    9. Mexico - Greenhouse Gas Emissions Baselines and Reduction Potentials...

      Open Energy Info (EERE)

      from Buildings AgencyCompany Organization United Nations Environment Programme Sector Energy Focus Area Buildings Topics Baseline projection, GHG inventory, Pathways analysis,...

    10. NETL - Bituminous Baseline Performance and Cost Interactive Tool...

      Open Energy Info (EERE)

      from the Cost and Performance Baseline for Fossil Energy Plants - Bituminous Coal and Natural Gas to Electricity report. The tool provides an interactive summary of the full...

    11. Sandia Energy - Scaled Wind Farm Technology Facility Baselining...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Project Accelerates Work Home Renewable Energy Energy SWIFT Facilities Partnership News Wind Energy News & Events Systems Analysis Scaled Wind Farm Technology Facility Baselining...

    12. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

      SciTech Connect (OSTI)

      Maranas, Costas D

      2012-05-21

      An overarching goal of the Department of Energy™ mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

    13. Development of Computational Approaches for Simulation and Advanced Controls for Hybrid Combustion-Gasification Chemical Looping

      SciTech Connect (OSTI)

      Joshi, Abhinaya; Lou, Xinsheng; Neuschaefer, Carl; Chaudry, Majid; Quinn, Joseph

      2012-07-31

      This document provides the results of the project through September 2009. The Phase I project has recently been extended from September 2009 to March 2011. The project extension will begin work on Chemical Looping (CL) Prototype modeling and advanced control design exploration in preparation for a scale-up phase. The results to date include: successful development of dual loop chemical looping process models and dynamic simulation software tools, development and test of several advanced control concepts and applications for Chemical Looping transport control and investigation of several sensor concepts and establishment of two feasible sensor candidates recommended for further prototype development and controls integration. There are three sections in this summary and conclusions. Section 1 presents the project scope and objectives. Section 2 highlights the detailed accomplishments by project task area. Section 3 provides conclusions to date and recommendations for future work.

    14. Electrochemistry Diagnostics of Baseline and New Materials | Department of

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Energy 2 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting PDF icon es033_kostecki_2012_o.pdf More Documents & Publications Electrochemistry Diagnostics of Baseline and New Materials Electrochemistry Diagnostics of Baseline and New Materials Overview of Applied Battery Research

    15. Electrochemistry Diagnostics of Baseline and New Materials | Department of

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Energy 1 DOE Hydrogen and Fuel Cells Program, and Vehicle Technologies Program Annual Merit Review and Peer Evaluation PDF icon es033_kostecki_2011_p.pdf More Documents & Publications Electrochemistry Diagnostics of Baseline and New Materials Electrochemistry Diagnostics of Baseline and New Materials Interfacial Processes - Diagnostics

    16. Long-Baseline Neutrino Experiment (LBNE)Water Cherenkov Detector Basis of Estimate Forms and Backup Documentation LBNE Far Site Internal Review (December 6-9, 2011)

      SciTech Connect (OSTI)

      Stewart J.; Diwan, M.; Dolph, J.; Novakova, P.; Sharma, R.; Stewart, J.; Viren, B.; Russo, T.; Kaducak, M.; Mantsch, P.; Paulos, B.; Feyzi, F.; Sullivan, G.; Bionta, R.; Fowler, J.; Warner, D.; Bahowick, S.; Van Berg, R.; Kearns, E.; Hazen, E.; Sinnis, G.; Sanchez, M.

      2011-12-09

      Basis of Estimate (BOE) forms and backup documentation developed for the Water Cherenkov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    17. Long-Baseline Neutrino Experiment (LBNE) Water Cherenkov Detector Schedule and Cost Books LBNE Far Site Internal Review(December 6-9,2011)

      SciTech Connect (OSTI)

      Stewart J.; Diwan, M.; Dolph, J.; Novakova, P.; Sharma, R.; Stewart, J.; Viren, B.; Russo, T.; Kaducak, M.; Mantsch, P.; Paulos, B.; Feyzi, F.; Sullivan, G.; Bionta, R.; Fowler, J.; Warner, D.; Bahowick, S.; Van Berg, R.; Kearns, E.; Hazen, E.; Sinnis, G.; Sanchez, M.

      2011-12-09

      Schedule and Cost Books developed for the Water Cherenkov Detector (WCD) option for the far detector of the Long Baseline Neutrino Experiment (LBNE)

    18. Development of computer program ENMASK for prediction of residual environmental masking-noise spectra, from any three independent environmental parameters

      SciTech Connect (OSTI)

      Chang, Y.-S.; Liebich, R. E.; Chun, K. C.

      2000-03-31

      Residual environmental sound can mask intrusive4 (unwanted) sound. It is a factor that can affect noise impacts and must be considered both in noise-impact studies and in noise-mitigation designs. Models for quantitative prediction of sensation level (audibility) and psychological effects of intrusive noise require an input with 1/3 octave-band spectral resolution of environmental masking noise. However, the majority of published residual environmental masking-noise data are given with either octave-band frequency resolution or only single A-weighted decibel values. A model has been developed that enables estimation of 1/3 octave-band residual environmental masking-noise spectra and relates certain environmental parameters to A-weighted sound level. This model provides a correlation among three environmental conditions: measured residual A-weighted sound-pressure level, proximity to a major roadway, and population density. Cited field-study data were used to compute the most probable 1/3 octave-band sound-pressure spectrum corresponding to any selected one of these three inputs. In turn, such spectra can be used as an input to models for prediction of noise impacts. This paper discusses specific algorithms included in the newly developed computer program ENMASK. In addition, the relative audibility of the environmental masking-noise spectra at different A-weighted sound levels is discussed, which is determined by using the methodology of program ENAUDIBL.

    19. Scientific Opportunities with the Long-Baseline Neutrino Experiment

      SciTech Connect (OSTI)

      Adams, C.; et al.,

      2013-07-28

      In this document, we describe the wealth of science opportunities and capabilities of LBNE, the Long-Baseline Neutrino Experiment. LBNE has been developed to provide a unique and compelling program for the exploration of key questions at the forefront of particle physics. Chief among the discovery opportunities are observation of CP symmetry violation in neutrino mixing, resolution of the neutrino mass hierarchy, determination of maximal or near-maximal mixing in neutrinos, searches for nucleon decay signatures, and detailed studies of neutrino bursts from galactic supernovae. To fulfill these and other goals as a world-class facility, LBNE is conceived around four central components: (1) a new, intense wide-band neutrino source at Fermilab, (2) a fine-grained `near' neutrino detector just downstream of the source, (3) the Sanford Underground Research Facility (SURF) in Lead, South Dakota at an optimal distance (~1300 km) from the neutrino source, and (4) a massive liquid argon time-projection chamber (LArTPC) deployed there as a 'far' detector. The facilities envisioned are expected to enable many other science opportunities due to the high event rates and excellent detector resolution from beam neutrinos in the near detector and atmospheric neutrinos in the far detector. This is a mature, well developed, world class experiment whose relevance, importance, and probability of unearthing critical and exciting physics has increased with time.

    20. Baseline design/economics for advanced Fischer-Tropsch technology. Quarterly report, January--March 1992

      SciTech Connect (OSTI)

      Not Available

      1992-09-01

      The objectives of the study are to: Develop a baseline design for indirect liquefaction using advanced Fischer-Tropsch (F-T) technology. Prepare the capital and operating costs for the baseline design. Develop a process flow sheet simulation (PFS) model. This report summarizes the activities completed during the period December 23, 1992 through March 15, 1992. In Task 1, Baseline Design and Alternates, the following activities related to the tradeoff studies were completed: approach and basis; oxygen purity; F-T reactor pressure; wax yield; autothermal reformer; hydrocarbons (C{sub 3}/C{sub 4}s) recovery; and hydrogenrecovery. In Task 3, Engineering Design Criteria, activities were initiated to support the process tradeoff studies in Task I and to develop the environmental strategy for the Illinois site. The work completed to date consists of the development of the F-T reactor yield correlation from the Mobil dam and a brief review of the environmental strategy prepared for the same site in the direct liquefaction baseline study.Some work has also been done in establishing site-related criteria, in establishing the maximum vessel diameter for train sizing and in coping with the low H{sub 2}/CO ratio from the Shell gasifier. In Task 7, Project Management and Administration, the following activities were completed: the subcontract agreement between Amoco and Bechtel was negotiated; a first technical progress meeting was held at the Bechtel office in February; and the final Project Management Plan was approved by PETC and issued in March 1992.

    1. Notice of Intent to Revise DOE G 413.3-5A, Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2016-02-03

      The proposed revision to this Department of Energy Guide focuses on updating the current guide with the latest terminology and references, regarding Performance Baseline Development process. This update also incorporates the latest Secretarial memoranda on project management issued since the last update to DOE O 413.3B, Program and Project Management for the Acquisition of Capital Assets.

    2. Biodiversity Research Institute Mid-Atlantic Baseline Study Webinar

      Broader source: Energy.gov [DOE]

      Carried out by the Biodiversity Research Institute (BRI) and funded by the U.S. Department of Energy Wind and Water Power Technology Office and other partners, the goal of the Mid-Atlantic Baseline...

    3. Cost and Performance Baseline for Fossil Energy Plants Volume...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      www.netl.doe.gov This page intentionally left blank Cost and Performance Baseline for Coal-to-SNG and Ammonia (Volume 2) i Table of Contents LIST OF EXHIBITS......

    4. Renewable Diesel from Algal Lipis: An Integrated Baseline for...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... Original RA modeling output: mean annual biofuel production (Lha-year) under current ... of unit farms required to meet the 5 BG biofuel production target under: (a) the baseline ...

    5. Annual Technology Baseline (Including Supporting Data); NREL (National

      Office of Scientific and Technical Information (OSTI)

      Renewable Energy Laboratory) (Conference) | SciTech Connect SciTech Connect Search Results Conference: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Citation Details In-Document Search Title: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain

    6. Annual Technology Baseline (Including Supporting Data); NREL (National

      Office of Scientific and Technical Information (OSTI)

      Renewable Energy Laboratory) (Conference) | SciTech Connect Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) Citation Details In-Document Search Title: Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory) × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a

    7. Fort Irwin Integrated Resource Assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Richman, E.E.; Keller, J.M.; Dittmer, A.L.; Hadley, D.L.

      1994-01-01

      This report documents the assessment of baseline energy use at Fort Irwin, a US Army Forces Command facility near Barstow, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Integrated Resource Assessment. The US Army Forces Command (FORSCOM) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Irwin. This is part of a model program that PNL has designed to support energy-use decisions in the federal sector. This program (1) identifies and evaluates all cost-effective energy projects; (2) develops a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) targets 100% of the financing required to implement energy efficiency projects. PNL applied this model program to Fort Irwin. This analysis examines the characteristics of electric, propane gas, and vehicle fuel use for a typical operating year. It records energy-use intensities for the facilities at Fort Irwin by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that accounts for all energy use among buildings, utilities, and applicable losses.

    8. Baseline and Target Values for PV Forecasts: Toward Improved Solar Power Forecasting: Preprint

      SciTech Connect (OSTI)

      Zhang, Jie; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Lehman, Brad; Simmons, Joseph; Campos, Edwin; Banunarayanan, Venkat

      2015-08-05

      Accurate solar power forecasting allows utilities to get the most out of the solar resources on their systems. To truly measure the improvements that any new solar forecasting methods can provide, it is important to first develop (or determine) baseline and target solar forecasting at different spatial and temporal scales. This paper aims to develop baseline and target values for solar forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output. forecasting metrics. These were informed by close collaboration with utility and independent system operator partners. The baseline values are established based on state-of-the-art numerical weather prediction models and persistence models. The target values are determined based on the reduction in the amount of reserves that must be held to accommodate the uncertainty of solar power output.

    9. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      2013-05-15

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

    10. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      The Engineered Geothermal System (EGS) Exploration Methodology Project is developing an exploration approach for EGS through the integration of geoscientific data. The Project chose the Dixie Valley Geothermal System in Nevada as a field laboratory site for methodlogy calibration purposes because, in the public domain, it is a highly characterized geothermal systems in the Basin and Range with a considerable amount of geoscience and most importantly, well data. This Baseline Conceptual Model report summarizes the results of the first three project tasks (1) collect and assess the existing public domain geoscience data, (2) design and populate a GIS database, and (3) develop a baseline (existing data) geothermal conceptual model, evaluate geostatistical relationships, and generate baseline, coupled EGS favorability/trust maps from +1km above sea level (asl) to -4km asl for the Calibration Area (Dixie Valley Geothermal Wellfield) to identify EGS drilling targets at a scale of 5km x 5km. It presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region.

    11. India's baseline plan for nuclear energy self-sufficiency.

      SciTech Connect (OSTI)

      Bucher, R .G.; Nuclear Engineering Division

      2009-01-01

      India's nuclear energy strategy has traditionally strived for energy self-sufficiency, driven largely by necessity following trade restrictions imposed by the Nuclear Suppliers Group (NSG) following India's 'peaceful nuclear explosion' of 1974. On September 6, 2008, the NSG agreed to create an exception opening nuclear trade with India, which may create opportunities for India to modify its baseline strategy. The purpose of this document is to describe India's 'baseline plan,' which was developed under constrained trade conditions, as a basis for understanding changes in India's path as a result of the opening of nuclear commerce. Note that this treatise is based upon publicly available information. No attempt is made to judge whether India can meet specified goals either in scope or schedule. In fact, the reader is warned a priori that India's delivery of stated goals has often fallen short or taken a significantly longer period to accomplish. It has been evident since the early days of nuclear power that India's natural resources would determine the direction of its civil nuclear power program. It's modest uranium but vast thorium reserves dictated that the country's primary objective would be thorium utilization. Estimates of India's natural deposits vary appreciably, but its uranium reserves are known to be extremely limited, totaling approximately 80,000 tons, on the order of 1% of the world's deposits; and nominally one-third of this ore is of very low uranium concentration. However, India's roughly 300,000 tons of thorium reserves account for approximately 30% of the world's total. Confronted with this reality, the future of India's nuclear power industry is strongly dependent on the development of a thorium-based nuclear fuel cycle as the only way to insure a stable, sustainable, and autonomous program. The path to India's nuclear energy self-sufficiency was first outlined in a seminal paper by Drs. H. J. Bhabha and N. B. Prasad presented at the Second United Nations Conference on the Peaceful Uses of Atomic Energy in 1958. The paper described a three stage plan for a sustainable nuclear energy program consistent with India's limited uranium but abundant thorium natural resources. In the first stage, natural uranium would be used to fuel graphite or heavy water moderated reactors. Plutonium extracted from the spent fuel of these thermal reactors would drive fast reactors in the second stage that would contain thorium blankets for breeding uranium-233 (U-233). In the final stage, this U-233 would fuel thorium burning reactors that would breed and fission U-233 in situ. This three stage blueprint still reigns as the core of India's civil nuclear power program. India's progress in the development of nuclear power, however, has been impacted by its isolation from the international nuclear community for its development of nuclear weapons and consequent refusal to sign the Nuclear Nonproliferation Treaty (NPT). Initially, India was engaged in numerous cooperative research programs with foreign countries; for example, under the 'Atoms for Peace' program, India acquired the Cirus reactor, a 40 MWt research reactor from Canada moderated with heavy water from the United States. India was also actively engaged in negotiations for the NPT. But, on May 18, 1974, India conducted a 'peaceful nuclear explosion' at Pokharan using plutonium produced by the Cirus reactor, abruptly ending the era of international collaboration. India then refused to sign the NPT, which it viewed as discriminatory since it would be required to join as a non-nuclear weapons state. As a result of India's actions, the Nuclear Suppliers Group (NSG) was created in 1975 to establish guidelines 'to apply to nuclear transfers for peaceful purposes to help ensure that such transfers would not be diverted to unsafeguarded nuclear fuel cycle or nuclear explosive activities. These nuclear export controls have forced India to be largely self-sufficient in all nuclear-related technologies.

    12. Long-Term Stewardship Baseline Report and Transition Guidance

      SciTech Connect (OSTI)

      Kristofferson, Keith

      2001-11-01

      Long-term stewardship consists of those actions necessary to maintain and demonstrate continued protection of human health and the environment after facility cleanup is complete. As the Department of Energys (DOE) lead laboratory for environmental management programs, the Idaho National Engineering and Environmental Laboratory (INEEL) administers DOEs long-term stewardship science and technology efforts. The INEEL provides DOE with technical, and scientific expertise needed to oversee its long-term environmental management obligations complexwide. Long-term stewardship is administered and overseen by the Environmental Management Office of Science and Technology. The INEEL Long-Term Stewardship Program is currently developing the management structures and plans to complete INEEL-specific, long-term stewardship obligations. This guidance document (1) assists in ensuring that the program leads transition planning for the INEEL with respect to facility and site areas and (2) describes the classes and types of criteria and data required to initiate transition for areas and sites where the facility mission has ended and cleanup is complete. Additionally, this document summarizes current information on INEEL facilities, structures, and release sites likely to enter long-term stewardship at the completion of DOEs cleanup mission. This document is not intended to function as a discrete checklist or local procedure to determine readiness to transition. It is an overarching document meant as guidance in implementing specific transition procedures. Several documents formed the foundation upon which this guidance was developed. Principal among these documents was the Long-Term Stewardship Draft Technical Baseline; A Report to Congress on Long-Term Stewardship, Volumes I and II; Infrastructure Long-Range Plan; Comprehensive Facility Land Use Plan; INEEL End-State Plan; and INEEL Institutional Plan.

    13. Vehicle Technologies Office Merit Review 2014: Integrated Computational Materials Engineering Approach to Development of Lightweight 3GAHSS Vehicle Assembly

      Broader source: Energy.gov [DOE]

      Presentation given by USAMP at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about integrated computational materials...

    14. Vehicle Technologies Office Merit Review 2015: Integrated Computational Materials Engineering Approach to Development of Lightweight 3GAHSS Vehicle Assembly

      Broader source: Energy.gov [DOE]

      Presentation given by USAMP at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about integrated computational materials...

    15. Multiproject baselines for evaluation of electric power projects

      SciTech Connect (OSTI)

      Sathaye, Jayant; Murtishaw, Scott; Price, Lynn; Lefranc, Maurice; Roy, Joyashree; Winkler, Harald; Spalding-Fecher, Randall

      2003-03-12

      Calculating greenhouse gas emissions reductions from climate change mitigation projects requires construction of a baseline that sets emissions levels that would have occurred without the project. This paper describes a standardized multiproject methodology for setting baselines, represented by the emissions rate (kg C/kWh), for electric power projects. A standardized methodology would reduce the transaction costs of projects. The most challenging aspect of setting multiproject emissions rates is determining the vintage and types of plants to include in the baseline and the stringency of the emissions rates to be considered, in order to balance the desire to encourage no- or low-carbon projects while maintaining environmental integrity. The criteria for selecting power plants to include in the baseline depend on characteristics of both the project and the electricity grid it serves. Two case studies illustrate the application of these concepts to the electric power grids in eastern India and South Africa. We use hypothetical, but realistic, climate change projects in each country to illustrate the use of the multiproject methodology, and note the further research required to fully understand the implications of the various choices in constructing and using these baselines.

    16. DEVELOPMENT OF A COMPUTATIONAL MULTIPHASE FLOW MODEL FOR FISCHER TROPSCH SYNTHESIS IN A SLURRY BUBBLE COLUMN REACTOR

      SciTech Connect (OSTI)

      Donna Post Guillen; Tami Grimmett; Anastasia M. Gribik; Steven P. Antal

      2010-09-01

      The Hybrid Energy Systems Testing (HYTEST) Laboratory is being established at the Idaho National Laboratory to develop and test hybrid energy systems with the principal objective to safeguard U.S. Energy Security by reducing dependence on foreign petroleum. A central component of the HYTEST is the slurry bubble column reactor (SBCR) in which the gas-to-liquid reactions will be performed to synthesize transportation fuels using the Fischer Tropsch (FT) process. SBCRs are cylindrical vessels in which gaseous reactants (for example, synthesis gas or syngas) is sparged into a slurry of liquid reaction products and finely dispersed catalyst particles. The catalyst particles are suspended in the slurry by the rising gas bubbles and serve to promote the chemical reaction that converts syngas to a spectrum of longer chain hydrocarbon products, which can be upgraded to gasoline, diesel or jet fuel. These SBCRs operate in the churn-turbulent flow regime which is characterized by complex hydrodynamics, coupled with reacting flow chemistry and heat transfer, that effect reactor performance. The purpose of this work is to develop a computational multiphase fluid dynamic (CMFD) model to aid in understanding the physico-chemical processes occurring in the SBCR. Our team is developing a robust methodology to couple reaction kinetics and mass transfer into a four-field model (consisting of the bulk liquid, small bubbles, large bubbles and solid catalyst particles) that includes twelve species: (1) CO reactant, (2) H2 reactant, (3) hydrocarbon product, and (4) H2O product in small bubbles, large bubbles, and the bulk fluid. Properties of the hydrocarbon product were specified by vapor liquid equilibrium calculations. The absorption and kinetic models, specifically changes in species concentrations, have been incorporated into the mass continuity equation. The reaction rate is determined based on the macrokinetic model for a cobalt catalyst developed by Yates and Satterfield [1]. The model includes heat generation due to the exothermic chemical reaction, as well as heat removal from a constant temperature heat exchanger. Results of the CMFD simulations (similar to those shown in Figure 1) will be presented.

    17. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

      SciTech Connect (OSTI)

      Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

      2013-09-06

      This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

    18. Fort Drum integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Brodrick, J.R.; Daellenbach, K.K.; Di Massa, F.V.; Keller, J.M.; Richman, E.E.; Sullivan, G.P.; Wahlstrom, R.R.

      1992-12-01

      The US Army Forces Command (FORSCOM) has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Fort Drum. This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company. It will identify and evaluate all electric and fossil fuel cost-effective energy projects; develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, the FORSCOM Fort Drum facility located near Watertown, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. It records energy-use intensities for the facilities at Fort Drum by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, central systems, and applicable losses.

    19. Griffiss AFB integrated resource assessment. Volume 2, Electric baseline detail

      SciTech Connect (OSTI)

      Dixon, D.R.; Armstrong, P.R.; Keller, J.M.

      1993-02-01

      The US Air Force Air Combat Command has tasked the Pacific Northwest Laboratory (PNL) as the lead laboratory supporting the US Department of Energy (DOE) Federal Energy Management Program`s (FEMP) mission to identify, evaluate, and assist in acquiring all cost-effective energy projects at Griffiss Air Force Base (AFB). This is a model program PNL is designing for federal customers served by the Niagara Mohawk Power Company (Niagara Mohawk). It will (1) identify and evaluate all electric cost-effective energy projects; (2) develop a schedule at each installation for project acquisition considering project type, size, timing, and capital requirements, as well as energy and dollar savings; and (3) secure 100% of the financing required to implement electric energy efficiency projects from Niagara Mohawk and have Niagara Mohawk procure the necessary contractors to perform detailed audits and install the technologies. This report documents the assessment of baseline energy use at one of Niagara Mohawk`s primary federal facilities, Griffiss AFB, an Air Combat Command facility located near Rome, New York. It is a companion report to Volume 1, the Executive Summary, and Volume 3, the Electric Resource Assessment. The analysis examines the characteristics of electric, gas, oil, propane, coal, and purchased thermal capacity use for fiscal year (FY) 1990. The results include energy-use intensities for the facilities at Griffiss AFB by building type and electric energy end use. A complete electric energy consumption reconciliation is presented that accounts for the distribution of all major electric energy uses and losses among buildings, utilities, and central systems.

    20. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle θ23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    1. Precision Measurements of Long-Baseline Neutrino Oscillation at LBNF

      SciTech Connect (OSTI)

      Worcester, Elizabeth

      2015-08-06

      In a long-baseline neutrino oscillation experiment, the primary physics objectives are to determine the neutrino mass hierarchy, to determine the octant of the neutrino mixing angle ?23, to search for CP violation in neutrino oscillation, and to precisely measure the size of any CP-violating effect that is discovered. This presentation provides a brief introduction to these measurements and reports on efforts to optimize the design of a long-baseline neutrino oscillation experiment, the status of LBNE, and the transition to an international collaboration at LBNF.

    2. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

      SciTech Connect (OSTI)

      Katya Le Blanc; Johanna Oxstrand

      2012-04-01

      The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

    3. Computational mechanics

      SciTech Connect (OSTI)

      Raboin, P J

      1998-01-01

      The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

    4. THE FIRST VERY LONG BASELINE INTERFEROMETRIC SETI EXPERIMENT

      SciTech Connect (OSTI)

      Rampadarath, H.; Morgan, J. S.; Tingay, S. J.; Trott, C. M.

      2012-08-15

      The first Search for Extra-Terrestrial Intelligence (SETI) conducted with very long baseline interferometry (VLBI) is presented. By consideration of the basic principles of interferometry, we show that VLBI is efficient at discriminating between SETI signals and human generated radio frequency interference (RFI). The target for this study was the star Gliese 581, thought to have two planets within its habitable zone. On 2007 June 19, Gliese 581 was observed for 8 hr at 1230-1544 MHz with the Australian Long Baseline Array. The data set was searched for signals appearing on all interferometer baselines above five times the noise limit. A total of 222 potential SETI signals were detected and by using automated data analysis techniques were ruled out as originating from the Gliese 581 system. From our results we place an upper limit of 7 MW Hz{sup -1} on the power output of any isotropic emitter located in the Gliese 581 system within this frequency range. This study shows that VLBI is ideal for targeted SETI including follow-up observations. The techniques presented are equally applicable to next-generation interferometers, such as the long baselines of the Square Kilometre Array.

    5. 241-AZ Farm Annulus Extent of Condition Baseline Inspection

      SciTech Connect (OSTI)

      Engeman, Jason K.; Girardot, Crystal L.; Vazquez, Brandon J.

      2013-05-15

      This report provides the results of the comprehensive annulus visual inspection for tanks 241- AZ-101 and 241-AZ-102 performed in fiscal year 2013. The inspection established a baseline covering about 95 percent of the annulus floor for comparison with future inspections. Any changes in the condition are also included in this document.

    6. Technical Baseline Summary Description for the Tank Farm Contractor

      SciTech Connect (OSTI)

      TEDESCHI, A.R.

      2000-04-21

      This document is a revision of the document titled above, summarizing the technical baseline of the Tank Farm Contractor. It is one of several documents prepared by CH2M HILL Hanford Group, Inc. to support the U.S. Department of Energy Office of River Protection Tank Waste Retrieval and Disposal Mission at Hanford.

    7. Computation & Simulation > Theory & Computation > Research >...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      it. Click above to view. computational2 computational3 In This Section Computation & Simulation Computation & Simulation Extensive combinatorial results and ongoing basic...

    8. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process...

      Energy Savers [EERE]

      2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project...

    9. Computational mechanics

      SciTech Connect (OSTI)

      Goudreau, G.L.

      1993-03-01

      The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

    10. Parallel computing works

      SciTech Connect (OSTI)

      Not Available

      1991-10-23

      An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

    11. Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation.

      SciTech Connect (OSTI)

      Saffer, Shelley I.

      2014-12-01

      This is a final report of the DOE award DE-SC0001132, Advanced Artificial Science. The development of an artificial science and engineering research infrastructure to facilitate innovative computational modeling, analysis, and application to interdisciplinary areas of scientific investigation. This document describes the achievements of the goals, and resulting research made possible by this award.

    12. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

      SciTech Connect (OSTI)

      Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

      2012-02-01

      The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

    13. Computing and Computational Sciences Directorate - Information...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      cost-effective, state-of-the-art computing capabilities for research and development. ... communicates and manages strategy, policy and finance across the portfolio of IT assets. ...

    14. Recent developments in large-scale finite-element Lagrangian hydrocode technology. [Dyna 20/dyna 30 computer code

      SciTech Connect (OSTI)

      Goudreau, G.L.; Hallquist, J.O.

      1981-10-01

      The state of Lagrangian hydrocodes for computing the large deformation dynamic response of inelastic continuua is reviewed in the context of engineering computation at the Lawrence Livermore National Laboratory, USA, and the DYNA2D/DYNA3D finite elements codes. The emphasis is on efficiency and computational cost. The simplest elements with explicit time integration. The two-dimensional four node quadrilateral and the three-dimensional hexahedron with one point quadrature are advocated as superior to other more expensive choices. Important auxiliary capabilities are a cheap but effective hourglass control, slidelines/planes with void opening/closure, and rezoning. Both strain measures and material formulation are seen as a homogeneous stress point problem and a flexible material subroutine interface admits both incremental and total strain formulation, dependent on internal energy or an arbitrary set of other internal variables. Vectorization on Class VI computers such as the CRAY-1 is a simple exercise for optimally organized primitive element formulations. Some examples of large scale computation are illustrated, including continuous tone graphic representation.

    15. Level 3 Baseline Risk Assessment for Building 3515 at Oak Ridge National Lab., Oak Ridge, TN

      SciTech Connect (OSTI)

      Wollert, D.A.; Cretella, F.M.; Golden, K.M.

      1995-08-01

      The baseline risk assessment for the Fission Product Pilot Plant (Building 3515) at the Oak Ridge National laboratory (ORNL) provides the Decontamination and Decommissioning (D&D) Program at ORNL and Building 3515 project managers with information concerning the results of the Level 3 baseline risk assessment performed for this building. The document was prepared under Work Breakdown Structure 1.4.12.6.2.01 (Activity Data Sheet 3701, Facilities D&D) and includes information on the potential long-term impacts to human health and the environment if no action is taken to remediate Building 3515. Information provided in this document forms the basis for the development of remedial alternatives and the no-action risk portion of the Engineering Evaluation/Cost Analysis report.

    16. Baseline ecological risk assessment Salmon Site, Lamar County, Mississippi

      SciTech Connect (OSTI)

      1995-04-01

      The Salmon Site (SS), formerly the Tatum Dome Test Site, located in Mississippi was the site of two nuclear and two gas explosion tests conducted between 1964 and 1970. A consequence of these testing activities is that radionuclides were released into the salt dome, where they are presently contained. During reentry drilling and other site activities, incidental liquid and solid wastes that contained radioactivity were generated, resulting in some soil, ground water and equipment contamination. As part of the remedial investigation effort, a Baseline Ecological Risk Assessment was conducted at the SS. The purpose is to gauge ecological and other environmental impacts attributable to past activities at the former test facility. The results of this facility-specific baseline risk assessment are presented in this document.

    17. Baseline measurements of terrestrial gamma radioactivity at the CEBAF site

      SciTech Connect (OSTI)

      Wollenberg, H.A.; Smith, A.R.

      1991-10-01

      A survey of the gamma radiation background from terrestrial sources was conducted at the CEBAF site, Newport News, Virginia, on November 12--16, 1990, to provide a gamma radiation baseline for the site prior to the startup of the accelerator. The concentrations and distributions of the natural radioelements in exposed soil were measured, and the results of the measurements were converted into gamma-ray exposure rates. Concurrently, samples were collected for laboratory gamma spectral analyses.

    18. Hybrid Electric Vehicle Fleet and Baseline Performance Testing

      SciTech Connect (OSTI)

      J. Francfort; D. Karner

      2006-04-01

      The U.S. Department of Energys Advanced Vehicle Testing Activity (AVTA) conducts baseline performance and fleet testing of hybrid electric vehicles (HEV). To date, the AVTA has completed baseline performance testing on seven HEV models and accumulated 1.4 million fleet testing miles on 26 HEVs. The HEV models tested or in testing include: Toyota Gen I and Gen II Prius, and Highlander; Honda Insight, Civic and Accord; Chevrolet Silverado; Ford Escape; and Lexus RX 400h. The baseline performance testing includes dynamometer and closed track testing to document the HEVs fuel economy (SAE J1634) and performance in a controlled environment. During fleet testing, two of each HEV model are driven to 160,000 miles per vehicle within 36 months, during which maintenance and repair events, and fuel use is recorded and used to compile life-cycle costs. At the conclusion of the 160,000 miles of fleet testing, the SAE J1634 tests are rerun and each HEV battery pack is tested. These AVTA testing activities are conducted by the Idaho National Laboratory, Electric Transportation Applications, and Exponent Failure Analysis Associates. This paper discusses the testing methods and results.

    19. Final Report. Baseline LAW Glass Formulation Testing, VSL-03R3460-1, Rev. 0

      SciTech Connect (OSTI)

      Muller, Isabelle S.; Pegg, Ian L.; Gan, Hao; Buechele, Andrew; Rielley, Elizabeth; Bazemore, Gina; Cecil, Richard; Hight, Kenneth; Mooers, Cavin; Lai, Shan-Tao T.; Kruger, Albert A.

      2015-06-18

      The major objective of the baseline glass formulation work was to develop and select glass formulations that are compliant with contractual and processing requirements for each of the LAW waste streams. Other objectives of the work included preparation and characterization of glasses with respect to the properties of interest, optimization of sulfate loading in the glasses, evaluation of ability to achieve waste loading limits, testing to demonstrate compatibility of glass melts with melter materials of construction, development of glass formulations to support ILAW qualification activities, and identification of glass formulation issues with respect to contract specifications and processing requirements.

    20. Cognitive Computing for Security.

      SciTech Connect (OSTI)

      Debenedictis, Erik; Rothganger, Fredrick; Aimone, James Bradley; Marinella, Matthew; Evans, Brian Robert; Warrender, Christina E.; Mickel, Patrick

      2015-12-01

      Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

    1. EVMS Training Snippet: 4.6 Baseline Control Methods | Department of Energy

      Office of Environmental Management (EM)

      6 Baseline Control Methods EVMS Training Snippet: 4.6 Baseline Control Methods This EVMS Training Snippet, sponsored by the Office of Project Management (PM) discusses baseline revisions and the different baseline control vehicles used in DOE. Link to Video Presentation | Prior Snippet (4.5) | Next Snippet (4.7) | Return to Index PDF icon Slides Only PDF icon Slides with Notes More Documents & Publications EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule

    2. Opportunities for Russian Nuclear Weapons Institute developing computer-aided design programs for pharmaceutical drug discovery. Final report

      SciTech Connect (OSTI)

      1996-09-23

      The goal of this study is to determine whether physicists at the Russian Nuclear Weapons Institute can profitably service the need for computer aided drug design (CADD) programs. The Russian physicists` primary competitive advantage is their ability to write particularly efficient code able to work with limited computing power; a history of working with very large, complex modeling systems; an extensive knowledge of physics and mathematics, and price competitiveness. Their primary competitive disadvantage is their lack of biology, and cultural and geographic issues. The first phase of the study focused on defining the competitive landscape, primarily through interviews with and literature searches on the key providers of CADD software. The second phase focused on users of CADD technology to determine deficiencies in the current product offerings, to understand what product they most desired, and to define the potential demand for such a product.

    3. NREL: MIDC/SRRL Baseline Measurement System (39.74 N, 105.18 W, 1829 m,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      GMT-7) Solar Radiation Research Laboratory Baseline Measurement System

    4. Baseline review of the U.S. LHC Accelerator project

      SciTech Connect (OSTI)

      1998-02-01

      The Department of Energy (DOE) Review of the U.S. Large Hadron Collider (LHC) Accelerator project was conducted February 23--26, 1998, at the request of Dr. John R. O`Fallon, Director, Division of High Energy Physics, Office of Energy Research, U.S. DOE. This is the first review of the U.S. LHC Accelerator project. Overall, the Committee found that the U.S. LHC Accelerator project effort is off to a good start and that the proposed scope is very conservative for the funding available. The Committee recommends that the project be initially baselined at a total cost of $110 million, with a scheduled completion data of 2005. The U.S. LHC Accelerator project will supply high technology superconducting magnets for the interaction regions (IRs) and the radio frequency (rf) straight section of the LHC intersecting storage rings. In addition, the project provides the cryogenic support interface boxes to service the magnets and radiation absorbers to protect the IR dipoles and the inner triplet quadrupoles. US scientists will provide support in analyzing some of the detailed aspects of accelerator physics in the two rings. The three laboratories participating in this project are Brookhaven National Laboratory, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The Committee was very impressed by the technical capabilities of the US LHC Accelerator project team. Cost estimates for each subsystem of the US LHC Accelerator project were presented to the Review Committee, with a total cost including contingency of $110 million (then year dollars). The cost estimates were deemed to be conservative. A re-examination of the funding profile, costs, and schedules on a centralized project basis should lead to an increased list of deliverables. The Committee concluded that the proposed scope of US deliverables to CERN can be readily accomplished with the $110 million total cost baseline for the project. The current deliverables should serve as the baseline scope with the firm expectation that additional scope will be restored to the baseline as the project moves forward. The Committee supports the FY 1998 work plan and scope of deliverables but strongly recommends the reevaluation of costs and schedules with the goal of producing a plan for restoring the US deliverables to CERN. This plan should provide precise dates when scope decisions must be made.

    5. CLAMR (Compute Language Adaptive Mesh Refinement)

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CLAMR (Compute Language Adaptive Mesh Refinement) CLAMR (Compute Language Adaptive Mesh Refinement) CLAMR (Compute Language Adaptive Mesh Refinement) is being developed as a DOE...

    6. Energy baseline and energy efficiency resource opportunities for the Forest Products Laboratory, Madison, Wisconsin

      SciTech Connect (OSTI)

      Mazzucchi, R.P.; Richman, E.E.; Parker, G.B.

      1993-08-01

      This report provides recommendations to improve the energy use efficiency at the Forest Products Laboratory in Madison, Wisconsin. The assessment focuses upon the four largest buildings and central heating plant at the facility comprising a total of approximately 287,000 square feet. The analysis is comprehensive in nature, intended primarily to determine what if any energy efficiency improvements are warranted based upon the potential for cost-effective energy savings. Because of this breadth, not all opportunities are developed in detail; however, baseline energy consumption data and energy savings concepts are described to provide a foundation for detailed investigation and project design where warranted.

    7. Estimating baseline risks from biouptake and food ingestion at a contaminated site

      SciTech Connect (OSTI)

      MacDonell, M.; Woytowich, K.; Blunt, D.; Picel, M.

      1993-11-01

      Biouptake of contaminants and subsequent human exposure via food ingestion represents a public concern at many contaminated sites. Site-specific measurements from plant and animal studies are usually quite limited, so this exposure pathway is often modeled to assess the potential for adverse health effects. A modeling tool was applied to evaluate baseline risks at a contaminated site in Missouri, and the results were used to confirm that ingestion of fish and game animals from the site area do not pose a human health threat. Results were also used to support the development of cleanup criteria for site soil.

    8. Data Management Guide: Integrated Baseline System (IBS). Version 2.1

      SciTech Connect (OSTI)

      Bower, J.C. [Bower Software Services, Kennewick, Washington (United States)] Bower Software Services, Kennewick, Washington (United States); Burford, M.J.; Downing, T.R.; Moise, M.C.; Williams, J.R. [Pacific Northwest Lab., Richland, WA (United States)] Pacific Northwest Lab., Richland, WA (United States)

      1995-01-01

      The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that is being developed under the direction of the US Army Nuclear and Chemical Agency (USANCA). The IBS Data Management Guide provides the background, as well as the operations and procedures needed to generate and maintain a site-specific map database. Data and system managers use this guide to manage the data files and database that support the administrative, user-environment, database management, and operational capabilities of the IBS. This document provides a description of the data files and structures necessary for running the IBS software and using the site map database.

    9. Computed tomography and optical remote sensing: Development for the study of indoor air pollutant transport and dispersion

      SciTech Connect (OSTI)

      Drescher, A.C.

      1995-06-01

      This thesis investigates the mixing and dispersion of indoor air pollutants under a variety of conditions using standard experimental methods. It also extensively tests and improves a novel technique for measuring contaminant concentrations that has the potential for more rapid, non-intrusive measurements with higher spatial resolution than previously possible. Experiments conducted in a sealed room support the hypothesis that the mixing time of an instantaneously released tracer gas is inversely proportional to the cube root of the mechanical power transferred to the room air. One table-top and several room-scale experiments are performed to test the concept of employing optical remote sensing (ORS) and computed tomography (CT) to measure steady-state gas concentrations in a horizontal plane. Various remote sensing instruments, scanning geometries and reconstruction algorithms are employed. Reconstructed concentration distributions based on existing iterative CT techniques contain a high degree of unrealistic spatial variability and do not agree well with simultaneously gathered point-sample data.

    10. Advanced Scientific Computing Research

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Advanced Scientific Computing Research Advanced Scientific Computing Research Discovering, developing, and deploying computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to the Department of Energy. Get Expertise Pieter Swart (505) 665 9437 Email Pat McCormick (505) 665-0201 Email Dave Higdon (505) 667-2091 Email Fulfilling the potential of emerging computing systems and architectures beyond today's tools and techniques to deliver

    11. DOE Announces Webinars on the Mid-Atlantic Baseline Study, EPA...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      used in the Mid-Atlantic Baseline Studies (MABS), a project intended to help inform the ... The Mid-Atlantic Baseline Studies (MABS) Project was carried out by the Biodiversity ...

    12. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      EVMS Training Snippet: 3.1A Integrated Master Schedule (IMS) Initial Baseline Review EVMS Training Snippet: 4.6 Baseline Control Methods EVMS Training Snippet: 4.9 High-level EVM...

    13. Waste Assessment Baseline for the IPOC Second Floor, West Wing

      SciTech Connect (OSTI)

      McCord, Samuel A

      2015-04-01

      Following a building-wide waste assessment in September, 2014, and subsequent presentation to Sandia leadership regarding the goal of Zero Waste by 2025, the occupants of the IPOC Second Floor, West Wing contacted the Materials Sustainability and Pollution Prevention (MSP2) team to guide them to Zero Waste in advance of the rest of the site. The occupants are from Center 3600, Public Relations and Communications , and Center 800, Independent Audit, Ethics and Business Conduct . To accomplish this, MSP2 conducted a new limited waste assessment from March 2-6, 2015 to compare the second floor, west wing to the building as a whole. The assessment also serves as a baseline with which to mark improvements in diversion in approximately 6 months.

    14. Compute nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute nodes Compute nodes Click here to see more detailed hierachical map of the topology of a compute node. Last edited: 2016-02-01 08:07:08

    15. Computing Information

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      here you can find information relating to: Obtaining the right computer accounts. Using NIC terminals. Using BooNE's Computing Resources, including: Choosing your desktop....

    16. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      undergraduate summer institute http:isti.lanl.gov (Educational Prog) 2016 Computer System, Cluster, and Networking Summer Institute Purpose The Computer System,...

    17. TRIDAC host computer functional specification

      SciTech Connect (OSTI)

      Hilbert, S.M.; Hunter, S.L.

      1983-08-23

      The purpose of this document is to outline the baseline functional requirements for the Triton Data Acquisition and Control (TRIDAC) Host Computer Subsystem. The requirements presented in this document are based upon systems that currently support both the SIS and the Uranium Separator Technology Groups in the AVLIS Program at the Lawrence Livermore National Laboratory and upon the specific demands associated with the extended safe operation of the SIS Triton Facility.

    18. Development

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      OF IMAGING We require: (1) a high quality ion beam, (2) computer vision and image processing techniques for isolating and re- constructing the beam, and (3) wavelengths suitable...

    19. Idaho National Laboratorys Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2011-06-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at INL. Additionally, INL has a desire to see how its emissions compare with similar institutions, including other DOE national laboratories. Executive Order 13514 requires that federal agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INL's FY08 GHG inventory was calculated according to methodologies identified in federal GHG guidance documents using operational control boundaries. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INL's organizational boundaries but are a consequence of INL's activities). This inventory found that INL generated a total of 113,049 MT of CO2-equivalent emissions during FY08. The following conclusions were made from looking at the results of the individual contributors to INL's baseline GHG inventory: (1) Electricity (including the associated transmission and distribution losses) is the largest contributor to INL's GHG inventory, with over 50% of the CO2e emissions; (2) Other sources with high emissions were stationary combustion (facility fuels), waste disposal (including fugitive emissions from the onsite landfill and contracted disposal), mobile combustion (fleet fuels), employee commuting, and business air travel; and (3) Sources with low emissions were wastewater treatment (onsite and contracted), fugitive emissions from refrigerants, and business ground travel (in personal and rental vehicles). This report details the methods behind quantifying INL's GHG inventory and discusses lessons learned on better practices by which information important to tracking GHGs can be tracked and recorded. It is important to note that because this report differentiates between those portions of INL that are managed and operated by the Battelle Energy Alliance (BEA) and those managed by other contractors, it includes only that large proportion of Laboratory activities overseen by BEA. It is assumed that other contractors will provide similar reporting for those activities they manage, where appropriate.

    20. Idaho National Laboratorys Greenhouse Gas FY08 Baseline

      SciTech Connect (OSTI)

      Jennifer D. Morton

      2010-09-01

      A greenhouse gas (GHG) inventory is a systematic attempt to account for the production and release of certain gasses generated by an institution from various emission sources. The gasses of interest are those which have become identified by climate science as related to anthropogenic global climate change. This document presents an inventory of GHGs generated during fiscal year (FY) 2008 by Idaho National Laboratory (INL), a Department of Energy (DOE)-sponsored entity, located in southeastern Idaho. Concern about the environmental impact of GHGs has grown in recent years. This, together with a desire to decrease harmful environmental impacts, would be enough to encourage the calculation of a baseline estimate of total GHGs generated at the INL. Additionally, the INL has a desire to see how its emissions compare with similar institutions, including other DOE-sponsored national laboratories. Executive Order 13514 requires that federally-sponsored agencies and institutions document reductions in GHG emissions in the future, and such documentation will require knowledge of a baseline against which reductions can be measured. INLs FY08 GHG inventory was calculated according to methodologies identified in Federal recommendations and an as-yet-unpublished Technical and Support Document (TSD) using operational control boundary. It measures emissions generated in three Scopes: (1) INL emissions produced directly by stationary or mobile combustion and by fugitive emissions, (2) the share of emissions generated by entities from which INL purchased electrical power, and (3) indirect or shared emissions generated by outsourced activities that benefit INL (occur outside INLs organizational boundaries but are a consequence of INLs activities). This inventory found that INL generated a total of 114,256 MT of CO2-equivalent emissions during fiscal year 2008 (FY08). The following conclusions were made from looking at the results of the individual contributors to INLs baseline GHG inventory: Electricity is the largest contributor to INLs GHG inventory, with over 50% of the net anthropogenic CO2e emissions Other sources with high emissions were stationary combustion, fugitive emissions from the onsite landfill, mobile combustion (fleet fuels) and the employee commute Sources with low emissions were contracted waste disposal, wastewater treatment (onsite and contracted) and fugitive emissions from refrigerants. This report details the methods behind quantifying INLs GHG inventory and discusses lessons learned on better practices by which information important to tracking GHGs can be tracked and recorded. It is important to stress that the methodology behind this inventory followed guidelines that have not yet been formally adopted. Thus, some modification of the conclusions may be necessary as additional guidance is received. Further, because this report differentiates between those portions of the INL that are managed and operated by the Battelle Energy Alliance (BEA) and those managed by other contractors, it includes only that large proportion of Laboratory activities overseen by BEA. It is assumed that other contractors will provide similar reporting for those activities they manage, where appropriate.

    1. Baseline Risk Assessment Supporting Closure at Waste Management Area C at the Hanford Site Washington

      SciTech Connect (OSTI)

      Singleton, Kristin M.

      2015-01-07

      The Office of River Protection under the U.S. Department of Energy is pursuing closure of the Single-Shell Tank (SST) Waste Management Area (WMA) C under the requirements of the Hanford Federal Facility Agreement and Consent Order (HFFACO). A baseline risk assessment (BRA) of current conditions is based on available characterization data and information collected at WMA C. The baseline risk assessment is being developed as a part of a Resource Conservation and Recovery Act (RCRA) Facility Investigation (RFI)/Corrective Measures Study (CMS) at WMA C that is mandatory under Comprehensive Environmental Response, Compensation, and Liability Act and RCRA corrective action. The RFI/CMS is needed to identify and evaluate the hazardous chemical and radiological contamination in the vadose zone from past releases of waste from WMA C. WMA C will be under Federal ownership and control for the foreseeable future, and managed as an industrial area with restricted access and various institutional controls. The exposure scenarios evaluated under these conditions include Model Toxics Control Act (MTCA) Method C, industrial worker, maintenance and surveillance worker, construction worker, and trespasser scenarios. The BRA evaluates several unrestricted land use scenarios (residential all-pathway, MTCA Method B, and Tribal) to provide additional information for risk management. Analytical results from 13 shallow zone (0 to 15 ft. below ground surface) sampling locations were collected to evaluate human health impacts at WMA C. In addition, soil analytical data were screened against background concentrations and ecological soil screening levels to determine if soil concentrations have the potential to adversely affect ecological receptors. Analytical data from 12 groundwater monitoring wells were evaluated between 2004 and 2013. A screening of groundwater monitoring data against background concentrations and Federal maximum concentration levels was used to determine vadose zone contamination impacts on groundwater. Waste Management Area C is the first of the Hanford tank farms to begin the closure planning process. The current baseline risk assessment will provide valuable information for making corrective actions and closure decisions for WMA C, and will also support the planning for future tank farm soil investigation and baseline risk assessments.

    2. Pentek metal coating removal system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek coating removal system consisted of the ROTO-PEEN Scaler, CORNER-CUTTER{reg_sign}, and VAC-PAC{reg_sign}. They are designed to remove coatings from steel, concrete, brick, and wood. The Scaler uses 3M Roto Peen tungsten carbide cutters while the CORNER-CUTTER{reg_sign} uses solid needles for descaling activities. These hand tools are used with the VAC-PAC{reg_sign} vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure minimal, but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    3. Ultra-high pressure water jet: Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The ultra-high pressure waterjet technology was being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The ultra-high pressure waterjet technology acts as a cutting tool for the removal of surface substrates. The Husky{trademark} pump feeds water to a lance that directs the high pressure water at the surface to be removed. The safety and health evaluation during the testing demonstration focused on two main areas of exposure. These were dust and noise. The dust exposure was found to be minimal, which would be expected due to the wet environment inherent in the technology, but noise exposure was at a significant level. Further testing for noise is recommended because of the outdoor environment where the testing demonstration took place. In addition, other areas of concern found were arm-hand vibration, ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, fall hazards, slipping hazards, hazards associated with the high pressure water, and hazards associated with air pressure systems.

    4. Pentek concrete scabbling system: Baseline report; Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The Pentek scabbling technology was tested at Florida International University (FIU) and is being evaluated as a baseline technology. This report evaluates it for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The Pentek concrete scabbling system consisted of the MOOSE, SQUIRREL-I, and SQUIRREL-III scabblers. The scabblers are designed to scarify concrete floors and slabs using cross-section, tungsten carbide tipped bits. The bits are designed to remove concrete in 318 inch increments. The bits are either 9-tooth or demolition type. The scabblers are used with a vacuum system designed to collect and filter the concrete dust and contamination that is removed from the surface. The safety and health evaluation conducted during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure was minimal, but noise exposure was significant. Further testing for each of these exposures is recommended. Because of the outdoor environment where the testing demonstration took place, results may be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed operating environment. Other areas of concern were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    5. LTC vacuum blasting machine (metal) baseline report: Greenbook (chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole-body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    6. LTC vacuum blasting maching (concrete): Baseline report: Greenbook (Chapter)

      SciTech Connect (OSTI)

      1997-07-31

      The LTC shot blast technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjuction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC 1073 Vacuum Blasting Machine uses a high-capacity, direct-pressure blasting system which incorporates a continuous feed for the blast media. The blast media cleans the surface within the contained brush area of the blast. It incorporates a vacuum system which removes dust and debris from the surface as it is blasted. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. Dust exposure during maintenance activities was minimal, but due to mechanical difficulties dust monitoring could not be conducted during operation. Noise exposure was significant. Further testing for each of these exposures is recommended because of the outdoor environment where the testing demonstration took place. This may cause the results to be inaccurate. It is feasible that the dust and noise levels will be higher in an enclosed environment. In addition, other safety and health issues found were ergonomics, heat stress, tripping hazards, electrical hazards, lockout/tagout, and arm-hand vibration.

    7. Computing and Computational Sciences Directorate - Computer Science...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      AWARD Winners: Jess Gehin; Jackie Isaacs; Douglas Kothe; Debbie McCoy; Bonnie Nestor; John Turner; Gilbert Weigand Organization(s): Nuclear Technology Program; Computing and...

    8. U.S. Department of Energy Performance Baseline Guide

      Broader source: Directives, Delegations, and Requirements [Office of Management (MA)]

      2011-09-23

      This guide identifies key PB elements, development processes, and practices; describes the context in which DOE PB development occurs; and suggests ways of addressing the critical elements in PB development. Supersedes DOE G 413.3-5.

    9. Sampling designs for geochemical baseline studies in the Colorado oil shale region: a manual for practical application

      SciTech Connect (OSTI)

      Klusman, R. W.; Ringrose, C. D.; Candito, R. J.; Zuccaro, B.; Rutherford, D. W.; Dean, W. E.

      1980-06-01

      This manual presents a rationale for sampling designs, and results of geochemical baseline studies in the Colorado portion of the oil-shale region. The program consists of a systematic trace element study of soils, stream sediments, and plants carried out in a way to be conservative of human and financial resources and yield maximum information. Extension of this approach to other parameters, other locations, and to environmental baseline studies in general is a primary objective. A baseline for any geochemical parameter can be defined as the concentration of that parameter in a given medium such as soil, the range of its concentration, and the geographic scale of variability. In air quality studies, and to a lesser extent for plants, the temporal scale of variability must also be considered. In studies of soil, the temporal variablility does not become a factor until such time that a study is deemed necessary to evaluate whether or not there have been changes in baseline levels as a result of development. The manual is divided into five major parts. The first is a suggested sampling protocol which is presented in an outline form for guiding baseline studies in this area. The second section is background information on the physical features of the area of study, trace elements of significance occurring in oil shale, and the sample media used in these studies. The third section is concerned primarily with sampling design and its application to the geochemical studies of the oil shale region. The last sections, in the form of appendices, provide actual data and illustrate in a systematic manner, the calculations performed to obtain the various summary data. The last segment of the appendices is a more academic discussion of the geochemistry of trace elements and the parameters of importance influencing their behavior in natural systems.

    10. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

      SciTech Connect (OSTI)

      Joseph, Earl C.; Conway, Steve; Dekate, Chirag

      2013-09-30

      This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size.  A new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.

    11. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cluster-Image TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computing Resources The TRACC Computational Clusters With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD

    12. Baseline scheme for polarization preservation and control in the MEIC ion complex

      SciTech Connect (OSTI)

      Derbenev, Yaroslav S.; Lin, Fanglei; Morozov, Vasiliy; Zhang, Yuhong; Kondratenko, Anatoliy; Kondratenko, M A; Filatov, Yury

      2015-09-01

      The scheme for preservation and control of the ion polarization in the Medium-energy Electron-Ion Collider (MEIC) has been under active development in recent years. The figure-8 configuration of the ion rings provides a unique capability to control the polarization of any ion species including deuterons by means of "weak" solenoids rotating the particle spins by small angles. Insertion of "weak" solenoids into the magnetic lattices of the booster and collider rings solves the problem of polarization preservation during acceleration of the ion beam. Universal 3D spin rotators designed on the basis of "weak" solenoids allow one to obtain any polarization orientation at an interaction point of MEIC. This paper presents the baseline scheme for polarization preservation and control in the MEIC ion complex.

    13. Quality Assurance Baseline Assessment Report to Los Alamos National Laboratory Analytical Chemistry Operations

      SciTech Connect (OSTI)

      Jordan, R. A.

      1998-09-01

      This report summarizes observations that were made during a Quality Assurance (QA) Baseline Assessment of the Nuclear Materials Technology Analytical Chemistry Group (NMT-1). The Quality and Planning personnel, for NMT-1, are spending a significant amount of time transitioning out of their roles of environmental oversight into production oversight. A team from the Idaho National Engineering and Environmental Laboratory Defense Program Environmental Surety Program performed an assessment of the current status of the QA Program. Several Los Alamos National Laboratory Analytical Chemistry procedures were reviewed, as well as Transuranic Waste Characterization Program (TWCP) QA documents. Checklists were developed and the assessment was performed according to an Implementation Work Plan, INEEL/EXT-98-00740.

    14. Computing and Computational Sciences Directorate - Information Technology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Sciences and Engineering The Computational Sciences and Engineering Division (CSED) is ORNL's premier source of basic and applied research in the field of data sciences and knowledge discovery. CSED's science agenda is focused on research and development related to knowledge discovery enabled by the explosive growth in the availability, size, and variability of dynamic and disparate data sources. This science agenda encompasses data sciences as well as advanced modeling and

    15. 2008 CHP Baseline Assessment and Action Plan for the California Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy California Market 2008 CHP Baseline Assessment and Action Plan for the California Market This 2008 report provides an updated baseline assessment and action plan for combined heat and power (CHP) in California and identifies hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). PDF icon chp_california_2008.pdf More Documents & Publications 2008 CHP Baseline Assessment and Action Plan for the

    16. Optimization of the CLIC Baseline Collimation System (Conference) | SciTech

      Office of Scientific and Technical Information (OSTI)

      Connect Optimization of the CLIC Baseline Collimation System Citation Details In-Document Search Title: Optimization of the CLIC Baseline Collimation System Important efforts have recently been dedicated to the improvement of the design of the baseline collimation system of the Compact Linear Collider (CLIC). Different aspects of the design have been optimized: the transverse collimation depths have been recalculated in order to reduce the collimator wakefield effects while maintaining a

    17. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Target Schedule (OTS) Implementations | Department of Energy EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations » EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations This EVMS Training Snippet, sponsored by the Office of Project Management (PM) covers Over Target

    18. EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over

      Office of Environmental Management (EM)

      Target Schedule (OTS) Implementations | Department of Energy 1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations EVMS Training Snippet: 4.1 The Over Target Baseline (OTB) and The Over Target Schedule (OTS) Implementations This EVMS Training Snippet, sponsored by the Office of Project Management (PM) covers Over Target Baseline and Over Target Schedule implementations. Link to Video Presentation | Prior Snippet (3.3) | Next Snippet (4.2) | Return to Index PDF

    19. EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process |

      Office of Environmental Management (EM)

      Department of Energy 2 Integrated Baseline Review (IBR) Process EVMS Training Snippet: 4.2 Integrated Baseline Review (IBR) Process This EVMS Training Snippet sponsored by the Office of Project Management (PM) covers the Integrated Baseline Review (IBR) process. Link to Video Presentation | Prior Snippet (4.1) | Next Snippet (4.3) | Return to Index PDF icon Slides Only PDF icon Slides with Notes More Documents & Publications EVMS Training Snippet: 1.4 EVMS Stage 2 Surveillance EVMS

    20. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB DDR3 800 MHz memory per node Peak Gflop rate 9.2 Gflops/core 36.8 Gflops/node 352 Tflops for the entire machine Each core has their own L1 and L2 caches, with 64 KB and 512KB respectively 2 MB L3 cache shared among the 4 cores Compute Node Software By default the compute nodes run a restricted low-overhead

    1. Microsoft PowerPoint - Snippet 4.6 Baseline Control Methods 20140723 [Compatibility Mode]

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      baseline revisions and the different baseline control vehicles used in DOE. 1 It is a given that during the life of the project, the performance measurement baseline (PMB) will change for a variety of reasons. These changes may affect the technical scope, schedule, and/or budget of the project. Revisions to the baseline may be necessary to maintain a valid work plan. In accordance with the DOE Acquisition Guide Chapter 43.3 (March 2013), certain changes cannot be made to the PMB, such as

    2. U.S. Department of Energy Performance Baseline Guide - DOE Directives...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      5A, U.S. Department of Energy Performance Baseline Guide by Brian Kong Functional areas: Program Management, Project Management, Work Processes This guide identifies key PB...

    3. Development of an Extensible Computational Framework for Centralized Storage and Distributed Curation and Analysis of Genomic Data Genome-scale Metabolic Models

      SciTech Connect (OSTI)

      Stevens, Rick

      2010-08-01

      The DOE funded KBase project of the Stevens group at the University of Chicago was focused on four high-level goals: (i) improve extensibility, accessibility, and scalability of the SEED framework for genome annotation, curation, and analysis; (ii) extend the SEED infrastructure to support transcription regulatory network reconstructions (2.1), metabolic model reconstruction and analysis (2.2), assertions linked to data (2.3), eukaryotic annotation (2.4), and growth phenotype prediction (2.5); (iii) develop a web-API for programmatic remote access to SEED data and services; and (iv) application of all tools to bioenergy-related genomes and organisms. In response to these goals, we enhanced and improved the ModelSEED resource within the SEED to enable new modeling analyses, including improved model reconstruction and phenotype simulation. We also constructed a new website and web-API for the ModelSEED. Further, we constructed a comprehensive web-API for the SEED as a whole. We also made significant strides in building infrastructure in the SEED to support the reconstruction of transcriptional regulatory networks by developing a pipeline to identify sets of consistently expressed genes based on gene expression data. We applied this pipeline to 29 organisms, computing regulons which were subsequently stored in the SEED database and made available on the SEED website (http://pubseed.theseed.org). We developed a new pipeline and database for the use of kmers, or short 8-residue oligomer sequences, to annotate genomes at high speed. Finally, we developed the PlantSEED, or a new pipeline for annotating primary metabolism in plant genomes. All of the work performed within this project formed the early building blocks for the current DOE Knowledgebase system, and the kmer annotation pipeline, plant annotation pipeline, and modeling tools are all still in use in KBase today.

    4. Kenya-Danish Government Baseline Workstream | Open Energy Information

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    5. Vietnam-Danish Government Baseline Workstream | Open Energy Informatio...

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    6. Thailand-Danish Government Baseline Workstream | Open Energy...

      Open Energy Info (EERE)

      Government Partner Danish Ministry for Climate, Energy, and Building; The Danish Energy Agency Sector Energy Topics Implementation, Low emission development planning Program...

    7. Cost and Performance Comparison Baseline for Fossil Energy Plants...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      energy security. A broad portfolio of technologies is being developed within the Clean Coal Program to accomplish this objective. Ever increasing technological enhancements...

    8. Software and High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Software and High Performance Computing Software and High Performance Computing Providing world-class high performance computing capability that enables unsurpassed solutions to complex problems of strategic national interest Contact thumbnail of Kathleen McDonald Head of Intellectual Property, Business Development Executive Kathleen McDonald Richard P. Feynman Center for Innovation (505) 667-5844 Email Software Computational physics, computer science, applied mathematics, statistics and the

    9. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2006-11-01

      Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together researchers in these areas and to provide a focal point for the development of computational expertise at the Laboratory. These efforts will connect to and support the Department of Energy's long range plans to provide Leadership class computing to researchers throughout the Nation. Recruitment for six new positions at Stony Brook to strengthen its computational science programs is underway. We expect some of these to be held jointly with BNL.

    10. Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Cite Seer Department of Energy provided open access science research citations in chemistry, physics, materials, engineering, and computer science IEEE Xplore Full text...

    11. Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Security All JLF participants must fully comply with all LLNL computer security regulations and procedures. A laptop entering or leaving B-174 for the sole use by a US citizen and so configured, and requiring no IP address, need not be registered for use in the JLF. By September 2009, it is expected that computers for use by Foreign National Investigators will have no special provisions. Notify maricle1@llnl.gov of all other computers entering, leaving, or being moved within B 174. Use

    12. Sandia National Laboratories: Advanced Simulation Computing:...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      These collaborations help solve the challenges of developing computing platforms and simulation tools across a number of disciplines. Computer Science Research Institute The...

    13. Vehicle Technologies Office Merit Review 2015: Computational Design and Development of a New, Lightweight Cast Alloy for Advanced Cylinder Heads in High-Efficiency, Light-Duty Engines

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2015 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about computational design and...

    14. Computing and Computational Sciences Directorate - Divisions

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CCSD Divisions Computational Sciences and Engineering Computer Sciences and Mathematics Information Technolgoy Services Joint Institute for Computational Sciences National Center for Computational Sciences

    15. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes There are currently 2632 nodes available on PDSF. The compute (batch) nodes at PDSF are heterogenous, reflecting the periodic procurement of new nodes (and the eventual retirement of old nodes). From the user's perspective they are essentially all equivalent except that some have more memory per job slot. If your jobs have memory requirements beyond the default maximum of 1.1GB you should specify that in your job submission and the batch system will run your job on an

    16. Baseline System Costs for 50.0 MW Enhanced Geothermal System--A Function of: Working Fluid, Technology, and Location, Location, Location

      Broader source: Energy.gov [DOE]

      Project objectives: Develop a baseline cost model of a 50.0 MW Enhanced Geothermal System, including all aspects of the project, from finding the resource through to operation, for a particularly challenging scenario: the deep, radioactively decaying granitic rock of the Pioneer Valley in Western Massachusetts.

    17. GTA (ground test accelerator) Phase 1: Baseline design report

      SciTech Connect (OSTI)

      Not Available

      1986-08-01

      The national Neutral Particle Beam (NPB) program has two objectives: to provide the necessary basis for a discriminator/weapon decision by 1992, and to develop the technology in stages that lead ultimately to a neutral particle beam weapon. The ground test accelerator (GTA) is the test bed that permits the advancement of the state-of-the-art under experimental conditions in an integrated automated system mode. An intermediate goal of the GTA program is to support the Integrated Space Experiments, while the ultimate goal is to support the 1992 decision. The GTA system and each of its major subsystems are described, and project schedules and resource requirements are provided. (LEW)

    18. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Nodes Quad CoreAMDOpteronprocessor Compute Node Configuration 9,572 nodes 1 quad-core AMD 'Budapest' 2.3 GHz processor per node 4 cores per node (38,288 total cores) 8 GB...

    19. Insertion Devices for NSLS-II Baseline and Future

      SciTech Connect (OSTI)

      Tanabe,T.

      2008-06-23

      NSLS-II is going to employ Damping Wigglers (DWs) not only for emittance reduction but also as broad band hard X-ray source. In-Vacuum Undulators (IVUs) with the minimum RMS phase error (< 2 degree) and possible cryo-capability are planned for X-ray planar device. Elliptically Polarized Undulators (EPUs) are envisioned for polarization controls. Due to the lack of hard X-ray flux from weak dipole magnet field (0.4 Tesla), three pole wigglers (3PWs) of the peak field over 1 Tesla will be mainly used by NSLS bending magnet beam line users. Magnetic designs and kick maps for dynamic aperture surveys were created using the latest version of Radia [1] for Mathematica 6 which we supported the development. There are other devices planned for the later stage of the project, such as quasi-periodic EPU, superconducting wiggler/undulator, and Cryo-Permanent Magnet Undulator (CPMU) with Praseodymium Iron Boron (PrFeB) magnets and textured Dysprosium poles. For R&D, Hybrid PrFeB arrays were planned to be assembled and field-measured at room temperature, liquid nitrogen and liquid helium temperature using our vertical test facility. We have also developed a specialized power supply for pulsed wire measurement.

    20. Bioinformatics Computing Consultant Position Available

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      You can read more about the positions and apply at jobs.lbl.gov: Bioinformatics High Performance Computing Consultant (job number: 73194) and Software Developer for High...

    1. Exascale Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Exascale Computing CoDEx Project: A Hardware/Software Codesign Environment for the Exascale Era The next decade will see a rapid evolution of HPC node architectures as power and cooling constraints are limiting increases in microprocessor clock speeds and constraining data movement. Applications and algorithms will need to change and adapt as node architectures evolve. A key element of the strategy as we move forward is the co-design of applications, architectures and programming

    2. LHC Computing

      SciTech Connect (OSTI)

      Lincoln, Don

      2015-07-28

      The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

    3. High Hydrogen Concentrations Detected In The Underground Vaults For RH-TRU Waste At INEEL Compared With Calculated Values Using The INEEL-Developed Computer Code

      SciTech Connect (OSTI)

      Rajiv Bhatt; Soli Khericha

      2005-02-01

      About 700 remote-handled transuranic (RH-TRU) waste drums are stored in about 144 underground vaults at the Intermediate-Level Transuranic Storage Facility at the Idaho National Environmental and Engineering Laboratorys (INEELs) Radioactive Waste Management Complex (RWMC). These drums were shipped to the INEEL from 1976 through 1996. During recent monitoring, concentrations of hydrogen were found to be in excess of lower explosive limits. The hydrogen concentration in one vault was detected to be as high as 18% (by volume). This condition required evaluation of the safety basis for the facility. The INEEL has developed a computer program to estimate the hydrogen gas generation as a function of time and diffusion through a series of layers (volumes), with a maximum five layers plus a sink/environment. The program solves the first-order diffusion equations as a function of time. The current version of the code is more flexible in terms of user input. The program allows the user to estimate hydrogen concentrations in the different layers of a configuration and then change the configuration after a given time; e.g.; installation of a filter on an unvented drum or placed in a vault or in a shipping cask. The code has been used to predict vault concentrations and to identify potential problems during retrieval and aboveground storage. The code has generally predicted higher hydrogen concentrations than the measured values, particularly for the drums older than 20 year, which could be due to uncertainty and conservative assumptions in drum age, heat generation rate, hydrogen generation rate, Geff, and diffusion rates through the layers.

    4. Computational Earth Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      6 Computational Earth Science We develop and apply a range of high-performance computational methods and software tools to Earth science projects in support of environmental health, cleaner energy, and national security. Contact Us Group Leader Carl Gable Deputy Group Leader Gilles Bussod Email Profile pages header Search our Profile pages Hari Viswanathan inspects a microfluidic cell used to study the extraction of hydrocarbon fuels from a complex fracture network. EES-16's Subsurface Flow

    5. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Hans Gougar

      2014-05-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.

    6. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Gougar, Hans D.

      2014-10-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both ‘small’ or medium-sized and ‘modular’ by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOE’s ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.

    7. Baseline Concept Description of a Small Modular High Temperature Reactor

      SciTech Connect (OSTI)

      Hans Gougar

      2014-05-01

      The objective of this report is to provide a description of generic small modular high temperature reactors (herein denoted as an smHTR), summarize their distinguishing attributes, and lay out the research and development (R&D) required for commercialization. The generic concepts rely heavily on the modular high temperature gas-cooled reactor designs developed in the 1980s which were never built but for which pre-licensing or certification activities were conducted. The concept matured more recently under the Next Generation Nuclear Plant (NGNP) project, specifically in the areas of fuel and material qualification, methods development, and licensing. As all vendor-specific designs proposed under NGNP were all both small or medium-sized and modular by International Atomic Energy Agency (IAEA) and Department of Energy (DOE) standards, the technical attributes, challenges, and R&D needs identified, addressed, and documented under NGNP are valid and appropriate in the context of Small Modular Reactor (SMR) applications. Although the term High Temperature Reactor (HTR) is commonly used to denote graphite-moderated, thermal spectrum reactors with coolant temperatures in excess of 650oC at the core outlet, in this report the historical term High Temperature Gas-Cooled Reactor (HTGR) will be used to distinguish the gas-cooled technology described herein from its liquid salt-cooled cousin. Moreover, in this report it is to be understood that the outlet temperature of the helium in an HTGR has an upper limit of 950 degrees C which corresponds to the temperature to which certain alloys are currently being qualified under DOEs ARC program. Although similar to the HTGR in just about every respect, the Very High Temperature Reactor (VHTR) may have an outlet temperature in excess of 950 degrees C and is therefore farther from commercialization because of the challenges posed to materials exposed to these temperatures. The VHTR is the focus of R&D under the Generation IV program and its specific R&D needs will be included in this report when appropriate for comparison. The distinguishing features of the HTGR are the refractory (TRISO) coated particle fuel, the low-power density, graphite-moderated core, and the high outlet temperature of the inert helium coolant. The low power density and fuel form effectively eliminate the possibility of core melt, even upon a complete loss of coolant pressure and flow. The graphite, which constitutes the bulk of the core volume and mass, provides a large thermal buffer that absorbs fission heat such that thermal transients occur over a timespan of hours or even days. As chemically-inert helium is already a gas, there is no coolant temperature or void feedback on the neutronics and no phase change or corrosion product that could degrade heat transfer. Furthermore, the particle coatings and interstitial graphite retain fission products such that the source terms at the plant boundary remain well below actionable levels under all anticipated nominal and off-normal operating conditions. These attributes enable the reactor to supply process heat to a collocated industrial plant with negligible risk of contamination and minimal dynamic coupling of the facilities (Figure 1). The exceptional retentive properties of coated particle fuel in a graphite matrix were first demonstrated in the DRAGON reactor, a European research facility that began operation in 1964.

    8. An evaluation of baseline conditions at lease tract C-a, Rio Blanco County, Colorado

      SciTech Connect (OSTI)

      Barteaux, W.L.; Biezugbe, G.

      1987-09-01

      An analysis was made of baseline groundwater quality data from oil shale lease tract C-a, managed by Rio Blanco Oil Shale Company. The data are limited in several respects. All conclusions drawn from the data must be qualified with these limitations. Baseline conditions were determined by analyzing data from wells in the upper bedrock and lower bedrock aquifers and from the alluvial wells. Baseline data were considered all data collected before mining operations began. The water quality was then evaluated using the 1987 Colorado State Basic Standards for Ground Water as a basis. The maximum baseline values for several parameters in each aquifer exceed the standard values. The quality of the upper lower bedrock aquifers varies from region to region within the site. Data on the lower bedrock aquifer are insufficient for speculation on the cause of the variations. Variations in the upper bedrock aquifer are possibly caused by leakage of the lower bedrock aquifer. 16 refs., 9 figs., 9 tabs.

    9. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

      SciTech Connect (OSTI)

      Xu, Tengfang; Flapper, Joris; Ke, Jing; Kramer, Klaas; Sathaye, Jayant

      2012-02-01

      The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and water usage in individual dairy plants, augment benchmarking activities in the market places, and facilitate implementation of efficiency measures and strategies to save energy and water usage in the dairy industry. Industrial adoption of this emerging tool and technology in the market is expected to benefit dairy plants, which are important customers of California utilities. Further demonstration of this benchmarking tool is recommended, for facilitating its commercialization and expansion in functions of the tool. Wider use of this BEST-Dairy tool and its continuous expansion (in functionality) will help to reduce the actual consumption of energy and water in the dairy industry sector. The outcomes comply very well with the goals set by the AB 1250 for PIER program.

    10. 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy Hawaii Market 2008 CHP Baseline Assessment and Action Plan for the Hawaii Market The purpose of this 2008 report is to provide an updated baseline assessment and action plan for combined heat and power (CHP) in Hawaii and to identify the hurdles that prevent the expanded use of CHP systems. This report was prepared by the Pacific Region CHP Application Center (RAC). PDF icon chp_hawaii_2008.pdf More Documents & Publications Renewable Power Options for Electricity

    11. Optimization of the CLIC Baseline Collimation System (Conference) | SciTech

      Office of Scientific and Technical Information (OSTI)

      Connect Optimization of the CLIC Baseline Collimation System Citation Details In-Document Search Title: Optimization of the CLIC Baseline Collimation System × You are accessing a document from the Department of Energy's (DOE) SciTech Connect. This site is a product of DOE's Office of Scientific and Technical Information (OSTI) and is provided as a public service. Visit OSTI to utilize additional information resources in energy science and technology. A paper copy of this document is also

    12. EA-1943: Construction and Operation of the Long Baseline Neutrino Facility

      Office of Environmental Management (EM)

      and Deep Underground Neutrino Experiment at Fermilab, Batavia, Illinois, and Sanford Underground Research Facility, Lead, South Dakota | Department of Energy 43: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground Neutrino Experiment at Fermilab, Batavia, Illinois, and Sanford Underground Research Facility, Lead, South Dakota EA-1943: Construction and Operation of the Long Baseline Neutrino Facility and Deep Underground Neutrino Experiment at Fermilab,

    13. Evaluation of final waste forms and recommendations for baseline alternatives to group and glass

      SciTech Connect (OSTI)

      Bleier, A.

      1997-09-01

      An assessment of final waste forms was made as part of the Federal Facilities Compliance Agreement/Development, Demonstration, Testing, and Evaluation (FFCA/DDT&E) Program because supplemental waste-form technologies are needed for the hazardous, radioactive, and mixed wastes of concern to the Department of Energy and the problematic wastes on the Oak Ridge Reservation. The principal objective was to identify a primary waste-form candidate as an alternative to grout (cement) and glass. The effort principally comprised a literature search, the goal of which was to establish a knowledge base regarding four areas: (1) the waste-form technologies based on grout and glass, (2) candidate alternatives, (3) the wastes that need to be immobilized, and (4) the technical and regulatory constraints on the waste-from technologies. This report serves, in part, to meet this goal. Six families of materials emerged as relevant; inorganic, organic, vitrified, devitrified, ceramic, and metallic matrices. Multiple members of each family were assessed, emphasizing the materials-oriented factors and accounting for the fact that the two most prevalent types of wastes for the FFCA/DDT&E Program are aqueous liquids and inorganic sludges and solids. Presently, no individual matrix is sufficiently developed to permit its immediate implementation as a baseline alternative. Three thermoplastic materials, sulfur-polymer cement (inorganic), bitumen (organic), and polyethylene (organic), are the most technologically developed candidates. Each warrants further study, emphasizing the engineering and economic factors, but each also has limitations that regulate it to a status of short-term alternative. The crystallinity and flexible processing of sulfur provide sulfur-polymer cement with the highest potential for short-term success via encapsulation. Long-term immobilization demands chemical stabilization, which the thermoplastic matrices do not offer. Among the properties of the remaining candidates, those of glass-ceramics (devitrified matrices) represent the best compromise for meeting the probable stricter disposal requirements in the future.

    14. Computing and Computational Sciences Directorate - Contacts

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Home › About Us Contacts Jeff Nichols Associate Laboratory Director Computing and Computational Sciences Becky Verastegui Directorate Operations Manager Computing and Computational Sciences Directorate Michael Bartell Chief Information Officer Information Technologies Services Division Jim Hack Director, Climate Science Institute National Center for Computational Sciences Shaun Gleason Division Director Computational Sciences and Engineering Barney Maccabe Division Director Computer Science

    15. Compute Nodes

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Nodes Compute Nodes MC-proc.png Compute Node Configuration 6,384 nodes 2 twelve-core AMD 'MagnyCours' 2.1-GHz processors per node (see die image to the right and schematic below) 24 cores per node (153,216 total cores) 32 GB DDR3 1333-MHz memory per node (6,000 nodes) 64 GB DDR3 1333-MHz memory per node (384 nodes) Peak Gflop/s rate: 8.4 Gflops/core 201.6 Gflops/node 1.28 Peta-flops for the entire machine Each core has its own L1 and L2 caches, with 64 KB and 512KB respectively One 6-MB

    16. Computing Resources

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Resources This page is the repository for sundry items of information relevant to general computing on BooNE. If you have a question or problem that isn't answered here, or a suggestion for improving this page or the information on it, please mail boone-computing@fnal.gov and we'll do our best to address any issues. Note about this page Some links on this page point to www.everything2.com, and are meant to give an idea about a concept or thing without necessarily wading through a whole website

    17. Quantum steady computation

      SciTech Connect (OSTI)

      Castagnoli, G. )

      1991-08-10

      This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

    18. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Modular baseline health surveys

      SciTech Connect (OSTI)

      Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Schmidlin, Sandro; Magassouba, Mohamed L.; Knoblauch, Astrid M.; Singer, Burton H.; Utzinger, Juerg

      2012-02-15

      The quantitative assessment of health impacts has been identified as a crucial feature for realising the full potential of health impact assessment (HIA). In settings where demographic and health data are notoriously scarce, but there is a broad range of ascertainable ecological, environmental, epidemiological and socioeconomic information, a diverse toolkit of data collection strategies becomes relevant for the mainly small-area impacts of interest. We present a modular, cross-sectional baseline health survey study design, which has been developed for HIA of industrial development projects in the humid tropics. The modular nature of our toolkit allows our methodology to be readily adapted to the prevailing eco-epidemiological characteristics of a given project setting. Central to our design is a broad set of key performance indicators, covering a multiplicity of health outcomes and determinants at different levels and scales. We present experience and key findings from our modular baseline health survey methodology employed in 14 selected sentinel sites within an iron ore mining project in the Republic of Guinea. We argue that our methodology is a generic example of rapid evidence assembly in difficult-to-reach localities, where improvement of the predictive validity of the assessment and establishment of a benchmark for longitudinal monitoring of project impacts and mitigation efforts is needed.

    19. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

      SciTech Connect (OSTI)

      Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

      2013-09-01

      The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing blinds, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those whose savings can be calculated with least error? 4. What is the state of public domain models, that is, how well do they perform, and what are the associated implications for whole-building measurement and verification (M&V)? Additional project objectives that were addressed as part of this study include: (1) clarification of the use cases and conditions for baseline modeling performance metrics, benchmarks and evaluation criteria, (2) providing guidance for determining customer suitability for baseline modeling, (3) describing the portfolio level effects of baseline model estimation errors, (4) informing PG&Es development of EMIS technology product specifications, and (5) providing the analytical foundation for future studies about baseline modeling and saving effects of EMIS technologies. A final objective of this project was to demonstrate the application of the methodology, performance metrics, and test protocols with participating EMIS product vendors.

    20. Sandia Energy - Computational Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Science Home Energy Research Advanced Scientific Computing Research (ASCR) Computational Science Computational Sciencecwdd2015-03-26T13:35:2...

    1. Baseline Library

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Skip navigation links Marketing Resources Reports, Publications, and Research Agricultural Commercial Consumer Products Industrial Institutional Multi-Sector Residential...

    2. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

      SciTech Connect (OSTI)

      Boring, Ronald L.; Joe, Jeffrey C.

      2015-02-01

      For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

    3. NERSC seeks Computational Systems Group Lead

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      seeks Computational Systems Group Lead NERSC seeks Computational Systems Group Lead January 6, 2011 by Katie Antypas Note: This position is now closed. The Computational Systems Group provides production support and advanced development for the supercomputer systems at NERSC. Manage the Computational Systems Group (CSG) which provides production support and advanced development for the supercomputer systems at NERSC (National Energy Research Scientific Computing Center). These systems, which

    4. Computer System,

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute New Mexico Consortium and Los Alamos National Laboratory HOW TO APPLY Applications will be accepted JANUARY 5 - FEBRUARY 13, 2016 Computing and Information Technology undegraduate students are encouraged to apply. Must be a U.S. citizen. * Submit a current resume; * Offcial University Transcript (with spring courses posted and/or a copy of spring 2016 schedule) 3.0 GPA minimum; * One Letter of Recommendation from a Faculty Member; and * Letter of

    5. Computing Events

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Events Computing Events Spotlighting the most advanced scientific and technical applications in the world! Featuring exhibits of the latest and greatest technologies from industry, academia and government research organizations; many of these technologies will be seen for the first time in Denver. Supercomputing Conference 13 Denver, Colorado November 17-22, 2013 Spotlighting the most advanced scientific and technical applications in the world, SC13 will bring together the international

    6. MHD computations for stellarators

      SciTech Connect (OSTI)

      Johnson, J.L.

      1985-12-01

      Considerable progress has been made in the development of computational techniques for studying the magnetohydrodynamic equilibrium and stability properties of three-dimensional configurations. Several different approaches have evolved to the point where comparison of results determined with different techniques shows good agreement. 55 refs., 7 figs.

    7. Borehole temperatures and a baseline for 20th-century global warming estimates

      SciTech Connect (OSTI)

      Harris, R.N.; Chapman, D.S.

      1997-03-14

      Lack of a 19th-century baseline temperature against which 20th-century warming can be referenced constitutes a deficiency in understanding recent climate change. Combination of borehole temperature profiles, which contain a memory of surface temperature changes in previous centuries, with the meteorologicl archive of surface air temperatures can provide a 19th-century baseline temperature tied to the current observational record. A test case in Utah, where boreholes are interspersed with meteorological stations belonging to the Historical Climatological network, Yields a noise reduction in estimates of 20th-century warming and a baseline temperature that is 0.6{degrees} {+-} 0.1{degrees}C below the 1951 to 1970 mean temperature for the region. 22 refs., 3 figs., 1 tab.

    8. Understanding the Effect of Baseline Modeling Implementation Choices on Analysis of Demand Response Performance

      SciTech Connect (OSTI)

      University of California, Berkeley; Addy, Nathan; Kiliccote, Sila; Mathieu, Johanna; Callaway, Duncan S.

      2012-06-13

      Accurate evaluation of the performance of buildings participating in Demand Response (DR) programs is critical to the adoption and improvement of these programs. Typically, we calculate load sheds during DR events by comparing observed electric demand against counterfactual predictions made using statistical baseline models. Many baseline models exist and these models can produce different shed calculations. Moreover, modelers implementing the same baseline model can make different modeling implementation choices, which may affect shed estimates. In this work, using real data, we analyze the effect of different modeling implementation choices on shed predictions. We focused on five issues: weather data source, resolution of data, methods for determining when buildings are occupied, methods for aligning building data with temperature data, and methods for power outage filtering. Results indicate sensitivity to the weather data source and data filtration methods as well as an immediate potential for automation of methods to choose building occupied modes.

    9. Development and testing of FIDELE: a computer code for finite-difference solution to harmonic magnetic-dipole excitation of an azimuthally symmetric horizontally and radially layered earth

      SciTech Connect (OSTI)

      Vittitoe, C.N.

      1981-04-01

      The FORTRAN IV computer code FIDELE simulates the high-frequency electrical logging of a well in which induction and receiving coils are mounted in an instrument sonde immersed in a drilling fluid. The fluid invades layers of surrounding rock in an azimuthally symmetric pattern, superimposing radial layering upon the horizonally layered earth. Maxwell's equations are reduced to a second-order elliptic differential equation for the azimuthal electric-field intensity. The equation is solved at each spatial position where the complex dielectric constant, magnetic permeability, and electrical conductivity have been assigned. Receiver response is given as the complex open-circuit voltage on receiver coils. The logging operation is simulated by a succession of such solutions as the sonde traverses the borehole. Test problems verify consistency with available results for simple geometries. The code's main advantage is its treatment of a two-dimensional earth; its chief disadvantage is the large computer time required for typical problems. Possible code improvements are noted. Use of the computer code is outlined, and tests of most code features are presented.

    10. Microsoft PowerPoint - Snippet 4.2 Integrated Baseline Review Process 20140724 [Compatibility Mode]

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      the Integrated Baseline Review (IBR) process. This snippet provides an overview of the IBR Process and will explain what it is, why it is required, when it should be conducted, who should conduct it, its applicability, and what areas may be covered during an IBR. In short, the purpose of an IBR is to achieve a mutual understanding of the baseline plan and its relationship to the underlying Earned Value Management (EVM) systems and processes that will operate during the life cycle of the project.

    11. 2008 CHP Baseline Assessment and Action Plan for the Nevada Market |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy 08 CHP Baseline Assessment and Action Plan for the Nevada Market 2008 CHP Baseline Assessment and Action Plan for the Nevada Market The purpose of this report is to assess the current status of combined heat and power (CHP) in Nevada and to identify the hurdles that prevent the expanded use of CHP systems. The report summarizes the CHP "landscape" in Nevada, including the current installed base of CHP systems, the potential future CHP market, and the status of

    12. DOE Announces Webinars on the Mid-Atlantic Baseline Study, EPA's Clean

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Power Plan and More | Department of Energy the Mid-Atlantic Baseline Study, EPA's Clean Power Plan and More DOE Announces Webinars on the Mid-Atlantic Baseline Study, EPA's Clean Power Plan and More November 13, 2015 - 8:30am Addthis EERE offers webinars to the public on a range of subjects, from adopting the latest energy efficiency and renewable energy technologies, to training for the clean energy workforce. Webinars are free; however, advanced registration is typically required. You can

    13. Computing and Computational Sciences Directorate - Computer Science and

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematics Division Computer Science and Mathematics Division The Computer Science and Mathematics Division (CSMD) is ORNL's premier source of basic and applied research in high-performance computing, applied mathematics, and intelligent systems. Our mission includes basic research in computational sciences and application of advanced computing systems, computational, mathematical and analysis techniques to the solution of scientific problems of national importance. We seek to work

    14. Computing at JLab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Jefferson Lab Jefferson Lab Home Search Contact JLab Computing at JLab ---------------------- Accelerator Controls CAD CDEV CODA Computer Center High Performance Computing Scientific Computing JLab Computer Silo maintained by webmaster@jlab.org

    15. Parallel Computing Summer Research Internship

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Parallel Computing Parallel Computing Summer Research Internship Creates next-generation leaders in HPC research and applications development Contacts Program Co-Lead Robert (Bob) Robey Email Program Co-Lead Gabriel Rockefeller Email Program Co-Lead Hai Ah Nam Email Professional Staff Assistant Nickole Aguilar Garcia (505) 665-3048 Email The Parallel Computing Summer Research Internship is an intense 10 week program aimed at providing students with a solid foundation in modern high performance

    16. Waste Isolation Pilot Plant Transuranic Waste Baseline inventory report. Volume 2. Revision 1

      SciTech Connect (OSTI)

      1995-02-01

      This document is the Baseline Inventory Report for the transuranic (alpha-bearing) wastes stored at the Waste Isolation Pilot Plant (WIPP) in New Mexico. Waste stream profiles including origin, applicable EPA codes, typical isotopic composition, typical waste densities, and typical rates of waste generation for each facility are presented for wastes stored at the WIPP.

    17. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

      SciTech Connect (OSTI)

      1996-07-01

      This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics.

    18. Baseline Risk Assessment for the F-Area Burning/Rubble Pits and Rubble Pit

      SciTech Connect (OSTI)

      Palmer, E.

      1996-03-01

      This document provides an overview of the Savannah River Site (SRS) and a description of the F-Area Burning/Rubble Pits (BRPs) and Rubble Pit (RP) unit. It also describes the objectives and scope of the baseline risk assessment (BRA).

    19. Computing Resources | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Resources Mira Cetus and Vesta Visualization Cluster Data and Networking Software JLSE Computing Resources Theory and Computing Sciences Building Argonne's Theory and Computing Sciences (TCS) building houses a wide variety of computing systems including some of the most powerful supercomputers in the world. The facility has 25,000 square feet of raised computer floor space and a pair of redundant 20 megavolt amperes electrical feeds from a 90 megawatt substation. The building also

    20. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science, Computing, Applied Math » Extreme Scale Computing, Co-design Extreme Scale Computing, Co-design Computational co-design may facilitate revolutionary designs in the next generation of supercomputers. Get Expertise Tim Germann Physics and Chemistry of Materials Email Allen McPherson Energy and Infrastructure Analysis Email Turab Lookman Physics and Condensed Matter and Complex Systems Email Computational co-design involves developing the interacting components of a

    1. Magnetic resonance imaging and computational fluid dynamics (CFD) simulations of rabbit nasal airflows for the development of hybrid CFD/PBPK models

      SciTech Connect (OSTI)

      Corley, Richard A.; Minard, Kevin R.; Kabilan, Senthil; Einstein, Daniel R.; Kuprat, Andrew P.; harkema, J. R.; Kimbell, Julia; Gargas, M. L.; Kinzell, John H.

      2009-06-01

      The percentages of total air?ows over the nasal respiratory and olfactory epithelium of female rabbits were cal-culated from computational ?uid dynamics (CFD) simulations of steady-state inhalation. These air?ow calcula-tions, along with nasal airway geometry determinations, are critical parameters for hybrid CFD/physiologically based pharmacokinetic models that describe the nasal dosimetry of water-soluble or reactive gases and vapors in rabbits. CFD simulations were based upon three-dimensional computational meshes derived from magnetic resonance images of three adult female New Zealand White (NZW) rabbits. In the anterior portion of the nose, the maxillary turbinates of rabbits are considerably more complex than comparable regions in rats, mice, mon-keys, or humans. This leads to a greater surface area to volume ratio in this region and thus the potential for increased extraction of water soluble or reactive gases and vapors in the anterior portion of the nose compared to many other species. Although there was considerable interanimal variability in the ?ne structures of the nasal turbinates and air?ows in the anterior portions of the nose, there was remarkable consistency between rabbits in the percentage of total inspired air?ows that reached the ethmoid turbinate region (~50%) that is presumably lined with olfactory epithelium. These latter results (air?ows reaching the ethmoid turbinate region) were higher than previous published estimates for the male F344 rat (19%) and human (7%). These di?erences in regional air?ows can have signi?cant implications in interspecies extrapolations of nasal dosimetry.

    2. High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      HPC INL Logo Home High-Performance Computing INL's high-performance computing center provides general use scientific computing capabilities to support the lab's efforts in advanced...

    3. Computer Architecture Lab

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      User Defined Images Archive APEX Home R & D Exascale Computing CAL Computer Architecture Lab The goal of the Computer Architecture Laboratory (CAL) is engage in...

    4. RCRA Facility Investigation/Remedial Investigation Report with the Baseline Risk Assessment for the 716-A Motor Shops Seepage Basin

      SciTech Connect (OSTI)

      Palmer, E.

      1997-08-25

      This document describes the RCRA Facility Investigation/Remedial Investigation/Baseline Risk Assessment of the 716-A Motor Shops Seepage Basin.

    5. Baseline point source load inventory, 1985. 1991 reevaluation report No. 2

      SciTech Connect (OSTI)

      Not Available

      1993-02-04

      The report finalizes and documents the Chesapeake Bay Agreement states' 1985 point source nutrient load estimates initially presented in the Baywide Nutrient Reduction Strategy (BNRS). The Bay Agreement states include Maryland, Virginia, Pennsylvania, and the District of Columbia. Each of the states final, annual, discharged, 1985 point source total phosphorus and total nitrogen nutrient load estimates are presented. These estimates are to serve as the point source baseline for the year 2000 40% nutrient reduction goal. Facility by facility flows, nutrient concentrations and nutrient loads for 1985 from above the fall line (AFL) and from below the fall line (BFL) are presented. The report presents the percent change in the 1985 baseline loads for each of the Bay agreement states relative to 1991. Estimates of 1991 nutrient loads are not available for non-agreement states at this time.

    6. Free-piston Stirling engine experimental program: Part 1. Baseline test summary

      SciTech Connect (OSTI)

      Berggren, R.; Moynihan, T.

      1983-06-01

      Free-Piston Stirling Engine experimental data are presented from a series of tests that establish the operating characteristics of the engine and determine performance repeatability. The operating envelope of the engine was to determine maximum parameter range and repeatability. Tests were then carried out in which individual operating parameters were varied while others were maintained constant. These data establish the baseline operation of the engine as a preliminary to a series of tests in which several suspected sources of energy loss are investigated by changing the engine geometry to isolate and magnify each suspected loss mechanism. Performance with the geometry change is compared against baseline operation to quantify the magnitude of the loss mechanism under investigation. The results of the loss mechanism investigation are presented in Part 2 of this report.

    7. What Are the Computational Keys to Future Scientific Discoveries...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Center (NERSC) developed a Data Intensive Computing Pilot. "Many of the big data challenges that have long existed in the particle and high energy physics world...

    8. Overview of Computer-Aided Engineering of Batteries (CAEBAT)...

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Computer-Aided Engineering of Batteries (CAEBAT) and Introduction to Multi-Scale, ... Merit Review 2014: Development of Computer-Aided Design Tools for Automotive Batteries

    9. Mixed waste focus area integrated technical baseline report. Phase I, Volume 2: Revision 0

      SciTech Connect (OSTI)

      1996-01-16

      This document (Volume 2) contains the Appendices A through J for the Mixed Waste Focus Area Integrated Technical Baseline Report Phase I for the Idaho National Engineering Laboratory. Included are: Waste Type Managers` Resumes, detailed information on wastewater, combustible organics, debris, unique waste, and inorganic homogeneous solids and soils, and waste data information. A detailed list of technology deficiencies and site needs identification is also provided.

    10. Results from baseline tests of the SPRE I and comparison with code model predictions

      SciTech Connect (OSTI)

      Cairelli, J.E.; Geng, S.M.; Skupinski, R.C.

      1994-09-01

      The Space Power Research Engine (SPRE), a free-piston Stirling engine with linear alternator, is being tested at the NASA Lewis Research Center as part of the Civil Space Technology Initiative (CSTI) as a candidate for high capacity space power. This paper presents results of base-line engine tests at design and off-design operating conditions. The test results are compared with code model predictions.

    11. River Corridor Baseline Risk Assessment (RCBRA) Human Health Risk Assessment (Volume 2)

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Sands Jim Hansen U.S. Department of Energy - Richland Operations Office October 12, 2011 River Corridor Baseline Risk Assessment (RCBRA) Human Health Risk Assessment (Volume 2) * RCBRA Human Health Risk Assessment is final - Response provided to HAB advice #246 * RCBRA Ecological Risk Assessment (Draft C) was transmitted to regulators September 27 * Columbia River Component - Draft Ecological Screening Level Risk Assessment ready for regulator review - Draft Human health risk assessment will be

    12. ITP Distributed Energy: 2008 Combined Heat and Power Baseline Assessment and Action Planfor the Nevada Market

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      UC Berkeley UC Irvine San Diego State University 2008 Combined Heat and Power Baseline Assessment and Action Plan for the Nevada Market Final Project Report September 30, 2008 Prepared By: Pacific Region Combined Heat and Power Application Center Timothy Lipman 1 Frank Ling 1 Vincent McDonell 2 Asfaw Beyene 3 Daniel Kammen 1 Scott Samuelsen 2 1 University of California - Berkeley 2 University of California - Irvine 3 San Diego State University (this page left intentionally blank) Legal Notice

    13. Microsoft PowerPoint - Snippet 3.1A IMS Initial Baseline Review 20140712 [Compatibility Mode]

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      how to review an Integrated Master Schedule (IMS) and highlights common areas of non-compliance with ANSI/EIA-748 guidelines. When the scheduling guidelines and requirements are implemented correctly, the results support creation of a realistic IMS and critical path the DOE can rely on to assess schedule performance, predictive analysis, and risk. This snippet is recommended whenever a schedule baseline is created or revised. 1 The Contract is the prevailing document regarding what Earned Value

    14. BASELINE RISK ASSESSMENT OF GROUND WATER CONTAMINATION AT THE URAN~UM MILL TAILINGS

      Office of Legacy Management (LM)

      I~:-:ii*.i: i,<;.;.-;_r- --:-:ir-- - . . - -. . - . . - , -, . , , , - - - - . BASELINE RISK ASSESSMENT OF GROUND WATER CONTAMINATION AT THE URAN~UM MILL TAILINGS SITE NEAR RIVERTON, WYOMING I i I I I Prepared by the U.S. Department of Energy Albuquerque, New Mexico September 1995 INTENDED FOR PUBLIC RELEASE This report has been reproduced from the best available copy. Avai and microfiche Number of pages in this report: 166 DOE and DOE contractors can obtain copies of this report from: Office

    15. The impact of sterile neutrinos on CP measurements at long baselines

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Gandhi, Raj; Kayser, Boris; Masud, Mehedi; Prakash, Suprabh

      2015-09-01

      With the Deep Underground Neutrino Experiment (DUNE) as an example, we show that the presence of even one sterile neutrino of mass ~1 eV can significantly impact the measurements of CP violation in long baseline experiments. Using a probability level analysis and neutrino-antineutrino asymmetry calculations, we discuss the large magnitude of these effects, and show how they translate into significant event rate deviations at DUNE. These results demonstrate that measurements which, when interpreted in the context of the standard three family paradigm, indicate CP conservation at long baselines, may, in fact hide large CP violation if there is a sterilemore » state. Similarly, any data indicating the violation of CP cannot be properly interpreted within the standard paradigm unless the presence of sterile states of mass O(1 eV) can be conclusively ruled out. Our work underscores the need for a parallel and linked short baseline oscillation program and a highly capable near detector for DUNE, but in order that its highly anticipated results on CP violation in the lepton sector may be correctly interpreted.« less

    16. Idaho National Engineering Laboratory (INEL) Environmental Restoration Program (ERP), Baseline Safety Analysis File (BSAF). Revision 1

      SciTech Connect (OSTI)

      Not Available

      1994-06-20

      This document was prepared to take the place of a Safety Evaluation Report since the Baseline Safety Analysis File (BSAF)and associated Baseline Technical Safety Requirements (TSR) File do not meet the requirements of a complete safety analysis documentation. Its purpose is to present in summary form the background of how the BSAF and Baseline TSR originated and a description of the process by which it was produced and approved for use in the Environmental Restoration Program.The BSAF is a facility safety reference document for INEL environmental restoration activities including environmental remediation of inactive waste sites and decontamination and decommissioning (D&D) of surplus facilities. The BSAF contains safety bases common to environmental restoration activities and guidelines for performing and documenting safety analysis. The common safety bases can be incorporated by reference into the safety analysis documentation prepared for individual environmental restoration activities with justification and any necessary revisions. The safety analysis guidelines in BSAF provide an accepted method for hazard analysis; analysis of normal, abnormal, and accident conditions; human factors analysis; and derivation of TSRS. The BSAF safety bases and guidelines are graded for environmental restoration activities.

    17. Development of a lab-scale, high-resolution, tube-generated X-ray computed-tomography system for three-dimensional (3D) materials characterization

      SciTech Connect (OSTI)

      Mertens, J.C.E. Williams, J.J. Chawla, Nikhilesh

      2014-06-01

      The design and construction of a modular high resolution X-ray computed tomography (XCT) system is highlighted in this paper. The design approach is detailed for meeting a specified set of instrument performance goals tailored towards experimental versatility and high resolution imaging. The XCT tool is unique in the detector and X-ray source design configuration, enabling control in the balance between detection efficiency and spatial resolution. The system package is also unique: The sample manipulation approach implemented enables a wide gamut of in situ experimentation to analyze structure evolution under applied stimulus, by optimizing scan conditions through a high degree of controllability. The component selection and design process is detailed: Incorporated components are specified, custom designs are shared, and the approach for their integration into a fully functional XCT scanner is provided. Custom designs discussed include the dual-target X-ray source cradle which maintains position and trajectory of the beam between the two X-ray target configurations with respect to a scintillator mounting and positioning assembly and the imaging sensor, as well as a novel large-format X-ray detector with enhanced adaptability. The instrument is discussed from an operational point of view, including the details of data acquisition and processing implemented for 3D imaging via micro-CT. The performance of the instrument is demonstrated on a silica-glass particle/hydroxyl-terminated-polybutadiene (HTPB) matrix binder PBX simulant. Post-scan data processing, specifically segmentation of the sample's relevant microstructure from the 3D reconstruction, is provided to demonstrate the utility of the instrument. - Highlights: Custom built X-ray tomography system for microstructural characterization Detector design for maximizing polychromatic X-ray detection efficiency X-ray design offered for maximizing X-ray flux with respect to imaging resolution Novel lab-scale XCT data acquisition and data processing methods 3D characterization of glass-bead mock plastic-bonded-explosive stimulant.

    18. Community Greening: How to Develop a Strategic Plan | Open Energy...

      Open Energy Info (EERE)

      Focus Area People and Policy Phase Bring the Right People Together, Create a Vision, Determine Baseline, Evaluate Options, Develop Goals, Prepare a Plan, Get Feedback,...

    19. Reference Model Development

      SciTech Connect (OSTI)

      Jepsen, Richard

      2011-11-02

      Presentation from the 2011 Water Peer Review in which principal investigator discusses project progress to develop a representative set of Reference Models (RM) for the MHK industry to develop baseline cost of energy (COE) and evaluate key cost component/system reduction pathways.

    20. FY12 Quarter 3 Computing Utilization Report LANL

      SciTech Connect (OSTI)

      Wampler, Cheryl L. [Los Alamos National Laboratory; McClellan, Laura Ann [Los Alamos National Laboratory

      2012-07-25

      DSW continues to dominate the capacity workload, with a focus in Q3 on common model baselining runs in preparation for the Annual Assessment Review (AAR) of the weapon systems. There remains unmet demand for higher fidelity simulations, and for increased throughput of simulations. Common model baselining activities would benefit from doubling the resolution of the models and running twice as many simulations. Capacity systems were also utilized during the quarter to prepare for upcoming Level 2 milestones. Other notable DSW activities include validation of new physics models and safety studies. The safety team used the capacity resources extensively for projects involving 3D computer simulations for the Furrow series of experiments at DARHT (a Level 2 milestone), fragment impact, surety theme, PANTEX assessments, and the 120-day study. With the more than tripling of classified capacity computing resources with the addition of the Luna system and the safety team's imminent access to the Cielo system, demand has been met for current needs. The safety team has performed successful scaling studies on Luna up to 16K PE size-jobs with linear scaling, running the large 3D simulations required for the analysis of Furrow. They will be investigating scaling studies on the Cielo system with the Lustre file system in Q4. Overall average capacity utilization was impacted by negative effects of the LANL Voluntary Separation Program (VSP) at the beginning of Q3, in which programmatic staffing was reduced by 6%, with further losses due to management backfills and attrition, resulting in about 10% fewer users. All classified systems were impacted in April by a planned 2 day red network outage. ASC capacity workload continues to focus on code development, regression testing, and verification and validation (V&V) studies. Significant capacity cycles were used in preparation for a JOWOG in May and several upcoming L2 milestones due in Q4. A network transition has been underway on the unclassified networks to increase access of all ASC users to the unclassified systems through the Yellow Turquoise Integration (YeTI) project. This will help to alleviate the longstanding shortage of resources for ASC unclassified code development and regression testing, and also make a broader palette of machines available to unclassified ASC users, including PSAAP Alliance users. The Moonlight system will be the first capacity resource to be made available through the YETI project, and will make available a significant increase in cycles, as well as GPGPU accelerator technology. The Turing and Lobo machines will be decommissioned in the next quarter. ASC projects running on Cielo as part of the CCC-3 include turbulence, hydrodynamics, burn, asteroids, polycrystals, capability and runtime performance improvements, and materials including carbon and silicone.

    1. Computational analysis of storage synthesis in developing Brassica napus L. (oilseed rape) embryos: Flux variability analysis in relation to 13C-metabolic flux analysis

      SciTech Connect (OSTI)

      Hay, J.; Schwender, J.

      2011-08-01

      Plant oils are an important renewable resource, and seed oil content is a key agronomical trait that is in part controlled by the metabolic processes within developing seeds. A large-scale model of cellular metabolism in developing embryos of Brassica napus (bna572) was used to predict biomass formation and to analyze metabolic steady states by flux variability analysis under different physiological conditions. Predicted flux patterns are highly correlated with results from prior 13C metabolic flux analysis of B. napus developing embryos. Minor differences from the experimental results arose because bna572 always selected only one sugar and one nitrogen source from the available alternatives, and failed to predict the use of the oxidative pentose phosphate pathway. Flux variability, indicative of alternative optimal solutions, revealed alternative pathways that can provide pyruvate and NADPH to plastidic fatty acid synthesis. The nutritional values of different medium substrates were compared based on the overall carbon conversion efficiency (CCE) for the biosynthesis of biomass. Although bna572 has a functional nitrogen assimilation pathway via glutamate synthase, the simulations predict an unexpected role of glycine decarboxylase operating in the direction of NH4+ assimilation. Analysis of the light-dependent improvement of carbon economy predicted two metabolic phases. At very low light levels small reductions in CO2 efflux can be attributed to enzymes of the tricarboxylic acid cycle (oxoglutarate dehydrogenase, isocitrate dehydrogenase) and glycine decarboxylase. At higher light levels relevant to the 13C flux studies, ribulose-1,5-bisphosphate carboxylase activity is predicted to account fully for the light-dependent changes in carbon balance.

    2. Evaluation of metrics and baselines for tracking greenhouse gas emissions trends: Recommendations for the California climate action registry

      SciTech Connect (OSTI)

      Price, Lynn; Murtishaw, Scott; Worrell, Ernst

      2003-06-01

      Executive Summary: The California Climate Action Registry, which was initially established in 2000 and began operation in Fall 2002, is a voluntary registry for recording annual greenhouse gas (GHG) emissions. The purpose of the Registry is to assist California businesses and organizations in their efforts to inventory and document emissions in order to establish a baseline and to document early actions to increase energy efficiency and decrease GHG emissions. The State of California has committed to use its ''best efforts'' to ensure that entities that establish GHG emissions baselines and register their emissions will receive ''appropriate consideration under any future international, federal, or state regulatory scheme relating to greenhouse gas emissions.'' Reporting of GHG emissions involves documentation of both ''direct'' emissions from sources that are under the entity's control and indirect emissions controlled by others. Electricity generated by an off-site power source is consider ed to be an indirect GHG emission and is required to be included in the entity's report. Registry participants include businesses, non-profit organizations, municipalities, state agencies, and other entities. Participants are required to register the GHG emissions of all operations in California, and are encouraged to report nationwide. For the first three years of participation, the Registry only requires the reporting of carbon dioxide (CO2) emissions, although participants are encouraged to report the remaining five Kyoto Protocol GHGs (CH4, N2O, HFCs, PFCs, and SF6). After three years, reporting of all six Kyoto GHG emissions is required. The enabling legislation for the Registry (SB 527) requires total GHG emissions to be registered and requires reporting of ''industry-specific metrics'' once such metrics have been adopted by the Registry. The Ernest Orlando Lawrence Berkeley National Laboratory (Berkeley Lab) was asked to provide technical assistance to the California Energy Commission (Energy Commission) related to the Registry in three areas: (1) assessing the availability and usefulness of industry-specific metrics, (2) evaluating various methods for establishing baselines for calculating GHG emissions reductions related to specific actions taken by Registry participants, and (3) establishing methods for calculating electricity CO2 emission factors. The third area of research was completed in 2002 and is documented in Estimating Carbon Dioxide Emissions Factors for the California Electric Power Sector (Marnay et al., 2002). This report documents our findings related to the first areas of research. For the first area of research, the overall objective was to evaluate the metrics, such as emissions per economic unit or emissions per unit of production that can be used to report GHG emissions trends for potential Registry participants. This research began with an effort to identify methodologies, benchmarking programs, inventories, protocols, and registries that u se industry-specific metrics to track trends in energy use or GHG emissions in order to determine what types of metrics have already been developed. The next step in developing industry-specific metrics was to assess the availability of data needed to determine metric development priorities. Berkeley Lab also determined the relative importance of different potential Registry participant categories in order to asses s the availability of sectoral or industry-specific metrics and then identified industry-specific metrics in use around the world. While a plethora of metrics was identified, no one metric that adequately tracks trends in GHG emissions while maintaining confidentiality of data was identified. As a result of this review, Berkeley Lab recommends the development of a GHG intensity index as a new metric for reporting and tracking GHG emissions trends.Such an index could provide an industry-specific metric for reporting and tracking GHG emissions trends to accurately reflect year to year changes while protecting proprietary data. This GHG intensity index changes while protecting proprietary data. This GHG intensity index would provide Registry participants with a means for demonstrating improvements in their energy and GHG emissions per unit of production without divulging specific values. For the second research area, Berkeley Lab evaluated various methods used to calculate baselines for documentation of energy consumption or GHG emissions reductions, noting those that use industry-specific metrics. Accounting for actions to reduce GHGs can be done on a project-by-project basis or on an entity basis. Establishing project-related baselines for mitigation efforts has been widely discussed in the context of two of the so-called ''flexible mechanisms'' of the Kyoto Protocol to the United Nations Framework Convention on Climate Change (Kyoto Protocol) Joint Implementation (JI) and the Clean Development Mechanism (CDM).

    3. Exploratory Experimentation and Computation

      SciTech Connect (OSTI)

      Bailey, David H.; Borwein, Jonathan M.

      2010-02-25

      We believe the mathematical research community is facing a great challenge to re-evaluate the role of proof in light of recent developments. On one hand, the growing power of current computer systems, of modern mathematical computing packages, and of the growing capacity to data-mine on the Internet, has provided marvelous resources to the research mathematician. On the other hand, the enormous complexity of many modern capstone results such as the Poincare conjecture, Fermat's last theorem, and the classification of finite simple groups has raised questions as to how we can better ensure the integrity of modern mathematics. Yet as the need and prospects for inductive mathematics blossom, the requirement to ensure the role of proof is properly founded remains undiminished.

    4. About the Advanced Computing Tech Team | Department of Energy

      Energy Savers [EERE]

      Advanced Computing Tech Team About the Advanced Computing Tech Team The Advanced Computing Tech Team is made up of representatives from DOE and its national laboratories who are involved with developing and using advanced computing tools. The following is a list of some of those programs and what how they are currently using advanced computing in pursuit of their respective missions. Advanced Science Computing Research (ASCR) The mission of the Advanced Scientific Computing Research (ASCR)

    5. Application of Robust Design and Advanced Computer Aided Engineering Technologies: Cooperative Research and Development Final Report, CRADA Number CRD-04-143

      SciTech Connect (OSTI)

      Thornton, M.

      2013-06-01

      Oshkosh Corporation (OSK) is taking an aggressive approach to implementing advanced technologies, including hybrid electric vehicle (HEV) technology, throughout their commercial and military product lines. These technologies have important implications for OSK's commercial and military customers, including fleet fuel efficiency, quiet operational modes, additional on-board electric capabilities, and lower thermal signature operation. However, technical challenges exist with selecting the optimal HEV components and design to work within the performance and packaging constraints of specific vehicle applications. SK desires to use unique expertise developed at the Department of Energy?s (DOE) National Renewable Energy Laboratory (NREL), including HEV modeling and simulation. These tools will be used to overcome technical hurdles to implementing advanced heavy vehicle technology that meet performance requirements while improving fuel efficiency.

    6. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ... software, and hardware in an integrated computational co-design process. * Designed Cruft, a suite of molecular dynamics proxy applications (software) developed to explore ...

    7. Inexpensive computer data-acquisition system

      SciTech Connect (OSTI)

      Galvin, J.E.; Brown, I.G.

      1985-10-01

      A system based on an Apple II+ personal computer is used for on-line monitoring of ion-beam characteristics in accelerator ion source development.

    8. Computer System, Cluster, and Networking Summer Institute Program Description

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      System, Cluster, and Networking Summer Institute Program Description The Computer System, Cluster, and Networking Summer Institute (CSCNSI) is a focused technical enrichment program targeting third-year college undergraduate students currently engaged in a computer science, computer engineering, or similar major. The program emphasizes practical skill development in setting up, configuring, administering, testing, monitoring, and scheduling computer systems, supercomputer clusters, and computer

    9. Fermilab | Science at Fermilab | Computing | Grid Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      In the early 2000s, members of Fermilab's Computing Division looked ahead to experiments like those at the Large Hadron Collider, which would collect more data than any computing ...

    10. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This revision of the LLNL Fire Protection Baseline Needs Assessment (BNA) was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by Martin Gresho, Sandia/CA Fire Marshal. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only address emergency response. The original LLNL BNA was created on April 23, 1997 as a means of collecting all requirements concerning emergency response capabilities at LLNL (including response to emergencies at Sandia/CA) into one BNA document. The original BNA documented the basis for emergency response, emergency personnel staffing, and emergency response equipment over the years. The BNA has been updated and reissued five times since in 1998, 1999, 2000, 2002, and 2004. A significant format change was performed in the 2004 update of the BNA in that it was 'zero based.' Starting with the requirement documents, the 2004 BNA evaluated the requirements, and determined minimum needs without regard to previous evaluations. This 2010 update maintains the same basic format and requirements as the 2004 BNA. In this 2010 BNA, as in the previous BNA, the document has been intentionally divided into two separate documents - the needs assessment (1) and the compliance assessment (2). The needs assessment will be referred to as the BNA and the compliance assessment will be referred to as the BNA Compliance Assessment. The primary driver for separation is that the needs assessment identifies the detailed applicable regulations (primarily NFPA Standards) for emergency response capabilities based on the hazards present at LLNL and Sandia/CA and the geographical location of the facilities. The needs assessment also identifies areas where the modification of the requirements in the applicable NFPA standards is appropriate, due to the improved fire protection provided, the remote location and low population density of some the facilities. As such, the needs assessment contains equivalencies to the applicable requirements. The compliance assessment contains no such equivalencies and simply assesses the existing emergency response resources to the requirements of the BNA and can be updated as compliance changes independent of the BNA update schedule. There are numerous NFPA codes and standards and other requirements and guidance documents that address the subject of emergency response. These requirements documents are not always well coordinated and may contain duplicative or conflicting requirements or even coverage gaps. Left unaddressed, this regulatory situation results in frequent interpretation of requirements documents. Different interpretations can then lead to inconsistent implementation. This BNA addresses this situation by compiling applicable requirements from all identified sources (see Section 5) and analyzing them collectively to address conflict and overlap as applicable to the hazards presented by the LLNL and Sandia/CA sites (see Section 7). The BNA also generates requirements when needed to fill any identified gaps in regulatory coverage. Finally, the BNA produces a customized simple set of requirements, appropriate for the DOE protection goals, such as those defined in DOE O 420.1B, the hazard level, the population density, the topography, and the site layout at LLNL and Sandia/CA that will be used as the baseline requirements set - the 'baseline needs' - for emergency response at LLNL and Sandia/CA. A template approach is utilized to accomplish this evaluation for each of the nine topical areas that comprise the baseline needs for emergency response. The basis for conclusions reached in determining the baseline needs for each of the topical areas is presented in Sections 7.1 through 7.9. This BNA identifies only mandatory requirements and establishes the minimum performance criteria. The minimum performance criteria may not be the level of performance desired Lawrence Livermore National Laboratory or Sandia/CA

    11. Mira Computational Readiness Assessment | Argonne Leadership Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Facility INCITE Program 5 Checks & 5 Tips for INCITE Mira Computational Readiness Assessment ALCC Program Director's Discretionary (DD) Program Early Science Program INCITE 2016 Projects ALCC 2015 Projects ESP Projects View All Projects Publications ALCF Tech Reports Industry Collaborations Mira Computational Readiness Assessment Assess your project's computational readiness for Mira A review of the following computational readiness points in relation to scaling, porting, I/O, memory

    12. The mixed waste management facility. Project baseline revision 1.2

      SciTech Connect (OSTI)

      Streit, R.D.; Throop, A.L.

      1995-04-01

      Revision 1.2 to the Project Baseline (PB) for the Mixed Waste Management Facility (MWMF) is in response to DOE directives and verbal guidance to (1) Collocate the Decontamination and Waste Treatment Facility (DWTF) and MWMF into a single complex, integrate certain and overlapping functions as a cost-saving measure; (2) Meet certain fiscal year (FY) new-BA funding objectives ($15.3M in FY95) with lower and roughly balanced funding for out years; (3) Reduce Total Project Cost (TPC) for the MWMF Project; (4) Include costs for all appropriate permitting activities in the project TPC. This baseline revision also incorporates revisions in the technical baseline design for Molten Salt Oxidation (MSO) and Mediated Electrochemical Oxidation (MEO). Changes in the WBS dictionary that are necessary as a result of this rebaseline, as well as minor title changes, at WBS Level 3 or above (DOE control level) are approved as a separate document. For completeness, the WBS dictionary that reflects these changes is contained in Appendix B. The PB, with revisions as described in this document, were also the basis for the FY97 Validation Process, presented to DOE and their reviewers on March 21-22, 1995. Appendix C lists information related to prior revisions to the PB. Several key changes relate to the integration of functions and sharing of facilities between the portion of the DWTF that will house the MWMF and those portions that are used by the Hazardous Waste Management (HWM) Division at LLNL. This collocation has been directed by DOE as a cost-saving measure and has been implemented in a manner that maintains separate operational elements from a safety and permitting viewpoint. Appendix D provides background information on the decision and implications of collocating the two facilities.

    13. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

      SciTech Connect (OSTI)

      Ludtka, Gail Mackiewicz-; Chourey, Aashish

      2010-08-01

      As the original magnet designer and manufacturer of ORNL s 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNL s Materials Processing Group s and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

    14. SHORT-BASELINE NEUTRINO PHYSICS AT MiniBooNE E. D. Zimmerman

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      NEUTRINO PHYSICS AT MiniBooNE E. D. Zimmerman University of Colorado PANIC 2011 Cambridge, Mass. 25 July 2011 Short-Baseline Neutrino Physics at MiniBooNE * MiniBooNE * Neutrino cross-sections * Hadron production channels * Oscillation physics * Antineutrino Oscillations * MiniBooNE-SciBooNE joint result Motivating MiniBooNE: LSND Liquid Scintillator Neutrino Detector * Stopped + beam at Los Alamos LAMPF produces e , , ̅ but no ̅ e (due to capture). * Look for delayed coincidence of positron

    15. A comparison of baseline aerodynamic performance of optimally-twisted versus non-twisted HAWT blades

      SciTech Connect (OSTI)

      Simms, D.A.; Robinson, M.C.; Hand, M.M.; Fingersh, L.J.

      1995-01-01

      NREL has completed the initial twisted blade field tests of the ``Unsteady Aerodynamics Experiment.`` This test series continues systematic measurements of unsteady aerodynamic phenomena prevalent in stall-controlled horizontal axis wind turbines (HAWTs). The blade twist distribution optimizes power production at a single angle of attack along the span. Abrupt transitions into and out of stall are created due to rapid changes in inflow. Data from earlier experiments have been analyzed extensively to characterize the steady and unsteady response of untwisted blades. In this report, a characterization and comparison of the baseline aerodynamic performance of the twisted versus non-twisted blade sets will be presented for steady flow conditions.

    16. DOE-EM-STD-5502-94; DOE Limited Standard Hazard Baseline Documentation

      Office of Environmental Management (EM)

      NOT MEASUREMENT SENSITIVE DOE-EM-STD-5502-94 August 1994 DOE LIMITED STANDARD HAZARD BASELINE DOCUMENTATION U.S. Department of Energy AREA SAFT Washington, D.C. 20585 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. This document has been reproduced directly from the best available copy. Available to DOE and DOE contractors from the Office of Scientific and Technical Information, P.O. Box 62, Oak Ridge, TN 37831; (615) 576-8401. Available to the public from the

    17. Secure computing for the 'Everyman'

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Secure computing for the 'Everyman' Secure computing for the 'Everyman' If implemented on a wide scale, quantum key distribution technology could ensure truly secure commerce, banking, communications and data transfer. September 2, 2014 This small device developed at Los Alamos National Laboratory uses the truly random spin of light particles as defined by laws of quantum mechanics to generate a random number for use in a cryptographic key that can be used to securely transmit information

    18. Computational Sciences and Engineering Division

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      If you have questions or comments regarding any of our research and development activities, how to work with ORNL and the Computational Sciences and Engineering (CSE) Division, or the content of this website please contact one of the following people: If you have questions regarding CSE technologies and capabilities, job opportunities, working with ORNL and the CSE Division, intellectual property, etc., contact, Shaun S. Gleason, Ph.D. Division Director, Computational Sciences and Engineering

    19. Computational Sciences and Engineering Division

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      The Computational Sciences and Engineering Division is a major research division at the Department of Energy's Oak Ridge National Laboratory. CSED develops and applies creative information technology and modeling and simulation research solutions for National Security and National Energy Infrastructure needs. The mission of the Computational Sciences and Engineering Division is to enhance the country's capabilities in achieving important objectives in the areas of national defense, homeland

    20. A Study to Develop an Industrial-Scale, Computer-Controlled High Magnetic Field Processing (HMFP) System to Assist in Commercializing the Novel, Enabling HMFP Manufacturing Technology

      SciTech Connect (OSTI)

      Lutdka, G. M.; Chourey, A.

      2010-05-12

      As the original magnet designer and manufacturer of ORNLs 9T, 5-inch ID bore magnet, American Magnetics Inc. (AMI) has collaborated with ORNLs Materials Processing Groups and this partnership has been instrumental in the development of our unique thermo-magnetic facilities and expertise. Consequently, AMI and ORNL have realized that the commercial implementation of the High Magnetic Field Processing (HMFP) technology will require the evolution of robust, automated superconducting (SC) magnet systems that will be cost-effective and easy to operate in an industrial environment. The goal of this project and CRADA is to significantly expedite the timeline for implementing this revolutionary and pervasive cross-cutting technology for future US produced industrial components. The successful completion of this project is anticipated to significantly assist in the timely commercialization and licensing of our HMFP intellectual property for a broad spectrum of industries; and to open up a new market for AMI. One notable outcome of this project is that the ThermoMagnetic Processing Technology WON a prestigious 2009 R&D 100 Awards. This award acknowledges and recognizes our TMP Technology as one of the top 100 innovative US technologies in 2009. By successfully establishing the design requirements for a commercial scale magnetic processing system, this project effort has accomplished a key first step in facilitating the building and demonstration of a superconducting magnetic processing coil, enabling the transition of the High Magnetic Field Processing Technology beyond a laboratory novelty into a commercially viable and industrially scalable Manufacturing Technology.

    1. Multiprocessor computing for images

      SciTech Connect (OSTI)

      Cantoni, V. ); Levialdi, S. )

      1988-08-01

      A review of image processing systems developed until now is given, highlighting the weak points of such systems and the trends that have dictated their evolution through the years producing different generations of machines. Each generation may be characterized by the hardware architecture, the programmability features and the relative application areas. The need for multiprocessing hierarchical systems is discussed focusing on pyramidal architectures. Their computational paradigms, their virtual and physical implementation, their programming and software requirements, and capabilities by means of suitable languages, are discussed.

    2. Sandia Energy - Computations

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computations Home Transportation Energy Predictive Simulation of Engines Reacting Flow Applied Math & Software Computations ComputationsAshley Otero2015-10-30T02:18:51+00:00...

    3. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

      SciTech Connect (OSTI)

      J. H. Jackson; S. P. Teysseyre

      2012-02-01

      The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

    4. Baseline Fracture Toughness and CGR testing of alloys X-750 and XM-19 (EPRI Phase I)

      SciTech Connect (OSTI)

      J. H. Jackson; S. P. Teysseyre

      2012-10-01

      The Advanced Test Reactor National Scientific User Facility (ATR NSUF) and Electric Power Research Institute (EPRI) formed an agreement to test representative alloys used as reactor structural materials as a pilot program toward establishing guidelines for future ATR NSUF research programs. This report contains results from the portion of this program established as Phase I (of three phases) that entails baseline fracture toughness, stress corrosion cracking (SCC), and tensile testing of selected materials for comparison to similar tests conducted at GE Global Research. The intent of this Phase I research program is to determine baseline properties for the materials of interest prior to irradiation, and to ensure comparability between laboratories using similar testing techniques, prior to applying these techniques to the same materials after having been irradiated at the Advanced Test Reactor (ATR). The materials chosen for this research are the nickel based super alloy X-750, and nitrogen strengthened austenitic stainless steel XM-19. A spare core shroud upper support bracket of alloy X-750 was purchased by EPRI from Southern Co. and a section of XM-19 plate was purchased by EPRI from GE-Hitachi. These materials were sectioned at GE Global Research and provided to INL.

    5. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zrich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

    6. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

    7. Computing for Finance

      SciTech Connect (OSTI)

      2010-03-24

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing – from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege of being a summer student at CERN.3. Opportunities for gLite in finance and related industriesAdam Vile, Head of Grid, HPC and Technical Computing, Excelian Ltd.gLite, the Grid software developed by the EGEE project, has been exceedingly successful as an enabling infrastructure, and has been a massive success in bringing together scientific and technical communities to provide the compute power to address previously incomputable problems. Not so in the finance industry. In its current form gLite would be a business disabler. There are other middleware tools that solve the finance communities compute problems much better. Things are moving on, however. There are moves afoot in the open source community to evolve the technology to address other, more sophisticated needs such as utility and interactive computing. In this talk, I will describe how Excelian is providing Grid consultancy services for the finance community and how, through its relationship to the EGEE project, Excelian is helping to identify and exploit opportunities as the research and business worlds converge. Because of the strong third party presence in the finance industry, such opportunities are few and far between, but they are there, especially as we expand sideways into related verticals such as the smaller hedge funds and energy companies. This talk will give an overview of the barriers to adoption of gLite in the finance industry and highlight some of the opportunities offered in this and related industries as the ideas around Grid mature. Speaker Bio: Dr Adam Vile is a senior consultant and head of the Grid and HPC practice at Excelian, a consultancy that focuses on financial markets professional services. He has spent many years in investment banking, as a developer, project manager and architect in both front and back office. Before joining Excelian he was senior Grid and HPC architect at Barclays Capital. Prior to joining investment banking, Adam spent a number of years lecturing in IT and mathematics at a UK University and maintains links with academia through lectures, research and through validation and steering of postgraduate courses. He is a chartered mathematician and was the conference chair of the Institute of Mathematics and its Applications first conference in computational Finance.4. From Monte Carlo to Wall Street Daniel Egloff, Head of Financial Engineering Computing Unit, Zürich Cantonal Bank High performance computing techniques provide new means to solve computationally hard problems in the financial service industry. First I consider Monte Carlo simulation and illustrate how it can be used to implement a sophisticated credit risk management and economic capital framework. From a HPC perspective, basic Monte Carlo simulation is embarrassingly parallel and can be implemented efficiently on distributed memory clusters. Additional difficulties arise for adaptive variance reduction schemes, if the information content in a sample is very small, and if the amount of simulated date becomes huge such that incremental processing algorithms are indispensable. We discuss the business value of an advanced credit risk quantification which is particularly compelling in these days. While Monte Carlo simulation is a very versatile tool it is not always the preferred solution for the pricing of complex products like multi asset options, structured products, or credit derivatives. As a second application I show how operator methods can be used to develop a pricing framework. The scalability of operator methods relies heavily on optimized dense matrix-matrix multiplications and requires specialized BLAS level-3 implementations provided by specialized FPGA or GPU boards. Speaker Bio: Daniel Egloff studied mathematics, theoretical physics, and computer science at the University of Zurich and the ETH Zurich. He holds a PhD in Mathematics from University of Fribourg, Switzerland. After his PhD he started to work for a large Swiss insurance company in the area of asset and liability management. He continued his professional career in the consulting industry. At KPMG and Arthur Andersen he consulted international clients and implemented quantitative risk management solutions for financial institutions and insurance companies. In 2002 he joined Zurich Cantonal Bank. He was assigned to develop and implement credit portfolio risk and economic capital methodologies. He built up a competence center for high performance and cluster computing. Currently, Daniel Egloff is heading the Financial Computing unit in the ZKB Financial Engineering division. He and his team is engineering and operating high performance cluster applications for computationally intensive problems in financial risk management.

    8. Computer hardware fault administration

      DOE Patents [OSTI]

      Archer, Charles J. (Rochester, MN); Megerian, Mark G. (Rochester, MN); Ratterman, Joseph D. (Rochester, MN); Smith, Brian E. (Rochester, MN)

      2010-09-14

      Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

    9. Molecular Science Computing | EMSL

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computational and state-of-the-art experimental tools, providing a cross-disciplinary environment to further research. Additional Information Computing user policies Partners...

    10. Applied & Computational Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      & Computational Math - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us ... Twitter Google + Vimeo GovDelivery SlideShare Applied & Computational Math HomeEnergy ...

    11. advanced simulation and computing

      National Nuclear Security Administration (NNSA)

      Each successive generation of computing system has provided greater computing power and energy efficiency.

      CTS-1 clusters will support NNSA's Life Extension Program and...

    12. NERSC Computer Security

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Security NERSC Computer Security NERSC computer security efforts are aimed at protecting NERSC systems and its users' intellectual property from unauthorized access or...

    13. Climate Modeling using High-Performance Computing

      SciTech Connect (OSTI)

      Mirin, A A

      2007-02-05

      The Center for Applied Scientific Computing (CASC) and the LLNL Climate and Carbon Science Group of Energy and Environment (E and E) are working together to improve predictions of future climate by applying the best available computational methods and computer resources to this problem. Over the last decade, researchers at the Lawrence Livermore National Laboratory (LLNL) have developed a number of climate models that provide state-of-the-art simulations on a wide variety of massively parallel computers. We are now developing and applying a second generation of high-performance climate models. Through the addition of relevant physical processes, we are developing an earth systems modeling capability as well.

    14. ORISE: Web Development

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Web Development As computer-based applications become increasingly popular for the delivery of health care training and information, the need for Web development in support of ...

    15. High energy neutron Computed Tomography developed

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      include James Hunter of the Lab's Non-Destructive Testing and Evaluation Group, Ron Nelson of LANL's LANSCE Nuclear Science and Jim Hall of Lawrence Livermore National...

    16. High energy neutron Computed Tomography developed

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      objects. May 9, 2014 Neutron tomography horizontal "slice" of a tungsten and polyethylene test object containing tungsten carbide BBs. Neutron tomography horizontal "slice"...

    17. Cosmic Reionization On Computers | Argonne Leadership Computing...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      its Cosmic Reionization On Computers (CROC) project, using the Adaptive Refinement Tree (ART) code as its main simulation tool. An important objective of this research is to make...

    18. Etalon-induced baseline drift and correction in atom flux sensors based on atomic absorption spectroscopy

      SciTech Connect (OSTI)

      Du, Yingge; Chambers, Scott A.

      2014-10-20

      Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific real-time flux sensing and control. The ultimate sensitivity and performance of these sensors are strongly affected by baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability, which has not been previously considered, and cannot be corrected using existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5% which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.

    19. Etalon-induced Baseline Drift And Correction In Atom Flux Sensors Based On Atomic Absorption Spectroscopy

      SciTech Connect (OSTI)

      Du, Yingge; Chambers, Scott A.

      2014-10-20

      Atom flux sensors based on atomic absorption (AA) spectroscopy are of significant interest in thin film growth as they can provide unobtrusive, element specific, real-time flux sensing and control. The ultimate sensitivity and performance of the sensors are strongly affected by the long-term and short term baseline drift. Here we demonstrate that an etalon effect resulting from temperature changes in optical viewport housings is a major source of signal instability which has not been previously considered or corrected by existing methods. We show that small temperature variations in the fused silica viewports can introduce intensity modulations of up to 1.5%, which in turn significantly deteriorate AA sensor performance. This undesirable effect can be at least partially eliminated by reducing the size of the beam and tilting the incident light beam off the viewport normal.

    20. Dual baseline search for muon antineutrino disappearance at 0.1 eV²

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Cheng, G.; Huelsnitz, W.; Aguilar-Arevalo, A. A.; Alcaraz-Aunion, J. L.; Brice, S. J.; Brown, B. C.; Bugel, L.; Catala-Perez, J.; Church, E. D.; Conrad, J. M.; et al

      2012-09-25

      The MiniBooNE and SciBooNE collaborations report the results of a joint search for short baseline disappearance of ν¯μ at Fermilab’s Booster Neutrino Beamline. The MiniBooNE Cherenkov detector and the SciBooNE tracking detector observe antineutrinos from the same beam, therefore the combined analysis of their data sets serves to partially constrain some of the flux and cross section uncertainties. Uncertainties in the νμ background were constrained by neutrino flux and cross section measurements performed in both detectors. A likelihood ratio method was used to set a 90% confidence level upper limit on ν¯μ disappearance that dramatically improves upon prior limits inmore »the Δm²=0.1–100 eV² region.« less

    1. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      2014-01-02

      FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

    2. Dixie Valley Engineered Geothermal System Exploration Methodology Project, Baseline Conceptual Model Report

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Iovenitti, Joe

      FSR Part I presents (1) an assessment of the readily available public domain data and some proprietary data provided by Terra-Gen Power, LLC, (2) a re-interpretation of these data as required, (3) an exploratory geostatistical data analysis, (4) the baseline geothermal conceptual model, and (5) the EGS favorability/trust mapping. The conceptual model presented applies to both the hydrothermal system and EGS in the Dixie Valley region. FSR Part II presents (1) 278 new gravity stations; (2) enhanced gravity-magnetic modeling; (3) 42 new ambient seismic noise survey stations; (4) an integration of the new seismic noise data with a regional seismic network; (5) a new methodology and approach to interpret this data; (5) a novel method to predict rock type and temperature based on the newly interpreted data; (6) 70 new magnetotelluric (MT) stations; (7) an integrated interpretation of the enhanced MT data set; (8) the results of a 308 station soil CO2 gas survey; (9) new conductive thermal modeling in the project area; (10) new convective modeling in the Calibration Area; (11) pseudo-convective modeling in the Calibration Area; (12) enhanced data implications and qualitative geoscience correlations at three scales (a) Regional, (b) Project, and (c) Calibration Area; (13) quantitative geostatistical exploratory data analysis; and (14) responses to nine questions posed in the proposal for this investigation. Enhanced favorability/trust maps were not generated because there was not a sufficient amount of new, fully-vetted (see below) rock type, temperature, and stress data. The enhanced seismic data did generate a new method to infer rock type and temperature. However, in the opinion of the Principal Investigator for this project, this new methodology needs to be tested and evaluated at other sites in the Basin and Range before it is used to generate the referenced maps. As in the baseline conceptual model, the enhanced findings can be applied to both the hydrothermal system and EGS in the Dixie Valley region.

    3. Idaho National Engineering Laboratory (INEL) Environmental Restoration (ER) Program Baseline Safety Analysis File (BSAF)

      SciTech Connect (OSTI)

      1995-09-01

      The Baseline Safety Analysis File (BSAF) is a facility safety reference document for the Idaho National Engineering Laboratory (INEL) environmental restoration activities. The BSAF contains information and guidance for safety analysis documentation required by the U.S. Department of Energy (DOE) for environmental restoration (ER) activities, including: Characterization of potentially contaminated sites. Remedial investigations to identify and remedial actions to clean up existing and potential releases from inactive waste sites Decontamination and dismantlement of surplus facilities. The information is INEL-specific and is in the format required by DOE-EM-STD-3009-94, Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports. An author of safety analysis documentation need only write information concerning that activity and refer to BSAF for further information or copy applicable chapters and sections. The information and guidance provided are suitable for: {sm_bullet} Nuclear facilities (DOE Order 5480-23, Nuclear Safety Analysis Reports) with hazards that meet the Category 3 threshold (DOE-STD-1027-92, Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports) {sm_bullet} Radiological facilities (DOE-EM-STD-5502-94, Hazard Baseline Documentation) Nonnuclear facilities (DOE-EM-STD-5502-94) that are classified as {open_quotes}low{close_quotes} hazard facilities (DOE Order 5481.1B, Safety Analysis and Review System). Additionally, the BSAF could be used as an information source for Health and Safety Plans and for Safety Analysis Reports (SARs) for nuclear facilities with hazards equal to or greater than the Category 2 thresholds, or for nonnuclear facilities with {open_quotes}moderate{close_quotes} or {open_quotes}high{close_quotes} hazard classifications.

    4. Baseline for Climate Change: Modeling Watershed Aquatic Biodiversity Relative to Environmental and Anthropogenic Factors

      SciTech Connect (OSTI)

      Maurakis, Eugene G

      2010-10-01

      Objectives of the two-year study were to (1) establish baselines for fish and macroinvertebrate community structures in two mid-Atlantic lower Piedmont watersheds (Quantico Creek, a pristine forest watershed; and Cameron Run, an urban watershed, Virginia) that can be used to monitor changes relative to the impacts related to climate change in the future; (2) create mathematical expressions to model fish species richness and diversity, and macroinvertebrate taxa and macroinvertebrate functional feeding group taxa richness and diversity that can serve as a baseline for future comparisons in these and other watersheds in the mid-Atlantic region; and (3) heighten peoples awareness, knowledge and understanding of climate change and impacts on watersheds in a laboratory experience and interactive exhibits, through internship opportunities for undergraduate and graduate students, a week-long teacher workshop, and a website about climate change and watersheds. Mathematical expressions modeled fish and macroinvertebrate richness and diversity accurately well during most of the six thermal seasons where sample sizes were robust. Additionally, hydrologic models provide the basis for estimating flows under varying meteorological conditions and landscape changes. Continuations of long-term studies are requisite for accurately teasing local human influences (e.g. urbanization and watershed alteration) from global anthropogenic impacts (e.g. climate change) on watersheds. Effective and skillful translations (e.g. annual potential exposure of 750,000 people to our inquiry-based laboratory activities and interactive exhibits in Virginia) of results of scientific investigations are valuable ways of communicating information to the general public to enhance their understanding of climate change and its effects in watersheds.

    5. Reference manual for toxicity and exposure assessment and risk characterization. CERCLA Baseline Risk Assessment

      SciTech Connect (OSTI)

      1995-03-01

      The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA, 1980) (CERCLA or Superfund) was enacted to provide a program for identifying and responding to releases of hazardous substances into the environment. The Superfund Amendments and Reauthorization Act (SARA, 1986) was enacted to strengthen CERCLA by requiring that site clean-ups be permanent, and that they use treatments that significantly reduce the volume, toxicity, or mobility of hazardous pollutants. The National Oil and Hazardous Substances Pollution Contingency Plan (NCP) (USEPA, 1985; USEPA, 1990) implements the CERCLA statute, presenting a process for (1) identifying and prioritizing sites requiring remediation and (2) assessing the extent of remedial action required at each site. The process includes performing two studies: a Remedial Investigation (RI) to evaluate the nature, extent, and expected consequences of site contamination, and a Feasibility Study (FS) to select an appropriate remedial alternative adequate to reduce such risks to acceptable levels. An integral part of the RI is the evaluation of human health risks posed by hazardous substance releases. This risk evaluation serves a number of purposes within the overall context of the RI/FS process, the most essential of which is to provide an understanding of ``baseline`` risks posed by a given site. Baseline risks are those risks that would exist if no remediation or institutional controls are applied at a site. This document was written to (1) guide risk assessors through the process of interpreting EPA BRA policy and (2) help risk assessors to discuss EPA policy with regulators, decision makers, and stakeholders as it relates to conditions at a particular DOE site.

    6. Computers-BSA.ppt

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computers! Boy Scout Troop 405! What is a computer?! Is this a computer?! Charles Babbage: Father of the Computer! 1830s Designed mechanical calculators to reduce human error. *Input device *Memory to store instructions and results *A processors *Output device! Vacuum Tube! Edison 1883 & Lee de Forest 1906 discovered that "vacuum tubes" could serve as electrical switches and amplifiers A switch can be ON (1)" or OFF (0) Electronic computers use Boolean (George Bool 1850) logic

    7. Computational Fluid Dynamics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      scour-tracc-cfd TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Fluid Dynamics Overview of CFD: Video Clip with Audio Computational fluid dynamics (CFD) research uses mathematical and computational models of flowing fluids to describe and predict fluid response in problems of interest, such as the flow of air around a moving vehicle or the flow of water and sediment in a river. Coupled with appropriate and prototypical

    8. Theory & Computation > Research > The Energy Materials Center...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Theory & Computation In This Section Computation & Simulation Theory & Computation Computation & Simulation...

    9. Fracture Analysis of Vessels. Oak Ridge FAVOR, v06.1, Computer Code: Theory and Implementation of Algorithms, Methods, and Correlations

      SciTech Connect (OSTI)

      Williams, P. T.; Dickson, T. L.; Yin, S.

      2007-12-01

      The current regulations to insure that nuclear reactor pressure vessels (RPVs) maintain their structural integrity when subjected to transients such as pressurized thermal shock (PTS) events were derived from computational models developed in the early-to-mid 1980s. Since that time, advancements and refinements in relevant technologies that impact RPV integrity assessment have led to an effort by the NRC to re-evaluate its PTS regulations. Updated computational methodologies have been developed through interactions between experts in the relevant disciplines of thermal hydraulics, probabilistic risk assessment, materials embrittlement, fracture mechanics, and inspection (flaw characterization). Contributors to the development of these methodologies include the NRC staff, their contractors, and representatives from the nuclear industry. These updated methodologies have been integrated into the Fracture Analysis of Vessels -- Oak Ridge (FAVOR, v06.1) computer code developed for the NRC by the Heavy Section Steel Technology (HSST) program at Oak Ridge National Laboratory (ORNL). The FAVOR, v04.1, code represents the baseline NRC-selected applications tool for re-assessing the current PTS regulations. This report is intended to document the technical bases for the assumptions, algorithms, methods, and correlations employed in the development of the FAVOR, v06.1, code.

    10. Argonne's Laboratory computing center - 2007 annual report.

      SciTech Connect (OSTI)

      Bair, R.; Pieper, G. W.

      2008-05-28

      Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

    11. Radiological Worker Computer Based Training

      Energy Science and Technology Software Center (OSTI)

      2003-02-06

      Argonne National Laboratory has developed an interactive computer based training (CBT) version of the standardized DOE Radiological Worker training program. This CD-ROM based program utilizes graphics, animation, photographs, sound and video to train users in ten topical areas: radiological fundamentals, biological effects, dose limits, ALARA, personnel monitoring, controls and postings, emergency response, contamination controls, high radiation areas, and lessons learned.

    12. Polymorphous computing fabric

      DOE Patents [OSTI]

      Wolinski, Christophe Czeslaw (Los Alamos, NM); Gokhale, Maya B. (Los Alamos, NM); McCabe, Kevin Peter (Los Alamos, NM)

      2011-01-18

      Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

    13. Computing for Finance

      ScienceCinema (OSTI)

      None

      2011-10-06

      The finance sector is one of the driving forces for the use of distributed or Grid computing for business purposes. The speakers will review the state-of-the-art of high performance computing in the financial sector, and provide insight into how different types of Grid computing ? from local clusters to global networks - are being applied to financial applications. They will also describe the use of software and techniques from physics, such as Monte Carlo simulations, in the financial world. There will be four talks of 20min each. The talk abstracts and speaker bios are listed below. This will be followed by a Q&A; panel session with the speakers. From 19:00 onwards there will be a networking cocktail for audience and speakers. This is an EGEE / CERN openlab event organized in collaboration with the regional business network rezonance.ch. A webcast of the event will be made available for subsequent viewing, along with powerpoint material presented by the speakers. Attendance is free and open to all. Registration is mandatory via www.rezonance.ch, including for CERN staff. 1. Overview of High Performance Computing in the Financial Industry Michael Yoo, Managing Director, Head of the Technical Council, UBS Presentation will describe the key business challenges driving the need for HPC solutions, describe the means in which those challenges are being addressed within UBS (such as GRID) as well as the limitations of some of these solutions, and assess some of the newer HPC technologies which may also play a role in the Financial Industry in the future. Speaker Bio: Michael originally joined the former Swiss Bank Corporation in 1994 in New York as a developer on a large data warehouse project. In 1996 he left SBC and took a role with Fidelity Investments in Boston. Unable to stay away for long, he returned to SBC in 1997 while working for Perot Systems in Singapore. Finally, in 1998 he formally returned to UBS in Stamford following the merger with SBC and has remained with UBS for the past 9 years. During his tenure at UBS, he has had a number of leadership roles within IT in development, support and architecture. In 2006 Michael relocated to Switzerland to take up his current role as head of the UBS IB Technical Council, responsible for the overall technology strategy and vision of the Investment Bank. One of Michael's key responsibilities is to manage the UBS High Performance Computing Research Lab and he has been involved in a number of initiatives in the HPC space. 2. Grid in the Commercial WorldFred Gedling, Chief Technology Officer EMEA and Senior Vice President Global Services, DataSynapse Grid computing gets mentions in the press for community programs starting last decade with "Seti@Home". Government, national and supranational initiatives in grid receive some press. One of the IT-industries' best-kept secrets is the use of grid computing by commercial organizations with spectacular results. Grid Computing and its evolution into Application Virtualization is discussed and how this is key to the next generation data center. Speaker Bio: Fred Gedling holds the joint roles of Chief Technology Officer for EMEA and Senior Vice President of Global Services at DataSynapse, a global provider of application virtualisation software. Based in London and working closely with organisations seeking to optimise their IT infrastructures, Fred offers unique insights into the technology of virtualisation as well as the methodology of establishing ROI and rapid deployment to the immediate advantage of the business. Fred has more than fifteen years experience of enterprise middleware and high-performance infrastructures. Prior to DataSynapse he worked in high performance CRM middleware and was the CTO EMEA for New Era of Networks (NEON) during the rapid growth of Enterprise Application Integration. His 25-year career in technology also includes management positions at Goldman Sachs and Stratus Computer. Fred holds a First Class Bsc (Hons) degree in Physics with Astrophysics from the University of Leeds and had the privilege o

    14. Fermilab | Science at Fermilab | Computing | High-performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Lattice QCD Farm at the Grid Computing Center at Fermilab. Lattice QCD Farm at the Grid Computing Center at Fermilab. Computing High-performance Computing A workstation computer can perform billions of multiplication and addition operations each second. High-performance parallel computing becomes necessary when computations become too large or too long to complete on a single such machine. In parallel computing, computations are divided up so that many computers can work on the same problem at

    15. Vehicle Technologies Office Merit Review 2014: Computational design and development of a new, lightweight cast alloy for advanced cylinder heads in high-efficiency, light-duty engines FOA 648-3a

      Broader source: Energy.gov [DOE]

      Presentation given by General Motors at 2014 DOE Hydrogen and Fuel Cells Program and Vehicle Technologies Office Annual Merit Review and Peer Evaluation Meeting about computational design and...

    16. Computers in Commercial Buildings

      U.S. Energy Information Administration (EIA) Indexed Site

      Government-owned buildings of all types, had, on average, more than one computer per person (1,104 computers per thousand employees). They also had a fairly high ratio of...

    17. Computers for Learning

      Broader source: Energy.gov [DOE]

      Through Executive Order 12999, the Computers for Learning Program was established to provide Federal agencies a quick and easy system for donating excess and surplus computer equipment to schools...

    18. Getting Computer Accounts

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Accounts When you first arrive at the lab, you will be presented with lots of forms that must be read and signed in order to get an ID and computer access. You must ensure...

    19. The 1993 baseline biological studies and proposed monitoring plan for the Device Assembly Facility at the Nevada Test Site

      SciTech Connect (OSTI)

      Woodward, B.D.; Hunter, R.B.; Greger, P.D.; Saethre, M.B.

      1995-02-01

      This report contains baseline data and recommendations for future monitoring of plants and animals near the new Device Assembly Facility (DAF) on the Nevada Test Site (NTS). The facility is a large structure designed for safely assembling nuclear weapons. Baseline data was collected in 1993, prior to the scheduled beginning of DAF operations in early 1995. Studies were not performed prior to construction and part of the task of monitoring operational effects will be to distinguish those effects from the extensive disturbance effects resulting from construction. Baseline information on species abundances and distributions was collected on ephemeral and perennial plants, mammals, reptiles, and birds in the desert ecosystems within three kilometers (km) of the DAF. Particular attention was paid to effects of selected disturbances, such as the paved road, sewage pond, and the flood-control dike, associated with the facility. Radiological monitoring of areas surrounding the DAF is not included in this report.

    20. Supporting collaborative computing and interaction

      SciTech Connect (OSTI)

      Agarwal, Deborah; McParland, Charles; Perry, Marcia

      2002-05-22

      To enable collaboration on the daily tasks involved in scientific research, collaborative frameworks should provide lightweight and ubiquitous components that support a wide variety of interaction modes. We envision a collaborative environment as one that provides a persistent space within which participants can locate each other, exchange synchronous and asynchronous messages, share documents and applications, share workflow, and hold videoconferences. We are developing the Pervasive Collaborative Computing Environment (PCCE) as such an environment. The PCCE will provide integrated tools to support shared computing and task control and monitoring. This paper describes the PCCE and the rationale for its design.

    1. Computational Structural Mechanics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      load-2 TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling Computational Structural Mechanics Overview of CSM Computational structural mechanics is a well-established methodology for the design and analysis of many components and structures found in the transportation field. Modern finite-element models (FEMs) play a major role in these evaluations, and sophisticated software, such as the commercially available LS-DYNA® code, is

    2. Multicore: Fallout from a Computing Evolution

      ScienceCinema (OSTI)

      Yelick, Kathy [Director, NERSC

      2009-09-01

      July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

    3. Baseline tests for arc melter vitrification of INEL buried wastes. Volume 1: Facility description and summary data report

      SciTech Connect (OSTI)

      Oden, L.L.; O`Connor, W.K.; Turner, P.C.; Soelberg, N.R.; Anderson, G.L.

      1993-11-19

      This report presents field results and raw data from the Buried Waste Integrated Demonstration (BWID) Arc Melter Vitrification Project Phase 1 baseline test series conducted by the Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM). The baseline test series was conducted using the electric arc melter facility at the USBM Albany Research Center in Albany, Oregon. Five different surrogate waste feed mixtures were tested that simulated thermally-oxidized, buried, TRU-contaminated, mixed wastes and soils present at the INEL. The USBM Arc Furnace Integrated Waste Processing Test Facility includes a continuous feed system, the arc melting furnace, an offgas control system, and utilities. The melter is a sealed, 3-phase alternating current (ac) furnace approximately 2 m high and 1.3 m wide. The furnace has a capacity of 1 metric ton of steel and can process as much as 1,500 lb/h of soil-type waste materials. The surrogate feed materials included five mixtures designed to simulate incinerated TRU-contaminated buried waste materials mixed with INEL soil. Process samples, melter system operations data and offgas composition data were obtained during the baseline tests to evaluate the melter performance and meet test objectives. Samples and data gathered during this program included (a) automatically and manually logged melter systems operations data, (b) process samples of slag, metal and fume solids, and (c) offgas composition, temperature, velocity, flowrate, moisture content, particulate loading and metals content. This report consists of 2 volumes: Volume I summarizes the baseline test operations. It includes an executive summary, system and facility description, review of the surrogate waste mixtures, and a description of the baseline test activities, measurements, and sample collection. Volume II contains the raw test data and sample analyses from samples collected during the baseline tests.

    4. Lawrence Livermore National Laboratory Emergency Response Capability 2009 Baseline Needs Assessment Performance Assessment

      SciTech Connect (OSTI)

      Sharry, J A

      2009-12-30

      This document was prepared by John A. Sharry, LLNL Fire Marshal and Division Leader for Fire Protection and was reviewed by Sandia/CA Fire Marshal, Martin Gresho. This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2009 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2004 BNA, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures. On October 1, 2007, LLNL contracted with the Alameda County Fire Department to provide emergency response services. The level of service called for in that contract is the same level of service as was provided by the LLNL Fire Department prior to that date. This Compliance Assessment will evaluate fire department services beginning October 1, 2008 as provided by the Alameda County Fire Department.

    5. LTC America`s, Inc. PTC-6 vacuum system (metal): Baseline report

      SciTech Connect (OSTI)

      1997-07-31

      The LTC coating removal technology was tested and is being evaluated at Florida International University (FIU) as a baseline technology. In conjunction with FIU`s evaluation of efficiency and cost, this report covers the evaluation conducted for safety and health issues. It is a commercially available technology and has been used for various projects at locations throughout the country. The LTC coating removal system consisted of several hand tools, a Roto Peen scaler, and a needlegun. They are designed to remove coatings from steel, concrete, brick, and wood. These hand tools are used with the LTC PTC-6 vacuum system to capture dust and debris as removal of the coating takes place. The safety and health evaluation during the testing demonstration focused on two main areas of exposure: dust and noise. The dust exposure was minimal but noise exposure was significant. Further testing for each exposure is recommended because of the environment where the testing demonstration took place. It is feasible that the dust and noise levels will be higher in an enclosed operating environment of different construction. In addition, other areas of concern found were arm-hand vibration, whole body vibration, ergonomics, heat stress, tripping hazards, electrical hazards, machine guarding, and lockout/tagout.

    6. NREL Solar Radiation Research Laboratory (SRRL): Baseline Measurement System (BMS); Golden, Colorado (Data)

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Stoffel, T.; Andreas, A.

      The SRRL was established at the Solar Energy Research Institute (now NREL) in 1981 to provide continuous measurements of the solar resources, outdoor calibrations of pyranometers and pyrheliometers, and to characterize commercially available instrumentation. The SRRL is an outdoor laboratory located on South Table Mountain, a mesa providing excellent solar access throughout the year, overlooking Denver. Beginning with the basic measurements of global horizontal irradiance, direct normal irradiance and diffuse horizontal irradiance at 5-minute intervals, the SRRL Baseline Measurement System now produces more than 130 data elements at 1-min intervals that are available from the Measurement & Instrumentation Data Center Web site. Data sources include global horizontal, direct normal, diffuse horizontal (from shadowband and tracking disk), global on tilted surfaces, reflected solar irradiance, ultraviolet, infrared (upwelling and downwelling), photometric and spectral radiometers, sky imagery, and surface meteorological conditions (temperature, relative humidity, barometric pressure, precipitation, snow cover, wind speed and direction at multiple levels). Data quality control and assessment include daily instrument maintenance (M-F) with automated data quality control based on real-time examinations of redundant instrumentation and internal consistency checks using NREL's SERI-QC methodology. Operators are notified of equipment problems by automatic e-mail messages generated by the data acquisition and processing system. Radiometers are recalibrated at least annually with reference instruments traceable to the World Radiometric Reference (WRR).

    7. Tank Waste Remediation System retrieval and disposal mission technical baseline summary description

      SciTech Connect (OSTI)

      McLaughlin, T.J.

      1998-01-06

      This document is prepared in order to support the US Department of Energy`s evaluation of readiness-to-proceed for the Waste Retrieval and Disposal Mission at the Hanford Site. The Waste Retrieval and Disposal Mission is one of three primary missions under the Tank Waste Remediation System (TWRS) Project. The other two include programs to characterize tank waste and to provide for safe storage of the waste while it awaits treatment and disposal. The Waste Retrieval and Disposal Mission includes the programs necessary to support tank waste retrieval, wastefeed, delivery, storage and disposal of immobilized waste, and closure of tank farms. This mission will enable the tank farms to be closed and turned over for final remediation. The Technical Baseline is defined as the set of science and engineering, equipment, facilities, materials, qualified staff, and enabling documentation needed to start up and complete the mission objectives. The primary purposes of this document are (1) to identify the important technical information and factors that should be used by contributors to the mission and (2) to serve as a basis for configuration management of the technical information and factors.

    8. NREL Solar Radiation Research Laboratory (SRRL): Baseline Measurement System (BMS); Golden, Colorado (Data)

      DOE Data Explorer [Office of Scientific and Technical Information (OSTI)]

      Stoffel, T.; Andreas, A.

      1981-07-15

      The SRRL was established at the Solar Energy Research Institute (now NREL) in 1981 to provide continuous measurements of the solar resources, outdoor calibrations of pyranometers and pyrheliometers, and to characterize commercially available instrumentation. The SRRL is an outdoor laboratory located on South Table Mountain, a mesa providing excellent solar access throughout the year, overlooking Denver. Beginning with the basic measurements of global horizontal irradiance, direct normal irradiance and diffuse horizontal irradiance at 5-minute intervals, the SRRL Baseline Measurement System now produces more than 130 data elements at 1-min intervals that are available from the Measurement & Instrumentation Data Center Web site. Data sources include global horizontal, direct normal, diffuse horizontal (from shadowband and tracking disk), global on tilted surfaces, reflected solar irradiance, ultraviolet, infrared (upwelling and downwelling), photometric and spectral radiometers, sky imagery, and surface meteorological conditions (temperature, relative humidity, barometric pressure, precipitation, snow cover, wind speed and direction at multiple levels). Data quality control and assessment include daily instrument maintenance (M-F) with automated data quality control based on real-time examinations of redundant instrumentation and internal consistency checks using NREL's SERI-QC methodology. Operators are notified of equipment problems by automatic e-mail messages generated by the data acquisition and processing system. Radiometers are recalibrated at least annually with reference instruments traceable to the World Radiometric Reference (WRR).

    9. BNL ATLAS Grid Computing

      ScienceCinema (OSTI)

      Michael Ernst

      2010-01-08

      As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

    10. Computing environment logbook

      DOE Patents [OSTI]

      Osbourn, Gordon C; Bouchard, Ann M

      2012-09-18

      A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

    11. Mathematical and Computational Epidemiology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Mathematical and Computational Epidemiology Search Site submit Contacts | Sponsors Mathematical and Computational Epidemiology Los Alamos National Laboratory change this image and alt text Menu About Contact Sponsors Research Agent-based Modeling Mixing Patterns, Social Networks Mathematical Epidemiology Social Internet Research Uncertainty Quantification Publications People Mathematical and Computational Epidemiology (MCEpi) Quantifying model uncertainty in agent-based simulations for

    12. Scalable optical quantum computer

      SciTech Connect (OSTI)

      Manykin, E A; Mel'nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre 'Kurchatov Institute', Moscow (Russian Federation)

      2014-12-31

      A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

    13. COMPUTATIONAL SCIENCE CENTER

      SciTech Connect (OSTI)

      DAVENPORT, J.

      2005-11-01

      The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

    14. Baseline risk assessment for exposure to contaminants at the St. Louis Site, St. Louis, Missouri

      SciTech Connect (OSTI)

      Not Available

      1993-11-01

      The St. Louis Site comprises three noncontiguous areas in and near St. Louis, Missouri: the St. Louis Downtown Site (SLDS), the St. Louis Airport Storage Site (SLAPS), and the Latty Avenue Properties. The main site of the Latty Avenue Properties includes the Hazelwood Interim Storage Site (HISS) and the Futura Coatings property, which are located at 9200 Latty Avenue. Contamination at the St. Louis Site is the result of uranium processing and disposal activities that took place from the 1940s through the 1970s. Uranium processing took place at the SLDS from 1942 through 1957. From the 1940s through the 1960s, SLAPS was used as a storage area for residues from the manufacturing operations at SLDS. The materials stored at SLAPS were bought by Continental Mining and Milling Company of Chicago, Illinois, in 1966, and moved to the HISS/Futura Coatings property at 9200 Latty Avenue. Vicinity properties became contaminated as a result of transport and movement of the contaminated material among SLDS, SLAPS, and the 9200 Latty Avenue property. This contamination led to the SLAPS, HISS, and Futura Coatings properties being placed on the National Priorities List (NPL) of the US Environmental Protection Agency (EPA). The US Department of Energy (DOE) is responsible for cleanup activities at the St. Louis Site under its Formerly Utilized Sites Remedial Action Program (FUSRAP). The primary goal of FUSRAP is the elimination of potential hazards to human health and the environment at former Manhattan Engineer District/Atomic Energy Commission (MED/AEC) sites so that, to the extent possible, these properties can be released for use without restrictions. To determine and establish cleanup goals for the St. Louis Site, DOE is currently preparing a remedial investigation/feasibility study-environmental impact statement (RI/FS-EIS). This baseline risk assessment (BRA) is a component of the process; it addresses potential risk to human health and the environment associated wi

    15. M & V Shootout: Setting the Stage For Testing the Performance of New Energy Baseline

      SciTech Connect (OSTI)

      Touzani, Samir; Custodio, Claudine; Sohn, Michael; Fernandes, Samuel; Granderson, Jessica; Jump, David; Taylor, Cody

      2015-07-01

      Trustworthy savings calculations are critical to convincing investors in energy efficiency projects of the benefit and cost-effectiveness of such investments and their ability to replace or defer supply-side capital investments. However, today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of efficiency projects. They also require time-consuming data acquisition and often do not deliver results until years after the program period has ended. A spectrum of savings calculation approaches are used, with some relying more heavily on measured data and others relying more heavily on estimated or modeled data, or stipulated information. The rising availability of “smart” meters, combined with new analytical approaches to quantifying savings, has opened the door to conducting M&V more quickly and at lower cost, with comparable or improved accuracy. Energy management and information systems (EMIS) technologies, not only enable significant site energy savings, but are also beginning to offer M&V capabilities. This paper expands recent analyses of public-domain, whole-building M&V methods, focusing on more novel baseline modeling approaches that leverage interval meter data. We detail a testing procedure and metrics to assess the performance of these new approaches using a large test dataset. We also provide conclusions regarding the accuracy, cost, and time trade-offs between more traditional M&V and these emerging streamlined methods. Finally, we discuss the potential evolution of M&V to better support the energy efficiency industry through low-cost approaches, and the long-term agenda for validation of building energy analytics.

    16. Statistical Comparison of the Baseline Mechanical Properties of NBG-18 and PCEA Graphite

      SciTech Connect (OSTI)

      Mark C. Carroll; David T. Rohrbaugh

      2013-08-01

      High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR), a graphite-moderated, helium-cooled design that is capable of producing process heat for power generation and for industrial process that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties by providing comprehensive data that captures the level of variation in measured values. In addition to providing a comprehensive comparison between these values in different nuclear grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons and variations between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between the two grades of graphite that were initially favored in the two main VHTR designs. NBG-18, a medium-grain pitch coke graphite from SGL formed via vibration molding, was the favored structural material in the pebble-bed configuration, while PCEA, a smaller grain, petroleum coke, extruded graphite from GrafTech was favored for the prismatic configuration. An analysis of the comparison between these two grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.

    17. Level III baseline risk evaluation for Building 3505 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      Mostella, W.B. Jr.

      1994-12-01

      The Level III Baseline Risk Evaluation (BRE) for Building 3505, the ORNL Metal Recovery Facility, provides an analysis of the potential for adverse health effects, current or future, associated with the presence of hazardous substances in the building. The Metal Recovery Facility was used from 1952 through 1960 to process large quantities of radioactive material using the PUREX process for the recovery of uranium-238, plutonium-239, neptunium-237, and americium-241. The facility consists of seven process cells (A through G), a canal, a dissolver room, a dissolver pit, an office, locker room, storage area, control room, electrical gallery, shop, and makeup area. The cells were used to house the nuclear fuel reprocessing equipment, and the canal was constructed to be used as a water-shielded transfer canal. Currently, there are no known releases of radioactive contaminants from Building 3505. To perform the BRE, historical radiological survey data were used to estimate the concentration of alpha- and beta/gamma emitting radionuclides in the various cells, rooms, and other areas in Building 3505. Data from smear surveys were used to estimate the amount of transferable contamination (to which receptors can be exposed via inhalation and ingestion), and data from probe surveys were used to estimate the amount of both fixed and transferable contamination (from which receptors can receive external exposure). Two land use scenarios, current and future, and their subsequent exposure scenarios were explored in the BRE. Under the current land use scenario, two exposure scenarios were evaluated. The first was a worst-case industrial exposure scenario in which the receptor is a maintenance worker who works 8 hours/day, 350 days/year in the building for 25 years. In the second, more realistic exposure scenario, the receptor is a surveillance and maintenance (S&M) worker who spends two 8-hour days/year in the building for 25 years.

    18. Breckinridge Project, initial effort. Report VII, Volume II. Environmental baseline report

      SciTech Connect (OSTI)

      1982-01-01

      Ashland Synthetic Fuels, Inc. (ASFI) and Airco Energy Company, Inc. (AECI) have recently formed the Breckinridge Project and are currently conducting a process and economic feasibility study of a commercial scale facility to produce synthetic liquid fuels from coal. The coal conversion process to be used is the H-COAL process, which is in the pilot plant testing stage under the auspices of the US Department of Energy at the H-COAL Pilot Plant Project near Catlettsburg, Kentucky. The preliminary plans for the commercial plant are for a 18,140 metric ton/day (24,000 ton/day) nominal coal assumption capacity utilizing the abundant high sulfur Western Kentucky coals. The Western Kentucky area offers a source of the coal along with adequate water, power, labor, transportation and other factors critical to the successful siting of a plant. Various studies by federal and state governments, as well as private industry, have reached similar conclusions regarding the suitability of such plant sites in western Kentucky. Of the many individual sites evaluated, a site in Breckinridge County, Kentucky, approximately 4 kilometers (2.5 miles) west of the town of Stephensport, has been identified as the plant location. Actions have been taken to obtain options to insure that this site will be available when needed. This report contains an overview of the regional setting and results of the baseline environmental studies. These studies include collection of data on ambient air and water quality, sound, aquatic and terrestrial biology and geology. This report contains the following chapters; introduction, review of significant findings, ambient air quality monitoring, sound, aquatic ecology, vegetation, wildlife, geology, soils, surface water, and ground water.

    19. Vandenberg Air Force Base integrated resource assessment. Volume 2, Baseline detail

      SciTech Connect (OSTI)

      Halverson, M.A.; Richman, E.E.; Dagle, J.E.; Hickman, B.J.; Daellenbach, K.K.; Sullivan, G.P.

      1993-06-01

      The US Air Force Space Command has tasked the Pacific Northwest Laboratory, as the lead laboratory supporting the US Department of Energy Federal Energy Management Program, to identify, evaluate, and assist in acquiring all cost-effective energy projects at Vandenberg Air Force Base (VAFB). This is a model program PNL is designing for federal customers served by the Pacific Gas and Electric Company (PG and E). The primary goal of the VAFB project is to identify all electric energy efficiency opportunities, and to negotiate with PG and E to acquire those resources through a customized demand-side management program for its federal clients. That customized program should have three major characteristics: (1) 100% up-front financing; (2) substantial utility cost-sharing; and (3) utility implementation through energy service companies under contract to the utility. A similar arrangement will be pursued with Southern California Gas for non-electric resource opportunities if that is deemed desirable by the site and if the gas utility seems open to such an approach. This report documents the assessment of baseline energy use at VAFB located near Lompoc, California. It is a companion report to Volume 1, Executive Summary, and Volume 3, Resource Assessment. This analysis examines the characteristics of electric, natural gas, fuel oil, and propane use for fiscal year 1991. It records energy-use intensities for the facilities at VAFB by building type and energy end use. It also breaks down building energy consumption by fuel type, energy end use, and building type. A more complete energy consumption reconciliation is presented that includes the accounting of all energy use among buildings, utilities, and applicable losses.

    20. Baseline biological risk assessment for aquatic populations occurring near Eielson Air Force Base, Alaska

      SciTech Connect (OSTI)

      Dauble, D.; Brandt, C.; Lewis, R.; Smith, R.

      1995-12-31

      Eielson Air Force Base (AFB), Alaska was listed as a Superfund site in November 1989 with 64 potential source areas of contamination. As part of a sitewide remedial investigation, baseline risk assessments were conducted in 1993 and 1994 to evaluate hazards posed to biological receptors and to human health. Fish tissue, aquatic invertebrates, aquatic vegetation, sediment, and surface water data were collected from several on-site and off-site surface water bodies. An initial screening risk assessment indicated that several surface water sites along two major tributary creeks flowing through the base had unacceptable risks to both aquatic receptors and to human health because of DDTs. Other contaminants of concern (i.e., PCBs and PAHs) were below screening risk levels for aquatic organisms, but contributed to an unacceptable risk to human health. Additional samples was taken in 1994 to characterize the site-wide distribution of PAHs, DDTs, and PCBs in aquatic biota and sediments. Concentrations of PAHs were invertebrates > aquatic vegetation > fish, but concentrations were sufficiently low that they posed no significant risk to biological receptors. Pesticides were detected in all fish tissue samples. Polychlorinated biphenyls (PCBs) were also detected in most fish from Garrison Slough. The pattern of PCB concentrations in Arctic grayling (Thymallus arcticus) was related to their proximity to a sediment source in lower Garrison Slough. Ingestion of PCB-contaminated fish is the primary human-health risk driver for surface water bodies on Eielson AFB, resulting in carcinogenic risks > 1 {times} 10{sup {minus}4} for future recreational land-use at some sites. Principal considerations affecting uncertainty in the risk assessment process included spatial and temporal variability in media contaminant concentrations and inconsistencies between modelled and measured body burdens.

    1. EA-2020: Energy Efficiency Standards for New Federal Low-Rise Residential Buildings’ Baseline Standards Update (RIN 1904-AD56)

      Broader source: Energy.gov [DOE]

      This EA will evaluate the potential environmental impacts of implementing the provisions in the Energy Conservation and Production Act (ECPA) that require DOE to update the baseline Federal energy efficiency performance standards for the construction of new Federal buildings, including low-rise residential buildings.

    2. Computers for artificial intelligence a technology assessment and forecast

      SciTech Connect (OSTI)

      Miller, R.K.

      1986-01-01

      This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

    3. Sandia Energy - High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High Performance Computing Home Energy Research Advanced Scientific Computing Research (ASCR) High Performance Computing High Performance Computingcwdd2015-03-18T21:41:24+00:00...

    4. Tribal Energy Development - Process and Guide

      Energy Savers [EERE]

      Development - Process & "Guide" Integrate supply and demand alternatives Tribal Objectives * Energy Reliability & Security * Off-Grid Electrification * Minimize Environmental Impacts * Supply Diversification * Use of Local Resources * Economic Development * Jobs * Build technical expertise * Respect for Mother Earth * Others?? Develop a community energy baseline Develop a common Tribal energy vision Identify and support a Tribal champion Identify and evaluate resource options

    5. Session on computation in biological pathways

      SciTech Connect (OSTI)

      Karp, P.D.; Riley, M.

      1996-12-31

      The papers in this session focus on the development of pathway databases and computational tools for pathway analysis. The discussion involves existing databases of sequenced genomes, as well as techniques for studying regulatory pathways.

    6. Edison Electrifies Scientific Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Edison Electrifies Scientific Computing Edison Electrifies Scientific Computing NERSC Flips Switch on New Flagship Supercomputer January 31, 2014 Contact: Margie Wylie, mwylie@lbl.gov, +1 510 486 7421 The National Energy Research Scientific Computing (NERSC) Center recently accepted "Edison," a new flagship supercomputer designed for scientific productivity. Named in honor of American inventor Thomas Alva Edison, the Cray XC30 will be dedicated in a ceremony held at the Department of

    7. Energy Aware Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Partnerships Shifter: User Defined Images Archive APEX Home » R & D » Energy Aware Computing Energy Aware Computing Dynamic Frequency Scaling One means to lower the energy required to compute is to reduce the power usage on a node. One way to accomplish this is by lowering the frequency at which the CPU operates. However, reducing the clock speed increases the time to solution, creating a potential tradeoff. NERSC continues to examine how such methods impact its operations and its

    8. Computation Directorate 2007 Annual Report

      SciTech Connect (OSTI)

      Henson, V E; Guse, J A

      2008-03-06

      If there is a single word that both characterized 2007 and dominated the thoughts and actions of many Laboratory employees throughout the year, it is transition. Transition refers to the major shift that took place on October 1, when the University of California relinquished management responsibility for Lawrence Livermore National Laboratory (LLNL), and Lawrence Livermore National Security, LLC (LLNS), became the new Laboratory management contractor for the Department of Energy's (DOE's) National Nuclear Security Administration (NNSA). In the 55 years under the University of California, LLNL amassed an extraordinary record of significant accomplishments, clever inventions, and momentous contributions in the service of protecting the nation. This legacy provides the new organization with a built-in history, a tradition of excellence, and a solid set of core competencies from which to build the future. I am proud to note that in the nearly seven years I have had the privilege of leading the Computation Directorate, our talented and dedicated staff has made far-reaching contributions to the legacy and tradition we passed on to LLNS. Our place among the world's leaders in high-performance computing, algorithmic research and development, applications, and information technology (IT) services and support is solid. I am especially gratified to report that through all the transition turmoil, and it has been considerable, the Computation Directorate continues to produce remarkable achievements. Our most important asset--the talented, skilled, and creative people who work in Computation--has continued a long-standing Laboratory tradition of delivering cutting-edge science even in the face of adversity. The scope of those achievements is breathtaking, and in 2007, our accomplishments span an amazing range of topics. From making an important contribution to a Nobel Prize-winning effort to creating tools that can detect malicious codes embedded in commercial software; from expanding BlueGene/L, the world's most powerful computer, by 60% and using it to capture the most prestigious prize in the field of computing, to helping create an automated control system for the National Ignition Facility (NIF) that monitors and adjusts more than 60,000 control and diagnostic points; from creating a microarray probe that rapidly detects virulent high-threat organisms, natural or bioterrorist in origin, to replacing large numbers of physical computer servers with small numbers of virtual servers, reducing operating expense by 60%, the people in Computation have been at the center of weighty projects whose impacts are felt across the Laboratory and the DOE community. The accomplishments I just mentioned, and another two dozen or so, make up the stories contained in this report. While they form an exceptionally diverse set of projects and topics, it is what they have in common that excites me. They share the characteristic of being central, often crucial, to the mission-driven business of the Laboratory. Computational science has become fundamental to nearly every aspect of the Laboratory's approach to science and even to the conduct of administration. It is difficult to consider how we would proceed without computing, which occurs at all scales, from handheld and desktop computing to the systems controlling the instruments and mechanisms in the laboratories to the massively parallel supercomputers. The reasons for the dramatic increase in the importance of computing are manifest. Practical, fiscal, or political realities make the traditional approach to science, the cycle of theoretical analysis leading to experimental testing, leading to adjustment of theory, and so on, impossible, impractical, or forbidden. How, for example, can we understand the intricate relationship between human activity and weather and climate? We cannot test our hypotheses by experiment, which would require controlled use of the entire earth over centuries. It is only through extremely intricate, detailed computational simulation that we can test our theories, and simulati

    9. Personal Computer Inventory System

      Energy Science and Technology Software Center (OSTI)

      1993-10-04

      PCIS is a database software system that is used to maintain a personal computer hardware and software inventory, track transfers of hardware and software, and provide reports.

    10. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Results from a climate simulation computed using the Model for Prediction Across Scales (MPAS) code. This visualization shows the temperature of ocean currents using a green and ...

    11. Announcement of Computer Software

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      All Other Editions Are Obsolete UNITED STATES DEPARTMENT OF ENERGY ANNOUNCEMENT OF COMPUTER SOFTWARE OMB Control Number 1910-1400 (OMB Burden Disclosure Statement is on last...

    12. Molecular Science Computing: 2010 Greenbook

      SciTech Connect (OSTI)

      De Jong, Wibe A.; Cowley, David E.; Dunning, Thom H.; Vorpagel, Erich R.

      2010-04-02

      This 2010 Greenbook outlines the science drivers for performing integrated computational environmental molecular research at EMSL and defines the next-generation HPC capabilities that must be developed at the MSC to address this critical research. The EMSL MSC Science Panel used EMSL’s vision and science focus and white papers from current and potential future EMSL scientific user communities to define the scientific direction and resulting HPC resource requirements presented in this 2010 Greenbook.

    13. Michael Levitt and Computational Biology

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Michael Levitt and Computational Biology Resources with Additional Information * Publications Michael Levitt Courtesy of Linda A. Cicero / Stanford News Service Michael Levitt, PhD, professor of structural biology at the Stanford University School of Medicine, has won the 2013 Nobel Prize in Chemistry. ... Levitt ... shares the ... prize with Martin Karplus ... and Arieh Warshel ... "for the development of multiscale models for complex chemical systems." Levitt's work focuses on

    14. INITIAL COMPARISON OF BASELINE PHYSICAL AND MECHANICAL PROPERTIES FOR THE VHTR CANDIDATE GRAPHITE GRADES

      SciTech Connect (OSTI)

      Carroll, Mark C

      2014-09-01

      High-purity graphite is the core structural material of choice in the Very High Temperature Reactor (VHTR) design, a graphite-moderated, helium-cooled configuration that is capable of producing thermal energy for power generation as well as process heat for industrial applications that require temperatures higher than the outlet temperatures of present nuclear reactors. The Baseline Graphite Characterization Program is endeavoring to minimize the conservative estimates of as-manufactured mechanical and physical properties in nuclear-grade graphites by providing comprehensive data that captures the level of variation in measured values. In addition to providing a thorough comparison between these values in different graphite grades, the program is also carefully tracking individual specimen source, position, and orientation information in order to provide comparisons both in specific properties and in the associated variability between different lots, different billets, and different positions from within a single billet. This report is a preliminary comparison between each of the grades of graphite that are considered candidate grades from four major international graphite producers. These particular grades (NBG-18, NBG-17, PCEA, IG-110, and 2114) are the major focus of the evaluations presently underway on irradiated graphite properties through the series of Advanced Graphite Creep (AGC) experiments. NBG-18, a medium-grain pitch coke graphite from SGL from which billets are formed via vibration molding, was the favored structural material in the pebble-bed configuration. NBG-17 graphite from SGL is essentially NBG-18 with the grain size reduced by a factor of two. PCEA, petroleum coke graphite from GrafTech with a similar grain size to NBG-17, is formed via an extrusion process and was initially considered the favored grade for the prismatic layout. IG-110 and 2114, from Toyo Tanso and Mersen (formerly Carbone Lorraine), respectively, are fine-grain grades produced via an isomolding process. An analysis of the comparison between each of these grades will include not only the differences in fundamental and statistically-significant individual strength levels, but also the differences in variability in properties within each of the grades that will ultimately provide the basis for the prediction of in-service performance. The comparative performance of the different types of nuclear-grade graphites will continue to evolve as thousands more specimens are fully characterized from the numerous grades of graphite being evaluated.

    15. Short-baseline electron neutrino disappearance, tritium beta decay, and neutrinoless double-beta decay

      SciTech Connect (OSTI)

      Giunti, Carlo; Laveder, Marco [INFN, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Dipartimento di Fisica G. Galilei, Universita di Padova, and INFN, Sezione di Padova, Via F. Marzolo 8, I-35131 Padova (Italy)

      2010-09-01

      We consider the interpretation of the MiniBooNE low-energy anomaly and the gallium radioactive source experiments anomaly in terms of short-baseline electron neutrino disappearance in the framework of 3+1 four-neutrino mixing schemes. The separate fits of MiniBooNE and gallium data are highly compatible, with close best-fit values of the effective oscillation parameters {Delta}m{sup 2} and sin{sup 2}2{theta}. The combined fit gives {Delta}m{sup 2}(greater-or-similar sign)0.1 eV{sup 2} and 0.11(less-or-similar sign)sin{sup 2}2{theta}(less-or-similar sign)0.48 at 2{sigma}. We consider also the data of the Bugey and Chooz reactor antineutrino oscillation experiments and the limits on the effective electron antineutrino mass in {beta} decay obtained in the Mainz and Troitsk tritium experiments. The fit of the data of these experiments limits the value of sin{sup 2}2{theta} below 0.10 at 2{sigma}. Considering the tension between the neutrino MiniBooNE and gallium data and the antineutrino reactor and tritium data as a statistical fluctuation, we perform a combined fit which gives {Delta}m{sup 2}{approx_equal}2 eV and 0.01(less-or-similar sign)sin{sup 2}2{theta}(less-or-similar sign)0.13 at 2{sigma}. Assuming a hierarchy of masses m{sub 1}, m{sub 2}, m{sub 3}<

    16. 60 Years of Computing | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      60 Years of Computing 60 Years of Computing

    17. Information Science, Computing, Applied Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Capabilities Information Science, Computing, Applied Math science-innovationassetsimagesicon-science.jpg Information Science, Computing, Applied Math National security ...

    18. 2011 Computation Directorate Annual Report

      SciTech Connect (OSTI)

      Crawford, D L

      2012-04-11

      From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilities and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.

    19. Appendix A - GPRA06 benefits estimates: MARKAL and NEMS model baseline cases

      SciTech Connect (OSTI)

      None, None

      2009-01-18

      NEMS is an integrated energy model of the U.S. energy system developed by the Energy Information Administration (EIA) for forecasting and policy analysis purposes.

    20. Computer Processor Allocator

      Energy Science and Technology Software Center (OSTI)

      2004-03-01

      The Compute Processor Allocator (CPA) provides an efficient and reliable mechanism for managing and allotting processors in a massively parallel (MP) computer. It maintains information in a database on the health. configuration and allocation of each processor. This persistent information is factored in to each allocation decision. The CPA runs in a distributed fashion to avoid a single point of failure.

    1. SC e-journals, Computer Science

      Office of Scientific and Technical Information (OSTI)

      & Mathematical Organization Theory Computational Complexity Computational Economics Computational Management ... Technology EURASIP Journal on Information Security ...

    2. Extreme Scale Computing to Secure the Nation

      SciTech Connect (OSTI)

      Brown, D L; McGraw, J R; Johnson, J R; Frincke, D

      2009-11-10

      Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national security requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be de

    3. Electromagnetic analysis of forces and torques on the baseline and enhanced ITER shield modules due to plasma disruption.

      SciTech Connect (OSTI)

      Kotulski, Joseph Daniel; Coats, Rebecca Sue; Pasik, Michael Francis; Ulrickson, Michael Andrew

      2009-08-01

      An electromagnetic analysis is performed on the ITER shield modules under different plasma-disruption scenarios using the OPERA-3d software. The models considered include the baseline design as provided by the International Organization and an enhanced design that includes the more realistic geometrical features of a shield module. The modeling procedure is explained, electromagnetic torques are presented, and results of the modeling are discussed.

    4. Work Domain Analysis of a Predecessor Sodium-cooled Reactor as Baseline for AdvSMR Operational Concepts

      SciTech Connect (OSTI)

      Ronald Farris; David Gertman; Jacques Hugo

      2014-03-01

      This report presents the results of the Work Domain Analysis for the Experimental Breeder Reactor (EBR-II). This is part of the phase of the research designed to incorporate Cognitive Work Analysis in the development of a framework for the formalization of an Operational Concept (OpsCon) for Advanced Small Modular Reactors (AdvSMRs). For a new AdvSMR design, information obtained through Cognitive Work Analysis, combined with human performance criteria, can and should be used in during the operational phase of a plant to assess the crew performance aspects associated with identified AdvSMR operational concepts. The main objective of this phase was to develop an analytical and descriptive framework that will help systems and human factors engineers to understand the design and operational requirements of the emerging generation of small, advanced, multi-modular reactors. Using EBR-II as a predecessor to emerging sodium-cooled reactor designs required the application of a method suitable to the structured and systematic analysis of the plant to assist in identifying key features of the work associated with it and to clarify the operational and other constraints. The analysis included the identification and description of operating scenarios that were considered characteristic of this type of nuclear power plant. This is an invaluable aspect of Operational Concept development since it typically reveals aspects of future plant configurations that will have an impact on operations. These include, for example, the effect of core design, different coolants, reactor-to-power conversion unit ratios, modular plant layout, modular versus central control rooms, plant siting, and many more. Multi-modular plants in particular are expected to have a significant impact on overall OpsCon in general, and human performance in particular. To support unconventional modes of operation, the modern control room of a multi-module plant would typically require advanced HSIs that would provide sophisticated operational information visualization, coupled with adaptive automation schemes and operator support systems to reduce complexity. These all have to be mapped at some point to human performance requirements. The EBR-II results will be used as a baseline that will be extrapolated in the extended Cognitive Work Analysis phase to the analysis of a selected advanced sodium-cooled SMR design as a way to establish non-conventional operational concepts. The Work Domain Analysis results achieved during this phase have not only established an organizing and analytical framework for describing existing sociotechnical systems, but have also indicated that the method is particularly suited to the analysis of prospective and immature designs. The results of the EBR-II Work Domain Analysis have indicated that the methodology is scientifically sound and generalizable to any operating environment.

    5. Renewable Diesel from Algal Lipids: An Integrated Baseline for Cost, Emissions, and Resource Potential from a Harmonized Model

      SciTech Connect (OSTI)

      Davis, R.; Fishman, D.; Frank, E. D.; Wigmosta, M. S.; Aden, A.; Coleman, A. M.; Pienkos, P. T.; Skaggs, R. J.; Venteris, E. R.; Wang, M. Q.

      2012-06-01

      The U.S. Department of Energy's Biomass Program has begun an initiative to obtain consistent quantitative metrics for algal biofuel production to establish an 'integrated baseline' by harmonizing and combining the Program's national resource assessment (RA), techno-economic analysis (TEA), and life-cycle analysis (LCA) models. The baseline attempts to represent a plausible near-term production scenario with freshwater microalgae growth, extraction of lipids, and conversion via hydroprocessing to produce a renewable diesel (RD) blendstock. Differences in the prior TEA and LCA models were reconciled (harmonized) and the RA model was used to prioritize and select the most favorable consortium of sites that supports production of 5 billion gallons per year of RD. Aligning the TEA and LCA models produced slightly higher costs and emissions compared to the pre-harmonized results. However, after then applying the productivities predicted by the RA model (13 g/m2/d on annual average vs. 25 g/m2/d in the original models), the integrated baseline resulted in markedly higher costs and emissions. The relationship between performance (cost and emissions) and either productivity or lipid fraction was found to be non-linear, and important implications on the TEA and LCA results were observed after introducing seasonal variability from the RA model. Increasing productivity and lipid fraction alone was insufficient to achieve cost and emission targets; however, combined with lower energy, less expensive alternative technology scenarios, emissions and costs were substantially reduced.

    6. Computational Tools to Assess Turbine Biological Performance

      SciTech Connect (OSTI)

      Richmond, Marshall C.; Serkowski, John A.; Rakowski, Cynthia L.; Strickler, Brad; Weisbeck, Molly; Dotson, Curtis L.

      2014-07-24

      Public Utility District No. 2 of Grant County (GCPUD) operates the Priest Rapids Dam (PRD), a hydroelectric facility on the Columbia River in Washington State. The dam contains 10 Kaplan-type turbine units that are now more than 50 years old. Plans are underway to refit these aging turbines with new runners. The Columbia River at PRD is a migratory pathway for several species of juvenile and adult salmonids, so passage of fish through the dam is a major consideration when upgrading the turbines. In this paper, a method for turbine biological performance assessment (BioPA) is demonstrated. Using this method, a suite of biological performance indicators is computed based on simulated data from a CFD model of a proposed turbine design. Each performance indicator is a measure of the probability of exposure to a certain dose of an injury mechanism. Using known relationships between the dose of an injury mechanism and frequency of injury (dose–response) from laboratory or field studies, the likelihood of fish injury for a turbine design can be computed from the performance indicator. By comparing the values of the indicators from proposed designs, the engineer can identify the more-promising alternatives. We present an application of the BioPA method for baseline risk assessment calculations for the existing Kaplan turbines at PRD that will be used as the minimum biological performance that a proposed new design must achieve.

    7. Computers as tools

      SciTech Connect (OSTI)

      Eriksson, I.V.

      1994-12-31

      The following message was recently posted on a bulletin board and clearly shows the relevance of the conference theme: {open_quotes}The computer and digital networks seem poised to change whole regions of human activity -- how we record knowledge, communicate, learn, work, understand ourselves and the world. What`s the best framework for understanding this digitalization, or virtualization, of seemingly everything? ... Clearly, symbolic tools like the alphabet, book, and mechanical clock have changed some of our most fundamental notions -- self, identity, mind, nature, time, space. Can we say what the computer, a purely symbolic {open_quotes}machine,{close_quotes} is doing to our thinking in these areas? Or is it too early to say, given how much more powerful and less expensive the technology seems destinated to become in the next few decades?{close_quotes} (Verity, 1994) Computers certainly affect our lives and way of thinking but what have computers to do with ethics? A narrow approach would be that on the one hand people can and do abuse computer systems and on the other hand people can be abused by them. Weli known examples of the former are computer comes such as the theft of money, services and information. The latter can be exemplified by violation of privacy, health hazards and computer monitoring. Broadening the concept from computers to information systems (ISs) and information technology (IT) gives a wider perspective. Computers are just the hardware part of information systems which also include software, people and data. Information technology is the concept preferred today. It extends to communication, which is an essential part of information processing. Now let us repeat the question: What has IT to do with ethics? Verity mentioned changes in {open_quotes}how we record knowledge, communicate, learn, work, understand ourselves and the world{close_quotes}.

    8. Appendix A: GPRA08 benefits estimates: NEMS and MARKAL Model Baseline Cases

      SciTech Connect (OSTI)

      None, None

      2009-01-18

      Document summarizes the results of the benefits analysis of EEREs programs, as described in the FY 2008 Budget Request. EERE estimates benefits for its overall portfolio and nine Research, Development, Demonstration, and Deployment (RD3) programs.

    9. Applications of Parallel Computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computers Applications of Parallel Computers UCB CS267 Spring 2015 Tuesday & Thursday, 9:30-11:00 Pacific Time Applications of Parallel Computers, CS267, is a graduate-level course offered at the University of California, Berkeley. The course is being taught by UC Berkeley professor and LBNL Faculty Scientist Jim Demmel. CS267 is broadcast live over the internet and all NERSC users are invited to monitor the broadcast course, but course credit is available only to student registered for the

    10. Results of the 2004 Knowledge and Opinions Surveys for the Baseline Knowledge Assessment of the U.S. Department of Energy Hydrogen Program

      SciTech Connect (OSTI)

      Schmoyer, Richard L; Truett, Lorena Faith; Cooper, Christy

      2006-04-01

      The U.S. Department of Energy (DOE) Hydrogen Program focuses on overcoming critical barriers to the widespread use of hydrogen fuel cell technology. The transition to a new, hydrogen-based energy economy requires an educated human infrastructure. With this in mind, the DOE Hydrogen Program conducted statistical surveys to measure and establish baselines for understanding and awareness about hydrogen, fuel cells, and a hydrogen economy. The baseline data will serve as a reference in designing an education program, and it will be used in comparisons with future survey results (2008 and 2011) to measure changes in understanding and awareness. Scientific sampling was used to survey four populations: (1) the general public, ages 18 and over; (2) students, ages 12-17; (3) state and local government officials; and (4) potential large-scale hydrogen users. It was decided that the survey design should include about 1,000 individuals in each of the general public and student categories, about 250 state and local officials, and almost 100 large-scale end users. The survey questions were designed to accomplish specific objectives. Technical questions measured technical understanding and awareness of hydrogen technology. Opinion questions measured attitudes about safety, cost, the environment, and convenience, as well as the likelihood of future applications of hydrogen technology. For most of the questions, "I don't know" or "I have no opinion" were acceptable answers. Questions about information sources assessed how energy technology information is received. The General Public and Student Survey samples were selected by random digit dialing. Potential large-scale end users were selected by random sampling. The State and Local Government Survey was of the entire targeted population of government officials (not a random sample). All four surveys were administered by computer-assisted telephone interviewing (CATI). For each population, the length of the survey was less than 15 minutes. Design of an education program is beyond the scope of the report, and comparisons of the baseline data with future results will not be made until the survey is fielded again. Nevertheless, a few observations about the data are salient: For every population group, average scores on the technical knowledge questions were lower for the fuel cell questions than for the other technical questions. State and local officials expressed more confidence in hydrogen safety than large-scale end users, and they were much more confident than either the general public or students. State and local officials also scored much higher on the technical questions. Technical understanding appears to influence opinions about safety. For the General Public, Student, and Large-Scale End User Surveys, respondents with above-average scores on the eleven technical questions were more likely to have an opinion about hydrogen technology safety, and for those respondents who expressed an opinion, their opinion was more likely to be positive. These differences were statistically significant. Using criteria of "Sometimes" or "Frequently" to describe usage, respondents rated media sources for obtaining energy information. The general public and students responded that television is the primary media source of energy information. State and local officials and large-scale end users indicated that their primary media sources are newspapers, the Internet, and science and technology journals. In order of importance, the general public values safety, cost, environment, and convenience. The Large-Scale End User Survey suggests that there is presently little penetration of hydrogen technology; nor is there much planning for it.

    11. Computationally Optimized Homogenization Heat Treatment of Metal Alloys -

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Energy Innovation Portal Advanced Materials Advanced Materials Find More Like This Return to Search Computationally Optimized Homogenization Heat Treatment of Metal Alloys National Energy Technology Laboratory Contact NETL About This Technology Publications: PDF Document Publication Computationally Optimized Homogenization Heat Treatment of Metal Alloys (291 KB) Technology Marketing Summary ? A computational approach has been developed to improve the homogenization heat treatment of solid

    12. Representation of Limited Rights Data and Restricted Computer Software |

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Department of Energy Representation of Limited Rights Data and Restricted Computer Software Representation of Limited Rights Data and Restricted Computer Software PDF icon Representation of Limited Rights Data and Restricted Computer Software More Documents & Publications CLB-1003.PDF&#0; Intellectual Property Provisions (CSB-1003) Cooperative Agreement Research, Development, or Demonstration Domestic Small Businesses CDLB-1003.PDF&#0;

    13. Cloud computing security.

      SciTech Connect (OSTI)

      Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

      2010-10-01

      Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

    14. Theory, Modeling and Computation

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      modeling and simulation will be enhanced not only by the wealth of data available from MaRIE but by the increased computational capacity made possible by the advent of extreme...

    15. Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      a n n u a l r e p o r t 2 0 1 2 Argonne Leadership Computing Facility Director's Message .............................................................................................................................1 About ALCF ......................................................................................................................................... 2 IntroDuCIng MIrA Introducing Mira

    16. Ames Lab 101: Improving Materials with Advanced Computing

      ScienceCinema (OSTI)

      Johnson, Duane

      2014-06-04

      Ames Laboratory's Chief Research Officer Duane Johnson talks about using advanced computing to develop new materials and predict what types of properties those materials will have.

    17. Computer System, Cluster and Networking Summer Institute (CSCNSI...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      NSEC ISTI Summer School Programs CSCNSI Computer System, Cluster and Networking Summer Institute Emphasizes practical skills development Contact Leader Stephan Eidenbenz...

    18. Applied Computer Science

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ADTSC » CCS » CCS-7 Applied Computer Science Innovative co-design of applications, algorithms, and architectures in order to enable scientific simulations at extreme scale Leadership Group Leader Linn Collins Email Deputy Group Leader (Acting) Bryan Lally Email Climate modeling visualization Results from a climate simulation computed using the Model for Prediction Across Scales (MPAS) code. This visualization shows the temperature of ocean currents using a green and blue color scale. These

    19. Computational Physics and Methods

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      2 Computational Physics and Methods Performing innovative simulations of physics phenomena on tomorrow's scientific computing platforms Growth and emissivity of young galaxy hosting a supermassive black hole as calculated in cosmological code ENZO and post-processed with radiative transfer code AURORA. image showing detailed turbulence simulation, Rayleigh-Taylor Turbulence imaging: the largest turbulence simulations to date Advanced multi-scale modeling Turbulence datasets Density iso-surfaces

    20. Compute Reservation Request Form

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Compute Reservation Request Form Compute Reservation Request Form Users can request a scheduled reservation of machine resources if their jobs have special needs that cannot be accommodated through the regular batch system. A reservation brings some portion of the machine to a specific user or project for an agreed upon duration. Typically this is used for interactive debugging at scale or real time processing linked to some experiment or event. It is not intended to be used to guarantee fast

    1. New TRACC Cluster Computer

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      TRACC Cluster Computer With the addition of a new cluster called Zephyr that was made operational in September of this year (2012), TRACC now offers two clusters to choose from: Zephyr and our original cluster that has now been named Phoenix. Zephyr was acquired from Atipa technologies, and it is a 92-node system with each node having two AMD 16 core, 2.3 GHz, 32 GB processors. See also Computing Resources.

    2. Advanced Simulation and Computing

      National Nuclear Security Administration (NNSA)

      NA-ASC-117R-09-Vol.1-Rev.0 Advanced Simulation and Computing PROGRAM PLAN FY09 October 2008 ASC Focal Point Robert Meisner, Director DOE/NNSA NA-121.2 202-586-0908 Program Plan Focal Point for NA-121.2 Njema Frazier DOE/NNSA NA-121.2 202-586-5789 A Publication of the Office of Advanced Simulation & Computing, NNSA Defense Programs i Contents Executive Summary ----------------------------------------------------------------------------------------------- 1 I. Introduction

    3. Computing | Department of Energy

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      Computing Computing Fun fact: Most systems require air conditioning or chilled water to cool super powerful supercomputers, but the Olympus supercomputer at Pacific Northwest National Laboratory is cooled by the location's 65 degree groundwater. Traditional cooling systems could cost up to $61,000 in electricity each year, but this more efficient setup uses 70 percent less energy. | Photo courtesy of PNNL. Fun fact: Most systems require air conditioning or chilled water to cool super powerful

    4. computational-fluid-dynamics-student-thesis

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Fluid Dynamics Student Thesis Abstract DEVELOPMENT OF A THREE-DIMENSIONAL SCOURING METHODOLOGY AND ITS IMPLEMENTATION IN A COMMERCIAL CFD CODE FOR OPEN CHANNEL FLOW OVER A FLOODED BRIDGE DECK The Computational Fluid Dynamics staff at TRACC is supporting three students from Northern Illinois University who are working for a Masters degree. The CFD staff is directing the thesis research and working with them on three projects: (1) a three-dimensional scour computation methodology for pressure flow

    5. Graham Fletcher | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Graham Fletcher Principal Project Specialist in Computational Science Graham Fletcher Argonne National Laboratory 9700 South Cass Avenue Building 240 - Rm. 1123 Argonne, IL 60439 630-252-0755 fletcher@alcf.anl.gov Graham Fletcher is a Principal Project Specialist in Computational Science at the ALCF with a background in quantum chemistry and supercomputing. His research interests focus on the development of highly scalable methods and algorithms for the accurate and reliable prediction of

    6. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Extreme Scale Computing, Co-design Informing system design, ensuring productive and efficient code Project Description To address the increasingly complex problems of the modern world, scientists at Los Alamos are pushing the scale of computing to the extreme, forming partnerships with other national laboratories and industry to develop supercomputers that can achieve "exaflop" speeds-that is, a quintillion (a million trillion) calculations per second. To put such speed in perspective,

    7. LANL computer model boosts engine efficiency

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      LANL computer model boosts engine efficiency LANL computer model boosts engine efficiency The KIVA model has been instrumental in helping researchers and manufacturers understand combustion processes, accelerate engine development and improve engine design and efficiency. September 25, 2012 KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber and 4 valves. KIVA simulation of an experimental engine with DOHC quasi-symmetric pent-roof combustion chamber

    8. Introducing Aurora | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Aurora Announcement Press Conference: DOE announces next-gen supercomputer Aurora to be built at Argonne Introducing Aurora Author: ALCF staff April 9, 2015 Facebook Twitter LinkedIn Google E-mail Printer-friendly version Today, U.S. Department of Energy Under Secretary for Science and Energy Lynn Orr announced two new High Performance Computing (HPC) awards that continue to advance U.S. leadership in developing exascale computing. The announcement was made alongside leaders from Argonne

    9. DOE ASSESSMENT SEAB Recommendations Related to High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      DOE ASSESSMENT SEAB Recommendations Related to High Performance Computing 1. Introduction The Department of Energy (DOE) is planning to develop and deliver capable exascale computing systems by 2023-24. These systems are expected to have a one-hundred to one-thousand-fold increase in sustained performance over today's computing capabilities, capabilities critical to enabling the next-generation computing for national security, science, engineering, and large- scale data analytics needed to

    10. Can Cloud Computing Address the Scientific Computing Requirements for DOE

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Researchers? Well, Yes, No and Maybe Can Cloud Computing Address the Scientific Computing Requirements for DOE Researchers? Well, Yes, No and Maybe Can Cloud Computing Address the Scientific Computing Requirements for DOE Researchers? Well, Yes, No and Maybe January 30, 2012 Jon Bashor, Jbashor@lbl.gov, +1 510-486-5849 Magellan1.jpg Magellan at NERSC After a two-year study of the feasibility of cloud computing systems for meeting the ever-increasing computational needs of scientists,

    11. Computing and Computational Sciences Directorate - Joint Institute for

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Sciences Joint Institute for Computational Sciences To help realize the full potential of new-generation computers for advancing scientific discovery, the University of Tennessee (UT) and Oak Ridge National Laboratory (ORNL) have created the Joint Institute for Computational Sciences (JICS). JICS combines the experience and expertise in theoretical and computational science and engineering, computer science, and mathematics in these two institutions and focuses these skills on

    12. Computing and Computational Sciences Directorate - National Center for

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computational Sciences Home National Center for Computational Sciences The National Center for Computational Sciences (NCCS), formed in 1992, is home to two of Oak Ridge National Laboratory's (ORNL's) high-performance computing projects-the Oak Ridge Leadership Computing Facility (OLCF) and the National Climate-Computing Research Center (NCRC). The OLCF (www.olcf.ornl.gov) was established at ORNL in 2004 with the mission of standing up a supercomputer 100 times more powerful than the leading

    13. in High Performance Computing Computer System, Cluster, and Networking...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      iSSH v. Auditd: Intrusion Detection in High Performance Computing Computer System, Cluster, and Networking Summer Institute David Karns, New Mexico State University Katy Protin,...

    14. Computational design and analysis of flatback airfoil wind tunnel experiment.

      SciTech Connect (OSTI)

      Mayda, Edward A.; van Dam, C.P.; Chao, David D.; Berg, Dale E.

      2008-03-01

      A computational fluid dynamics study of thick wind turbine section shapes in the test section of the UC Davis wind tunnel at a chord Reynolds number of one million is presented. The goals of this study are to validate standard wind tunnel wall corrections for high solid blockage conditions and to reaffirm the favorable effect of a blunt trailing edge or flatback on the performance characteristics of a representative thick airfoil shape prior to building the wind tunnel models and conducting the experiment. The numerical simulations prove the standard wind tunnel corrections to be largely valid for the proposed test of 40% maximum thickness to chord ratio airfoils at a solid blockage ratio of 10%. Comparison of the computed lift characteristics of a sharp trailing edge baseline airfoil and derived flatback airfoils reaffirms the earlier observed trend of reduced sensitivity to surface contamination with increasing trailing edge thickness.

    15. computational-structural-mechanics-training

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Table of Contents Date Location Training Course: HyperMesh and HyperView April 12-14, 2011 Argonne TRACC Argonne, IL Introductory Course: Developing Compute-efficient, Quality Models with LS-PrePost® 3 on the TRACC Cluster October 21-22, 2010 Argonne TRACC West Chicago, IL Modeling and Simulation with LS-DYNA®: Insights into Modeling with a Goal of Providing Credible Predictive Simulations February 11-12, 2010 Argonne TRACC West Chicago, IL Introductory Course: Using LS-OPT® on the TRACC

    16. Extensible Computational Chemistry Environment

      Energy Science and Technology Software Center (OSTI)

      2012-08-09

      ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing themore » power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of the inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

    17. Annual Report on Environmental Monitoring Activities for FY 1995 (Baseline Year) at Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      1996-06-01

      This report describes baseline contaminant release conditions for Waste Area Grouping (WAG) 6 at Oak Ridge National Laboratory (ORNL). The sampling approach and data analysis methods used to establish baseline conditions were presented in ``Environmental Monitoring Plan for Waste Area Grouping 6 at Oak Ridge National Laboratory, Oak Ridge, Tennessee (EMP).`` As outlined in the EMP, the purpose of the baseline monitoring year at WAG 6 was to determine the annual contaminant releases from the site during fiscal year 1995 (FY95) against which any potential changes in releases over time could be compared. The baseline year data set provides a comprehensive understanding of release conditions from all major waste units in the WAG through each major contaminant transport pathway. Due to a mandate to reduce all monitoring work, WAG 6 monitoring was scaled back and reporting efforts on the baseline year results are being minimized. This report presents the quantified baseline year contaminant flux conditions for the site and briefly summarizes other findings. All baseline data cited in this report will reside in the Oak Ridge Environmental Information system (OREIS) database, and will be available for use in future years as the need arises to identify potential release changes.

    18. Information Science, Computing, Applied Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Capabilities » Information Science, Computing, Applied Math /science-innovation/_assets/images/icon-science.jpg Information Science, Computing, Applied Math National security depends on science and technology. The United States relies on Los Alamos National Laboratory for the best of both. No place on Earth pursues a broader array of world-class scientific endeavors. Computer, Computational, and Statistical Sciences (CCS)» High Performance Computing (HPC)» Extreme Scale Computing, Co-design»

    19. Description of Model Data for SNL100-00: The Sandia 100-meter All-glass Baseline Wind Turbine

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      00-00: The Sandia 100-meter All-glass Baseline Wind Turbine Blade D. Todd Griffith, Brian R. Resor Sandia National Laboratories Wind and Water Power Technologies Department Introduction This document provides a brief description of model files that are available for the SNL100-00 blade [1]. For each file, codes used to create/read the model files are detailed (e.g. code version and date, description, etc). A summary of the blade model data is also provided from the design report [1]. A Design

    20. Super recycled water: quenching computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Super recycled water: quenching computers Super recycled water: quenching computers New facility and methods support conserving water and creating recycled products. Using reverse...

    1. Computer simulation | Open Energy Information

      Open Energy Info (EERE)

      Computer simulation Jump to: navigation, search OpenEI Reference LibraryAdd to library Web Site: Computer simulation Author wikipedia Published wikipedia, 2013 DOI Not Provided...

    2. NREL: Computational Science Home Page

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      high-performance computing, computational science, applied mathematics, scientific data management, visualization, and informatics. NREL is home to the largest high performance...

    3. SCC: The Strategic Computing Complex

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      computer room, which is an open room about three-fourths the size of a football field. The Strategic Computing Complex (SCC) at the Los Alamos National Laboratory...

    4. Human-computer interface

      DOE Patents [OSTI]

      Anderson, Thomas G.

      2004-12-21

      The present invention provides a method of human-computer interfacing. Force feedback allows intuitive navigation and control near a boundary between regions in a computer-represented space. For example, the method allows a user to interact with a virtual craft, then push through the windshield of the craft to interact with the virtual world surrounding the craft. As another example, the method allows a user to feel transitions between different control domains of a computer representation of a space. The method can provide for force feedback that increases as a user's locus of interaction moves near a boundary, then perceptibly changes (e.g., abruptly drops or changes direction) when the boundary is traversed.

    5. On the Use of an ER-213 Detonator to Establish a Baseline for the ER-486

      SciTech Connect (OSTI)

      Thomas, Keith A.; Liechty, Gary H.; Jaramillo, Dennis C.; Munger, Alan C.; McHugh, Douglas C.; Kennedy, James E.

      2014-08-19

      This report documents a series of tests using a TSD-115 fireset coupled with an ER-213, a gold exploding bridgewire (EBW) detonator. These tests were designed to fire this EBW with a smaller fireset to obtain current and voltage data as well as timing information at voltage levels below, above, and throughout the threshold firing region. This study could then create a database for comparison to our current ER-486 EBW development, which is designed to be a lower voltage (<500V) device.

    6. Modeling of Electric Water Heaters for Demand Response: A Baseline PDE Model

      SciTech Connect (OSTI)

      Xu, Zhijie; Diao, Ruisheng; Lu, Shuai; Lian, Jianming; Zhang, Yu

      2014-09-05

      Demand response (DR)control can effectively relieve balancing and frequency regulation burdens on conventional generators, facilitate integrating more renewable energy, and reduce generation and transmission investments needed to meet peak demands. Electric water heaters (EWHs) have a great potential in implementing DR control strategies because: (a) the EWH power consumption has a high correlation with daily load patterns; (b) they constitute a significant percentage of domestic electrical load; (c) the heating element is a resistor, without reactive power consumption; and (d) they can be used as energy storage devices when needed. Accurately modeling the dynamic behavior of EWHs is essential for designing DR controls. Various water heater models, simplified to different extents, were published in the literature; however, few of them were validated against field measurements, which may result in inaccuracy when implementing DR controls. In this paper, a partial differential equation physics-based model, developed to capture detailed temperature profiles at different tank locations, is validated against field test data for more than 10 days. The developed model shows very good performance in capturing water thermal dynamics for benchmark testing purposes

    7. GPU COMPUTING FOR PARTICLE TRACKING

      SciTech Connect (OSTI)

      Nishimura, Hiroshi; Song, Kai; Muriki, Krishna; Sun, Changchun; James, Susan; Qin, Yong

      2011-03-25

      This is a feasibility study of using a modern Graphics Processing Unit (GPU) to parallelize the accelerator particle tracking code. To demonstrate the massive parallelization features provided by GPU computing, a simplified TracyGPU program is developed for dynamic aperture calculation. Performances, issues, and challenges from introducing GPU are also discussed. General purpose Computation on Graphics Processing Units (GPGPU) bring massive parallel computing capabilities to numerical calculation. However, the unique architecture of GPU requires a comprehensive understanding of the hardware and programming model to be able to well optimize existing applications. In the field of accelerator physics, the dynamic aperture calculation of a storage ring, which is often the most time consuming part of the accelerator modeling and simulation, can benefit from GPU due to its embarrassingly parallel feature, which fits well with the GPU programming model. In this paper, we use the Tesla C2050 GPU which consists of 14 multi-processois (MP) with 32 cores on each MP, therefore a total of 448 cores, to host thousands ot threads dynamically. Thread is a logical execution unit of the program on GPU. In the GPU programming model, threads are grouped into a collection of blocks Within each block, multiple threads share the same code, and up to 48 KB of shared memory. Multiple thread blocks form a grid, which is executed as a GPU kernel. A simplified code that is a subset of Tracy++ [2] is developed to demonstrate the possibility of using GPU to speed up the dynamic aperture calculation by having each thread track a particle.

    8. Baseline risk assessment of ground water contamination at the Uranium Mill Tailings Sites near Rifle, Colorado

      SciTech Connect (OSTI)

      1995-05-01

      The ground water project evaluates the nature and extent of ground water contamination resulting from the uranium ore processing activities. This report is a site specific document that will be used to evaluate current and future impacts to the public and the environment from exposure to contaminated ground water. Currently, no one is using the ground water and therefore, no one is at risk. However, the land will probably be developed in the future and so the possibility of people using the ground water does exist. This report examines the future possibility of health hazards resulting from the ingestion of contaminated drinking water, skin contact, fish ingestion, or contact with surface waters and sediments.

    9. Low No{sub x}/SO{sub x} burner retrofit for utility cyclone boilers. Baseline test report: Issue A

      SciTech Connect (OSTI)

      Moore, K.; Martin, L.; Smith, J.

      1991-05-01

      The Low NO{sub x}/SO{sub x} (LNS) Burner Retrofit for Utility Cyclone Boilers program consists of the retrofit and subsequent demonstration of the technology at Southern Illinois Power Cooperative`s (SIPC`s) 33-MW unit 1 cyclone boiler located near Marion, Illinois. The LNS Burner employs a simple innovative combustion process burning high-sulfur Illinois coal to provide substantial SO{sub 2} and NO{sub x} control within the burner. A complete series of boiler performance and characterization tests, called the baseline tests, was conducted in October 1990 on unit 1 of SIPC`s Marion Station. The primary objective of the baseline test was to collect data from the existing plant that could provide a comparison of performance after the LNS Burner retrofit. These data could confirm the LNS Burner`s SO{sub x} and NO{sub x} emissions control and any effect on boiler operation. Further, these tests would provide to the project experience with the operating characteristics of the host unit as well as engineering design information to minimize technical uncertainties in the application of the LNS Burner technology.

    10. Synchronizing compute node time bases in a parallel computer

      DOE Patents [OSTI]

      Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

      2014-12-30

      Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

    11. Synchronizing compute node time bases in a parallel computer

      DOE Patents [OSTI]

      Chen, Dong; Faraj, Daniel A; Gooding, Thomas M; Heidelberger, Philip

      2015-01-27

      Synchronizing time bases in a parallel computer that includes compute nodes organized for data communications in a tree network, where one compute node is designated as a root, and, for each compute node: calculating data transmission latency from the root to the compute node; configuring a thread as a pulse waiter; initializing a wakeup unit; and performing a local barrier operation; upon each node completing the local barrier operation, entering, by all compute nodes, a global barrier operation; upon all nodes entering the global barrier operation, sending, to all the compute nodes, a pulse signal; and for each compute node upon receiving the pulse signal: waking, by the wakeup unit, the pulse waiter; setting a time base for the compute node equal to the data transmission latency between the root node and the compute node; and exiting the global barrier operation.

    12. TRANSIMS Interface Development

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      transims TRANSIMS Interface Development TRACC RESEARCH Computational Fluid Dynamics Computational Structural Mechanics Transportation Systems Modeling TRANSIMS Studio (Figure 1) has been developed by TRACC for the TRANSIMS community as part of the TRANSIMS Open Source project. It provides an integrated development environment (IDE) for TRANSIMS by combining a number of components that work seamlessly with each other. The visible part of the IDE is the graphical user interface (GUI) that allows

    13. Computer Security Risk Assessment

      Energy Science and Technology Software Center (OSTI)

      1992-02-11

      LAVA/CS (LAVA for Computer Security) is an application of the Los Alamos Vulnerability Assessment (LAVA) methodology specific to computer and information security. The software serves as a generic tool for identifying vulnerabilities in computer and information security safeguards systems. Although it does not perform a full risk assessment, the results from its analysis may provide valuable insights into security problems. LAVA/CS assumes that the system is exposed to both natural and environmental hazards and tomore » deliberate malevolent actions by either insiders or outsiders. The user in the process of answering the LAVA/CS questionnaire identifies missing safeguards in 34 areas ranging from password management to personnel security and internal audit practices. Specific safeguards protecting a generic set of assets (or targets) from a generic set of threats (or adversaries) are considered. There are four generic assets: the facility, the organization''s environment; the hardware, all computer-related hardware; the software, the information in machine-readable form stored both on-line or on transportable media; and the documents and displays, the information in human-readable form stored as hard-copy materials (manuals, reports, listings in full-size or microform), film, and screen displays. Two generic threats are considered: natural and environmental hazards, storms, fires, power abnormalities, water and accidental maintenance damage; and on-site human threats, both intentional and accidental acts attributable to a perpetrator on the facility''s premises.« less

    14. Applications in Data-Intensive Computing

      SciTech Connect (OSTI)

      Shah, Anuj R.; Adkins, Joshua N.; Baxter, Douglas J.; Cannon, William R.; Chavarra-Miranda, Daniel; Choudhury, Sutanay; Gorton, Ian; Gracio, Deborah K.; Halter, Todd D.; Jaitly, Navdeep; Johnson, John R.; Kouzes, Richard T.; Macduff, Matt C.; Marquez, Andres; Monroe, Matthew E.; Oehmen, Christopher S.; Pike, William A.; Scherrer, Chad; Villa, Oreste; Webb-Robertson, Bobbie-Jo M.; Whitney, Paul D.; Zuljevic, Nino

      2010-04-01

      This book chapter, to be published in Advances in Computers, Volume 78, in 2010 describes applications of data intensive computing (DIC). This is an invited chapter resulting from a previous publication on DIC. This work summarizes efforts coming out of the PNNL's Data Intensive Computing Initiative. Advances in technology have empowered individuals with the ability to generate digital content with mouse clicks and voice commands. Digital pictures, emails, text messages, home videos, audio, and webpages are common examples of digital content that are generated on a regular basis. Data intensive computing facilitates human understanding of complex problems. Data-intensive applications provide timely and meaningful analytical results in response to exponentially growing data complexity and associated analysis requirements through the development of new classes of software, algorithms, and hardware.

    15. Sandia National Laboratories: Advanced Simulation and Computing...

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      ASC Advanced Simulation and Computing Computational Systems & Software Environment Crack Modeling The Computational Systems & Software Environment program builds integrated,...

    16. RDD-100 model development for TWRS

      SciTech Connect (OSTI)

      Gneiting, B.C.

      1996-10-01

      The purpose of this document is to describe the work performed to develop an executable model of the TWRS technical baseline using the RDD-100 Dynamic Verification Facility. The benefit of developing a DVF model that simulates the conceptual TWRS baseline system is that is provides a verification of the system performance and the traceability needed between the system requirements and the proposed architectures that will satisfy the requirements and perform the identified functions. The initial modeling results showed some potential interface and scheduling conflicts between some of the TWRS components.

    17. DEVELOPMENT OF A REFRIGERANT DISTRIBUTION SECTION FOR ASHRAE STANDARD 152.

      SciTech Connect (OSTI)

      ANDREWS,J.W.

      2001-09-07

      In a recent draft report titled ''Impacts of Refrigerant Line Length on System Efficiency in Residential Heating and Cooling Systems Using Refrigerant Distribution,'' (Andrews 2000) some baseline calculations were performed to estimate various impacts on system efficiency of long refrigerant distribution lines. Refrigerant distribution refers to ''mini-splits'' and other types of space beating and cooling equipment that utilize refrigerant lines, rather than ducts or pipes, to transport heat and cooling effect from the outdoor unit to the building spaces where this heat or cooling is used. Five factors affecting efficiency were studied in each of the space conditioning modes (heating and cooling) for a total of ten factors in all. Temperature changes and pressure drops in each of the two refrigerant lines accounted for four of the factors, with the remaining one being elevation of the indoor unit relative to the outdoor unit. Of these factors, pressure drops in the suction line in cooling showed by far the largest effect. This report builds on these baseline calculations to develop a possible algorithm for a refrigerant distribution section of ASHRAE Standard 152. It is based on the approximate treatment of the previous report, and is therefore subject to error that might be corrected using a more detailed analysis, possibly including computer modeling and field testing. However, because the calculated efficiency impacts are generally small (a few percent being typical) it may be that the approximate treatment is sufficient. That question is left open for discussion. The purpose of this report is not to advocate the adoption of the methodology developed, but rather to present it as an option that could either be adopted as-is or used as a starting point for further analysis. It is assumed that the reader has available and is familiar with ASHRAE Standard 152P and with the previous analysis referred to above.

    18. Open-Source Software in Computational Research: A Case Study

      DOE Public Access Gateway for Energy & Science Beta (PAGES Beta)

      Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; Gel, Aytekin; Pannala, Sreekanth

      2008-01-01

      A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

    19. EA-2001: Energy Efficiency Standards for New Federal Commercial and Multi-Family High-Rise Residential Buildings' Baseline Standards Update (RIN 1904-AD39)

      Broader source: Energy.gov [DOE]

      The U.S. Department of Energy (DOE) is publishing this final rule to implement provisions in the Energy Conservation and Production Act (ECPA) that require DOE to update the baseline Federal energy efficiency performance standards for the construction of new Federal commercial and multi-family high-rise residential buildings. This rule updates the baseline Federal commercial standard to the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE) Standard 90.1-2013.

    20. Extreme Scale Computing, Co-design

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Information Science, Computing, Applied Math Extreme Scale Computing, Co-design Extreme Scale Computing, Co-design Computational co-design may facilitate revolutionary designs ...

    1. Visitor Hanford Computer Access Request - Hanford Site

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Visitor Hanford Computer Access Request Visitor Hanford Computer Access Request Visitor Hanford Computer Access Request Visitor Hanford Computer Access Request Email Email Page |...

    2. Viscosity index calculated by program in GW-basic for personal computers

      SciTech Connect (OSTI)

      Anaya, C.; Bermudez, O. )

      1988-12-26

      A computer program has been developed to calculate the viscosity index of oils when viscosities at two temperatures are known.

    3. Environmental Baseline Survey Report for the Title Transfer of Land Parcel ED-4 at the East Tennessee Technology Park, Oak Ridge, Tennessee

      SciTech Connect (OSTI)

      SAIC

      2008-05-01

      This environmental baseline survey (EBS) report documents the baseline environmental conditions of a land parcel referred to as 'ED-4' (ED-4) at the U. S. Department of Energy's (DOE's) East Tennessee Technology Park (ETTP). DOE is proposing to transfer the title of this land to the Heritage Center, LLC. Parcel ED-4 is a land parcel that consists of two noncontiguous areas comprising a total of approximately 18 acres located east of the ETTP. The western tract of ED-4 encompasses approximately 8.5 acres in the northeastern quadrant of the intersection of Boulevard Road and Highway 58. The eastern tract encompasses an area of approximately 9.5 acres in the northwestern quadrant of the intersection of Blair Road and Highway 58 (the Oak Ridge Turnpike). Aerial photographs and site maps from throughout the history of the ETTP, going back to its initial development in the 1940s as the Oak Ridge Gaseous Diffusion Plant (ORGDP), indicate that this area has been undeveloped woodland with the exception of three support facilities for workers constructing the ORGDP since federal acquisition in 1943. These three support facilities, which were located in the western tract of ED-4, included a recreation hall, the Town Hall Camp Operations Building, and the Property Warehouse. A railroad spur also formerly occupied a portion of Parcel ED-4. These former facilities only occupied approximately 5 percent of the total area of Parcel ED-4. This report provides supporting information for the transfer of this government-owned property at ETTP to a non-federal entity. This EBS is based upon the requirements of Sect. 120(h) of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA). In order to support a Clean Parcel Determination (CPD) in accordance with CERCLA Sect. 120(h)(4)(d), groundwater and sediment samples were collected within, and adjacent to, the Parcel ED-4 study area. The potential for DOE to make a CPD for ED-4 is further supported by a No Further Investigation (NFI) determination made on land that adjoins ED-4 to the east (DOE 1997a) and to the south (DOE 1997b).

    4. Magellan: A Cloud Computing Testbed

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Magellan News & Announcements Archive Petascale Initiative Exascale Computing APEX Home » R & D » Archive » Magellan: A Cloud Computing Testbed Magellan: A Cloud Computing Testbed Cloud computing is gaining a foothold in the business world, but can clouds meet the specialized needs of scientists? That was one of the questions NERSC's Magellan cloud computing testbed explored between 2009 and 2011. The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Oce

    5. Ultralow dose computed tomography attenuation correction for pediatric PET CT using adaptive statistical iterative reconstruction

      SciTech Connect (OSTI)

      Brady, Samuel L.; Shulkin, Barry L.

      2015-02-15

      Purpose: To develop ultralow dose computed tomography (CT) attenuation correction (CTAC) acquisition protocols for pediatric positron emission tomography CT (PET CT). Methods: A GE Discovery 690 PET CT hybrid scanner was used to investigate the change to quantitative PET and CT measurements when operated at ultralow doses (1035 mA s). CT quantitation: noise, low-contrast resolution, and CT numbers for 11 tissue substitutes were analyzed in-phantom. CT quantitation was analyzed to a reduction of 90% volume computed tomography dose index (0.39/3.64; mGy) from baseline. To minimize noise infiltration, 100% adaptive statistical iterative reconstruction (ASiR) was used for CT reconstruction. PET images were reconstructed with the lower-dose CTAC iterations and analyzed for: maximum body weight standardized uptake value (SUV{sub bw}) of various diameter targets (range 837 mm), background uniformity, and spatial resolution. Radiation dose and CTAC noise magnitude were compared for 140 patient examinations (76 post-ASiR implementation) to determine relative dose reduction and noise control. Results: CT numbers were constant to within 10% from the nondose reduced CTAC image for 90% dose reduction. No change in SUV{sub bw}, background percent uniformity, or spatial resolution for PET images reconstructed with CTAC protocols was found down to 90% dose reduction. Patient population effective dose analysis demonstrated relative CTAC dose reductions between 62% and 86% (3.2/8.30.9/6.2). Noise magnitude in dose-reduced patient images increased but was not statistically different from predose-reduced patient images. Conclusions: Using ASiR allowed for aggressive reduction in CT dose with no change in PET reconstructed images while maintaining sufficient image quality for colocalization of hybrid CT anatomy and PET radioisotope uptake.

    6. Computer Algebra System

      Energy Science and Technology Software Center (OSTI)

      1992-05-04

      DOE-MACSYMA (Project MAC''s SYmbolic MAnipulation system) is a large computer programming system written in LISP. With DOE-MACSYMA the user can differentiate, integrate, take limits, solve systems of linear or polynomial equations, factor polynomials, expand functions in Laurent or Taylor series, solve differential equations (using direct or transform methods), compute Poisson series, plot curves, and manipulate matrices and tensors. A language similar to ALGOL-60 permits users to write their own programs for transforming symbolic expressions. Franzmore » Lisp OPUS 38 provides the environment for the Encore, Celerity, and DEC VAX11 UNIX,SUN(OPUS) versions under UNIX and the Alliant version under Concentrix. Kyoto Common Lisp (KCL) provides the environment for the SUN(KCL),Convex, and IBM PC under UNIX and Data General under AOS/VS.« less

    7. GPU Computational Screening

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      GPU Computational Screening of Carbon Capture Materials J. Kim 1 , A Koniges 1 , R. Martin 1 , M. Haranczyk 1 , J. Swisher 2 , and B. Smit 1,2 1 Lawrence Berkeley National Laboratory, Berkeley, CA 94720 2 Department of Chemical Engineering, University of California, Berkeley, Berkeley, CA 94720 E-mail: jihankim@lbl.gov Abstract. In order to reduce the current costs associated with carbon capture technologies, novel materials such as zeolites and metal-organic frameworks that are based on

    8. Cloud Computing Services

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Services - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced

    9. High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Performance Computing - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced

    10. Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Anti-HIV antibody Software optimized on Mira advances design of mini-proteins for medicines, materials Scientists at the University of Washington are using Mira to virtually design unique, artificial peptides, or short proteins. Read More Celebrating 10 years 10 science highlights celebrating 10 years of Argonne Leadership Computing Facility To celebrate our 10th anniversary, we're highlighting 10 science accomplishments since we opened our doors. Read More Bill Gropp works with students during

    11. Applied & Computational Math

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      & Computational Math - Sandia Energy Energy Search Icon Sandia Home Locations Contact Us Employee Locator Energy & Climate Secure & Sustainable Energy Future Stationary Power Energy Conversion Efficiency Solar Energy Wind Energy Water Power Supercritical CO2 Geothermal Natural Gas Safety, Security & Resilience of the Energy Infrastructure Energy Storage Nuclear Power & Engineering Grid Modernization Battery Testing Nuclear Fuel Cycle Defense Waste Management Programs Advanced

    12. From Federal Computer Week:

      National Nuclear Security Administration (NNSA)

      Federal Computer Week: Energy agency launches performance-based pay system By Richard W. Walker Published on March 27, 2008 The Energy Department's National Nuclear Security Administration has launched a new performance- based pay system involving about 2,000 of its 2,500 employees. NNSA officials described the effort as a pilot project that will test the feasibility of the new system, which collapses the traditional 15 General Schedule pay bands into broader pay bands. The new structure

    13. Computed Tomography Status

      DOE R&D Accomplishments [OSTI]

      Hansche, B. D.

      1983-01-01

      Computed tomography (CT) is a relatively new radiographic technique which has become widely used in the medical field, where it is better known as computerized axial tomographic (CAT) scanning. This technique is also being adopted by the industrial radiographic community, although the greater range of densities, variation in samples sizes, plus possible requirement for finer resolution make it difficult to duplicate the excellent results that the medical scanners have achieved.

    14. DETECTION OF FAST RADIO TRANSIENTS WITH MULTIPLE STATIONS: A CASE STUDY USING THE VERY LONG BASELINE ARRAY

      SciTech Connect (OSTI)

      Thompson, David R.; Wagstaff, Kiri L.; Majid, Walid A.; Brisken, Walter F.; Deller, Adam T.; Tingay, Steven J.; Wayth, Randall B.

      2011-07-10

      Recent investigations reveal an important new class of transient radio phenomena that occur on submillisecond timescales. Often, transient surveys' data volumes are too large to archive exhaustively. Instead, an online automatic system must excise impulsive interference and detect candidate events in real time. This work presents a case study using data from multiple geographically distributed stations to perform simultaneous interference excision and transient detection. We present several algorithms that incorporate dedispersed data from multiple sites, and report experiments with a commensal real-time transient detection system on the Very Long Baseline Array. We test the system using observations of pulsar B0329+54. The multiple-station algorithms enhanced sensitivity for detection of individual pulses. These strategies could improve detection performance for a future generation of geographically distributed arrays such as the Australian Square Kilometre Array Pathfinder and the Square Kilometre Array.

    15. Baseline risk assessment for groundwater contamination at the uranium mill tailings site near Monument Valley, Arizona. Draft

      SciTech Connect (OSTI)

      Not Available

      1993-09-01

      This baseline risk assessment evaluates potential impact to public health or the environment resulting from groundwater contamination at the former uranium mill processing site near Monument Valley, Arizona. The tailings and other contaminated material at this site are being relocated and stabilized in a disposal cell at Mexican Hat, Utah, through the US Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project. The tailings removal is planned for completion by spring 1994. After the tailings are removed, groundwater contamination at the site will continue to be evaluated. This risk assessment is the first document specific to this site for the Groundwater Project. It will be used to assist in determining what remedial action is needed for contaminated groundwater at the site.

    16. Baseline risk assessment of ground water contamination at the Monument Valley Uranium Mill Tailings Site, Cane Valley, Arizona. Revision 1

      SciTech Connect (OSTI)

      Not Available

      1994-08-01

      This baseline risk assessment evaluates potential impact to public health or the environment from ground water contamination at the former uranium mill processing site in Cane Valley near Monument Valley, Arizona. The US Department of Energy (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project has relocated and stabilized this site`s tailings and other contaminated material in a disposal cell at Mexican Hat, Utah. The second phase of the UMTRA Project is to evaluate ground water contamination. This risk assessment is the first document specific to this site for the Ground Water Project that evaluates potential health and environmental risks. It will help determine the approach required to address contaminated ground water at the site.

    17. Computational nuclear quantum many-body problem: The UNEDF project

      SciTech Connect (OSTI)

      Fann, George I [ORNL

      2013-01-01

      The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

    18. Argonne's Laboratory computing resource center : 2006 annual report.

      SciTech Connect (OSTI)

      Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

      2007-05-31

      Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific reach and performance of Argonne's computational applications. Furthermore, recognizing that Jazz is fully subscribed, with considerable unmet demand, the LCRC has framed a 'path forward' for additional computing resources.

    19. Integrated Dry NO sub x /SO sub 2 Emissions Control System baseline test report, November 11--December 15, 1991

      SciTech Connect (OSTI)

      Shiomoto, G.H.; Smith, R.A.

      1992-03-01

      The DOE sponsored Integrated Dry NO{sub x}/SO{sub 2} Emissions Control System program, which is a Clean Coal Technology Ill demonstration, is being conducted by Public Service Company of Colorado. The test site is Arapahoe Generating Station Unit 4, which is a 100 MWe, down-fired utility boiler burning a low sulfur western coal. The project goal is to demonstrate 70 percent reductions in NO{sub x} and S0{sub 2} emissions through the integration of: (1) down-fired low-NO{sub x} burners with overfire air; (2) urea injection for additional NO{sub x} removal; and (3) dry sorbent injection and duct humidification for SO{sub 2} removal. The effectiveness of the integrated system on a high sulfur coal will also be tested. This report documents the first baseline test results conducted during the program. The baseline tests were conducted with the original burners and auxiliary equipment and represent the unmodified boiler emissions. The burner design of Arapahoe Unit 4 results in relatively high NO{sub x} levels ranging from 740 to 850 ppM (corrected to 3% O{sub 2}, dry) over the load range. Excess air level was the primary factor influencing NO{sub x} emissions. During normal boiler operations, there was a wide range in NO{sub x} emissions, due to the variations of excess air, boiler load and other, secondary parameters. SO{sub 2} emissions ranged from 350 to 600 ppM (corrected to 3% O{sub 2}, dry) and reflected variations in the coal sulfur content.

    20. Forest Restoration Carbon Analysis of Baseline Carbon Emissions and Removal in Tropical Rainforest at La Selva Central, Peru

      SciTech Connect (OSTI)

      Patrick Gonzalez; Benjamin Kroll; Carlos R. Vargas

      2006-01-10

      Conversion of tropical forest to agricultural land and pasture has reduced forest extent and the provision of ecosystem services, including watershed protection, biodiversity conservation, and carbon sequestration. Forest conservation and reforestation can restore those ecosystem services. We have assessed forest species patterns, quantified deforestation and reforestation rates, and projected future baseline carbon emissions and removal in Amazon tropical rainforest at La Selva Central, Peru. The research area is a 4800 km{sup 2} buffer zone around the Parque Nacional Yanachaga-Chemillen, Bosque de Proteccion San Matias-San Carlos, and the Reserva Comunal Yanesha. A planned project for the period 2006-2035 would conserve 4000 ha of forest in a proposed 7000 ha Area de Conservacion Municipale de Chontabamba and establish 5600 ha of natural regeneration and 1400 ha of native species plantations, laid out in fajas de enriquecimiento (contour plantings), to reforest 7000 ha of agricultural land. Forest inventories of seven sites covering 22.6 ha in primary forest and 17 sites covering 16.5 ha in secondary forest measured 17,073 trees of diameter {ge} 10 cm. The 24 sites host trees of 512 species, 267 genera, and 69 families. We could not identify the family of 7% of the trees or the scientific species of 21% of the trees. Species richness is 346 in primary forest and 257 in the secondary forest. In primary forest, 90% of aboveground biomass resides in old-growth species. Conversely, in secondary forest, 66% of aboveground biomass rests in successional species. The density of trees of diameter {ge} 10 cm is 366 trees ha{sup -1} in primary forest and 533 trees ha{sup -1} in secondary forest, although the average diameter is 24 {+-} 15 cm in primary forest and 17 {+-} 8 cm in secondary forest. Using Amazon forest biomass equations and wood densities for 117 species, aboveground biomass is 240 {+-} 30 t ha{sup -1} in the primary sites and 90 {+-} 10 t ha{sup -1} in the secondary sites. Aboveground carbon density is 120 {+-} 15 t ha{sup -1} in primary forest and 40 {+-} 5 t ha{sup -1} in secondary forest. Forest stands in the secondary forest sites range in age from 10 to 42 y. Growth in biomass (t ha{sup -1}) as a function of time (y) follows the relation: biomass = 4.09-0.017 age{sup 2} (p < 0.001). Aboveground biomass and forest species richness are positively correlated (r{sup 2} = 0.59, p < 0.001). Analyses of Landsat data show that the land cover of the 3700 km{sup 2} of non-cloud areas in 1999 was: closed forest 78%; open forest 12%, low vegetation cover 4%, sparse vegetation cover 6%. Deforestation from 1987 to 1999 claimed a net 200 km{sup 2} of forest, proceeding at a rate of 0.005 y{sup -1}. Of those areas of closed forest in 1987, only 89% remained closed forest in 1999. Consequently, closed forests experienced disruption in the time period at double the rate of net deforestation. The three protected areas experienced negligible deforestation or slight reforestation. Based on 1987 forest cover, 26,000 ha are eligible for forest carbon trading under the Clean Development Mechanism, established by the Kyoto Protocol to the United Nations Framework Convention on Climate Change. Principal components analysis showed that distance to nonforest was the factor that best explained observed patterns of deforestation while distance to forest best explained observed patterns of reforestation, more significant than elevation, distance to rivers, distance to roads, slope, and distance to towns of population > 400. Aboveground carbon in live vegetation in the project area decreased from 35 million {+-} 4 million t in 1987 to 34 million {+-} 4 million t in 1999. Projected aboveground carbon in live vegetation would fall to 33 million {+-} 4 million t in 2006, 32 million {+-} 4 million t in 2011, and 29 million {+-} 3 million t in 2035. Projected net deforestation in the research area would total 13,000 {+-} 3000 ha in the period 1999-2011, proceeding at a rate of 0.003 {+-} 0.0007 y{sup -1}, and would total 33,000 {+-} 7000

    1. Ultra-Scale Computing for Emergency Evacuation

      SciTech Connect (OSTI)

      Bhaduri, Budhendra L; Nutaro, James J; Liu, Cheng; Zacharia, Thomas

      2010-01-01

      Emergency evacuations are carried out in anticipation of a disaster such as hurricane landfall or flooding, and in response to a disaster that strikes without a warning. Existing emergency evacuation modeling and simulation tools are primarily designed for evacuation planning and are of limited value in operational support for real time evacuation management. In order to align with desktop computing, these models reduce the data and computational complexities through simple approximations and representations of real network conditions and traffic behaviors, which rarely represent real-world scenarios. With the emergence of high resolution physiographic, demographic, and socioeconomic data and supercomputing platforms, it is possible to develop micro-simulation based emergency evacuation models that can foster development of novel algorithms for human behavior and traffic assignments, and can simulate evacuation of millions of people over a large geographic area. However, such advances in evacuation modeling and simulations demand computational capacity beyond the desktop scales and can be supported by high performance computing platforms. This paper explores the motivation and feasibility of ultra-scale computing for increasing the speed of high resolution emergency evacuation simulations.

    2. High Performance Computing at the Oak Ridge Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      High Performance Computing at the Oak Ridge Leadership Computing Facility Go to Menu Page 2 Outline * Our Mission * Computer Systems: Present, Past, Future * Challenges Along the Way * Resources for Users Go to Menu Page 3 Our Mission Go to Menu Page 4 * World's most powerful computing facility * Nation's largest concentration of open source materials research * $1.3B budget * 4,250 employees * 3,900 research guests annually * $350 million invested in modernization * Nation's most diverse energy

    3. Environmental settings for selected US Department of Energy installations - support information for the programmatic environmental impact statement and the baseline environmental management report

      SciTech Connect (OSTI)

      Holdren, G.R.; Glantz, C.S.; Berg, L.K.; Delinger, K.; Fosmire, C.J.; Goodwin, S.M.; Rustad, J.R.; Schalla, R.; Schramke, J.A.

      1995-05-01

      This report contains the environmental setting information developed for 25 U.S. Department of Energy (DOE) installations in support of the DOE`s Programmatic Environmental Impact Study (PEIS) and the Baseline Environmental Management Report (BEMR). The common objective of the PEIS and the BEMR is to provide the public with information about the environmental contamination problems associated with major DOE facilities across the country, and to assess the relative risks that radiological and hazardous contaminants pose to the public, onsite workers, and the environment. Environmental setting information consists of the site-specific data required to model (using the Multimedia Environmental Pollutant Assessment System) the atmospheric, groundwater, and surface water transport of contaminants within and near the boundaries of the installations. The environmental settings data describes the climate, atmospheric dispersion, hydrogeology, and surface water characteristics of the installations. The number of discrete environmental settings established for each installation was governed by two competing requirements: (1) the risks posed by contaminants released from numerous waste sites were to be modeled as accurately as possible, and (2) the modeling required for numerous release sites and a large number of contaminants had to be completed within the limits imposed by the PEIS and BEMR schedule. The final product is the result of attempts to balance these competing concerns in a way that minimizes the number of settings per installation in order to meet the project schedule while at the same, time providing adequate, if sometimes highly simplified, representations of the different areas within an installation. Environmental settings were developed in conjunction with installation experts in the fields of meteorology, geology, hydrology, and geochemistry.

    4. Foundational Tools for Petascale Computing

      SciTech Connect (OSTI)

      Miller, Barton

      2014-05-19

      The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, High-Performance Energy Applications and Systems, SC0004061/FG02-10ER25972, UW PRJ36WV.

    5. computational-hydraulics

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      and Aerodynamics using STAR-CCM+ for CFD Analysis March 21-22, 2012 Argonne, Illinois Dr. Steven Lottes This email address is being protected from spambots. You need JavaScript enabled to view it. A training course in the use of computational hydraulics and aerodynamics CFD software using CD-adapco's STAR-CCM+ for analysis will be held at TRACC from March 21-22, 2012. The course assumes a basic knowledge of fluid mechanics and will make extensive use of hands on tutorials. CD-adapco will issue

    6. Computer generated holographic microtags

      DOE Patents [OSTI]

      Sweatt, William C.

      1998-01-01

      A microlithographic tag comprising an array of individual computer generated holographic patches having feature sizes between 250 and 75 nanometers. The tag is a composite hologram made up of the individual holographic patches and contains identifying information when read out with a laser of the proper wavelength and at the proper angles of probing and reading. The patches are fabricated in a steep angle Littrow readout geometry to maximize returns in the -1 diffracted order. The tags are useful as anti-counterfeiting markers because of the extreme difficulty in reproducing them.

    7. Computer generated holographic microtags

      DOE Patents [OSTI]

      Sweatt, W.C.

      1998-03-17

      A microlithographic tag comprising an array of individual computer generated holographic patches having feature sizes between 250 and 75 nanometers is disclosed. The tag is a composite hologram made up of the individual holographic patches and contains identifying information when read out with a laser of the proper wavelength and at the proper angles of probing and reading. The patches are fabricated in a steep angle Littrow readout geometry to maximize returns in the -1 diffracted order. The tags are useful as anti-counterfeiting markers because of the extreme difficulty in reproducing them. 5 figs.

    8. Scanning computed confocal imager

      DOE Patents [OSTI]

      George, John S. (Los Alamos, NM)

      2000-03-14

      There is provided a confocal imager comprising a light source emitting a light, with a light modulator in optical communication with the light source for varying the spatial and temporal pattern of the light. A beam splitter receives the scanned light and direct the scanned light onto a target and pass light reflected from the target to a video capturing device for receiving the reflected light and transferring a digital image of the reflected light to a computer for creating a virtual aperture and outputting the digital image. In a transmissive mode of operation the invention omits the beam splitter means and captures light passed through the target.

    9. Hydropower Baseline Cost Modeling

      SciTech Connect (OSTI)

      O'Connor, Patrick W.; Zhang, Qin Fen; DeNeale, Scott T.; Chalise, Dol Raj; Centurion, Emma E.

      2015-01-01

      Recent resource assessments conducted by the United States Department of Energy have identified significant opportunities for expanding hydropower generation through the addition of power to non-powered dams and on undeveloped stream-reaches. Additional interest exists in the powering of existing water resource infrastructure such as conduits and canals, upgrading and expanding existing hydropower facilities, and the construction new pumped storage hydropower. Understanding the potential future role of these hydropower resources in the nation’s energy system requires an assessment of the environmental and techno-economic issues associated with expanding hydropower generation. To facilitate these assessments, this report seeks to fill the current gaps in publically available hydropower cost-estimating tools that can support the national-scale evaluation of hydropower resources.

    10. Short Baseline Neutrino

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      decay region is followed by an absorber and 450 m of dirt, beyond which only the neutrino component of the beam survives. e ? The MiniBooNE Neutrino Beam March 10, 2003...

    11. Introduction to High Performance Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Introduction to High Performance Computing Introduction to High Performance Computing June 10, 2013 Photo on 7 30 12 at 7.10 AM Downloads Download File Gerber-HPC-2.pdf...

    12. Computer Wallpaper | The Ames Laboratory

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computer Wallpaper We've incorporated the tagline, Creating Materials and Energy Solutions, into a computer wallpaper so you can display it on your desktop as a constant reminder....

    13. Augmented Computer Exercise for Inspection Training

      Energy Science and Technology Software Center (OSTI)

      2001-10-08

      ACE-IT is a computer-based training tool developed to simulate an on-site inspection of a facility. Inspectors and hosts practice realistic scenarios to prepare for inspections, to supplement tabletop and mock inspections, and for general training in managed access techniques. A training exercise is conducted between interconnected computer workstations. Participants at each workstation play a role, such as inspector or host, and the exercise permits team-specific actions at each stage of the inspection. Prompts and on-screenmore » menus let the participants know what responses are expected from them to continue the exercise.« less

    14. Certification of computer professionals: A good idea?

      SciTech Connect (OSTI)

      Boggess, G.

      1994-12-31

      In the early stages of computing there was little understanding or attention paid to the ethical responsibilities of professionals. Compainies routinely put secretaries and music majors through 30 hours of video training and turned them loose on data processing projects. As the nature of the computing task changed, these same practices were followed and the trainees were set loose on life-critical software development projects. The enormous risks of using programmers with limited training has been by the GAO report on the BSY-2 program.

    15. Super recycled water: quenching computers

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Super recycled water: quenching computers Super recycled water: quenching computers New facility and methods support conserving water and creating recycled products. Using reverse osmosis to "super purify" water allows the system to reuse water and cool down our powerful yet thirsty computers. January 30, 2014 Super recycled water: quenching computers LANL's Sanitary Effluent Reclamation Facility, key to reducing the Lab's discharge of liquid. Millions of gallons of industrial

    16. Fermilab | Science at Fermilab | Computing

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Computing Computing is indispensable to science at Fermilab. High-energy physics experiments generate an astounding amount of data that physicists need to store, analyze and communicate with others. Cutting-edge technology allows scientists to work quickly and efficiently to advance our understanding of the world . Fermilab's Computing Division is recognized for its expertise in handling huge amounts of data, its success in high-speed parallel computing and its willingness to take its craft in

    17. History | Argonne Leadership Computing Facility

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Leadership Computing The Argonne Leadership Computing Facility (ALCF) was established at Argonne National Laboratory in 2004 as part of a U.S. Department of Energy (DOE) initiative dedicated to enabling leading-edge computational capabilities to advance fundamental discovery and understanding in a broad range of scientific and engineering disciplines. Supported by the Advanced Scientific Computing Research (ASCR) program within DOE's Office of Science, the ALCF is one half of the DOE Leadership

    18. Secure computing for the 'Everyman' goes to market

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      Secure computing for the 'Everyman' goes to market Secure computing for the 'Everyman' goes to market Quantum key distribution technology could ensure truly secure commerce, banking, communications and data transfer December 22, 2014 Secure computing for the 'Everyman' goes to market This small device developed at Los Alamos National Laboratory uses the truly random spin of light particles as defined by laws of quantum mechanics to generate a random number for use in a cryptographic key that can

    19. An Information Dependant Computer Program for Engine Exhaust Heat Recovery

      Office of Energy Efficiency and Renewable Energy (EERE) Indexed Site

      for Heating | Department of Energy An Information Dependant Computer Program for Engine Exhaust Heat Recovery for Heating An Information Dependant Computer Program for Engine Exhaust Heat Recovery for Heating A computer program was developed to help engineers at rural Alaskan village power plants to quickly evaluate how to use exhaust waste heat from individual diesel power plants. PDF icon deer09_avadhanula.pdf More Documents & Publications Modular Low Cost High Energy Exhaust Heat

    20. Computer System, Cluster and Networking Summer Institute (CSCNSI)

      Broader source: All U.S. Department of Energy (DOE) Office Webpages (Extended Search)

      CSCNSI Computer System, Cluster and Networking Summer Institute Emphasizes practical skills development Contact Program Lead Carolyn Connor (505) 665-9891 Email Professional Staff Assistant Nicole Aguilar Garcia (505) 665-3048 Email Technical enrichment program for third-year undergrad students engaged in computer studies The Computer System, Cluster, and Networking Summer Institute (CSCNSI) is a focused technical enrichment program targeting third-year college undergraduate students currently